CN108269304B - Scene fusion visualization method under multiple geographic information platforms - Google Patents

Scene fusion visualization method under multiple geographic information platforms Download PDF

Info

Publication number
CN108269304B
CN108269304B CN201711400221.8A CN201711400221A CN108269304B CN 108269304 B CN108269304 B CN 108269304B CN 201711400221 A CN201711400221 A CN 201711400221A CN 108269304 B CN108269304 B CN 108269304B
Authority
CN
China
Prior art keywords
scene
dimensional
projection
layer
view
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711400221.8A
Other languages
Chinese (zh)
Other versions
CN108269304A (en
Inventor
徐汇军
顾爽
赵爽
钱晶
付啟明
张尧
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Research Institute Institute Of Electronics Chinese Academy Of Sciences
Original Assignee
Suzhou Research Institute Institute Of Electronics Chinese Academy Of Sciences
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Research Institute Institute Of Electronics Chinese Academy Of Sciences filed Critical Suzhou Research Institute Institute Of Electronics Chinese Academy Of Sciences
Priority to CN201711400221.8A priority Critical patent/CN108269304B/en
Publication of CN108269304A publication Critical patent/CN108269304A/en
Application granted granted Critical
Publication of CN108269304B publication Critical patent/CN108269304B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • G06T15/205Image-based rendering

Abstract

The invention discloses a scene fusion visualization method under a multi-geographic information platform, which firstly carries out fusion synchronization on an upper layer scene and a bottom layer scene: (1) synchronous map projection; (2) a synchronous view frustum; (3) establishing a bidirectional interaction function; (4) a synchronized view matrix; (5) synchronizing azimuth, inclination and distance from the center of the view matrix. After the synchronization is completed, connection is established to ensure that the upper layer scene and the bottom layer scene keep consistency of spatial analysis. Finally, the scene is circularly rendered through a judgment mechanism, so that the bottom three-dimensional WebGIS platform can be fused with various WebGIS scenes; the method mainly comprises the step of carrying out state synchronization and rendering on multiple 2.5-dimensional or 2-dimensional GIS scenes by using the same algorithm. The final effect is a complete agreement of scene elements, establishing good bi-directional interaction between each entity. By adopting the method, the visualization form of the GIS platform can be enriched, the development cost is reduced, and the applicability of the platform is widened.

Description

Scene fusion visualization method under multiple geographic information platforms
Technical Field
The invention relates to the field of geographic information visualization, in particular to a scene fusion visualization method under a multi-geographic information platform.
Background
With the explosive development of Internet, WebGIS technology perfects and expands the traditional GIS technology, and provides GIS using opportunity for many users. The WebGIS technology has good expandability and cross-platform performance. Under the internet environment, the WebGIS can realize the sharing of spatial data information among all fields and all departments in the whole social range, and greatly improves the efficiency of inquiring, releasing and maintaining the spatial information.
Along with the development of scripting languages such as WebGL technology and JavaScript, the demand of people on the visualization level of geographic big data is increasingly improved, and excellent 2.5-dimensional 2-dimensional GIS platforms such as Echarts, deckGL, Mapbox and the like and a plurality of open source databases are emerged at home and abroad. Each platform and database has its own features, such as Echarts provides an intuitive, lively and interactive data visualization chart, DeckGL provides different types of visualization layers, and three. However, the scenes and entities created by these platforms are independent of each other and have a low level of integration with each other. The geographic scene visualization level created by only relying on one or a small number of platforms is low, and aesthetic fatigue is easily caused to users. And the diversified GIS platform with abundant geographic elements is a high-efficiency GIS platform with market value.
Meanwhile, due to technical limitation, the current three-dimensional GIS technology cannot meet the requirement of large-scale commercial application with good cost performance, and the complete adoption of the three-dimensional GIS inevitably requires high system construction cost. The GIS design based on the two-three-dimensional mixed structure is a more practical solution for processing the three-dimensional GIS under the current background, so that the low-level combination of the two-three-dimensional structure and the three-dimensional structure must be broken through, and the compact two-three-dimensional structure is realized. The core problem is how to render two-dimensional data in a three-dimensional system.
These problems include: the method comprises the following steps of 1, solving a single problem of a geographic scene and a data visualization form of a WebGIS platform; the inconsistency problem of the data and the space of the 2.2-dimensional or 2.5-dimensional scene and the 3-dimensional bottom platform; 3. the problem of low cost performance of complete three-dimensional GIS development.
Therefore, how to perform good fusion on various two-dimensional geographic scenes and three-dimensional scenes and form good two-way interaction between the scenes is a problem which needs to be solved by the people.
Disclosure of Invention
The invention aims to provide a scene fusion visualization method under a multi-geographic information platform, which solves the problems in the prior art.
The technical scheme adopted by the invention aiming at the technical problems is as follows:
a scene fusion visualization method under a multi-geographic information platform specifically comprises the following steps;
step1, fusing a bottom layer scene and an upper layer scene;
step2, controlling the consistency of the upper scene and the lower scene in spatial analysis;
and 3, circularly rendering the WebGIS scene.
As a further preferable scheme of the scene fusion visualization method under the multi-geographic information platform, the steps specifically include the following steps:
step 1.1, creating a bottom layer WebGIS scene and an upper layer scene, selecting a three-dimensional GIS platform with a good interface as a bottom layer, and setting a three-dimensional digital earth, a basic layer, a camera and a renderer;
step 1.2, introducing an upper scene database, importing data files in a json format, and importing an upper scene frame into engineering files in a plug-in mode;
and step 1.3, performing a series of synchronous operations.
As a further preferable scheme of the scene fusion visualization method under the multi-geographic information platform, the step 1.3 is specifically as follows:
(1) synchronous map projection: the projection method uniformly adopts Web Mercator, 3-dimensional coordinates corresponding to the 2-dimensional or 2.5-dimensional upper scene obtained by calculation by using a projection formula are stored in a new array, and the upper scene corresponds to the three-dimensional coordinate system;
(2) synchronous view cone: adjusting a camera of the upper scene, multiplying the coordinate of the upper scene by the perspective projection matrix, and enabling the result to be superposed with the bottom layer;
(3) establishing a bidirectional interaction function: setting a state attribute in a component class, wherein the change of the attribute can be triggered by the change of a certain scene, and the attribute change causes the corresponding operation of other scenes, so that the linkage of a plurality of scenes and the consistency of data are achieved;
(4) synchronization view matrix: setting a view matrix of a bottom scene as a view matrix of an upper scene;
(5) synchronous azimuth, inclination and distance from the center of the view matrix: and obtaining values of the azimuth angle, the inclination angle and the distance from the view matrix center of the bottom layer scene, storing the values into an array, and assigning values to corresponding data of the upper layer scene by using the array.
As a further preferable scheme of the scene fusion visualization method under the multi-geographic information platform, in step2, the synchronous map projection method is as follows:
given the coordinates (x, y) of the 2-dimensional or 2.5-dimensional upper scene, x and y are respectively the horizontal and vertical coordinates under the mercator projection, and the origin of the coordinate system of the mercator projection is assumed to be (0, lambda)0) The mercator projection formula is:
Figure BDA0001519281300000031
where λ is longitude and φ is latitude.
As a further preferable scheme of the scene fusion visualization method under the multi-geographic information platform, in step2, the method for synchronizing the view cones is as follows:
the perspective projection matrix M is specifically as follows:
Figure BDA0001519281300000032
wherein, θ is the included angle of the view cone in the Y direction, ratio is the aspect ratio of the projection plane, and n and f are the distances from the upper layer scene and the lower layer scene to the camera, respectively.
As a further preferable scheme of the scene fusion visualization method under the multi-geographic information platform, in step2, the method for establishing the bidirectional interaction function is as follows:
the two-way interaction function, i.e. the event trigger mechanism, is to be established, and the conversion formula is as follows:
linkage formula from upper scene to bottom scene: h ═ H × map
Linkage formula from bottom scene to upper scene: scale H/H × match (4)
Wherein: h is the height of the view angle in the bottom map; h is the height of the whole view angle; scale is the scale of the upper scene; match is the matching parameter from the upper scene to the bottom scene.
As a further preferable scheme of the scene fusion visualization method under the multi-geographic information platform, in step2, the synchronized view matrix method is as follows:
the role of the model view matrix is: multiplying a point coordinate to obtain a new point coordinate, the obtained point coordinate representing the point transformation in the world, if the observer is considered as a model, the view matrix is the inverse of the observer's model matrix, and the observer is translated (t)x,ty,tz) Equivalent to translation of the whole world (-t)x,-ty,-tz),
Figure BDA0001519281300000041
The observer rotates around the Z axis by an angle theta, which is equivalent to the rotation of the whole world around the Z axis by-theta;
Figure BDA0001519281300000042
the observer reduces S times in three directions in equal proportion, which is equivalent to S times of the whole world;
Figure BDA0001519281300000043
after the method is adopted, the invention has the following beneficial effects:
1. the invention provides a scene fusion method under multiple geographic information platforms, which can realize the fusion of scenes under multiple dimensions and multiple platforms through a unified standard to achieve synchronous rendering and synchronous operation;
2. according to the invention, a single geographic scene is combined with the digital earth, the visualization level of the digital earth is improved while the spatial coordinate attribute of the geographic scene is given, the diversification of earth big data elements is realized, and the dependence of a WebGIS platform on the single geographic scene is avoided;
3. the two-way interaction can be realized among different scenes, so that the consistency of data expression and spatial analysis is realized;
4. the invention provides a set of cross-platform, plug-in-free and installation-free solution, effectively reduces the development cost, expands the multi-level scene display capability and provides a good environment for the comparative analysis between scenes.
Drawings
FIG. 1 is a flow chart of a scene fusion method under a multi-geographic information platform;
FIG. 2 is a flow chart of a method for fusing a bottom layer scene and an upper layer scene;
FIG. 3 is a flow chart of a method for maintaining spatial analysis consistency of an upper layer scene and a lower layer scene;
fig. 4 is a flowchart of WebGIS scene loop rendering.
Detailed Description
The first part
The present invention will be described in further detail with reference to the accompanying drawings and specific embodiments.
Fig. 1 is a mechanism block diagram of the method of the present invention, and a scene fusion visualization method under a multi-geographic information platform includes 3 stages: stage 1 is the fusion of a bottom layer scene and an upper layer scene; stage 2, controlling the spatial analysis of the upper layer scene and the lower layer scene to be consistent; phase 3 is a cyclic rendering of the WebGIS scene.
Fig. 2 is a flowchart of a fusion method of a bottom layer scene and an upper layer scene. Firstly, a bottom layer WebGIS scene and an upper layer scene are created, a three-dimensional GIS platform with a good interface is selected as a bottom layer, and a three-dimensional digital earth, a basic layer, a camera and a renderer are arranged. And then introducing an upper scene database, importing the data files in a json format, and importing the upper scene framework into the engineering files in a plug-in mode. At this time, the upper scene and the bottom platform are only two independent canvases, and are mutually overlapped and displayed by the browser. A series of synchronization operations are then performed: (1) synchronous map projection: the projection method uniformly adopts a Web Mercator. And 3-dimensional coordinates corresponding to the 2-dimensional or 2.5-dimensional upper scene obtained by calculation by using a projection formula are stored in a new array, so that the upper scene corresponds to the three-dimensional coordinate system. (2) Synchronous view cone: and adjusting a camera of the upper scene, and multiplying the coordinate of the upper scene by the perspective projection matrix, wherein the result is coincided with the bottom layer. (3) Establishing a bidirectional interaction function: the state attribute is set in the component class, the change of the attribute can be triggered by the change of a certain scene, and the attribute change causes the corresponding operation of other scenes, so that the linkage of a plurality of scenes and the consistency of data are achieved. (4) Synchronization view matrix: and setting the view matrix of the bottom scene as the view matrix of the upper scene. (5) Synchronous azimuth, inclination and distance from the center of the view matrix: and obtaining values of the azimuth angle, the inclination angle and the distance from the view matrix center of the bottom layer scene, storing the values into an array, and assigning values to corresponding data of the upper layer scene by using the array.
FIG. 3 is a flow chart for maintaining consistency in the spatial analysis of an upper and lower layer scene. The upper GIS platform provides advanced spatial analysis function. And after the calculation is finished, returning the analysis results (Layer, statistical forms and the like) to the bottom Layer platform, and loading the analysis results into the bottom Layer scene by the bottom Layer GIS platform through a data loading function, so that the analysis results of the upper-Layer high-level space can be visually displayed on the bottom Layer GIS platform.
Fig. 4 is a flowchart of WebGIS scene loop rendering. And rendering the WebGIS scene at the lower layer and the upper layer in a circulating manner, wherein the entity change condition in the scene and the change condition of the camera view matrix are judged through renderrequired and viewMatrix each time in the circulating manner. Re-rendering the changed entity if the change occurs, and forcibly suspending the rendering operation through a forcedPaus command if the change does not occur. The characteristic can well optimize computer resources and improve the operation efficiency.
The second part
The technical scheme and the scientific principle according to the invention are explained in detail below.
1. The synchronous map projection method in phase 2 is as follows:
coordinates (x, y) of the 2-dimensional or 2.5-dimensional upper scene are given. (x, y) are mercator plane coordinates, assuming that the origin of the mercator projected coordinate system is (0, λ)0) The mercator projection formula is:
Figure BDA0001519281300000061
where λ is longitude and φ is latitude.
The 'equiangular' characteristic of the mercator projection ensures the correctness of the scene direction and the relative position, and ensures that the GIS platform can not make mistakes when inquiring the direction of the ground object. The synchronized coordinate system needs to be unified, and WGS84 is generally selected;
2. the synchronized view frustum approach in phase 2 is as follows:
only the scene in six planes of the cone (up, down, left, right, near, far) is visible. The purpose of perspective projection is to convert the prism stage into a cube (cuboid). Ensuring that the top and bottom scenes are seen to be completely coincident from the camera perspective. The perspective projection matrix is M:
Figure BDA0001519281300000062
theta is the included angle of the viewing pyramid in the Y direction, ratio is the aspect ratio of the projection plane, and n and f are the distances from the camera (the pyramid top of the pyramid) to the upper and lower scenes, respectively.
3. The method for establishing the bidirectional interactive function in the phase 2 comprises the following steps:
the two-way interaction function, i.e. the event trigger mechanism, is to be established, and the conversion formula is as follows:
linkage formula from upper scene to bottom scene: h ═ H × map
Linkage formula from bottom scene to upper scene: scale H/H × match (4)
Wherein: h is the height of the view angle in the bottom map; h is the height of the whole view angle; scale is the scale of the upper scene; match is the matching parameter from the upper scene to the bottom scene.
4. The synchronized view matrix method in phase 2 is as follows:
the role of the model view matrix is: multiplying a point coordinate to obtain a new point coordinate, wherein the obtained point coordinate represents the point transformation in the world. If the observer is considered to be a model, the view matrix is of the observer's model matrixAnd (4) inverting the matrix. The observer translates (t)x,ty,tz) Equivalent to translation of the whole world (-t)x,-ty,-tz)。
Figure BDA0001519281300000071
The observer has rotated an angle theta around the Z-axis, which corresponds to a rotation of theta around the Z-axis throughout the world.
Figure BDA0001519281300000072
The observer scales down by a factor of S in three directions, which is equivalent to a factor of S magnified throughout the world.
Figure BDA0001519281300000073
Third part
The specific implementation steps are as follows:
step 1: according to the framework structure, the life cycle of the scene fusion method under the multi-geographic information platform is divided into initial loading, rendering and loading ending. During initial loading, a sphere is defined as a basic sphere based on a three-dimensional digital earth engine Cesium, and then a WebGIS scene of a third party is input as an upper-layer scene. And adding a high-gradient map as a base map rendering for the basic sphere.
Step2 the requested map tile is rendered onto the sphere in the form of a sector above the sphere. As in the synchronized map projection in fig. 2, when a tile requests and completes transmission, the index value (2052, 431 currently) of the tile is used to calculate the web mercator geographical coordinates P1[ -157517.079,6842183.225] of the starting point of the tile (bottom scene), and the geographical coordinates of the web mercator projection are used to calculate the latitude and longitude positions P2[ -1.415,52.232] on the sphere corresponding to the top scene (formula 1).
Step3, adjusting the camera position as the synchronous view cone in fig. 2, taking P2 as the current viewpoint, using the world coordinates of the current viewpoint and the view cone boundary points (upper left, lower left, upper right and lower right points), and taking the viewpoint as the vertex to take the ray and the three-dimensional earth for picking up and detecting. When the detection ray and the three-dimensional earth have an intersection point, the world coordinate of the intersection point is used for calculating the geographic coordinate range of the window request image where the detection ray is located, and a matrix is obtained by formula 2
Figure BDA0001519281300000081
If the vertexes of all the upper-layer scenes are in the viewing cone range, the region to be judged is in the viewing cone range; if only part of the vertexes are in the range of the viewing cone, the region to be judged is intersected with the viewing cone, and the region to be judged is also considered to be visible; if all the vertexes are not in the range of the viewing cone, the region to be judged is invisible.
Step4 establishing the bidirectional interaction function as in fig. 2, monitoring the scene change by setting the state index, and establishing the event trigger mechanism. When the height of the upper layer scene changes, the state index state also changes, at this time, an execution function is triggered, the current map is obtained to be 5.25, and the height value of the bottom layer scene is obtained by recalculating according to formula (3) to be 1550370.0351. As a parameter for the final rendering. Therefore, the linkage of the upper layer and the lower layer and the consistency of data are ensured.
Step5 synchronization of azimuth, inclination and distance from the center point of the view matrix as in fig. 2. The initial values of the azimuth angle, the inclination angle and the distance from the center point of the view matrix of the state parameters are respectively set to be-27.39, 40.5 and 4500. When the user operates the sphere to rotate, the azimuth angle, the inclination angle and the distance from the center point of the view matrix of the bottom scene to the viewpoint are changed, the state parameters are updated immediately (the parameters are-26.39, 40.5 and 4500 at the moment), and the parameters are assigned to the upper scene.
Step 6: consistency of the spatial analysis is maintained as in fig. 3. The ground objects in the bottom scene all correspond to one or several entities or vector data of the upper scene. That is, the three-dimensional GIS platform shares the vector data source of the two-dimensional GIS platform through the WFS, and when data query, data addition, data editing, data deletion and other operations are performed, according to specific requirements, upper-layer scene data is analyzed and calculated to obtain a result (coordinate), and the result (coordinate) is stored in an array position. The bottom scene selects a proper visualization form to create a geometric entity, such as creating a line segment to represent a traffic road, and creating a circular dot to represent an event occurrence position. When the viewing angle becomes small, the viewing range is determined from the view cone in step3, and a curve or a polygon is generally selected as the position attribute of the analysis data, in which the resultant data is set as the curve or the polygon. Therefore, the analysis result of the upper scene can be displayed on the bottom layer. The consistency of the multi-geographic scene space analysis is achieved.
Step 7: as shown in fig. 4, when cyclically rendering a WebGIS scene at a lower layer and an upper layer, an entity change situation and a camera view matrix change situation in the scene are to be determined each time, and the scene with no change in the entity and the view matrix is not to be rendered, otherwise, rendering is performed. Therefore, repetitive operation can be avoided, the utilization rate of the memory is reduced, and the loading process is smoother.

Claims (5)

1. A scene fusion visualization method under multiple geographic information platforms is characterized in that: the method specifically comprises the following steps;
step1, fusing a bottom layer scene and an upper layer scene;
the steps specifically comprise the following steps:
step 1.1, creating a bottom layer WebGIS scene and an upper layer scene, selecting a three-dimensional GIS platform with a good interface as a bottom layer, and setting a three-dimensional digital earth, a basic layer, a camera and a renderer;
step 1.2, introducing an upper scene database, importing data files in a json format, and importing an upper scene frame into engineering files in a plug-in mode;
step 1.3, carrying out a series of synchronous operations;
the step 1.3 is specifically as follows:
(1) synchronous map projection: the projection method uniformly adopts Web Mercator, 3-dimensional coordinates corresponding to the 2-dimensional or 2.5-dimensional upper scene obtained by calculation by using a projection formula are stored in a new array, and the upper scene corresponds to the three-dimensional coordinate system;
(2) synchronous view cone: adjusting a camera of the upper scene, multiplying the coordinate of the upper scene by the perspective projection matrix, and enabling the result to be superposed with the bottom layer;
(3) establishing a bidirectional interaction function: setting a state attribute in a component class, wherein the change of the attribute can be triggered by the change of a certain scene, and the attribute change causes the corresponding operation of other scenes, so that the linkage of a plurality of scenes and the consistency of data are achieved;
(4) synchronization view matrix: setting a view matrix of a bottom scene as a view matrix of an upper scene;
(5) synchronous azimuth, inclination and distance from the center of the view matrix: obtaining values of the azimuth angle, the inclination angle and the distance from the view matrix center of the bottom layer scene, storing the values into an array, and assigning values to corresponding data of the upper layer scene by using the array;
step2, controlling the consistency of the upper scene and the lower scene in spatial analysis;
the upper GIS platform analyzes data to obtain an analysis result and returns the analysis result to the bottom GIS platform, the bottom GIS platform loads the analysis result to the bottom WebGIS scene through a data loading function, scene performance characteristics in the bottom WebGIS scene are extracted and returned to the upper GIS platform, the upper GIS platform continues to analyze the data, and the process is circulated in such a way, so that the consistency of the upper scene and the lower scene space analysis is maintained;
step3, circularly rendering the WebGIS scene;
and circularly rendering the WebGIS scene at the lower layer and the upper layer, judging the entity change condition in the scene and the change condition of the camera view matrix through renderrequired and viewMatrix each time, re-rendering the changed entity if the change occurs, and forcibly suspending the rendering operation through a forcedPaus command if the change does not occur.
2. The method for scene fusion visualization under multiple geographic information platforms according to claim 1, wherein: in step2, the synchronous map projection method is as follows:
given the coordinates (x, y) of the 2-dimensional or 2.5-dimensional upper scene, x, y are respectivelyIs the horizontal and vertical coordinates of the projection of the mercator, and the origin of the coordinate system of the projection of the mercator is assumed to be (0, lambda)0) The mercator projection formula is:
Figure FDA0003475365470000021
where λ is longitude and φ is latitude.
3. The method for scene fusion visualization under multiple geographic information platforms according to claim 1, wherein: in step2, the synchronized view frustum method is as follows:
the perspective projection matrix M is specifically as follows:
Figure FDA0003475365470000022
wherein, θ is the included angle of the view cone in the Y direction, ratio is the aspect ratio of the projection plane, and n and f are the distances from the upper layer scene and the lower layer scene to the camera, respectively.
4. The method for scene fusion visualization under multiple geographic information platforms according to claim 1, wherein: in step2, the method for establishing the bidirectional interactive function is as follows:
the two-way interaction function, i.e. the event trigger mechanism, is to be established, and the conversion formula is as follows:
linkage formula from upper scene to bottom scene: h ═ H × map
Linkage formula from bottom scene to upper scene: scale H/H × match (4)
Wherein: h is the height of the view angle in the bottom map; h is the height of the whole view angle; scale is the scale of the upper scene; match is the matching parameter from the upper scene to the bottom scene.
5. The method for scene fusion visualization under multiple geographic information platforms according to claim 1, wherein: in step2, the synchronized view matrix method is as follows:
the role of the model view matrix is: multiplying a point coordinate to obtain a new point coordinate, the obtained point coordinate representing the point transformation in the world, if the observer is considered as a model, the view matrix is the inverse of the observer's model matrix, and the observer is translated (t)x,ty,tz) Equivalent to translation of the whole world (-t)x,-ty,-tz),
Figure FDA0003475365470000031
The observer rotates around the Z axis by an angle theta, which is equivalent to the rotation of the whole world around the Z axis by-theta;
Figure FDA0003475365470000032
the observer reduces S times in three directions in equal proportion, which is equivalent to S times of the whole world;
Figure FDA0003475365470000033
CN201711400221.8A 2017-12-22 2017-12-22 Scene fusion visualization method under multiple geographic information platforms Active CN108269304B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711400221.8A CN108269304B (en) 2017-12-22 2017-12-22 Scene fusion visualization method under multiple geographic information platforms

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711400221.8A CN108269304B (en) 2017-12-22 2017-12-22 Scene fusion visualization method under multiple geographic information platforms

Publications (2)

Publication Number Publication Date
CN108269304A CN108269304A (en) 2018-07-10
CN108269304B true CN108269304B (en) 2022-03-11

Family

ID=62772256

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711400221.8A Active CN108269304B (en) 2017-12-22 2017-12-22 Scene fusion visualization method under multiple geographic information platforms

Country Status (1)

Country Link
CN (1) CN108269304B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111091620B (en) * 2019-12-03 2023-09-26 深圳震有科技股份有限公司 Map dynamic road network processing method and system based on graphics and computer equipment
CN111221911B (en) * 2020-01-02 2023-09-15 深圳震有科技股份有限公司 GIS system fusion rendering and data synchronous processing method, system and equipment
CN111861890A (en) * 2020-08-03 2020-10-30 北京庚图科技有限公司 Three-dimensional map generation method and device
CN112380309A (en) * 2020-11-23 2021-02-19 深圳航天智慧城市系统技术研究院有限公司 WebGL-based GIS data visualization method and device
CN112364117A (en) * 2020-11-25 2021-02-12 深圳航天智慧城市系统技术研究院有限公司 GIS data visualization method and device based on physical engine
CN112711407A (en) * 2020-12-28 2021-04-27 河北志晟信息技术股份有限公司 Construction method and use method of universal WebGIS development client
CN115471550B (en) * 2022-08-31 2023-05-26 北京四维远见信息技术有限公司 2.5-dimensional image space geometric azimuth correction method, device, equipment and medium
CN116662435B (en) * 2023-05-25 2024-02-02 北京龙软科技股份有限公司 Cloud GIS two-three-dimensional integrated visualization system and integrated visualization method

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9165397B2 (en) * 2013-06-19 2015-10-20 Google Inc. Texture blending between view-dependent texture and base texture in a geographic information system
CN104615735B (en) * 2015-02-11 2019-03-15 中科星图股份有限公司 A kind of space time information method for visualizing based on geographical information space system
CN106547880B (en) * 2016-10-26 2020-05-12 重庆邮电大学 Multi-dimensional geographic scene identification method fusing geographic area knowledge
CN106846474B (en) * 2016-12-29 2020-04-03 中国科学院电子学研究所苏州研究院 WebGIS (Web geographic information System) time-space process simulation method based on time sequence characteristics and particle systems
CN107369205B (en) * 2017-07-04 2020-10-16 东南大学 Mobile terminal city two-dimensional and three-dimensional linkage display method

Also Published As

Publication number Publication date
CN108269304A (en) 2018-07-10

Similar Documents

Publication Publication Date Title
CN108269304B (en) Scene fusion visualization method under multiple geographic information platforms
US11222465B2 (en) Embedded urban design scene emulation method and system
CN105719343A (en) Method for constructing virtual streetscape map
Liu Three-dimensional visualized urban landscape planning and design based on virtual reality technology
CN108572951B (en) Mapping data three-dimensional display system based on geographic information
CN108959434A (en) A kind of scene fusion visualization method under more geographical information platforms
KR101591427B1 (en) Method for Adaptive LOD Rendering in 3-D Terrain Visualization System
CN114756937A (en) Visualization system and method based on UE4 engine and Cesium framework
CN112017270A (en) Live-action three-dimensional visualization online application system
CN204695673U (en) A kind of for the three-dimensional digital sand table in urban planning and construction
Bi et al. Research on CIM basic platform construction
KR20220155245A (en) Positioning method, method for generating visual map and device thereof
Ding et al. The interactive modeling method of virtual city scene based on building codes
CN115619986A (en) Scene roaming method, device, equipment and medium
CN114972665A (en) Three-dimensional visual virtual scene modeling method in unmanned aerial vehicle virtual simulation
Zhang Development of virtual campus system based on ArcGIS
Song et al. Research on 3D campus integrated management based on ArcGIS Pro and CityEngine
CN106875480B (en) Method for organizing urban three-dimensional data
CN116310093B (en) Virtual three-dimensional urban geographic scene sand table model construction system and method thereof
Ren et al. Design and Development of 3D Urban Planning Management System Based on Oblique Image Technology
Sun et al. Budi: Building urban designs interactively a spatial-based visualization and collaboration platform for urban planning
Yi et al. Research on the Application of Computer Artificial Intelligence Technology in Environmental Art Design
Huang et al. Innovative Application and Improvement of Panoramic Digital Technology in Indoor Display Scenes
Hua et al. Review of 3D GIS Data Fusion Methods and Progress
Ding et al. Rapid construction of indoor and outdoor three-dimensional scenes and augmented reality navigation design and application

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant