CN110706325A - Real-time dynamic rendering method and system for three-dimensional submarine environment - Google Patents

Real-time dynamic rendering method and system for three-dimensional submarine environment Download PDF

Info

Publication number
CN110706325A
CN110706325A CN201910932100.0A CN201910932100A CN110706325A CN 110706325 A CN110706325 A CN 110706325A CN 201910932100 A CN201910932100 A CN 201910932100A CN 110706325 A CN110706325 A CN 110706325A
Authority
CN
China
Prior art keywords
texture
rendering
etching
bubble
bubbles
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910932100.0A
Other languages
Chinese (zh)
Other versions
CN110706325B (en
Inventor
马国军
丁静静
朱琎
曾庆军
王彪
周大年
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangsu University of Science and Technology
Original Assignee
Jiangsu University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangsu University of Science and Technology filed Critical Jiangsu University of Science and Technology
Priority to CN201910932100.0A priority Critical patent/CN110706325B/en
Publication of CN110706325A publication Critical patent/CN110706325A/en
Application granted granted Critical
Publication of CN110706325B publication Critical patent/CN110706325B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/04Texture mapping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/06Ray-tracing

Abstract

The invention discloses a real-time dynamic rendering method of a three-dimensional submarine environment and a rendering system for realizing the method, wherein the real-time dynamic rendering method of the three-dimensional submarine environment comprises the following steps: 1. establishing an obj model according to the digital elevation model data of the submarine area to be rendered; acquiring a ground image of the submarine area in real time to generate a terrain normal texture map; setting a texture surrounding mode and a filtering mode; loading the terrain normal texture map onto an obj model by utilizing OpenGL to realize rendering of the seabed ground; 2. rendering and etching according to the water etching texture picture, and rendering a light beam according to a light source space; 3. setting texture data and transparency of the bubbles; generating new bubbles and setting bubble attributes to delete the bubbles with the lifetime exceeding a preset lifetime threshold value and the bubbles with the height exceeding a preset sea depth threshold value; and performing collision detection according to the position and the speed of the bubble, and updating the attribute of the bubble after the collision. The method can be used for rapidly drawing the submarine environment, and the drawing effect is vivid.

Description

Real-time dynamic rendering method and system for three-dimensional submarine environment
Technical Field
The invention belongs to the field of computer image simulation, and particularly relates to a method and a rendering system for performing three-dimensional real-time dynamic rendering on a submarine environment by using a computer.
Background
The detection of human beings on the ocean is not satisfactory to two-dimensional, and the establishment of a vivid three-dimensional seabed dynamic environment becomes a spatial basis for computer visual visualization and three-dimensional display of ocean information and scene elements. On the basis, scientific research activities and virtual operations in various aspects such as ocean information data processing and analysis, ocean operation display, ocean phenomenon simulation and the like can be developed. The three-dimensional dynamic submarine environment visualization plays an important role in movies, games, submarine exploration and disaster relief.
Currently, dynamic visualization of the entire subsea environment is not comprehensive and the rendering speed is slow. In terms of terrain, the simulation of terrain is divided into real terrain and simulated terrain. The simulated terrain generally cannot reflect the situation of submarine terrain really, and the real terrain is generally established by a Digital Elevation Model (DEM) method, so that the types of the obtained data are not uniform. The method of data transformation is usually determined by the system used, and the terrain data is converted into the type which can be identified by the system, but the transformation is only suitable for one system, can not be used universally, has large workload and is not easy to realize. In the aspect of interaction between sea water and terrain and light, sea surface light is divided into a plurality of cones, and light intensity integral calculation of the cones consumes a large amount of time for integral operation. In terms of light beams, only light attenuation is generally considered, and refraction of light and influence of topography on light are not considered. The method has the advantages that the etching graph technology is introduced into rendering of underwater illumination, the etching graph is improved by combining with the periodicity of the FFT water surface, the underwater light beam is rendered by utilizing the light advancing technology, the rendering efficiency of the FFT water surface is reduced by the periodic rendering, and the visualization of the submarine environment cannot be dynamically realized.
In the simulation process, the visualization method has complex data processing process and low processing efficiency.
Disclosure of Invention
The purpose of the invention is as follows: aiming at the problems in the prior art, the invention discloses a real-time dynamic rendering method for a three-dimensional submarine environment, which can be used for rapidly rendering the submarine environment and has a vivid rendering effect.
The technical scheme is as follows: the invention discloses a real-time dynamic rendering method of a three-dimensional submarine environment, which comprises the following steps:
(1) establishing an obj model according to the digital elevation model data of the submarine area to be rendered;
acquiring a ground image of the submarine area in real time to generate a terrain normal texture map;
setting a texture surrounding mode and a filtering mode; loading the terrain normal texture map onto an obj model by utilizing OpenGL to realize rendering of the seabed ground;
(2) performing rendering etching, namely acquiring static water etching texture pictures at n continuous moments before the current moment of the submarine area to be rendered, wherein the time difference of adjacent pictures in the static water etching texture pictures is smaller than a preset time interval △ t;
setting a texture surrounding mode and a filtering mode; loading the water etching normal line texture picture to a water area part of a seabed area to be rendered by utilizing an OpenGL timer to realize dynamic rendering of water etching;
rendering the light beam: rendering a full-screen rectangle in an OpenGL main program; in a fragment shader, starting from a near cutting surface, light rays travel to a scene body, a water etching texture picture is sampled along the sight direction, a current sampling point is projected to a light source space, and the obtained x and y components are used as a light beam texture coordinate sampling etching picture; the x and y components are window relative coordinates of the current fragment; the sampling value is used as the strength contribution of the sampling point to the fragment, exponential attenuation is added, and the final fragment strength is calculated; the fragment intensity S (p) at position p on the ray is: s (P) ═ I + α (P)1,p)·β(p,e)·c;
Wherein, P1As the starting point of the light, I is the light intensity of the light on the sea surface, and e is the observationThe position of the point, c is the scattering coefficient of the seawater, α (P)1P) is the attenuation coefficient of the light intensity from the starting point to p; β (p, e) is the attenuation coefficient of the light intensity from the eye position to the ray position at p;
(3) setting texture data and transparency of the bubbles; generating new bubbles and setting bubble attributes, the bubble attributes comprising: bubble position, size, lifetime, velocity; deleting the bubbles with the lifetime exceeding a preset lifetime threshold value and the bubbles with the height exceeding a preset sea depth threshold value; and performing collision detection according to the position and the speed of the bubble, and updating the attribute of the bubble after the collision.
The step (1) also comprises optimization and simplification of the obj model;
the optimization is as follows: optimizing the ratio of the established obj model adjusting data;
the simplification is to merge repeated vertexes, normals and materials in the obj model file.
The texture surrounding mode in the step (1) is a repeated texture image, and the texture filtering mode is a multi-stage gradually-distant texture.
The method for generating the terrain normal texture map in the step (1) comprises the following steps: the filter insert using PS will transfer the texture map to the normal map.
On the other hand, the invention discloses a system for realizing the three-dimensional seabed environment real-time dynamic rendering method, which comprises the following steps: the device comprises a terrain rendering module, an etching and light beam rendering module and a bubble simulation module;
the terrain rendering module is used for rendering the seabed ground according to the digital elevation model data and the ground image of the seabed area;
the etching and light beam rendering module renders etching according to the water etching texture picture and renders light beams according to a light source;
the bubble simulation module is used for generating and deleting bubbles, performing collision detection according to the positions and the speeds of the bubbles, and updating the attributes of the bubbles after collision.
The terrain rendering module, the etching and beam rendering module and the bubble simulation module are computers provided with AMD RadeON HD8210 graphic cards.
Has the advantages that: compared with the prior art, the real-time dynamic rendering method of the three-dimensional submarine environment disclosed by the invention has the following advantages:
(1) the obtained terrain accuracy is high by modeling the real terrain elevation data;
(2) data model conversion is carried out before rendering, an obj model is loaded and the model is simplified, so that the drawing speed can be increased;
(3) because the texture generation of the etching pattern needs a large amount of integral calculation and extra workload, the invention not only can be more real, but also can save time and improve efficiency through video texture conversion.
Drawings
FIG. 1 is a flow chart of a method for real-time dynamic rendering of a three-dimensional subsea environment;
FIG. 2 is a schematic diagram of a three-dimensional subsea environment real-time dynamic rendering system;
FIG. 3 is a schematic illustration of etching and light beam;
fig. 4 is a flow chart of bubble simulation.
Detailed Description
The invention is further elucidated with reference to the drawings and the detailed description.
As shown in fig. 1, the invention discloses a real-time dynamic rendering method for a three-dimensional submarine environment, comprising the following steps:
step 1, rendering the seabed ground, and specifically comprises the following steps:
(1.1) establishing an obj model according to the digital elevation model data of the seabed area to be rendered;
a Digital Elevation Model (DEM) is a discrete function of Elevation z with respect to two arguments of plane coordinates x, y. The most common expression for an elevation model is the altitude relative to sea level, or the relative height of some reference plane. The invention can restore the elevation value to the maximum extent by using the dem data in the tif format. The dem data can be extracted, and the size of a pixel value and the image coordinate position of the pixel value can be completely extracted according to the size of the tif picture and the spatial resolution; the magnitude of the pixel values represents the elevation of the seafloor terrain and the image coordinates represent relative position information.
And generating an obj model according to dem data by utilizing python, and simultaneously optimizing the model by utilizing blend. The dem data has the problem that the data change is not obvious, so that the effect is not obvious in the subsequent display, and the model can be optimized by adjusting the proportional size of the data.
Although the obj specification is very simple, it still has a large space to derive geometric and mesh/texture combinations that are not always best suited for 3D rendering engines. And simple cleaning operation is carried out, the size of the file is reduced, and the loading and rendering of the model are faster.
(1.2) acquiring a ground image of the seabed area in real time by using sonar to generate a terrain normal texture map;
the object reflects light in different directions at different positions, and for each texture picture of the ground, all normal vectors on the surface of the texture picture can be calculated and processed and stored in the normal map. The corresponding normal direction is obtained by sampling from the normal map, so that the method is closer to reality. There are two main methods for making the normal map: firstly, high-mold baking is adopted, namely a high-precision model with millions of surfaces or even millions of surfaces is made, then a low mold with thousands of surfaces and even millions of surfaces is made, and the detail information of the high mold is baked on the low mold to obtain a normal map, but higher modeling skill and a great deal of energy are needed; secondly, the texture mapping is used for rotating the normal mapping, and the method is convenient and fast. The present invention can be accomplished directly in ps using the second approach, using the filter insert of ps.
(1.3) setting a texture surrounding mode and a filtering mode; loading the terrain normal texture map onto an obj model by utilizing OpenGL to realize rendering of the seabed ground;
the texture surrounding mode is a repeated texture image, and the texture filtering mode is a multi-stage gradually-distant texture.
The texture surrounding mode comprises repeated texture images, mirror image repetition, edge colors specified by a user and the like, the repeated texture images are selected, texture coordinates are constrained between (0,1), and the edge of the texture coordinates can be repeated by the excessive part. Texture maps can be linear, square, rectangular, or even three-dimensional, but when mapped onto a polygon or object surface and transformed to the screen coordinate system, it is almost impossible for individual texels on the texture to directly correspond to the final picture pixel on the screen. In this case texels are scaled up or down as required by a multi-level progressive texture (mipmap) technique. When the viewpoint changes, the texture may be filtered in advance and the filtered image stored as a continuous low resolution version in order to reduce rendering artifacts.
The rendering based on OpenGL mainly needs to utilize a vertex shader, a geometry shader and a fragment shader; the method comprises the following steps: setting illumination: creating a light source, and setting the attributes of the light source, including position, direction and color; and setting brightness gain and scattering gain. Loading data from the obj file; and loading the terrain normal texture map into a main program of OpenGL. Establishing a Mesh object; and drawing the seabed ground through mesh. Preparing a shader program; and finally realizing the rendering of the seabed ground by a vertex shader, a geometry shader and a fragment shader.
Step 2, rendering, etching and rendering the light beam, which specifically comprises the following steps:
(2.1) rendering and etching:
the difficulty of the simulated etching is to obtain the intersection point of the light and the seabed, and when the seabed is a plane, the intersection can be obtained only by using the intersection formula of the light and the seabed; when the seabed is uneven, the traditional formula has huge calculation amount, and the etching pattern can be quickly obtained by adopting a dynamic texture method, so that the calculation amount is reduced. The water etching normal line texture is projected on an object in a scene, so that real-time calculation of refraction and reflection of a large amount of light rays is omitted.
Obtaining static water etching texture pictures of continuous n moments before the current moment of the seabed area to be rendered, wherein the time difference of adjacent pictures in the static water etching texture pictures is smaller than a preset time interval △ t, so that the pictures are coherent.
The water etching normal line texture picture needs to be filtered before rendering and etching, because the area with relatively dispersed photons has low sampling (undersample), and the water etching normal line texture picture is usually noisy. In the embodiment, the noise is reduced by adopting Gaussian blur based on the GPU, the low-pass filtering is adopted, and the satisfactory effect can be achieved by selecting the appropriate blur value radius (radius). In the embodiment, the fuzzy radius is 3, so that the filtering effect can be well realized.
And generating a water etching normal line texture picture by using the filtered water etching texture picture to enhance the visual effect. In this embodiment, the filter insert of PS is also used to convert the water etching texture picture into a water etching normal texture picture.
Setting a texture surrounding mode and a filtering mode; and (3) loading the water etching normal line texture picture to a water area part of the submarine area to be rendered by utilizing an OpenGL timer to generate texture resources, keeping the position of the quadrangle and texture coordinate data unchanged, dynamically binding different textures so as to realize a video playing effect, and realizing rendering of water etching, wherein the step (1.3) is the same as the step (1.3), and the difference is that the water etching normal line texture picture is loaded.
(2.2) rendering the beam: rendering a full-screen rectangle in an OpenGL main program; in the fragment shader, light rays travel to the scene body from the near-clipping surface, and the water-etched texture picture is sampled along the direction of the line of sight. The sampling step length determines the simulation precision, the step length value is large, and the precision is low; the step value is small, the precision is high, but the rendering speed is low. The step value sampled in the embodiment is determined according to the ratio of the distance from the position of the water etched texture to the viewpoint to the total length of the light, and the step value is limited to be within the range of (0.01, 1).
Projecting the current sampling point to a light source space, and using the obtained x and y components as a light beam texture coordinate sampling etching pattern; the x and y components are window relative coordinates of the current fragment; the sampling value is used as the strength contribution of the sampling point to the fragment, exponential attenuation is added, and the final fragment strength is calculated; the fragment intensity S (p) at position p on the ray is: s (P) ═ I + α (P)1,p)·β(p,e)·c;
Wherein, P1Is the starting point of the light, I is the light intensity of the light irradiating the sea surface, e is the position of the observation point, c is the scattering coefficient of the sea water, alpha (P)1P) is the attenuation system of the light intensity from the starting point to pCounting; β (p, e) is the attenuation coefficient of the light intensity from the eye position to the ray position at p. As shown in fig. 3, a schematic diagram of etching and light beam.
Step 3, setting texture data and transparency of the bubbles; in the embodiment, the bubble texture is set to be fully transparent, so that the simulation effect is more realistic;
generating a new bubble using the particle system of OpenGL and setting bubble attributes, the bubble attributes including: bubble position, size, lifetime, velocity; in this embodiment, the number of bubbles is set to 500, the initial position of the bubbles is on the seabed ground, and the initial velocity is v0When the life time is t, the speed is
Figure BDA0002220504960000061
Deleting the bubbles with the lifetime exceeding a preset lifetime threshold value and the bubbles with the height exceeding a preset sea depth threshold value; and performing collision detection according to the position and the speed of the bubble, and updating the attribute of the bubble after the collision. In this embodiment, collision detection is implemented in the vertex shader. Since the vertex data cannot be updated immediately, the position and velocity vectors of the bubble are stored in two buffers, each time the earlier buffer is fetched to update the rendering.
As shown in fig. 4, a flow chart of bubble simulation is shown. Wherein the rasterization is to determine the screen space covered by a certain portion of the geometry. After the screen space information and the input vertex data are obtained, the rasterization unit can directly perform linear interpolation on each variable in the fragment shader, and then transmit a result value to the fragment shader. The fragment shader (i.e. fragment shader) mainly sets the color of the bubble, and is used for setting the transparency of the color of the bubble, wherein the transparency is mainly set according to the brightness and the scattering, and the brightness at the beginning is the color of the texture picture of the bubble and changes within a certain range along with the change of the bubble.
As shown in fig. 2, a rendering system for implementing the method for real-time dynamic rendering of a three-dimensional subsea environment includes: the device comprises a terrain rendering module 1, an etching and light beam rendering module 2 and a bubble simulation module 3; the terrain rendering module is used for rendering the seabed ground according to the digital elevation model data and the ground image of the seabed area; the etching and light beam rendering module renders etching according to the water etching texture picture and renders light beams according to a light source; the bubble simulation module is used for generating and deleting bubbles, performing collision detection according to the positions and the speeds of the bubbles, and updating the attributes of the bubbles after collision. In this embodiment, the terrain rendering module, the etching and beam rendering module, and the bubble simulation module are computers equipped with AMD radius HD8210 graphics cards, and are specifically configured as follows:
CPU:AMD E1_2100APU with Radeon(TM)HD Graphics 1.00GHz;
memory: 2GB memory;
a graphic card: AMD Radeon HD8210 is provided,
memory of the display card: 1881 MB.

Claims (6)

1. A real-time dynamic rendering method for a three-dimensional submarine environment is characterized by comprising the following steps:
(1) establishing an obj model according to the digital elevation model data of the submarine area to be rendered;
acquiring a ground image of the submarine area in real time to generate a terrain normal texture map;
setting a texture surrounding mode and a filtering mode; loading the terrain normal texture map onto an obj model by utilizing OpenGL to realize rendering of the seabed ground;
(2) performing rendering etching, namely acquiring static water etching texture pictures at n continuous moments before the current moment of the submarine area to be rendered, wherein the time difference of adjacent pictures in the static water etching texture pictures is smaller than a preset time interval △ t;
setting a texture surrounding mode and a filtering mode; loading the water etching normal line texture picture to a water area part of a seabed area to be rendered by utilizing an OpenGL timer to realize dynamic rendering of water etching;
rendering the light beam: rendering a full-screen rectangle in an OpenGL main program; in a fragment shaderStarting from a near cutting surface, enabling light rays to advance to a scene body, sampling the water etching texture picture along the sight line direction, projecting a current sampling point to a light source space, and sampling an etching pattern by using the obtained x and y components as light beam texture coordinates; the x and y components are window relative coordinates of the current fragment; the sampling value is used as the strength contribution of the sampling point to the fragment, exponential attenuation is added, and the final fragment strength is calculated; the fragment intensity S (p) at position p on the ray is: s (P) ═ I + α (P)1,p)·β(p,e)·c;
Wherein, P1Is the starting point of the light, I is the light intensity of the light irradiating the sea surface, e is the position of the observation point, c is the scattering coefficient of the sea water, alpha (P)1P) is the attenuation coefficient of the light intensity from the starting point to p; β (p, e) is the attenuation coefficient of the light intensity from the eye position to the ray position at p;
(3) setting texture data and transparency of the bubbles; generating new bubbles and setting bubble attributes, the bubble attributes comprising: bubble position, size, lifetime, velocity; deleting the bubbles with the lifetime exceeding a preset lifetime threshold value and the bubbles with the height exceeding a preset sea depth threshold value; and performing collision detection according to the position and the speed of the bubble, and updating the attribute of the bubble after the collision.
2. The method for real-time dynamic rendering of a three-dimensional seabed environment according to claim 1, wherein the step (1) further comprises optimization and simplification of an obj model;
the optimization is as follows: optimizing the ratio of the established obj model adjusting data;
the simplification is to merge repeated vertexes, normals and materials in the obj model file.
3. The method for real-time dynamic rendering of a three-dimensional seabed environment as claimed in claim 1, wherein the texture surrounding manner in step (1) is a repeated texture image, and the texture filtering manner is a multi-level gradually-distant texture.
4. The real-time dynamic rendering method for the three-dimensional seabed environment according to claim 1, wherein the method for generating the terrain normal texture map in the step (1) comprises: the texture map is rotated to the normal map using the filter insert of the PS.
5. A three-dimensional subsea environment real-time dynamic rendering system, comprising: the device comprises a terrain rendering module, an etching and light beam rendering module and a bubble simulation module;
the terrain rendering module is used for rendering the seabed ground according to the digital elevation model data and the ground image of the seabed area;
the etching and light beam rendering module renders etching according to the water etching texture picture and renders light beams according to a light source;
the bubble simulation module is used for generating and deleting bubbles, performing collision detection according to the positions and the speeds of the bubbles, and updating the attributes of the bubbles after collision.
6. The system of claim 5, wherein the terrain rendering module, the lithography and beam rendering module, and the bubble simulation module are computers equipped with AMD Radon HD8210 graphics cards.
CN201910932100.0A 2019-09-29 2019-09-29 Real-time dynamic rendering method and system for three-dimensional submarine environment Active CN110706325B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910932100.0A CN110706325B (en) 2019-09-29 2019-09-29 Real-time dynamic rendering method and system for three-dimensional submarine environment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910932100.0A CN110706325B (en) 2019-09-29 2019-09-29 Real-time dynamic rendering method and system for three-dimensional submarine environment

Publications (2)

Publication Number Publication Date
CN110706325A true CN110706325A (en) 2020-01-17
CN110706325B CN110706325B (en) 2022-12-30

Family

ID=69197148

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910932100.0A Active CN110706325B (en) 2019-09-29 2019-09-29 Real-time dynamic rendering method and system for three-dimensional submarine environment

Country Status (1)

Country Link
CN (1) CN110706325B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111862291A (en) * 2020-07-10 2020-10-30 完美世界(北京)软件科技发展有限公司 Aqueous baking method and apparatus, storage medium, and electronic apparatus
CN112907720A (en) * 2021-02-08 2021-06-04 中国海洋大学 Sea ice data visualization method and device for realistic rendering
CN113298918A (en) * 2020-02-24 2021-08-24 广东博智林机器人有限公司 Different color display method and device for overlapped area
CN114898026A (en) * 2022-05-10 2022-08-12 北京领为军融科技有限公司 Dynamic loading and unloading method for landscape based on position and sight

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103577656A (en) * 2013-11-25 2014-02-12 哈尔滨工业大学 Three-dimensional dynamic simulation method for water outlet process of submarine-launched missiles
CN104200506A (en) * 2014-08-04 2014-12-10 广东威创视讯科技股份有限公司 Method and device for rendering three-dimensional GIS mass vector data
CN105279782A (en) * 2015-07-02 2016-01-27 苏州蜗牛数字科技股份有限公司 Simulation and rendering method of real-time sea system
CN107918949A (en) * 2017-12-11 2018-04-17 网易(杭州)网络有限公司 Rendering intent, storage medium, processor and the terminal of virtual resource object

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103577656A (en) * 2013-11-25 2014-02-12 哈尔滨工业大学 Three-dimensional dynamic simulation method for water outlet process of submarine-launched missiles
CN104200506A (en) * 2014-08-04 2014-12-10 广东威创视讯科技股份有限公司 Method and device for rendering three-dimensional GIS mass vector data
CN105279782A (en) * 2015-07-02 2016-01-27 苏州蜗牛数字科技股份有限公司 Simulation and rendering method of real-time sea system
CN107918949A (en) * 2017-12-11 2018-04-17 网易(杭州)网络有限公司 Rendering intent, storage medium, processor and the terminal of virtual resource object

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
刘丁: "虚拟海底场景多要素实时绘制算法研究", 《中国优秀硕士学位论文全文数据库 (信息科技辑)》 *
曹红飞: "太湖水场景的模拟技术研究", 《中国优秀硕士学位论文全文数据库 (信息科技辑)》 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113298918A (en) * 2020-02-24 2021-08-24 广东博智林机器人有限公司 Different color display method and device for overlapped area
CN113298918B (en) * 2020-02-24 2022-12-27 广东博智林机器人有限公司 Different color display method and device for overlapped area
CN111862291A (en) * 2020-07-10 2020-10-30 完美世界(北京)软件科技发展有限公司 Aqueous baking method and apparatus, storage medium, and electronic apparatus
CN111862291B (en) * 2020-07-10 2024-01-09 完美世界(北京)软件科技发展有限公司 Baking method and device for water system, storage medium, and electronic device
CN112907720A (en) * 2021-02-08 2021-06-04 中国海洋大学 Sea ice data visualization method and device for realistic rendering
CN112907720B (en) * 2021-02-08 2022-08-05 中国海洋大学 Sea ice data visualization method and device for realistic rendering
CN114898026A (en) * 2022-05-10 2022-08-12 北京领为军融科技有限公司 Dynamic loading and unloading method for landscape based on position and sight

Also Published As

Publication number Publication date
CN110706325B (en) 2022-12-30

Similar Documents

Publication Publication Date Title
CN110706325B (en) Real-time dynamic rendering method and system for three-dimensional submarine environment
US10290142B2 (en) Water surface rendering in virtual environment
EP3080781B1 (en) Image rendering of laser scan data
Ritschel et al. Micro-rendering for scalable, parallel final gathering
WO2017206325A1 (en) Calculation method and apparatus for global illumination
US10049486B2 (en) Sparse rasterization
JPH0778267A (en) Method for display of shadow and computer-controlled display system
CN114820906B (en) Image rendering method and device, electronic equipment and storage medium
JP6885693B2 (en) Graphics processing system
CN108805971B (en) Ambient light shielding method
EP2410492A2 (en) Optimal point density using camera proximity for point-based global illumination
EP3211601B1 (en) Rendering the global illumination of a 3d scene
US6791544B1 (en) Shadow rendering system and method
CN103700134A (en) Three-dimensional vector model real-time shadow deferred shading method based on controllable texture baking
JP4584956B2 (en) Graphics processor and drawing processing method
Chiu et al. GPU-based ocean rendering
KR101118597B1 (en) Method and System for Rendering Mobile Computer Graphic
JPH10162161A (en) Efficient rendering that uses user-defined room and window
KR101208826B1 (en) Real time polygonal ambient occlusion method using contours of depth texture
Olajos Real-time rendering of volumetric clouds
Seng et al. Realistic real-time rendering of 3D terrain scenes based on OpenGL
CN116993894B (en) Virtual picture generation method, device, equipment, storage medium and program product
US20180005432A1 (en) Shading Using Multiple Texture Maps
EP3940651A1 (en) Direct volume rendering apparatus
Gün Interactive editing of complex terrains on parallel graphics architectures

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant