CN112233214B - Snow scene rendering method, device and equipment for large scene and storage medium - Google Patents
Snow scene rendering method, device and equipment for large scene and storage medium Download PDFInfo
- Publication number
- CN112233214B CN112233214B CN202011103832.8A CN202011103832A CN112233214B CN 112233214 B CN112233214 B CN 112233214B CN 202011103832 A CN202011103832 A CN 202011103832A CN 112233214 B CN112233214 B CN 112233214B
- Authority
- CN
- China
- Prior art keywords
- particle
- scene
- preset
- rendering
- ambient light
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000009877 rendering Methods 0.000 title claims abstract description 71
- 238000000034 method Methods 0.000 title claims abstract description 45
- 239000002245 particle Substances 0.000 claims abstract description 119
- 230000000694 effects Effects 0.000 claims abstract description 30
- 230000000007 visual effect Effects 0.000 claims abstract description 9
- 238000005070 sampling Methods 0.000 claims description 43
- 238000004364 calculation method Methods 0.000 claims description 17
- 238000004590 computer program Methods 0.000 claims description 12
- 238000005516 engineering process Methods 0.000 abstract description 7
- 238000010586 diagram Methods 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 241000196324 Embryophyta Species 0.000 description 2
- 241000533950 Leucojum Species 0.000 description 2
- 238000003491 array Methods 0.000 description 2
- 238000004422 calculation algorithm Methods 0.000 description 2
- 230000000873 masking effect Effects 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 239000003595 mist Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/50—Lighting effects
- G06T15/506—Illumination models
Landscapes
- Engineering & Computer Science (AREA)
- Computer Graphics (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Processing Or Creating Images (AREA)
Abstract
The application relates to a snow scene rendering method, a device, equipment and a storage medium of a large scene, wherein the method comprises the following steps: acquiring depth textures and normal textures of a scene to be rendered; calculating the screen space ambient light shielding amount according to the depth texture and the normal texture; rendering snow in the scene to be rendered by utilizing the ambient light shielding amount; utilizing a GPU particle system to combine with a preset particle movement rule and performing snowfall effect rendering according to movement of a user visual angle; the preset particle movement rule is to control particle movement according to the real-time particle offset coordinates. The reality of snow effect is improved by adopting the ambient light shielding technology, and the natural sense of snow effect is improved by the GPU particle system.
Description
Technical Field
The application relates to the technical field of computer graphics, in particular to a snow scene rendering method, device and equipment for a large scene and a storage medium.
Background
With the development of the field of computer graphics, the technology for rendering the sense of reality of natural scenes is also becoming more and more mature. And wherein natural scenes are different from the geographic location to the geographic location throughout the year. Snow scenes, for example, are natural scenes common to northern areas, and the function of rendering snow scenes is involved in more and more scenes. With the development of software technology, the requirements on the authenticity of snow scene rendering are also increasing.
The existing snow scene rendering technology is mostly to fill the upward facing surface with white or paste snow map, and snowfall is realized by a full scene particle system, even directly in the screen space. The snow scene is very stiff, has no stereoscopic impression and layering sense on one hand, and the full scene particle system has no problem on a small scene, but occupies a large memory in a large scene, has a large memory and tends to be even unable to be distributed, and simultaneously greatly reduces rendering efficiency.
Disclosure of Invention
In view of the above, the present application aims to overcome the defects of the prior art, and provide a snow scene rendering method, device, equipment and storage medium for large scenes.
In order to achieve the above purpose, the application adopts the following technical scheme:
a snowscape rendering method of a large scene, comprising:
acquiring depth textures and normal textures of a scene to be rendered;
calculating the screen space ambient light shielding amount according to the depth texture and the normal texture;
rendering snow in the scene to be rendered by utilizing the ambient light shielding amount;
utilizing a GPU particle system to combine with a preset particle movement rule and performing snowfall effect rendering according to movement of a user visual angle; the preset particle movement rule is to control particle movement according to the real-time particle offset coordinates.
Optionally, the controlling the particle movement according to the real-time particle offset coordinates includes:
according to a preset offset formulaCalculating a particle offset coordinate; wherein vParticalPosition is the offset coordinate of the particle, vOffset is the local position of the preset particle in the preset local coordinate system, fExtent is the preset square side length, and vcameralGridposition is the world position of the user view angle in the preset world coordinate system; wherein the world position of the particle is calculated according to the formula vcamera gridposition=mod (vEye);
and controlling the movement of the particles according to the offset coordinates.
Optionally, the calculating the screen space ambient light shielding amount according to the depth texture and the normal texture includes:
calculating view coordinates and view normals of all pixel points in the scene to be rendered according to the depth texture and the normal texture;
calculating a sampling radius according to the depth texture;
selecting a set number of sampling points on a plurality of concentric circles with the view coordinates of the set pixel points as circle centers and the sampling radius as a radius;
calculating an ambient light shielding contribution value of the sampling point to the set pixel point by combining the view coordinates and the view normal;
and carrying out weighted average calculation on the ambient light shielding contribution value to obtain the screen space ambient light shielding amount.
Optionally, the calculating a sampling radius according to the depth texture includes:
resolving according to the depth texture to obtain pixel depth;
and performing projection operation according to the pixel depth, and further obtaining the sampling radius.
Optionally, the sampling radius is smaller than a preset maximum sampling radius.
Optionally, the method further comprises:
performing fog filling on the scene to be rendered in a set range; the set range has no intersection with the snowfall range.
A snowscape rendering device of a large scene, comprising:
the texture acquisition module is used for acquiring depth textures and normal textures of the scene to be rendered;
an ambient light occlusion amount calculation module for calculating a screen space ambient light occlusion amount from the depth texture and the normal texture;
the snow rendering module is used for rendering snow in the scene to be rendered by utilizing the shielding amount of the ambient light;
the snowfall rendering module is used for utilizing the GPU particle system to combine with a preset particle movement rule and performing snowfall effect rendering according to the movement of the visual angle of the user; the preset particle movement rule is to control particle movement according to the real-time particle offset coordinates.
Optionally, the snowfall rendering module includes:
an offset coordinate calculation unit for calculating an offset coordinate according to a preset offset formulaCalculating a particle offset coordinate; wherein vParticalPosition is the offset coordinate of the particle, vOffset is the local position of the preset particle in the preset local coordinate system, fExtent is the side length of the preset cube, and vcameralGridPosition is the user viewing angle in the preset worldWorld locations in a world coordinate system; wherein the world position of the particle is calculated according to the formula vcamera gridposition=mod (vEye);
and the particle moving unit is used for controlling the movement of the particles according to the offset coordinates.
A snowscape rendering device of a large scene, comprising:
a processor, and a memory coupled to the processor;
the memory is used for storing a computer program, and the computer program is at least used for executing the snow scene rendering method of the large scene;
the processor is configured to invoke and execute the computer program in the memory.
A storage medium storing a computer program which, when executed by a processor, performs the steps of a snow-scene rendering method for a large scene as described above.
The technical scheme provided by the application can comprise the following beneficial effects:
the application discloses a snow scene rendering method of a large scene, which comprises the following steps: acquiring depth textures and normal textures of a scene to be rendered, calculating the ambient light shielding amount of a screen space through the depth textures and the normal textures, rendering snow in the scene by using the ambient light shielding amount, and then performing snowfall effect rendering according to movement of a user viewing angle by using a GPU particle system in combination with a preset particle movement rule; the preset particle movement rule is to control the particle movement according to the real-time particle offset coordinates. According to the snow scene rendering method, the depth texture and the normal texture are utilized to calculate the ambient light shielding amount of the screen space, snow rendering is carried out by utilizing the ambient light shielding amount, so that a vivid snow effect is generated, meanwhile, the GPU particle system and a preset particle movement rule are utilized to render the snow scene, wherein the movement of particles is controlled according to the real-time particle offset coordinates during particle movement, so that a more natural snow effect in the scene browsing process is realized, and the sense of reality of snow and snow in the scene is greatly improved.
Drawings
In order to more clearly illustrate the embodiments of the application or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, it being obvious that the drawings in the following description are only some embodiments of the application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a method for rendering a snowfield of a large scene according to an embodiment of the present application;
FIG. 2 is a block diagram of a large scene snowscape rendering apparatus according to an embodiment of the present application;
fig. 3 is a block diagram of a snowscape rendering apparatus for a large scene according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the technical solutions of the present application will be described in detail below. It will be apparent that the described embodiments are only some, but not all, embodiments of the application. All other embodiments, based on the examples herein, which are within the scope of the application as defined by the claims, will be within the scope of the application as defined by the claims.
Fig. 1 is a flowchart of a snow scene rendering method of a large scene according to an embodiment of the present application. Referring to fig. 1, a snow scene rendering method of a large scene includes:
step 101: and acquiring the depth texture and the normal texture of the scene to be rendered. Specific: and rendering the whole scene by using a multi-texture technology to obtain self-luminous textures, depth textures and normal textures of the screen space of the whole scene.
Step 102: calculating the shielding amount of the ambient light in the screen space according to the depth texture and the normal texture;
step 103: rendering snow in the scene to be rendered by using the shielding amount of the ambient light;
step 104: utilizing a GPU particle system to combine with a preset particle movement rule and performing snowfall effect rendering according to movement of a user visual angle; the preset particle movement rule is to control the particle movement according to the real-time particle offset coordinates.
According to the method, the depth texture and the normal texture are used for calculating the shielding amount of the ambient light, and snow rendering is further carried out, so that a vivid snow effect is generated. And meanwhile, the GPU particle system is combined with a preset particle movement rule to conduct snow fall effect rendering, wherein the preset particle movement rule specifically controls movement of particles according to real-time particle offset coordinates, so that a more natural snow fall effect is provided for a user in a browsing process, and user experience is improved.
On the basis of the embodiment, the application also discloses a specific process for calculating the shielding amount of the environment light, which comprises the following steps: calculating view coordinates and view normals of all pixel points in the scene to be rendered according to the depth texture and the normal texture; calculating a sampling radius according to the depth texture; selecting a set number of sampling points on a plurality of concentric circles with view coordinates of the set pixel points as circle centers and sampling radius as radius; calculating an ambient light shielding contribution value of the sampling point to a set pixel point by combining the view coordinates and the view normal; and carrying out weighted average calculation on the ambient light shielding contribution value to obtain the screen space ambient light shielding amount. The specific calculation process of the sampling radius further comprises the following steps: analyzing according to the depth texture to obtain the pixel depth; and performing projection operation according to the pixel depth, and further obtaining the sampling radius.
And integrating the depth texture and the normal texture for each pixel point in the whole scene, and calculating to obtain the view coordinate and the view normal of each pixel point through a projection matrix. The projection matrix is calculated in advance according to related parameters, wherein the parameters are as follows: screen height, width, and viewing angle. And setting sampling points on a plurality of concentric circles with view coordinates of the current pixel point as circle centers and proper distances as radiuses, calculating an ambient light shielding contribution value of the current pixel by aiming at each sampling point, calculating sampling values of all the sampling points, and finally carrying out weighted average on the ambient light shielding contribution value and the sampling values to be used as a final ambient light shielding quantity of the current pixel to simulate an ambient light shielding effect.
It should be noted that the radius of the sampling circle for each pixel should be stationary with respect to the scene during ambient light occlusion calculations, i.e., the radius of the sampling of pixels near should be larger and the radius of the sampling of pixels far should be smaller. Further, the dynamic calculation of the sampling radius brings new problems, and the sampling radius of pixels which are very close to each other becomes very large, which is very disadvantageous for the masking amount operation. On the one hand, an excessively large sampling radius can lead to scattering of sampling points, so that uncertainty of a final calculation result is increased, and a flicker shielding effect is generated. While ambient light masking is a static special effect, this effect is naturally undesirable to see. Therefore, the maximum sampling radius limit is added on the premise of dynamic calculation of the sampling radius, so that the layering sense and the realism of the ambient light shielding effect are reserved to the maximum extent on the premise that the effect does not flicker.
Besides the problem of sampling radius, the shielding effect of the environmental light of the large scene also solves the problem of filtering irrelevant sampling points. For example, the depth of the space box in the scene rendering process is unconventional, so that all pixels should not be shielded by the space box, and the space box should not be shielded, so that the shielding amount should be set to 0 for all pixels and sampling points identified as space boxes. Meanwhile, the plants are in the form of signboards, so that the normal information is not reliable. But the plant texture is a synthetic texture and contains the illumination calculation result. Therefore, the normal line of the part with high brightness faces upwards, and good billboard snow effect is obtained through parameter adjustment.
In the above embodiments, the ambient light shielding technique is applied to snow rendering, and the reality of snow can be greatly improved by calculating and utilizing the ambient light shielding amount.
In more detail, the application also discloses a specific process of presetting a particle movement rule, namely controlling the particle movement according to the real-time particle offset coordinates, which comprises the following steps: according to a preset offset formulaCalculating a particle offset coordinate; wherein vParticalPosition is a particlevOffset is the local position of the preset particle in the preset local coordinate system, fExtent is the preset square side length, and vCameraGridPosition is the world position of the user view angle in the preset world coordinate system; wherein the world position of the particle is calculated according to the formula vcamera gridposition=mod (vEye); particle movement is controlled according to the offset coordinates.
In this embodiment, snowfall is performed in one square box with a certain distance as a side length near the camera (user view angle), for example, 300m as a side length. Once the camera moves, a part of snowflake particles go beyond the square box, and meanwhile, the other part of the square box is filled with snowflake particles, so that the particles going beyond the square box only need to be 'replenished' to the blank area. When the algorithm is applied to the method, the whole world scene is firstly divided into infinite adjacent cubes, the side length of the cube is assumed to be fExtent, the selected origin is positioned on the fixed point of one cube, meanwhile, the position of the camera is assumed to be vEye, and the center of the particle system is always the camera position. The cube closest to the origin and with non-negative vertex coordinates is set as cube, and all particles in the whole world scene have a position corresponding to and fixed by the cube.
It is assumed that particles within the camera range where snow scene rendering is required are all distributed within cube with centered point vEye and side length fExtent. As the cube moves, the particle will jump into another cube, where the position of the particle in this new cube is the same as its position in the cube, i.e. the position of the particle in the cube is fixed relative to the cube, i.e. the local coordinates of the particle are the same. Thus, each particle automatically follows the cube and is located at its predetermined position in a cube of the world division, regardless of the camera's movement to any position in the world. Assuming that the random position of the particles in the cube is vOffset, namely the local coordinate of the particles relative to the cube; the position of the camera in the world division cube is vcamera gridposition, namely the world coordinate of the particle relative to the whole world scene; the offset coordinates of the particle relative to the particle system can be calculated.
According to the method, particles in the particle system move along with the movement of the user visual angle, after the particles move, the position of the particles in a cube is needed to be determined, the cube is positioned in the position of a world scene, meanwhile, the position of the particles in the cube is determined, the offset coordinates of the particles are determined through the three position information, and after the offset coordinates of the particles relative to the particle system are obtained, the particles can be directly filled in the positions corresponding to the offset coordinates, so that the effect that the snow scene moves along with the movement of the user visual angle is achieved.
The self-scheduling of the large scene lens surrounding particle system is realized through the algorithm, so that the natural expression of the snowfall effect in the browsing process is realized.
Further, on the basis of the foregoing embodiment, the method of the present application further includes:
step 105: performing fog filling on the scene to be rendered in a set range; the set range has no intersection with the snowfall range. Since snowfall only covers around the user's view, the user's view is far away and the detail is lost, so the far away is filled with fog, so the entire scene is full of the feeling of snowing. Meanwhile, the mist effect hardly affects the memory, the video memory and the performance of the webpage.
The application utilizes the delay rendering technology and the GPU particle system technology to realize snow scene rendering, almost occupies no memory, only occupies 20MB of video memory (related to the size of a view port, 1080P view port is taken as an example here), and can almost ignore large scene data. Because all special effect operation processes are almost carried out in the display card, the scene rendering efficiency is not basically affected.
The embodiment of the application also provides a snow scene rendering device of the large scene. Please see the examples below.
Fig. 2 is a block diagram of a snowscape rendering device for a large scene according to an embodiment of the present application. Referring to fig. 2, a snow scene rendering apparatus of a large scene includes:
a texture acquisition module 201, configured to acquire a depth texture and a normal texture of a scene to be rendered;
an ambient light occlusion amount calculation module 202 for calculating a screen space ambient light occlusion amount from the depth texture and the normal texture;
the snow rendering module 203 is configured to render snow in the scene to be rendered by using the ambient light shielding amount;
the snowing rendering module 204 is configured to perform snowing effect rendering according to the movement of the user's viewing angle by using the GPU particle system in combination with a preset particle movement rule; the preset particle movement rule is to control particle movement according to the real-time particle offset coordinates.
In more detail, the snowfall rendering module 204 includes: an offset coordinate calculation unit for calculating an offset coordinate according to a preset offset formulaCalculating a particle offset coordinate; wherein vParticalPosition is the offset coordinate of the particle, vOffset is the local position of the preset particle in the preset local coordinate system, fExtent is the preset square side length, and vcameralGridposition is the world position of the user view angle in the preset world coordinate system; wherein the world position of the particle is calculated according to the formula vcamera gridposition=mod (vEye);
and the particle moving unit is used for controlling the movement of the particles according to the offset coordinates.
In more detail, the ambient light shielding amount calculation module 202 is specifically configured to: calculating view coordinates and view normals of all pixel points in the scene to be rendered according to the depth texture and the normal texture; calculating a sampling radius according to the depth texture; selecting a set number of sampling points on a plurality of concentric circles with view coordinates of the set pixel points as circle centers and sampling radius as radius; calculating an ambient light shielding contribution value of the sampling point to a set pixel point by combining the view coordinates and the view normal; and carrying out weighted average calculation on the ambient light shielding contribution value to obtain the screen space ambient light shielding amount.
The device greatly improves the sense of reality of snow effect by utilizing the ambient light shielding technology, and simultaneously can increase the natural expression of snow effect by utilizing the GPU particle system.
On the basis of the above embodiment, the device of the present application further includes: the fog Jing Tianchong module 205 is configured to perform fog filling on the scene to be rendered within a set range; the set range has no intersection with the snowfall range.
By adding the fog effect in the embodiment, the whole scene can be full of snowing feeling, and the reality of scene rendering is enhanced.
In order to more clearly introduce a hardware system for implementing the embodiment of the application, the embodiment of the application also provides a snow scene rendering device of a large scene, which corresponds to the snow scene rendering method of the large scene provided by the embodiment of the application. Please see the examples below.
Fig. 3 is a block diagram of a snowscape rendering apparatus for a large scene according to an embodiment of the present application. Referring to fig. 3, a snow scene rendering apparatus of a large scene includes:
a processor 301 and a memory 302 connected to the processor 301;
the memory 302 is configured to store a computer program, where the computer program is configured to at least perform the above-described snow-scene rendering method of a large scene;
the processor 301 is used to invoke and execute computer programs in the memory 302.
Meanwhile, the embodiment also discloses a storage medium, wherein the storage medium stores a computer program, and when the computer program is executed by a processor, each step in the large-scene snow scene rendering method is realized.
It is to be understood that the same or similar parts in the above embodiments may be referred to each other, and that in some embodiments, the same or similar parts in other embodiments may be referred to.
It should be noted that in the description of the present application, the terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance. Furthermore, in the description of the present application, unless otherwise indicated, the meaning of "plurality" means at least two.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and further implementations are included within the scope of the preferred embodiment of the present application in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present application.
It is to be understood that portions of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above-described embodiments, the various steps or methods may be implemented in software or firmware stored in a memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, may be implemented using any one or combination of the following techniques, as is well known in the art: discrete logic circuits having logic gates for implementing logic functions on data signals, application specific integrated circuits having suitable combinational logic gates, programmable Gate Arrays (PGAs), field Programmable Gate Arrays (FPGAs), and the like.
Those of ordinary skill in the art will appreciate that all or a portion of the steps carried out in the method of the above-described embodiments may be implemented by a program to instruct related hardware, where the program may be stored in a computer readable storage medium, and where the program, when executed, includes one or a combination of the steps of the method embodiments.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing module, or each unit may exist alone physically, or two or more units may be integrated in one module. The integrated modules may be implemented in hardware or in software functional modules. The integrated modules may also be stored in a computer readable storage medium if implemented in the form of software functional modules and sold or used as a stand-alone product.
The above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, or the like.
In the description of the present specification, a description referring to terms "one embodiment," "some embodiments," "examples," "specific examples," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present application. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiments or examples. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
While embodiments of the present application have been shown and described above, it will be understood that the above embodiments are illustrative and not to be construed as limiting the application, and that variations, modifications, alternatives and variations may be made to the above embodiments by one of ordinary skill in the art within the scope of the application.
Claims (8)
1. A snowscape rendering method of a large scene, comprising:
acquiring depth textures and normal textures of a scene to be rendered;
calculating the screen space ambient light shielding amount according to the depth texture and the normal texture;
rendering snow in the scene to be rendered by utilizing the ambient light shielding amount;
utilizing a GPU particle system to combine with a preset particle movement rule and performing snowfall effect rendering according to movement of a user visual angle; the preset particle movement rule is to control particle movement according to real-time particle offset coordinates;
the controlling the movement of the particles according to the real-time particle shift coordinates comprises:
according to a preset offset formulaCalculating a particle offset coordinate; wherein vParticalPosition is the offset of the particleThe label, vOffset is the local position of the preset particle in the preset local coordinate system, fExtent is the preset square side length, and vCameraGridPosition is the world position of the user visual angle in the preset world coordinate system; wherein the world position of the particle is calculated according to the formula vcameralgridposition=mod (vEye, fextension), where vEye is the position of the preset camera;
and controlling the movement of the particles according to the offset coordinates.
2. The method of claim 1, wherein said calculating a screen space ambient light shading amount from said depth texture and said normal texture comprises:
calculating view coordinates and view normals of all pixel points in the scene to be rendered according to the depth texture and the normal texture;
calculating a sampling radius according to the depth texture;
selecting a set number of sampling points on a plurality of concentric circles with the view coordinates of the set pixel points as circle centers and the sampling radius as a radius;
calculating an ambient light shielding contribution value of the sampling point to the set pixel point by combining the view coordinates and the view normal;
and carrying out weighted average calculation on the ambient light shielding contribution value to obtain the screen space ambient light shielding amount.
3. The method of claim 2, wherein said computing a sampling radius from said depth texture comprises:
resolving according to the depth texture to obtain pixel depth;
and performing projection operation according to the pixel depth, and further obtaining the sampling radius.
4. The method of claim 2, wherein the sampling radius is less than a preset maximum sampling radius.
5. The method as recited in claim 1, further comprising:
performing fog filling on the scene to be rendered in a set range; the set range has no intersection with the snowfall range.
6. A snowscape rendering device of a large scene, comprising:
the texture acquisition module is used for acquiring depth textures and normal textures of the scene to be rendered;
an ambient light occlusion amount calculation module for calculating a screen space ambient light occlusion amount from the depth texture and the normal texture;
the snow rendering module is used for rendering snow in the scene to be rendered by utilizing the shielding amount of the ambient light;
the snowfall rendering module is used for utilizing the GPU particle system to combine with a preset particle movement rule and performing snowfall effect rendering according to the movement of the visual angle of the user; the preset particle movement rule is to control particle movement according to real-time particle offset coordinates;
the snowfall rendering module includes:
an offset coordinate calculation unit for calculating an offset coordinate according to a preset offset formulaCalculating a particle offset coordinate; wherein vParticalPosition is the offset coordinate of the particle, vOffset is the local position of the preset particle in the preset local coordinate system, fExtent is the preset square side length, and vcameralGridposition is the world position of the user view angle in the preset world coordinate system; wherein the world position of the particle is calculated according to the formula vcameralgridposition=mod (vEye, fextension), where vEye is the position of the preset camera;
and the particle moving unit is used for controlling the movement of the particles according to the offset coordinates.
7. A snowscape rendering device of a large scene, comprising:
a processor, and a memory coupled to the processor;
the memory is used for storing a computer program at least for executing the snow scene rendering method of a large scene according to any of claims 1-5;
the processor is configured to invoke and execute the computer program in the memory.
8. A storage medium storing a computer program which, when executed by a processor, implements the steps of the snowscape rendering method of a large scene as claimed in any one of claims 1 to 5.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011103832.8A CN112233214B (en) | 2020-10-15 | 2020-10-15 | Snow scene rendering method, device and equipment for large scene and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011103832.8A CN112233214B (en) | 2020-10-15 | 2020-10-15 | Snow scene rendering method, device and equipment for large scene and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112233214A CN112233214A (en) | 2021-01-15 |
CN112233214B true CN112233214B (en) | 2023-11-28 |
Family
ID=74117317
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011103832.8A Active CN112233214B (en) | 2020-10-15 | 2020-10-15 | Snow scene rendering method, device and equipment for large scene and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112233214B (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115423914A (en) * | 2021-05-31 | 2022-12-02 | 北京字跳网络技术有限公司 | Particle rendering method and device |
CN113593052B (en) * | 2021-08-06 | 2022-04-29 | 贝壳找房(北京)科技有限公司 | Scene orientation determining method and marking method |
CN117934784B (en) * | 2024-03-25 | 2024-06-07 | 山东捷瑞数字科技股份有限公司 | Method and system for constructing and enhancing natural vision based on graphics |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104143208A (en) * | 2013-05-12 | 2014-11-12 | 哈尔滨点石仿真科技有限公司 | Real-time rendering method for large-scale realistic snowscape |
CN106293565A (en) * | 2015-06-05 | 2017-01-04 | 福建星网视易信息系统有限公司 | A kind of analog information method and device based on particle motioning models |
CN107730578A (en) * | 2017-10-18 | 2018-02-23 | 广州爱九游信息技术有限公司 | The rendering intent of luminous environment masking figure, the method and apparatus for generating design sketch |
CN108805971A (en) * | 2018-05-28 | 2018-11-13 | 中北大学 | A kind of ambient light masking methods |
CN110097624A (en) * | 2019-05-07 | 2019-08-06 | 洛阳众智软件科技股份有限公司 | Generate the method and device of three-dimensional data LOD simplified model |
-
2020
- 2020-10-15 CN CN202011103832.8A patent/CN112233214B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104143208A (en) * | 2013-05-12 | 2014-11-12 | 哈尔滨点石仿真科技有限公司 | Real-time rendering method for large-scale realistic snowscape |
CN106293565A (en) * | 2015-06-05 | 2017-01-04 | 福建星网视易信息系统有限公司 | A kind of analog information method and device based on particle motioning models |
CN107730578A (en) * | 2017-10-18 | 2018-02-23 | 广州爱九游信息技术有限公司 | The rendering intent of luminous environment masking figure, the method and apparatus for generating design sketch |
CN108805971A (en) * | 2018-05-28 | 2018-11-13 | 中北大学 | A kind of ambient light masking methods |
CN110097624A (en) * | 2019-05-07 | 2019-08-06 | 洛阳众智软件科技股份有限公司 | Generate the method and device of three-dimensional data LOD simplified model |
Non-Patent Citations (2)
Title |
---|
基于粒子系统的三维降雪场景仿真;周强 等;《计算机技术与发展》;第27卷(第1期);130-134 * |
大规模真实感雪景实时渲染;王纲 等;《电子学报》;第40卷(第9期);1746-1751 * |
Also Published As
Publication number | Publication date |
---|---|
CN112233214A (en) | 2021-01-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112233214B (en) | Snow scene rendering method, device and equipment for large scene and storage medium | |
CN108236783B (en) | Method and device for simulating illumination in game scene, terminal equipment and storage medium | |
CN111968215B (en) | Volume light rendering method and device, electronic equipment and storage medium | |
Behrendt et al. | Realistic real-time rendering of landscapes using billboard clouds | |
CN111968216B (en) | Volume cloud shadow rendering method and device, electronic equipment and storage medium | |
CN111476877B (en) | Shadow rendering method and device, electronic equipment and storage medium | |
CN108805971B (en) | Ambient light shielding method | |
CN110335333A (en) | For efficiently rendering the method and system of skies gas effect figure in three-dimensional map | |
WO2014159285A1 (en) | System and method for generation of shadow effects in three-dimensional graphics | |
US9582929B2 (en) | Dynamic skydome system | |
CN113034657B (en) | Rendering method, device and equipment for illumination information in game scene | |
US20100045669A1 (en) | Systems and method for visualization of fluids | |
CN104205173A (en) | Method for estimating the opacity level in a scene and corresponding device | |
CN111968214B (en) | Volume cloud rendering method and device, electronic equipment and storage medium | |
CN103700134A (en) | Three-dimensional vector model real-time shadow deferred shading method based on controllable texture baking | |
CN111739142A (en) | Scene rendering method and device, electronic equipment and computer readable storage medium | |
CN112184873A (en) | Fractal graph creating method and device, electronic equipment and storage medium | |
CN117218273A (en) | Image rendering method and device | |
CN116485987B (en) | Real environment simulation method and device based on shadow rendering | |
CN113368496B (en) | Weather rendering method and device for game scene and electronic equipment | |
US11804008B2 (en) | Systems and methods of texture super sampling for low-rate shading | |
CN111739074B (en) | Scene multi-point light source rendering method and device | |
US20190005736A1 (en) | Method and apparatus for calculating a 3d density map associated with a 3d scene | |
CN117557703A (en) | Rendering optimization method, electronic device and computer readable storage medium | |
CN114119836A (en) | Rendering method, rendering device, electronic equipment and readable storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CP03 | Change of name, title or address |
Address after: Floor 13, 14 and 15, building 3, lianfei building, No.1, Fenghua Road, high tech Development Zone, Luoyang City, Henan Province, 471000 Patentee after: Zhongzhi Software Co.,Ltd. Country or region after: China Address before: Floor 13, 14 and 15, building 3, lianfei building, No.1, Fenghua Road, high tech Development Zone, Luoyang City, Henan Province, 471000 Patentee before: Luoyang Zhongzhi Software Technology Co.,Ltd. Country or region before: China |
|
CP03 | Change of name, title or address |