CN117893670A - Real-time global illumination rendering method and device and electronic equipment - Google Patents

Real-time global illumination rendering method and device and electronic equipment Download PDF

Info

Publication number
CN117893670A
CN117893670A CN202410057767.1A CN202410057767A CN117893670A CN 117893670 A CN117893670 A CN 117893670A CN 202410057767 A CN202410057767 A CN 202410057767A CN 117893670 A CN117893670 A CN 117893670A
Authority
CN
China
Prior art keywords
voxel
voxels
distance field
illumination
real
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410057767.1A
Other languages
Chinese (zh)
Inventor
王雪健
白欲立
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo New Vision Beijing Technology Co Ltd
Original Assignee
Lenovo New Vision Beijing Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo New Vision Beijing Technology Co Ltd filed Critical Lenovo New Vision Beijing Technology Co Ltd
Priority to CN202410057767.1A priority Critical patent/CN117893670A/en
Publication of CN117893670A publication Critical patent/CN117893670A/en
Pending legal-status Critical Current

Links

Landscapes

  • Image Generation (AREA)

Abstract

The application provides a real-time global illumination rendering method, a device and electronic equipment, wherein the real-time global illumination rendering comprises the following steps: voxelization is carried out on the scene geometry of the current frame to obtain distance field voxels, wherein the attributes of the distance field voxels comprise: vector distance field, normal vector, albedo, and emission value; calculating direct illumination of a main light source with preset intensity to each distance field voxel and taking the direct illumination as emergent radiance; the outgoing radiance of the distance field voxels is anisotropically filtered to form a hierarchy to generate directional voxels; the outgoing emittance of the directional voxels is collected by the cone of ray bundles and the indirect illumination is calculated. The method and the device improve the running efficiency of real-time illumination, ensure the accuracy of illumination and improve the 3D rendering effect.

Description

Real-time global illumination rendering method and device and electronic equipment
Technical Field
The present application relates to the field of computer graphics, and in particular, to a real-time global illumination rendering method, apparatus and electronic device.
Background
In computer graphics, accurately calculating the light distribution in a scene is an important aspect of realism, and such calculation is often expensive because the light does not propagate, bounce, scatter along a straight path, and is absorbed by different objects in the scene.
Standard rendering pipelines use triangles to represent the geometry of a scene, which are then rasterized, rendered, and ultimately displayed on a screen. This representation is effective for local fragmentation operations (e.g., direct illumination), but has many limitations for more complex operations (e.g., global illumination).
At present, the global illumination method in 3D graphics generally has the problems of overlarge consumption, light leakage, low efficiency and the like.
Disclosure of Invention
In view of the above, the present application provides a real-time global illumination rendering method, apparatus and electronic device, where the method is used for storing outgoing radiation, including a solution of voxel occlusion and voxel normal attenuation error, and an efficient global illumination process of voxels, and uses cone tracking to solve the problems of excessive consumption, light leakage, low efficiency, etc. in the global illumination method in 3D graphics.
In a first aspect, an embodiment of the present application provides a real-time global illumination rendering method, including:
Voxelization is carried out on the scene geometry of the current frame to obtain distance field voxels, wherein the attributes of the distance field voxels comprise: vector distance field, normal vector, albedo, and emission value;
calculating direct illumination of a main light source with preset intensity to each distance field voxel and taking the direct illumination as emergent radiance;
The outgoing radiance of the distance field voxels is anisotropically filtered to form a hierarchy to generate directional voxels;
the outgoing emittance of the directional voxels is collected by the cone of ray bundles and the indirect illumination is calculated.
In one possible implementation, the scene geometry of the current frame is subjected to voxelization to obtain distance field voxels; comprising the following steps:
Generating a larger boundary polygon than the projected triangle for each scene geometry of the current frame to generate at least one segment, each segment having a different attribute, comprising: inverse normal vector, albedo, and emission value;
Averaging the attributes of all fragments in a voxel to obtain the attribute of the voxel;
the layered distance field representation stored in the 3D texture is used to derive the vector distance field for the scene geometry.
In one possible implementation, the method further includes:
When the distance field voxel of the previous frame is a dynamic voxel, updating the attribute body and the radiator of the current frame;
When the distance field voxel of the previous frame is a static voxel, the distance field voxel of the current frame uses the value of the previous frame;
the distance field voxels of the current frame are marked as dynamic voxels or static voxels.
In one possible implementation, calculating the direct illumination of the primary light source of the preset intensity onto each distance field voxel and as the outgoing radiance comprises:
A normal attenuation vector for each face of the distance field voxel is calculated:
Wherein is the light direction; the/> is the voxel normal vector of each face;
Three major faces/> are selected according to the axis sign of the average normal to each face of the voxel:
The outgoing radiance of the main light source with preset intensity irradiated onto the distance field voxel is:
Wherein is the preset intensity of the primary light source,/> is the albedo of the voxel, N is the voxel normal vector,/> is the voxel average normal weight vector: and/> ,/> is the vector after the normal decay vector product of the three main faces.
In one possible implementation, determining whether a voxel is occluded includes:
The position of the voxel is projected in the light space and the depth of the projected point is compared with the stored depth from the shadow map to determine if the voxel is occluded.
In one possible implementation, the outgoing radiance of the distance field voxels is anisotropically filtered to form a hierarchy to generate directional voxels; comprising the following steps:
Dividing the distance field voxels, summing the distance field voxels in each axial direction, taking a mean value, and interpolating according to the angle of the light during sampling to obtain a radiation value;
using anisotropic filtering, storing six direction values for each distance field voxel;
the orientation voxels are obtained by linear interpolation of three samples obtained from the selected orientation volume.
In one possible implementation, collecting the outgoing radiance of the directional voxels by the cone of ray bundles and calculating the indirect illumination comprises:
obtaining a shielding value and an outgoing radiance/> from the orientation voxels;
After the mth step of the cone, the indirect illumination value and the tracking occlusion value/> are:
Wherein and/> are the indirect illumination value and tracking occlusion value of the light after the m-1 th step of the cone.
In a second aspect, an embodiment of the present application provides a real-time global illumination rendering apparatus, including:
The voxelization unit is used for voxelization processing of the scene geometry of the current frame to obtain distance field voxels, and the attributes of the distance field voxels comprise: vector distance field, normal vector, albedo, and emission value;
the direct illumination calculation unit is used for calculating the direct illumination of the main light source with preset intensity irradiated to each distance field voxel and taking the direct illumination as emergent radiance;
The processing unit is used for forming a hierarchy by anisotropically filtering the emergent radiance of the distance field voxels so as to generate directional voxels;
and the indirect illumination calculation unit is used for collecting the emergent radiance of the directional voxels through the light beam of the cone and calculating the indirect illumination.
In a third aspect, an embodiment of the present application provides an electronic device, including: the system comprises a memory and a processor, wherein the memory stores executable programs, and the processor executes the executable programs to realize the steps of the method of the embodiment of the application.
In a fourth aspect, embodiments of the present application provide a storage medium carrying one or more computer programs which, when executed by a processor, implement the steps of the methods of embodiments of the present application
The method and the device improve the running efficiency of real-time illumination, ensure the accuracy of illumination and improve the 3D rendering effect.
Drawings
FIG. 1 is a flowchart of a real-time global illumination rendering method according to an embodiment of the present application;
FIG. 2 (a) is a schematic diagram of determining whether there are voxels in a ray location according to an embodiment of the present application;
FIG. 2 (b) is a schematic view of a ray passing through a voxelized geometric boundary according to an embodiment of the present application;
FIG. 3 is a functional block diagram of a real-time global illumination rendering device according to an embodiment of the present application;
fig. 4 is a functional block diagram of an electronic device according to an embodiment of the present application.
Detailed Description
Various aspects and features of the present application are described herein with reference to the accompanying drawings.
It should be understood that various modifications may be made to the embodiments of the application herein. Therefore, the above description should not be taken as limiting, but merely as exemplification of the embodiments. Other modifications within the scope and spirit of the application will occur to persons of ordinary skill in the art.
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the application and, together with a general description of the application given above, and the detailed description of the embodiments given below, serve to explain the principles of the application.
These and other characteristics of the application will become apparent from the following description of a preferred form of embodiment, given as a non-limiting example, with reference to the accompanying drawings.
It is also to be understood that, although the application has been described with reference to some specific examples, those skilled in the art can certainly realize many other equivalent forms of the application.
The above and other aspects, features and advantages of the present application will become more apparent in light of the following detailed description when taken in conjunction with the accompanying drawings.
Specific embodiments of the present application will be described hereinafter with reference to the accompanying drawings; however, it is to be understood that the disclosed embodiments are merely exemplary of the application, which can be embodied in various forms. Well-known and/or repeated functions and constructions are not described in detail to avoid obscuring the application in unnecessary or unnecessary detail. Therefore, specific structural and functional details disclosed herein are not intended to be limiting, but merely as a basis for the claims and as a representative basis for teaching one skilled in the art to variously employ the present application in virtually any appropriately detailed structure.
The specification may use the word "in one embodiment," "in another embodiment," "in yet another embodiment," or "in other embodiments," which may each refer to one or more of the same or different embodiments in accordance with the application.
First, the design idea of the embodiment of the present application will be briefly described.
Computing global illumination is a challenging and complex problem for real-time rendering in 3D applications. At present, the global illumination method in 3D graphics generally has the problems of overlarge consumption, light leakage, efficiency and the like.
To this end, the application proposes a global illumination rendering method that uses a vector distance field for step tracking calculation of outgoing radiation stored in voxels and for real-time calculation of indirect illumination of a simplified version of the scene. Such methods include diffuse, specular, and dual-reflection indirect illumination of emissive materials. The relevant structure in the distance field is based on a directional hierarchy stored in 3D texture by mipmap, which is updated in real-time with the GPU, enabling approximation of the indirect illumination of dynamic scenes. The vector-based distance field light path is adopted to directly calculate the effective voxels in the distance field. And global illumination for simplified outgoing radiation, ambient occlusion, soft shadows, and indirect illumination.
The method of the embodiment of the application can realize the following steps by using the discretization of the scene with voxel and cone tracking:
a real-time distance field calculation process stores only the information required to capture the direct and indirect illumination of each voxel.
An efficient voxel-based illumination process for storing outgoing radiation, including solutions for voxel occlusion and voxel normal attenuation errors.
An efficient voxel global illumination process uses cone tracking, expanding the filtered outgoing radiation with indirect diffuse terms.
The method and the device improve the running efficiency of real-time illumination, ensure the accuracy of illumination and improve the 3D rendering effect.
After the application scenario and the design idea of the embodiment of the present application are introduced, the technical solution provided by the embodiment of the present application is described below.
As shown in fig. 1, an embodiment of the present application provides a real-time global illumination rendering method, including:
Step 101: voxelization is carried out on the scene geometry of the current frame to obtain distance field voxels, wherein the attributes of the distance field voxels comprise: vector distance field, normal vector, albedo, and emission value;
Wherein, the embodiment of the application uses layered distance fields stored in 3D texture to represent discrete scene geometry, and uses the construction process of stepped vector distance fields for this purpose. Only the illumination data required for diffuse illumination, such as normals and albedo, are stored in the voxels using the average of all segments within the voxel space from the field, and furthermore the emission is stored to approximate the luminous surface.
Step 102: calculating direct illumination of a main light source with preset intensity to each distance field voxel and taking the direct illumination as emergent radiance;
For each distance field voxel the outgoing radiation is calculated, for direct diffuse illumination standard illumination techniques are used. An averaging operation is used in the voxelization process, which may lead to an undesired normal direction of the resulting average normal. This may create a loading problem for the normal decay term. To reduce this problem, a normal weighted direction coloring model for normal line attenuation is proposed. To calculate occlusion, in addition to standard shadow mapping techniques, ray casting may be used: rays are tracked from voxel locations in the light direction and voxel volumes are sampled to test ray-voxel collisions.
Step 103: the outgoing radiance of the distance field voxels is anisotropically filtered to form a hierarchy to generate directional voxels;
The present embodiment uses anisotropic filtering to populate the levels of the voxel representation hierarchy to generate directional voxels. To this end, the values from higher to lower levels of detail in the hierarchy are filtered in the axial direction using a direction integration and averaging operation; the level of detail is stored in a mipmap level for the 3D texture, which corresponds to the texture LOD. When the object is close to the object of the observer, a high resolution mipmap image is used: as the object gets farther away from the viewer, a low resolution image is used.
The texture is not only a piece of picture data, but may be a series of picture data, because the texture needs to be subjected to interpolation calculation when the picture is enlarged and reduced, the process is relatively time-consuming, in order to avoid the time-consuming calculation and improve the performance, the display card adopts a multi-level texture mode of a picture, and according to the distance between a camera and the texture, different levels of textures are adopted, such as that the camera is closer to the texture, a larger texture is selected, the camera is farther from the texture, and a smaller texture is selected. This is a multi-level texture.
Step 104: the outgoing emittance of the directional voxels is collected by the cone of ray bundles and the indirect illumination is calculated.
To approximate the second bounce of indirect light, another step may be added after filtering the calculated values from the outgoing radiation. In this regard, the necessary information for calculating the indirect diffuse reflection component using cone tracking is provided in the voxel structure. For each voxel cone tracking is performed to approximate the indirect diffuse reflection component, and the filtering step is then repeated, so that the outgoing radiation in the voxel representation now contains the first bounce of direct illumination and indirect diffuse reflection.
Diffuse and specular reflection and ambient occlusion. For each visible fragment, direct and indirect illumination is calculated. For the indirect component, a final acquisition scheme is used to track the cone over a hemisphere in the normal direction to collect the outgoing radiation stored in the voxel representation.
Specifically, the specific implementation procedure of step 101 is as follows:
during voxelization, normal vectors, albedo and emission values are stored in the voxels, each attribute having its own 3D texture. This information is necessary to calculate the diffuse reflection and normal attenuation of the individual light paths, where the illumination of each pixel is not calculated, but rather by voxel. The structure may be extended to support more complex predictive models, but this may mean higher memory consumption for storing additional data.
In addition, the distance field step tracking process uses another structure. The illumination calculation result value for each voxel is stored in another 3D texture, called the radiation volume. To approximate the increasing diameter of the cone and its sample volume, the level of detail of the voxelized scene is used. For anisotropic voxels six 3D textures with half the resolution of the radiation volume are required, one texture for each axis direction, as well as positive and negative directions. The level of detail is stored in the mipmap level of these textures, referred to as the targeting volume.
The radiation volume represents the maximum level of detail of the voxelized scene, this texture being separated from the directional volume. To bind the two structures, linear interpolation is used between samples of the two structures when the mipmap level required for the distance field range is between the maximum level and the detail level of the first filter.
The voxelization process is performed using a conservative voxelization scheme, which means that if the geometry exists in the space of voxels, there must be voxels in that space. The thin-surface voxelization of the triangle can be calculated for each voxel B by testing whether the plane defined by the vertices of the triangle intersects B and whether the 2D projection of the triangle along its normal principal axis intersects the 2D projection of B.
For each projected triangle, a slightly larger boundary polygon is generated to ensure that no matter how small the projected triangle is, it will generate at least one segment. To create this polygon, each triangle edge is moved outward to enlarge the triangle. The boundary polygons do not overestimate the coverage of the projected triangle, and therefore discard excess fragments using the expanded bounding box defined by the projected triangle vertices.
A voxel may contain many segments and all of these segments may have different stored attribute values (i.e., albedo, normal, and emission). To obtain a good approximation of the discrete geometry, an averaging operation is applied to all values within the voxel space of each attribute. The attributes must be written and read at each voxel location in their respective 3D texture using atomic operations to ensure a time consistent result.
For dynamic updating, the calculation of the properties of the corresponding vector distance fields of the static and dynamic geometries is separate. Although static voxelization occurs only once, dynamic voxelization occurs at each frame or when needed. Voxelization of both types of geometry occurs in the same process as described above, and therefore a method is needed to indicate which voxels are static, embodiments of the present application use a single-valued 3D texture to identify static voxels, which will be referred to as a flow volume. In a static voxelization process, after generating a voxel, a value is written into the flow volume indicating that the location is static. In contrast, in a dynamic voxelization process, a value is read from the flow volume at the voxel writing location prior to generating the voxel. If the value indicates that the position marker is static, writing is aborted, leaving the static voxel unchanged.
In order to continually re-scene, it is necessary to clear previously stored dynamic voxels from the 3D texture. This is done prior to dynamic voxelization using general purpose computations on a Graphics Processing Unit (GPU) or ComputeShading computations. The flow volume is read to clear the voxels under two conditions: whether a voxel is present and whether the voxel is dynamic.
Specifically, the specific implementation procedure of step 102 includes:
in order to properly approximate indirect illumination during the vector distance field step calculation tracking step, it is necessary to store the incident radiation in the radiation volume of the voxel structure. Inspired by the delayed rendering, a voxel light path for calculating the direct illumination of each voxel is achieved. However, unlike performing delayed rendering of the light path for each visible pixel, the illumination calculation herein is performed for each voxel, whether visible or invisible. This is necessary to avoid visibility problems, as the structure is used for approximate indirect lighting. In this case, the camera obscures or does not see geometry whether or not it contributes to the light distribution of the visible debris. To expedite this process, GPGPU-based methods perform one thread for each voxel reserved for the radiating volume.
For the Lambert reflection model, the information stored in the 3D texture during the voxelization process can be used to calculate the coloring of each voxel. For each light source, the following equation is calculated per voxel to describe the voxel exit emittance :
Where is the light source intensity,/> is the albedo of the voxel,/> is the voxel normal vector,/> is the light direction. For non-uniform light sources such as spotlights and point light sources, the position of the voxel is required. This may be obtained by projecting the position of the voxel in texture space to world space.
When the normal vector of the voxel is used for the normal decay term (), it may give unwanted results. Since the normal vector is stored using an averaging operation, the resulting average normal may ultimately point in an inconvenient direction. This problem is noticeable when the normal vectors within the voxel space are non-uniform. To reduce this problem, the present embodiment uses a normal weighted attenuation, where the normal attenuation is first calculated per face of the voxel, as follows:
a normal attenuation vector for each face of the distance field voxel is calculated:
Wherein is the light direction; the/> is the voxel normal vector of each face;
Three major faces/> are selected according to the axis sign of the average normal to each face of the voxel:
the outgoing radiance of the main light source with preset intensity irradiated onto the distance field voxel is:
Wherein is the preset intensity of the primary light source,/> is the albedo of the voxel, N is the voxel normal vector,/> is the voxel average normal weight vector: and/> ,/> is the vector after the normal decay vector product of the three main faces.
In order to generate accurate results in the vector distance field tracking step, the voxels need to be occluded, otherwise the voxelized geometry, which should have little outgoing radiation, will contribute to the indirect illumination calculation.
The position of the voxel is projected in the light space and the depth of the projected point is compared with the stored depth from the shadow map to determine if the voxel is occluded. A simple improvement to this technique is as follows: instead of using the voxel center position Vp, this position is translated along the normal vector of the voxel by half voxel size Vsize, i.e. vp=vp+n× Vsize ×0.5, which further exposes the voxel position in case the center position may be blocked by geometry in the vicinity of the voxel.
Embodiments of the present application use ray casting within a volume to calculate occlusion. Any resulting volume from the voxelization process may be used, as the algorithm need only determine whether a voxel is present at a particular location. To determine the occlusion of a voxel, a ray is traced from the location of the voxel along the direction of the light, and the volume is sampled to determine if there is a voxel at the location of the ray (as shown in fig. 2 (a)), if this condition is true, the voxel is occluded.
Instead of stopping the rays immediately after a voxel is found, the soft shadow can be approximated by a single ray, accumulating a value k per collision, and dividing by the tracking distance t, i.e., the current soft shadow is:
wherein the soft shadow of the previous frame is ; from a light perspective, the number of collisions is typically higher for rays that cross the voxelized geometric boundary (as shown in fig. 2 (b)).
After obtaining the direct illumination value, the emission term from the voxelization process is added to the value for each voxel. In the cone tracking step, these voxels will look bright even over the occluded regions, so indirect light from these regions of the voxelized scene will accumulate on each cone.
Specifically, the specific implementation procedure of step 103 includes:
In order to obtain more accurate results in the vector distance field tracking step anisotropic voxels are used. The mipmap level will store six direction values for each voxel, one for each direction axis. Each cone has an origin, aperture angle and direction, the last factor determining which three volumes to sample. The directional samples are obtained by linear interpolation of three samples obtained from the selected directional volume.
To generate anisotropic voxels, to calculate the direction values, volume integration is performed over depth and the direction values are averaged to obtain a result value for a particular direction. This embodiment uses the GPU to execute one thread per voxel at the mipmap level, which will filter using the values of the previous level, which is executed at each mipmap level.
Vector distance field tracking is similar to ray travel, advancing a certain length every step, except that the sample volume increases along the diameter of the distance field. The mipmap level in the directional volume is used to approximate the expansion of the sample volume during the cone trajectory to ensure smooth variation between samples, four linear interpolations can be used.
During the conical step, the diameter of the cone is defined by d, which can be extracted using the tracking distance t, as follows:
d=2t×tan(
Wherein is the aperture angle.
Depending on the diameter of the cone, which mipmap level Vlevel should be sampled, the following equation can be used:
Vlevel=log2(d÷Vsize)
wherein Vsize is the size of the voxel at maximum level of detail.
Specifically, the specific implementation procedure of step 104 includes:
Obtaining a shielding value and an outgoing radiance/> from the orientation voxels;
After the mth step of the cone, the indirect illumination value and the tracking occlusion value/> are:
Wherein and/> are the indirect illumination value and tracking occlusion value of the light after the m-1 th step of the cone.
To ensure good integration quality between samples, the distance d0 between the steps is modified by a factor β. When β=1, a value of d0 equal to the diameter d of the cone, a value of less than 1 will produce higher quality results, but more samples will be needed, which will reduce performance.
To improve efficiency, this process is implemented in screen space, using deferred rendering. For each visible fragment, a set of cones is tracked to obtain a different effect.
Indirect illumination uses a coarse monte carlo approximation. The hemispherical region of integration in the rendering equation may be divided into sums of integration. For regular partitions, each partition area resembles a cone. For each cone, the result of tracking valid voxels in the distance field is used to approximate them, and the resulting values are then weighted to obtain the cumulative result at the cone origin.
For the Blinn Phong material, several large cones distributed over a normal-direction hemisphere estimate diffuse reflection, while a single distance field in the direction of reflection (whose aperture depends on the specular index) approximates specular reflection.
To increase efficiency, the same cone used for diffuse reflection may be used to approximate ambient occlusion. For the ambient occlusion term delta, only the occlusion values are accumulated, and at each step, the accumulated value is multiplied by a weighting function f (r):
f(r)=1/(1+λr)
where r is the current radius of the cone and λ is a user-defined value.
Controlling the speed of decay of f (r) along the tracking distance, and after the mth step of the cone, updating the ambient occlusion term to:
wherein is the ambient occlusion term after the m-1 th step of the cone.
Cone tracking may also be used to achieve soft shadows of the cone tracking from the curved points to the direction of the lamp light. The cone aperture controls the softness and scattering of the shadow generated. For soft shadows with cones, the occlusion values are only accumulated at each step.
In order to calculate diffuse reflection at a surface point using vector distance field step tracking, its normal vector, albedo, and incident radiation at that point are required. Since the geometric normal vector and albedo are voxelized as 3D textures, all the information needed for indirect diffuse terms is available after the voxel direct illumination is calculated. A corresponding distance field ladder tracking is performed for each valid voxel using the GPGPU to calculate the indirectly diffuse first bounce at each voxel. This step is performed after anisotropic filtering of the outgoing radiation values of the voxel direct illumination process
For each voxel, a set of cones is generated around the hemisphere in the normal direction using its average normal vector to calculate the indirect diffusion of the voxel location. The effective weighted result from all distance fields is then multiplied by the albedo of the voxel and added to the direct illumination value.
The final outgoing radiation of the radiation volume now stores the first bounce of direct illumination and indirect diffusion. For new values, the anisotropic filtering process needs to be repeated. The second bounce of the indirect illumination can be approximated in the step of step calculation of the final distance field for each pixel.
Based on the same inventive concept, an embodiment of the present application provides a real-time global illumination rendering device, as shown in fig. 3, where the real-time global illumination rendering device 200 provided by the embodiment of the present application at least includes:
A voxelization unit 201, configured to voxeize a scene geometry of a current frame to obtain a distance field voxel, where an attribute of the distance field voxel includes: vector distance field, normal vector, albedo, and emission value;
A direct illumination calculation unit 202 for calculating the direct illumination of the main light source with preset intensity to each distance field voxel and using the direct illumination as the emergent radiance;
A processing unit 203 for anisotropically filtering the outgoing emittance of the distance field voxels to form a hierarchy to generate directional voxels;
an indirect illumination calculation unit 204 for collecting the outgoing radiance of the directional voxels by the ray bundles of the cone and calculating the indirect illumination.
It should be noted that, the principle of solving the technical problem of the real-time global illumination rendering device 200 provided by the embodiment of the present application is similar to that of the method provided by the embodiment of the present application, so that the implementation of the real-time global illumination rendering device 200 provided by the embodiment of the present application can refer to the implementation of the method provided by the embodiment of the present application, and the repetition is omitted.
As shown in fig. 4, an electronic device 300 provided in an embodiment of the present application at least includes: the real-time global illumination rendering method provided by the embodiment of the application is realized by the processor 301, the memory 302 and the computer program stored in the memory 302 and capable of running on the processor 301 when the processor 301 executes the computer program.
The electronic device 300 provided by embodiments of the present application may also include a bus 303 that connects the different components, including the processor 301 and the memory 302. Bus 303 represents one or more of several types of bus structures, including a memory bus, a peripheral bus, a local bus, and so forth.
Memory 302 may include readable media in the form of volatile memory, such as random access memory (Random Access Memory, RAM) 3021 and/or cache memory 3022, and may further include Read Only Memory (ROM) 3023.
The memory 302 may also include a program tool 3024 having a set (at least one) of program modules 3025, the program modules 3025 including, but not limited to: an operating subsystem, one or more application programs, other program modules, and program data, each or some combination of which may include an implementation of a network environment.
The electronic device 300 may also communicate with one or more external devices 304 (e.g., keyboard, remote control, etc.), one or more devices that enable a user to interact with the electronic device 300 (e.g., cell phone, computer, etc.), and/or any device that enables the electronic device 300 to communicate with one or more other electronic devices 300 (e.g., router, modem, etc.). Such communication may occur through an Input/Output (I/O) interface 305. Also, electronic device 300 may communicate with one or more networks such as a local area network (Local Area Network, LAN), a wide area network (Wide Area Network, WAN), and/or a public network such as the internet via network adapter 306. As shown in fig. 4, the network adapter 306 communicates with other modules of the electronic device 300 over the bus 303. It should be appreciated that although not shown in fig. 4, other hardware and/or software modules may be used in connection with electronic device 300, including, but not limited to: microcode, device drivers, redundant processors, external disk drive arrays, disk array (Redundant Arrays of INDEPENDENT DISKS, RAID) subsystems, tape drives, data backup storage subsystems, and the like.
It should be noted that the electronic device 300 shown in fig. 4 is only an example, and should not be construed as limiting the function and the application scope of the embodiment of the present application.
The embodiment of the application also provides a computer readable storage medium which stores computer instructions which when executed by a processor realize the real-time global illumination rendering method provided by the embodiment of the application.
Furthermore, although the operations of the methods of the present application are depicted in the drawings in a particular order, this is not required or suggested that these operations must be performed in this particular order or that all of the illustrated operations must be performed in order to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step to perform, and/or one step decomposed into multiple steps to perform.
While preferred embodiments of the present application have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. It is therefore intended that the following claims be interpreted as including the preferred embodiments and all such alterations and modifications as fall within the scope of the application.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the application.

Claims (10)

1. A real-time global illumination rendering method, comprising:
Voxelization is carried out on the scene geometry of the current frame to obtain distance field voxels, wherein the attributes of the distance field voxels comprise: vector distance field, normal vector, albedo, and emission value;
calculating direct illumination of a main light source with preset intensity to each distance field voxel and taking the direct illumination as emergent radiance;
The outgoing radiance of the distance field voxels is anisotropically filtered to form a hierarchy to generate directional voxels;
the outgoing emittance of the directional voxels is collected by the cone of ray bundles and the indirect illumination is calculated.
2. The real-time global illumination rendering method according to claim 1, wherein the scene geometry of the current frame is subjected to voxelization to obtain distance field voxels; comprising the following steps:
Generating a larger boundary polygon than the projected triangle for each scene geometry of the current frame to generate at least one segment, each segment having a different attribute, comprising: inverse normal vector, albedo, and emission value;
Averaging the attributes of all fragments in a voxel to obtain the attribute of the voxel;
the layered distance field representation stored in the 3D texture is used to derive the vector distance field for the scene geometry.
3. The real-time global illumination rendering method of claim 1, further comprising:
When the distance field voxel of the previous frame is a dynamic voxel, updating the attribute body and the radiator of the current frame;
When the distance field voxel of the previous frame is a static voxel, the distance field voxel of the current frame uses the value of the previous frame;
the distance field voxels of the current frame are marked as dynamic voxels or static voxels.
4. The real-time global illumination rendering method according to claim 1, wherein calculating the direct illumination of the primary light source of a preset intensity to each distance field voxel and as the outgoing radiance comprises:
A normal attenuation vector for each face of the distance field voxel is calculated:
wherein is the light direction; the/> is the voxel normal vector of each face;
Three major faces/> are selected according to the axis sign of the average normal to each face of the voxel:
The outgoing radiance of the main light source with preset intensity irradiated onto the distance field voxel is:
Wherein is the preset intensity of the primary light source,/> is the albedo of the voxel, N is the voxel normal vector,/> is the voxel average normal weight vector: and/> ,/> is the vector after the normal decay vector product of the three main faces.
5. The method of real-time global illumination rendering according to claim 1, wherein determining whether a voxel is occluded comprises:
The position of the voxel is projected in the light space and the depth of the projected point is compared with the stored depth from the shadow map to determine if the voxel is occluded.
6. The real-time global illumination rendering method of claim 1, wherein the outgoing radiance of distance field voxels is anisotropically filtered to form a hierarchy to generate directional voxels; comprising the following steps:
Dividing the distance field voxels, summing the distance field voxels in each axial direction, taking a mean value, and interpolating according to the angle of the light during sampling to obtain a radiation value;
storing six direction values for each distance field voxel using anisotropic filtering;
the orientation voxels are obtained by linear interpolation of three samples obtained from the selected orientation volume.
7. The real-time global illumination rendering method of claim 1, wherein collecting the outgoing radiance of the directional voxels through the ray bundles of the cone and calculating the indirect illumination comprises:
Obtaining a shielding value and an outgoing radiance/> from the orientation voxels;
After the mth step of the cone, the indirect illumination value and the tracking occlusion value/> are:
Wherein and/> are the indirect illumination value and tracking occlusion value of the light after the m-1 th step of the cone.
8. A real-time global illumination rendering apparatus, comprising:
The voxelization unit is used for voxelization processing of the scene geometry of the current frame to obtain distance field voxels, and the attributes of the distance field voxels comprise: vector distance field, normal vector, albedo, and emission value;
the direct illumination calculation unit is used for calculating the direct illumination of the main light source with preset intensity irradiated to each distance field voxel and taking the direct illumination as emergent radiance;
The processing unit is used for forming a hierarchy by anisotropically filtering the emergent radiance of the distance field voxels so as to generate directional voxels;
and the indirect illumination calculation unit is used for collecting the emergent radiance of the directional voxels through the light beam of the cone and calculating the indirect illumination.
9. An electronic device, comprising: a memory and a processor, the memory having stored therein an executable program, the processor executing the executable program to implement the steps of the method of any one of claims 1 to 7.
10. A storage medium carrying one or more computer programs which, when executed by a processor, implement the steps of the method of any of claims 1 to 7.
CN202410057767.1A 2024-01-15 2024-01-15 Real-time global illumination rendering method and device and electronic equipment Pending CN117893670A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410057767.1A CN117893670A (en) 2024-01-15 2024-01-15 Real-time global illumination rendering method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410057767.1A CN117893670A (en) 2024-01-15 2024-01-15 Real-time global illumination rendering method and device and electronic equipment

Publications (1)

Publication Number Publication Date
CN117893670A true CN117893670A (en) 2024-04-16

Family

ID=90651781

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410057767.1A Pending CN117893670A (en) 2024-01-15 2024-01-15 Real-time global illumination rendering method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN117893670A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109364481A (en) * 2018-10-30 2019-02-22 网易(杭州)网络有限公司 Real-time global illumination method, apparatus, medium and electronic equipment in game
US20200302683A1 (en) * 2016-05-30 2020-09-24 Netease (Hangzhou) Network Co.,Ltd. Global Illumination Calculation Method and Apparatus
CN115713584A (en) * 2022-11-10 2023-02-24 上海纵游网络技术有限公司 Method, system, device and storage medium for rendering volume cloud based on directed distance field
CN115830208A (en) * 2023-01-09 2023-03-21 腾讯科技(深圳)有限公司 Global illumination rendering method and device, computer equipment and storage medium
CN117152237A (en) * 2023-09-25 2023-12-01 不鸣科技(杭州)有限公司 Distance field generation method and device, electronic equipment and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200302683A1 (en) * 2016-05-30 2020-09-24 Netease (Hangzhou) Network Co.,Ltd. Global Illumination Calculation Method and Apparatus
CN109364481A (en) * 2018-10-30 2019-02-22 网易(杭州)网络有限公司 Real-time global illumination method, apparatus, medium and electronic equipment in game
CN115713584A (en) * 2022-11-10 2023-02-24 上海纵游网络技术有限公司 Method, system, device and storage medium for rendering volume cloud based on directed distance field
CN115830208A (en) * 2023-01-09 2023-03-21 腾讯科技(深圳)有限公司 Global illumination rendering method and device, computer equipment and storage medium
CN117152237A (en) * 2023-09-25 2023-12-01 不鸣科技(杭州)有限公司 Distance field generation method and device, electronic equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
戚爽;喻光继;: "多级存储优化的大规模全局光照快速计算", 测绘通报, no. 03, 25 March 2020 (2020-03-25) *

Similar Documents

Publication Publication Date Title
Bartz et al. OpenGL-assisted occlusion culling for large polygonal models
Assarsson et al. A geometry-based soft shadow volume algorithm using graphics hardware
Lefohn et al. Resolution-matched shadow maps
JP5476138B2 (en) A method for updating acceleration data structures for ray tracing between frames based on changing field of view
JP4769732B2 (en) A device that realistically displays complex dynamic 3D scenes by ray tracing
JP2669599B2 (en) Shadow drawing method and three-dimensional graphic computer system
Thiedemann et al. Voxel-based global illumination
Wand et al. Processing and interactive editing of huge point clouds from 3D scanners
Jönsson et al. Historygrams: Enabling interactive global illumination in direct volume rendering using photon mapping
Schlegel et al. Extinction-based shading and illumination in GPU volume ray-casting
US9508191B2 (en) Optimal point density using camera proximity for point-based global illumination
US20100085360A1 (en) Rendering in scattering media
US20130027417A1 (en) Alternate Scene Representations for Optimizing Rendering of Computer Graphics
Ament et al. Low-pass filtered volumetric shadows
Papaioannou et al. Real-time volume-based ambient occlusion
Vasilakis et al. A survey of multifragment rendering
Lambru et al. Comparative analysis of real-time global illumination techniques in current game engines
Eisemann et al. Efficient real-time shadows
CN117893670A (en) Real-time global illumination rendering method and device and electronic equipment
US20230274493A1 (en) Direct volume rendering apparatus
Matthews et al. High quality rendering of protein dynamics in space filling mode
Hermes et al. Global Illumination using Parallel Global Ray-Bundles.
Lambru et al. Hybrid global illumination: A novel approach combining screen and light space information
Jabłoński et al. Real-time voxel rendering algorithm based on screen space billboard voxel buffer with sparse lookup textures
Nysjö et al. RayCaching: Amortized Isosurface Rendering for Virtual Reality

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination