CN117671125A - Illumination rendering method, device, equipment and storage medium - Google Patents

Illumination rendering method, device, equipment and storage medium Download PDF

Info

Publication number
CN117671125A
CN117671125A CN202311364931.5A CN202311364931A CN117671125A CN 117671125 A CN117671125 A CN 117671125A CN 202311364931 A CN202311364931 A CN 202311364931A CN 117671125 A CN117671125 A CN 117671125A
Authority
CN
China
Prior art keywords
illumination
target model
shadow
map
rendering
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311364931.5A
Other languages
Chinese (zh)
Inventor
李展钊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202311364931.5A priority Critical patent/CN117671125A/en
Publication of CN117671125A publication Critical patent/CN117671125A/en
Pending legal-status Critical Current

Links

Landscapes

  • Image Generation (AREA)

Abstract

The disclosure relates to the field of graphics processing, and provides a method, a device, equipment and a storage medium for illumination rendering, wherein the method comprises the following steps: generating a shadow map corresponding to the target model according to the position relation between the target model and the light source; calculating the actual illumination intensity of the target model according to the shadow map and the model surface illumination intensity of the target model; and performing color gradient processing on the target model according to the actual illumination intensity to obtain a final illumination effect, and performing illumination rendering on the target model according to the final illumination effect. According to the method, the influence of the shadow on the model illumination is calculated through generating the shadow map, the actual illumination intensity of the target model is obtained, and the color gradient processing is used, so that the influence of the real-time shadow can be considered while the specific style or artistic effect is realized, and the reality of illumination rendering is improved.

Description

Illumination rendering method, device, equipment and storage medium
Technical Field
The present disclosure relates to the field of graphics processing, and in particular, to a method, apparatus, device, and storage medium for illumination rendering.
Background
Illumination is one of the key elements that creates the real world in 3D games. To achieve realistic results, developers have been looking for more efficient and accurate methods to calculate and render illumination. Typical lighting models, such as the Phong model and Lambert model, rely on the normal to the model surface and the relative direction of the light sources to determine their lighting effects. Both Phong and Lambert models have their limitations. They are models based on real physical phenomena, and although the illumination effect in the real world can be simulated, the requirements may not be satisfied in the case where a specific style or artistic effect needs to be expressed. In the prior art, a Ramp technology is generally used for achieving a specific visual effect and realizing a specific style or artistic effect, however, the Ramp technology is realized by using a specific chart, and the chart cannot be modified when a game runs, so that the influence of real-time shadow cannot be considered when illumination rendering is carried out.
Disclosure of Invention
The main purpose of this disclosure is to solve the technical problem that the influence of real-time shadows cannot be considered when the illumination rendering is performed to realize a specific style or artistic effect in the prior art.
The first aspect of the present disclosure provides an illumination rendering method, the method comprising:
Generating a shadow map corresponding to the target model according to the position relation between the target model and the light source;
calculating the actual illumination intensity of the target model according to the shadow map and the model surface illumination intensity of the target model;
and performing color gradient processing on the target model according to the actual illumination intensity to obtain a final illumination effect, and performing illumination rendering on the target model according to the final illumination effect.
A second aspect of the present disclosure provides an illumination rendering apparatus, the apparatus comprising:
the mapping generation module is used for generating a shadow mapping corresponding to the target model according to the position relation between the target model and the light source;
the illumination calculation module is used for calculating the actual illumination intensity of the target model according to the shadow map and the model surface illumination intensity of the target model;
and the rendering module is used for carrying out color gradient processing on the target model according to the actual illumination intensity to obtain a final illumination effect, and carrying out illumination rendering on the target model according to the final illumination effect.
A third aspect of the present disclosure provides an illumination rendering apparatus, comprising: a memory and at least one processor, the memory having instructions stored therein, the memory and the at least one processor being interconnected by a line; the at least one processor invokes the instructions in the memory to cause the light rendering device to perform the steps of the light rendering method described above.
A fourth aspect of the present disclosure provides a computer-readable storage medium having instructions stored therein, which when run on a computer, cause the computer to perform the steps of the illumination rendering method described above.
Generating a shadow map corresponding to a target model according to the position relation between the target model and a light source; calculating the actual illumination intensity of the target model according to the shadow map and the model surface illumination intensity of the target model; and performing color gradient processing on the target model according to the actual illumination intensity to obtain a final illumination effect, and performing illumination rendering on the target model according to the final illumination effect. According to the method, the influence of the shadow on the model illumination is calculated through generating the shadow map, the actual illumination intensity of the target model is obtained, and the color gradient processing is used, so that the influence of the real-time shadow can be considered while the specific style or artistic effect is realized, and the reality of illumination rendering is improved.
Additional features and advantages of the disclosure will be set forth in the description which follows, and in part will be apparent from the description, or may be learned by practice of the disclosure. The objectives and other advantages of the disclosure will be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
The foregoing objects, features and advantages of the disclosure will be more readily apparent from the following detailed description of the preferred embodiments taken in conjunction with the accompanying drawings.
Drawings
FIG. 1 is a schematic diagram of one embodiment of an illumination rendering method in an embodiment of the disclosure;
FIG. 2 is a schematic diagram of another embodiment of an illumination rendering method in an embodiment of the disclosure;
FIG. 3 is a schematic diagram of another embodiment of an illumination rendering method in an embodiment of the disclosure;
FIG. 4 is a schematic diagram of one embodiment of an illumination rendering apparatus in an embodiment of the disclosure;
FIG. 5 is a schematic view of another embodiment of an illumination rendering apparatus in an embodiment of the disclosure;
fig. 6 is a schematic diagram of one embodiment of an illumination rendering apparatus in an embodiment of the disclosure.
Detailed Description
The embodiment of the disclosure provides a lighting rendering method, a device, equipment and a storage medium, which are used for generating a shadow map corresponding to a target model according to the position relationship between the target model and a light source; calculating the actual illumination intensity of the target model according to the shadow map and the model surface illumination intensity of the target model; and performing color gradient processing on the target model according to the actual illumination intensity to obtain a final illumination effect, and performing illumination rendering on the target model according to the final illumination effect. According to the method, the influence of the shadow on the model illumination is calculated through generating the shadow map, the actual illumination intensity of the target model is obtained, and the color gradient processing is used, so that the influence of the real-time shadow can be considered while the specific style or artistic effect is realized, and the reality of illumination rendering is improved.
The terms "first," "second," "third," "fourth" and the like in the description and in the claims and in the above-described figures, if any, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments described herein may be implemented in other sequences than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed or inherent to such process, method, article, or apparatus.
For ease of understanding, the following describes a specific flow of an embodiment of the disclosure, referring to fig. 1, and a first embodiment of a lens attribute adjustment method in an embodiment of the disclosure includes:
101. generating a shadow map corresponding to the target model according to the position relation between the target model and the light source;
In this embodiment, in an illumination scene, a target model refers to a model that needs to receive illumination and produce a shadow effect, such as a person, building, or other entity. The object model may have various shapes and materials, such as a planar, volumetric, or complex curved surface model. The light source is a model of luminescence that can produce light and illuminate the surrounding environment. Common light sources include the sun, bulbs, spotlights, and the like. The light source irradiates the target model by emitting light, and forms different illumination effects and shadow effects through reflection, refraction, scattering and other processes according to the material characteristics of the target model, and the position relationship between the target model and the light source directly influences the expression effect of the illumination scene.
Specifically, the light source needs to be set in advance, and the position and direction of the light source are determined. Illumination is typically represented using parallel light sources or point light sources, after the light source is set, the scene is rendered from the view angle of the light source, a depth map is generated, wherein the depth map records the distance from each pixel point to the light source, this can be achieved by using a projection matrix and the view angle of the light source in rendering, then a target model is rendered from the view angle of a camera, and the depth map is transferred to a shader for processing, and in the rendering process of the target model, the shadow value of each pixel is calculated by using the depth map. The shadow intensity may be determined by comparing the depth value in the depth map with the distance of the current pixel point to the light source, and the calculated shadow value is stored as a shadow map. Shadow mapping is typically a grayscale texture image in which the color values represent the shadow intensities of the pixels. It should be noted that in the rendering process, a suitable shader is required to process the depth map and calculate the shadow effect, and the position and direction of the light source have a great influence on the shadow effect, and parameters of the light source need to be adjusted according to actual requirements, and in addition, the shadow map can be combined with other texture maps to achieve a more realistic rendering effect.
In practical applications, other ways of generating a shadow map of the object model may also be used, such as rendering the scene from the perspective of the light source and storing depth information in the depth map. Then, the scene is re-rendered from the camera's perspective and the shadow intensities are determined by comparing the depth of each pixel to the depth values in the depth map, thereby generating the shadow map.
102. Calculating the actual illumination intensity of the target model according to the shadow map and the model surface illumination intensity of the target model;
in this embodiment, there may be various ways to calculate the actual illumination intensity of the target model according to the shadow map and the model surface illumination intensity of the target model, for example, for each vertex or each primitive of the target model, the pixel value of the corresponding position is sampled from the shadow map, and the sampled shadow map pixel value is resolved into the shadow intensity. Normally, black (0) indicates complete shading, and white (1) indicates no shading. Interpolation and adjustment can be carried out according to actual demands, pixel values of the shadow mapping are mapped into a proper shadow intensity range, then calculation is carried out by combining an illumination model according to model surface illumination intensity of a target model and shadow intensity obtained through analysis, and a traditional illumination model comprises diffuse reflection, specular reflection, ambient light and the like. One or more of these illumination models may be used to calculate the actual illumination intensity of the target model, and furthermore to obtain a smooth illumination effect on the model surface, the actual illumination intensity may be interpolated and smoothed. For example, the illumination intensity between vertices is linearly interpolated using a vertex shader, or the illumination intensity between pixels is interpolated using a fragment shader. This will ensure a smoother and more continuous variation of the illumination on the model surface.
In this embodiment, the calculating the actual illumination intensity of the target model according to the shadow map and the model surface illumination intensity of the target model includes: acquiring a shadow model of the scene and illumination data of the light source; determining a shading coefficient of the shading map according to the shading model and the illumination data; and calculating the actual illumination intensity of the target model according to the shadow coefficient and the model surface illumination intensity of the target model.
Specifically, the shadow map is combined with the illumination intensity of the model surface to obtain the influence of the shadow on the model illumination, the actual illumination intensity of the target model can be calculated based on the influence of the shadow on the model illumination, and if the pixel is in the shadow, the illumination intensity of the pixel is weakened. The specific intensity depends on the shading model and the properties of the light source, e.g. the illumination intensity of the pixel is multiplied by a shading coefficient, which is determined by the shading data of the shading map, which may be 0.5 or other value.
Further, the determining the shading coefficient of the shading map according to the shading model and the illumination data comprises: acquiring a preset curve chart; determining corresponding curve data from the curve chart according to the shadow model and the illumination data; and determining the shading coefficient of the shading map according to the curve data.
In particular, in calculating shading coefficients, curve charts (Curve Atlas) may be used, curve Atlas Curve Atlas being a special texture that contains a plurality of different Curve data that can be used to store and manage information for one-dimensional functions. These curves may be arbitrary as long as they can be mapped into the two-dimensional space of textures. The main advantages of Curve Atlas are performance and flexibility. It can store a large amount of curve data in one texture, thereby reducing the use of the video memory. On the other hand, since Curve Atlas is a two-dimensional texture, it can use the texture filtering function of GPU to perform efficient interpolation and sampling, thus obtaining better effect. In this embodiment, the corresponding Curve data is determined from the Curve chart through the shadow model and the illumination data, and a Curve representing the influence of shadows on the illumination of the object is generated, where Curve Atlas may be regarded as a series of curves, each Curve represents a specific value mapping relationship, and may also be regarded as a shadow coefficient. For example, if a shadow coefficient of a curve representing the effect of the shadow on the illumination of the object is 0.625, the curve maps an input value of 0.8 to an output value of 0.5, and the actual illumination intensity of 0.5 is obtained by reducing the original illumination intensity, so that the effect of the shadow on the illumination of the object is represented.
103. And performing color gradient processing on the target model according to the actual illumination intensity to obtain a final illumination effect, and performing illumination rendering on the target model according to the final illumination effect.
In practical applications, the color gradient process is a ramp technique, which is a technique of changing the color or illumination effect of the surface of an object by using a color gradient (ramp) image, and in this embodiment, a ramp map may be created according to normalized illumination intensity data, where each pixel corresponds to a specific illumination intensity value. The ramp map is a texture of gradient colors, the color of each pixel can be determined according to the illumination intensity value, and the sampling position in the ramp map is calculated according to the position of each vertex on the target model. The normal vector and the illumination direction can be used to determine the illumination intensity of each point on the model surface, and then the samples are performed in the ramp map according to the calculated sampling position, so as to obtain the corresponding illumination color. And applying the sampled illumination color to a shader or a material of the target model, and mixing the illumination effect with the target model according to the requirement and a used rendering engine so as to realize the final illumination rendering effect.
In the embodiment, generating a shadow map corresponding to a target model according to the position relation between the target model and a light source; calculating the actual illumination intensity of the target model according to the shadow map and the model surface illumination intensity of the target model; and performing color gradient processing on the target model according to the actual illumination intensity to obtain a final illumination effect, and performing illumination rendering on the target model according to the final illumination effect. According to the method, the influence of the shadow on the model illumination is calculated through generating the shadow map, the actual illumination intensity of the target model is obtained, and the color gradient processing is used, so that the influence of the real-time shadow can be considered while the specific style or artistic effect is realized, and the reality of illumination rendering is improved.
Referring to fig. 2, another embodiment of the illumination rendering method in the embodiment of the disclosure includes:
201. generating a light source view and a projection matrix of the light source according to the light source data of the light source;
in this embodiment, the position and direction of the light source may be obtained from the light source data, which may be a three-dimensional coordinate representing the position of the light source, or a direction vector pointing from the light source to a certain target point. Based on the position and orientation of the light sources, a camera view matrix is created for viewing the scene. This may be achieved by using a suitable mathematical library or custom calculations. The view matrix defines the position, orientation and rotation of the observer (light source). The projection matrix of the light source is then calculated based on the selected projection type (e.g., orthographic projection or perspective projection). Wherein the projection matrix projects objects in the scene from a three-dimensional space to a two-dimensional screen space. Different projection types have different calculation methods, such as orthogonal projection and perspective projection.
202. Rendering a scene where the target model is located based on the light source view, the projection matrix and the position relationship between the target model and the light source;
in this embodiment, before rendering, a depth buffer is created for a scene of the rendering target model, for storing depth information rendered from the perspective of the light source. The size of the depth buffer should be consistent with the size of the final output depth map. All objects in the scene are then rendered to the depth buffer using the light source view and the projection matrix. This may be achieved by using a custom shader in the rendering pipeline that only calculates the depth values of the fragments and writes them to the depth buffer. For each object to be rendered, the position of the object vertices in the light source space may be calculated using the light source view and the projection matrix, and in the fragment shader, depth values are calculated and saved from the positions in the light source space, the rendered depth information being read from the depth buffer. This may be achieved by sampling the depth buffer as a texture and finally converting the depth information in the depth buffer into a depth map. This may involve converting the depth value read from the depth buffer from NDC (Normalized Device Coordinates) to a range of [0,1 ].
203. Sampling the depth value of each pixel in the scene, and generating a shadow map corresponding to the target model based on the actual depth value of each pixel and the sampled depth value;
in this embodiment, the sampling the depth value of each pixel in the scene, and generating the shadow map corresponding to the target model based on the actual depth value and the sampled depth value of each pixel includes: sampling the depth value of each pixel in the scene for a preset number of times to obtain a plurality of depth values corresponding to each pixel; comparing the actual depth value of each pixel with a plurality of corresponding depth values in sequence to obtain a plurality of comparison results; and calculating the probability of the corresponding pixel in the shadow based on the comparison results, and generating a shadow map corresponding to the target model according to the pixel with the probability larger than a preset probability threshold.
In particular, vertex data of the object model may be used, which is converted into a coordinate system under the view angle of the light source (i.e. the light source space), which may be achieved by multiplying the object model vertices with the light source view and the projection matrix. For each pixel in the object model, the vertex position at the light source perspective (i.e., coordinates in the light source space) is used to sample the depth values in the depth map. And reading out the depth values in the depth map through texture sampling. The depth values sampled from the depth map are compared with the actual depth values of the pixels. If the sampled depth value is greater than the actual depth value, indicating that the pixel is in a shadow region; otherwise, the visible region is the visible region. And generating a shadow map for the target model according to the comparison result. The shadow map may be represented using a binary black-and-white image, where black represents the shadow area and white represents the visible area. The shadow state of each pixel is written into the shadow map correspondingly.
Further, by the above steps, very hard shadow edges may be created. To achieve a more natural effect, some technique, such as PCF (Percentage Closer Filtering), may be used to soften the shadow edges. The basic idea of PCF is to sample each pixel multiple times in depth texture, calculate the probability of the pixel being in shadow according to these sampled values, specifically, sample the shadow map multiple times, generally sample texture by bilinear interpolation, count the proportion of the pixels in shadow and non-shadow in the sampled result, and interpolate the shadow intensity of the current pixel according to the shadow proportion, so as to obtain the softened shadow value.
204. Calculating the actual illumination intensity of the target model according to the shadow map and the model surface illumination intensity of the target model;
205. and performing color gradient processing on the target model according to the actual illumination intensity to obtain a final illumination effect, and performing illumination rendering on the target model according to the final illumination effect.
In this embodiment, steps 204-205 are similar to steps 102-103 in the first embodiment, and will not be described again.
In the embodiment, generating a shadow map corresponding to a target model according to the position relation between the target model and a light source; calculating the actual illumination intensity of the target model according to the shadow map and the model surface illumination intensity of the target model; and performing color gradient processing on the target model according to the actual illumination intensity to obtain a final illumination effect, and performing illumination rendering on the target model according to the final illumination effect. According to the method, the influence of the shadow on the model illumination is calculated through generating the shadow map, the actual illumination intensity of the target model is obtained, and the color gradient processing is used, so that the influence of the real-time shadow can be considered while the specific style or artistic effect is realized, and the reality of illumination rendering is improved.
Referring to fig. 3, another embodiment of the illumination rendering method in the embodiment of the disclosure includes:
301. generating a shadow map corresponding to the target model according to the position relation between the target model and the light source;
302. calculating the actual illumination intensity of the target model according to the shadow map and the model surface illumination intensity of the target model;
in this embodiment, the steps 301-302 are similar to the steps 101-102 in the first embodiment, and will not be described herein.
303. Acquiring a color gradient map subjected to color gradient processing, and acquiring corresponding color information from the color gradient map according to the actual illumination intensity pair;
in this embodiment, the light intensity values are first processed through Curve Atlas before being used in the Ramp technique. That is, the illumination intensity values are first remapped according to the curves in Curve Atlas, and then the mapped values are used to sample the Ramp texture. For example, assume that the original illumination intensity value is 0.5, but there is a Curve in Curve Atlas that maps 0.5 to 0.7. Before the Ramp technique is applied, first, this mapping relationship is used to convert 0.5 into 0.7, then the Ramp texture is sampled using 0.7 as an index, and corresponding color information is acquired. In practice, the color gradient (Ramp) refers to a color that continuously varies over a range. In computer graphics, color gradients are typically used to represent colors at different illumination intensities of an object surface. For example, in a gray color gradient, black may represent no illumination, white may represent the strongest illumination, and middle gray may represent a medium intensity illumination.
In this embodiment, the developer creates a color gradient image, typically a horizontal stripe, with the intensity of light from dark to bright represented from left to right, that is color gradient processed. This color gradient image defines the relationship between various illumination intensities and colors. The Ramp texture is then sampled using the calculated actual illumination intensity as an index or UV coordinate (mainly U coordinate). For example, if the illumination intensity is 0.5 (between 0 and 1), then the value in the middle of the Ramp texture is sampled. The color sampled from the Ramp texture is the color information of the pixel.
304. Performing color gradient processing on the target model according to the color information to obtain a final illumination effect;
in this embodiment, the selected color information is finally applied to the pixel or object model surface to provide the user with a stylized lighting effect, such that the lighting is no longer a monotonic gray value, but rather a color rendering that is full of artistic perception.
305. And performing illumination rendering on the target model according to the final illumination effect.
In the embodiment, generating a shadow map corresponding to a target model according to the position relation between the target model and a light source; calculating the actual illumination intensity of the target model according to the shadow map and the model surface illumination intensity of the target model; and performing color gradient processing on the target model according to the actual illumination intensity to obtain a final illumination effect, and performing illumination rendering on the target model according to the final illumination effect. According to the method, the influence of the shadow on the model illumination is calculated through generating the shadow map, the actual illumination intensity of the target model is obtained, and the color gradient processing is used, so that the influence of the real-time shadow can be considered while the specific style or artistic effect is realized, and the reality of illumination rendering is improved.
The foregoing describes a light rendering method in an embodiment of the present disclosure, and the following describes a light rendering device in an embodiment of the present disclosure, referring to fig. 4, an embodiment of the light rendering device in an embodiment of the present disclosure includes:
the map generating module 401 is configured to generate a shadow map corresponding to the target model according to a positional relationship between the target model and the light source;
an illumination calculation module 402, configured to calculate an actual illumination intensity of the target model according to the shadow map and a model surface illumination intensity of the target model;
and the rendering module 403 is configured to perform color gradient processing on the target model according to the actual illumination intensity, obtain a final illumination effect, and perform illumination rendering on the target model according to the final illumination effect.
In the embodiment, generating a shadow map corresponding to a target model according to the position relation between the target model and a light source; calculating the actual illumination intensity of the target model according to the shadow map and the model surface illumination intensity of the target model; and performing color gradient processing on the target model according to the actual illumination intensity to obtain a final illumination effect, and performing illumination rendering on the target model according to the final illumination effect. According to the method, the influence of the shadow on the model illumination is calculated through generating the shadow map, the actual illumination intensity of the target model is obtained, and the color gradient processing is used, so that the influence of the real-time shadow can be considered while the specific style or artistic effect is realized, and the reality of illumination rendering is improved.
Referring to fig. 5, another embodiment of an illumination rendering apparatus in an embodiment of the disclosure includes:
the map generating module 401 is configured to generate a shadow map corresponding to the target model according to a positional relationship between the target model and the light source;
an illumination calculation module 402, configured to calculate an actual illumination intensity of the target model according to the shadow map and a model surface illumination intensity of the target model;
and the rendering module 403 is configured to perform color gradient processing on the target model according to the actual illumination intensity, obtain a final illumination effect, and perform illumination rendering on the target model according to the final illumination effect.
In a possible implementation, the map generation module 401 includes:
a matrix generation unit 4011 for generating a light source view and a projection matrix of the light source according to light source data of the light source;
a scene rendering unit 4012 for rendering a scene in which the target model is located based on the light source view, the projection matrix, and a positional relationship between the target model and the light source;
the depth value sampling unit 4013 is configured to sample a depth value of each pixel in the scene, and generate a shadow map corresponding to the target model based on the actual depth value and the sampled depth value of each pixel.
In one possible implementation, the depth value sampling unit 4013 is specifically configured to:
sampling the depth value of each pixel in the scene for a preset number of times to obtain a plurality of depth values corresponding to each pixel;
comparing the actual depth value of each pixel with a plurality of corresponding depth values in sequence to obtain a plurality of comparison results;
and calculating the probability of the corresponding pixel in the shadow based on the comparison results, and generating a shadow map corresponding to the target model according to the pixel with the probability larger than a preset probability threshold.
In one possible implementation, the illumination calculation module 402 includes:
a data acquisition unit 4021 configured to acquire a shadow model of the scene and illumination data of the light source;
a coefficient determination unit 4022 for determining a shading coefficient of the shading map from the shading model and the illumination data;
an intensity calculating unit 4023 for calculating an actual illumination intensity of the target model based on the shadow coefficient and a model surface illumination intensity of the target model.
In a possible embodiment, the coefficient determination unit 4022 is specifically configured to:
acquiring a preset curve chart;
determining corresponding curve data from the curve chart according to the shadow model and the illumination data;
And determining the shading coefficient of the shading map according to the curve data.
In one possible implementation, the rendering module 403 includes:
a color acquisition unit 4031 for acquiring a color gradient map subjected to color gradient processing, and acquiring corresponding color information from the color gradient map according to the actual illumination intensity pair;
the gradient processing unit 4032 is configured to perform color gradient processing on the target model according to the color information, so as to obtain a final illumination effect;
and the illumination rendering unit 4033 is used for performing illumination rendering on the target model according to the final illumination effect.
In a possible embodiment, the color acquisition unit 4031 is specifically configured to:
acquiring a color gradient map for performing color gradient processing and model parameters of the model;
sampling the color gradient map according to the model parameters and the illumination intensity to obtain a first sampling result and a second sampling result;
and calculating color information corresponding to the target model according to the first sampling result and the second sampling result.
In the embodiment, generating a shadow map corresponding to a target model according to the position relation between the target model and a light source; calculating the actual illumination intensity of the target model according to the shadow map and the model surface illumination intensity of the target model; and performing color gradient processing on the target model according to the actual illumination intensity to obtain a final illumination effect, and performing illumination rendering on the target model according to the final illumination effect. According to the method, the influence of the shadow on the model illumination is calculated through generating the shadow map, the actual illumination intensity of the target model is obtained, and the color gradient processing is used, so that the influence of the real-time shadow can be considered while the specific style or artistic effect is realized, and the reality of illumination rendering is improved.
The mid-light rendering device in the embodiments of the present disclosure is described in detail above in fig. 4 and 5 from the point of view of the modularized functional entity, and the light rendering apparatus in the embodiments of the present disclosure is described in detail below from the point of view of hardware processing.
Fig. 6 is a schematic diagram of a light rendering device according to an embodiment of the disclosure, where the light rendering device 600 may vary considerably in configuration or performance, and may include one or more processors (central processing units, CPU) 610 (e.g., one or more processors) and memory 620, one or more storage media 630 (e.g., one or more mass storage devices) storing applications 633 or data 632. Wherein the memory 620 and the storage medium 630 may be transitory or persistent storage. The program stored in the storage medium 630 may include one or more modules (not shown), each of which may include a series of instruction operations in the illumination rendering apparatus 600. The light rendering device 600 may also include one or more power supplies 640, one or more wired or wireless network interfaces 650, one or more input/output interfaces 660, and/or one or more operating systems 631, such as Windows Serve, mac OS X, unix, linux, freeBSD, and the like. It will be appreciated by those skilled in the art that the light rendering device structure shown in fig. 6 does not constitute a limitation of the light rendering device provided by the present disclosure, and may include more or fewer components than illustrated, or may combine certain components, or may be a different arrangement of components. Still further, the processor 610 may be configured to communicate with the storage medium 630 and execute a series of instruction operations in the storage medium 630 on the light rendering device 600 to implement the following steps:
Generating a shadow map corresponding to the target model according to the position relation between the target model and the light source; calculating the actual illumination intensity of the target model according to the shadow map and the model surface illumination intensity of the target model; and performing color gradient processing on the target model according to the actual illumination intensity to obtain a final illumination effect, and performing illumination rendering on the target model according to the final illumination effect. According to the method, the influence of the shadow on the model illumination is calculated through generating the shadow map, the actual illumination intensity of the target model is obtained, and the color gradient processing is used, so that the influence of the real-time shadow can be considered while the specific style or artistic effect is realized, and the reality of illumination rendering is improved.
Optionally, generating the shadow map corresponding to the target model according to the position relationship between the target model and the light source includes: generating a light source view and a projection matrix of the light source according to the light source data of the light source; rendering a scene where the target model is located based on the light source view, the projection matrix and the position relationship between the target model and the light source; and sampling the depth value of each pixel in the scene, and generating a shadow map corresponding to the target model based on the actual depth value of each pixel and the sampled depth value.
The method can realize real-time dynamic shadow effect by dynamically updating the light source data, the light source view and the projection matrix. This means that when the light source or model position changes, the shadow map can be updated accordingly to reflect the change, a more realistic real-time rendering effect is achieved and generating the shadow map can more accurately simulate the lighting effect in the real world, making the shadow of the rendered scene appear more realistic.
Optionally, the step of sampling the depth value of each pixel in the scene, and generating the shadow map corresponding to the target model based on the actual depth value and the sampled depth value of each pixel includes: sampling the depth value of each pixel in the scene for a preset number of times to obtain a plurality of depth values corresponding to each pixel; comparing the actual depth value of each pixel with a plurality of corresponding depth values in sequence to obtain a plurality of comparison results; and calculating the probability of the corresponding pixel in the shadow based on the comparison results, and generating a shadow map corresponding to the target model according to the pixel with the probability larger than the preset probability threshold.
By sampling a plurality of depth values for each pixel and calculating the shadow probability based on the comparison result, the method can increase the softness and the sense of reality of the shadow and provide accurate control of the shadow effect. Meanwhile, the method can improve rendering performance and efficiency, so that the process of generating the shadow map is more efficient.
Optionally, calculating the actual illumination intensity of the target model according to the shadow map and the model surface illumination intensity of the target model includes: acquiring a shadow model of a scene and illumination data of a light source; determining a shadow coefficient of the shadow map according to the shadow model and the illumination data; and calculating the actual illumination intensity of the target model according to the shadow coefficient and the model surface illumination intensity of the target model.
According to the method, the corresponding shadow value can be calculated for each pixel in the shadow map according to the illumination data and the shadow model, so that the shielding and shadow effect of the light source on the target model can be accurately projected and simulated in the rendering process, and more real illumination performance is realized.
Optionally, determining the shading coefficient of the shading map according to the shading model and the illumination data includes: acquiring a preset curve chart; determining corresponding curve data from the curve chart according to the shadow model and the illumination data; the shading coefficients of the shading map are determined from the curve data.
According to the method, a plurality of different Curve charts are stored through the Curve Atlas technology, the charts can be modified during operation, flexibility and practicability are greatly improved, corresponding Curve data are determined from the Curve charts according to a shadow model and illumination data, and shadow coefficients of a shadow map can be accurately determined. The curve data can be used as the basis of the shadow coefficient, and a more accurate and fine control mode is provided. By selecting corresponding curve data according to different illumination conditions and shadow types, the intensity and shape of the shadow can be better adjusted according to the relative position between the light source and the model, the shape of the model and other information, so that the shadow effect is more in line with the actual illumination condition.
Optionally, performing color gradient processing on the target model according to the actual illumination intensity to obtain a final illumination effect, and performing illumination rendering on the target model according to the final illumination effect includes: acquiring a color gradient map subjected to color gradient processing, and acquiring corresponding color information from the color gradient map according to the actual illumination intensity pair; performing color gradient processing on the target model according to the color information to obtain a final illumination effect; and performing illumination rendering on the target model according to the final illumination effect.
The method can save the calculated amount and the rendering time through the matching of the prepared color gradient map and the actual illumination intensity. Compared with the real-time calculation of the illumination effect, the method and the device acquire the color information from the color gradient map and apply the color information to the target model, can greatly improve the rendering efficiency, and save the calculation resources while maintaining high-quality rendering.
Optionally, the obtaining a color gradient map for performing color gradient processing, and obtaining corresponding color information from the color gradient map according to the actual illumination intensity pair includes: acquiring a color gradient map for performing color gradient processing and model parameters of a model; sampling the color gradient map according to the model parameters and the illumination intensity to obtain a first sampling result and a second sampling result; and calculating color information corresponding to the target model according to the first sampling result and the second sampling result.
The method for obtaining the color gradient map for performing color gradient processing can provide a self-defining mode to realize the illumination effect of the model surface. The model is subjected to illumination processing by using the color gradient map in the rendering process, so that the reflection characteristics of the surface of the model under different illumination conditions can be simulated, and more vivid and diversified illumination effects are realized.
The present disclosure also provides a computer readable storage medium, which may be a non-volatile computer readable storage medium, which may also be a volatile computer readable storage medium, having instructions stored therein which, when executed on a computer, cause the computer to perform the steps of:
generating a shadow map corresponding to the target model according to the position relation between the target model and the light source; calculating the actual illumination intensity of the target model according to the shadow map and the model surface illumination intensity of the target model; and performing color gradient processing on the target model according to the actual illumination intensity to obtain a final illumination effect, and performing illumination rendering on the target model according to the final illumination effect. According to the method, the influence of the shadow on the model illumination is calculated through generating the shadow map, the actual illumination intensity of the target model is obtained, and the color gradient processing is used, so that the influence of the real-time shadow can be considered while the specific style or artistic effect is realized, and the reality of illumination rendering is improved.
Optionally, generating the shadow map corresponding to the target model according to the position relationship between the target model and the light source includes: generating a light source view and a projection matrix of the light source according to the light source data of the light source; rendering a scene where the target model is located based on the light source view, the projection matrix and the position relationship between the target model and the light source; and sampling the depth value of each pixel in the scene, and generating a shadow map corresponding to the target model based on the actual depth value of each pixel and the sampled depth value.
The method can realize real-time dynamic shadow effect by dynamically updating the light source data, the light source view and the projection matrix. This means that when the light source or model position changes, the shadow map can be updated accordingly to reflect the change, a more realistic real-time rendering effect is achieved and generating the shadow map can more accurately simulate the lighting effect in the real world, making the shadow of the rendered scene appear more realistic.
Optionally, the step of sampling the depth value of each pixel in the scene, and generating the shadow map corresponding to the target model based on the actual depth value and the sampled depth value of each pixel includes: sampling the depth value of each pixel in the scene for a preset number of times to obtain a plurality of depth values corresponding to each pixel; comparing the actual depth value of each pixel with a plurality of corresponding depth values in sequence to obtain a plurality of comparison results; and calculating the probability of the corresponding pixel in the shadow based on the comparison results, and generating a shadow map corresponding to the target model according to the pixel with the probability larger than the preset probability threshold.
By sampling a plurality of depth values for each pixel and calculating the shadow probability based on the comparison result, the method can increase the softness and the sense of reality of the shadow and provide accurate control of the shadow effect. Meanwhile, the method can improve rendering performance and efficiency, so that the process of generating the shadow map is more efficient.
Optionally, calculating the actual illumination intensity of the target model according to the shadow map and the model surface illumination intensity of the target model includes: acquiring a shadow model of a scene and illumination data of a light source; determining a shadow coefficient of the shadow map according to the shadow model and the illumination data; and calculating the actual illumination intensity of the target model according to the shadow coefficient and the model surface illumination intensity of the target model.
According to the method, the corresponding shadow value can be calculated for each pixel in the shadow map according to the illumination data and the shadow model, so that the shielding and shadow effect of the light source on the target model can be accurately projected and simulated in the rendering process, and more real illumination performance is realized.
Optionally, determining the shading coefficient of the shading map according to the shading model and the illumination data includes: acquiring a preset curve chart; determining corresponding curve data from the curve chart according to the shadow model and the illumination data; the shading coefficients of the shading map are determined from the curve data.
According to the method, a plurality of different Curve charts are stored through the Curve Atlas technology, the charts can be modified during operation, flexibility and practicability are greatly improved, corresponding Curve data are determined from the Curve charts according to a shadow model and illumination data, and shadow coefficients of a shadow map can be accurately determined. The curve data can be used as the basis of the shadow coefficient, and a more accurate and fine control mode is provided. By selecting corresponding curve data according to different illumination conditions and shadow types, the intensity and shape of the shadow can be better adjusted according to the relative position between the light source and the model, the shape of the model and other information, so that the shadow effect is more in line with the actual illumination condition.
Optionally, performing color gradient processing on the target model according to the actual illumination intensity to obtain a final illumination effect, and performing illumination rendering on the target model according to the final illumination effect includes: acquiring a color gradient map subjected to color gradient processing, and acquiring corresponding color information from the color gradient map according to the actual illumination intensity pair; performing color gradient processing on the target model according to the color information to obtain a final illumination effect; and performing illumination rendering on the target model according to the final illumination effect.
The method can save the calculated amount and the rendering time through the matching of the prepared color gradient map and the actual illumination intensity. Compared with the real-time calculation of the illumination effect, the method and the device acquire the color information from the color gradient map and apply the color information to the target model, can greatly improve the rendering efficiency, and save the calculation resources while maintaining high-quality rendering.
Optionally, the obtaining a color gradient map for performing color gradient processing, and obtaining corresponding color information from the color gradient map according to the actual illumination intensity pair includes: acquiring a color gradient map for performing color gradient processing and model parameters of a model; sampling the color gradient map according to the model parameters and the illumination intensity to obtain a first sampling result and a second sampling result; and calculating color information corresponding to the target model according to the first sampling result and the second sampling result.
The method for obtaining the color gradient map for performing color gradient processing can provide a self-defining mode to realize the illumination effect of the model surface. The model is subjected to illumination processing by using the color gradient map in the rendering process, so that the reflection characteristics of the surface of the model under different illumination conditions can be simulated, and more vivid and diversified illumination effects are realized.
It will be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working process of the system or apparatus and unit described above may refer to the corresponding process in the foregoing method embodiment, which is not repeated herein.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present disclosure may be embodied in essence or a part contributing to the prior art or all or part of the technical solution in the form of a software product stored in a storage medium, including several instructions to cause a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the method described in the embodiments of the present disclosure. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a read-only memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The above embodiments are merely for illustrating the technical solution of the present disclosure, and not for limiting the same; although the present disclosure has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present disclosure.

Claims (10)

1. An illumination rendering method, characterized in that the illumination rendering method comprises:
generating a shadow map corresponding to the target model according to the position relation between the target model and the light source;
calculating the actual illumination intensity of the target model according to the shadow map and the model surface illumination intensity of the target model;
and performing color gradient processing on the target model according to the actual illumination intensity to obtain a final illumination effect, and performing illumination rendering on the target model according to the final illumination effect.
2. The illumination rendering method according to claim 1, wherein the generating the shadow map corresponding to the target model according to the positional relationship between the target model and the light source includes:
generating a light source view and a projection matrix of the light source according to the light source data of the light source;
rendering a scene where the target model is located based on the light source view, the projection matrix and the positional relationship between the target model and the light source;
and sampling the depth value of each pixel in the scene, and generating a shadow map corresponding to the target model based on the actual depth value of each pixel and the sampled depth value.
3. The illumination rendering method according to claim 2, wherein the sampling the depth value of each pixel in the scene, and generating the shadow map corresponding to the target model based on the actual depth value of each pixel and the sampled depth value comprises:
sampling the depth value of each pixel in the scene for a preset number of times to obtain a plurality of depth values corresponding to each pixel;
comparing the actual depth value of each pixel with a plurality of corresponding depth values in sequence to obtain a plurality of comparison results;
and calculating the probability of the corresponding pixel in the shadow based on the comparison results, and generating a shadow map corresponding to the target model according to the pixel with the probability larger than a preset probability threshold.
4. The illumination rendering method according to claim 1, wherein the calculating the actual illumination intensity of the target model from the shadow map and the model surface illumination intensity of the target model comprises:
acquiring a shadow model of the scene and illumination data of the light source;
determining a shading coefficient of the shading map according to the shading model and the illumination data;
and calculating the actual illumination intensity of the target model according to the shadow coefficient and the model surface illumination intensity of the target model.
5. The illumination rendering method of claim 4, wherein the determining the shading coefficients of the shading map from the shading model and the illumination data comprises:
acquiring a preset curve chart;
determining corresponding curve data from the curve chart according to the shadow model and the illumination data;
and determining the shading coefficient of the shading map according to the curve data.
6. The illumination rendering method according to claim 1, wherein performing color gradient processing on the target model according to the actual illumination intensity to obtain a final illumination effect, and performing illumination rendering on the target model according to the final illumination effect comprises:
acquiring a color gradient map subjected to color gradient processing, and acquiring corresponding color information from the color gradient map according to the actual illumination intensity pair;
performing color gradient processing on the target model according to the color information to obtain a final illumination effect;
and carrying out illumination rendering on the target model according to the final illumination effect.
7. The illumination rendering method according to claim 6, wherein the acquiring a color gradient map subjected to color gradient processing, and acquiring corresponding color information from the color gradient map according to the actual illumination intensity pair, comprises:
Acquiring a color gradient map for performing color gradient processing and model parameters of the model;
sampling the color gradient map according to the model parameters and the illumination intensity to obtain a first sampling result and a second sampling result;
and calculating color information corresponding to the target model according to the first sampling result and the second sampling result.
8. An illumination rendering device, characterized in that the illumination rendering device comprises:
the mapping generation module is used for generating a shadow mapping corresponding to the target model according to the position relation between the target model and the light source;
the illumination calculation module is used for calculating the actual illumination intensity of the target model according to the shadow map and the model surface illumination intensity of the target model;
and the rendering module is used for carrying out color gradient processing on the target model according to the actual illumination intensity to obtain a final illumination effect, and carrying out illumination rendering on the target model according to the final illumination effect.
9. An illumination rendering device, characterized in that the illumination rendering device comprises: a memory and at least one processor, the memory having instructions stored therein;
The at least one processor invokes the instructions in the memory to cause the light rendering device to perform the steps of the light rendering method of any one of claims 1-7.
10. A computer readable storage medium having instructions stored thereon, which when executed by a processor, implement the steps of the illumination rendering method according to any of claims 1-7.
CN202311364931.5A 2023-10-18 2023-10-18 Illumination rendering method, device, equipment and storage medium Pending CN117671125A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311364931.5A CN117671125A (en) 2023-10-18 2023-10-18 Illumination rendering method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311364931.5A CN117671125A (en) 2023-10-18 2023-10-18 Illumination rendering method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117671125A true CN117671125A (en) 2024-03-08

Family

ID=90065134

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311364931.5A Pending CN117671125A (en) 2023-10-18 2023-10-18 Illumination rendering method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117671125A (en)

Similar Documents

Publication Publication Date Title
CN111508052B (en) Rendering method and device of three-dimensional grid body
US6876362B1 (en) Omnidirectional shadow texture mapping
US7212207B2 (en) Method and apparatus for real-time global illumination incorporating stream processor based hybrid ray tracing
US6567083B1 (en) Method, system, and computer program product for providing illumination in computer graphics shading and animation
US7126602B2 (en) Interactive horizon mapping
US20070139408A1 (en) Reflective image objects
Szirmay-Kalos et al. Gpu-based techniques for global illumination effects
US6922193B2 (en) Method for efficiently calculating texture coordinate gradient vectors
US6791544B1 (en) Shadow rendering system and method
CN112819941A (en) Method, device, equipment and computer-readable storage medium for rendering water surface
US6753875B2 (en) System and method for rendering a texture map utilizing an illumination modulation value
KR100951121B1 (en) Rendering method for indirect illumination effect
CN117671125A (en) Illumination rendering method, device, equipment and storage medium
US20180005432A1 (en) Shading Using Multiple Texture Maps
Wang et al. Real-time bump mapped texture shading based-on hardware acceleration
Muszyński et al. Wide Field of View Projection Using Rasterization
Ahokas Shadow Maps
Karlsson et al. Rendering Realistic Augmented Objects Using a Image Based Lighting Approach
Nicholson et al. Gpu based algorithms for terrain texturing
Peschel et al. Plausible visualization of the dynamic digital factory with massive amounts of lights
WO1999016021A1 (en) Method, system, and computer program product for providing illumination in computer graphics shading and animation
Kühnert et al. Fur shading and modification based on cone step mapping
CN116524102A (en) Cartoon second-order direct illumination rendering method, device and system
CN116137051A (en) Water surface rendering method, device, equipment and storage medium
CN114972647A (en) Model rendering method and device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination