WO2022206380A1 - 光照渲染方法、装置、计算机设备和存储介质 - Google Patents

光照渲染方法、装置、计算机设备和存储介质 Download PDF

Info

Publication number
WO2022206380A1
WO2022206380A1 PCT/CN2022/081063 CN2022081063W WO2022206380A1 WO 2022206380 A1 WO2022206380 A1 WO 2022206380A1 CN 2022081063 W CN2022081063 W CN 2022081063W WO 2022206380 A1 WO2022206380 A1 WO 2022206380A1
Authority
WO
WIPO (PCT)
Prior art keywords
light source
light
virtual scene
illumination
lighting
Prior art date
Application number
PCT/CN2022/081063
Other languages
English (en)
French (fr)
Inventor
徐华兵
曹舜
李元亨
魏楠
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Publication of WO2022206380A1 publication Critical patent/WO2022206380A1/zh
Priority to US17/985,107 priority Critical patent/US20230076326A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/02Non-photorealistic rendering
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/04Texture mapping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/06Ray-tracing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/503Blending, e.g. for anti-aliasing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/506Illumination models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/55Radiosity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2215/00Indexing scheme for image rendering
    • G06T2215/12Shadow map, environment map

Definitions

  • the present application relates to the technical field of image processing, and in particular, to a lighting rendering method, apparatus, computer device and storage medium.
  • Global illumination showing the combined effect of direct lighting and indirect lighting.
  • global illumination can be achieved through ray tracing, ambient occlusion, light probes, etc.
  • a lighting rendering method, apparatus, computer device, storage medium, and computer program product are provided.
  • a lighting rendering method executed by a computer device, the method comprising:
  • the light source change information determine the current light source projection coefficient corresponding to the changed light source
  • a lighting rendering device the device includes:
  • the light source determination module is used to determine the light source change information when the light source in the virtual scene changes
  • a projection coefficient updating module configured to determine the current light source projection coefficient corresponding to the changed light source according to the light source change information
  • an indirect illumination determination module configured to determine the indirect illumination value of the target pixel in the virtual scene according to the illumination transfer parameter corresponding to the target pixel in the virtual scene and the current light source projection coefficient;
  • a direct illumination determination module configured to determine the direct illumination value corresponding to the target pixel point under the changed light source
  • a lighting rendering module configured to perform lighting rendering on the target pixel point according to the direct lighting value and the indirect lighting value.
  • a computer device comprising a memory and one or more processors, the memory stores computer-readable instructions that, when executed by the processor, cause the one or more processors to execute the above Steps in the lighting rendering method.
  • One or more non-transitory readable storage media having computer readable instructions stored thereon which, when executed by one or more processors, cause the one or more processors to implement the above illumination Steps in the render method.
  • a computer program product comprising computer readable instructions that, when executed by a processor, implement the steps of the above image processing method.
  • 1 is an application environment diagram of a lighting rendering method in some embodiments
  • FIG. 2 is a schematic flowchart of a lighting rendering method in some embodiments
  • FIG. 3 is a schematic diagram of the lighting effect of a light source in some embodiments.
  • FIG. 4 is a schematic diagram of the lighting effect of a light source in some embodiments.
  • FIG. 5 is a schematic diagram of illumination transfer parameters of each surface corresponding to an illumination probe in some embodiments
  • FIG. 6 is a schematic diagram of a scene screen in some embodiments.
  • FIG. 7 is a schematic diagram after projecting a scene image in some embodiments.
  • FIG. 8 is a schematic flowchart of a lighting rendering method in some specific embodiments.
  • FIG. 9 is a schematic diagram of a rendering effect of a scene picture in some embodiments.
  • FIG. 10 is a schematic diagram of a rendering effect of a scene picture in some embodiments.
  • 11 is a schematic diagram of the effect of lighting rendering in three different ways in some embodiments.
  • FIG. 12 is a structural block diagram of a lighting rendering apparatus in some embodiments.
  • Figure 13 is a diagram of the internal structure of a computer device in some embodiments.
  • the image processing method in a three-dimensional scene provided by this application can be applied to computer equipment.
  • the computer equipment can be a terminal or a server.
  • the image processing method in a three-dimensional scene provided by the present application can be applied to a terminal, a server, or a system including a terminal and a server, and is realized through interaction between the terminal and the server.
  • the lighting rendering method provided by the present application can be applied to the application environment shown in FIG. 1 .
  • the terminal 102 communicates with the server 104 through the network. Specifically, the terminal 102 may obtain scene data corresponding to the virtual scene from the server 104, where the scene data includes illumination data such as light sources, light source change information, and light source projection coefficients. During the process of running the virtual scene, the terminal 102 determines the light source change information when the light source in the virtual scene changes.
  • the terminal 102 determines the current light source projection coefficient corresponding to the changed light source according to the light source change information; determines the indirect illumination value of the target pixel in the virtual scene according to the illumination transfer parameter corresponding to the target pixel and the current light source projection coefficient; determines The direct illumination value corresponding to the target pixel under the changed light source; according to the direct illumination value and the indirect illumination value, the illumination rendering is performed on the target pixel.
  • the terminal 102 may be a smart phone, a tablet computer, a notebook computer, a desktop computer, a smart speaker, a smart watch, etc., but is not limited thereto.
  • the server 104 may be an independent physical server, or a server cluster or a distributed system composed of multiple physical servers, or may provide cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communications, Cloud servers for basic cloud computing services such as middleware services, domain name services, security services, CDN, and big data and artificial intelligence platforms.
  • the terminal 102 and the server 104 may be directly or indirectly connected through wired or wireless communication, which is not limited in this application.
  • the lighting rendering method of the present application may be a method of performing image processing on target pixels of an image to be rendered in a virtual scene based on computer vision technology, so that the lighting rendering effect in a dynamic lighting scene can be effectively improved.
  • a lighting rendering method is provided, and the method is applied to a computer device as an example for illustration.
  • the computer device may be the terminal or server in FIG. 1, and the method may also be It is applied to a system including a terminal and a server, and is realized through the interaction between the terminal and the server.
  • the following steps are included:
  • Virtual scene refers to the digital scene outlined by the computer through digital communication technology, including two-dimensional virtual scene and three-dimensional virtual scene.
  • the three-dimensional virtual scene can display the shape of the object more beautifully, and can also display the virtual reality world more intuitively.
  • the objects in the three-dimensional virtual scene may include at least one of terrain, houses, trees, characters, and the like.
  • the applications of virtual scenes are becoming more and more extensive. For example, virtual scenes can be applied to scenarios such as game development and video production.
  • the virtual scene it can be displayed by 3D graphics on the computer, and the 3D simulation environment can be displayed on the screen. All objects in the virtual scene can be described by the 3D scene data.
  • 3D scene data can be loaded into a 3D scene to display a 3D simulation environment.
  • the three-dimensional scene data includes at least one of model data, texture data, lighting data, vector data, terrain data, grid volume data, and the like.
  • the illumination data includes at least one of light sources and light source change information.
  • the terminal may perform a series of rendering processing on the scene picture in the virtual scene, so that the picture content in the virtual scene is displayed on the screen in the form of a two-dimensional image.
  • rendering refers to the process of two-dimensionally projecting an object model in a three-dimensional scene into a digital image according to the set environment, material, lighting and rendering parameters.
  • lighting rendering is the process of converting three-dimensional radiosity processing into a two-dimensional image. Entities in virtual scenes and environments are represented in three-dimensional form, which is closer to the real world and is easy to manipulate and transform.
  • the light source refers to an object that can emit light by itself and is emitting light, and the light source includes but is not limited to at least one of the sun, an electric lamp, a burning substance, and the like.
  • the light source in the virtual scene is a series of lighting data that can realistically model the lighting effect of the light source in reality.
  • the light source in the virtual scene is calculated by calculating the light and dark distribution and color distribution of the surface of the illuminated object, and makes the illuminated object and the surrounding environment have a contrast of light, dark and color, so as to show the lighting effect of the object.
  • the light source in the virtual scene can be a virtual light source node without shape or outline.
  • the lighting effect of the illuminated object can be calculated according to the position of the light source in the virtual scene, and the viewer (observer) cannot see the light source itself. Similar to the light source in the real world, when the colored light source in the virtual scene is projected on the surface of the colored object, the final rendered color depends on the reflection and absorption of the light. Wherein, based on the pre-built three-dimensional geometric model in the virtual scene, the shadow effect when the light source in the virtual scene is irradiated on the object in the virtual scene can also be constructed.
  • a simulated lighting effect can be achieved by simulating the propagation of light in the environment.
  • light can propagate in a straight line in the medium, and when encountering obstacles such as objects in the process of propagation, reflection, scattering, diffuse reflection, and absorption of light may occur.
  • obstacles such as objects in the process of propagation, reflection, scattering, diffuse reflection, and absorption of light may occur.
  • objects of different materials have different reflection rates of light on their surfaces. For example, reflection occurs when light travels to a specular surface, which has a higher emission pass rate. When light travels to the tabletop, diffuse reflection occurs and its transmittance is attenuated.
  • the light source change information refers to the change information generated by the light source after the change compared to the light source before the change in the virtual scene.
  • the light source change information may include at least one of light source position change information, light source intensity change information, or light source direction change information.
  • the terminal when the terminal detects that the light source in the virtual scene changes, the terminal obtains the current light source information, and compares the current light source information with the light source information before the light source change. The comparison is performed to determine the light source change information.
  • the light source projection coefficient may refer to the light intensity coefficient corresponding to the light source.
  • the light source has a linear relationship with the light source projection coefficient, and different light sources correspond to different projection coefficients.
  • a light source in a virtual scene can be regarded as a virtual light source, that is, a set of light source functions. Projecting this virtual light source onto the projection basis function will obtain a set of coefficients, which are the light source projection coefficients.
  • the current light source projection coefficient refers to the light source projection coefficient corresponding to the changed light source.
  • basis functions refer to elements of a particular basis in the space of functions.
  • each continuous function can be represented as a linear combination of basis functions.
  • Basis functions are also called mixing functions, which can be used as interpolation functions by mixing basis functions.
  • the projection basis function which is used to input the light source into the basis function, so as to calculate the reflection of the light source light through the basis function.
  • the projection basis function may specifically be a spherical harmonic function.
  • the spherical harmonic function is a unit sphere that restricts the harmonic function satisfying Laplace's equation to the spherical coordinate system, and can be used to represent the direction distribution on the spherical surface.
  • a spherical harmonic function is a generalized Fourier variation defined on a sphere, and a discrete spherical function can be parameterized to simulate low-frequency ambient lighting.
  • the BRDF function Bidirectional Reflectance Distribution Function
  • Spherical harmonic functions can be used to capture lighting and then relight to calculate global illumination in virtual environments in real time.
  • the projection basis function may also use a spherical Gaussian function, a user-defined piecewise function, etc. as the projection basis function, which is not limited in this application.
  • the lighting in the virtual scene is a linear system, which is the lighting effect produced by the linear combination of light sources, and is also the linear combination of the lighting effects of individual light sources.
  • FIG. 32 includes light source 3a and light source 3b
  • FIG. 32a includes light source 3a
  • FIG. 32b includes light source 3b.
  • the lighting effect in Fig. 32 is the effect of superimposing the lighting in Fig. 32a and Fig. 32b.
  • FIG. 42 includes the light source 4a
  • FIG. 42a includes half the luminance of the light source 4a.
  • the lighting effect in Figure 42 is twice the lighting effect in Figure 42a.
  • the lighting effect in FIG. 32 is obtained by linearly combining the lighting effect of the light source 3a in FIG. 32a and the lighting effect of the light source 3b in FIG. 32b respectively according to the corresponding light source projection coefficients.
  • the terminal When the terminal detects that the light source in the virtual scene changes, the terminal updates the lighting in the current virtual scene according to the light source change information after the light source changes. Specifically, the terminal light source change information updates the illumination data corresponding to the target pixels in the virtual scene. That is, when the light source changes, the terminal needs to recalculate the current light source projection coefficient corresponding to the current light source in the virtual scene according to the light source change information.
  • S206 Determine the indirect illumination value of the target pixel in the virtual scene according to the illumination transfer parameter corresponding to the target pixel in the virtual scene and the current light source projection coefficient.
  • the pixel point may refer to the center position of a pixel.
  • a pixel can be an area in the form of a small square, and a small square can correspond to a pixel, and the center of the pixel is the pixel point.
  • An image consists of multiple pixels.
  • a three-dimensional virtual scene needs to undergo a series of renderings to make its content appear on the screen in two-dimensional form.
  • the three-dimensional virtual scene can be displayed by continuous scene picture images, and the scene picture images are composed of multiple pixels.
  • the target pixel refers to the pixel corresponding to the to-be-rendered image in the virtual scene, that is, the pixel that needs to be rendered currently.
  • the pixel information such as the color value, brightness value, and depth value of the target pixel point may be determined through scene data in the virtual scene.
  • the brightness value and color value of the target pixel point are related to the light intensity of the environment where it is located, the number of surrounding virtual objects that block light, and so on.
  • the light transfer parameters may include light transfer vectors.
  • the illumination transfer vector is a vector used to convert the incident light into the transferred incident light including self-occlusion and mutual reflection, and can be specifically used to determine the irradiance (Irradiance) per unit area.
  • the global illumination effect can be generated.
  • global illumination is a unique term in 3D software, and light has the properties of reflection and refraction.
  • the sunlight that hits the ground during the day is reflected and refracted countless times, so the ground seen by the human eye is clear.
  • 3D software global illumination can not only calculate the bright and dark sides of objects, but also calculate various light effects such as reflection and refraction of light in the virtual environment.
  • the light received by the surface of the real object is not all from the light source, but also includes the light reflected by other objects.
  • the lighting from the light source is direct lighting
  • the lighting from other objects is indirect lighting.
  • direct lighting refers to the brightness of the light that the light source directly illuminates on the object and is reflected to the virtual observation point or the camera.
  • Indirect lighting means that the light source first irradiates other objects, and after one or more bounces, it finally reaches the surface of the object to be observed, and then reflects the brightness of the light to the virtual observation point or camera.
  • the brightness of light refers to the light intensity, that is, the radiance per unit projected area.
  • the unit projected area is the plane perpendicular to the light, and the radiant flux per unit area is called radiance.
  • radiant flux represents the sum of light energy flowing through an area per second. The same light energy irradiates the surface of the object with different areas, and the brightness is different. The angle of the surface of the object will cause the incident light to spread out and reduce the radiance.
  • the direct illumination value refers to the color value of the object in the virtual environment after receiving the direct illumination of the light source, that is, the illumination intensity value.
  • the indirect lighting value refers to the color value of objects in the virtual environment after receiving indirect lighting.
  • the terminal After the terminal determines the current light source projection coefficient corresponding to the changed light source according to the light source change information after the light source changes, the terminal determines the indirection of the target pixel in the virtual scene according to the illumination transfer parameter corresponding to the target pixel and the current light source projection coefficient. light value.
  • the terminal may pre-calculate the lighting transfer parameters corresponding to each object in the virtual scene, which may specifically include the lighting transfer parameters corresponding to each static object in the virtual scene and the lighting transfer parameters corresponding to the lighting probe.
  • the indirect illumination value of the target pixel in the virtual scene can be directly calculated from the illumination transfer parameter and the projection coefficient of the current light source.
  • the direct illumination brightness received by each object in the virtual environment also changes.
  • the direct illumination and shadow in the virtual scene can be dynamically calculated by using the runtime rendering pipeline, so that the direct illumination can be calculated in real time.
  • the terminal can determine the brightness of the light irradiated by the changed light source on the target pixel point according to the light source change information, and according to the changed light brightness, the color of the object corresponding to the target pixel point, and the angle between the target pixel point and the virtual observation point , and calculate the direct illumination value corresponding to the target pixel under the changed light source in real time.
  • a preset method can be used to calculate the brightness contribution of the light after the light of the changed light source is projected to the target pixel, and use it as the color value of the pixel, so as to obtain the corresponding direct illumination value of the target pixel under the changed light source.
  • the direct illumination value corresponding to the target pixel point may be zero, that is, the target pixel point only includes the indirect light after multiple reflections of the light irradiated by the light source into the virtual environment. light value.
  • the target pixel points to be rendered include multiple points, and for each pixel point, it may only include the direct lighting value or the indirect lighting value, or may include both the direct lighting value and the indirect lighting value.
  • the direct lighting value is a valid value
  • the indirect lighting value can be zero; when the indirect lighting value is a valid value, the direct lighting value can be zero.
  • the terminal further determines the shadow corresponding to each target pixel point according to the direct illumination value corresponding to the target pixel point. Specifically, for the calculation of the shadow value, that is, for each pixel, a ray is emitted to the light source. If the light intersects the object, the object is in shadow, and the shadow value is 0. If the shadow value does not intersect, the shadow value is 1. There is a penumbra, where the shadow value is between 0-1.
  • the lighting rendering refers to performing lighting calculation processing in the rendering process on the target pixels in the to-be-rendered image, so that the final target pixels have lighting effects.
  • the global illumination information of each target pixel in the virtual environment is obtained.
  • the terminal after calculating the direct illumination value and indirect illumination value corresponding to the target pixel to be rendered, the terminal performs illumination rendering on the target pixel according to the direct illumination value and indirect illumination value corresponding to each target pixel, so as to obtain the final result. screen rendering results.
  • the terminal when the light source in the virtual scene changes, the terminal first determines the light source change information, and then determines the current light source projection coefficient corresponding to the changed light source according to the light source change information, so that the current light source can be accurately and effectively updated in real time.
  • the current light source projection coefficient corresponding to the light source The terminal further determines the indirect illumination value of the target pixel in the virtual scene according to the current light source projection coefficient and the illumination transfer parameter corresponding to the target pixel. In this way, the indirect illumination value of the target pixel under the changed light source can be accurately calculated.
  • the terminal further determines the direct illumination value corresponding to the target pixel under the changed light source, and performs illumination rendering on the target pixel according to the direct illumination value and the indirect illumination value, so that the illumination rendering of the real-time changing light source can be accurately and efficiently performed. processing, thereby effectively improving the lighting rendering effect in dynamic lighting scenes.
  • determining the current light source projection coefficient corresponding to the changed light source according to the light source change information includes: determining the illumination transformation matrix according to the light source change information; determining the initial light source projection coefficient before the light source changes in the virtual scene; The illumination transformation matrix and the initial light source projection coefficient determine the current light source projection coefficient corresponding to the changed light source.
  • the light source change information corresponding to the light source change is specifically determined according to the difference between the light source after the change and the light source before the change.
  • the transformation matrix is a linear algebra that can perform linear transformation. Any linear transformation can be represented by a matrix in a consistent form that is easy to calculate, and multiple transformations can also be easily connected together by matrix multiplication.
  • affine transformation and perspective projection in projected space can represent multi-dimensional linear transformation with homogeneous coordinates.
  • Lighting transformation matrix refers to the transformation matrix used to represent the change of light.
  • the initial light source projection coefficient refers to the light source projection coefficient corresponding to the light source before the light source changes.
  • the terminal after determining the light source change information, the terminal first updates the illumination transformation matrix after the light source change according to the light source change information.
  • the terminal obtains the initial light source projection coefficient corresponding to the light source before the change, and then calculates the current light source projection coefficient corresponding to the changed light source according to the updated illumination transformation matrix and the initial light source projection coefficient.
  • the current light source projection coefficient corresponding to the current light source in the virtual scene can be calculated in real time, so that the light source projection coefficient under the dynamic light source can be quickly updated.
  • the current light source projection coefficient corresponding to the changed light source is determined according to the illumination transformation matrix and the initial light source projection coefficient, so that the current light source projection coefficient corresponding to the current light source in the virtual scene can be calculated in real time, so that it can quickly
  • the light projection factor under dynamic light sources is updated locally.
  • the original light source preset in the virtual scene is a fixed original light source preset in the virtual scene before the virtual scene is displayed.
  • the original light source of the preset does not change.
  • basis functions refer to elements of a particular basis in the space of functions.
  • each continuous function can be represented as a linear combination of basis functions.
  • Basis functions are also called mixing functions, which can be used as interpolation functions by mixing basis functions.
  • the projection basis function is used to input the light source into the basis function to calculate the reflection of light from the light source through the basis function.
  • the projection basis function may specifically be a spherical harmonic function.
  • the spherical harmonic function is a unit sphere that restricts the harmonic function satisfying Laplace's equation to the spherical coordinate system, and can be used to represent the direction distribution on the spherical surface.
  • a spherical harmonic function is a generalized Fourier variation defined on a sphere, and a discrete spherical function can be parameterized to simulate low-frequency ambient lighting.
  • the BRDF function Bidirectional Reflectance Distribution Function
  • Spherical harmonic functions can be used to capture lighting and then relight to calculate global illumination in virtual environments in real time.
  • the projection basis function may also use a spherical Gaussian function, a user-defined piecewise function, etc. as the projection basis function, which is not limited in this application.
  • the light source in the virtual scene has a corresponding light source function. Project the light source into the projection basis function, that is, project the light source function onto the preset projection basis function.
  • the terminal compares the changed light source with the original light source preset in the virtual scene to obtain light source change information. Then, the terminal obtains the initial light source projection coefficient corresponding to the original light source.
  • the initial light source projection coefficient is a projection coefficient obtained by projecting the original light source onto the projection basis function.
  • the terminal determines a corresponding illumination transformation matrix according to the light source change information, and then updates the initial light source projection coefficient according to the illumination transformation matrix. Specifically, the terminal may perform dot product processing on the illumination change matrix and the initial light source projection coefficient, so as to obtain the updated projection coefficient. The terminal uses the obtained updated projection coefficient as the current light source projection coefficient corresponding to the changed light source in the virtual scene, so that the current light source projection coefficient corresponding to the current light source can be quickly calculated in real time.
  • the initial light source projection coefficient is updated to obtain the current light source projection coefficient corresponding to the changed light source, and the obtained updated projection coefficient is used as the changed light source in the virtual scene.
  • the current light source projection coefficient so that the current light source projection coefficient corresponding to the current light source can be quickly calculated in real time.
  • the light source change information is the change information generated by the changed light source compared to the historical light source before the change in the virtual scene; determining the initial light source projection coefficient before the light source change in the virtual scene includes: obtaining and historical The historical light source projection coefficient corresponding to the light source. Determining the current light source projection coefficient corresponding to the changed light source according to the illumination transformation matrix and the initial light source projection coefficient includes: updating the historical light source projection coefficient according to the illumination transformation matrix to obtain the current light source projection coefficient corresponding to the changed light source .
  • the historical light source refers to the light source before the change. For example, if the light source at the current moment has changed compared to the light source at the previous moment, the light source at the current moment is the light source after the change, and the light source at the previous moment is the light source before the change, that is, the light source at the previous moment is the historical light source.
  • the historical light source projection coefficient is the calculated light source projection coefficient corresponding to the historical light source. Specifically, it may be the light source projection coefficient obtained by projecting the historical light source to the projection basis function.
  • the terminal compares the changed light source with the historical light source before the change in the virtual scene to obtain light source change information.
  • the terminal further directly obtains the calculated historical light source projection coefficients corresponding to the historical light sources.
  • the terminal further determines a corresponding illumination transformation matrix according to the light source change information, so as to update the historical light source projection coefficient according to the illumination transformation matrix.
  • the terminal can perform dot product processing on the illumination change matrix and the historical light source projection coefficients, that is, similar to transforming the matrix and a vector (a set of projection coefficients), so as to obtain updated projection coefficients.
  • the terminal uses the obtained updated projection coefficient as the current light source projection coefficient corresponding to the changed light source in the virtual scene.
  • the historical light source projection coefficient at the last moment can be updated accurately and effectively, so as to accurately and real-time calculate the illumination information after the light source changes.
  • the steps of obtaining the lighting transfer parameters corresponding to the target pixels in the virtual scene are as follows: for the target pixels belonging to static objects in the virtual scene, obtain a light map matching the target pixels; The lighting transfer parameters corresponding to the target pixels in the virtual scene are obtained from the texture.
  • the static object in the virtual scene refers to the fixed object in the virtual scene, and its position and orientation will not change.
  • a light map refers to a picture that uses a global illumination algorithm to pre-generate lighting information for a static target object in a virtual scene, which is used to represent the lighting visual effect of the static object.
  • Lightmap similar to color rendering, is used to perform lighting rendering processing for corresponding static objects to render corresponding lighting effects, so as to simulate the influence of lighting on objects in real scenes.
  • the light map includes lighting information corresponding to multiple texels, that is, pixel values or color values.
  • texel short for texture element, is the basic unit in the texture space of computer graphics, and the texel can be mapped to the appropriate output image pixel by texture mapping technology.
  • the texels in the embodiments of the present application correspond to pixels, that is, the texels in the lightmap correspond to the pixels on the surface of the static object. Through the lighting information of the pixel points, the object can be illuminated and rendered.
  • a light map may correspond to an object surface map corresponding to a static object in a virtual scene.
  • one light map may correspond to one or more static objects in the virtual scene, for example, it may be multiple surface maps of objects on the same plane.
  • the use of lightmap technology can generate lighting information offline for real-time rendering objects, which can improve image quality while ensuring performance.
  • pre-calculated light transfer parameters corresponding to the texels of the static object are stored.
  • the lighting transfer parameters corresponding to the static objects may be pre-calculated based on a PRT technology (Precomputed Radiance Transfer, a rendering technology for pre-calculating radiance transfer), and then the lighting transfer parameters may be stored in the corresponding light map. Therefore, when calculating the indirect illumination of the static object, the terminal can directly obtain the corresponding illumination transfer parameters from the light map to perform illumination calculation processing.
  • the terminal determines the current light source projection coefficient corresponding to the changed light source according to the light source change information after the light source change, and then obtains the pre-calculated illumination transfer parameter corresponding to the target pixel point to Calculate the indirect lighting value of the target pixel in the virtual scene according to the lighting transfer parameters and the current light source projection coefficient.
  • the terminal obtains the target pixels in the matching lightmap from the target pixels. Lighting transfer parameters corresponding to the target pixel. Then, the terminal calculates the indirect illumination value of the target pixel in the virtual scene according to the current light source projection coefficient and the acquired illumination transfer parameter.
  • the terminal can directly obtain the lightmap from the lightmap when calculating the indirect lighting of the static object.
  • the corresponding lighting transfer parameters are processed for lighting calculation, so the efficiency of obtaining lighting transfer parameters is improved, so that the indirect lighting of static objects can be calculated accurately and quickly, and the efficiency of lighting rendering is improved.
  • obtaining the light transfer parameters corresponding to the target pixels in the virtual scene from the light map includes: for the target pixels belonging to the static objects in the virtual scene, based on the texture mapping relationship, in the light map of the static objects, searching for The texel that matches the target pixel; according to the matched texel, the light transfer parameters corresponding to the target pixel are obtained from the lightmap.
  • texture mapping is the process of mapping texture pixels in texture space to pixels in screen space.
  • Texture mapping relationship that is, the mapping relationship between texels in texture space and pixels in screen space.
  • the terminal in the process of calculating the indirect illumination value of the target pixel point belonging to the static object in the virtual scene, for each target pixel point, the terminal first obtains the light map corresponding to the static object, and then according to the texture mapping relationship on the static object In the corresponding lightmap, find the texel that matches the target pixel. Among them, the texels of the static object in the light map correspond to the pixels on the surface of the static object.
  • the terminal obtains the pre-stored lighting transfer parameters corresponding to each target pixel from the lightmap according to the matched texels.
  • the terminal then performs dot product processing on the current light source projection coefficient and the illumination transfer parameter corresponding to each target pixel, and then the indirect illumination value of each target pixel in the virtual scene can be calculated.
  • the lighting transfer parameters of each static object in the virtual scene are pre-baked and stored in the corresponding light map.
  • the steps of acquiring the illumination transfer parameters corresponding to the target pixels in the virtual scene are as follows: for the target pixels belonging to the dynamic objects in the virtual scene, acquire illumination probes that match the target pixels; The light transfer parameters corresponding to the target pixels in the virtual scene are obtained from the needle.
  • a dynamic object in a virtual scene refers to a non-fixed object in the virtual scene, that is, a moving object that can move in the virtual scene, and at least one of its position and direction can be changed.
  • the light probe is a light probe that is pre-placed in the virtual scene, and may be a sphere or a polygon, such as a cube.
  • Light probes can capture and use information about the light passing through the empty space of the scene, which can be used to provide high-quality lighting information including indirect reflected light for dynamic objects in virtual scenes, and high-precision detailed lighting for static objects information.
  • light probes store baking information for lighting in a virtual scene, including precomputed lighting transfer parameters.
  • the difference is that lightmaps store lighting information about light hitting the surfaces of static objects in a virtual scene, while light probes store information about light passing through empty spaces in the scene.
  • the terminal obtains a light probe that matches the target pixel, and Get the corresponding light transfer parameters from the light probe. Based on the distance between the target pixel and the light probe, the terminal can select a light probe that matches the target pixel from each light probe in the virtual scene. For example, the terminal can calculate the target pixel and light probe. When the calculated distance is less than the distance threshold, the light probe is determined as the light probe matching the target pixel point. The distance threshold can be preset or set as needed. The terminal further transmits the parameters according to the current light source projection coefficient and the illumination transfer parameters stored in the illumination probe matching the target pixel point. Then, the terminal calculates the indirect illumination value of the target pixel in the virtual scene according to the current light source projection coefficient and the acquired illumination transfer parameter.
  • the terminal can directly obtain the indirect illumination from the illumination probes when calculating the indirect illumination of dynamic objects.
  • the corresponding lighting transfer parameters are obtained from the needle for lighting calculation processing, which improves the efficiency of obtaining lighting transfer parameters, so that the indirect lighting of dynamic objects can be accurately and quickly calculated, and the efficiency of lighting rendering is improved.
  • determining the indirect illumination value of the target pixel in the virtual scene according to the illumination transfer parameter corresponding to the target pixel in the virtual scene and the current light source projection coefficient includes: updating the target pixel according to the current light source projection coefficient and the illumination transfer parameter The light brightness corresponding to each direction on the light probe; the updated light brightness corresponding to each light probe is interpolated; according to the interpolated light brightness and the normal direction of the target pixel point, determine the target pixel point in the virtual scene. Indirect lighting value.
  • each direction has a corresponding pre-baked light intensity. That is, for each direction on the light probe, the light transfer parameters corresponding to each direction are pre-calculated.
  • the normal is a straight line that is always perpendicular to a plane.
  • a normal is a line in the plane that is perpendicular to the tangent of a curve at a point.
  • the normal has a direction, usually from the inside of the solid to the outside is the positive normal direction, and vice versa is the normal negative direction.
  • the normal direction of the target pixel refers to the direction from the target pixel to the normal of the virtual observation point or the camera.
  • the terminal in the process of calculating the indirect illumination value of the target pixel point belonging to the dynamic object in the virtual scene, for each target pixel point, the terminal first obtains an illumination probe matching the dynamic object. For example, you can get light probes within a preset distance of dynamic objects. Then get the light transfer parameters stored in each light probe. The terminal then calculates the brightness of light in each direction of each illumination probe according to the illumination transfer parameters stored in each illumination probe, the current projection coefficient corresponding to each probe surface, and the current projection coefficient corresponding to each probe surface . Specifically, the calculated light brightness may be an indirect lighting value.
  • the terminal then performs interpolation processing on the brightness of the light corresponding to the lighting probe matched with the dynamic object to obtain the brightness of the light after the interpolation.
  • the terminal then calculates the indirect illumination value corresponding to the target pixel on the dynamic object according to the normal direction of the target pixel on the dynamic object, so that the indirect illumination value of each target pixel in the virtual scene can be accurately calculated.
  • the illumination transfer parameters of each illumination probe in the virtual scene are pre-baked and stored in the corresponding illumination probe. Therefore, when the virtual scene is running, the light transfer parameters are obtained from the light probe, which improves the efficiency of obtaining the light transfer parameters, can quickly calculate the indirect lighting value of the dynamic object in real time, and can accurately and quickly calculate the change of the light source.
  • the indirect lighting value of dynamic objects improves the efficiency of lighting rendering.
  • the above-mentioned illumination rendering method when the light source in the virtual scene changes, before determining the light source change information, the above-mentioned illumination rendering method further includes: projecting the original light source to a plurality of projection basis functions, and using each projection basis function after the projection as a Corresponding virtual light source; for the pixels of static objects in the virtual scene, the lighting transfer parameters corresponding to the pixels of each virtual light source are determined based on ray tracing, and stored in the corresponding light map; for the light probes in the virtual scene, based on Ray tracing determines the light transfer parameters corresponding to each virtual light source in the light probe, and stores them in the light probe.
  • ray tracing also known as Path Tracing
  • Path Tracing is a technology used to render virtual scenes in computer graphics.
  • the principle is to emit a ray from the viewpoint. When the ray intersects the surface of the object, select a random direction according to the material properties of the surface, and continue to emit another ray for tracking sampling, and so on, until the ray hits the light source or escapes the scene.
  • the light contribution is then calculated as the color value of the pixel, for example, a Monte Carlo method can be used to calculate the light contribution.
  • the final rendered image can converge to obtain a high-accuracy lighting rendering result.
  • Ray tracing can render high-realistic scene effects, but it also requires relatively high computational overhead.
  • the radiance of light reflected on the surface of an object does not change in optical propagation, that is, the light transfer parameters do not change. Therefore, the illumination transfer parameters of the surfaces of the objects in the virtual scene can be pre-baked, thereby effectively reducing the computational overhead in the real-time rendering process and improving the efficiency of rendering processing.
  • the lighting transfer parameters corresponding to the pixels of the static objects in the virtual scene and the lighting transfer parameters corresponding to the light probes may also be pre-bake. Specifically, the lighting transfer parameters can be baked based on the basis function light source in the virtual scene.
  • the basis function light source that is, the virtual light source, that is, after projecting the original light source to the preset projection basis function, uses each basis function as a virtual light source, and the projected basis function is the basis function light source.
  • the lighting transfer parameters can be pre-baked by ray tracing according to the projection basis function.
  • the spherical harmonic function can be used as the projection basis function. Since the spherical harmonic function has the characteristics of rotation invariance and high projection performance, when the light source changes, the light transfer parameters will not change, only the light source projection coefficient corresponding to the light source will be changed. It changes as the light source changes, so using the spherical harmonic function based on the hemispherical surface integral can effectively bake out the light transfer parameters.
  • the terminal can project the original light source onto multiple projection basis functions, and use each projected projection basis function as a corresponding virtual light source, and then use ray tracing to calculate the brightness information of the virtual light source on the surface of each object in the virtual scene.
  • the terminal in the process of pre-computing the light transfer parameters, the terminal only samples the indirect light, that is, only samples the rays whose light has been bounced twice or more.
  • each virtual light source is projected onto the surface of each object in the virtual scene, and each pixel on the surface of each object in the virtual scene receives brightness information corresponding to each virtual light source.
  • the terminal first projects the preset original light source onto a plurality of preset different projection basis functions, for example, can project onto three different projection basis functions.
  • a plurality of that is, at least two or more. Projecting the original light source onto different projection basis functions will generate an approximately dynamic light source, and each projection basis function after projection is a kind of virtual light source, so each projection basis function after projection can be used as the corresponding Virtual light source.
  • Each projection basis function after projection is re-projected into the virtual scene as a virtual light source, and each pixel on the surface of each object in the virtual scene can receive the brightness information of each virtual light source, so that approximate dynamic light sources can be sampled in the virtual scene. Corresponding lighting information.
  • Each pixel point on the surface of each object in the virtual scene after receiving the brightness information of each virtual light source, emits reflected rays from the pixel point of each object. Then, ray tracing is performed on the reflected rays to capture the brightness contribution of the reflected rays in the virtual scene, so as to determine the illumination transfer parameters corresponding to each pixel on the surface of the object.
  • the terminal can bake out corresponding illumination transfer parameters for static objects and illumination probes in the virtual scene, respectively. For each pixel point of the static object in the virtual scene, the terminal determines the light transfer parameter corresponding to each pixel point of each virtual light source based on ray tracing, and stores it in the corresponding light map. For the light probes in the virtual scene, the terminal determines the light transfer parameters corresponding to the light probes for each virtual light source based on ray tracing, and stores them in the corresponding light probes.
  • the lighting transfer parameters corresponding to the static objects and the light probes in the virtual scene are pre-calculated and stored in the corresponding lightmaps and light probes respectively, so that when performing lighting rendering processing in real time, it is possible to directly Obtaining the illumination transfer parameters corresponding to the target pixels for indirect illumination calculation can effectively reduce the calculation consumption in the real-time rendering process and effectively improve the effect of illumination rendering.
  • determining the light transfer parameter corresponding to each virtual light source at the pixel based on ray tracing, and storing it in a corresponding light map including: for static objects in the virtual scene For each pixel point of , obtain the light brightness corresponding to the pixel point after the projection of each virtual light source; take the pixel point as the starting point, and emit rays with light brightness to the hemisphere pointed by the normal of the pixel point; in the virtual scene, Sampling the reflection brightness and brightness attenuation of each ray after hemispherical reflection; based on the reflected light brightness and light source brightness corresponding to each ray, determine the light transfer parameters corresponding to the pixels, and store the light transfer parameters in the corresponding light map.
  • the light brightness after the pixel points are projected by the original light source refers to the light brightness information received by each pixel point on the static object in the virtual scene after the original light source in the virtual scene is projected to the preset projection basis function, and also It is the brightness of the color of the pixel after receiving the illumination of the light source.
  • the spherical harmonic function is an orthonormal basis defined on the unit sphere.
  • the projection basis function can be used to represent the directional distribution on the sphere.
  • each pixel on the static object will receive the light brightness corresponding to each virtual light source.
  • it may be the light brightness of each virtual light source after multiple reflections.
  • the terminal uses the world space position corresponding to the center of each pixel as the starting point, and based on the brightness of the received light, emits light to the hemisphere where the normal of the pixel is facing, and also It is to emit rays with the brightness of light.
  • the terminal then samples in the virtual scene, calculates the reflection brightness and brightness attenuation of the ray after being reflected by the hemispherical surface, and then calculates the brightness contribution of the ray after multiple bounces according to the reflection brightness and brightness attenuation. Specifically, when the reflected ray intersects with an object in the virtual scene, the brightness contribution of the ray after multiple bounces on the current projection basis function is calculated. Among them, for each ray, the brightness contribution corresponding to each ray is calculated, and then the brightness contribution is fitted to the corresponding projection basis function, and then the illumination transfer parameters on the corresponding projection basis function are calculated according to the brightness contribution. .
  • the terminal After the terminal projects the original light source to multiple projection basis functions, for each basis function, it samples the brightness contribution of rays after multiple bounces in the virtual scene, and calculates the illumination transfer parameters on the corresponding projection basis functions. Then, the lighting transfer parameters on each projection basis function are fitted to obtain the final lighting transfer parameters corresponding to the pixels.
  • the terminal After calculating the light transfer parameters corresponding to each pixel on the static object in the virtual scene, the terminal stores the light transfer parameters in the corresponding texel positions in the light map according to the texture mapping relationship.
  • determining the light transfer parameters corresponding to the light probes of each virtual light source based on ray tracing, and storing them in the light probes including: for the light probes in the virtual scene Needle to obtain the light intensity corresponding to each surface of the illumination probe after the projection of each virtual light source; based on the light intensity, take the center point of the illumination probe as the starting point, and emit rays to a spherical surface with a preset radius centered on the center point; In the virtual scene, the reflection brightness and brightness attenuation of each ray after being reflected by the spherical surface are sampled; based on the reflection brightness and brightness attenuation corresponding to each ray, the lighting transfer parameters corresponding to each light probe are determined, and the lighting transfer parameters are stored in the in the light probe.
  • Each light probe may be preset with a plurality of corresponding projection basis functions, which may be spherical basis functions in particular.
  • the terminal When the terminal precomputes the lighting transfer parameters corresponding to the light probes in the virtual scene, after each virtual light source is projected in the virtual scene, the faces in multiple directions of each light probe in the virtual scene will receive the corresponding values of each virtual light source. light brightness. For the faces in each direction of the light probe, the terminal takes the center point of the light probe as the starting point, and according to the brightness of the light received by the light probe, emits rays to a spherical surface with a preset radius centered on the center point .
  • the calculation method is the same as that for each pixel on a static object. After calculating the brightness contribution corresponding to each ray, the brightness contribution is fitted to the corresponding projection basis function, with The corresponding illumination transfer parameters are calculated according to the lightness contribution.
  • the included angle of the ray is calculated according to the direction of the ray and the direction of each surface on the illumination probe.
  • the included angle of the rays is less than 90 degrees
  • the brightness contribution of the rays after multiple bounces on the current projection basis function is calculated.
  • the brightness contribution corresponding to each ray is calculated, and then the brightness contribution is fitted to the projection basis function of the corresponding surface, and then the illumination on the projection basis function of the corresponding surface is calculated according to the brightness contribution. Pass parameters.
  • the ambient light cube includes six surfaces, namely the x surface, the -x surface, the y surface, the -y surface, the z surface and the -z surface, each surface corresponds to a projection basis function, and each basis function stores a corresponding
  • the light transfer parameter of the light transfer parameter can specifically be the light transfer vector Transfer Vector.
  • the x surface corresponds to the light transfer vector (x)
  • the -x surface corresponds to the light transfer vector (-x)
  • the y surface corresponds to the light transfer vector (y)
  • the -y surface corresponds to the light transfer vector (-y)
  • the z corresponds to the light transfer vector (z)
  • the terminal After calculating the light transfer parameters corresponding to each surface on the light probe in the virtual scene, the terminal stores the light transfer parameters on the light probe.
  • the virtual scene is a virtual game scene
  • the light source in the virtual game scene is a distant light source
  • determining the light source change information includes: when running the virtual game scene, monitoring the virtual game The light source direction of the distant light source in the scene; when the light source direction changes, the light source change information is determined according to the changed light source direction and the initial light source direction of the distant light source.
  • a virtual game scene refers to a virtual scene deployed in a game application, and may specifically be a three-dimensional virtual scene.
  • the distant light source may specifically refer to a light source with a far range displayed in the picture, that is, a light source far away from an object in a virtual scene, such as sunlight or sky light in a virtual game scene.
  • the sky light in the game scene is usually calculated based on the method of simulating atmospheric scattering to calculate the scattered light in the environment, which can render the sky and scene under different sun positions and atmospheric characteristics.
  • the final color value is calculated by the integral formula, and the integral interval is a continuous interval, so the approximate result can be obtained by calculating the cumulative sum by discrete sampling.
  • a game application is deployed in the terminal, and the game application includes a virtual game scene.
  • the terminal monitors the light source direction of the distant light source in the virtual game scene in real time.
  • the light source change information is determined according to the changed light source direction and the initial light source direction of the distant light source. For example, when the light source rotates, the rotation angle of the light source is determined according to the rotation direction. Then the terminal calculates the rotation matrix corresponding to the spherical harmonics according to the rotation angle, so as to update the light source projection coefficient in the virtual game scene in real time according to the rotation matrix and the initial light source projection coefficient in the virtual game scene, so as to calculate the target pixel to be displayed. The corresponding current light source projection coefficient. Then, according to the current light source projection coefficient calculated in real time, illumination rendering processing is performed on the target pixel point.
  • FIG. 6 which is a scene picture 6a in some embodiments
  • the distant light source in the scene picture is sunlight.
  • the virtual scene has a house on the left and two walls on the right, red and green, which reflect the corresponding indirect lighting in the virtual scene.
  • the distant light source changes, for example, when the sunlight in the virtual scene rotates and changes, the light received by each object in the virtual scene also changes.
  • FIG. 7 it is a scene picture 7 a after the sunlight in the virtual scene in FIG. 6 is rotated.
  • the wall receives brighter light
  • the corresponding reflected light from the left house wall is also brighter.
  • the direct illumination value and the indirect illumination value in the virtual game scene are dynamically calculated. Dynamic change, real-time dynamic lighting rendering is performed on the scene, which can present an effect close to the real natural environment, thereby effectively improving the lighting rendering effect under the dynamic light source in the virtual scene.
  • a specific lighting rendering method which specifically includes the following steps:
  • S804 Determine a lighting transformation matrix according to the light source change information; determine a current light source projection coefficient corresponding to the changed light source according to the lighting transformation matrix and the initial light source projection coefficient before the light source change in the virtual scene.
  • S810 Determine the indirect illumination value of the target pixel in the virtual scene according to the current light source projection coefficient and the illumination transfer parameter corresponding to the target pixel.
  • S816 Interpolate the updated light intensity corresponding to each illumination probe; determine the indirect illumination value of the target pixel in the virtual scene according to the interpolated light intensity and the normal direction of the target pixel.
  • the terminal detects that the light source in the virtual scene changes, the light source projection coefficient corresponding to the changed light source will also change accordingly.
  • the terminal determines the illumination transformation matrix according to the light source change information. For example, the rotation matrix of the light source can be determined according to the angle of the light source transformation.
  • the terminal then updates the current light source projection coefficient corresponding to the current light source in the virtual scene according to the illumination transformation matrix and the initial light source projection coefficient before the light source changes.
  • the lighting in the virtual scene is a linear system, and the light source in the virtual scene can be represented by a function, and the lighting in the virtual scene is calculated by projecting the light source function to the light source projection basis function.
  • the expression of the lighting linear system can be as follows:
  • x, y, z are the basis functions of the light source projection in the virtual scene
  • a, b, and c are the light source projection coefficients of the light source projected onto the basis function.
  • a distant light source it can be represented by a CubeMap (cube map) in the virtual scene, so that it can be projected onto the basis function when performing lighting calculations.
  • the projection expression of the distant light source can be:
  • sh0, sh1, and sh2 are basis functions, corresponding to x, y, and z, respectively.
  • the lighting of the virtual scene is an approximate effect, which can be used for smooth lighting such as indirect light.
  • the basis function does not change, only the corresponding light source projection coefficients a, b, and c change. Therefore, f(x), f(y), and f(z) in the above formula are not changed.
  • it can be calculated by pre-baking, that is, the corresponding light transfer parameter Transfer Vector, and the light source projection coefficients a, b, and c need to be calculated in real-time according to the corresponding light source in the virtual scene.
  • the spherical harmonic fundamental function of any order can be used to fit the light source according to the application requirements.
  • the fitting of complex light sources is more accurate, but the more orders, the more computing resources that need to be consumed.
  • only indirect lighting based on the PRT technology may be considered.
  • a third-order spherical harmonic basis function that is, 9 basis functions
  • a fourth-order spherical harmonic basis function that is, 16 basis functions can be used, which can meet the fitting requirements of the distant light source.
  • the scene picture 92 includes a distant light source 92a.
  • the projection schematic diagram 10a is a schematic diagram after projecting the distant light source 92a in FIG. 9 . It can be seen that the brightness of the light source part and the area around the light source is relatively high.
  • the distant light source in the virtual scene in the case of day and night change, the distant light source rotates around the scene.
  • the selection of the projection basis function affects the baking of the lighting transfer parameters and the projection of the light source. Since the light source projection is an approximation to the distant light source, in the process of rotating the distant light source, if the projection basis function used does not have the characteristics of rotation invariance, the jitter of the coefficient will occur, resulting in the jitter of the scene illumination.
  • the spherical harmonic function is used as the projection basis function of the light source. Because the spherical harmonic function has the characteristics of rotation invariance and high projection performance, the lighting information in the virtual scene can be calculated more accurately, which can effectively improve the final lighting rendering effect and runtime performance.
  • the terminal may project the original light source onto the projection basis function according to the state information of the original light source preset in the virtual scene in advance, and then calculate the initial projection coefficient based on the Monte Carlo method. Then, in the process of rendering the virtual scene, when the light source changes, for example, in the rotation of the light source, after calculating the light rotation matrix according to the rotation angle, it is only necessary to perform the product of the matrix and the vector once according to the initial projection coefficient, The current light source projection coefficient after the light source changes can be updated. Compared with the method of calculating the projection coefficient of the light source by performing a large number of samples and re-projecting each time, the method of this embodiment can effectively save the consumption of computing resources.
  • the terminal Before rendering the target pixels to be rendered in the virtual scene, the terminal needs to pre-bake the lighting transfer parameters corresponding to each pixel of the static object in the virtual scene and the lighting transfer parameters corresponding to the lighting probe.
  • the light transfer parameters can be baked based on the path tracing based on the PRT method.
  • the illumination contribution of the entire distant light source to the scene is calculated.
  • only the brightness information of the indirect lighting in the virtual scene is considered.
  • only the indirect lighting with low frequency is calculated in the lighting transfer parameter Transfer Vector.
  • a method based on path integration can be used, and the brightness representation of indirect lighting in a virtual scene can be as follows:
  • Lo represents the brightness of the light emitted from a position on the object in a certain direction
  • Le is the brightness of the light source or self-illuminating object
  • P(pn) is the brightness contribution of the light after N bounces
  • T(pn) is the N number of bounces. Attenuation of light brightness after bounce.
  • each projection basis function calculates the corresponding luminance contribution according to the above steps.
  • the luminance information of multiple projection basis functions may share the same ray bounce path and be packaged together for calculation. In this way, the lighting transfer parameters corresponding to the indirect lighting in the virtual scene can be baked more quickly.
  • the lighting baking data of each static object that is, the lighting transfer parameters
  • the lighting transfer parameters are stored in the light map corresponding to each static object.
  • Each texel in the lightmap stores corresponding brightness information, that is, each texel stores a corresponding light transfer parameter information.
  • each pixel on the surface of the static object in the virtual scene will receive corresponding brightness information. Then, for each pixel on the static object, which corresponds to a texel in the lightmap, from the world-space position corresponding to the center of the texel as a starting point, a ray is emitted to the hemisphere where the normal is facing. The scene is then sampled to calculate the luminosity falloff of the ray over multiple bounces. When the light source intersects the object in the scene, calculate the luminance contribution of each ray on the current basis function, then fit the luminance contribution of each ray, and then calculate the luminance contribution obtained by fitting on each basis function. Corresponding light transfer parameters.
  • the expression of fitting the light transfer parameters by emitting light to the hemisphere can be as follows:
  • T represents the light transmission parameter
  • L(w) is the brightness of the light incident in the w direction
  • is the angle between the incident direction and the normal.
  • the Ambient Cube can be used to represent the faces in multiple directions on the light probe.
  • Each surface corresponds to a projection basis function, and a light transfer parameter is stored on the basis function corresponding to each surface.
  • the baking data of dynamic objects in the virtual scene is stored in the light probe, and each surface in the light probe stores the incident light information in all directions.
  • the baking of the light probe takes the position of the light probe as the starting point, and emits rays uniformly to the entire spherical surface.
  • the calculation method corresponding to the light map is the same, and the corresponding ray on each basis function is accumulated. , and then fit to the corresponding basis function.
  • the included angle is calculated according to the direction of the ray and the direction of each surface. When the included angle is less than 90 degrees, it is accumulated into the illumination transfer parameters of the corresponding surface.
  • the terminal When the terminal renders the virtual scene, for example, when running an application including the virtual scene, the terminal needs to update the light source projection coefficient corresponding to the changed light source according to the real-time light source information.
  • the light source as the distant light source, such as sunlight as an example
  • the mapping relationship is mapped to the surface of the static object.
  • the projection of the distant light source to the basis function can be calculated by the Monte Carlo method. Specifically, it can start from the center of the world coordinate system in the virtual scene, emit rays uniformly in all directions, and solve the coefficient of the projection basis function.
  • the expression of the light source projection coefficient for solving the projection basis function can be as follows:
  • a represents the spherical harmonic coefficient that needs to be projected to a certain basis function to be solved, that is, the light source projection coefficient.
  • s represents the entire spherical surface
  • w represents the solid angle
  • f(w) is the luminance value representing the direction of the solid angle
  • a(w) is the corresponding spherical harmonic fundamental function.
  • wi represents the parameter of the currently sampled solid angle
  • pdf is the corresponding weight.
  • spherical harmonic basis functions you can first project the basis functions based on the default direction, and then calculate the lighting transformation matrix when the light source changes. For example, when the light source rotates, calculate the rotation matrix corresponding to the spherical harmonic, and update the rotated matrix according to the default initial projection coefficient and rotation matrix, so that the light source projection coefficient can be updated more quickly.
  • the rotation matrix of spherical harmonics can be as follows:
  • the dynamic object in the virtual scene obtain multiple light probes around the dynamic object, and obtain the light transfer parameters stored corresponding to the multiple faces in the light probe, which can be 6 faces specifically. Then, according to the calculated current light source projection coefficient and the illumination transfer parameters corresponding to each surface in the light probe, the light brightness of each surface is updated, that is, the indirect illumination corresponding to each surface in each light probe is calculated. value. Then linearly interpolate the light brightness of the surrounding multiple light probes, and then calculate the final brightness of the target pixel in this direction according to the interpolated light brightness and the normal direction of the target pixel on the dynamic object. , so as to obtain the indirect illumination value corresponding to the target pixel.
  • the indirect illumination value of the dynamic object may also be calculated according to the illumination probe in the manner of spherical harmonics. Similar to the way of using the Ambient Cube ambient light cube, first, according to the pre-calculated light transfer parameters and the current light source projection coefficient stored in the light probe, the projection coefficient on each basis function in the light probe is calculated in real time, and the light probe is obtained. Spherical harmonic representation of the brightness of the needle. Then, according to the normal direction of the target pixel on the dynamic object, the diffuse spherical harmonic conversion parameter corresponding to the target pixel is calculated. Dot product with harmonic coefficients, so that the final indirect illumination value corresponding to the target pixel of the dynamic object can be accurately calculated.
  • the direct illumination value and the indirect illumination value corresponding to the target pixel in the virtual scene are dynamically calculated respectively by splitting the direct illumination and the indirect illumination, and adopting a PRT-based manner.
  • a PRT-based method is used to calculate the lighting rendering effect diagram of direct lighting and indirect lighting respectively.
  • FIG. 11 b in FIG. 11 is a lighting rendering effect diagram of a mixed calculation of direct lighting and indirect lighting based on the PRT method.
  • FIG. 11 c in FIG. 11 is a lighting rendering effect diagram that only uses the PRT-based method to calculate indirect lighting. It can be seen that by splitting direct lighting and indirect lighting, the overall indirect lighting in the virtual scene can be dynamically updated, and the high-frequency direct lighting and shadow information are also more accurate, so that the overall lighting effect in the virtual scene is better.
  • the present application also provides an application scenario, where the application scenario is an open world type three-dimensional game scenario, and the above-mentioned lighting rendering method is applied to the game scenario.
  • the 3D game scene of the open world type includes a virtual scene, which includes static objects and dynamic objects and light probes, and also includes distant light sources that simulate the real world, such as sunlight and sky light.
  • the lighting transfer parameters corresponding to each static object and the light probe in the virtual scene can be pre-baked, and stored in the corresponding light map and light probe respectively.
  • a game application corresponding to a three-dimensional game scene is deployed in the terminal.
  • the light source state of the distant light source in the three-dimensional game scene is detected in real time.
  • the light source change information is determined.
  • the current light source projection coefficient corresponding to the changed distant view light source is determined according to the light source change information.
  • the target pixels to be displayed in the virtual scene for the target pixels to be displayed in the virtual scene, for the target pixels belonging to static objects in the virtual scene, obtain the pre-computed matching lighting transfer parameters from the corresponding light map, and according to the current light source projection coefficient and Lighting transfer parameters, calculate the indirect lighting value of the target pixel in the virtual scene.
  • the pre-computed matching lighting transfer parameters are obtained from the corresponding light probes, and the target pixels in the virtual scene are calculated according to the current light source projection coefficient and lighting transfer parameters. The indirect lighting value.
  • the terminal also calculates the direct illumination value corresponding to the target pixel under the changed light source according to the light source change information, so that the dynamic direct illumination value and indirect illumination value in the virtual scene can be calculated in real time.
  • the terminal then performs illumination rendering on the target pixel point according to the direct illumination value and the indirect illumination value. In this way, the lighting rendering processing of the real-time changing distant light source in the game scene can be accurately and efficiently performed, thereby effectively improving the lighting rendering effect under the dynamic distant light source in the virtual scene.
  • the present application further provides an application scenario, where the application scenario is a three-dimensional environment demonstration scenario, such as a tourism environment demonstration scenario, a building demonstration scenario, and a virtual training scenario.
  • the 3D environment demonstration scene for various environments includes a preset virtual scene, and the virtual scene includes static objects, dynamic objects and light probes.
  • the terminal may be preloaded with scene data corresponding to the three-dimensional environment demonstration scene, and may also acquire the required scene data from the server in real time.
  • the state of the light source in the virtual scene is detected in real time.
  • the current light source projection coefficient corresponding to the target pixel to be displayed in the virtual scene is updated in real time according to the light source change information.
  • the target pixels belonging to static objects in the virtual scene obtain the pre-computed matching lighting transfer parameters from the corresponding light map, and calculate the target pixels in the virtual scene according to the current light source projection coefficient and lighting transfer parameters.
  • the indirect lighting value in .
  • the pre-computed matching lighting transfer parameters are obtained from the corresponding light probes, and the target pixels in the virtual scene are calculated according to the current light source projection coefficient and lighting transfer parameters. The indirect lighting value.
  • the terminal also calculates the direct illumination value corresponding to the target pixel under the changed light source according to the light source change information, so that the dynamic direct illumination value and indirect illumination value in the virtual scene can be calculated in real time.
  • the terminal then performs illumination rendering on the target pixel point according to the direct illumination value and the indirect illumination value. This can not only effectively save the computing resources of indirect lighting, but also accurately perform lighting rendering processing on the real-time changing light sources in the virtual scene, effectively improving the lighting rendering effect under the dynamic light source in the virtual scene.
  • an apparatus for lighting rendering 1200 may adopt a software module or a hardware module, or a combination of the two to become a part of a computer device.
  • the apparatus specifically includes: determining a light source. Module 1202, projection coefficient update module 1204, indirect illumination determination module 1206, direct illumination determination module 1208 and illumination rendering module 1210, wherein:
  • the light source determination module 1202 is configured to determine light source change information when the light source in the virtual scene changes.
  • the projection coefficient updating module 1204 is configured to determine the current light source projection coefficient corresponding to the changed light source according to the light source change information.
  • the indirect illumination determination module 1206 is configured to determine the indirect illumination value of the target pixel in the virtual scene according to the illumination transfer parameter corresponding to the target pixel in the virtual scene and the current light source projection coefficient.
  • the direct illumination determination module 1208 is configured to determine the direct illumination value corresponding to the target pixel point under the changed light source.
  • the illumination rendering module 1210 is configured to perform illumination rendering on the target pixel according to the direct illumination value and the indirect illumination value.
  • the projection coefficient update module 1204 is further configured to determine the illumination transformation matrix according to the light source change information; determine the initial light source projection coefficient before the light source changes in the virtual scene; The current light source projection coefficient corresponding to the light source of .
  • the light source change information is the change information generated by the changed light source compared to the original light source preset in the virtual scene; the projection coefficient updating module 1204 is further configured to obtain the result obtained by projecting the original light source to the projection basis function.
  • the initial light source projection coefficient according to the illumination transformation matrix, the initial light source projection coefficient is updated to obtain the current light source projection coefficient corresponding to the changed light source.
  • the light source change information is the change information generated by the changed light source compared to the historical light source before the change in the virtual scene; the projection coefficient update module 1204 is further configured to obtain the historical light source projection coefficient corresponding to the historical light source ; According to the illumination transformation matrix, the historical light source projection coefficient is updated to obtain the current light source projection coefficient corresponding to the changed light source.
  • the lighting rendering 1200 apparatus further includes a first transfer parameter obtaining module, and the first transfer parameter obtaining module is further configured to obtain a light map matching the target pixels for the target pixels belonging to the static objects in the virtual scene ; Get the light transfer parameters corresponding to the target pixels in the virtual scene from the light map.
  • the first transfer parameter acquisition module is further configured to, for target pixels belonging to static objects in the virtual scene, search for texels that match the target pixels in the lightmap of the static object based on the texture mapping relationship; According to the matching texel, the light transfer parameters corresponding to the target pixel are obtained from the lightmap.
  • the lighting rendering 1200 apparatus further includes a second transfer parameter obtaining module, and the second transfer parameter obtaining module is further configured to obtain a target pixel point that matches the target pixel point for the target pixel point belonging to the dynamic object in the virtual scene.
  • Lighting probe obtain the lighting transfer parameter corresponding to the target pixel in the virtual scene from the lighting probe.
  • the indirect illumination determination module 1206 is further configured to, for the target pixels belonging to the dynamic objects in the virtual scene, determine the illumination probes that match the target pixels of the dynamic objects; according to the current light source projection coefficient and illumination transfer parameters , update the light intensity corresponding to each direction on the light probe; interpolate the updated light intensity corresponding to each light probe; determine the target pixel in the virtual scene according to the interpolated light intensity and the normal direction of the target pixel The indirect lighting value in .
  • the above-mentioned lighting rendering device further includes a lighting transfer parameter baking module for projecting the original light source to a plurality of projection basis functions, and using each projection basis function after projection as a corresponding virtual light source; for virtual scenes For each pixel point of the static object in the virtual scene, the lighting transfer parameters corresponding to each virtual light source at the pixel point are determined based on ray tracing, and are stored in the corresponding light map; for the light probe in the virtual scene, each virtual light source is determined based on ray tracing. The light transfer parameters corresponding to the light probe are stored in the light probe.
  • the lighting transfer parameter baking module is further configured to obtain, for each pixel of the static object in the virtual scene, the brightness of the light corresponding to the pixel after the projection of each virtual light source;
  • the hemisphere pointed to by the normal of emits rays with light brightness; in the virtual scene, the reflection brightness and brightness attenuation of each ray after being reflected by the hemisphere are sampled; based on the reflected light brightness and light source brightness corresponding to each ray, Determine the light transfer parameters corresponding to the pixels, and store the light transfer parameters in the corresponding light map.
  • the lighting transfer parameter baking module is further configured to, for the lighting probe in the virtual scene, obtain the light brightness corresponding to each surface of the lighting probe after each virtual light source is projected;
  • the center point of the needle is the starting point, and rays are emitted to a spherical surface with a preset radius centered on the center point; in the virtual scene, the reflection brightness and brightness attenuation of each ray after being reflected by the spherical surface are sampled; based on the reflection brightness and brightness corresponding to each ray Light attenuation, determine the light transfer parameters corresponding to each light probe, and store the light transfer parameters in the light probe.
  • the virtual scene is a virtual game scene
  • the light source in the virtual game scene is a distant light source
  • the light source determining module 1202 is further configured to monitor the light source direction of the distant light source in the virtual game scene when the virtual game scene is running; When the direction changes, the light source change information is determined according to the changed light source direction and the initial light source direction of the distant light source.
  • Each module in the above-mentioned lighting rendering apparatus may be implemented in whole or in part by software, hardware and combinations thereof.
  • the above modules can be embedded in or independent of the processor in the computer device in the form of hardware, or stored in the memory in the computer device in the form of software, so that the processor can call and execute the operations corresponding to the above modules.
  • a computer device is provided, and the computer device may be a terminal, and its internal structure diagram may be as shown in FIG. 13 .
  • the computer equipment includes a processor, memory, a communication interface, a display screen, and an input device connected by a system bus.
  • the processor of the computer device is used to provide computing and control capabilities.
  • the memory of the computer device includes a non-volatile storage medium, an internal memory.
  • the non-volatile storage medium stores an operating system and computer-readable instructions.
  • the internal memory provides an environment for the execution of the operating system and computer-readable instructions in the non-volatile storage medium.
  • the communication interface of the computer device is used for wired or wireless communication with an external terminal, and the wireless communication can be realized by WIFI, operator network, NFC (Near Field Communication) or other technologies.
  • the computer-readable instructions when executed by a processor, implement a lighting rendering method.
  • the display screen of the computer equipment may be a liquid crystal display screen or an electronic ink display screen, and the input device of the computer equipment may be a touch layer covered on the display screen, or a button, a trackball or a touchpad set on the shell of the computer equipment , or an external keyboard, trackpad, or mouse.
  • FIG. 13 is only a block diagram of a partial structure related to the solution of the present application, and does not constitute a limitation on the computer equipment to which the solution of the present application is applied. Include more or fewer components than shown in the figures, or combine certain components, or have a different arrangement of components.
  • a computer device comprising a memory and one or more processors, the memory having computer-readable instructions stored in the memory, the computer-readable instructions, when executed by the processor, cause the one or more processing
  • the device implements the steps in the above method embodiments.
  • one or more non-transitory readable storage media are also provided, storing computer-readable instructions that, when executed by one or more processors, cause the one or more The processor implements the steps in the foregoing method embodiments.
  • a computer program product including computer-readable instructions, which, when executed by a processor, implement the steps in the foregoing method embodiments.
  • Non-volatile memory may include read-only memory (Read-Only Memory, ROM), magnetic tape, floppy disk, flash memory, or optical memory, and the like.
  • Volatile memory may include random access memory (RAM) or external cache memory.
  • the RAM may be in various forms, such as Static Random Access Memory (SRAM) or Dynamic Random Access Memory (DRAM).

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Image Generation (AREA)
  • Processing Or Creating Images (AREA)

Abstract

本申请涉及一种光照渲染方法、装置、计算机设备和存储介质。所述方法包括:当虚拟场景中的光源发生变化时,确定光源变化信息;根据所述光源变化信息,确定变化后的光源所对应的当前光源投影系数;根据所述虚拟场景中的目标像素点对应的光照传递参数和所述当前光源投影系数,确定所述目标像素点在所述虚拟场景中的间接光照值;确定所述目标像素点在变化后的光源下对应的直接光照值;根据所述直接光照值和所述间接光照值,对所述目标像素点进行光照渲染。

Description

光照渲染方法、装置、计算机设备和存储介质
本申请要求于2021年04月02日提交中国专利局,申请号为202110359533.9,申请名称为“光照渲染方法、装置、计算机设备和存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及图像处理技术领域,特别是涉及一种光照渲染方法、装置、计算机设备和存储介质。
背景技术
在图像渲染中,为了实现逼真的场景光照效果,需要在渲染器中指定全局光照。全局光照,表现了直接照明和间接照明的综合效果。例如全局光照可以通过光线追踪、环境光遮蔽、光照探针等方式实现。当从光源处发射出光线后,碰到障碍物后经过数次的反射和折射,虚拟场景中物体表面和角落都会有光感。
然而,由于现有的渲染方法,不适用于动态光照场景,从而导致了动态光照场景下的光照渲染效果较差。
发明内容
根据本申请提供的各种实施例,提供一种光照渲染方法、装置、计算机设备、存储介质和计算机程序产品。
一种光照渲染方法,由计算机设备执行,所述方法包括:
当虚拟场景中的光源发生变化时,确定光源变化信息;
根据所述光源变化信息,确定变化后的光源所对应的当前光源投影系数;
根据所述目标像素点对应的光照传递参数和所述当前光源投影系数,确定所述目标像素点在所述虚拟场景中的间接光照值;
确定所述虚拟场景中的目标像素点在变化后的光源下对应的直接光照值;及
根据所述直接光照值和所述间接光照值,对所述目标像素点进行光照渲染。
一种光照渲染装置,所述装置包括:
光源确定模块,用于当虚拟场景中的光源发生变化时,确定光源变化信息;
投影系数更新模块,用于根据所述光源变化信息,确定变化后的光源所对应的当前光源投影系数;
间接光照确定模块,用于根据所述虚拟场景中的目标像素点对应的光照传递参数和所述当前光源投影系数,确定所述目标像素点在所述虚拟场景中的间接光照值;
直接光照确定模块,用于确定所述目标像素点在变化后的光源下对应的直接光照值;及
光照渲染模块,用于根据所述直接光照值和所述间接光照值,对所述目标像素点进行光照渲染。
一种计算机设备,包括存储器和一个或多个处理器,所述存储器存储有计算机可读指令,所述计算机可读指令被所述处理器执行时,使得所述一个或多个处理器执行上述光照渲染方法中的步骤。
一个或多个非易失性可读存储介质,其上存储有计算机可读指令,所述计算机可读指令被一个或多个处理器执行时,使得所述一个或多个处理器实现上述光照渲染方法中的步骤。
一种计算机程序产品,包括计算机可读指令,所述计算机可读指令被处理器执行时实现 上述图像处理方法的步骤。
本申请的一个或多个实施例的细节在下面的附图和描述中提出。本申请的其他特征、目的和优点将从说明书、附图以及权利要求书变得明显。
附图说明
为了更清楚地说明本申请实施例中的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1为一些实施例中光照渲染方法的应用环境图;
图2为一些实施例中光照渲染方法的流程示意图;
图3为一些实施例中光源的光照效果的示意图;
图4为一些实施例中光源的光照效果的示意图;
图5为一些实施例中光照探针对应的每个面的光照传递参数的示意图;
图6为一些实施例中场景画面的示意图;
图7为一些实施例中对场景画面进行投影后的示意图;
图8为一些具体的实施例中光照渲染方法的流程示意图;
图9为一些实施例中场景画面的渲染效果的示意图;
图10为一些实施例中场景画面的渲染效果的示意图;
图11为一些实施例中采用三种不同方式进行光照渲染的效果示意图;
图12为一些实施例中光照渲染装置的结构框图;
图13为一些实施例中计算机设备的内部结构图。
具体实施方式
为了使本申请的目的、技术方案及优点更加清楚明白,以下结合附图及实施例,对本申请进行进一步详细说明。应当理解,此处描述的具体实施例仅仅用以解释本申请,并不用于限定本申请。
本申请提供的三维场景下的图像处理方法,可以应用于计算机设备中。计算机设备可以为终端或服务器。本申请提供的三维场景下的图像处理方法可以应用于终端,也可以应用于服务器,还可以应用于包括终端和服务器的系统,并通过终端和服务器的交互实现。
本申请提供的光照渲染方法,可以应用于如图1所示的应用环境中。其中,终端102通过网络与服务器104进行通信。具体地,终端102可以从服务器104中获取虚拟场景对应的场景数据,场景数据中包括光源、光源变化信息、光源投影系数等光照数据。终端102在运行虚拟场景的过程中,当虚拟场景中的光源发生变化时,确定光源变化信息。终端102然后根据光源变化信息,确定变化后的光源所对应的当前光源投影系数;根据目标像素点对应的光照传递参数和当前光源投影系数,确定目标像素点在虚拟场景中的间接光照值;确定目标像素点在变化后的光源下对应的直接光照值;根据直接光照值和间接光照值,对目标像素点进行光照渲染。
其中,终端102可以是智能手机、平板电脑、笔记本电脑、台式计算机、智能音箱、智能手表等,但并不局限于此。服务器104可以是独立的物理服务器,也可以是多个物理服务器构成的服务器集群或者分布式系统,还可以是提供云服务、云数据库、云计算、云函数、云存储、网络服务、云通信、中间件服务、域名服务、安全服务、CDN、以及大数据和人工智能平台等基础云计算服务的云服务器。终端102以及服务器104可以通过有线或无线通信方 式进行直接或间接地连接,本申请在此不做限制。
在本申请的光照渲染方法,可以是基于计算机视觉技术,对虚拟场景中待渲染图像的目标像素点进行图像处理的方法,从而能够有效提高动态光照场景下的光照渲染效果。
在一些实施例中,如图2所示,提供了一种光照渲染方法,以该方法应用于计算机设备为例进行说明,该计算机设备具体可以是图1中的终端或服务器,该方法还可以应用于包括终端和服务器的系统,并通过终端和服务器的交互实现。本实施例中,包括以下步骤:
S202,当虚拟场景中的光源发生变化时,确定光源变化信息。
虚拟场景,是指计算机通过数字通讯技术勾勒出的数字化场景,包括二维虚拟场景和三维虚拟场景,可以用虚拟化技术手段来真实模拟出现世界的各种物质形态、空间关系等信息。其中,三维虚拟场景能够更加美观地展示物体的形态,同时也能更加直观地展示虚拟现实世界。例如,三维虚拟场景下的物体可以包括地形、房屋、树木、人物等中的至少一种。虚拟场景的应用越来越广泛,例如虚拟场景可以应用于游戏开发、视频制作等场景。
对于虚拟场景,可以通过计算机三维图形显示,可以在屏幕上展示三维的仿真环境,虚拟场景中的所有物体都可以通过三维场景数据描述。例如,可以把三维场景数据加载到三维场景中,以展示出三维的仿真环境。三维场景数据包括模型数据、纹理数据、光照数据、矢量数据、地形数据、栅格体数据等中的至少一种。其中,光照数据中包括光源和光源变化信息等中的至少一种。
终端可以对虚拟场景中的场景画面进行一系列的渲染处理,以使虚拟场景中的画面内容以二维的图像形式在屏幕上进行显示。其中,在计算机图形中,渲染是指将三维场景中的物体模型,按照设定好的环境、材质、光照及渲染参数,二维投影成数字图像的过程。其中,光照渲染,也就是将三维的光能传递处理转换为一个二维图像的过程。虚拟场景和环境中实体用三维形式表示,更接近于现实世界,便于操纵和变换。通过对虚拟场景中待展示的像素点进行顶点渲染、着色渲染以及光照渲染等一系列渲染处理后,能够使得虚拟场景的呈现效果愈加精细和美观。
其中,光源,是指能够自行发光且正在发光的物体,光源包括但不限于是太阳、电灯、燃烧的物质等中的至少一种。虚拟场景中的光源,是能够真实地模型现实中的光源的光照效果的一系列光照数据。
虚拟场景中的光源,是通过对被照射物体表面的明暗分布、色彩分布进行光照计算,并且使得被照射物体和周围环境有明暗、色彩的对比,从而表现出物体的光照效果。区别于现实中的光源,虚拟场景中的光源可以是没有形状或者轮廓的虚拟光源节点。
例如,可以根据虚拟场景中光源的位置,来计算被照射物体的光照效果,而浏览者(观察者)是看不到光源本身的。与现实中的光源类似,当虚拟场景中的有色光源,投射到有色物体表面上时,最终渲染出的颜色取决于光线的反射和吸收情况。其中,还可以基于虚拟场景中预先构建的三维几何模型,构建出虚拟场景中的光源照射在虚拟场景下的物体上时的阴影效果。
在虚拟场景中,可以通过模拟光线在环境中的传播,实现仿真的光照效果。其中,光线在介质中可以沿直线传播,在传播的过程中遇到障碍物如物体时,可能会发生光线的反射、散射、漫反射、被吸收等情况。其中,不同材质的物体,光线在其表面的反射通过率不同。例如,光线传播到镜面时会发生反射,其发射通过率较高。光线传播到桌面时,会发生漫反射,其发射通过率则会衰减。
当虚拟场景中光源发生变化时,虚拟场景中的环境光照效果也会随之发生变化。其中,光源变化信息,是指是变化后的光源相较于虚拟场景中变化前的光源所产生的变化信息。光源变化信息可以包括光源位置变化信息、光源强度变化信息或光源方向变化信息等中的至少一种。
具体地,终端在展示虚拟场景或运行包括虚拟场景的应用的过程中,当检测到虚拟场景中的光源发生变化时,则获取当前的光源信息,将当前的光源信息与光源变化前的光源信息进行比对,从而确定光源变化信息。
S204,根据光源变化信息,确定变化后的光源所对应的当前光源投影系数。
其中,光源投影系数,可以是指光源对应的光照强度系数。光源与光源投影系数呈线性关系,不同的光源对应不同的投影系数。例如,虚拟场景中的光源,可以看作一个虚拟的光源,也即一组光源函数。将这个虚拟的光源投影到投影基函数上,就会得到一组系数,这组系数则为光源投影系数。当前光源投影系数是指变化后的光源所对应的光源投影系数。
在数学中,基函数是指函数空间中特定基底的元素。在函数空间中,每个连续函数可以表示为基函数的线性组合。基函数也称为混合函数,能够将基函数混合起来可作为插值函数。,投影基函数,就是指用于将光源输入至基函数,以通过基函数计算光源光线的反射情况。
在一些实施例中,投影基函数具体可以为球面谐波函数。球面谐波函数,是将满足拉普拉斯方程的谐函数限制于球坐标系下的单位球面,可以用来表示球面上的方向分布。在图形学中,球面谐波函数是一种定义在球面的广义傅里叶变化,可以将离散的球面函数参数化,来模拟低频的环境光照明。例如用来表示虚拟场景中物体表面的BRDF函数(Bidirectional Reflectance Distribution Function,双向反射分布函数)、环境贴图、光照传输、可见性分布等与方向有关的函数。球面谐波函数可以用于捕捉光照,然后进行重新光照处理,以实时计算出虚拟环境中的全局光照。在其他的实施例中,投影基函数还可以采用球面高斯函数、自定义分段函数等作为投影基函数,本申请在此不做限定。
虚拟场景中的光照是一种线性系统,是光源的线性组合产生的光照效果,也是单独光源的光照效果的线性组合。在一些实施例中,如图3所示,图32中包括光源3a和光源3b,图32a中包括光源3a,图32b中包括光源3b。图32中的光照效果,则是图32a与图32b的光照叠加的效果。如图4所示,图42中包括光源4a,图42a中包括光源4a的一半光照亮度。图42中的光照效果,则是两倍的图42a中的光照效果。图32中的光照效果,是将图32a中光源3a的光照效果和图32b中光源3b的光照效果分别按照相应的光源投影系数进行线性组合得到的。
当终端监测到虚拟场景中的光源发生变化时,终端则根据光源变化后的光源变化信息,对当前虚拟场景中的光照进行更新。具体地,终端光源变化信息,更新虚拟场景中的目标像素点所对应的光照数据。也就是当光源发生变化时,终端需要根据光源变化信息,重新计算虚拟场景中当前光源对应的当前光源投影系数。
S206,根据虚拟场景中的目标像素点对应的光照传递参数和当前光源投影系数,确定目标像素点在虚拟场景中的间接光照值。
其中,像素点,可以是指一个像素的中心位置。像素可以是一个小方格形式的区域,一个小方格可以对应代表一个像素,像素的中心即像素点。图像由多个像素组成。
三维的虚拟场景需要经过一系列的渲染,使其内容以二维的形式表现到屏幕中。三维的虚拟场景可以通过连续的场景画面图像进行展示,场景画面图像由多个像素点构成。
目标像素点,是指虚拟场景中的待渲染图像对应的像素点,也就是当前需要被渲染的像素点。其中,目标像素点的颜色值、亮度值、深度值等像素信息可以通过虚拟场景中的场景数据确定。比如,目标像素点的亮度值和颜色值与其所在环境的光线密集程度、周围遮挡光线的虚拟物体的数量等有关。
其中,光照传递参数,可以包括光照转移向量。光照转移向量,是用于将入射光线转化为包含自遮挡、互反射的转移入射光的向量,具体可以用于确定单位面积内的辐射照度(Irradiance)。
光源投射至虚拟环境中后,可以产生全局光照效果。其中,全局光照是三维软件中的特有名词,光具有反射和折射的性质。在真实的大自然环境中,白天的太阳光照射到地面是经过无数次的反射和折射的,因此人眼看到的地面都是清晰的。在三维软件中,全局光照不仅可以计算物体的光亮面以及阴暗面,还能计算虚拟环境中光的反射、折射等各种光效。
现实中的物体表面接收到的光照并不是全部来自光源,还包括其他物体反射的光照。其中,来自光源的光照即为直接光照,而来自其他物体的光照就是间接光照。全局光照效果包括直接光照和间接光照的效果,也就是全局光照=直接光照+间接光照。
其中,直接光照,是指光源直接照射到物体上,并反射到虚拟观测点或摄像机的光照亮度。间接光照,是指光源先照射到其它物体上,并经过一次或多次弹射,最终抵达到被观察的物体表面上,然后反射到虚拟观测点或摄像机的光照亮度。
其中,光照亮度,是指光强度,也就是单位投影面积的辐射度。单位投影面积是就光照垂直的平面,单位面积下的辐射通量被称为辐射度。其中,辐射通量表示每秒流经某个区域的光照能量总和。同样的光照能量照射在不同面积的物体表面其亮度也是不同的,物体表面的角度会导致入射光线散布开来从而减少辐射度。
其中,直接光照值,是指虚拟环境中的物体接收到光源直接光照后的颜色值,也就是光照强度值。同理,间接光照值,就是指虚拟环境中的物体接收到间接光照后的颜色值。
终端根据光源变化后的光源变化信息,确定变化后的光源所对应的当前光源投影系数后,则根据目标像素点对应的光照传递参数和当前光源投影系数,确定目标像素点在虚拟场景中的间接光照值。
具体地,终端在渲染目标像素点之前,可以预先计算出虚拟场景中各物体对应的光照传递参数,具体可以包括虚拟场景中各静态物体对应的光照传递参数和光照探针对应的光照传递参数。
虚拟场景中的目标像素点的间接光照值可以直接由光照传递参数和当前光源投影系数计算得到。
S208,确定目标像素点在变化后的光源下对应的直接光照值。
当虚拟环境中的光源发生变化时,虚拟环境中各物体所接收到的直接光照亮度也会发生变化。其中,虚拟场景中的直接光照和阴影可以利用运行时的渲染管线来动态计算,从而实现地直接光照进行实时计算。
具体地,终端可以根据光源变化信息,确定变化后的光源照射到目标像素点上的光线亮度,并根据变化后的光线亮度、目标像素点对应的物体颜色以及目标像素点与虚拟观测点的角度,实时计算出目标像素点在变化后的光源下对应的直接光照值。具体可以采用预设方式,计算变化后的光源的光线投射至目标像素点后,光线的光亮度贡献,并作为像素的颜色值,从而得到目标像素点在变化后的光源下对应的直接光照值。
其中,当光源未直接照射至目标像素点上时,目标像素点对应的直接光照值则可以为零,也就是该目标像素点仅包括光源照射至虚拟环境中的光线经过多次反射后的间接光照值。待渲染的目标像素点包括多个,针对每个像素点,可以仅包括直接光照值或间接光照值,也可以同时包括直接光照值和间接光照值。其中,当直接光照值为有效值时,间接光照值则可以为零;当间接光照值为有效值时,直接光照值则可以为零。
在一些实施例中,终端还根据目标像素点对应的直接光照值,确定各目标像素点对应的阴影。具体的,对于阴影值的计算,即针对每一个像素点,向光源发射射线,如果光线和物体相交,物体处于阴影之中,阴影值为0,若不相交阴影值为1,对于阴影边缘可能存在半影的情况,此时阴影值为0-1之间。
S210,根据直接光照值和间接光照值,对目标像素点进行光照渲染。
其中,光照渲染,是指对待渲染图像中的目标像素点进行渲染过程中的光照计算处理,以使最终的目标像素点具有光照效果。
通过计算出虚拟环境中各目标像素点的直接光照值和间接光照值,则获得了虚拟环境中各目标像素点的全局光照信息。
具体地,终端计算出待渲染的目标像素点对应的直接光照值和间接光照值后,则根据各目标像素点对应的直接光照值和间接光照值,对目标像素点进行光照渲染,以得到最终的画面渲染结果。
上述光照渲染方法中,当虚拟场景中的光源发生变化时,终端首先确定光源变化信息,然后根据光源变化信息,确定变化后的光源所对应的当前光源投影系数,从而能够精准有效地实时更新当前光源对应的当前光源投影系数。终端进而根据当前光源投影系数和目标像素点对应的光照传递参数,确定目标像素点在虚拟场景中的间接光照值。由此能够精准地计算出目标像素点在变化后的光源下的间接光照值。终端进而确定目标像素点在变化后的光源下对应的直接光照值,并根据直接光照值和间接光照值,对目标像素点进行光照渲染,由此能够精准高效地对实时变化的光源进行光照渲染处理,从而有效提高了动态光照场景下的光照渲染效果。
在一些实施例中,根据光源变化信息,确定变化后的光源所对应的当前光源投影系数,包括:根据光源变化信息,确定光照变换矩阵;确定虚拟场景中光源变化前的初始光源投影系数;根据光照变换矩阵和初始光源投影系数,确定变化后的光源所对应的当前光源投影系数。
当光源发生变化时,对应的光源投影系数也会随之改变。光源变化所对应的光源变化信息,具体是根据变化后的光源相较于变化前的光源的差异所确定的。
其中,变换矩阵,是表示能够进行线性变换的线性代数,任意线性变换都可以用矩阵表示为易于计算的一致形式,且多个变换也可以很容易地通过矩阵的相乘连接在一起。例如,在三维计算机图形学中,投影空间下的仿射变换与透视投影可以用齐次坐标表示多维的线性变换。
光照变换矩阵,是指用于表示光线变化的变换矩阵。初始光源投影系数,是指光源变化前的光源对应的光源投影系数。
具体地,终端确定出光源变化信息后,首先根据光源变化信息,更新光源变化后的光照变换矩阵。终端则获取光源变化前所对应的初始光源投影系数,进而根据更新的光照变换矩阵和初始光源投影系数,计算出变化后的光源所对应的当前光源投影系数。由此能够实时计 算出虚拟场景中的当前光源对应的当前光源投影系数,从而能够快速地更新动态光源下的光源投影系数。
本实施例中,根据光照变换矩阵和初始光源投影系数,确定变化后的光源所对应的当前光源投影系数,由此能够实时计算出虚拟场景中的当前光源对应的当前光源投影系数,从而能够快速地更新动态光源下的光源投影系数。
在一些实施例中,光源变化信息,是变化后的光源相较于虚拟场景中预设的原始光源所产生的变化信息;确定虚拟场景中光源变化前的初始光源投影系数,包括:获取将原始光源投影至投影基函数得到的初始光源投影系数。根据光照变换矩阵和初始光源投影系数,确定变化后的光源所对应的当前光源投影系数,包括:根据光照变换矩阵,对初始光源投影系数进行更新,得到变化后的光源所对应的当前光源投影系数。
其中,虚拟场景中预设的原始光源,是在展示虚拟场景之前,在虚拟场景中预先设定的固定的原始光源。预设的原始光源不会发生变化。
在数学中,基函数是指函数空间中特定基底的元素。在函数空间中,每个连续函数可以表示为基函数的线性组合。基函数也称为混合函数,能够将基函数混合起来可作为插值函数。投影基函数,是指用于将光源输入至基函数,以通过基函数计算光源光线的反射情况。
在一些实施例中,投影基函数具体可以为球面谐波函数。球面谐波函数,是将满足拉普拉斯方程的谐函数限制于球坐标系下的单位球面,可以用来表示球面上的方向分布。在图形学中,球面谐波函数是一种定义在球面的广义傅里叶变化,可以将离散的球面函数参数化,来模拟低频的环境光照明。例如用来表示虚拟场景中物体表面的BRDF函数(Bidirectional Reflectance Distribution Function,双向反射分布函数)、环境贴图、光照传输、可见性分布等与方向有关的函数。球面谐波函数可以用于捕捉光照,然后进行重新光照处理,以实时计算出虚拟环境中的全局光照。在其他的实施例中,投影基函数还可以采用球面高斯函数、自定义分段函数等作为投影基函数,本申请在此不做限定。
在虚拟场景中的光源,具有对应的光源函数。将光源投影至投影基函数中,也就是将光源函数投影到预设的投影基函数上。
通过将光源投影至投影基函数中,对于表面上的每一个点,四面八方入射的光线都会对该点有光照亮度的贡献,单个点的最终出射光强要计算出来就需要很多的光线采样,具体的求解渲染方程,可以视为是在一个半球面上的积分。
当虚拟场景中的光源发生变化时,终端根据变化后的光源,与虚拟场景中预设的原始光源进行比较,得到光源变化信息。然后,终端则获取与原始光源对应的初始光源投影系数。其中,初始光源投影系数是通过将原始光源投影至投影基函数上得到的投影系数。
终端根据光源变化信息确定对应的光照变换矩阵,进而根据光照变换矩阵对初始光源投影系数进行更新。具体地,终端可以将光照变化矩阵与初始光源投影系数进行点积处理,从而能够得到更新的投影系数。终端则将得到的更新的投影系数,作为虚拟场景中变化后的光源所对应的当前光源投影系数,由此能够实时地快速计算出当前光源对应的当前光源投影系数。
本实施例中,根据光照变换矩阵,对初始光源投影系数进行更新,得到变化后的光源所对应的当前光源投影系数,将得到的更新的投影系数,作为虚拟场景中变化后的光源所对应的当前光源投影系数,由此能够实时地快速计算出当前光源对应的当前光源投影系数。
在一些实施例中,光源变化信息,是变化后的光源相较于虚拟场景中变化前的历史光源 所产生的变化信息;确定虚拟场景中光源变化前的初始光源投影系数,包括:获取与历史光源对应的历史光源投影系数。根据光照变换矩阵和初始光源投影系数,确定变化后的光源所对应的当前光源投影系数,包括:根据光照变换矩阵,对历史光源投影系数进行更新,得到变化后的光源所对应的当前光源投影系数。
其中,历史光源是指变化前的光源。例如,当前时刻的光源相较于上一时刻的光源发生了变化,则当前时刻的光源是变化后的光源,上一时刻的光源是变化前的光源,即上一时刻的光源为历史光源。历史光源投影系数,是已经计算出的与历史光源对应的光源投影系数。具体可以是将历史光源投影至投影基函数得到的光源投影系数。
当虚拟场景中的光源发生变化时,终端根据变化后的光源,与虚拟场景中变化前的历史光源进行比较,得到光源变化信息。终端进而直接获取已经计算出的与历史光源对应的历史光源投影系数。
终端进而根据光源变化信息确定对应的光照变换矩阵,以根据光照变换矩阵对历史光源投影系数进行更新。具体地,终端可以将光照变化矩阵与历史光源投影系数进行点积处理,即类似于将矩阵与向量(一组投影系数)进行变换,从而能够得到更新的投影系数。终端则将得到的更新的投影系数,作为虚拟场景中变化后的光源所对应的当前光源投影系数。
本实施例中,通过实时对变化之前的历史光源投影系数进行更新,能够精准有效地对上一刻的历史光源投影系数进行更新,以精准实时地计算出光源变化后的光照信息。
在一些实施例中,获取虚拟场景中的目标像素点对应的光照传递参数的步骤如下:针对虚拟场景中属于静态物体的目标像素点,获取与目标像素点相匹配的光照贴图;从所述光照贴图中获取所述虚拟场景中的目标像素点对应的光照传递参数。
虚拟场景中的静态物体,是指虚拟场景中固定的物体,其位置和方向等均不会发生变化。
其中,光照贴图,即Light Map,是指对虚拟场景中的静态目标物体使用全局光照算法预先生成光照信息的图片,用于表示静态物体的光照视觉效果。光照贴图,与色彩渲染类似,光照渲染用于为对应的静态物体进行光照渲染处理,以渲染出相应的光照效果,从而实现模拟真实场景下光照对物体的影响。
其中,光照贴图中包括多个纹素对应的光照信息,也就像素值或颜色值。其中,纹素,是纹理元素的简称,是计算机图形纹理空间中的基本单元,可以通过纹理映射技术将纹素映射到恰当的输出图像像素上。本申请实施例中的纹素与像素相对应,也就是光照贴图中的纹素与静态物体表面的像素点对应。通过像素点的光照信息则可对物体进行光照渲染。
具体地,一张光照贴图可以对应虚拟场景中的一个静态物体对应的物体表面贴图。在另外的实施例中,一张光照贴图可以对应虚拟场景中的一个或多个静态物体,例如可以是同一平面的多个物体表面贴图。采用光照贴图技术能够离线为实时渲染的绘制对象生成光照信息,在保证性能的同时提高画面质量。
其中,虚拟场景中静态物体对应的光照贴图中,存储有预先计算出的与静态物体的纹素对应的光照传递参数。例如,具体可以是预先基于PRT技术(Precomputed Radiance Transfer,预计算辐射亮度传输的绘制技术)预计算出静态物体对应的光照传递参数,然后将光照传递参数存储至对应的光照贴图中。由此,终端在计算静态物体的间接光照时,可直接从光照贴图中获取对应的光照传递参数进行光照计算处理。
当虚拟场景中的光源发生变化时,终端根据光源变化后的光源变化信息,确定出变化后的光源对应的当前光源投影系数后,则获取与目标像素点对应的预先计算的光照传递参数, 以根据光照传递参数和当前光源投影系数,计算目标像素点在虚拟场景中的间接光照值。
具体地,当待渲染的目标像素点,包括虚拟场景中静态物体的目标像素点时,对于虚拟场景中属于静态物体的目标像素点,终端则从目标像素点在相匹配的光照贴图中,获取与目标像素点对应的光照传递参数。然后终端根据当前光源投影系数和获取的光照传递参数,计算出目标像素点在虚拟场景中的间接光照值。
本实施例中,通过将虚拟场景中静态物体对应的光照传递参数预先烘培出来,并存储至对应的光照贴图中,由此终端在计算静态物体的间接光照时,可直接从光照贴图中获取对应的光照传递参数进行光照计算处理,因此提高了获取光照传递参数的效率,从而能够精准快速地计算出静态物体的间接光照,提高了光照渲染的效率。
在一些实施例中,从光照贴图中获取虚拟场景中的目标像素点对应的光照传递参数包括:针对虚拟场景中属于静态物体的目标像素点,基于纹理映射关系在静态物体的光照贴图中,查找与目标像素点相匹配的纹素;根据相匹配的纹素,从光照贴图中获取与目标像素点对应的光照传递参数。
其中,纹理映射,是将纹理空间中的纹理像素映射到屏幕空间中的像素的过程。纹理映射关系,也就是纹理空间的纹素与屏幕空间的像素之间的映射关系。通过纹理映射,能够实现将一幅图像贴到三维物体的表面上来增强真实感,可以和光照计算、图像混合等技术结合起来形成许多非常漂亮的效果。
具体地,终端在计算虚拟场景中属于静态物体的目标像素点的间接光照值的过程中,针对每一个目标像素点,终端首先获取与静态物体对应的光照贴图,然后根据纹理映射关系在静态物体对应的光照贴图中,查找与目标像素点相匹配的纹素。其中,静态物体在光照贴图中的纹素,与静态物体表面的像素点相对应。
然后终端则根据相匹配的纹素,从光照贴图中获取预先存储的与每个目标像素点对应的光照传递参数。终端进而将当前光源投影系数和每个目标像素点对应的光照传递参数进行点积处理,则可以计算得到每个目标像素点在虚拟场景中的间接光照值。
本实施例中,通过将虚拟场景中各个静态物体的光照传递参数预先烘培处理,并存储至对应的光照贴图中。由此在虚拟场景运行时,提高了获取光照传递参数的效率,能够实时地快速计算出静态物体的间接光照值,提高了光照渲染的效率,并且能够精准快速地计算出光源变化后的静态物体的间接光照值。
在一些实施例中,获取虚拟场景中的目标像素点对应的光照传递参数的步骤如下:针对虚拟场景中属于动态物体的目标像素点,获取与目标像素点相匹配的光照探针;从光照探针中获取虚拟场景中的目标像素点对应的光照传递参数。
虚拟场景中的动态物体,是指虚拟场景中非固定的物体,也就是可以在虚拟场景中进行移动的移动对象,其位置和方向等中的至少一种是可变化的。
其中,光照探针,是预先放置在虚拟场景中的光照探测器,具体可以是球体,还是可以多边形体,例如正方体等。通过光照探针可以捕获并使用穿过场景空白空间的光线的相关信息,能够用于为虚拟场景中的动态物体提供包括间接反射光的高质量光照信息,以及为静态物体提供高精度的细节光照信息。
与光照贴图类似,光照探针存储了虚拟场景中的光照的烘焙信息,其中包括预先计算出的光照传递参数。不同之处在于,光照贴图存储的是有关光线照射到虚拟场景中静态物体表面的光照信息,而光照探针存储的是有关光线穿过场景中的空白空间的信息。
具体地,当待渲染的目标像素点,包括虚拟场景中动态物体的目标像素点时,对于虚拟场景中属于动态物体的目标像素点,终端则获取与目标像素点相匹配的光照探针,并从光照探针中获取相应的光照传递参数。终端可以基于目标像素点与光照探针之间的距离,从虚拟场景中的各个光照探针中选取与目标像素点相匹配的光照探针,例如,可以终端可以计算目标像素点与光照探针之间的距离,当计算出的距离小于距离阈值时,将该光照探针确定为与目标像素点相匹配的光照探针。距离阈值可以预设或根据需要设置。终端进而根据当前光源投影系数和与目标像素点相匹配的光照探针中存储的光照传递参数。然后终端根据当前光源投影系数和获取的光照传递参数,计算出目标像素点在虚拟场景中的间接光照值。
本实施例中,通过将虚拟场景中光照探针对应的光照传递参数预先烘培出来,并存储至对应的光照探针中,由此终端在计算动态物体的间接光照时,可直接从光照探针中获取对应的光照传递参数进行光照计算处理,提高了获取光照传递参数的效率,从而能够精准快速地计算出动态物体的间接光照,提高了光照渲染的效率。
在一些实施例中,根据虚拟场景中的目标像素点对应的光照传递参数和当前光源投影系数,确定目标像素点在虚拟场景中的间接光照值包括:根据当前光源投影系数和光照传递参数,更新光照探针上各方向对应的光线亮度;对各光照探针对应的更新后的光线亮度进行插值;根据插值后的光线亮度和目标像素点的法线方向,确定目标像素点在虚拟场景中的间接光照值。
光照探针上具有多个方向,每个方向具有预先烘焙的对应的光线亮度。也就是针对光照探针上的每个方向,都预先计算出了每个方向对应的光照传递参数。
其中,法线,是指始终垂直于某平面的直线。在几何学中,法线指平面上垂直于曲线在某点的切线的一条线。对于立体表面而言,法线具有方向,通常由立体的内部指向外部的是法线正方向,反之则是法线负方向。本实施例中,目标像素点的法线方向,是指由目标像素点指向虚拟观测点或摄像机的法线的方向。
具体地,终端在计算虚拟场景中属于动态物体的目标像素点的间接光照值的过程中,针对每一个目标像素点,终端首先获取与动态物体相匹配的光照探针。例如,可以获取与动态物体预设距离以内的光照探针。然后获取各个光照探针中存储的光照传递参数。终端进而根据各个光照探针中存储的光照传递参数,以及各个探针表面对应的当前投影系数,以及各个探针表面对应的当前投影系数,计算每个光照探针的每个方向上的光线亮度。具体地,该计算出的光线亮度可以是间接光照值。
进一步地,终端然后对动态物体相匹配的光照探针对应的光线亮度进行插值处理,得到插值后的光线亮度。终端进而根据动态物体上的目标像素点的法线方向,计算动态物体上的目标像素点对应的间接光照值,从而能够精准地计算得到每个目标像素点在虚拟场景中的间接光照值。
本实施例中,通过将虚拟场景中各个光照探针的光照传递参数预先烘培处理,并存储至对应的光照探针中。由此在虚拟场景运行时,从光照探针中获取光照传递参数,提高了获取光照传递参数的效率,能够实时地快速计算出动态物体的间接光照值,并且能够精准快速地计算出光源变化后的动态物体的间接光照值,提高了光照渲染的效率。
在一些实施例中,当虚拟场景中的光源发生变化时,确定光源变化信息之前,上述光照渲染方法还包括:将原始光源投影至多个投影基函数,将投影后的每个投影基函数分别作为对应的虚拟光源;针对虚拟场景中静态物体的像素点,基于光线追踪确定各虚拟光源在像素 点对应的光照传递参数,并存储至对应的光照贴图中;针对虚拟场景中的光照探针,基于光线追踪确定各虚拟光源在光照探针对应的光照传递参数,并存储至光照探针中。
其中,光线追踪,也即路径跟踪(Path Tracing),是计算机图形学中用于渲染虚拟场景的一种技术,是指在虚拟场景中对光线传播路径的追踪,可以得到该光线在传播过程中的光线路径。其原理为从视点发出一条光线,光线与物体表面相交时,根据表面的材质属性选择一个随机方向,继续发出另一条光线进行跟踪采样,如此迭代,直到光线打到光源上或逃逸出场景。然后计算光线的贡献,作为像素的颜色值,例如可以采用蒙特卡洛法计算光线的贡献。
因此,只要迭代的次数足够多、渲染的时间足够长,最终渲染的图像能收敛得到精确度较高的光照渲染结果。光线追踪可以渲染出高真实感的场景效果,但是相对也需要很高的计算开销。
在几何光学中,光线在物体表面反射的辐射亮度在光学传播中不会改变,也就是光照传递参数不会改变。因此,可以通过在预先烘培出虚拟场景中各物体表面的光照传递参数,由此能够有效减少实时渲染过程中的计算开销,以提高渲染处理的效率。
在对虚拟场景中待渲染的目标像素点进行渲染之前,还可以预先烘培出虚拟场景中静态物体的像素点对应的光照传递参数,以及光照探针对应的光照传递参数。具体可以基于虚拟场景中的基函数光源进行光照传递参数的烘培。其中,基函数光源,即虚拟光源,也就是将原始光源投影至预设的投影基函数后,将每一个基函数作为一个虚拟光源,投影后的基函数则为基函数光源。
光照传递参数可以根据投影基函数,采用光线追踪的方式预先烘培出来。其中,投影基函数可以采用球面谐波函数,由于球面谐波函数具有旋转不变性和投影性能高的特性,当光源发生变化时,光照传递参数不会发生变化,仅光源对应的光源投影系数会随着光源变化而变化,因此采用球面谐波函数基于半球表面积分的形式能够有效地烘培出光照传递参数。
终端可以通过将原始光源投影至多个投影基函数上,将投影后的每个投影基函数分别作为对应的虚拟光源,然后利用光线追踪计算虚拟光源在虚拟场景中各物体表面上的亮度信息。其中,终端在预计算光照传递参数的过程中,仅对间接光照进行采样,也就是仅对光线经过两次以上反弹的射线进行采样。
然后,将每个虚拟光源分别投影至虚拟场景中各物体的表面上,虚拟场景中各物体表面的各像素点则会接收到每个虚拟光源对应的亮度信息。
具体地,终端在烘培光照传递参数的过程中,首先将预设的原始光源投影至预设的多个不同的投影基函数上,例如可以投影至3个不同的投影基函数。其中,多个即至少两个以上。将原始光源投影至不同的投影基函数上,则会产生近似动态的光源,投影后的每个投影基函数则为一种虚拟光源,因此可以将投影后的每个投影基函数分别作为对应的虚拟光源。将投影后的每个投影基函数作为虚拟光源再投影至虚拟场景中,虚拟场景中各物体表面的各像素点则可以接收到各虚拟光源的亮度信息,从而能够在虚拟场景中采样近似动态光源对应的光照信息。
虚拟场景中各物体表面的各像素点,接收到各虚拟光源的亮度信息后,从各物体的像素点发出反射的射线。然后对经过反射的射线进行光线追踪,以捕捉反射的射线在虚拟场景中的光亮度贡献,从而确定物体表面上各像素点对应的光照传递参数。
具体地,终端可以对虚拟场景中的静态物体和光照探针,分别烘培出对应的光照传递参 数。针对虚拟场景中静态物体的各像素点,终端则基于光线追踪确定各虚拟光源在各像素点对应的光照传递参数,并存储至对应的光照贴图中。针对虚拟场景中的光照探针,终端则基于光线追踪确定各虚拟光源在光照探针对应的光照传递参数,并存储至对应的光照探针中。
本实施例中,通过预先计算出虚拟场景中静态物体和光照探针对应的光照传递参数,并分别存储至对应的光照贴图和光照探针中,由此在实时进行光照渲染处理时,可以直接获取目标像素点对应的光照传递参数进行间接光照计算,由此能够有效减少实时渲染过程中的计算消耗,有效提高了光照渲染的效果。
在一些实施例中,针对虚拟场景中静态物体的各像素点,基于光线追踪确定各虚拟光源在像素点对应的光照传递参数,并存储至对应的光照贴图中,包括:针对虚拟场景中静态物体的每个像素点,获取各虚拟光源投影后在像素点对应的光线亮度;以像素点为起点,向像素点的法线所指向的半球面,发射具有光线亮度的射线;在虚拟场景中,采样各射线经过半球面反射后的反射亮度和光亮衰减度;基于各射线所对应的反射光线亮度和光源亮度,确定像素点对应的光照传递参数,并将光照传递参数存储至对应的光照贴图中。
其中,像素点经原始光源投影后的光线亮度,是指将虚拟场景中的原始光源投影至预设的投影基函数后,虚拟场景中静态物体上各像素点所接收到的光线亮度信息,也就是像素点在接收到光源照射后,像素点颜色的明暗程度。
以投影基函数采用球面谐波函数为例,球面谐波函数是定义于单位球面上的正交基。将光源投影至投影基函数后,投影基函数可以用来表示球面上的方向分布。像素点的法线所指向的半球面,也就是指各像素点在投影基函数上所对应的单位半球面上的正交基。
具体地,终端在预计算虚拟场景中静态物体对应的光照传递参数时,各虚拟光源在虚拟场景中进行投影后,静态物体上的每个像素点会接收到各虚拟光源对应的光线亮度。具体可以是各虚拟光源经过多次反射后的光线亮度。
然后,针对静态物体上的每个像素点,终端则从每个像素点中心对应的世界空间位置作为起点,基于所接收到的光线亮度,向像素点的法线朝向的半球面发射光线,也就是发射具有光线亮度的射线。
终端进而在虚拟场景中进行采样,计算射线经过半球面反射后的反射亮度和光亮衰减度,进而根据反射亮度和光亮衰减度,计算出射线经过多次反弹后的光亮度贡献。具体地,当反射的射线与虚拟场景中的物体相交时,则计算当前投影基函数上,射线经过多次反弹后的光亮度贡献。其中,针对每一条射线,计算出每个射线对应的光亮度贡献,然后将光亮度贡献拟合到对应的投影基函数上,进而根据光亮度贡献计算出在对应投影基函数上的光照传递参数。
终端将原始光源投影至多个投影基函数后,针对每一个基函数,都在虚拟场景中采样射线经过多次反弹后的光亮度贡献,并计算出在对应投影基函数上的光照传递参数。然后,对各投影基函数上的光照传递参数进行拟合,得到像素点对应的最终的光照传递参数。
终端计算出虚拟场景中静态物体上各像素点对应的光照传递参数后,进而按照纹理映射关系,将光照传递参数存储至光照贴图中对应的纹素位置上。
本实施例中,通过将原始光源投影至多个投影基函数后,从每个像素点向该像素点的法线所指向的半球面发射射线,然后在虚拟场景中采样各射线经过多次反射后的光亮度贡献,从而能够精准地计算出虚拟场景中静态物体上的各像素点对应的光照传递参数。
在一些实施例中,针对虚拟场景中的光照探针,基于光线追踪确定各虚拟光源在光照探 针对应的光照传递参数,并存储至各光照探针中,包括:针对虚拟场景中的光照探针,获取各虚拟光源投影后在光照探针的各个面对应的光线亮度;基于光线亮度,以光照探针的中心点为起点,向以中心点为中心的预设半径的球面发射射线;在虚拟场景中,采样各射线经过球面反射后的反射亮度和光亮衰减度;基于各射线对应的反射亮度和光亮衰减度,确定各光照探针对应的光照传递参数,并将光照传递参数存储至光照探针中。
每个光照探针可以预设有对应的多个投影基函数,具体可以为球面基函数。
终端在预计算虚拟场景中光照探针对应的光照传递参数时,各虚拟光源在虚拟场景中进行投影后,虚拟场景中每个光照探针的多个方向上的面会接收到各虚拟光源对应的光线亮度。针对光照探针的每个方向上的面,然后终端从光照探针的中心点位置作为起点,根据光照探针所接收到的光线亮度,向以中心点为中心的预设半径的球面发射射线。
对于每一条射线的光亮度贡献,与针对静态物体上各像素点对应的计算方式相同,计算出每个射线对应的光亮度贡献后,将光亮度贡献拟合到对应的投影基函数上,以根据光亮度贡献计算出对应的光照传递参数。
具体地,对于每一条射线和其对应的亮度值,根据射线的方向和光照探针上每个面的方向计算射线的夹角。当射线的夹角小于90度时,则计算当前投影基函数上,射线经过多次反弹后的光亮度贡献。其中,针对每一条射线,计算出每个射线对应的光亮度贡献,然后将光亮度贡献拟合到对应面的投影基函数上,进而根据光亮度贡献计算出在对应面投影基函数上的光照传递参数。
例如,如图5所示,以正立方体为例,如Ambient Cube(环境光立方体),用来表示光照探针上的多个方向的面。其中,环境光立方体包括六个面,即x面、-x面、y面、-y面、z面和-z面,每一个面对应一个投影基函数,每个基函数上存储有对应的光照传递参数,光照传递参数具体可以为光照转移向量Transfer Vector。即x面对应光照转移向量(x),-x面对应光照转移向量(-x),y面对应光照转移向量(y),-y面对应光照转移向量(-y),z面对应光照转移向量(z),-z面对应光照转移向量(-z)。
终端计算出虚拟场景中光照探针上每个面对应的光照传递参数后,进而将光照传递参数存储光照探针上。
本实施例中,通过将原始光源投影至多个投影基函数后,从每个光照探针中心点向球面发射射线,然后在虚拟场景中采样各射线经过多次反射后的光亮度贡献,从而能够精准地计算出虚拟场景中光照探针上多个面应的光照传递参数。
在一些实施例中,虚拟场景为虚拟游戏场景,虚拟游戏场景中的光源为远景光源;当虚拟场景中的光源发生变化时,确定光源变化信息,包括:在运行虚拟游戏场景时,监测虚拟游戏场景中远景光源的光源方向;当光源方向发生变化时,根据变化后的光源方向和远景光源的初始光源方向,确定光源变化信息。
虚拟游戏场景,是指游戏应用中部署的虚拟场景,具体可以为三维的虚拟场景。其中,远景光源,具体可以是指在画面中呈现出的范围较远的光源,也就是虚拟场景中距离物体较远的光源,例如虚拟游戏场景中的太阳光或天空光。例如,游戏场景中的天空光,通常是基于模拟大气散射的方式计算环境中的散射光,能够在不同的太阳位置以及大气特性下渲染天空及景物。通常最终的颜色值是由积分公式计算得到的,积分区间是一个连续的区间,因此可以通过离散采样的方式计算累加和得到近似结果。
具体地,终端中部署有游戏应用,游戏应用中包括虚拟游戏场景。当终端中的虚拟游戏 场景在运行时,终端则实时监测虚拟游戏场景中远景光源的光源方向。
当终端监测到虚拟游戏场景中的光源方向发生变化时,根据变化后的光源方向和远景光源的初始光源方向,确定光源变化信息。例如,例如当光源发生旋转时,根据旋转方向,确定光源旋转的角度。然后终端根据旋转的角度计算球面谐波对应的旋转矩阵,以根据旋转矩阵和虚拟游戏场景中的初始光源投影系数,实时更新虚拟游戏场景中的光源投影系数,以计算出待展示的目标像素点所对应的当前光源投影系数。进而根据实时计算出的当前光源投影系数对目标像素点进行光照渲染处理。
例如,如图6所示,为一些实施例中的场景画面6a,可以看出该场景画面中的远景光源为太阳光。该虚拟场景中左侧有房屋,右侧有两面墙,分别是红色和绿色,其在虚拟场景中会反射相应的间接光照。当远景光源发生变化时,例如当虚拟场景中的太阳光发生旋转变化时,虚拟场景中各物体所接收到的光照也会发生变化。如图7所示,为图6中虚拟场景中的太阳光旋转后的场景画面7a。从图7中可以看出,当墙面接受得光照更亮时,左侧房屋墙壁的对应反射光线也更亮。
本实施例中,通过根据虚拟游戏场景中动态光源的光源变化信息,实时更新场景中的光源投影系数,以动态计算出虚拟游戏场景中的直接光照值和间接光照值,由此能够随光源的动态变化,对场景进行实时动态光照渲染,由此能够呈现出接近真实自然环境的效果,进而有效提高了虚拟场景中动态光源下的光照渲染效果。
在一个具体的实施例中,如图8所示,提供了一种具体的光照渲染方法,具体包括以下步骤:
S802,当虚拟场景中的光源发生变化时,确定光源变化信息。
S804,根据光源变化信息,确定光照变换矩阵;根据光照变换矩阵和虚拟场景中光源变化前的初始光源投影系数,确定变化后的光源所对应的当前光源投影系数。
S806,针对虚拟场景中属于静态物体的目标像素点,基于纹理映射关系在静态物体的光照贴图中,查找与目标像素点相匹配的纹素。
S808,根据相匹配的纹素,从光照贴图中获取与目标像素点对应的光照传递参数。
S810,根据当前光源投影系数和目标像素点对应的光照传递参数,确定目标像素点在虚拟场景中的间接光照值。
S812,针对虚拟场景中属于动态物体的目标像素点,确定与动态物体的目标像素点相匹配的光照探针。
S814,根据当前光源投影系数和光照探针中存储的光照传递参数,更新光照探针上各方向对应的光线亮度。
S816,对各光照探针对应的更新后的光线亮度进行插值;根据插值后的光线亮度和目标像素点的法线方向,确定目标像素点在虚拟场景中的间接光照值。
S818,确定目标像素点在变化后的光源下对应的直接光照值。
S820,根据直接光照值和间接光照值,对目标像素点进行光照渲染。
终端在展示虚拟场景或运行包括虚拟场景的应用的过程中,当终端检测到虚拟场景中的光源发生变化时,变化后的光源对应的光源投影系数也会随之改变。终端则根据光源变化信息,确定光照变换矩阵。例如可以根据光源变换的角度,确定光源旋转矩阵。终端然后根据光照变换矩阵和光源变化前的初始光源投影系数,更新虚拟场景中当前光源所对应的当前光源投影系数。
虚拟场景中的光照是一种线性系统,虚拟场景中的光源可以用函数表示,通过将光源函数投影至光源投影基函数上计算虚拟场景中的光照。
具体地,光照线性系统的表达式可以如下:
Figure PCTCN2022081063-appb-000001
其中,x、y、z是虚拟场景中光源投影的基函数,a、b、c为光源投影到基函数上的光源投影系数。
例如,对于远景光源,在虚拟场景中可以用CubeMap(立方体映射)来表示,以在进行光照计算时,投影到基函数上。其中,远景光源的投影表达式可以为:
(sh0,sh1,sh2...)=(x,y,z...)=a*sh0+b*sh1+c*sh2+...
其中,sh0、sh1和sh2是基函数,分别对应于x、y和z。光源经过投影之后,对虚拟场景的光照是一个近似效果,可以用于间接光等平滑的光照。当光源旋转时,基函数不改变,只有对应的光源投影系数a、b、c发生改变。由此上述公式中的f(x)、f(y)、f(z)则不改变。具体可以通过预先烘培的方式来计算出来,也就是对应的光照传递参数Transfer Vector,光源投影系数a、b、c则需要在实时运行中,根据虚拟场景中对应的光源计算出来。
在实际应用中,可以根据应用需求采用任意阶数的球面谐波基函数来对光源进行拟合。当阶数更多时,对复杂光源的拟合更准确,但阶数越多,需要消耗的计算资源也越大。本实施例中,可以仅考虑基于PRT技术烘培间接光照。具体可以采用三阶数的球面谐波基函数,即9个基函数,或者4阶数的球面谐波基函数,即16个基函数,能够满足远景光源的拟合需求。
例如,如图9所示,为一些实施例中的场景画面92,场景画面92中包括远景光源92a。如图10所示,投影示意图10a为对图9中的远景光源92a进行投影后的示意图,可以看出,光源部分以及光源周围区域的亮度较高。
以虚拟场景中的远景光源为例,在昼夜变换的情况中,远景光源绕场景进行旋转。其中,投影基函数的选择影响着光照传递参数的烘培和光源的投影。由于光源投影是对远景光源的近似,在远景光源旋转的过程中,如果采用的投影基函数不具有旋转不变性的特点,则会产生系数的抖动,从而造成场景光照的抖动。在本实施例中,采用球面谐波函数作为光源的投影基函数。由于球面谐波函数具有旋转不变性和投影性能高的特性,能够更加精准地计算出虚拟场景中的光照信息,从而能够有效提高最终的光照渲染效果和运行时的性能。
具体地,终端可以预先根据虚拟场景中预设的原始光源的状态信息,将原始光源投影至投影基函数上,进而基于Monte Carlo(蒙特卡洛)法计算出初始投影系数。然后,在对虚拟场景进行渲染的过程中,当光源发生变化时,例如在光源的旋转中,根据旋转的角度计算光照旋转矩阵后,只需要按照初始投影系数,进行一次矩阵和向量的乘积,即可更新光源变化后的当前光源投影系数。相较于每次都进行大量的采样重新进行投影计算光源投影系数的方式,本实施例的方式能够有效节省计算资源的消耗。
其中,终端在对虚拟场景中待渲染的目标像素点进行渲染之前,需要预先烘培出虚拟场景中静态物体的各像素点对应的光照传递参数,以及光照探针对应的光照传递参数。具体可以基于PRT方式,采用基于路径追踪来烘培光照传递参数。在传统的光能传输计算中,会计 算整个远景光源对场景的光照贡献度。本实施例中,仅考虑虚拟场景中间接光照的亮度信息,具体地,光照传递参数Transfer Vector中只计算频率低的间接光照。具体可以采用基于路径积分的方式,虚拟场景中间接光照的亮度表示可以如下:
Figure PCTCN2022081063-appb-000002
Figure PCTCN2022081063-appb-000003
其中,Lo表示物体上一个位置向某一个方向发射的光亮度,Le为光源或者自发光物体的亮度,P(pn)为光线经过N次反弹之后的亮度贡献,T(pn)为经过N次反弹后光亮度的衰减。
在烘焙光照传递参数的过程中,虚拟场景中的光源被投影到球面谐波基函数上,每一个投影基函数均按照上面的步骤计算出对应的光亮度贡献。
在一些实施例中,为了加速光照传递参数的烘培性能,多个投影基函数的光亮度信息可以共用同样的光线弹射路径,打包到一起来计算。由此能够更加快速地烘培出虚拟场景中间接光照对应的光照传递参数。
其中,对于虚拟场景中的静态物体,各静态物体的光照烘培数据,即光照传递参数,存储在各静态物体对应的光照贴图中。光照贴图中的每一个纹素存储了对应的亮度信息,即每个纹素存储一个对应的光照传递参数信息。
具体地,将虚拟场景中的原始光源投影至球面谐波基函数上后,虚拟场景中静态物体表面的各像素点则会接收到相应的亮度信息。然后,对于静态物体上的每个像素点,即对应于光照贴图中的纹素,从纹素中心对应的世界空间位置作为起点,向法线朝向的半球面发射光线。然后在场景中进行采样,计算射线经过多次反弹的光亮度衰减。当光源与场景中的物体相交时,计算当前基函数上各个射线的光亮度贡献,然后对各个射线的光亮度贡献进行拟合,进而根据各基函数上拟合得到的光亮度贡献,计算出对应的光照传递参数。
其中,通过向半球面发射光线以拟合光照传递参数的表达式可以如下:
Figure PCTCN2022081063-appb-000004
其中,T表示光照传递参数,L(w)为按照w方向入射的光线亮度,θ为入射方向和法线的夹角。当使用cosine权重投射光线时,其pdf(w i)=cos(θ)*π,约简后得到最后的拟合表达式。
其中,对于虚拟场景中的光照探针,可以采用Ambient Cube(环境光立方体),来表示光照探针上的多个方向的面。每一个面对应一个投影基函数,每个面对应的基函数上存储一个光照传递参数。虚拟场景中动态物体的烘培数据则存储在光照探针中,光照探针中每一个面中存储了各个方向的入射光信息。
具体地,光照探针的烘培,以光照探针的位置作为起点,向整个球面均匀发射射线,对于每一条射线,其和光照贴图对应的计算方式相同,累积出各个基函数上的射线对应的光亮度贡献,然后拟合到对应的基函数上。对于一条射线和其对应的亮度值,根据射线的方向和每个面的方向计算夹角,当夹角小于90度时,则累积到对应面的光照传递参数中。
终端在对虚拟场景进行渲染时,例如在运行包括虚拟场景的应用时,终端则需要根据实时的光源信息,更新变化后的光源对应的光源投影系数。以光源为远景光源,例如太阳光为例,首先需要按照当前场景的天光和太阳光,更新远景光源投影到基函数上的系数,然后对 于虚拟场景中的静态物体,则从对应的光照贴图中获取存储的预先烘培的对应像素点的光照传递参数,然后就可以直接根据当前光源投影系数和获取的光照传递参数,计算最终的光亮度贡献,也就是静态物体的间接光照值,进而按照纹理映射关系映射到静态物体表面上。
远景光源到基函数的投影,可以采用蒙特卡洛法进行采样计算,具体可以从虚拟场景中的世界坐标系中心出发,向各个方向均匀发射射线,求解投影基函数的系数。求解投影基函数的光源投影系数的表达式可以如下:
Figure PCTCN2022081063-appb-000005
其中,a表示求解的需要投影到某一个基函数的球谐系数,即光源投影系数。s表示整个球面,w表示立体角,f(w)是表示该立体角方向的亮度值,a(w)是对应的球面谐波基函数。wi表示当前采样的立体角的参数,pdf为对应的权重。当使用均匀球面采样时,pdf则为1/4π。
对于球面谐波基函数,可以首先基于默认方向投影出基函数,然后当光源发生变化时,计算光照变换矩阵。例如光源旋转时,计算球面谐波对应的旋转矩阵,通过根据默认的初始投影系数和旋转矩阵来更新旋转后的矩阵,由此可以更快速更新光源投影系数。球面谐波的旋转矩阵具体可以如下:
Figure PCTCN2022081063-appb-000006
对于虚拟场景中的动态物体,获取动态物体周围的多个光照探针,并获取光照探针中多个面对应存储的光照传递参数,具体可以为6个面。然后根据计算出的当前光源投影系数和光照探针中每个面对应的光照传递参数,更新每个面个光线亮度,也就是计算出每个光照探针中每个面对应的间接光照值。然后对周围的多个光照探针的光线亮度进行线性插值,然后根据插值后的光线亮度,以及动态物体上的目标像素点的法线方向,计算出目标像素点在该方向上的最终的亮度,从而得到目标像素点对应的间接光照值。
在一些实施例中,还可以采用球面谐波的方式,根据光照探针计算动态物体的间接光照值。与采用Ambient Cube环境光立方体的方式类似,首先按照光照探针中存储的预计算出的光照传递参数和当前光源投影系数,实时计算出光照探针中每一个基函数上的投影系数,得到光照探针的光亮度的球面谐波表示。然后按照动态物体上的目标像素点的法线方向,计算出目标像素点对应的漫反射球谐转换参数,最后根据将漫反射球谐转换参数和光照探针对应的球面谐波表示中的球谐系数进行点积,从而能够精准地计算出动态物体的目标像素点对应最终的间接光照值。
在一些实施例中,通过拆分直接光照和间接光照,采用基于PRT的方式,分别动态计算出虚拟场景中目标像素点对应的直接光照值和间接光照值。如图11所示,图11中的图11a,为本实施例中,通过拆分直接光照和间接光照,采用基于PRT方式分别计算直接光照和间接光照的光照渲染效果图。图11中的图11b,为采用基于PRT方式将直接光照和间接光照进行混合计算的光照渲染效果图。图11中的图11c,为仅采用基于PRT方式计算间接光照的光照渲染效果图。可以看出,通过拆分直接光照和间接光照,能够实现动态更新虚拟场景中的整 体间接光照,并且高频的直接光照和阴影信息也更加准确,从而使得虚拟场景中的整体光照效果更佳。
本申请还提供一种应用场景,该应用场景为开放世界类型的三维游戏场景,该游戏场景应用上述的光照渲染方法。具体地,开放世界类型的三维游戏场景中包括虚拟场景,虚拟场景中包括静态物体和动态物体以及光照探针,还包括模拟现实世界的远景光源,例如太阳光和天光等。其中,可以预先烘培出虚拟场景中各静态物体和光照探针对应的光照传递参数,并分别存储至对应的光照贴图和光照探针中。
其中,终端中部署有与三维游戏场景对应的游戏应用。当终端中的游戏应用在运行时,实时检测三维游戏场景中远景光源的光源状态。当虚拟场景中的远景光源发生变化时,确定光源变化信息。然后根据光源变化信息确定变化后的远景光源所对应的当前光源投影系数。
然后,对于虚拟场景中待展示的目标像素点,针对虚拟场景中属于静态物体的目标像素点,则从对应的光照贴图中获取预计算的相匹配的光照传递参数,并根据当前光源投影系数和光照传递参数,计算目标像素点在虚拟场景中的间接光照值。针对虚拟场景中属于动态物体的目标像素点,则从对应的光照探针中获取预计算的相匹配的光照传递参数,并根据当前光源投影系数和光照传递参数,计算目标像素点在虚拟场景中的间接光照值。
终端同时还根据光源变化信息,计算出目标像素点在变化后的光源下对应的直接光照值,由此能够实时计算出虚拟场景中动态的直接光照值和间接光照值。终端进而根据直接光照值和间接光照值,对目标像素点进行光照渲染。由此能够精准高效地对游戏场景中实时变化的远景光源进行光照渲染处理,从而有效提高了虚拟场景中动态远景光源下的光照渲染效果。
本申请还另外提供一种应用场景,该应用场景为三维环境演示场景,例如旅游环境演示场景、建筑物演示场景以及虚拟训练场景等,该三维环境演示场景应用上述的三维场景下的光照渲染方法。针对各种环境下的三维环境演示场景中包括预先设置的虚拟场景,虚拟场景中包括静态物体和动态物体以及光照探针。
在对三维环境演示场景进行展示之前,预先烘培出虚拟场景中各静态物体和光照探针对应的光照传递参数,并分别存储至对应的光照贴图和光照探针中。
终端中可以预先加载有三维环境演示场景对应的场景数据,也可以实时从服务器中获取所需的场景数据。当终端在演示三维环境演示场景时,实时检测虚拟场景中的光源状态。当虚拟场景中的光源发生变化时,根据光源变化信息,实时更新虚拟场景中待展示的目标像素点所对应的当前光源投影系数。
然后,针对虚拟场景中属于静态物体的目标像素点,则从对应的光照贴图中获取预计算的相匹配的光照传递参数,并根据当前光源投影系数和光照传递参数,计算目标像素点在虚拟场景中的间接光照值。针对虚拟场景中属于动态物体的目标像素点,则从对应的光照探针中获取预计算的相匹配的光照传递参数,并根据当前光源投影系数和光照传递参数,计算目标像素点在虚拟场景中的间接光照值。
终端同时还根据光源变化信息,计算出目标像素点在变化后的光源下对应的直接光照值,由此能够实时计算出虚拟场景中动态的直接光照值和间接光照值。终端进而根据直接光照值和间接光照值,对目标像素点进行光照渲染。由此不仅能够有效节省间接光照的计算资源,还能够精准地对虚拟场景中实时变化的光源进行光照渲染处理,有效提高了虚拟场景中动态光源下的光照渲染效果。
应该理解的是,虽然上述各实施例的流程图中的各个步骤按照箭头的指示依次显示,但 是这些步骤并不是必然按照箭头指示的顺序依次执行。除非本文中有明确的说明,这些步骤的执行并没有严格的顺序限制,这些步骤可以以其它的顺序执行。而且,上述各实施例中的至少一部分步骤可以包括多个步骤或者多个阶段,这些步骤或者阶段并不必然是在同一时刻执行完成,而是可以在不同的时刻执行,这些步骤或者阶段的执行顺序也不必然是依次进行,而是可以与其它步骤或者其它步骤中的步骤或者阶段的至少一部分轮流或者交替地执行。
在一些实施例中,如图12所示,提供了一种光照渲染1200装置,该装置可以采用软件模块或硬件模块,或者是二者的结合成为计算机设备的一部分,该装置具体包括:光源确定模块1202、投影系数更新模块1204、间接光照确定模块1206、直接光照确定模块1208和光照渲染模块1210,其中:
光源确定模块1202,用于当虚拟场景中的光源发生变化时,确定光源变化信息。
投影系数更新模块1204,用于根据光源变化信息,确定变化后的光源所对应的当前光源投影系数。
间接光照确定模块1206,用于根据虚拟场景中的目标像素点对应的光照传递参数和当前光源投影系数,确定目标像素点在虚拟场景中的间接光照值。
直接光照确定模块1208,用于确定目标像素点在变化后的光源下对应的直接光照值。
光照渲染模块1210,用于根据直接光照值和间接光照值,对目标像素点进行光照渲染。
在一些实施例中,投影系数更新模块1204还用于根据光源变化信息,确定光照变换矩阵;确定虚拟场景中光源变化前的初始光源投影系数;根据光照变换矩阵和初始光源投影系数,确定变化后的光源所对应的当前光源投影系数。
在一些实施例中,光源变化信息,是变化后的光源相较于虚拟场景中预设的原始光源所产生的变化信息;投影系数更新模块1204还用于获取将原始光源投影至投影基函数得到的初始光源投影系数;根据光照变换矩阵,对初始光源投影系数进行更新,得到变化后的光源所对应的当前光源投影系数。
在一些实施例中,光源变化信息,是变化后的光源相较于虚拟场景中变化前的历史光源所产生的变化信息;投影系数更新模块1204还用于获取与历史光源对应的历史光源投影系数;根据光照变换矩阵,对历史光源投影系数进行更新,得到变化后的光源所对应的当前光源投影系数。
在一些实施例中,光照渲染1200装置还包括第一传递参数获取模块,第一传递参数获取模块还用于针对虚拟场景中属于静态物体的目标像素点,获取与目标像素点相匹配的光照贴图;从光照贴图中获取虚拟场景中的目标像素点对应的光照传递参数。
在一些实施例中,第一传递参数获取模块还用于针对虚拟场景中属于静态物体的目标像素点,基于纹理映射关系在静态物体的光照贴图中,查找与目标像素点相匹配的纹素;根据相匹配的纹素,从光照贴图中获取与目标像素点对应的光照传递参数。
在一些实施例中,光照渲染1200装置还包括第二传递参数获取模块,第二传递参数获取模块还用于针对虚拟场景中属于动态物体的目标像素点,获取与所述目标像素点相匹配的光照探针;从所述光照探针中获取所述虚拟场景中的目标像素点对应的光照传递参数。
在一些实施例中,间接光照确定模块1206还用于针对虚拟场景中属于动态物体的目标像素点,确定与动态物体的目标像素点相匹配的光照探针;根据当前光源投影系数和光照传递参数,更新光照探针上各方向对应的光线亮度;对各光照探针对应的更新后的光线亮度进行插值;根据插值后的光线亮度和目标像素点的法线方向,确定目标像素点在虚拟场景中的间 接光照值。
在一些实施例中,上述光照渲染装置还包括光照传递参数烘培模块,用于将原始光源投影至多些投影基函数,将投影后的每个投影基函数分别作为对应的虚拟光源;针对虚拟场景中静态物体的各像素点,基于光线追踪确定各虚拟光源在像素点对应的光照传递参数,并存储至对应的光照贴图中;针对虚拟场景中的光照探针,基于光线追踪确定各虚拟光源在光照探针对应的光照传递参数,并存储至光照探针中。
在一些实施例中,光照传递参数烘培模块还用于针对虚拟场景中静态物体的每个像素点,获取各虚拟光源投影后在像素点对应的光线亮度;以像素点为起点,向像素点的法线所指向的半球面,发射具有光线亮度的射线;在虚拟场景中,采样各射线经过半球面反射后的反射亮度和光亮衰减度;基于各射线所对应的反射光线亮度和光源亮度,确定像素点对应的光照传递参数,并将光照传递参数存储至对应的光照贴图中。
在一些实施例中,光照传递参数烘培模块还用于针对虚拟场景中的光照探针,获取各虚拟光源投影后在光照探针的各个面对应的光线亮度;基于光线亮度,以光照探针的中心点为起点,向以中心点为中心的预设半径的球面发射射线;在虚拟场景中,采样各射线经过球面反射后的反射亮度和光亮衰减度;基于各射线对应的反射亮度和光亮衰减度,确定各光照探针对应的光照传递参数,并将光照传递参数存储至光照探针中。
在一些实施例中,虚拟场景为虚拟游戏场景,虚拟游戏场景中的光源为远景光源;光源确定模块1202还用于在运行虚拟游戏场景时,监测虚拟游戏场景中远景光源的光源方向;当光源方向发生变化时,根据变化后的光源方向和远景光源的初始光源方向,确定光源变化信息。
关于光照渲染装置的具体限定可以参见上文中对于光照渲染方法的限定。上述光照渲染装置中的各个模块可全部或部分通过软件、硬件及其组合来实现。上述各模块可以硬件形式内嵌于或独立于计算机设备中的处理器中,也可以以软件形式存储于计算机设备中的存储器中,以便于处理器调用执行以上各个模块对应的操作。
在一些实施例中,提供了一种计算机设备,该计算机设备可以是终端,其内部结构图可以如图13所示。该计算机设备包括通过系统总线连接的处理器、存储器、通信接口、显示屏和输入装置。其中,该计算机设备的处理器用于提供计算和控制能力。该计算机设备的存储器包括非易失性存储介质、内存储器。该非易失性存储介质存储有操作系统和计算机可读指令。该内存储器为非易失性存储介质中的操作系统和计算机可读指令的运行提供环境。该计算机设备的通信接口用于与外部的终端进行有线或无线方式的通信,无线方式可通过WIFI、运营商网络、NFC(近场通信)或其他技术实现。该计算机可读指令被处理器执行时以实现一种光照渲染方法。该计算机设备的显示屏可以是液晶显示屏或者电子墨水显示屏,该计算机设备的输入装置可以是显示屏上覆盖的触摸层,也可以是计算机设备外壳上设置的按键、轨迹球或触控板,还可以是外接的键盘、触控板或鼠标等。
本领域技术人员可以理解,图13中示出的结构,仅仅是与本申请方案相关的部分结构的框图,并不构成对本申请方案所应用于其上的计算机设备的限定,具体的计算机设备可以包括比图中所示更多或更少的部件,或者组合某些部件,或者具有不同的部件布置。
在一些实施例中,还提供了一种计算机设备,包括存储器和一个或多个处理器,存储器中存储有计算机可读指令,计算机可读指令被处理器执行时,使得该一个或多个处理器实现上述各方法实施例中的步骤。
在一些实施例中,还提供了一个或多个非易失性可读存储介质,存储有计算机可读指令,该计算机可读指令被一个或多个处理器执行时,使得该一个或多个处理器实现上述各方法实施例中的步骤。
在一些实施例中,还提供了一种计算机程序产品,包括计算机可读指令,该计算机可读指令被处理器执行时实现上述各方法实施例中的步骤。
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程,是可以通过计算机可读指令来指令相关的硬件来完成,所述的计算机可读指令可存储于一非易失性计算机可读取存储介质中,该计算机可读指令在执行时,可包括如上述各方法的实施例的流程。其中,本申请所提供的各实施例中所使用的对存储器、存储、数据库或其它介质的任何引用,均可包括非易失性和易失性存储器中的至少一种。非易失性存储器可包括只读存储器(Read-Only Memory,ROM)、磁带、软盘、闪存或光存储器等。易失性存储器可包括随机存取存储器(Random Access Memory,RAM)或外部高速缓冲存储器。作为说明而非局限,RAM可以是多种形式,比如静态随机存取存储器(Static Random Access Memory,SRAM)或动态随机存取存储器(Dynamic Random Access Memory,DRAM)等。
以上实施例的各技术特征可以进行任意的组合,为使描述简洁,未对上述实施例中的各个技术特征所有可能的组合都进行描述,然而,只要这些技术特征的组合不存在矛盾,都应当认为是本说明书记载的范围。
以上所述实施例仅表达了本申请的几种实施方式,其描述较为具体和详细,但并不能因此而理解为对发明专利范围的限制。应当指出的是,对于本领域的普通技术人员来说,在不脱离本申请构思的前提下,还可以做出若干变形和改进,这些都属于本申请的保护范围。因此,本申请专利的保护范围应以所附权利要求为准。

Claims (16)

  1. 一种光照渲染方法,其特征在于,由计算机设备执行,所述方法包括:
    当虚拟场景中的光源发生变化时,确定光源变化信息;
    根据所述光源变化信息,确定变化后的光源所对应的当前光源投影系数;
    根据所述虚拟场景中的目标像素点对应的光照传递参数和所述当前光源投影系数,确定所述目标像素点在所述虚拟场景中的间接光照值;
    确定所述目标像素点在变化后的光源下对应的直接光照值;
    根据所述直接光照值和所述间接光照值,对所述目标像素点进行光照渲染。
  2. 根据权利要求1所述的方法,其特征在于,所述根据所述光源变化信息,确定变化后的光源所对应的当前光源投影系数,包括:
    根据所述光源变化信息,确定光照变换矩阵;
    确定所述虚拟场景中光源变化前的初始光源投影系数;及
    根据所述光照变换矩阵和所述初始光源投影系数,确定变化后的光源所对应的当前光源投影系数。
  3. 根据权利要求2所述的方法,其特征在于,所述光源变化信息,是变化后的光源相较于所述虚拟场景中预设的原始光源所产生的变化信息;
    所述确定所述虚拟场景中光源变化前的初始光源投影系数,包括:
    获取将所述原始光源投影至投影基函数得到的初始光源投影系数;及
    所述根据所述光照变换矩阵和所述初始光源投影系数,确定变化后的光源所对应的当前光源投影系数,包括:
    根据所述光照变换矩阵,对所述初始光源投影系数进行更新,得到变化后的光源所对应的当前光源投影系数。
  4. 根据权利要求2所述的方法,其特征在于,所述光源变化信息,是变化后的光源相较于所述虚拟场景中变化前的历史光源所产生的变化信息;
    所述确定所述虚拟场景中光源变化前的初始光源投影系数,包括:
    获取与所述历史光源对应的历史光源投影系数;及
    所述根据所述光照变换矩阵和所述初始光源投影系数,确定变化后的光源所对应的当前光源投影系数,包括:
    根据所述光照变换矩阵,对所述历史光源投影系数进行更新,得到变化后的光源所对应的当前光源投影系数。
  5. 根据权利要求1所述的方法,其特征在于,获取所述虚拟场景中的目标像素点对应的光照传递参数的步骤如下:
    针对所述虚拟场景中属于静态物体的目标像素点,获取与所述目标像素点相匹配的光照贴图;及
    从所述光照贴图中获取所述虚拟场景中的目标像素点对应的光照传递参数。
  6. 根据权利要求5所述的方法,其特征在于,所述从所述光照贴图中获取所述虚拟场景中的目标像素点对应的光照传递参数包括:
    针对所述虚拟场景中属于静态物体的目标像素点,基于纹理映射关系在所述静态物体的 光照贴图中,查找与所述目标像素点相匹配的纹素;及
    根据所述相匹配的纹素,从所述光照贴图中获取与所述目标像素点对应的光照传递参数。
  7. 根据权利要求1所述的方法,其特征在于,获取所述虚拟场景中的目标像素点对应的光照传递参数的步骤如下:
    针对所述虚拟场景中属于动态物体的目标像素点,获取与所述目标像素点相匹配的光照探针;及
    从所述光照探针中获取所述虚拟场景中的目标像素点对应的光照传递参数。
  8. 根据权利要求7所述的方法,其特征在于,所述根据所述虚拟场景中的目标像素点对应的光照传递参数和所述当前光源投影系数,确定所述目标像素点在所述虚拟场景中的间接光照值包括:
    根据所述当前光源投影系数和所述光照传递参数,更新所述光照探针上各方向对应的光线亮度;
    对各所述光照探针对应的更新后的光线亮度进行插值;及
    根据插值后的光线亮度和所述目标像素点的法线方向,确定所述目标像素点在所述虚拟场景中的间接光照值。
  9. 根据权利要求1所述的方法,其特征在于,所述当虚拟场景中的光源发生变化时,确定光源变化信息之前,所述方法还包括:
    将原始光源投影至多个投影基函数,将投影后的每个投影基函数分别作为对应的虚拟光源;
    针对所述虚拟场景中静态物体的像素点,基于光线追踪确定各所述虚拟光源在所述像素点对应的光照传递参数,并存储至对应的光照贴图中;及
    针对所述虚拟场景中的光照探针,基于光线追踪确定各所述虚拟光源在所述光照探针对应的光照传递参数,并存储至所述光照探针中。
  10. 根据权利要求9所述的方法,其特征在于,所述针对所述虚拟场景中静态物体的像素点,基于光线追踪确定各所述虚拟光源在所述像素点对应的光照传递参数,并存储至对应的光照贴图中,包括:
    针对所述虚拟场景中静态物体的每个像素点,获取各所述虚拟光源投影后在所述像素点对应的光线亮度;
    以所述像素点为起点,向所述像素点的法线所指向的半球面,发射具有所述光线亮度的射线;
    在所述虚拟场景中,采样各所述射线经过所述半球面反射后的反射亮度和光亮衰减度;及
    基于各所述射线所对应的所述反射光线亮度和所述光源亮度,确定所述像素点对应的光照传递参数,并将所述光照传递参数存储至对应的光照贴图中。
  11. 根据权利要求9所述的方法,其特征在于,所述针对所述虚拟场景中的光照探针,基于光线追踪确定各所述虚拟光源在所述光照探针对应的光照传递参数,并存储至所述光照探针中,包括:
    针对所述虚拟场景中的光照探针,获取各所述虚拟光源投影后在所述光照探针的各个面对应的光线亮度;
    基于所述光线亮度,以所述光照探针的中心点为起点,向以所述中心点为中心的预设半径的球面发射射线;
    在所述虚拟场景中,采样各所述射线经过所述球面反射后的反射亮度和光亮衰减度;及
    基于各所述射线对应的所述反射亮度和所述光亮衰减度,确定各所述光照探针对应的光照传递参数,并将所述光照传递参数存储至所述光照探针中。
  12. 根据权利要求1至11任意一项所述的方法,其特征在于,所述虚拟场景为虚拟游戏场景,所述虚拟游戏场景中的光源为远景光源;
    所述当虚拟场景中的光源发生变化时,确定光源变化信息,包括:
    在运行虚拟游戏场景时,监测所述虚拟游戏场景中远景光源的光源方向;及
    当所述光源方向发生变化时,根据变化后的光源方向和所述远景光源的初始光源方向,确定光源变化信息。
  13. 一种光照渲染装置,其特征在于,所述装置包括:
    光源确定模块,用于当虚拟场景中的光源发生变化时,确定光源变化信息;
    投影系数更新模块,用于根据所述光源变化信息,确定变化后的光源所对应的当前光源投影系数;
    间接光照确定模块,用于根据所述虚拟场景中的目标像素点对应的光照传递参数和所述当前光源投影系数,确定所述目标像素点在所述虚拟场景中的间接光照值;
    直接光照确定模块,用于确定所述目标像素点在变化后的光源下对应的直接光照值;及光照渲染模块,用于根据所述直接光照值和所述间接光照值,对所述目标像素点进行光照渲染。
  14. 一种计算机设备,包括存储器和一个或多个处理器,所述存储器存储有计算机可读指令,其特征在于,所述计算机可读指令被所述处理器执行时,使得所述一个或多个处理器执行权利要求1至12中任一项所述的方法的步骤。
  15. 一个或多个非易失性可读存储介质,存储有计算机可读指令,其特征在于,所述计算机可读指令被一个或多个处理器执行时,使得所述一个或多个处理器实现权利要求1至12中任一项所述的方法的步骤。
  16. 一种计算机程序产品,包括计算机可读指令,其特征在于,所述计算机可读指令被处理器执行时实现权利要求1至12中任一项所述的方法的步骤。
PCT/CN2022/081063 2021-04-02 2022-03-16 光照渲染方法、装置、计算机设备和存储介质 WO2022206380A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/985,107 US20230076326A1 (en) 2021-04-02 2022-11-10 Illumination rendering method and apparatus, computer device, and storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110359533.9 2021-04-02
CN202110359533.9A CN112927341A (zh) 2021-04-02 2021-04-02 光照渲染方法、装置、计算机设备和存储介质

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/985,107 Continuation US20230076326A1 (en) 2021-04-02 2022-11-10 Illumination rendering method and apparatus, computer device, and storage medium

Publications (1)

Publication Number Publication Date
WO2022206380A1 true WO2022206380A1 (zh) 2022-10-06

Family

ID=76173917

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/081063 WO2022206380A1 (zh) 2021-04-02 2022-03-16 光照渲染方法、装置、计算机设备和存储介质

Country Status (3)

Country Link
US (1) US20230076326A1 (zh)
CN (1) CN112927341A (zh)
WO (1) WO2022206380A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117011446A (zh) * 2023-08-23 2023-11-07 苏州深捷信息科技有限公司 一种动态环境光照的实时渲染方法

Families Citing this family (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112927341A (zh) * 2021-04-02 2021-06-08 腾讯科技(深圳)有限公司 光照渲染方法、装置、计算机设备和存储介质
US11770494B1 (en) * 2021-06-14 2023-09-26 Jeremy Cowart Photography, Inc. Apparatus, systems, and methods for providing a lightograph
US20220414975A1 (en) * 2021-06-29 2022-12-29 Apple Inc. Techniques for manipulating computer graphical light sources
CN113436247B (zh) * 2021-07-29 2024-03-01 北京达佳互联信息技术有限公司 一种图像处理方法、装置、电子设备及存储介质
CN113592999B (zh) * 2021-08-05 2022-10-28 广州益聚未来网络科技有限公司 一种虚拟发光体的渲染方法及相关设备
CN113786616B (zh) * 2021-09-30 2024-04-12 天津亚克互动科技有限公司 一种间接光照实现方法、装置、存储介质及计算设备
CN113989473B (zh) * 2021-12-23 2022-08-12 北京天图万境科技有限公司 一种重新光照的方法和装置
CN114596403A (zh) * 2022-03-15 2022-06-07 Oppo广东移动通信有限公司 图像处理方法、装置、存储介质及终端
CN114419240B (zh) * 2022-04-01 2022-06-17 腾讯科技(深圳)有限公司 光照渲染方法、装置、计算机设备和存储介质
CN115115767A (zh) * 2022-06-01 2022-09-27 合众新能源汽车有限公司 一种场景渲染方法、装置、电子设备及存储介质
CN114972112B (zh) * 2022-06-17 2024-05-14 如你所视(北京)科技有限公司 用于图像逆渲染的方法、装置、设备和介质
CN115393499B (zh) * 2022-08-11 2023-07-25 广州极点三维信息科技有限公司 一种3d实时渲染方法、系统及介质
CN115082611B (zh) * 2022-08-18 2022-11-11 腾讯科技(深圳)有限公司 光照渲染方法、装置、设备和介质
CN115439595A (zh) * 2022-11-07 2022-12-06 四川大学 一种面向ar的室内场景动态光照在线估计方法与装置
CN115830208B (zh) * 2023-01-09 2023-05-09 腾讯科技(深圳)有限公司 全局光照渲染方法、装置、计算机设备和存储介质
CN115953524B (zh) * 2023-03-09 2023-05-23 腾讯科技(深圳)有限公司 一种数据处理方法、装置、计算机设备及存储介质
CN116030180B (zh) * 2023-03-30 2023-06-09 北京渲光科技有限公司 辐照度缓存光照计算方法及装置、存储介质、计算机设备
CN116152419B (zh) * 2023-04-14 2023-07-11 腾讯科技(深圳)有限公司 数据处理方法、装置、设备及存储介质
CN116310061B (zh) * 2023-05-18 2023-08-08 腾讯科技(深圳)有限公司 一种数据的处理方法、装置以及存储介质
CN116612225B (zh) * 2023-07-18 2023-12-22 腾讯科技(深圳)有限公司 光照渲染方法、装置、设备和介质
CN116974417B (zh) * 2023-07-25 2024-03-29 江苏泽景汽车电子股份有限公司 显示控制方法及装置、电子设备、存储介质
CN116672706B (zh) * 2023-08-03 2023-11-03 腾讯科技(深圳)有限公司 光照渲染方法、装置、终端和存储介质
CN117354439A (zh) * 2023-10-31 2024-01-05 神力视界(深圳)文化科技有限公司 光强处理方法、装置、电子设备和计算机存储介质
CN117523062B (zh) * 2024-01-05 2024-03-08 腾讯科技(深圳)有限公司 光照效果的预览方法、装置、设备及存储介质
CN117953137B (zh) * 2024-03-27 2024-06-14 哈尔滨工业大学(威海) 一种基于动态表面反射场的人体重光照方法

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1889128A (zh) * 2006-07-17 2007-01-03 北京航空航天大学 基于gpu的预计算辐射度传递全频阴影的方法
CN102855400A (zh) * 2012-09-10 2013-01-02 北京航空航天大学 一种基于投影网格的海洋表面建模及实时光照方法
US20130335434A1 (en) * 2012-06-19 2013-12-19 Microsoft Corporation Rendering global light transport in real-time using machine learning
CN112927341A (zh) * 2021-04-02 2021-06-08 腾讯科技(深圳)有限公司 光照渲染方法、装置、计算机设备和存储介质

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3217361B1 (en) * 2016-03-11 2021-12-15 Imagination Technologies Limited Importance sampling for determining a light map
CN106981087A (zh) * 2017-04-05 2017-07-25 杭州乐见科技有限公司 光照效果渲染方法及装置
CN111420404B (zh) * 2020-03-20 2023-04-07 网易(杭州)网络有限公司 游戏中对象渲染的方法及装置、电子设备、存储介质

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1889128A (zh) * 2006-07-17 2007-01-03 北京航空航天大学 基于gpu的预计算辐射度传递全频阴影的方法
US20130335434A1 (en) * 2012-06-19 2013-12-19 Microsoft Corporation Rendering global light transport in real-time using machine learning
CN102855400A (zh) * 2012-09-10 2013-01-02 北京航空航天大学 一种基于投影网格的海洋表面建模及实时光照方法
CN112927341A (zh) * 2021-04-02 2021-06-08 腾讯科技(深圳)有限公司 光照渲染方法、装置、计算机设备和存储介质

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
HU WEIHE …: "Research and Implementation of Scene Drawing and Lighting Rendering Technology in 3D Game Engine", INFORMATION SCIENCE AND TECHNOLOGY, CHINESE MASTER’S THESES FULL-TEXT DATABASE, 15 January 2009 (2009-01-15), pages 1 - 62, XP055973697, [retrieved on 20221021] *
WANG RUI, ZHU JIAJUN, HUMPHREYS GREG: "Precomputed Radiance Transfer for Real-Time Indirect Lighting using A Spectral Mesh Basis", EUROGRAPHICS SYMPOSIUM ON RENDERING, 1 January 2007 (2007-01-01), pages 1 - 9, XP055973693, [retrieved on 20221021] *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117011446A (zh) * 2023-08-23 2023-11-07 苏州深捷信息科技有限公司 一种动态环境光照的实时渲染方法
CN117011446B (zh) * 2023-08-23 2024-03-08 苏州深捷信息科技有限公司 一种动态环境光照的实时渲染方法

Also Published As

Publication number Publication date
CN112927341A (zh) 2021-06-08
US20230076326A1 (en) 2023-03-09

Similar Documents

Publication Publication Date Title
WO2022206380A1 (zh) 光照渲染方法、装置、计算机设备和存储介质
US7212207B2 (en) Method and apparatus for real-time global illumination incorporating stream processor based hybrid ray tracing
US12039662B2 (en) Method and apparatus for recognizing pixel point, illumination rendering method and apparatus, electronic device, and storage medium
US7212206B2 (en) Method and apparatus for self shadowing and self interreflection light capture
US11954169B2 (en) Interactive path tracing on the web
CN112755535B (zh) 光照渲染方法、装置、存储介质及计算机设备
WO2023185262A1 (zh) 光照渲染方法、装置、计算机设备和存储介质
CN102096941A (zh) 虚实融合环境下的光照一致性方法
WO2023142607A1 (zh) 图像渲染方法、装置、设备和介质
US11790594B2 (en) Ray-tracing with irradiance caches
CN107644453A (zh) 一种基于物理着色的渲染方法及系统
CN116894922A (zh) 一种基于实时图形引擎的夜视图像生成方法
Moreau et al. Importance sampling of many lights on the GPU
Cabeleira Combining rasterization and ray tracing techniques to approximate global illumination in real-time
WO2024148898A1 (zh) 图像降噪方法、装置、计算机设备和存储介质
CN116030179B (zh) 一种数据处理方法、装置、计算机设备及存储介质
Ignatenko et al. A Real-Time 3D Rendering System with BRDF Materials and Natural Lighting
US20180005432A1 (en) Shading Using Multiple Texture Maps
Nijasure et al. Interactive global illumination in dynamic environments using commodity graphics hardware
CN114445538A (zh) 目标对象的实时渲染方法、装置、电子设备及存储介质
Abbas et al. Gaussian radial basis function for efficient computation of forest indirect illumination
CN116778053B (zh) 基于目标引擎的贴图生成方法、装置、设备及存储介质
McGuire et al. Plausible Blinn-Phong reflection of standard cube MIP-maps
Fober Radiance Textures for Rasterizing Ray-Traced Data
Li et al. Realight-NeRF: Modeling Realistic Lighting and Optimizing Environment Light Capture in Scenes

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22778563

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 19/02/2024)