WO2023088047A1 - 一种渲染方法及装置 - Google Patents

一种渲染方法及装置 Download PDF

Info

Publication number
WO2023088047A1
WO2023088047A1 PCT/CN2022/127466 CN2022127466W WO2023088047A1 WO 2023088047 A1 WO2023088047 A1 WO 2023088047A1 CN 2022127466 W CN2022127466 W CN 2022127466W WO 2023088047 A1 WO2023088047 A1 WO 2023088047A1
Authority
WO
WIPO (PCT)
Prior art keywords
lighting component
shading point
patch
rendering
point
Prior art date
Application number
PCT/CN2022/127466
Other languages
English (en)
French (fr)
Inventor
李洪珊
Original Assignee
华为云计算技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为云计算技术有限公司 filed Critical 华为云计算技术有限公司
Publication of WO2023088047A1 publication Critical patent/WO2023088047A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • G06T15/205Image-based rendering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/04Texture mapping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/506Illumination models

Definitions

  • the embodiments of the present application relate to the field of computer technology, and in particular, to a rendering method and device.
  • Rendering technology refers to outputting real pictures that simulate the same model and lighting conditions in the real world based on 3D model data (including object geometric model, surface material, etc.) and light data (including light source position, color, intensity, etc.).
  • 3D model data including object geometric model, surface material, etc.
  • light data including light source position, color, intensity, etc.
  • Ray tracing technology is a means to achieve rendering.
  • the rendering results of each object in the virtual scene using the ray tracing algorithm can be stored in advance, and the rendering results can be directly read from the stored data when the user watches, without Rendering calculations are further performed, thereby reducing the amount of calculations and rendering delays.
  • the surface material of the 3D model is relatively complex, and the number of objects in a virtual scene is large, so that the amount of data stored for a virtual scene is extremely large, which in turn leads to high storage costs.
  • Embodiments of the present application provide a rendering method and device to solve the problem of high storage costs.
  • the embodiment of the present application provides a rendering method, and the rendering method is used for rendering a virtual scene.
  • the virtual scene includes at least one model, and the at least one model includes a plurality of patches, and each patch includes a plurality of shading points.
  • the method includes: obtaining a pre-calculated common lighting component of the target patch and a pre-calculated specular lighting component of the first shading point.
  • the first shading point is located on the target patch.
  • the common lighting component of the target patch is used to calculate the rendering result of the plurality of shading points.
  • the specular illumination component of the first shading point is used to indicate the illumination intensity of the outgoing light of the first shading point.
  • the texture map parameters of each shading point on a patch can determine the material of different shading points on a patch. Therefore, in the embodiment of the present application, the texture map is no longer taken into consideration in the pre-calculation stage, which can reduce the amount of storage, and the texture map parameters are combined in the rendering stage to improve the rendering fineness.
  • the above-mentioned target patch is one of multiple patches, and the target patch may be any patch in a model in the rendered frame. If all the shaded points in the mesh of the model in the rendering frame are rendered, the rendering of the frame is completed.
  • each patch corresponds to a common lighting component, which can reduce the amount of computation.
  • each shading point included in the target patch corresponds to a specular lighting component.
  • the point where the incident light reaches the target patch can be understood as a shading point. Therefore, the multiple shading points included in the target surface can be understood as the intersection points of outgoing rays of different set directions and the target surface. Specular lighting components can be precomputed for different shading points.
  • obtaining the specular lighting component of the pre-calculated first shading point includes: determining an approximate ray corresponding to the outgoing ray according to the normal map parameters of the target patch, wherein the Both the outgoing ray and the approximate ray pass through the target surface, and the first shading point is the intersection point of the outgoing ray and the target surface; the pre-calculated specular illumination component of the approximate ray is determined as the The specular component of the first shading point.
  • a new direction can be determined by combining the normal map parameters of the surface with the direction of the outgoing light (which can be understood as the direction that light from other directions enters the human eye after being reflected by the surface), which is the approximate direction of the light.
  • the specular lighting component in the direction can approximate the effect of combining a normal map in the direction of the outgoing light from the patch. No need to take normal map parameters into account in the precomputation stage, thus improving rendering accuracy without increasing storage.
  • the method before acquiring the pre-calculated common lighting component of the target surface and the pre-calculated specular lighting component of the first shading point, the method further includes: calculating the target surface based on a ray tracing method The common lighting component of the patch and the specular lighting component of the first shading point, wherein the elements used to calculate the common lighting component of the target patch only include the brightness of one or more incident lights on the target patch and the The angle between one or more incident lights and the normal direction of the target patch. The elements used to calculate the common lighting component of the target patch do not include material parameters of the target patch.
  • the public lighting component of each patch and the specular lighting component of each shading point are calculated, and the material parameters on the patch that affect the rendering result are not counted, and then when rendering a certain patch , from the pre-computation results, you can query the calculation results of the patch that are not involved in the material parameters of the pixels of the patch, and then get the rendering result of the patch according to the calculation results and the material parameters of the patch .
  • the material parameters of each shading point on a patch can determine the material of different shading points on a patch. Therefore, in the embodiment of the present application, the material parameters are no longer taken into consideration in the pre-calculation stage, which can reduce the storage capacity, and the material parameters are combined in the rendering stage to improve the rendering fineness.
  • the method further includes: acquiring a pre-calculated specular lighting component of a second shading point, wherein the second shading point is located on the target patch, and the specular surface of the second shading point
  • the lighting component is used to indicate the lighting intensity of the light entering the second shading point; according to the common lighting component of the target patch, the specular lighting component of the second shading point and the texture map parameters of the second shading point , calculate the rendering result of the second shading point.
  • Each patch can include multiple shading points, only need to get the public lighting component of the patch to which multiple shading points belong, and get the specular lighting component of the shading point, no need to pre-calculate the part of each shading point after combining the texture map Lighting components, which reduce the amount of calculations and improve the rendering fineness.
  • the common lighting component of the target patch includes a first weight
  • the specular lighting component of the first shading point includes a second weight
  • the method further includes: providing a weight setting interface, receiving a user The first weight and/or the second weight are set; the common lighting component of the target patch is calculated according to the first weight; the common lighting component of the first shading point is calculated according to the second weight .
  • the weight is taken into account in the pre-calculation stage, so that the weight in the rendering stage does not need to participate in the calculation, and the rendering efficiency is improved.
  • the method further includes: providing a weight setting interface to receive the first weight and/or the second weight set by the user; the public lighting component according to the target patch, Calculating the rendering result of the first shading point based on the specular lighting component of the first shading point and the texture map parameters of the first shading point includes: according to the common lighting component of the target patch and the first The product of the weights, the product of the specular light component of the first shading point and the second weight, and the texture mapping parameter of the first shading point are used to calculate the rendering result of the first shading point.
  • the weight is no longer taken into account in the pre-computation stage, so that the rendering stage supports users to modify the weight and enhance the rendering effect.
  • the embodiment of the present application provides a rendering engine, the rendering node is used to render a virtual scene, the virtual scene includes at least one model, the at least one model includes a plurality of meshes, and each mesh includes multiple a rendering point; the rendering node includes a processing unit and a storage unit;
  • the processing unit is configured to acquire the pre-calculated common lighting component of the target patch and the pre-calculated specular lighting component of the first shading point from the storage unit, wherein the first shading point is located on the target patch, and the The common lighting component of the target patch is used to calculate the rendering result of the multiple shading points, and the specular lighting component of the first shading point is used to indicate the light intensity of the light entering the first shading point; according to the The public lighting component of the target patch, the specular lighting component of the first shading point, and the texture mapping parameters of the first shading point are used to calculate the rendering result of the first shading point.
  • each patch corresponds to a common lighting component.
  • each shading point included in the target patch corresponds to a specular lighting component.
  • the processing unit is specifically configured to: determine an approximate ray corresponding to the outgoing ray according to the normal map parameters of the target surface, where the outgoing ray and the approximate ray are both Passing through the target surface, the first shading point is the intersection point of the outgoing ray and the target surface; determining the pre-calculated specular illumination component of the approximate ray as the specular illumination of the first shading point portion.
  • the processing unit is further configured to: before obtaining the pre-calculated common lighting component of the target patch and the pre-calculated specular lighting component of the first shading point, calculate the The common lighting component of the target patch and the specular lighting component of the first shading point, wherein the elements used to calculate the common lighting component of the target patch only include the brightness of one or more incident lights on the target patch and the included angle between the one or more incident lights and the normal direction of the target surface.
  • the processing unit is further configured to: acquire a pre-calculated specular lighting component of a second shading point, where the second shading point is located on the target patch, and the second shading The specular lighting component of the point is used to indicate the lighting intensity of the light entering the second shading point; according to the common lighting component of the target patch, the specular lighting component of the second shading point and the Texture map parameters to calculate the rendering result of the second shading point.
  • the common lighting component of the target patch includes a first weight
  • the specular lighting component of the first shading point includes a second weight
  • the rendering engine further includes a weight setting interface; the a weight setting interface, configured to receive the first weight and/or the second weight set by the user; the processing unit is also configured to calculate the public illumination component of the target patch according to the first weight; The second weight computes a common lighting component of the first shading point.
  • the rendering engine further includes a weight setting interface; the weight setting interface is configured to receive the first weight and/or the second weight set by the user; the processing unit, specifically According to the product of the common lighting component of the target patch and the first weight, the product of the specular lighting component of the first shading point and the second weight, and the texture map parameter of the first shading point , calculate the rendering result of the first shading point.
  • the embodiment of the present application also provides a rendering method, the rendering method is used for rendering a virtual scene, the virtual scene includes at least one model, each model includes a plurality of meshes, and each mesh includes a plurality of A shading point, the method includes: acquiring a pre-calculated common lighting component of the target patch, wherein the first shading point is located on the target patch, and the public lighting component of the target patch is used to calculate the multiple The rendering result of a shading point.
  • an approximate ray corresponding to the outgoing ray is determined, wherein both the outgoing ray and the approximate ray pass through the target surface, and the first shading point is the outgoing ray The intersection point of the ray with the target patch.
  • Get the precomputed specular lighting component of the approximate ray Calculate the rendering result of the first shading point according to the common lighting component of the target patch and the specular lighting component of the approximate light.
  • the pre-calculated specular illumination component of the approximate light is approximated as the corresponding specular illumination component after the first shading point is combined with the normal map.
  • the target patch has a texture map
  • the rendering result of the first shading point is calculated according to the common lighting component of the target patch and the specular lighting component of the first shading point, include:
  • each patch corresponds to a common lighting component.
  • each shading point included in the target patch corresponds to a specular lighting component.
  • the method before the acquisition of the pre-calculated common lighting component of the target surface and the pre-calculated specular lighting component of the approximate light, the method further includes: calculating the The common lighting component and the specular lighting component of the approximate light, wherein the elements used to calculate the common lighting component of the target patch only include the brightness of one or more incident lights on the target patch and the one or more The angle between the incident light and the normal direction of the target surface.
  • the common lighting component of the target patch includes a first weight
  • the specular lighting component of the first shading point includes a second weight
  • the method further includes: providing a weight setting interface, receiving The first weight and/or the second weight set by the user; calculating the common lighting component of the target patch according to the first weight; calculating the specular lighting component of the approximate light according to the second weight.
  • the method further includes: providing a weight setting interface to receive the first weight and/or the second weight set by the user; the public lighting component according to the target patch, The specular lighting component of the approximate light, and calculating the rendering result of the first shading point, include:
  • an embodiment of the present application provides a rendering engine, the rendering node is used to render a virtual scene, the virtual scene includes at least one model, the at least one model includes a plurality of meshes, and each mesh includes multiple a rendering point; the rendering node includes a processing unit and a storage unit;
  • the processing unit is configured to acquire a pre-calculated common lighting component of the target patch from a storage unit, wherein the first shading point is located on the target patch, and the common lighting component of the target patch is used to calculate the Describes the rendering result of multiple shading points.
  • an approximate ray corresponding to the outgoing ray is determined, wherein both the outgoing ray and the approximate ray pass through the target surface, and the first shading point is the outgoing ray The intersection point of the ray with the target patch.
  • the pre-calculated specular illumination component of the approximate light is acquired from a storage unit. Calculate the rendering result of the first shading point according to the common lighting component of the target patch and the specular lighting component of the approximate light.
  • the embodiment of the present application provides a computer program product containing instructions.
  • the cluster of computer equipment executes the computer program as provided in the first aspect or any possible design of the first aspect.
  • an embodiment of the present application provides a computer-readable storage medium, including computer program instructions.
  • the computer program instructions When the computer program instructions are executed by a cluster of computing devices, the cluster of computing devices executes the first aspect or the first aspect.
  • the method provided by any possible design, or execute the method provided by the third aspect or any possible design of the third aspect.
  • the embodiment of the present application provides a computing device cluster, including at least one computing device, each computing device includes a processor and a memory; the processor of the at least one computing device is used to execute instructions stored in the memory of the at least one computing device , so that the computing device executes the method provided by the first aspect or any possible design of the first aspect, or executes the method provided by the third aspect or any possible design of the third aspect.
  • the cluster of computing devices includes a computing device including a processor and a memory; the processor is used to execute instructions stored in the memory to run the second aspect or any possible design of the second aspect
  • a rendering engine is provided, so that the computing device executes the method provided by the first aspect or any possible design of the first aspect, or the processor is used to execute instructions stored in the memory to run the fourth aspect or the method of the fourth aspect
  • the rendering engine provided in any possible design, so that the computing device executes the method provided in the third aspect or any possible design of the third aspect.
  • the cluster of computing devices includes at least two computing devices, each computing device including a processor and memory.
  • the processors of the at least two computing devices are used to execute the instructions stored in the memories of the at least two computing devices to run the rendering engine provided by the second aspect or any possible design of the second aspect, so that the computing device cluster performs as The method provided by the first aspect or any possible design of the first aspect; or, the processors of the at least two computing devices are used to execute the instructions stored in the memories of the at least two computing devices to run the fourth aspect or the fourth aspect
  • the rendering engine provided by any possible design of the third aspect, so that the computing device cluster executes the method provided by any possible design of the third aspect or the third aspect.
  • Each computing device runs an included portion of the rendering engine.
  • Fig. 1 is the schematic diagram of UV map in the embodiment of the present application.
  • Fig. 2 is a schematic diagram of an open hemispherical space in the embodiment of the present application
  • Fig. 3 is a schematic diagram before and after texture mapping in the embodiment of the present application.
  • FIG. 4 is a schematic diagram of material parameters of other patches in the reflection scene in the embodiment of the present application.
  • FIG. 5 is a schematic diagram of the rendering system in the embodiment of the present application.
  • Fig. 6 is a schematic flow chart of the pre-calculation part in the embodiment of the present application.
  • FIG. 7 is a schematic diagram of the real-time rendering part of the process in the embodiment of the present application.
  • Fig. 8 is a schematic diagram of the principle of the approximate direction of the normal map in the embodiment of the present application.
  • FIG. 9A is a schematic diagram of a rendering process in which only texture maps exist in the embodiment of the present application.
  • FIG. 9B is a schematic diagram of a rendering process in which only normal maps exist in the embodiment of the present application.
  • FIG. 9C is a schematic diagram of the rendering process with normal maps and texture maps in the embodiment of the present application.
  • FIG. 10 is a schematic diagram of the rendering node structure in the embodiment of the present application.
  • FIG. 11 is a schematic structural diagram of a computing device in an embodiment of the present application.
  • a 3D model is a polygonal representation of an object, usually displayed on a computer or other video device.
  • the shape of the three-dimensional model can be various, for example, it can be a sphere, a cone, a curved object, a plane object, an object with an irregular surface, and the like.
  • the objects shown are either real-world entities or fictional objects.
  • 3D models are often generated using specialized software such as 3D modeling tools, but can also be generated using other methods. As the data of points and other components, the 3D model can be generated manually or according to a certain algorithm.
  • a patch can also be called a mesh.
  • planes are also called patches, and they can be any polygon, commonly used are triangles and quadrilaterals.
  • the shapes of the patches of 3D models of different shapes may also be different. For example, the shapes of the patches of a sphere and the patches of a curved object can be completely different.
  • the size of the patch can be set as required, and the smaller the size of the patch can be set when the higher the accuracy of the rendered image is required.
  • the position, shape or size of the patch in the 3D model can be described by geometric parameters.
  • Ray-tracing rendering is a rendering method that generates a rendered image by tracing the path of light that enters a virtual scene from the viewpoint of an observer (such as a camera or human eye) toward each pixel of the rendered image.
  • the virtual scene includes a light source and a three-dimensional model.
  • Ray tracing is divided into forward ray tracing and reverse ray tracing.
  • Forward ray tracing refers to starting from the light source and forward tracing the transfer process of light in the virtual scene.
  • the rendering device performs forward ray tracing on the light generated by the light source in the virtual scene, so as to obtain the light intensity of each patch (or grid) in the three-dimensional model in the virtual scene.
  • Reverse ray tracing is the process of passing rays through a virtual scene to light sources by tracing them from a set point of view into the mesh of a 3D model.
  • the set viewing angle is a certain angle at which the user observes the virtual scene.
  • the light generated by the light source shines on the 3D model.
  • the light source may be a point light source, a line light source or a surface light source and the like. After the light generated by the light source touches one or more surfaces in the 3D model, one of the following situations will occur at each contact point: refraction, reflection or diffuse reflection. Then into the user's eyes.
  • Each patch in the 3D model has certain material characteristics.
  • materials for patches There are three types of materials for patches: transparent materials, smooth opaque materials, and rough opaque materials.
  • the refraction/reflection of the surface for light can be divided into three situations: refraction, reflection and diffuse reflection. Among them, light will be refracted when it touches a transparent material, reflection will occur when light touches an opaque material with a smooth surface, and diffuse reflection will occur when light touches an opaque material with a rough surface.
  • material characteristics can also be expressed by material parameters. Material parameters may include texture, surface unevenness, metalness, roughness, opacity, refraction, and the like.
  • the specified value is red green blue (RGB) color
  • this map is a texture map (texture).
  • RGB red green blue
  • the values of all material parameters can be constant (the same parameter value of the whole object), or can be given by texture (material parameters of different positions on the surface of the object different values).
  • Texture can be a pixel feature component on a two-dimensional image. When the texture is mapped to the surface of a three-dimensional model in a specific way, it is also called a texture map. Textures are used to express the color (or refraction absorption coefficient) of the surface of a 3D model. The degree of surface bumpiness is represented by a normal map.
  • the normal map affects the normal direction of the surface of the 3D model, but does not affect the color. It is mainly used to express the height, potholes, etc. of the surface to create a bumpy surface effect.
  • Metalness can be expressed by a metalness map.
  • Metalness maps represent the specular properties of an object's representation.
  • Metalness maps define whether a 3D model is metallic or dielectric (non-metallic).
  • Roughness can be expressed with a roughness map.
  • a roughness map represents the diffuse properties of an object's surface.
  • a roughness map defines how rough or smooth the surface reflections of an object are.
  • Transparency indicates the degree of transparency of the object.
  • a patch stores data expressed in an open hemispherical space.
  • the open hemispherical space can be divided into multiple set directions according to the polar angle ⁇ and the azimuth angle ⁇ .
  • the set direction can be expressed as ( ⁇ , ⁇ ) in the hemispherical coordinate system, where 0 ⁇ 90, 0 ⁇ 360.
  • the color of the outgoing light that is, the color of the intersection point on the patch when viewed according to the set direction.
  • the intersection of the outgoing ray of the set direction and the patch can be understood as a coloring point, such as p1, p2 and p3 in Figure 2. If the data corresponding to the outgoing rays from 90*360 directions is pre-calculated for a patch, it can be understood as calculating the data corresponding to 90*360 shading points for the patch. Taking a patch as an example, the stored data of a colored point can also be understood as the color value of the patch viewed from a set direction.
  • the color value of the patch viewed in each set direction can be obtained based on the ray tracing algorithm.
  • Ray tracing may include forward ray tracing and/or backward ray tracing.
  • pre-computation the calculation of the color value of each patch viewed from multiple set directions is stored, which is called pre-computation.
  • the pixel density of a general map is far greater than the patch density of a geometry, and the specific density depends on the fineness of the texture.
  • the leather texture map of a car steering wheel may be 4096*4096 or 8192*8192, while the leather part of the steering wheel may only have thousands of patches.
  • the geometry of a car steering wheel is rendered with a leather texture map, it can be rendered for 4096*4096 or 8192*8192 patches.
  • a car scene includes not only the steering wheel, but also other objects, such as seats.
  • a car scene includes 110 objects. Some of the 110 objects have texture maps, some have both texture maps and normal maps, and some have normal maps only. For example, let's say 50% of objects have texture maps and normal maps.
  • each value that determines the material properties of a 3D model can be entered as a material parameter.
  • Material parameters can be expressed through textures.
  • the material parameters of each shading point on the patch that affect the rendering result are not included in the calculation, and then when rendering a certain patch, it can be queried from the pre-calculation result that the The precalculation result of the patch is the material parameter of the patch, and then the rendering result of the patch is obtained according to the precalculation result and the material parameter of the patch.
  • the rendering result of a shading point can be obtained by a linear combination of the diffuse color (diffuse color) component and the specular lighting (specular lighting) component.
  • the specular lighting component may be called the specular lighting component.
  • the diffuse reflection color component the color of light reflected from a contact point from various angles is usually the same. In other words, under the premise that the relative positions of the 3D model and the light source and other conditions remain unchanged, the diffuse reflection color components of the same shading point where diffuse reflection can occur are the same in two viewing directions.
  • the diffuse reflection color component the color of a certain colored point on a patch that the user sees is related to the material parameters of the colored point.
  • the material parameters of each colored point on a patch can determine the color values of different colored points on a patch (or say RGB values).
  • the diffuse reflection color component in order to improve the fineness and reduce the storage capacity, it is possible to calculate the diffuse reflection color component not involved in the material parameters of the pixel points of the own patch for each patch in the 3D model.
  • the diffuse reflection color component calculated without considering the material parameters of its own patch is called the public illumination component.
  • the specular lighting component is related to the user's viewing angle. For a shading point on a patch, the specular light components seen by the user through different viewing angles are different. That is, the specular lighting components of different cache points can be different.
  • the present application may store a data expressed in the form of an open hemispherical space for the specular illumination component of each patch.
  • the open hemispherical space can be divided into multiple set directions according to the polar angle ⁇ and the azimuth angle ⁇ .
  • the set direction can be expressed as ( ⁇ , ⁇ ) in the hemispherical coordinate system, where 0 ⁇ 90, 0 ⁇ 360.
  • For each patch store the specular component of the outgoing light in each set direction.
  • the specular lighting component of each shading point can be obtained based on the ray tracing algorithm.
  • Ray tracing may include forward ray tracing and/or backward ray tracing.
  • the diffuse reflection color (diffuse color) component and the specular lighting (specular lighting) component are described below in conjunction with the rendering equation.
  • the rendering equation is used to approximate the contribution of each ray to the final reflected ray on the plane of a given material.
  • the rendering equation may be shown in the following formula (1).
  • ⁇ 0 represents the viewing angle direction observed by the user.
  • the viewing angle direction can also be understood as the outgoing light direction.
  • p represents a shading point on the patch.
  • L 0 (p, ⁇ 0 ) represents the illumination brightness of the colored point p in the direction of viewing angle ⁇ 0 .
  • ⁇ i represents the incident direction of light.
  • n represents the normal direction of the plane where point p is located.
  • L i (p, ⁇ i ) represents the incident luminance of point p in the direction of ⁇ i .
  • N represents the number of light incident light directions at point p.
  • D represents a normal distribution function (normal distribution function, NDF).
  • the normal distribution function which describes the probability of the normal distribution on the patch, that is, the concentration of correctly oriented normals, because this distribution determines the overall shape and intensity of the specular highlight.
  • Low roughness produces sharp and clear specular reflections and narrow Elongated specular lobes.
  • Roughness can determine whether incoming light will be diffusely or specularly reflected as outgoing light.
  • F represents the Fresnel function (fresnel equation), which is used to describe the ratio of light reflected at different surface angles.
  • G represents the geometric function, also known as the masking-shadow function (masking-shadow function), which describes the self-shadowing properties of the patch, that is, the percentage of the patch that is not covered.
  • c represents the texture color corresponding to point p in the texture map data, k d and k s both represent weights.
  • the formula (2) in This item can be understood as a diffuse reflection color component or a diffuse item.
  • This item can be understood as a specular lighting component or a specular item.
  • the public lighting component of each patch of the 3D model in the virtual scene and the specular lighting component of multiple set directions of each patch can be pre-calculated.
  • the specular component of the point Taking the first patch as an example, the public lighting component of the first patch and the specular lighting components of multiple shading points of the first patch are based on the light source in the virtual scene, the geometric parameters of the first patch and the The material parameters and geometric parameters of other patches are obtained by ray tracing. It should be noted that in the embodiment of the present application, the calculation of the common lighting component and the specular lighting component of the first patch requires the participation of material parameters of other patches that affect the first patch. For example, when a light source shines on some other surface and is reflected, it will reach the first surface, then this certain surface will affect the calculation of the common lighting component and the specular lighting component.
  • the virtual scene has only one light source 410 , an opaque cube 411 and an opaque sphere 412 .
  • a ray is emitted from the light source 410, and is projected onto the facet 1 of the opaque cube 411 whose center point is Q1 , and then reflected to a facet 2 of the opaque sphere 412 whose center point is point Q2 .
  • the facet 1 is colored, and after being reflected by the facet 1 , it will affect the facet 2 .
  • FIG. 4 is only used as an example.
  • the virtual scene is far more complicated than that in Figure 4.
  • there may be multiple opaque objects and multiple transparent objects in the virtual scene so the light will be reflected or diffused multiple times, and there may be more than one light source , but two or more. Therefore, when precomputing the public lighting component and the specular lighting component of a certain patch, it is necessary to consider multiple other patches that affect the lighting of the patch.
  • the common lighting component of the target patch can be understood as the diffuse reflection color component without considering the material parameters of the target patch.
  • the specular lighting components of the target patch can be understood as multiple specular lighting components corresponding to multiple set directions (ie, multiple shading points) of the target patch.
  • the common lighting component can be expressed by the following formula (3) or formula (4):
  • p represents the coloring point corresponding to the patch
  • ⁇ i represents the incident direction of light
  • n represents the normal direction of the surface where point p is located
  • L i represents the incident light brightness of point p in the direction of ⁇ i
  • N represents the incident light direction of point p c represents the texture color corresponding to point p in the texture map data.
  • the user or the provider of the virtual scene can configure different k d for different virtual scenes, and can also configure different k d for different models in the virtual scene, or can configure different k for different patches d , this embodiment of the present application does not specifically limit it. It can be determined from the above formulas (3) and (4) that the common lighting components of different shading points on the same patch have nothing to do with the viewing angle direction. Therefore, the common lighting components corresponding to each shading point belonging to the same patch are the same.
  • the specular lighting component can be expressed by the following formula (5) or formula (6):
  • ⁇ 0 represents the viewing angle direction
  • p represents the coloring point corresponding to the patch
  • ⁇ i represents the incident direction of light
  • n represents the normal direction of the patch where point p is located
  • L i represents the incident light brightness of point p
  • N represents the number of incident light directions of light at point p
  • D represents the normal distribution function
  • F represents the Fresnel function
  • G represents the geometric function.
  • FIG. 5 is a schematic structural diagram of a rendering system involved in the present application.
  • the rendering system of the present application is used for rendering a 2D image obtained by rendering a three-dimensional (2D) model of a virtual scene through a rendering method, that is, a rendered image.
  • the rendering system of the present application may include: one or more terminal devices 100 , network devices 200 and a remote rendering platform 300 .
  • the remote rendering platform 300 may be deployed on a cloud server.
  • the remote rendering platform 300 and the terminal device 100 are generally deployed in different data centers.
  • the terminal device 100 may be a device that needs to display rendered images in real time, for example, it may be a virtual reality device (virtual reality, VR) for flight training, a computer for virtual games, a smart phone for a virtual shopping mall, etc. , not specifically limited here.
  • the terminal device can be a device with high configuration and high performance (for example, multi-core, high frequency, large memory, etc.), or a device with low configuration and low performance (for example, single core, low frequency, small memory, etc.) equipment.
  • the terminal device 100 may include hardware, an operating system, and a rendering application client.
  • the network device 200 is used to transmit data between the terminal device 100 and the remote rendering platform 300 through any communication mechanism/communication standard communication network.
  • the communication network may be a wide area network, a local area network, a point-to-point connection, etc., or any combination thereof.
  • the remote rendering platform 300 includes one or more remote rendering nodes.
  • Remote rendering nodes can also be referred to simply as rendering nodes.
  • the remote rendering platform 300 can be implemented by one or more computing devices. Multiple computing devices may form a computing device cluster.
  • the function of the rendering node can be implemented by one or more computing devices.
  • Remote rendering nodes can include rendering hardware, virtualization services, rendering engines, and rendering application servers from bottom to top.
  • rendering hardware includes computing resources, storage resources, and network resources.
  • Computing resources can adopt heterogeneous computing architecture, for example, central processing unit (central processing unit, CPU) + graphics processing unit (graphics processing unit, GPU) architecture, CPU+AI chip, CPU+GPU+AI chip architecture, etc. , not specifically limited here.
  • Storage resources may include storage devices such as internal memory and video memory.
  • Network resources may include network cards, port resources, address resources, and so on.
  • the virtualization service is a service that virtualizes the resources of the rendering node into vCPUs, etc. through virtualization technology, and flexibly isolates mutually independent resources according to the user's needs to run the user's application program.
  • the virtualization service may include a virtual machine (virtual machine, VM) service and a container (container) service, and the VM and the container may run a rendering engine and a rendering application server.
  • Rendering engines are used to implement rendering algorithms.
  • the rendering application server is used to call the rendering engine to complete the rendering of the rendered image.
  • the rendering application client on the terminal device 100 and the rendering application server of the remote rendering platform 300 may be collectively referred to as a rendering application.
  • Common rendering applications may include: game applications, VR applications, movie special effects, animation, and so on.
  • the user inputs an operation instruction through the rendering application client, and the rendering application client sends the operation instruction to the rendering application server, and the rendering application server invokes the rendering engine to generate a rendering result, and sends the rendering result to the rendering application client. Then the rendering application client converts the rendering result into an image and presents it to the user.
  • the rendering application server and the rendering application client may be provided by the rendering application provider, and the rendering engine may be provided by the cloud service provider.
  • the rendering application may be a game application.
  • the game developer of the game application installs the game application server on the remote rendering platform provided by the cloud service provider, and the game developer of the game application provides the game application client to the The user downloads and installs it on the user's terminal device.
  • the cloud service provider also provides a rendering engine, which can provide computing power for game applications.
  • the rendering application client, the rendering application server, and the rendering engine may all be provided by a cloud service provider.
  • the rendering system may further include a management device (not shown in FIG. 5 ).
  • the management device may be a device provided by a third party other than the user's terminal device and the cloud service provider's remote rendering platform 300 .
  • the management device may be a device provided by a game developer. Game developers can manage rendering applications through management devices. It can be understood that the management device can be set on the remote rendering platform, or can be set outside the remote rendering platform, which is not specifically limited here.
  • the rendering solution provided by the embodiment of the present application may be executed by the remote rendering platform 300, or executed by the rendering node, or implemented by the rendering engine in the rendering node.
  • the rendering solution provided by this application may include a pre-calculation part and a real-time rendering part.
  • FIG. 6 it shows a schematic flow chart of a pre-calculation part provided by the embodiment of the present application.
  • the pre-calculation part includes step 601-step 603.
  • the remote rendering platform obtains information about the virtual scene.
  • the virtual scene includes a light source and one or more three-dimensional models.
  • the relevant information of the virtual scene includes light source parameters and information of one or more models.
  • the information of one or more models may include, for example, the mesh division of each model, the mesh number, the geometric parameters and material parameters of each model and mesh, and so on.
  • the relevant information of the virtual scene may be sent by the terminal device to the remote rendering platform, or may be sent by the management device to the remote rendering platform.
  • Step 602 the remote rendering platform performs ray tracing on each patch according to the relevant information of the virtual scene to obtain the common lighting component of each patch and the specular lighting component of multiple shading points included on each patch.
  • the remote rendering platform can perform ray tracing according to the light source, the geometric parameters of the target patch, and the material parameters and geometric parameters of other patches to obtain the public lighting component and specular lighting component of the target patch.
  • the other patches mentioned here include at least one patch other than the target patch among the patches included in one or more three-dimensional models in the virtual scene.
  • the at least one patch is a patch that affects lighting on the target patch among the patches included in the one or more three-dimensional models.
  • the target surface is affected by reflection, as described above, and will not be described here again.
  • the remote rendering platform can calculate the common lighting components of each patch based on formula (3) or formula (4).
  • the specular illumination components of multiple set directions of each patch can be calculated based on the above formula (5) or formula (6).
  • the multiple set directions may be set by the provider or the user of the virtual scene.
  • the provider of the virtual scene or the user sets the setting direction parameters.
  • the set direction parameter may be the number of set directions or a number of set angles.
  • the remote rendering platform stores the common lighting components of each patch and the specular lighting components of multiple set directions for each patch.
  • the remote rendering platform stores the common lighting components of each patch and the specular lighting components of multiple set directions of each patch in the memory.
  • the memory may include volatile memory (volatile memory), such as random access memory (random access memory, RAM).
  • volatile memory such as random access memory (random access memory, RAM).
  • non-volatile memory non-volatile memory
  • read-only memory read-only memory
  • flash memory such as read-only memory (read-only memory, ROM), flash memory, mechanical hard disk (hard disk drive, HDD) or solid state hard disk (solid state drive, SSD) )wait.
  • the memory can also be deployed outside the remote rendering platform, or inside the remote rendering platform.
  • memory can be deployed inside the rendering engine.
  • the remote rendering platform may establish an illumination component information table for each patch.
  • the illumination component information table may include the number of each patch, common illumination components, and specular illumination components with different set directions.
  • the light component information table may also include material parameters of each patch.
  • the illumination component information table may refer to Table 1.
  • the remote rendering platform can establish a common lighting component information table and a specular lighting component information table for each patch.
  • the public lighting component information table may include the number of each patch and the corresponding public lighting component.
  • the specular lighting component information table includes the number of each patch and the specular lighting components corresponding to multiple set directions of each patch.
  • the public illumination component information table may refer to Table 2.
  • the specular lighting component information table can be referred to in Table 3.
  • FIG. 7 shows a schematic flowchart of a real-time rendering part provided by the embodiment of the present application.
  • the real-time rendering part includes step 701-step 703.
  • Step 701 the remote rendering platform receives a rendering request.
  • Rendering requests can come from end devices.
  • the terminal device may send the rendering request to the remote rendering platform through the network device.
  • the user's viewing angle parameter can be included in the rendering request.
  • the viewing angle parameter of the user may be a user viewing angle range or a user viewing angle direction.
  • the remote rendering platform can determine the user's viewing angle range according to the user's viewing angle orientation.
  • the direction of the viewing angle of the user may be a direction in which the user's eyes look directly.
  • the rendering request may also include the identifier of the virtual scene.
  • the identifier of the virtual scene uniquely identifies the virtual scene in the rendering system. In some scenes, when the rendering system only includes one kind of virtual scene, the identifier of the virtual scene may not be carried in the rendering request.
  • the rendering method of is similar to the method of rendering the target patch, and will not be repeated here.
  • one of the directions within the range of the user's viewing angle is taken as an example.
  • the coloring point on the target patch can be determined according to the direction of the user's viewing angle.
  • the embodiment of the present application takes the rendering of the first shading point of the target patch as an example.
  • Step 702 the remote rendering platform acquires the saved public lighting components of the target patch.
  • the public lighting component of the target patch is obtained by querying the lighting component information table.
  • the public lighting component of the target patch is obtained by querying the public lighting component information table.
  • the remote rendering platform may send a read request 1 to the memory, and the read request 1 is used to request to read the public lighting component of the target patch, and then the memory sends the public lighting component of the target patch to the remote rendering platform.
  • the remote rendering platform acquires the specular lighting component of the first shader point of the target patch.
  • the specular lighting component of the first shading point of the target patch is obtained by querying the lighting component information table.
  • the specular lighting component of the first shading point of the target patch is obtained by querying the specular lighting component information table.
  • the specular lighting component of the first shading point can be based on the specular lighting component in other approximate directions on the patch. to be approximately certain. Other approximate directions and the set directions corresponding to the first shading point all pass through the target patch. Relevant descriptions about the approximate determination will be described in detail later, and will not be repeated here.
  • the remote rendering platform may send a read request 2 to the memory, and the read request 2 is used to request to read the specular lighting component of the first shading point of the target patch, and then the memory stores the first shading point of the target patch
  • the specular lighting component of is sent to the remote rendering platform.
  • the determination of the first coloring point may be determined according to the direction of the user's viewing angle.
  • the direction of the user's viewing angle can be understood as the direction in which the outgoing light of the patch viewed by the user enters the human eye, that is, the direction of the outgoing light, and the intersection point between the outgoing light direction and the patch is the first coloring point.
  • step 702 and step 703 are executed sequentially, and step 702 and step 703 may be executed simultaneously.
  • Step 704 the remote rendering platform determines the rendering result of the first shading point of the target patch according to the material parameters of the target patch, the common lighting component of the target patch, and the specular lighting component of the first shading point of the target patch.
  • the texture map determines the color values in the diffuse color component, which are RGB values.
  • the remote rendering platform can combine the texture mapping parameters of the target patch and the public light component of the target patch to obtain the color component of the target patch after texture mapping (which can be understood as a diffuse reflection color component).
  • texture mapping which can be understood as a diffuse reflection color component.
  • the texture map parameters of each shading point in the target patch may be different, so for each shading point, the texture map parameters of the shading point can be used in combination with the common light component of the target patch to obtain the shading point , so as to obtain the color components of each shading point on the target patch.
  • the remote rendering platform can obtain the rendering result of the target patch based on weighted superposition of the color component of the target patch and the specular light component of the first shading point of the target patch.
  • the remote rendering platform determines the weights corresponding to the public lighting component and the specular lighting component respectively, and then the determined rendering result of point p on the target patch can be expressed by the following formula (7).
  • c represents the texture map parameter of point p on the target patch.
  • k d represents the first weight of the common lighting component.
  • k s represents the second weight of the specular lighting component.
  • c represents the texture map parameter of point p on the target patch.
  • the above k d and k s can be set by the provider or user of the virtual scene.
  • k d and k s may be included in the precalculation of the common lighting component and the specular lighting component.
  • the public lighting component is calculated using formula (4)
  • the specular lighting component is calculated using formula (6).
  • k d and k s may not be counted when pre-calculating the public lighting component and specular lighting component.
  • the public lighting component is calculated using formula (3)
  • the specular light component is calculated using formula (5).
  • an approximate method is adopted for patches with normal maps, and the effect similar to that for normal maps is achieved by perturbing the surface normal during real-time rendering.
  • a texture can be considered as a layer of skin attached to the surface of a geometric object. If the texture is high-precision, a patch can be mapped to an area on the texture, and there can be multiple values on the texture in this area. .
  • the normal map of a patch can be used to express bump details smaller than the size of the patch.
  • a new direction can be determined by combining the normal map parameters of the patch with the direction of the user's viewing angle.
  • the specular lighting component in this direction can approximate the effect of covering the shaded point corresponding to the user's viewing direction.
  • the normal map parameters and the approximation method provided by the embodiment of the present application are applied to the embodiment corresponding to FIG. 7 above.
  • the approximate ray direction can be determined by the user's viewing angle direction and the normal map parameters corresponding to the patch, and the lens illumination component of the approximate ray direction is used as the approximate specular illumination component after the shading point is combined with the normal map.
  • the following exemplarily describes an approximate direction determined according to the direction of the user's viewing angle and the normal map parameters of the shading point.
  • Referring to FIG. 8 take the user viewing direction as wo as an example.
  • the following is an example of determining the approximate direction of the first shading point of the target patch.
  • the calculation methods of other shading points are similar, and will not be repeated in the embodiments of the present application.
  • the remote rendering platform determines the first normal direction of the first shading point p of the target patch after applying the normal map according to the normal map parameters of the target patch.
  • the remote rendering platform can obtain the parameter value corresponding to the point p of the target patch in the normal map parameter, and the meaning of the parameter value is related to the type of the normal map. For example, it can be the height of the point p or the normal of the tangent space of the point p, etc.
  • the first normal direction of the P point after applying the normal map can be determined.
  • the normal direction of the target patch is called the second normal direction. Referring to FIG. 8 , the first normal direction is represented by Ns, and the second normal direction is represented by Ng.
  • the remote rendering platform determines the first direction of reflection of the user's viewing angle direction based on the first normal direction.
  • the first direction reflected by the user viewing direction may be understood as that the first direction and the user viewing direction are axisymmetric with the first normal direction as the axis.
  • the first direction is referred to as wi'.
  • the remote rendering platform determines the second direction reflected by the first direction wi' based on the second normal direction Ng of the target patch.
  • the second direction is called wo'.
  • the wo' direction is called the approximate direction of the wo direction after the normal map is applied to the p shading point. It should be noted that wo' is not the user's real view direction, but after the normal map is performed on the shading point P, the corresponding effect of the specular lighting component of the shading point P seen from the user's view direction can be approximated from the view angle of wo' The effect of the specular lighting component seen in the direction.
  • the wi direction is the direction in which the wo direction is reflected by Ng.
  • the normal direction of point P changes from Ng to Ns, theoretically the brightness of the light seen from the direction of wo should be caused by the incident light in the direction of wi';
  • the viewing angle direction corresponding to the brightness of the incident light in the wi' direction is wo'.
  • the specular light component corresponding to the wo' direction on the target patch can be approximately considered as the result of the wo direction after applying the normal map.
  • a certain patch may only have a normal map, such as a surface with unevenness and a single color.
  • a patch can only have a texture map, such as a smooth wood grain surface.
  • Some patches may have both a normal map and a texture map, such as a bumpy wood grain surface.
  • the remote rendering platform can obtain the public lighting components of patch 1. Then get the specular lighting component of patch 1 in the range of the user's viewing angle. Then calculate the color components of each coloring point of the patch 1 after applying the texture map according to the texture mapping parameters of multiple shading points within the user's viewing angle of the patch 1 and the common illumination component of the patch 1. Then, weighted superimposition is performed on the color component after the texture map is applied to the patch 1 and the specular light components of multiple colored points within the range of the user's viewing angle to obtain the rendering result of the patch 1.
  • shader 1 of patch 1 the color component of shader 1 and the specular light component of patch 1 are weighted and superimposed to obtain the rendering result of shader 1.
  • shader 2 of patch 1 the color component of shader 2 and the specular light component of patch 1 in the direction of the user's viewing angle are weighted and superimposed to obtain the rendering result of shader 2.
  • the user sets the first weight k d and the second weight k s as an example.
  • the texture mapping parameters of each coloring point of the patch 2 are the same.
  • the normal map parameters of each shading point of patch 2 are different.
  • the remote rendering platform can obtain the public lighting components of patch 2. According to the normal map parameters of each shading point combined with the principle of FIG. 8 , the approximate ray direction corresponding to the outgoing ray direction of each shading point is determined for each shading point. Further, the remote rendering platform obtains the specular lighting component in the approximate light direction determined for each shading point. Then, the color component of the patch 2 after the texture mapping is applied is determined according to the texture mapping parameters of the patch 2 and the common light component of the patch 2 .
  • weighted superposition is performed on the color components of each patch 2 and the specular illumination component in the approximate light direction determined for each shading point to obtain the rendering result of each shading point.
  • the shading point 3 of the patch 2 the color component of the patch 2 and the specular light component in the approximate light direction determined for the shading point 3 are weighted and superimposed to obtain the rendering result of the shading point 3.
  • the shading point 4 of the patch 2 the color component of the patch 2 and the specular light component in the approximate light direction determined for the shading point 4 are weighted and superimposed to obtain the rendering result of the shading point 4.
  • the user sets the first weight k d and the second weight k s as an example.
  • the remote rendering platform can obtain the public lighting components of patch 3.
  • an approximate ray direction is determined for the outgoing ray direction of each shading point.
  • the remote rendering platform obtains the specular lighting component in the approximate light direction determined for each shading point.
  • the color components of each coloring point of the patch 3 after applying texture mapping are calculated according to the texture mapping parameters of each coloring point of the patch 3 and the common illumination component of the patch 3 . Then weighted and superposed the color components of each shading point and the obtained specular light component in the approximate light direction determined for each shading point to obtain the rendering result of each shading point.
  • the color component of the shading point 5 and the specular illumination component in the approximate light direction determined for the shading point 5 are weighted and superimposed to obtain the rendering result of the shading point 5 .
  • the color component of the shading point 6 and the specular light component in the approximate light direction determined for the shading point 6 are weighted and superimposed to obtain the rendering result of the shading point 6 .
  • the user sets the first weight k d and the second weight k s as an example.
  • an image frame includes multiple pixel points, and each pixel point may correspond to one or more colored points.
  • Each pixel point can be obtained according to the average value of the rendering results of one or more shader points.
  • the method for obtaining the average value may be a weighted average method, or may be determined by directly obtaining the average value. Therefore, when the rendering results of multiple pixels are obtained, the rendering results of one image frame are obtained.
  • the high-precision texture part and the light effect part of the low-precision patch are decoupled. Only pre-calculate and store the light effect results of each surface with relatively low precision, and high-precision textures can be obtained through simple and fast calculation or approximation during real-time viewing. Furthermore, the rendering precision is improved on the basis of reducing the amount of storage.
  • the embodiment of the present application also provides a rendering engine, as shown in FIG.
  • the slice includes a plurality of shading points;
  • the rendering engine includes a processing unit 1001 and a storage unit 1002;
  • the processing unit 1001 is configured to obtain from the storage unit 1002 a pre-calculated common lighting component of the target patch and a pre-calculated specular lighting component of a first shading point, wherein the first shading point is located on the target patch , the common lighting component of the target patch is used to calculate the rendering result of the multiple shading points, and the specular lighting component of the first shading point is used to indicate the light intensity of the outgoing light of the first shading point; according to The common lighting component of the target patch, the specular lighting component of the first shading point, and the texture map parameters of the first shading point are used to calculate a rendering result of the first shading point.
  • each patch corresponds to a common lighting component.
  • each shading point included in the target patch corresponds to a specular lighting component.
  • the processing unit 1001 determines an approximate ray corresponding to the outgoing ray according to a normal map parameter of the target patch. Wherein, both the outgoing ray and the approximate ray pass through the target surface, and the first colored point is the intersection point of the outgoing ray and the target surface.
  • the pre-calculated specular lighting component of the approximate light is determined as the specular lighting component of the first shading point.
  • the processing unit 1001 acquires the pre-calculated common lighting component of the target patch and the pre-calculated specular lighting component of the first shading point, calculate the target The common lighting component of the surface patch and the specular lighting component of the first shading point, wherein the elements used to calculate the common lighting component of the target patch only include the brightness of one or more incident lights on the target patch and the The included angle between the one or more incident lights and the normal direction of the target surface.
  • the processing unit 1001 acquires a specular lighting component of a pre-calculated second shading point, where the second shading point is located on the target patch, and the second shading point's
  • the specular lighting component is used to indicate the lighting intensity of the light entering the second shading point; according to the common lighting component of the target patch, the specular lighting component of the second shading point and the texture map of the second shading point parameter, calculate the rendering result of the second shading point.
  • the common lighting component of the target patch includes a first weight
  • the specular lighting component of the first shading point includes a second weight
  • the rendering engine further includes a weight setting interface 1003;
  • the weight setting interface 1003 is configured to receive the first weight and/or the second weight set by the user;
  • the processing unit 1001 is also configured to calculate the common value of the target patch according to the first weight.
  • a lighting component calculating a specular lighting component of the first shading point according to the second weight.
  • the present application also provides a computing device 900 .
  • the computing device includes: a bus 902 , a processor 904 , a memory 906 and a communication interface 908 .
  • the processor 904 , the memory 906 and the communication interface 908 communicate through the bus 902 .
  • Computing device 900 may be a server or a terminal device. It should be understood that the present application does not limit the number of processors and memories in the computing device 900 .
  • the bus 902 may be a peripheral component interconnect standard (peripheral component interconnect, PCI) bus or an extended industry standard architecture (extended industry standard architecture, EISA) bus, etc.
  • the bus can be divided into address bus, data bus, control bus and so on. For ease of representation, only one line is used in FIG. 11 , but it does not mean that there is only one bus or one type of bus.
  • Bus 904 may include pathways for communicating information between various components of computing device 900 (eg, memory 906 , processor 904 , communication interface 908 ).
  • the processor 904 may include processing such as a central processing unit (central processing unit, CPU), a graphics processing unit (graphics processing unit, GPU), a microprocessor (micro processor, MP) or a digital signal processor (digital signal processor, DSP). Any one or more of them.
  • CPU central processing unit
  • GPU graphics processing unit
  • MP microprocessor
  • DSP digital signal processor
  • the processor 904 may include one or more graphics processors.
  • the processor 904 is configured to execute instructions stored in the memory 906 to implement the method described in the foregoing embodiment corresponding to FIG. 6 or FIG. 7 .
  • the processor 904 may include one or more central processing units and one or more graphics processors.
  • the processor 904 is configured to execute instructions stored in the memory 906 to implement the method described in the foregoing embodiment corresponding to FIG. 6 or FIG. 7 .
  • the memory 906 may include a volatile memory (volatile memory), such as a random access memory (random access memory, RAM).
  • Processor 904 can also include non-volatile memory (non-volatile memory), such as read-only memory (read-only memory, ROM), flash memory, mechanical hard disk (hard disk drive, HDD) or solid state hard drive (solid state drive, SSD).
  • Executable program codes are stored in the memory 906 , and the processor 904 executes the executable program codes to implement the method described in the embodiment corresponding to FIG. 6 or FIG. 7 .
  • the memory 906 stores instructions for the rendering node to execute the method described in the embodiment corresponding to FIG. 6 or FIG. 7 .
  • the communication interface 908 implements communication between the computing device 900 and other devices or communication networks using transceiver modules such as but not limited to network interface cards and transceivers.
  • the embodiment of the present application also provides a computer-readable storage medium.
  • the computer-readable storage medium may be any available medium that a computing device can store, or a data storage device such as a data center that includes one or more available media.
  • the available media may be magnetic media (eg, floppy disk, hard disk, magnetic tape), optical media (eg, DVD), or semiconductor media (eg, solid state hard disk), etc.
  • the computer-readable storage medium includes instructions, and the instructions instruct a computing device to execute the method described in the embodiment corresponding to FIG. 6 or FIG. 7 above.
  • the embodiment of the present application also provides a computer program product including instructions.
  • the computer program product may be a software or program product containing instructions, executable on a computing device or stored on any available medium.
  • the computer program product runs on at least one computer device, at least one computer device is made to execute the method described in the embodiment corresponding to FIG. 6 or FIG. 7 above.
  • the embodiments of the present application may be provided as methods, systems, or computer program products. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including but not limited to disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
  • computer-usable storage media including but not limited to disk storage, CD-ROM, optical storage, etc.
  • These computer program instructions may also be stored in a computer-readable memory capable of directing a computer or other programmable data processing apparatus to operate in a specific manner, such that the instructions stored in the computer-readable memory produce an article of manufacture comprising instruction means, the instructions
  • the device realizes the function specified in one or more procedures of the flowchart and/or one or more blocks of the block diagram.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • General Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • Geometry (AREA)
  • Image Generation (AREA)

Abstract

一种渲染方法及装置,用以解决存储成本高的问题。本申请在预计算阶段中,将影响渲染结果的面片上的各个着色点的材质参数不计算在内,进而在渲染某个面片时,可以从预计算结果中,查询到未由该面片的像素点的材质参数参与的该面片的计算结果,然后再根据计算结果以及该面片的材质参数来得到该面片的渲染结果。面片上各个着色点的材质参数能够决定一个面片上不同着色点的材质。因此本申请实施例在预计算阶段不再将材质参数考虑在内,可以减少存储量,而在渲染阶段再结合材质参数,以提高渲染精细度。

Description

一种渲染方法及装置
相关申请的交叉引用
本申请要求在2021年11月22日提交中国专利局、申请号为202111382296.4、申请名称为“一种渲染方法、系统及设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中;本申请要求在2022年02月22日提交中国专利局、申请号为202210162160.0、申请名称为“一种渲染方法及装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请实施例涉及计算机技术领域,尤其涉及一种渲染方法及装置。
背景技术
渲染技术是指根据三维模型数据(包括物体几何模型、表面材质等等)和光线数据(包括光源位置、颜色、强度等),输出模拟真实世界中相同模型和光照条件下的真实图片。而根据三维模型数据以及光线数据渲染图像的过程所需的计算量较大,消耗计算资源较多,使得实现渲染一张图片需要较长时间。
光线追踪技术是实现渲染的一种手段。为了减少虚拟场景包括的物体的渲染时长,可以将采用光线追踪算法针对虚拟场景中的各个物体的渲染结果预先存储下来,在用户观看时直接从存储的数据中读取渲染的结果即可,无需再进一步进行渲染计算,从而减少计算量,减少渲染时延。但是三维模型表面材质较复杂,一个虚拟场景中物体的数量较多,使得针对一个虚拟场景存储的数据量极大,进而导致存储成本较高。
发明内容
本申请实施例提供一种渲染方法及装置,用以解决存储成本高的问题。
第一方面,本申请实施例提供一种渲染方法,所述渲染方法用于渲染虚拟场景。所述虚拟场景包括至少一个模型,所述至少一个模型包括多个面片,每个面片包括多个着色点。所述方法包括:获取预先计算的目标面片的公共光照分量和预先计算的第一着色点的镜面光照分量。所述第一着色点位于所述目标面片上。所述目标面片的公共光照分量用于计算所述多个着色点的渲染结果。所述第一着色点的镜面光照分量用于指示所述第一着色点的出射光线的光照强度。根据所述目标面片的公共光照分量、所述第一着色点的镜面光照分量和所述第一着色点的纹理贴图参数,计算所述第一着色点的渲染结果。上述方案中,在预计算阶段中,计算各个面片的公共光照分量,以及各个着色点的镜面光照分量,将影响渲染结果的面片上的各个着色点的纹理贴图参数不计算在内,进而在渲染某个面片时,可以从预计算结果中,查询到未由该面片的像素点的纹理贴图参数参与的该面片的计算结果,然后再根据计算结果以及该面片的纹理贴图参数来得到该面片的渲染结果。面片上各个着色点的纹理贴图参数能够决定一个面片上不同着色点的材质。因此本申请实施例在预计算阶段不再将纹理贴图考虑在内,可以减少存储量,而在渲染阶段再结合纹理贴图参数,以 提高渲染精细度。
需要说明的是,上述目标面片为多个面片中的一个面片,该目标面片可以是渲染帧的画面中一个模型中任一面片。针对渲染帧中的画面中模型的面片中的着色点均完成渲染,则完成该帧的渲染。
在一种可能的设计中,每个面片对应一个公共光照分量,可以减少计算量。
在一种可能的设计中,所述目标面片包括的每个着色点对应一个镜面光照分量。入射光线到达目标面片的位置点可以理解为着色点。因此,目标面片包括的多个着色点可以理解为不同设定方向的出射光线与目标面片的交点。针对不同着色点可以均预计算镜面光照分量。
在一种可能的设计中,获取所述预先计算的第一着色点的镜面光照分量,包括:根据所述目标面片的法线贴图参数,确定与出射光线对应的近似光线,其中,所述出射光线和所述近似光线均经过所述目标面片,所述第一着色点为所述出射光线与所述目标面片的交点;将预先计算的所述近似光线的镜面光照分量确定为所述第一着色点的镜面光照分量。
上述设计中,通过面片的法线贴图参数结合出射光线方向(可以理解为从其他方向的光线经过面片反射后进入人眼的方向)能够确定一个新的方向,即近似光线的方向,这个方向上的镜面光照分量可以近似于在该面片的出射光线方向上结合法线贴图的效果。无需在预计算阶段将法线贴图参数考虑在内,从而在无需增加存储量的同时提高渲染精度。
在一种可能的设计中,所述获取预先计算的目标面片的公共光照分量和预先计算的第一着色点的镜面光照分量前,所述方法还包括:基于光线追踪方法计算所述目标面片的公共光照分量和所述第一着色点的镜面光照分量,其中,计算所述目标面片的公共光照分量利用的元素仅包括所述目标面片上一条或多条入射光的亮度和所述一条或多条入射光与所述目标面片法线方向的夹角。计算所述目标面片的公共光照分量利用的元素不包括目标面片的材质参数。
上述方案中,在预计算阶段中,计算各个面片的公共光照分量,以及各个着色点的镜面光照分量,将影响渲染结果的面片上材质参数不计算在内,进而在渲染某个面片时,可以从预计算结果中,查询到未由该面片的像素点的材质参数参与的该面片的计算结果,然后再根据计算结果以及该面片的材质参数来得到该面片的渲染结果。面片上各个着色点的材质参数能够决定一个面片上不同着色点的材质。因此本申请实施例在预计算阶段不再将材质参数考虑在内,可以减少存储量,而在渲染阶段再结合材质参数,以提高渲染精细度。
在一种可能的设计中,所述方法还包括:获取预先计算的第二着色点的镜面光照分量,其中,所述第二着色点位于所述目标面片上,所述第二着色点的镜面光照分量用于指示进入所述第二着色点的光线的光照强度;根据所述目标面片的公共光照分量、所述第二着色点的镜面光照分量和所述第二着色点的纹理贴图参数,计算所述第二着色点的渲染结果。每个面片上可以包括多个着色点,仅需要获取多个着色点所属的面片的公共光照分量,并且获取该着色点的镜面光照分量,无需预计算每个着色点在结合纹理贴图后部分光照分量,从而减少计算量的通知提高渲染精细度。
在一种可能的设计中,所述目标面片的公共光照分量包括第一权重,所述第一着色点的镜面光照分量包括第二权重,所述方法还包括:提供权重设置接口,接收用户设置的所述第一权重和/或所述第二权重;根据所述第一权重计算所述目标面片的公共光照分量;根据所述第二权重计算所述第一着色点的公共光照分量。
上述设计中,在预计算阶段,将权重考虑在内,从而渲染阶段权重无需参与计算,提供渲染效率。
在一种可能的设计中,所述方法还包括:提供权重设置接口,接收用户设置的所述第一权重和/或所述第二权重;所述根据所述目标面片的公共光照分量、所述第一着色点的镜面光照分量和所述第一着色点的纹理贴图参数,计算所述第一着色点的渲染结果,包括:根据所述目标面片的公共光照分量与所述第一权重的乘积、所述第一着色点的镜面光照分量与所述第二权重的乘积和所述第一着色点的纹理贴图参数,计算所述第一着色点的渲染结果。
上述设计中,在预计算阶段,权重不再考虑在内,从而渲染阶段支持用户修改权重,增强渲染效果。
第二方面,本申请实施例提供一种渲染引擎,所述渲染节点用于渲染虚拟场景,所述虚拟场景包括至少一个模型,所述至少一个模型包括多个面片,每个面片包括多个着色点;所述渲染节点包括处理单元和存储单元;
所述处理单元,用于从存储单元获取预先计算的目标面片的公共光照分量和预先计算的第一着色点的镜面光照分量,其中,所述第一着色点位于所述目标面片上,所述目标面片的公共光照分量用于计算所述多个着色点的渲染结果,所述第一着色点的镜面光照分量用于指示进入所述第一着色点的光线的光照强度;根据所述目标面片的公共光照分量、所述第一着色点的镜面光照分量和所述第一着色点的纹理贴图参数,计算所述第一着色点的渲染结果。
在一种可能的设计中,每个面片对应一个公共光照分量。
在一种可能的设计中,所述目标面片包括的每个着色点对应一个镜面光照分量。
在一种可能的设计中,所述处理单元,具体用于:根据所述目标面片的法线贴图参数,确定与出射光线对应的近似光线,其中,所述出射光线和所述近似光线均经过所述目标面片,所述第一着色点为所述出射光线与所述目标面片的交点;将预先计算的所述近似光线的镜面光照分量确定为所述第一着色点的镜面光照分量。
在一种可能的设计中,所述处理单元,还用于:在获取预先计算的目标面片的公共光照分量和预先计算的第一着色点的镜面光照分量前,基于光线追踪渲染引擎计算所述目标面片的公共光照分量和所述第一着色点的镜面光照分量,其中,计算所述目标面片的公共光照分量利用的元素仅包括所述目标面片上一条或多条入射光的亮度和所述一条或多条入射光与所述目标面片法线方向的夹角。
在一种可能的设计中,所述处理单元,还用于:获取预先计算的第二着色点的镜面光照分量,其中,所述第二着色点位于所述目标面片上,所述第二着色点的镜面光照分量用于指示进入所述第二着色点的光线的光照强度;根据所述目标面片的公共光照分量、所述第二着色点的镜面光照分量和所述第二着色点的纹理贴图参数,计算所述第二着色点的渲染结果。
在一种可能的设计中,所述目标面片的公共光照分量包括第一权重,所述第一着色点的镜面光照分量中包括第二权重,所述渲染引擎还包括权重设置接口;所述权重设置接口,用于接收用户设置的所述第一权重和/或所述第二权重;所述处理单元,还用于根据所述第一权重计算所述目标面片的公共光照分量;根据所述第二权重计算所述第一着色点的公共光照分量。
在一种可能的设计中,所述渲染引擎还包括权重设置接口;所述权重设置接口,用于接收用户设置的所述第一权重和/或所述第二权重;所述处理单元,具体用于根据所述目标面片的公共光照分量与所述第一权重的乘积、所述第一着色点的镜面光照分量与所述第二权重的乘积和所述第一着色点的纹理贴图参数,计算所述第一着色点的渲染结果。
第三方面,本申请实施例还提供一种渲染方法,所述渲染方法用于渲染虚拟场景,所述虚拟场景包括至少一个模型,每个模型包括多个面片,每个面片包括多个着色点,所述方法包括:获取预先计算的目标面片的公共光照分量,其中,所述第一着色点位于所述目标面片上,所述目标面片的公共光照分量用于计算所述多个着色点的渲染结果。根据所述目标面片的法线贴图参数,确定与出射光线对应的近似光线,其中,所述出射光线和所述近似光线均经过所述目标面片,所述第一着色点为所述出射光线与所述目标面片的交点。获取预先计算的所述近似光线的镜面光照分量。根据所述目标面片的公共光照分量、所述近似光线的镜面光照分量,计算所述第一着色点的渲染结果。
其中,在渲染阶段将预先计算的所述近似光线的镜面光照分量近似作为第一着色点结合法线贴图后对应的镜面光照分量。
在一种可能的设计中,所述目标面片具有纹理贴图;根据所述目标面片的公共光照分量、所述第一着色点的镜面光照分量,计算所述第一着色点的渲染结果,包括:
根据所述目标面片的公共光照分量、所述第一着色点的镜面光照分量和所述第一着色点的纹理贴图参数,计算所述第一着色点的渲染结果。
在一种可能的设计中,每个面片对应一个公共光照分量。
在一种可能的设计中,所述目标面片包括的每个着色点对应一个镜面光照分量。
在一种可能的设计中,所述获取预先计算的目标面片的公共光照分量和预先计算的近似光线的镜面光照分量前,所述方法还包括:基于光线追踪方法计算所述目标面片的公共光照分量和所述近似光线的镜面光照分量,其中,计算所述目标面片的公共光照分量利用的元素仅包括所述目标面片上一条或多条入射光的亮度和所述一条或多条入射光与所述目标面片法线方向的夹角。
在一种可能的设计中,所述目标面片的公共光照分量包括第一权重,所述第一着色点的镜面光照分量中包括第二权重,所述方法还包括:提供权重设置接口,接收用户设置的所述第一权重和/或所述第二权重;根据所述第一权重计算所述目标面片的公共光照分量;根据所述第二权重计算所述近似光线的镜面光照分量。
在一种可能的设计中,所述方法还包括:提供权重设置接口,接收用户设置的所述第一权重和/或所述第二权重;所述根据所述目标面片的公共光照分量、所述近似光线的镜面光照分量,计算所述第一着色点的渲染结果,包括:
根据所述目标面片的公共光照分量与所述第一权重的乘积、所述近似光线的镜面光照分量与所述第二权重的乘积,计算所述第一着色点的渲染结果。
第四方面,本申请实施例提供一种渲染引擎,所述渲染节点用于渲染虚拟场景,所述虚拟场景包括至少一个模型,所述至少一个模型包括多个面片,每个面片包括多个着色点;所述渲染节点包括处理单元和存储单元;
所述处理单元,用于从存储单元获取预先计算的目标面片的公共光照分量,其中,所述第一着色点位于所述目标面片上,所述目标面片的公共光照分量用于计算所述多个着色点的渲染结果。根据所述目标面片的法线贴图参数,确定与出射光线对应的近似光线,其 中,所述出射光线和所述近似光线均经过所述目标面片,所述第一着色点为所述出射光线与所述目标面片的交点。从存储单元获取预先计算的所述近似光线的镜面光照分量。根据所述目标面片的公共光照分量、所述近似光线的镜面光照分量,计算所述第一着色点的渲染结果。
第五方面,本申请实施例提供提一种包含指令的计算机程序产品,当该指令被计算机设备集群运行时,使得该计算机设备集群执行如第一方面或第一方面的任意可能的设计提供的方法,或者,使得该计算机设备集群执行如第三方面或第三方面的任意可能的设计提供的方法。
第六方面,本申请实施例提供一种计算机可读存储介质,包括计算机程序指令,当所述计算机程序指令由计算设备集群执行时,所述计算设备集群执行如第一方面或第一方面的任意可能的设计提供的方法,或者执行如第三方面或第三方面的任意可能的设计提供的方法。
第七方面,本申请实施例提供一种计算设备集群,包括至少一个计算设备,每个计算设备包括处理器和存储器;至少一个计算设备的处理器用于执行至少一个计算设备的存储器中存储的指令,以使得该计算设备执行如第一方面或第一方面的任意可能的设计提供的方法,或者执行如第三方面或第三方面的任意可能的设计提供的方法。
在一些可能的设计中,该计算设备集群包括一个计算设备,该计算设备包括处理器和存储器;该处理器用于执行该存储器中存储的指令以运行第二方面或第二方面的任意可能的设计提供的渲染引擎,以使得该计算设备执行如第一方面或第一方面的任意可能的设计提供的方法,或者该处理器用于执行该存储器中存储的指令以运行第四方面或第四方面的任意可能的设计提供的渲染引擎,以使得该计算设备执行如第三方面或第三方面的任意可能的设计提供的方法。
在一些可能的设计中,该计算设备集群包括至少两个计算设备,每个计算设备包括处理器和存储器。该至少两个计算设备的处理器用于执行该至少两个计算设备的存储器中存储的指令以运行第二方面或第二方面的任意可能的设计提供的渲染引擎,以使得该计算设备集群执行如第一方面或第一方面的任意可能的设计提供的方法;或者,该至少两个计算设备的处理器用于执行该至少两个计算设备的存储器中存储的指令以运行第四方面或第四方面的任意可能的设计提供的渲染引擎,以使得该计算设备集群执行如第三方面或第三方面的任意可能的设计提供的方法。每个计算设备运行了渲染引擎的包括的部分单元。
附图说明
图1为本申请实施例中UV贴图示意图;
图2为本申请实施例中开放半球空间示意图;
图3为本申请实施例中纹理贴图前后示意图;
图4为本申请实施例中反射场景中其他面片的材质参数示意图;
图5为本申请实施例中渲染系统示意图;
图6为本申请实施例中预计算部分流程示意图;
图7为本申请实施例中实时渲染部分流程示意图;
图8为本申请实施例中法线贴图近似方向原理示意图;
图9A为本申请实施例中仅存在纹理贴图的渲染流程示意图;
图9B为本申请实施例中仅存在法线贴图的渲染流程示意图;
图9C为本申请实施例中存在法线贴图和纹理贴图的渲染流程示意图;
图10为本申请实施例中渲染节点结构示意图;
图11为本申请实施例中计算设备结构示意图。
具体实施方式
如下先对本申请实施例中涉及的部分用语以及相关技术进行解释说明,以便于本领域技术人员容易理解。
三维模型是物体的多边形表示,通常用计算机或者其它视频设备进行显示。三维模型的形状可以是多种多样的,例如,可以是球体、锥体、曲面物体、平面物体以及表面不规则物体等等。显示的物体是可以是现实世界的实体,也可以是虚构的物体。三维模型经常用三维建模工具这种专门的软件生成,但是也可以用其它方法生成。作为点和其它分量集合的数据,三维模型可以手工生成,也可以按照一定的算法生成。
面片也可以称为网格。通常在渲染中,需要将空间(或者虚拟场景)中的模型划分成无数个微小的平面。这些平面又被称为面片,它们可以是任意多边形,常用的是三角形和四边形。形状不同的三维模型的面片的形状也可以不同。例如,球体的面片和曲面物体的面片的形状可以完全不同。面片的尺寸可以根据需要进行设置,对渲染出的图像的精度要求越高的情况下,面片的尺寸可以设置得越小。面片在三维模型的位置、形状或者尺寸等可以通过几何参数来描述。
光线追踪渲染是通过跟踪从观察者(例如相机或者人眼)的视点朝着渲染图像的每个像素发射的光线射入虚拟场景的光的路径来产生渲染图像的渲染方法。其中,虚拟场景包括光源以及三维模型。光线追踪分为正向光线追踪和反向光线追踪。正向光线跟踪是指从光源出发,正向跟踪光线在虚拟场景中的传递过程。渲染设备对虚拟场景中的光源产生的光线进行正向光线跟踪,从而得到虚拟场景中的三维模型中每个面片(或者网格)的光线强度。反向光线追踪是通过跟踪从设定视角进入三维模型的网格的光线,在虚拟场景中的传递至光源的过程。其中,设定视角是用户观察虚拟场景的某个角度。
假设虚拟场景中存在一个或者多个光源,以及一个或者多个三维模型。光源产生的光线照射到三维模型上。其中,光源可以是点光源、线光源或者面光源等等。光源产生的光线在接触到三维模型中的一个或者多个面片后,在每个接触点会发生如下情形中的一种:折射、反射或漫反射。然后进入到用户的眼睛。
三维模型中每个面片具有一定的材质特征。面片的材质可以分为以下三种:透明材质、光滑的不透明材质和粗糙的不透明材质。根据面片材质的不同,面片对于光的折/反射情况又可以分为折射、反射和漫反射三种情况。其中,光线接触透明材质会发生折射,光线接触表面光滑的不透明材质会发生反射,光线接触表面粗糙的不透明材质会发生漫反射。示例性地,材质特征也可以通过材质参数来表达。材质参数可以包括纹理、表面凹凸程度、金属度(metalness)、粗糙度(roughness)、透明度(opacity)、折反射度等。
在渲染中,可以为不同的物体指定不同的材质参数。如果需要某个物体上的不同位置具有不同的表现效果,则可以用贴图来为该物体指定材质参数。物体到贴图之间有一个从立体几何体表面展开到2D平面的一个映射(即UV展开)关系。U和V指的是2D平面的水平轴和垂直轴。例如,参见图1所示,图1中(a)汽车座椅进行UV展开后如图1 中(b)所示。如果为展开后的2D平面图中不同位置指定不同的值,就相当于在物体表面贴上一张图片。如果指定的值是红绿蓝(red green blue,RGB)颜色,则决定材质参数中的颜色值,那这张贴图就是纹理贴图(texture)。同样的技术理论上也适用于其它的材质参数,也就是说,所有的材质参数的取值都可以为常数(整个物体相同参数值),也可以由贴图给出(物体表面不同位置的材质参数取值不一样)。
纹理可以为二维图像上的像素特征分量,当把纹理按照特定的方式映射到三维模型表面上也称纹理贴图。纹理用于表达三维模型表面的颜色(或者折射吸收系数)。表面凹凸程度通过法线贴图来表达。
法线贴图影响三维模型表面的法线朝向,不影响颜色,主要用于表达表面的高低、坑洼等,制造出凹凸不平的表面效果。
金属度可以通过金属度贴图来表达。金属度贴图表示物体表明的镜面反射属性。金属度贴图可以定义三维模型是否为金属或者电介质(非金属)。粗糙度可以通过粗糙度贴图来表达。粗糙度贴图表示物体表面的漫反射属性。
粗糙度贴图可以定义物体表面反射的粗糙度或平滑程度。透明度(opacity)表示物体的透明程度。
理论上,如果把从不同设定方向进行预追踪的每个着色点的光线强度保存在该点上,在之后的渲染过程中,就不需要在计算该点的渲染结果,可以直接复用。其中,针对每个面片通常可以保存多个着色点的光线强度。例如,参见图2所示。对于面片存储一个以开放半球空间形式表达的数据。开放半球空间按照极角θ、方位角α可以分成多个设定方向。设定方向可以表达为半球坐标系中的(θ,α),其中,0≤θ≤90,0≤α<360。针对设定方向的出射光线与面片的交点存在出射光的颜色,即按照设定方向观看到面片上交点的颜色。设定方向的出射光线与面片的交点可以理解为一个着色点,比如图2中p1、p2和p3。如果针对一个面片预计算从90*360个方向的出射光线对应的数据,可以理解为,针对该面片计算90*360个着色点对应的数据。以面片为例,存下来的一个着色点的数据也可以理解为从设定方向观看到所述面片的颜色值。每个设定方向观看到的面片的颜色值可以基于光线追踪算法得到。光线追踪可以包括正向光线追踪和/或反向光线追踪。为了便于描述,将计算多个设定方向观看到的每个面片的颜色值存储下来,称为预计算。
应理解的是,一般贴图的像素密度远远大于几何体的面片密度,具体密度视纹理的精细度而定。比如一个汽车方向盘的皮革纹理贴图可能是4096*4096或8192*8192,而方向盘的皮革部分可能只有几千个面片。如果一个汽车方向盘的几何体通过皮革纹理贴图渲染时,可以针对4096*4096或8192*8192个面片进行渲染。而一个汽车场景中不仅仅包括方向盘,还包括其它物体,比如座椅等。作为一种举例,汽车场景中包括110个物体。110个物体中一些物体具有纹理贴图,一些物体具有纹理贴图和法线贴图,还有一些仅具有法线贴图。举例来说,以50%的物体具有纹理贴图和法线贴图为例。以每张贴图的分辨率为4K为例,即4096*4096。针对该汽车场景中的50%的物体来说,需要存储55*4096*4096个开放半球空间。每个开放半球空间有90*360个角度(着色点)。因此需要存储1.8*10^14个浮点数,存储成本较高。因此,为了避免了较高的存储成本,预计算的方式通常所应对渲染的精细度是有限的,一般只能用来存储比较平缓、粗糙度较大的物体。一般来说存储的开放半球空间的数量和面片的数量规模相当。但是仅缓存面片数量规模的开放半球空间,使得在实际渲染时无法还原精细的贴图效果。例如参见图3所示,图3中(a)表示经过纹 理贴图后的效果图,图3中(b)表示使用预计算的数据渲染的效果图。
基于此,本申请实施例提供一种渲染方案,在不增加存储成本的前提下,提高渲染的精细度。如前所述,决定三维模型的材质属性的每个值均可以由材质参数来输入。材质参数可以通过贴图来表达。本申请中,在预计算阶段中,将影响渲染结果的面片上的各个着色点的材质参数不计算在内,进而在渲染某个面片时,可以从预计算结果中,查询到未考虑该面片的材质参数的该面片的预计算结果,然后再根据预计算结果以及该面片的材质参数来得到该面片的渲染结果。
一个着色点的渲染结果可以由漫反射颜色(diffuse颜色)分量以及镜面光照(specular光照)分量线性组合而得。镜面光照分量可以称为高光光照分量。需要说明的是,对于漫反射颜色分量而言,接触点从各个角度反射出光线的颜色通常是一样的。换言之,在三维模型和光源的相对位置和其他条件不变的前提下,不同的两个视角方向看到同一个可以发生漫反射的着色点的漫反射颜色分量是一样的。而漫反射颜色分量中用户观看到一个面片上某个着色点的颜色与该着色点的材质参数相关,换言之,面片上各个着色点的材质参数能够决定一个面片上不同着色点的颜色值(或者说RGB值)。本申请实施例中为了提高精细度的同时减少存储量,可以针对三维模型中每个面片计算未由自身面片的像素点的材质参数参与的漫反射颜色分量。为了与由目前的由材质参数参与的漫反射颜色分量区分,本申请实施例中将未考虑自身面片的材质参数而计算得到的漫反射颜色分量称为公共光照分量。
针对镜面光照分量来说,与漫反射颜色分量不同的是,镜面光照分量与用户视角有关。针对一个面片上的着色点,用户通过不同的视角看到的镜面光照分量是不同的。也即,不同的缓存点的镜面光照分量可以是不同的。具体地,本申请对于每个面片的镜面光照分量,可以存储一个以开放半球空间形式表达的数据。开放半球空间按照极角θ、方位角α可以分成多个设定方向。设定方向可以表达为半球坐标系中的(θ,α),其中,0≤θ≤90,0≤α<360。针对每个面片,在每个设定方向上存储出射光的镜面光照分量。以面片为例,将多个设定方向对应的着色点的镜面光照分量存储下来。每个着色点的镜面光照分量可以基于光线追踪算法得到。光线追踪可以包括正向光线追踪和/或反向光线追踪。
为了便于理解,下面结合渲染方程对漫反射颜色(diffuse颜色)分量以及镜面光照(specular光照)分量进行说明。
渲染方程用于近似的求出每束光线对一个给定材质的平面上最终反射出来的光线所作出的贡献程度。渲染方程可以参见如下公式(1)所示。
Figure PCTCN2022127466-appb-000001
其中,ω 0表示用户观察的视角方向。视角方向也可以理解为出射光线方向。p表示面片上的一个着色点。L 0(p,ω 0)表示在视角ω 0方向的着色点p的光照亮度。ω i表示光线入射方向。n表示p点所在平面的法线方向。L i(p,ω i)表示p点的ω i方向的入射光亮度。N表示p点的光线入射光方向的数量。D表示法线分布函数(normal distribution function,NDF)。法线分布函数,用于描述面片上法线分布的概率,即正确朝向的法线的浓度,因为该分布决定了镜面高光整体的形状和强度,低粗糙度会产生锐利清晰的镜面反射和狭窄细长的镜面波瓣。通过美术材质中的粗糙度上做参数化,来描述和控制微平面的凹凸程度和其结果,即 通过D函数可以表达面片的粗糙度。粗糙度可以决定入射的光线是以漫反射还是镜面反射的形式产生出射光。F表示菲涅尔函数(fresnel equation),用于描述不同的表面角下反射的光线所占的比率。G表示几何函数,也被称为掩蔽阴影函数(masking-shadow function),描述面片自阴影属性,即面片并未被遮蔽的百分比。c表示所述纹理贴图数据中p点对应的纹理颜色,k d和k s均表示权重。
上述渲染方程经过变换处理,可以通过如下公式(2)来表达。
Figure PCTCN2022127466-appb-000002
其中,公式(2)中的
Figure PCTCN2022127466-appb-000003
这一项可以理解为漫反射颜色分量或者称为diffuse项。
Figure PCTCN2022127466-appb-000004
这一项可以理解为镜面光照分量或者称为specular项。
本申请实施例在预计算阶段可以预先计算虚拟场景中三维模型的每个面片的公共光照分量以及每个面片的多个设定方向的镜面光照分量(即每个面片的多个着色点的镜面光照分量)。以第一面片为例,第一面片的公共光照分量以及第一面片的多个着色点的镜面光照分量是根据虚拟场景中的光源、第一面片的几何参数以及虚拟场景中的其它面片的材质参数及几何参数进行光线追踪获得的。需要说明的是,本申请实施例中,在计算第一面片的公共光照分量以及镜面光照分量,需要其它对第一面片产生影响的面片的材质参数的参与。比如光源照射到其它某个面片上经过反射后会到达该第一面片,则该某个面片对公共光照分量以及镜面光照分量的计算产生影响。
比如,参见图4所示,反射场景中,假设虚拟场景只有一个光源410、不透明正方体411以及不透明球体412。从光源410发出一条光线,投射到不透明正方体411的中心点为Q 1的面片1,然后,被反射到不透明球体412的一个中心点为点Q 2的面片2上。光线照射到不透明正方体411的面片1,比如,面片1带有颜色,经过面片1反射后,会对面片2产生影响。因此,在计算面片2的公共光照分量以及镜面光照分量时,需要将面片1的材质参数考虑在内。应理解的是,在计算时,仅需要将面片1上反射点的材质参数考虑在内,并不需要对面片1所有的着色点的材质参数均考虑在内,因此并不会增加计算量。
应理解的是,上述图4仅作为一种举例。在实际应用中,虚拟场景远远比图4要复杂,例如,虚拟场景中可能同时存在多个不透明物体以及多个透明物体,因此,光线会被多次反射或者漫反射,另外光源可能不止一个,而是两个或者更多。因此对预计算某个面片的公共光照分量以及镜面光照分量时,需要考虑对该面片的光照产生影响的多个其它面片。
如下结合渲染方程对公共光照分量以及镜面光照分量进行简要描述。以目标面片为例,如前所述,目标面片的公共光照分量可以理解为未考虑目标面片的材质参数的漫反射颜色分量。目标面片的镜面光照分量可以理解为目标面片的多个设定方向(即多个着色点)对应的多个镜面光照分量。
示例性地,结合上述渲染方程,公共光照分量可以通过如下公式(3)或者公式(4)来表达:
Figure PCTCN2022127466-appb-000005
Figure PCTCN2022127466-appb-000006
其中,p表示面片对应的着色点,
Figure PCTCN2022127466-appb-000007
表示p点的公共光照分量,ω i表示光线入射方向,n表示p点所在面片的法线方向,L i表示p点的ω i方向的入射光亮度,N表示p点的光线入射光方向的数量,c表示所述纹理贴图数据中p点对应的纹理颜色。应理解的是,上述公式(4)与公式(3)的区别在于,公式(4)将权重k d考虑在内,而公式(3)并非将权重k d计算在内。k d可以由虚拟场景的提供者或者用户来设置。一些实施例中,用户或者虚拟场景的提供者可以针对不同的虚拟场景配置不同的k d,也可以针对虚拟场景中不同的模型配置不同的k d,也可以针对不同的面片配置不同的k d,本申请实施例对此不作具体限定。从上述公式(3)和公式(4)可以确定,同一面片上不同着色点的公共光照分量与视角方向没有关系,因此,属于同一面片的各个着色点对应的公共光照分量均相同。
示例性地,镜面光照分量可以通过如下公式(5)或者公式(6)来表达:
Figure PCTCN2022127466-appb-000008
Figure PCTCN2022127466-appb-000009
其中,ω 0表示视角方向,p表示面片对应的着色点,
Figure PCTCN2022127466-appb-000010
表示p点的镜面光照分量,ω i表示光线入射方向,n表示p点所在面片的法线方向,L i表示p点的入射光亮度,N表示p点的光线入射光方向的数量,D表示法线分布函数,F表示菲涅尔函数,G表示几何函数。应理解的是,上述公式(6)与公式(5)的区别在于,公式(6)将权重k s考虑在内,而公式(5)并非将权重k s计算在内。k s可以由虚拟场景的提供者或者用户来设置。一些实施例中,用户或者虚拟场景的提供者可以针对不同的虚拟场景配置不同的k s,也可以针对虚拟场景中不同的模型配置不同的k s,也可以针对不同的面片配置不同的k s,本申请实施例对此不作具体限定。
下面结合应用场景对本申请实施例提供的方案进行详细说明。
首先对本申请适用的渲染系统进行描述。参见图5,图5是本申请涉及的一种渲染系统的结构示意图。本申请的渲染系统用于通过渲染方法对虚拟场景的三维(2D)模型进行渲染得到的2D图像,即渲染图像。本申请的渲染系统可以包括:一个或多个终端设备100、网络设备200以及远程渲染平台300。远程渲染平台300具体可以部署在云服务器上。远程渲染平台300和终端设备100一般部署在不同的数据中心内。
终端设备100可以是需要实时显示渲染图像的设备,例如,可以是用于飞行训练的虚拟现实设备(virtual reality,VR)、可以是用于虚拟游戏的电脑以及用于虚拟商城的智能手机等等,此处不作具体限定。终端设备可以是高配置、高性能(例如,多核、高主频、内存大等等)的设备,也可以是低配置,低性能(例如,单核、低主频、内存小等等)的设备。在一具体的实施例中,终端设备100可以包括硬件、操作系统以及渲染应用客户端。
网络设备200用于在终端设备100通过任何通信机制/通信标准的通信网络与远程渲染平台300之间传输数据。其中,通信网络可以是广域网、局域网、点对点连接等方式,或它们的任意组合。
远程渲染平台300包括一个或多个远程渲染节点。远程渲染节点也可以简称为渲染节点。远程渲染平台300可以通过一个或者多个计算设备来实现。多个计算设备可以构成计算设备集群。渲染节点的功能可以由一个或者多个计算设备配合实现。远程渲染节点自下而上可以包括渲染硬件、虚拟化服务、渲染引擎以及渲染应用服务端等。其中,渲染硬件包括计算资源、存储资源以及网络资源。计算资源可以采用异构计算架构,例如,可以采用中央处理器(central processing unit,CPU)+图形处理器(graphics processing unit,GPU)架构,CPU+AI芯片,CPU+GPU+AI芯片架构等等,此处不作具体限定。存储资源可以包括内存、显存等存储设备。网络资源可以包括网卡、端口资源、地址资源等。虚拟化服务是通过虚拟化技术将渲染节点的资源虚拟化为vCPU等,并按照用户的需要灵活地隔离出相互独立的资源以运行用户的应用程序的服务。常见地,虚拟化服务可以包括虚拟机(virtual machine,VM)服务以及容器(container)服务,VM和容器可以运行渲染引擎和渲染应用服务端。渲染引擎用于实现渲染算法。渲染应用服务端用于调用渲染引擎以完成渲染图像的渲染。
终端设备100上的渲染应用客户端和远程渲染平台300的渲染应用服务端可以合称为渲染应用。常见的渲染应用可以包括:游戏应用、VR应用、电影特效以及动画等等。用户通过渲染应用客户端输入操作指令,渲染应用客户端将操作指令发送给渲染应用服务端,渲染应用服务端调用渲染引擎生成渲染结果,将渲染结果发送至渲染应用客户端。然后再由渲染应用客户端将渲染结果转换成图像呈现给用户。可在一具体的实施方式中,渲染应用服务端和渲染应用客户端可以是渲染应用提供商提供的,渲染引擎可以是云服务提供商提供的。举例来说,渲染应用可以是游戏应用,游戏应用的游戏开发商将游戏应用服务端安装在云服务提供商提供的远程渲染平台上,游戏应用的游戏开发商将游戏应用客户端通过互联网提供给用户下载,并安装在用户的终端设备上。此外,云服务提供商还提供了渲染引擎,渲染引擎可以为游戏应用提供计算能力。在另一种具体的实施方式中,渲染应用客户端、渲染应用服务端和渲染引擎可以均是云服务提供商提供的。
一些实施例中,渲染系统中,还可以包括管理设备(图5中未示出)。管理设备可以是用户的终端设备和云服务提供商的远程渲染平台300之外的第三方提供的设备。例如,管理设备可以是游戏开发商提供的设备。游戏开发商可以通过管理设备对渲染应用进行管理。可以理解,管理设备可以设置于远程渲染平台之上,也可以设置于远程渲染平台之外,此处不作具体限定。
接下来对本申请涉及的渲染方案进行详细的介绍。本申请实施例提供的渲染方案可以由远程渲染平台300执行,或者由渲染节点执行,或者由渲染节点中的渲染引擎实现。为了便于描述,后续不再对渲染系统中各个组件的图示进行描述。本申请提供的渲染方案可以包括预计算部分、实时渲染部分。
参见图6所示,示出本申请实施例提供的一种预计算部分流程示意图。预计算部分包括步骤601-步骤603。
步骤601,远程渲染平台获取虚拟场景的相关信息。虚拟场景中包括光源、一个或者多个三维模型。虚拟场景的相关信息包括光源参数、一个或者多个模型的信息。一个或者多个模型的信息,比如可以包括各个模型的面片划分情况、面片编号、各个模型和面片的几何参数和材质参数等。
在一些实施例中,虚拟场景的相关信息可以是由终端设备发送给远程渲染平台,也可 以是由管理设备发送给远程渲染平台。
步骤602,远程渲染平台根据虚拟场景的相关信息对各个面片进行光线追踪获得各个面片的公共光照分量、以及各个面片上包括的多个着色点的镜面光照分量。
以目标面片的公共光照分量和目标面片上多个着色点的镜面光照分量为例。远程渲染平台可以根据光源、目标面片的几何参数以及其它面片的材质参数和几何参数进行光线追踪获得目标面片的公共光照分量和镜面光照分量。这里所说的其它面片包括虚拟场景中的一个或者多个三维模型包括的面片中除该目标面片以外的至少一个面片。该至少一个面片是对一个或者多个三维模型包括的面片中对目标面片上的光照产生影响的面片。比如通过反射对目标面片产生影响,具体如前所述,此处不再重复描述。
一些实施例中,远程渲染平台可以基于公式(3)或者公式(4)来计算各个面片的公共光照分量。可以基于上述公式(5)或者公式(6)来计算各个面片的多个设定方向的镜面光照分量。
示例性地,多个设定方向可以是由虚拟场景的提供者或者用户设置的。比如虚拟场景的提供者或者用户设置设定方向参数。设定方向参数可以是设定方向的数量或者多个设定角度。
步骤603,远程渲染平台存储各个面片的公共光照分量、以及各个面片的多个设定方向的镜面光照分量。比如,远程渲染平台将各个面片的公共光照分量、以及各个面片的多个设定方向的镜面光照分量存储在存储器中。存储器可以包括易失性存储器(volatile memory),例如随机存取存储器(random access memory,RAM)。存储器也可以是非易失性存储器(non-volatile memory),例如只读存储器(read-only memory,ROM),快闪存储器,机械硬盘(hard disk drive,HDD)或固态硬盘(solid state drive,SSD)等。
一些实施例中,存储器也可以部署于远程渲染平台外部,也可以部署于远程渲染平台内部。比如,存储器可以部署于渲染引擎内部。
示例性地,远程渲染平台可以针对每个面片建立光照分量信息表。光照分量信息表中可以包括各个面片的编号、公共光照分量、不同设定方向的镜面光照分量。光照分量信息表中还可以包括各个面片的材质参数。例如,光照分量信息表可以参见表1所示。
表1
Figure PCTCN2022127466-appb-000011
一些实施例中,远程渲染平台可以针对每个面片建立公共光照分量信息表和镜面光照 分量信息表。其中公共光照分量信息表中可以包括各个面片的编号以及对应的公共光照分量。镜面光照分量信息表中包括各个面片的编号、每个面片的多个设定方向对应的镜面光照分量。例如,公共光照分量信息表可以参见表2所示。镜面光照分量信息表可以参见表3所示。
表2
面片编号 公共光照分量
1 a1
2 a2
…… ……
N aN
表3
Figure PCTCN2022127466-appb-000012
参见图7所示,示出本申请实施例提供的一种实时渲染部分流程示意图。实时渲染部分包括步骤701-步骤703。
步骤701,远程渲染平台接收渲染请求。渲染请求可以来自于终端设备。比如终端设备可以通过网络设备向远程渲染平台发送该渲染请求。
渲染请求中可以包括用户的视角参数。用户的视角参数可以是用户视角范围,也可以是一个用户视角方向。进而远程渲染平台可以根据该用户视角方位确定用户视角范围。比如用户视角方向可以是用户眼睛直视的方向。
渲染请求中还可以包括虚拟场景的标识。该虚拟场景的标识在渲染系统唯一标识该虚拟场景。一些场景中,渲染系统中仅包括一种虚拟场景时,渲染请求中可以不携带虚拟场景的标识。以用户的视角方向观看到虚拟场景中的三维模型的目标面片为例。应理解的是,用户的视角范围内观看到虚拟场景中的三维模型的面片不仅仅包括目标面片,还包括其它面片,本申请实施例仅以渲染目标面片为例,其它面片的渲染方式与渲染目标面片的方式类似,不再赘述。本申请实施例中以用户视角范围内的其中一个方向(比如以用户视角方向)为例。根据用户视角方向能够确定目标面片上的着色点。例如,本申请实施例以目标面片的第一着色点的渲染为例。
步骤702,远程渲染平台获取已保存的目标面片的公共光照分量。
比如,通过查询光照分量信息表获取该目标面片的公共光照分量。再比如,通过查询公共光照分量信息表获取该目标面片的公共光照分量。
一些实施例中,远程渲染平台可以向存储器发送读取请求1,读取请求1用于请求读取目标面片的公共光照分量,然后存储器将目标面片的公共光照分量发送给远程渲染平台。
步骤703,远程渲染平台获取目标面片的第一着色点的镜面光照分量。比如,通过查询光照分量信息表获取该目标面片的第一着色点的镜面光照分量。再比如,通过查询镜面光照分量信息表获取该目标面片的第一着色点的镜面光照分量。再比如,如果该目标面片具有法线贴图,该第一着色点的镜面光照分量(即第一着色点应用法线贴图后的镜面光照分量)可以根据面片上其它近似方向上的镜面光照分量来近似确定。其它近似方向与该第一着色点对应的设定方向均经过目标面片。关于近似确定的相关说明,后续会详细说明,此处不再赘述。
一些实施例中,远程渲染平台可以向存储器发送读取请求2,读取请求2用于请求读取目标面片的第一着色点的镜面光照分量,然后存储器将目标面片的第一着色点的镜面光照分量发送给远程渲染平台。第一着色点的确定可以根据用户视角方向确定。用户视角方向可以理解为用户观看到的面片的出射光线进入人眼的方向,即出射光线方向,该出射光线方向与面片的交点即为第一着色点。
需要说明的是,本申请并不限定步骤702和步骤703先后执行步骤,步骤702和步骤703可以同时执行。
步骤704,远程渲染平台根据目标面片的材质参数、目标面片的公共光照分量以及目标面片的第一着色点的镜面光照分量确定该目标面片的第一着色点的渲染结果。
如下以目标面片的材质参数包括纹理贴图参数为例。如前所示,纹理贴图决定漫反射颜色分量中的颜色值,即RGB值。远程渲染平台可以结合目标面片的纹理贴图参数以及目标面片的公共光照分量来获得目标面片进行纹理贴图后的颜色分量(可以理解为漫反射颜色分量)。应理解的是,目标面片中各个着色点的纹理贴图参数可能不同,则可以针对每个着色点,使用该着色点的纹理贴图参数,再结合目标面片的公共光照分量来获得该着色点的颜色分量,从而获得目标面片上每个着色点的颜色分量。通过上述方式,并不需要针对每个面片上的各个着色点均存储漫反射颜色分量,仅存储面片的公共光照分量,并没有增加存储量,但在实时渲染阶段,结合各个着色点的纹理贴图参数来提高渲染精度。
进一步地,远程渲染平台可以基于目标面片的颜色分量与目标面片的第一着色点的镜面光照分量进行加权叠加来得到目标面片的渲染结果。
结合渲染方程来说,以公共光照分量采用公式(3),镜面光照分量采用公式(5)为例。远程渲染平台确定公共光照分量以及镜面光照分量分别对应的权重,然后确定的目标面片上p点渲染结果可以通过如下公式(7)来表示。
Figure PCTCN2022127466-appb-000013
其中,c表示目标面片上p点的纹理贴图参数。k d表示公共光照分量的第一权重。k s表示镜面光照分量的第二权重。
以公共光照分量采用公式(4),镜面光照分量采用公式(6)为例。远程渲染平台确定的目标面片上p点渲染结果可以通过如下公式(8)来表示。
Figure PCTCN2022127466-appb-000014
其中,c表示目标面片上p点的纹理贴图参数。
上述k d和k s可以由虚拟场景的提供者或者用户来设置。一些场景中,虚拟场景的提供者设置k d和k s时,在预计算公共光照分量和镜面光照分量时,可以将k d和k s计算在内。比如公共光照分量采用公式(4)计算,镜面光照分量采用公式(6)计算。另一些场景中,k d和k s支持用户的设置时,在预计算公共光照分量和镜面光照分量时,可以不将k d和k s计算在内,比如公共光照分量采用公式(3)计算,镜面光照分量采用公式(5)计算。
如下以目标面片的材质参数包括法线贴图参数为例。本申请实施例中,针对存在法线贴图的面片,采用近似的方式,在实时渲染的时候通过扰动表面法线的方式达到类似针对法线贴图的效果。如前所述,贴图可以认为是贴在几何物体表面的一层蒙皮,如果贴图精度高的话,一个面片可以映射到贴图上的一片区域,该片区域内可以有多个贴图上的值。而一个面片的法线贴图可以用于表达比面片尺寸更小的凹凸细节。本申请实施例中通过面片的法线贴图参数结合用户视角方向能够确定一个新的方向,这个方向上的镜面光照分量可以近似于在该用户视角方向对应的着色点上蒙上贴图的效果。法线贴图参数以及本申请实施例提供的近似方式应用于到如前图7对应的实施例中。针对某个着色点来说,可以由用户视角方向以及面片对应的法线贴图参数确定近似光线方向,该近似光线方向的镜片光照分量来近似作为着色点结合法线贴图后的镜面光照分量。
如下示例性的描述一种根据用户视角方向以及该着色点的法线贴图参数确定近似的方向进行描述。参见图8所示,以用户视角方向为wo为例。如下针对目标面片的第一着色点确定近似方向为例,其它的着色点的计算方式类似,本申请实施例中不再赘述。
u1,远程渲染平台根据目标面片的法线贴图参数确定目标面片的第一着色点p在应用法线贴图后的第一法线方向。远程渲染平台可以获取法线贴图参数中该目标面片的p点对应的参数值,该参数值的含义与法线贴图的种类相关。比如可以是该p点的高度或者该p点切线空间的法线等等。获取法线贴图参数中该目标面片的P点的参数值后,根据该目标面片的法线方向,可以确定该P点应用法线贴图后的第一法线方向。为了便于区分,将目标面片的法线方向称为第二法线方向。参见图8所示,第一法线方向通过Ns表示,第二法线方向通过Ng表示。
u2,远程渲染平台以第一法线方向为基准确定用户视角方向反射的第一方向。
以第一法线方向为基准,用户视角方向反射的第一方向,可以理解为,第一方向与用户视角方向以第一法线方向为轴对称。图8中,将第一方向称为wi’。
u3,远程渲染平台确定所述第一方向wi’以所述目标面片的第二法线方向Ng为基准反射的第二方向。图8中,将第二方向称为wo’。后续为了便于描述,将wo’方向称为p着色点应用法线贴图后wo方向的近似方向。需要说明的是,wo’并非用户真正的视角方向,而是在着色点P上进行法线贴图之后,从用户视角方向看到的着色点P的镜面光照分量相应效果可以近似于从wo’视角方向观看到的镜面光照分量的效果。
如下简要描述下将wo’方向上的镜面光照分量可以近似于在着色点p的用户视角方向上蒙上贴图的镜面光照分量的原理。可以假设P点近似wo方向的光照效果在没有应用法线贴图的时候,是由wi方向入射光造成的。参见图8所示,wi方向是wo方向被Ng反射的方向。在该p点应用法线贴图后,由于P点的法线方向由Ng变成了Ns,因此理论上从wo方向看到的光照亮度应该由wi’方向的入射光造成的;而在以该面片的法线方向为基准时,wi’方向入射光光照亮度对应的视角方向为wo’。因此以该面片的法线方向为基准时, wi’方向事实上造成的光照计算结果其实已经存储下来了,是以wo’方向存储的镜面光照分量。因此目标面片上wo’方向对应的镜面光照分量,可以近似认为是应用法线贴图后wo方向的结果。
应理解的是,是实际的虚拟场景中,某个面片可以仅存在法线贴图,比如凹凸不平且颜色单一的表面。某个面片可以仅存在纹理贴图,比如表面光滑的木纹表面。某些面片可能既存在法线贴图,也存在纹理贴图,比如凹凸不平的木纹表面。
参见图9A所示,以存在纹理贴图且不存在法线贴图的面片来说,比如面片1。远程渲染平台可以获取面片1的公共光照分量。然后获取用户视角范围的面片1的镜面光照分量。然后根据结合面片1的用户视角范围内多个着色点的纹理贴图参数以及面片1的公共光照分量计算面片1的各个着色点应用纹理贴图后的颜色分量。然后再对面片1应用纹理贴图后的颜色分量与用户视角范围内的多个着色点的镜面光照分量进行加权叠加获得面片1的渲染结果。以面片1的着色点1为例,着色点1的颜色分量与面片1的镜面光照分量加权叠加得到着色点1的渲染结果。再比如,以面片1的着色点2为例,着色点2的颜色分量与用户视角方向的面片1的镜面光照分量加权叠加得到着色点2的渲染结果。图9A中以用户设置第一权重k d和第二权重k s为例。
参见图9B所示,以面片2为例,面片2各个着色点的纹理贴图参数相同。面片2各个着色点的法线贴图参数不同。远程渲染平台可以获取面片2的公共光照分量。根据每个着色点的法线贴图参数结合图8原理来针对每个着色点确定每个着色点的出射光线方向对应的近似光线方向。进一步,远程渲染平台获取针对每个着色点确定的近似光线方向上的镜面光照分量。然后根据结合面片2的纹理贴图参数以及面片2的公共光照分量确定面片2的应用纹理贴图后的颜色分量。进一步地,对各个面片2的颜色分量与针对每个着色点确定的近似光线方向上的镜面光照分量进行加权叠加获得每个着色点的渲染结果。以面片2的着色点3为例,面片2的颜色分量与针对着色点3确定的近似光线方向上的镜面光照分量加权叠加得到着色点3的渲染结果。再比如,以面片2的着色点4为例,面片2的颜色分量与针对着色点4确定的近似光线方向上的镜面光照分量加权叠加得到着色点4的渲染结果。图9B中以用户设置第一权重k d和第二权重k s为例。
参见图9C所示,以存在纹理贴图和法线贴图的面片来说,比如面片3。远程渲染平台可以获取面片3的公共光照分量。根据每个着色点的法线贴图参数结合图8原理来针对每个着色点的出射光线方向确定近似光线方向。进而远程渲染平台获取针对每个着色点确定的近似光线方向上的镜面光照分量。进一步地,根据结合面片3的各个着色点的纹理贴图参数以及面片3的公共光照分量计算面片3的各个着色点应用纹理贴图后的颜色分量。然后再对各个着色点的颜色分量与获取的针对每个着色点确定的近似光线方向上的镜面光照分量进行加权叠加获得每个着色点的渲染结果。以面片3的着色点5为例,着色点5的颜色分量与针对着色点5确定的近似光线方向上的镜面光照分量加权叠加得到着色点5的渲染结果。再比如,以面片3的着色点6为例,着色点6的颜色分量与针对着色点6确定的近似光线方向上的镜面光照分量加权叠加得到着色点6的渲染结果。图9C中以用户设置第一权重k d和第二权重k s为例。
应理解的是,一个图像帧包括多个像素点,每个像素点可以对应一个或者多个着色点。每个像素点可以根据一个或者多个着色点的渲染结果的平均值等方法来获得。其中,求取平均值的方法可以是加权平均值法,也可以是直接取平均值来确定。从而在获取多个像素 点的渲染结果,即得到一个图像帧的渲染结果。
本申请实施例中,将高精度的贴图部分和低精度的面片的光效部分解耦。只预计算和存储相对精度较低的各个面片的光效结果,高精度的贴图在实时查看时通过简单快速的计算或者近似得到。进而在降低存储量的基础上提高了渲染精度。
本申请实施例还提供一种渲染引擎,参见图10所示,所述渲染引擎用于渲染虚拟场景,所述虚拟场景包括至少一个模型,所述至少一个模型包括多个面片,每个面片包括多个着色点;所述渲染引擎包括处理单元1001和存储单元1002;
所述处理单元1001,用于从存储单元1002获取预先计算的目标面片的公共光照分量和预先计算的第一着色点的镜面光照分量,其中,所述第一着色点位于所述目标面片上,所述目标面片的公共光照分量用于计算所述多个着色点的渲染结果,所述第一着色点的镜面光照分量用于指示所述第一着色点的出射光线的光照强度;根据所述目标面片的公共光照分量、所述第一着色点的镜面光照分量和所述第一着色点的纹理贴图参数,计算所述第一着色点的渲染结果。
在一种可能的实施方式中,每个面片对应一个公共光照分量。
在一种可能的实施方式中,所述目标面片包括的每个着色点对应一个镜面光照分量。
在一种可能的实施方式中,所述处理单元1001根据所述目标面片的法线贴图参数,确定与出射光线对应的近似光线。其中,所述出射光线和所述近似光线均经过所述目标面片,所述第一着色点为所述出射光线与所述目标面片的交点。将预先计算的所述近似光线的镜面光照分量确定为所述第一着色点的镜面光照分量。
在一种可能的实施方式中,所述处理单元1001,在获取预先计算的目标面片的公共光照分量和预先计算的第一着色点的镜面光照分量前,基于光线追踪渲染引擎计算所述目标面片的公共光照分量和所述第一着色点的镜面光照分量,其中,计算所述目标面片的公共光照分量利用的元素仅包括所述目标面片上一条或多条入射光的亮度和所述一条或多条入射光与所述目标面片法线方向的夹角。
在一种可能的实施方式中,所述处理单元1001,获取预先计算的第二着色点的镜面光照分量,其中,所述第二着色点位于所述目标面片上,所述第二着色点的镜面光照分量用于指示进入所述第二着色点的光线的光照强度;根据所述目标面片的公共光照分量、所述第二着色点的镜面光照分量和所述第二着色点的纹理贴图参数,计算所述第二着色点的渲染结果。
在一种可能的实施方式中,所述目标面片的公共光照分量包括第一权重,所述第一着色点的镜面光照分量中包括第二权重,所述渲染引擎还包括权重设置接口1003;所述权重设置接口1003,用于接收用户设置的所述第一权重和/或所述第二权重;所述处理单元1001,还用于根据所述第一权重计算所述目标面片的公共光照分量;根据所述第二权重计算所述第一着色点的镜面光照分量。
本申请还提供一种计算设备900。如图11所示,计算设备包括:总线902、处理器904、存储器906和通信接口908。处理器904、存储器906和通信接口908之间通过总线902通信。计算设备900可以是服务器或终端设备。应理解,本申请不限定计算设备900中的处理器、存储器的个数。
总线902可以是外设部件互连标准(peripheral component interconnect,PCI)总线或扩展工业标准结构(extended industry standard architecture,EISA)总线等。总线可以分为 地址总线、数据总线、控制总线等。为便于表示,图11中仅用一条线表示,但并不表示仅有一根总线或一种类型的总线。总线904可包括在计算设备900各个部件(例如,存储器906、处理器904、通信接口908)之间传送信息的通路。
处理器904可以包括中央处理器(central processing unit,CPU)、图形处理器(graphics processing unit,GPU)、微处理器(micro processor,MP)或者数字信号处理器(digital signal processor,DSP)等处理器中的任意一种或多种。
在一些可能的实现方式中,处理器904可以包含一个或多个图形处理器。该处理器904用于执行存储在存储器906中的指令以实现前述图6或者图7对应的实施例所述的方法。
在一些可能的实现方式中,处理器904可以包括一个或多个中央处理器和一个或多个图形处理器。该处理器904用于执行存储在存储器906中的指令以实现前述图6或者图7对应的实施例所述的方法。
存储器906可以包括易失性存储器(volatile memory),例如随机存取存储器(random access memory,RAM)。处理器904还可以包括非易失性存储器(non-volatile memory),例如只读存储器(read-only memory,ROM),快闪存储器,机械硬盘(hard disk drive,HDD)或固态硬盘(solid state drive,SSD)。存储器906中存储有可执行的程序代码,处理器904执行该可执行的程序代码以实现图6或者图7对应的实施例所述的方法。具体的,存储器906上存有渲染节点用于执行图6或者图7对应的实施例所述的方法的指令。
通信接口908使用例如但不限于网络接口卡、收发器一类的收发模块,来实现计算设备900与其他设备或通信网络之间的通信。
本申请实施例还提供了一种计算机可读存储介质。所述计算机可读存储介质可以是计算设备能够存储的任何可用介质或者是包含一个或多个可用介质的数据中心等数据存储设备。所述可用介质可以是磁性介质,(例如,软盘、硬盘、磁带)、光介质(例如,DVD)、或者半导体介质(例如固态硬盘)等。该计算机可读存储介质包括指令,所述指令指示计算设备执行上述图6或者图7对应的实施例所述的方法。
本申请实施例还提供了一种包含指令的计算机程序产品。所述计算机程序产品可以是包含指令的,能够运行在计算设备上或被储存在任何可用介质中的软件或程序产品。当所述计算机程序产品在至少一个计算机设备上运行时,使得至少一个计算机设备执行上述图6或者图7对应的实施例所述的方法。
本领域内的技术人员应明白,本申请的实施例可提供为方法、系统、或计算机程序产品。因此,本申请可采用完全硬件实施例、完全软件实施例、或结合软件和硬件方面的实施例的形式。而且,本申请可采用在一个或多个其中包含有计算机可用程序代码的计算机可用存储介质(包括但不限于磁盘存储器、CD-ROM、光学存储器等)上实施的计算机程序产品的形式。
本申请是参照根据本申请实施例的方法、设备(系统)、和计算机程序产品的流程图和/或方框图来描述的。应理解可由计算机程序指令实现流程图和/或方框图中的每一流程和/或方框、以及流程图和/或方框图中的流程和/或方框的结合。可提供这些计算机程序指令到通用计算机、专用计算机、嵌入式处理机或其他可编程数据处理设备的处理器以产生一个机器,使得通过计算机或其他可编程数据处理设备的处理器执行的指令产生用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的装置。
这些计算机程序指令也可存储在能引导计算机或其他可编程数据处理设备以特定方式工作的计算机可读存储器中,使得存储在该计算机可读存储器中的指令产生包括指令装置的制造品,该指令装置实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能。
这些计算机程序指令也可装载到计算机或其他可编程数据处理设备上,使得在计算机或其他可编程设备上执行一系列操作步骤以产生计算机实现的处理,从而在计算机或其他可编程设备上执行的指令提供用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的步骤。
显然,本领域的技术人员可以对本申请实施例进行各种改动和变型而不脱离本申请实施例的范围。这样,倘若本申请实施例的这些修改和变型属于本申请权利要求及其等同技术的范围之内,则本申请也意图包含这些改动和变型在内。

Claims (19)

  1. 一种渲染方法,其特征在于,所述渲染方法用于渲染虚拟场景,所述虚拟场景包括至少一个模型,每个模型包括多个面片,每个面片包括多个着色点,所述方法包括:
    获取预先计算的目标面片的公共光照分量和预先计算的第一着色点的镜面光照分量,其中,所述第一着色点位于所述目标面片上,所述目标面片的公共光照分量用于计算所述多个着色点的渲染结果,所述第一着色点的镜面光照分量用于指示所述第一着色点的出射光线的光照强度;
    根据所述目标面片的公共光照分量、所述第一着色点的镜面光照分量和所述第一着色点的纹理贴图参数,计算所述第一着色点的渲染结果。
  2. 如权利要求1所述的方法,其特征在于,每个面片对应一个公共光照分量。
  3. 如权利要求1或2所述的方法,其特征在于,所述目标面片包括的每个着色点对应一个镜面光照分量。
  4. 如权利要求1至3中任一所述的方法,其特征在于,所述获取所述预先计算的第一着色点的镜面光照分量,包括:
    根据所述目标面片的法线贴图参数,确定与出射光线对应的近似光线,其中,所述出射光线和所述近似光线均经过所述目标面片,所述第一着色点为所述出射光线与所述目标面片的交点;
    将预先计算的所述近似光线的镜面光照分量确定为所述第一着色点的镜面光照分量。
  5. 如权利要求1至4中任一所述的方法,其特征在于,所述获取预先计算的目标面片的公共光照分量和预先计算的第一着色点的镜面光照分量前,所述方法还包括:
    基于光线追踪方法计算所述目标面片的公共光照分量和所述第一着色点的镜面光照分量,其中,计算所述目标面片的公共光照分量利用的元素仅包括所述目标面片上一条或多条入射光的亮度和所述一条或多条入射光与所述目标面片法线方向的夹角。
  6. 如权利要求1至5中任一所述的方法,其特征在于,所述方法还包括:
    获取预先计算的第二着色点的镜面光照分量,其中,所述第二着色点位于所述目标面片上,所述第二着色点的镜面光照分量用于指示所述第二着色点的出射光线的光照强度;
    根据所述目标面片的公共光照分量、所述第二着色点的镜面光照分量和所述第二着色点的纹理贴图参数,计算所述第二着色点的渲染结果。
  7. 如权利要求1至6中任一所述的方法,其特征在于,所述目标面片的公共光照分量包括第一权重,所述第一着色点的镜面光照分量中包括第二权重,所述方法还包括:
    提供权重设置接口,接收用户设置的所述第一权重和/或所述第二权重;
    根据所述第一权重计算所述目标面片的公共光照分量;
    根据所述第二权重计算所述第一着色点的镜面光照分量。
  8. 如权利要求1至7中任一所述的方法,其特征在于,所述方法还包括:
    提供权重设置接口,接收用户设置的所述第一权重和/或所述第二权重;
    所述根据所述目标面片的公共光照分量、所述第一着色点的镜面光照分量和所述第一着色点的纹理贴图参数,计算所述第一着色点的渲染结果,包括:
    根据所述目标面片的公共光照分量与所述第一权重的乘积、所述第一着色点的镜面光照分量与所述第二权重的乘积和所述第一着色点的纹理贴图参数,计算所述第一着色点的渲染结果。
  9. 一种渲染引擎,其特征在于,所述渲染节点用于渲染虚拟场景,所述虚拟场景包括至少一个模型,所述至少一个模型包括多个面片,每个面片包括多个着色点;所述渲染节点包括处理单元和存储单元;
    所述处理单元,用于从存储单元获取预先计算的目标面片的公共光照分量和预先计算的第一着色点的镜面光照分量,其中,所述第一着色点位于所述目标面片上,所述目标面片的公共光照分量用于计算所述多个着色点的渲染结果,所述第一着色点的镜面光照分量用于指示所述第一着色点的出射光线的光照强度;根据所述目标面片的公共光照分量、所述第一着色点的镜面光照分量和所述第一着色点的纹理贴图参数,计算所述第一着色点的渲染结果。
  10. 如权利要求9所述的渲染引擎,其特征在于,每个面片对应一个公共光照分量。
  11. 如权利要求9或10所述的渲染引擎,其特征在于,所述目标面片包括的每个着色点对应一个镜面光照分量。
  12. 如权利要求9至11中任一所述的渲染引擎,其特征在于,所述处理单元,具体用于:
    根据所述目标面片的法线贴图参数,确定与出射光线对应的近似光线,其中,所述出射光线和所述近似光线均经过所述目标面片,所述第一着色点为所述出射光线与所述目标面片的交点;
    将预先计算的所述近似光线的镜面光照分量确定为所述第一着色点的镜面光照分量。
  13. 如权利要求9至12中任一所述的渲染引擎,其特征在于,所述处理单元,还用于:在获取预先计算的目标面片的公共光照分量和预先计算的第一着色点的镜面光照分量前,基于光线追踪渲染引擎计算所述目标面片的公共光照分量和所述第一着色点的镜面光照分量,其中,计算所述目标面片的公共光照分量利用的元素仅包括所述目标面片上一条或多条入射光的亮度和所述一条或多条入射光与所述目标面片法线方向的夹角。
  14. 如权利要求9至13中任一所述的渲染引擎,其特征在于,所述处理单元,还用于:
    获取预先计算的第二着色点的镜面光照分量,其中,所述第二着色点位于所述目标面片上,所述第二着色点的镜面光照分量用于指示进入所述第二着色点的光线的光照强度;
    根据所述目标面片的公共光照分量、所述第二着色点的镜面光照分量和所述第二着色 点的纹理贴图参数,计算所述第二着色点的渲染结果。
  15. 如权利要求9至14中任一所述的渲染引擎,其特征在于,所述目标面片的公共光照分量包括第一权重,所述第一着色点的镜面光照分量中包括第二权重,所述渲染引擎还包括权重设置接口;
    所述权重设置接口,用于接收用户设置的所述第一权重和/或所述第二权重;
    所述处理单元,还用于根据所述第一权重计算所述目标面片的公共光照分量;根据所述第二权重计算所述第一着色点的镜面光照分量。
  16. 如权利要求9至14中任一所述的渲染引擎,其特征在于,所述渲染引擎还包括权重设置接口;
    所述权重设置接口,用于接收用户设置的所述第一权重和/或所述第二权重;
    所述处理单元,具体用于根据所述目标面片的公共光照分量与所述第一权重的乘积、所述第一着色点的镜面光照分量与所述第二权重的乘积和所述第一着色点的纹理贴图参数,计算所述第一着色点的渲染结果。
  17. 一种计算机程序产品,其特征在于,所述计算机程序产品包括指令,当所述指令被计算机设备集群运行时,使得所述计算机设备集群执行如权利要求的1至8中任一项所述的方法。
  18. 一种计算机可读存储介质,其特征在于,包括计算机程序指令,当所述计算机程序指令由计算设备集群执行时,所述计算设备集群执行如权利要求1至8中任一项所述的方法。
  19. 一种计算设备集群,其特征在于,包括至少一个计算设备,每个计算设备包括处理器和存储器;
    所述至少一个计算设备的处理器用于执行所述至少一个计算设备的存储器中存储的指令,以使得所述计算设备集群执行如权利要求1至8中任一项所述的方法。
PCT/CN2022/127466 2021-11-22 2022-10-25 一种渲染方法及装置 WO2023088047A1 (zh)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN202111382296 2021-11-22
CN202111382296.4 2021-11-22
CN202210162160.0A CN116152420A (zh) 2021-11-22 2022-02-22 一种渲染方法及装置
CN202210162160.0 2022-02-22

Publications (1)

Publication Number Publication Date
WO2023088047A1 true WO2023088047A1 (zh) 2023-05-25

Family

ID=86354976

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/127466 WO2023088047A1 (zh) 2021-11-22 2022-10-25 一种渲染方法及装置

Country Status (2)

Country Link
CN (1) CN116152420A (zh)
WO (1) WO2023088047A1 (zh)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106373179A (zh) * 2016-08-29 2017-02-01 北京像素软件科技股份有限公司 光照场景渲染方法
US20200082572A1 (en) * 2018-09-10 2020-03-12 Disney Enterprises, Inc. Techniques for capturing dynamic appearance of skin
CN111420404A (zh) * 2020-03-20 2020-07-17 网易(杭州)网络有限公司 游戏中对象渲染的方法及装置、电子设备、存储介质

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106373179A (zh) * 2016-08-29 2017-02-01 北京像素软件科技股份有限公司 光照场景渲染方法
US20200082572A1 (en) * 2018-09-10 2020-03-12 Disney Enterprises, Inc. Techniques for capturing dynamic appearance of skin
CN111420404A (zh) * 2020-03-20 2020-07-17 网易(杭州)网络有限公司 游戏中对象渲染的方法及装置、电子设备、存储介质

Also Published As

Publication number Publication date
CN116152420A (zh) 2023-05-23

Similar Documents

Publication Publication Date Title
US7212207B2 (en) Method and apparatus for real-time global illumination incorporating stream processor based hybrid ray tracing
EP2973425B1 (en) System and method for remote generation indirect illumination sources in three-dimensional graphics
US11494970B2 (en) Importance sampling for determining a light map
US9508191B2 (en) Optimal point density using camera proximity for point-based global illumination
US11790594B2 (en) Ray-tracing with irradiance caches
US20200342656A1 (en) Efficient rendering of high-density meshes
EP4213102A1 (en) Rendering method and apparatus, and device
CN113628317A (zh) 渲染方法、设备以及系统
US11823321B2 (en) Denoising techniques suitable for recurrent blurs
CN113345063A (zh) 基于深度学习的pbr三维重建方法、系统与计算机存储介质
CN115205438A (zh) 图像渲染方法和设备
Tokuyoshi et al. Stochastic light culling for VPLs on GGX microsurfaces
WO2023088047A1 (zh) 一种渲染方法及装置
CN116758208A (zh) 全局光照渲染方法、装置、存储介质及电子设备
WO2023005631A1 (zh) 一种渲染方法、装置及存储介质
WO2022042003A1 (zh) 三维着色的方法、装置、计算设备和存储介质
Rosen et al. Nonpinhole approximations for interactive rendering
WO2023029424A1 (zh) 一种对应用进行渲染的方法及相关装置
WO2024109006A1 (zh) 一种光源剔除方法及渲染引擎
WO2023197689A1 (zh) 一种数据处理的方法、系统和设备
WO2023109582A1 (zh) 处理光线数据的方法、装置、设备和存储介质
US20220203231A1 (en) Methods and Systems for Determining Decal Projections Intersecting Spacial Units in a Frame of a Game Space
CN118115644A (zh) 一种光源剔除方法及渲染引擎
EP4121947A1 (en) System and method for real-time ray tracing in a 3d environment
CN116958364A (zh) 一种数据处理方法及相关装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22894584

Country of ref document: EP

Kind code of ref document: A1