CN116363288A - Rendering method and device of target object, storage medium and computer equipment - Google Patents

Rendering method and device of target object, storage medium and computer equipment Download PDF

Info

Publication number
CN116363288A
CN116363288A CN202211527177.8A CN202211527177A CN116363288A CN 116363288 A CN116363288 A CN 116363288A CN 202211527177 A CN202211527177 A CN 202211527177A CN 116363288 A CN116363288 A CN 116363288A
Authority
CN
China
Prior art keywords
pixel
target object
coefficient
indirect
value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211527177.8A
Other languages
Chinese (zh)
Inventor
赵进
杨斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202211527177.8A priority Critical patent/CN116363288A/en
Publication of CN116363288A publication Critical patent/CN116363288A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/02Non-photorealistic rendering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/04Texture mapping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/506Illumination models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/80Shading

Abstract

The embodiment of the application discloses a rendering method and device of a target object, a storage medium and computer equipment. The method comprises the following steps: the method improves the rendering equation and/or the rendering flow of diffuse reflection illumination and/or specular reflection illumination and/or diffuse reflection illumination of indirect light and/or specular reflection illumination in the physical-based rendering pipeline, so that real-time light and shadow interaction can be generated, the improved rendering pipeline is suitable for stylized rendering, the stylized rendering efficiency is improved, and the stylized rendering effect is realized.

Description

Rendering method and device of target object, storage medium and computer equipment
Technical Field
The present invention relates to the field of data processing technologies, and in particular, to a method and an apparatus for rendering a target object, a computer readable storage medium, and a computer device.
Background
The PBR (Physically Based Rendering) rendering pipeline approximates and fits measurement data obtained in a laboratory through a mathematical formula, and finally obtains a material expression mode mainly comprising Base Color, metallic and Roughness parameters, which is also called a rendering equation, and based on the rendering equation, a fake and spurious writing effect is made through computer graphics. Since the purpose of the rendering equation of the PBR rendering pipeline is to achieve a realistic effect with spurious reality, the rendering equation of the PBR rendering pipeline is not suitable for rendering some strongly stylized pictures, such as the sketched mountains, celluloid animation (cel-animation), or the impressing of a picture.
The hand-painted material map is characterized in that the information such as color, light shadow, reflection and the like is expressed in the color map in a hand-painted mode, the painting effect of the hand-painted material map is different along with different art styles, the painting sense can be more represented through the hand-painted material map, however, because the detailed information such as all the colors, the light shadow, the emission and the like is recorded on one color map, real-time light shadow interaction cannot be generated, and meanwhile, the hand-painted material map is attached to a 3D model and then observed from different angles, so that the performance of an observed object cannot be guaranteed to be always in a correct range.
Therefore, there is a need for a rendering method that can be used to render strongly stylized pictures.
Disclosure of Invention
The embodiment of the application provides a rendering method and device of a target object, a computer readable storage medium and computer equipment, which can generate real-time light and shadow interaction and improve the stylized rendering efficiency.
The embodiment of the application provides a rendering method of a target object, which comprises the following steps:
obtaining model information and light source information of a target object, wherein the model information comprises normal line information;
Determining diffuse reflection illumination coefficients of each pixel in the pixels to be rendered of the target object by using a preset diffuse reflection function according to the normal information and the light source information, wherein the preset diffuse reflection function can be conducted at a position where a function value is zero and the reciprocal is zero;
determining a diffuse reflection illumination value of each pixel according to the diffuse reflection illumination coefficient and the light source information;
and performing stylized rendering on the target object according to the diffuse reflection illumination value.
The embodiment of the application provides a rendering method of a target object, which comprises the following steps:
obtaining model information and light source information of a target object, wherein the model information comprises normal line information;
obtaining texture coordinates, shadow coefficients and highlight colors of each pixel in the pixels to be rendered of the target object, and generating high light intensity of each pixel in the pixels to be rendered according to the texture coordinates of the target object;
determining a specular reflection illumination value of each pixel according to the high light intensity, the high light color, the shading coefficient or the corrected shading coefficient, and the light source information, wherein the corrected shading coefficient is obtained by correction processing according to the shading coefficient of each pixel;
And performing stylized rendering on the target object according to the specular reflection illumination value.
The embodiment of the application provides a rendering method of a target object, which comprises the following steps:
obtaining model information and light source information of a target object, wherein the model information comprises normal line information, and the light source information comprises indirect light information;
performing indirect diffuse reflection processing on each pixel in the pixel to be rendered of the target object by using the indirect light information and the normal line information to obtain an indirect diffuse reflection illumination value, and/or performing indirect specular reflection processing on each pixel in the pixel to be rendered of the target object by using roughness and the normal line information to obtain an indirect specular reflection illumination value, so as to decouple the indirect diffuse reflection processing and the indirect specular reflection processing;
determining an indirect light illumination value of each pixel in the pixels to be rendered of the target object according to the indirect light diffuse reflection illumination value and/or the indirect light specular reflection illumination value;
and performing stylized rendering on the target object according to the indirect light illumination value.
The embodiment of the application also provides a rendering device of the target object, which comprises:
the first acquisition module is used for acquiring model information and light source information of a target object, wherein the model information comprises normal line information;
The first diffuse reflection module is used for determining diffuse reflection illumination coefficients of each pixel in the pixels to be rendered of the target object by utilizing a preset diffuse reflection function according to the normal information and the light source information; determining diffuse reflection illumination values of each pixel according to the diffuse reflection illumination coefficients and the light source information; wherein the preset diffuse reflection function is conductive at the position of zero function value and the reciprocal is zero;
and the first rendering module is used for performing stylized rendering on the target object according to the diffuse reflection illumination value.
The embodiment of the application also provides a rendering device of the target object, which comprises:
the second acquisition module is used for acquiring model information and light source information of the target object, wherein the model information comprises normal line information; obtaining texture coordinates, shadow coefficients and highlight colors of each pixel in the pixels to be rendered of the target object;
the second specular reflection module is used for generating high light intensity of each pixel in the pixels to be rendered according to the texture coordinates of the target object; determining a specular reflection illumination value of each pixel according to the high light intensity, the high light color, the shading coefficient or the corrected shading coefficient, and the light source information, wherein the corrected shading coefficient is obtained by correction processing according to the shading coefficient of each pixel;
And the second rendering module is used for performing stylized rendering on the target object according to the specular reflection illumination value.
The embodiment of the application also provides a rendering device of the target object, which comprises:
the third acquisition module is used for acquiring model information and light source information of the target object, wherein the model information comprises normal line information, and the light source information comprises indirect light information;
the third indirect light module is used for performing indirect light diffuse reflection processing on each pixel in the pixel to be rendered of the target object by utilizing the indirect light information and the normal line information to obtain an indirect light diffuse reflection illumination value, and/or performing indirect light specular reflection processing on each pixel in the pixel to be rendered of the target object by utilizing the roughness and the normal line information to obtain an indirect light specular reflection illumination value so as to decouple the indirect light diffuse reflection processing and the indirect light specular reflection processing; determining an indirect light illumination value of each pixel in the pixels to be rendered of the target object according to the indirect light diffuse reflection illumination value and/or the indirect light specular reflection illumination value;
and the third rendering module is used for performing stylized rendering on the target object according to the indirect light illumination value.
Embodiments of the present application also provide a computer readable storage medium storing a computer program adapted to be loaded by a processor to perform the steps in the method for rendering an object according to any of the embodiments above.
The embodiment of the application also provides a computer device, which comprises a memory and a processor, wherein the memory stores a computer program, and the processor executes the steps in the method for rendering the target object according to any embodiment by calling the computer program stored in the memory.
According to the target object rendering method, device and computer readable storage medium, diffuse reflection illumination and/or specular reflection illumination in a physical-based rendering pipeline and/or specular reflection illumination rendering equation and/or rendering flow are improved, for example, for diffuse reflection illumination, a preset diffuse reflection function is obtained by modifying the diffuse reflection function in the physical-based rendering pipeline, as the preset diffuse reflection function can be conducted at a zero position and is zero-inverted, the situation that the bright-dark boundary line is hard after the diffuse reflection illumination in the original physical-based rendering pipeline is rendered is changed, for example, for specular reflection illumination, the specular reflection portion in the physical-based rendering pipeline is controlled according to the roughness of the target object, for example, the original specular reflection high light coefficient mask data is obtained according to the texture coordinates of the target object, for example, for the diffuse reflection illumination and the specular reflection illumination of the indirect light are modified, for example, the diffuse reflection pipeline is not coupled with the indirect reflection illumination, and the diffuse reflection illumination is not achieved, and the effect is not achieved by implementing the diffuse reflection of the diffuse reflection pipeline, and the diffuse reflection is not achieved, and the diffuse reflection illumination is completely rendered, and the effect is not achieved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly introduced below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram of a diffuse reflectance function image in a standard PBR rendering pipeline.
FIG. 2 is a schematic diagram of light and dark interfaces in a standard PBR rendering pipeline.
Fig. 3 is a flow chart of a rendering method of a target object according to an embodiment of the present application.
Fig. 4 is a schematic diagram of a light-shade boundary line according to an embodiment of the present application.
Fig. 5 is a schematic diagram of a preset diffuse reflection function image according to an embodiment of the present application.
Fig. 6 is a schematic diagram of a preset diffuse reflection function image under different attenuation coefficients according to an embodiment of the present application.
Fig. 7a is a schematic diagram of a light-dark boundary line corresponding to an attenuation coefficient of unregulated diffuse reflection according to an embodiment of the present application.
Fig. 7b is a schematic diagram of a light-dark boundary line corresponding to the adjusted attenuation coefficient according to the embodiment of the present application.
Fig. 7c is a schematic view of a self-shading display provided in an embodiment of the present application.
Fig. 8 is a flow chart of a rendering method of a target object according to an embodiment of the present application.
Fig. 9 is a schematic diagram of a highlight texture map corresponding to a virtual eyeball according to an embodiment of the present application.
Fig. 10 is a schematic diagram of vertical vectors and horizontal vectors corresponding to virtual eyeballs according to an embodiment of the present application.
Fig. 11 is a schematic diagram of normal directions of two virtual eyeballs according to an embodiment of the present disclosure.
Fig. 12 is a schematic view of projection based on a virtual camera view angle of a virtual eyeball according to an embodiment of the present application.
Fig. 13a and fig. 13b are schematic diagrams illustrating a highlight effect of a virtual eyeball according to an embodiment of the present application.
Fig. 14a is a schematic diagram of original texture coordinates of a virtual hair patch according to an embodiment of the present application.
Fig. 14b is a schematic diagram of the expanded preset texture coordinates according to the embodiment of the present application.
Fig. 15a is a schematic diagram of a plurality of virtual hair pieces and expanded preset texture coordinates according to an embodiment of the present application.
Fig. 15b is a schematic illustration of the root and tip of virtual hair provided in an embodiment of the present application.
Fig. 16a and 16b are schematic diagrams of gaussian function images corresponding to different c values according to the embodiments of the present application.
Fig. 17a is a schematic diagram of a highlight effect of a virtual hair mask according to an embodiment of the present disclosure under a highlight coefficient.
Fig. 17b is a schematic diagram of a highlight effect of the whole virtual hair provided in the embodiment of the present application under a highlight coefficient.
Fig. 18a is a schematic H-shaped highlight shape of a virtual hair mask according to an embodiment of the present disclosure.
Fig. 18b is a schematic H-shaped highlight shape of the entire virtual hair provided in an embodiment of the present application.
Fig. 18c is a schematic diagram of an interference effect obtained after noise processing performed on a certain virtual hair mask according to an embodiment of the present application.
Fig. 19 is a schematic diagram of a highlight texture map of a third object according to an embodiment of the present application.
Fig. 20 is a schematic diagram of a rendering result of H-shaped highlighting on a certain garment according to an embodiment of the present disclosure.
Fig. 21 is another flow chart of a rendering method of a target object according to an embodiment of the present application.
Fig. 22 is a schematic diagram of the fresnel effect provided in the examples of the present application.
Fig. 23 is a schematic diagram of an indirect reflection color of the indirect light specular reflection processing and a reflection color of a target object provided in the embodiment of the present application.
Fig. 24 is a schematic flow chart of a rendering method of a target object according to an embodiment of the present application.
Fig. 25 is a simplified flowchart of a rendering method of a target object according to an embodiment of the present application.
Fig. 26 is another simple flow chart of a rendering method of a target object according to an embodiment of the present application.
Fig. 27 is a schematic structural diagram of a rendering device for a target object according to an embodiment of the present application.
Fig. 28 is a schematic structural diagram of a rendering device for a target object according to an embodiment of the present application.
Fig. 29 is another schematic structural diagram of a rendering device for a target object according to an embodiment of the present application.
Fig. 30 is a schematic structural diagram of a computer device according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are only some, but not all, of the embodiments of the present application. All other embodiments, which can be made by those skilled in the art based on the embodiments herein without making any inventive effort, are intended to be within the scope of the present application.
The embodiment of the application provides a rendering method and device of a target object, a computer readable storage medium and computer equipment. Specifically, the method for rendering the target object in the embodiment of the present application may be performed by a computer device, where the computer device may be a device such as a terminal or a server. The terminal can be terminal equipment such as a smart phone, a tablet personal computer, a notebook computer, a touch screen, a game machine, a personal computer (PC, personal Computer), an intelligent robot, a vehicle-mounted computer and the like. The server may be an independent physical server, may be a server cluster formed by a plurality of physical servers, or may be a cloud server providing basic cloud computing services such as a cloud service and a cloud database.
Stylized, generally opposed to written, refers to a method that an art creator adopts to deviate from the original impression when describing and explaining a thing, and adopts a method of imitation or originality to make a painting have a special style, and most imitate various hand painting modes, for example: pencil strokes, oil strokes, watercolor strokes, or make the whole style old or broken, or make the modeling exaggerated, deformed, distorted, and juggled, etc.
Rendering includes forward rendering (Forward Rendering), which refers to computing each light source on a vertex-by-vertex or pixel-by-pixel basis at a vertex shader or fragment shader stage, and backward rendering, which is also referred to as deferred rendering (Deferred Rendering), which yields the final illumination result. And the backward rendering is to store the data required by illumination calculation first and then to perform illumination calculation on a screen space. The structure used to store the data needed for illumination in backward rendering is called Geometry Buffer, also simply G-Buffer, and is usually a plurality of materials (GPU Texture) with different formats.
The present application relates to a process of backward rendering in a Physical Based Rendering (PBR) pipeline, including a rendering equation and a rendering flow of backward rendering. In the present application, stylized rendering refers to rendering a target object to obtain a stylized effect.
It should be noted that the solution in the embodiments of the present application is calculated on a pixel basis, involving an improvement of the pixel shader stage.
Because of the changes in rendering equations and rendering flows involved in the backward rendering, the data stored by the G-Buffer is planned as follows.
G-Buffer1: the R channel, the G channel and the B channel store self-luminous parameters, and the A channel is reserved.
G-Buffer2: the R, G and B channels store intrinsic colors (Base colors), and the a channel remains.
G-Buffer3: the R channel stores Roughness (Roughness), and the a channel stores the ambient light shielding (Ambient Occlusion, AO) coefficient, i.e., the ambient light shielding intensity/the indirect light shielding intensity. If the diffuse reflection illumination uses the same attenuation coefficient, the same attenuation coefficient is stored in the G-Buffer3, and if the attenuation coefficients of the R channel, the G channel and the B channel are changed in the diffuse reflection illumination, the attenuation coefficients of the R channel, the G channel and the B channel after the change are stored in the G-Buffer 5.
G-Buffe4: for storing the normal information of the object, such as the normal vector N, it should be noted that the normal information includes various normals such as vertex normals and/or texture normals.
G-Buffe5: for storing the custom material parameters, such as the attenuation coefficients of the R, G, and B channels after the change, if the specular reflection portion is modified, the specular reflection portion will also need to store the specular color, the calculated high light intensity, and so on.
The storage formats in the G-Buffer1, G-Buffer2, G-Buffer3, G-Buffer4, and G-Buffer3 may be determined according to specific storage accuracy, and may be selected as RGBA8, RGBA32, or RGBA 64. The G-Buffer1, G-Buffer2, G-Buffer4 are consistent with those in the PBR rendering pipeline, and the application mainly changes G-Buffer3 and G-Buffer 5.
The method, the device, the computer readable storage medium and the computer equipment for rendering the target object according to the embodiments of the present application will be described in detail below based on the data stored in the G-Buffer. The numbers of the following examples are not intended to limit the preferred order of the examples.
The process of diffuse reflection illumination is first described. For the diffuse reflection term of the diffuse reflection illumination of the standard PBR rendering pipeline (corresponding to the calculated diffuse reflection illumination intensity/diffuse reflection illumination coefficient), it is a function of the dot product (N.L) between the normal vector N and the illumination direction vector L of the target object, and the function value is 0 when N.L is smaller than 0, and the function image is shown in figure 1. As can be seen from fig. 1, when the function value is 0, the function is not conductive, and is represented by a very hard light-dark boundary line, as shown in fig. 2. In the stylized rendering of the present application, the light-dark boundary line is required to be softer, so that the rendering equation of diffuse reflection illumination in the standard PBR rendering pipeline cannot be used for stylized rendering.
Fig. 3 is a flow chart of a method for rendering an object according to an embodiment of the present application, which mainly relates to improving a rendering equation of diffuse reflection illumination in a standard PBR rendering pipeline, and the method includes the following steps.
And 101, acquiring model information and light source information of the target object, wherein the model information comprises normal line information.
Wherein the target may be any virtual object and/or a person and/or an animal, etc. The model information of the object includes normal line information in the model including a normal line vector N and texture coordinates (UV coordinates) of each pixel, etc., and it is noted that the UV coordinates are originally obtained from vertices but are interpolated and mapped to each pixel, and the texture information includes at least one of an intrinsic color, roughness, attenuation coefficient, reflection color, ambient light shielding intensity, etc. of the object, and the light source information includes at least one of a light source color, a light source intensity, a light source direction, a transmission ratio of the light source, etc.
102, determining diffuse reflection illumination coefficients of each pixel in the pixels to be rendered by using a preset diffuse reflection function according to normal line information and light source information, wherein the preset diffuse reflection function can be conducted at a position where a function value is zero, and the reciprocal is zero.
In this application, a function filtering method is used, when the diffuse reflection term is zero, the function can be led and the derivative is zero, which makes the light-dark boundary line soft, as shown in fig. 4, and can be compared with fig. 2 for reference. The filter function is required to be within the definition domain x epsilon < -1,1]Is defined in the interior and has a value range of y E [0,1 ]]And at y=0, the derivative dy/dx approaches 0 as much as possible. The filtering function in this application uses a quadratic function y=k 2 For the filtering calculation, let x=n·l, k= (x-1)/(λ×2) +1, thus obtaining the preset diffuse reflection function y= ((x-1)/(λ×2) +1) in the present application 2 Wherein x=n·l, λ is the attenuation coefficient of the diffuse reflection term, the value range is [0, + ] infinity]. Wherein, when λ=0.5, an image of the preset diffuse reflection function is shown in fig. 5.
After the preset diffuse reflection function is obtained, the attenuation coefficient lambda of the R channel, the G channel and the B channel stored in the G-Buffer3 is obtained, and the diffuse reflection item of each pixel, namely the diffuse reflection illumination intensity/diffuse reflection illumination coefficient, is determined by utilizing the preset diffuse reflection function according to normal line information such as normal line vectors, light source information such as light source directions and attenuation coefficients. Substituting the normal vector, the light source direction and the attenuation coefficient into a preset diffuse reflection function for calculation, wherein the obtained numerical value is the calculated diffuse reflection illumination coefficient. The resulting diffuse reflectance illumination coefficient may be expressed in terms of DiffuseTerm. Because the R channel, the G channel and the B channel share the same attenuation coefficient, the diffuse reflection illumination coefficients DiffuseTerm of the R channel, the G channel and the B channel obtained by the final calculation are also consistent in change. The preset diffuse reflection function is led at the position where the function value is zero and the reciprocal is zero, so that the light-dark boundary line at the light-dark boundary is soft.
In one aspect, the step of determining the diffuse reflection illumination coefficient of each pixel in the pixels to be rendered by using a preset diffuse reflection function according to the normal information and the light source information includes: for each of the R, G, and B channels, obtaining preset different attenuation coefficients, e.g., obtaining preset different attenuation coefficients from the G-Buffer5, respectively λ1, λ2, and λ3; according to normal information such as normal vector, light source information such as light source direction and different set attenuation coefficients, determining the diffuse reflection illumination coefficients of each pixel in the R channel, the G channel and the B channel respectively in the pixels to be rendered by utilizing a preset diffuse reflection function, for example, substituting the normal vector of each pixel, the light source direction and the attenuation coefficient of the corresponding channel into the preset diffuse reflection function, wherein the obtained value is the channel full emission illumination coefficient of the corresponding channel; and determining the diffuse reflection illumination coefficients of the R channel, the G channel and the B channel as diffuse reflection illumination coefficients DiffuseTerm of each pixel in the pixels to be rendered. In this embodiment, different attenuation coefficients are set for each channel, and the sub-channels control the attenuation rate.
Wherein, as long as there is one different attenuation coefficient from the other, λ1, λ2, λ3. Under different attenuation coefficients, images of the preset diffuse reflection function are shown in fig. 6, the attenuation coefficients in three diffuse reflection curves from left to right are respectively 0.8, 0.5 and 0.2, different attenuation coefficients correspond to different diffuse reflection curves, and different diffuse reflection curves correspond to different attenuation rates.
It should be noted that in the standard PBR rendering pipeline, the attenuation of diffuse reflection is not controlled by the sub-channels, and in the present application, the attenuation of diffuse reflection is controlled by the sub-channels, and by modifying the attenuation coefficients of the R channel, the G channel and the B channel, the R channel, the G channel and the B channel have different attenuation rates, so that separation of hues occurs while diffuse reflection light and shade changes is realized, thereby increasing the richness of color changes.
In practical application, the attenuation coefficients of different channels can be adjusted according to the characteristics of the target object. For example, if the skin is expected to transmit relatively red light, the lambda values of the R channel and the G channel can be raised to suspend the attenuation, for example, the attenuation coefficient of the R channel can be set to 0.62, the attenuation coefficient of the G channel can be set to 0.48, and the attenuation coefficient of the b channel can be set to 0.46, so that relatively red skin can be obtained.
In some cases, different attenuation requirements may be set based on different parts of the object, for example, when the object is a person, the light transmission of the part with more bones of the face is relatively less, and the light transmission of the part with relatively thinner ears is relatively more, so that one texture may be used for fine control, that is, different attenuation coefficients may be set for different channels of each pixel.
Correspondingly, different attenuation coefficients can be set for the R channel, the G channel and the B channel of each different pixel in the pixels to be rendered, the different attenuation coefficients can be stored in the G-Buffer through textures, the attenuation coefficients of the different channels of each different pixel are obtained, the diffuse reflection illumination coefficients of each pixel in the pixels to be rendered in the R channel, the G channel and the B channel respectively are determined by utilizing a preset diffuse reflection function according to normal information such as a normal vector of each pixel, light source information such as a light source direction and the different attenuation coefficients set by the corresponding pixel, and the diffuse reflection illumination coefficients of the R channel, the G channel and the B channel of each pixel are used as diffuse reflection illumination coefficients Diffuseterm of each pixel in the pixels to be rendered. In this embodiment a different attenuation coefficient is set for each different channel of each pixel to determine the channel diffuse reflection illumination coefficient for each pixel.
Because the attenuation of brightness is controlled by different attenuation coefficients in the application, when the attenuation coefficient is set to be larger, the problem of light leakage possibly occurs, and therefore, the transmission ratio parameter is increased for the light source, and the transmission ratio is called as a transmission coefficient and refers to the ratio of the transmitted light flux to the incident light flux. It should be noted that the light sources in the standard PBR rendering pipeline do not have this parameter.
Thus, in one instance, the method further comprises: when the transmission ratio of the light source is zero, setting the attenuation coefficient of each channel of the R channel, the G channel and the B channel of each pixel as a preset attenuation coefficient, wherein the preset attenuation coefficient can also be used as a default attenuation coefficient, and the preset attenuation coefficient can be 0.4; when the transmission ratio of the light source is 1, setting the attenuation coefficient of each channel of the R channel, the G channel and the B channel of each pixel as a preset attenuation coefficient, namely the attenuation coefficient stored in the G-Buffer; when the transmission ratio of the light source is between 0 and 1, setting the attenuation coefficient of each of the R channel, the G channel and the B channel as an interpolation attenuation coefficient, wherein the interpolation attenuation coefficient is obtained by carrying out interpolation processing on a preset attenuation coefficient and a preset attenuation coefficient, and each channel is subjected to interpolation processing to obtain the interpolation attenuation coefficient of the corresponding channel. In this embodiment, when the transmittance ratio of the light source is different, three channels of the R channel, the G channel, and the B channel are set to different attenuation coefficients. Thus, when the transmittance of the light source is 1, the light source is completely transmitted, and when the transmittance of the light source is 0, all attenuation coefficients are set to be 0.4, so that the attenuation result is close to the attenuation trend of N.L in a standard PBR rendering pipeline. The transmission ratio of the light source is determined according to the rendering requirement, so that light leakage is avoided.
In one case, the method further comprises: when the transmission ratio of the light source is zero, setting the attenuation coefficients of different channels of different pixels as preset attenuation coefficients, wherein the preset attenuation coefficients can be 0.4; when the transmission ratio of the light source is 1, setting the attenuation coefficients of different channels of different pixels as preset attenuation coefficients, namely the attenuation coefficients stored in a G-Buffer in a texture form; when the transmission ratio of the light source is between 0 and 1, setting the attenuation coefficients of different channels of different pixels as interpolation attenuation coefficients, wherein the interpolation attenuation coefficients are obtained by carrying out interpolation processing on preset attenuation coefficients and preset attenuation coefficients, and carrying out interpolation processing on R channels, G channels and B channels of different pixels to obtain corresponding interpolation attenuation coefficients. In this embodiment, according to the transmission ratio of the light source, appropriate attenuation coefficients may be set for different channels of different pixels, so as to avoid light leakage.
After adjusting the attenuation coefficient of diffuse reflection, the boundary line of light and dark is moved backward as shown in fig. 7a and 7 b. Fig. 7a is a schematic diagram of a light-dark boundary line corresponding to an attenuation coefficient of unadjusted diffuse reflection, fig. 7b is a schematic diagram of a light-dark boundary line corresponding to an adjusted attenuation coefficient, and it can be seen from fig. 7a and fig. 7b that the light-dark boundary line is significantly shifted backward in fig. 7 b. The shift of the light-dark boundary line results in the display of a self-shadow that is originally hidden in the dark portion, as shown in fig. 7 c.
In an embodiment, after the step of determining the diffuse reflection illumination coefficient for each of the pixels to be rendered, further comprising: and correcting the shadow coefficient of each pixel in the pixels to be rendered to obtain a corrected shadow coefficient, and determining the final diffuse reflection illumination coefficient of each pixel according to the diffuse reflection illumination coefficient of each pixel and the corrected shadow coefficient.
Specifically, the step of performing correction processing on the shading coefficient of each pixel in the pixels to be rendered to obtain a corrected shading coefficient includes: acquiring an original shadow coefficient of each pixel, and a first diffuse reflection illumination coefficient and a maximum shadow coefficient corresponding to the normal vector when the normal vector is perpendicular to the illumination direction; and correcting the original shadow coefficient by using the first diffuse reflection illumination coefficient and the maximum shadow coefficient to obtain a corrected shadow coefficient. Wherein the original shading coefficient of each pixel refers to the shading coefficient in the standard PBR rendering pipeline, and the original shading coefficient is obtained.
The preset diffuse reflection function above is y=k 2 K= (x-1)/(λ×2) +1, x=n·l, the final diffuse reflection illumination coefficient is max (y, 0), when x=0, that is, when the vertex normal is perpendicular to the illumination direction, the corresponding value of y is used as the first diffuse reflection illumination coefficient, and the first diffuse reflection illumination coefficient is denoted as bound.
Assuming that the original Shadow coefficient of each pixel is Shadow, shadow=0 when there is Shadow, shadow=1 when there is no Shadow, and the maximum Shadow coefficient is 1. The step of correcting the original shadow coefficient by using the first diffuse reflection illumination coefficient and the maximum shadow coefficient to obtain a corrected shadow coefficient includes: and carrying out interpolation processing on the original shadow coefficient according to the first diffuse reflection illumination coefficient and the maximum shadow coefficient to obtain a corrected shadow coefficient, for example, carrying out interpolation processing between the first diffuse reflection illumination coefficient and the maximum shadow coefficient according to the original shadow coefficient to obtain the corrected shadow coefficient. The corrected Shadow coefficients are represented by Shadow ', and the process of obtaining corrected Shadow coefficients can be represented by using a formula, shadow' =mix (Shadow, bound, 1), where Mix (t, a, b) represents linear interpolation between a, b according to t.
After the corrected shadow coefficients are obtained, the final diffuse reflection illumination coefficient of each pixel is determined according to the diffuse reflection illumination coefficient of each pixel and the corrected shadow coefficients, for example, the smaller one of the diffuse reflection illumination coefficient of each pixel and the corrected shadow coefficients is used as the final diffuse reflection illumination coefficient of the corresponding vertex. The final diffuse reflectance illumination coefficient is still expressed using diffuse term, which=min (Shadow', max (y, 0)), a process of obtaining the final diffuse reflectance illumination coefficient can be expressed using a formula.
It should be noted that the above modification of the diffuse reflection function in the standard PBR rendering pipeline, and the modification of the diffuse reflection rendering flow in the PBR rendering pipeline, includes: the sub-channels control the attenuation coefficient, take into account the transmission ratio of the light source, correct the shading coefficient, etc.
And 103, determining the diffuse reflection illumination value of each pixel according to the diffuse reflection illumination coefficient and the light source information.
If the situation that the light-shade boundary line moves backwards is not considered, that is, the original shading coefficient is not corrected, the diffuse reflection illumination value of each pixel is determined according to the diffuse reflection illumination coefficient of each pixel, the light source information and the original shading coefficient, wherein the light source information comprises a light source color and a light source intensity light str, for example, the diffuse reflection illumination coefficient of each pixel, the light source color, the light source intensity and the original shading coefficient of each pixel are multiplied to obtain the diffuse reflection illumination value of each pixel. The diffuse reflection illumination value of each pixel may be expressed using LightValue1, and the process of obtaining the diffuse reflection illumination value of each pixel may be expressed using a formula, lightValue 1=diffuseterm.
If the original shading coefficient is corrected in consideration of the situation that the light-dark boundary line moves backward, the diffuse reflection illumination value of each pixel is determined according to the diffuse reflection illumination coefficient, the light source information and the corrected shading coefficient of each pixel, for example, the diffuse reflection illumination coefficient, the light source color, the light source intensity of each pixel and the corrected shading coefficient of each pixel are multiplied to obtain the diffuse reflection illumination value of each pixel. The usage formula can be expressed as lightvalue1=diffuseterm x LightColor x LightStr x Shadow'.
104, performing stylized rendering on the target object according to the diffuse reflection illumination value.
After obtaining the diffuse reflection illumination value, rendering the target object according to the diffuse reflection illumination value to realize stylized rendering, wherein in the case, only the diffuse reflection illumination condition of the target object is involved.
In one case, in addition to determining the diffuse reflection illumination value LightValue1 of the target object, it is also necessary to determine the specular reflection illumination value LightValue2 and/or the indirect light diffuse reflection illumination value LightValue3 and/or the indirect light specular reflection illumination value LightValue4 of the target object, and perform stylized rendering on the target object according to the diffuse reflection illumination value LightValue1, the specular reflection illumination value LightValue2 and/or the indirect light diffuse reflection illumination value LightValue3 and/or the indirect light specular reflection illumination value LightValue 4.
The method for determining the specular reflection illumination value of the target object can be performed by adopting any method for determining the specular reflection illumination value, the method for determining the indirect diffuse reflection illumination value of the target object can be performed by adopting any method for determining the indirect diffuse reflection illumination value, and the method for determining the indirect specular reflection illumination value of the target object can be performed by adopting any method for determining the indirect specular reflection illumination value.
In one case, the specular illumination value, lightValue2, of the target object may be determined as follows: the specular reflection illumination coefficient SpecTerm of each pixel is determined by using a physical-based rendering pipeline, and the specular reflection illumination value LightValue2 of each pixel is determined according to the specular reflection illumination coefficient SpecTerm, the corrected Shadow coefficient Shadow' of each pixel, and light source information such as the light source color and the light source intensity light str, wherein the corrected Shadow coefficient is obtained by performing correction processing according to the original Shadow coefficient of each pixel, and how to obtain the corrected Shadow coefficient is described in detail above, and details are not repeated herein. The process of obtaining the specular reflection illumination value of each pixel can be expressed using a formula, lightValue 2=0.04×specterm×lightcolor×lightstr×shadow'. In this embodiment, the corrected shadow coefficient is used to calculate the specular reflection illumination value, so that the calculated specular reflection highlight is better. Wherein the specular reflection illumination coefficient for each pixel is determined based on the physical rendering pipeline, and can be calculated by adopting GGX algorithm in the standard PBR rendering pipeline.
In one case, the model information includes texture coordinates of each vertex of the object, the texture coordinates of each vertex are mapped onto each pixel to obtain texture coordinates of each pixel, and the specular reflection illumination value LightValue2 of the object can be determined by using the following manner: and generating high light intensity of each pixel according to texture coordinates of each pixel in the pixels to be rendered of the target object, and determining a specular reflection illumination value of each pixel in the pixels to be rendered according to the high light intensity, the high light color, the corrected shadow coefficient and the light source information of each pixel. It should be noted that the size of the highlight spot in the standard PBR rendering pipeline is controlled by the roughness and satisfies the law of conservation of energy, whereas the determination of the high light intensity according to the texture coordinates of each of the pixels to be rendered of the target in the present application is different from the implementation principle controlled by the roughness in the standard PBR rendering pipeline. This part will be described in detail hereinafter, with specific reference to the corresponding part.
In one case, the illumination treatment of indirect light is also included, which may also be referred to as ambient light, and correspondingly, the light source information also includes indirect light information, i.e. ambient light information. Wherein, the indirect diffuse reflection illumination value and/or indirect specular reflection illumination value of the target object can be determined in the following manner: performing indirect diffuse reflection processing on each pixel in the pixel to be rendered of the target object by using the indirect light information and the normal line information to obtain an indirect diffuse reflection illumination value, and/or performing indirect specular reflection processing on each pixel in the pixel to be rendered of the target object by using the roughness and the normal line information to obtain an indirect specular reflection illumination value so as to decouple the indirect diffuse reflection processing and the indirect specular reflection processing; and determining the indirect light illumination value of each pixel according to the indirect light diffuse reflection illumination value and/or the indirect light specular reflection illumination value. It should be noted that, in the standard PBR rendering pipeline, the calculation of the diffuse reflection illumination value and the diffuse reflection illumination value of the indirect light are coupled together, and in this embodiment, the diffuse reflection illumination value of the indirect light and the diffuse reflection illumination value of the indirect light are determined to be calculated separately, so as to decouple the diffuse reflection processing of the indirect light and the diffuse reflection processing of the indirect light, so as to realize the diversification of the specular reflection and the diffuse reflection effects of the indirect light. This part will be described in detail hereinafter, with specific reference to the corresponding part.
The following describes the specular illumination, and first describes the specular illumination process. For the processing of specular illumination of a standard PBR rendering pipeline, the size of a specular light spot (or referred to as a specular light spot) corresponding to the specular illumination is related to the roughness of a target object, and the specular illumination processing follows the law of conservation of energy, i.e., the specular light spot integral is a fixed value, so that the more concentrated the specular light spot, the brighter the specular light spot, the more dispersed the specular light spot, the darker the specular light spot, i.e., the size and brightness of the specular light spot are integrated, and are controlled together. However, in case of performing the stylized rendering, it is not necessary to make the highlight flare larger and darker and make the highlight flare smaller and brighter each time, for example, it is sometimes necessary to make the highlight flare larger and brighter, and therefore, the processing of the specular illumination of the standard PBR rendering pipeline does not perform well in case of the stylized rendering.
Fig. 8 is a flow chart of a method for rendering an object according to an embodiment of the present application, which mainly relates to improvement of a rendering flow and a rendering equation of specular reflection processing of a standard PBR rendering pipeline, and includes the following steps.
And 201, acquiring model information and light source information of the target object, wherein the model information comprises normal line information.
The specific content of the model information and the specific content of the light source information refer to the specific description of the step 201, and are not described herein.
202, obtaining texture coordinates, shadow coefficients and highlight colors of each pixel in the target object pixels to be rendered, and generating high light intensity of each pixel in the pixels to be rendered according to the texture coordinates of each pixel in the target object pixels to be rendered.
It should be noted that the standard PBR rendering pipeline calculates the specular reflection coefficient according to the roughness, and does not calculate the high light intensity, nor calculates the high light intensity according to the texture coordinates.
In one aspect, the step of generating the high light intensity of each of the pixels to be rendered according to the texture coordinates of each of the pixels to be rendered of the object includes: determining a high light coefficient of each pixel in the pixels to be rendered according to the texture coordinates; and generating the high light intensity of each pixel in the pixels to be rendered according to the high light coefficient and the preset high light intensity value. Wherein the high light coefficient is a value between [0,1 ]. The preset high light intensity value is a standard high light intensity value, for example, the preset high light intensity value may be a high light intensity value when the light source color is white and the light source intensity is 1, and the reflected high light color is white, and in different application scenarios, the standard high light intensity value may be the same or different, and according to the high light coefficient and the preset high light intensity value, the high light intensity corresponding to each pixel under the standard high light intensity value may be obtained.
In one aspect, the object includes a first object, which may be a virtual eyeball or the like, and the step of determining the high light coefficient of each pixel in the pixels to be rendered according to the texture coordinates includes: acquiring a highlight texture map of a first target object; determining sampling texture coordinates of each pixel in the pixels to be rendered according to the texture coordinates, normal line information in model information of the first target object and the position relation of the virtual camera; and sampling the highlight texture map according to the sampling texture coordinates to generate a highlight coefficient of each pixel. Wherein, the virtual camera refers to a virtual camera in the current virtual scene. In this embodiment, offset data between the virtual camera and the first object is determined according to the positional relationship between the texture coordinates, the normal information and the virtual camera, the offset data is used as offset data of the texture coordinates to obtain sampled texture coordinates, and the highlight texture map is sampled according to the sampled texture coordinates to obtain a correct highlight coefficient of each pixel.
The first object is taken as a virtual eyeball as an example for explanation. Correspondingly, the highlight texture map of the first object refers to a highlight texture map corresponding to a virtual eyeball, as shown in fig. 9, which is a schematic diagram of the highlight texture map corresponding to a certain virtual eyeball.
Correspondingly, the step of determining the sampling texture coordinates of each pixel in the pixels to be rendered according to the texture coordinates, the normal information in the model information of the first object and the position relation of the virtual camera comprises the following steps: when the virtual camera looks at the virtual eyeball, a vertical vector in the vertical direction and a horizontal vector in the horizontal direction on the plane on which the virtual eyeball is positioned are obtained; determining the offset between the virtual camera and the virtual eyeball according to the vertical vector, the horizontal vector and the orientation vector of the virtual camera; and determining the sampling texture coordinates of each pixel in the pixels to be rendered according to the texture coordinates, the offset, the moving speed of the highlight point along with the sight line and the scaling data of the highlight point. In the embodiment, the position of the virtual camera relative to the virtual eyeball is determined according to the offset between the virtual camera and the virtual eyeball, so that the position of the highlight point on the virtual eyeball is further determined, and the position of the highlight point on the virtual eyeball can be obtained according to the corresponding sampling texture coordinates and the highlight texture map.
When the virtual camera looks at the virtual eyeball, a vertical vector in a vertical direction and a horizontal vector in a horizontal direction on a plane on which the virtual eyeball is located may be as shown in fig. 10, where the vertical vector in the vertical direction may be represented by UP and the horizontal vector in the horizontal direction may be represented by Right. Determining the vertical and horizontal vectors is equivalent to determining a frame such that subsequent operations are processed in accordance with the frame, simplifying subsequent operations, while for convenience determining the relationship between the virtual eyeball and the line of sight (virtual camera) because the highlight of the virtual eyeball moves with the movement of the line of sight.
Two different ways can be used to determine the vertical and horizontal vectors: in the first mode, if the virtual eyeball participates in binding and animation, bone vectors of bones bound with the virtual eyeball are obtained, and a vertical vector in the vertical direction and a horizontal vector in the horizontal direction on a plane on which the virtual eyeball is located are determined according to the bone vectors. If the virtual eyeballs are not bound or a bone vector cannot be obtained, then using a second method, two identical eyeball models are duplicated to obtain two models of the virtual eyeballs, and normal vectors of the two models of the virtual eyeballs are set as a vertical vector in the vertical direction and a horizontal vector in the horizontal direction on the plane of the virtual eyeballs, as shown in fig. 11. For the second mode, the numerical values of the vertical vector and the horizontal vector under the tangent space of the surface of the target object can be recorded on the texture by baking the normal map, and the numerical values are restored to the world space when the highlight position is calculated.
Setting the orientation vector of the virtual camera as V, wherein the obtained vertical vector is UP, the horizontal vector is Right, and correspondingly, the step of determining the offset between the virtual camera and the virtual eyeball according to the vertical vector, the horizontal vector and the orientation vector of the virtual camera comprises the following steps: and respectively performing dot products on the vertical vector and the horizontal vector and the orientation vector of the virtual camera to obtain two offset values between the virtual camera and the virtual eyeball. The procedure of obtaining two Offset amounts can be expressed using a formula, offset 1=v·right, offset 2=v·up. When a line of sight (virtual camera) is facing the virtual eyeball, both Offset amounts Offset1 and Offset2 are set to 0.
The step of determining the sampling texture coordinates of each pixel in the pixels to be rendered according to the texture coordinates, the offset, the moving speed of the highlight point along with the line of sight and the scaling data of the highlight point comprises the following steps: shifting original texture coordinates of each pixel in the pixels to be rendered of the virtual eyeballs to obtain shifted texture coordinates UV' =UV-0.5; and determining the sampling texture coordinates of each pixel in the pixels to be rendered according to the offset, the texture coordinates after the offset, the movement speed of the highlight point along with the sight line and the scaling data of the highlight point. It should be noted that, the original texture coordinates of each of the pixels to be rendered of the virtual eyeball need to be projected based on the virtual camera view angle of the front surface of the virtual eyeball, as shown in fig. 12, on the basis of which the offset is performed. The scaling data of the highlight point is used to control the size of the highlight point, the scaling data of the highlight point may be represented by Scale, the movement Speed of the highlight point along with the line of sight may be represented by Speed, the sampling texture coordinate UVHighLight of each pixel, and determining the sampling texture coordinate of each pixel may represent uvhighlight= (UV' - (Vec 2 (Offset 1, offset 2) ×speed+vec2 (BaseOffset 1, baseOffset 2))) by formula.
It should be noted that in the embodiment of the present application, a separate parameter Scale is used to control the size and scaling of the highlight dots. The positional shift of the high light is controlled by using Offset1, offset2, baseOffset1, baseOffset2, and Speed. The size and the zoom and the position shift of the highlight point are respectively and independently controlled.
After the sampling texture coordinates of each pixel are obtained, the highlight texture map corresponding to the virtual eyeball is sampled by using the sampling texture coordinates, so that the highlight coefficient of each pixel can be obtained, wherein the range of the sampling texture coordinates is [0,1], and the obtained highlight coefficient of each pixel is also the value in [0,1 ]. After obtaining the high light coefficient, generating the high light intensity of each pixel according to the high light coefficient and a preset high light intensity value, for example, multiplying the high light coefficient and the preset high light intensity value to obtain the high light intensity SpecStr of each pixel.
It can be seen that, for the first object, such as a virtual eyeball, the size of the highlight point of the virtual eyeball and the high light intensity of the highlight point are respectively represented by two different parameters, namely Scale and SpecStr, and the size of the highlight point and the high light intensity in the specular reflection processing process are controlled separately, so that the size and the high light intensity of the highlight point do not need to satisfy the law of conservation of energy as in a standard PBR rendering pipeline, so that the scheme in the embodiment of the application is suitable for stylized rendering of the virtual eyeball. The resulting highlight effect of the virtual eyeball as rendered may be as shown in fig. 13a and 13 b.
In one aspect, the object includes a second object, which may be a virtual hair object, and the step of determining the highlight coefficient of each pixel of the pixels to be rendered according to the texture coordinates includes: determining a preset texture coordinate of the second target object, wherein the preset texture coordinate is obtained by expanding the texture coordinate of the surface patch corresponding to the second target object into a texture coordinate vertically arranged along the V-axis coordinate direction; according to the preset texture coordinates, the highlight coefficient of each pixel in the pixels to be rendered is determined, for example, a preset function which is built in advance is obtained, the preset function comprises a highlight position deviation coefficient and a highlight thickness coefficient, the maximum value of the preset function is controllable, the independent variable range is controllable, the position of the maximum value of the preset function in the independent variable range is controllable, a highlight position deviation value corresponding to the highlight position deviation coefficient and a highlight thickness value corresponding to the highlight thickness coefficient are set, and the highlight coefficient of each pixel in the pixels to be rendered of the second object is generated by utilizing the preset function according to the preset texture coordinates, the highlight position deviation value and the highlight thickness value.
Taking the second target object as an example of virtual hair, in order to better control the position and size (thickness) of the highlight and not influence the original texture sampling, a set of preset texture coordinates is set up, the preset texture coordinates take the exact center of a texture coordinate (original texture coordinate) space as a coordinate origin to establish a coordinate system, and the texture coordinates of each virtual hair piece (also called virtual hair inserting piece) in the virtual hair are unfolded to be texture coordinates obtained by vertically arranging UV blocks along the V-axis coordinate direction. After the other set of preset texture coordinates is obtained, mapping the other set of preset texture coordinates to pixels to obtain the preset texture coordinates of each pixel, and taking the preset texture coordinates of each pixel into account for the calculation of the highlight coefficient.
Fig. 14a is a schematic diagram of original texture coordinates of a virtual hair mask, and fig. 14b is a schematic diagram of preset texture coordinates after expansion.
The preset texture coordinates require that the resulting UV blocks be straight even though the virtual hair piece is curved, and aligned from small to large in the V-axis direction in the direction from the root to the tip. As shown in fig. 15a, the right side is a plurality of virtual hair pieces, which are curved, and the left side is a preset texture coordinate corresponding to the original texture coordinate, which is straight. As shown in fig. 15b, the black part of the virtual hair patch is smaller in V-coordinate among the preset texture coordinates, i.e., the root part is smaller in V-coordinate, the white part is larger in V-coordinate, i.e., the tip part is larger in V-coordinate.
Wherein, because each virtual hair piece has a difference of length, the UV blocks in the obtained preset texture coordinates have a difference of length in the vertical direction. It can be understood from the reverse side that setting the texture coordinates of all hairs from 0 to 1 results in a highlight stretching in the long virtual hair piece and a highlight compression in the short virtual hair piece, so that the texture coordinates are adjusted to obtain the preset texture coordinates, wherein the short virtual hair piece in the preset texture coordinates corresponds to shorter UV blocks, and the long virtual hair piece corresponds to longer UV blocks. Or it can be understood that because each virtual hair piece has a length distinction, there is a certain requirement for the preset texture coordinates: firstly, the initial position of the UV blocks (also called UV islands) on the virtual hair side in the V-axis direction is almost uniform, and secondly, the scaling of the UV islands in the V-axis direction according to the length of the virtual hair side ensures that the position of the highlight and the thickness of the highlight are ideal when the parameters b and c of the preset function (to be described later) are the same, and no transition offset and stretching occurs.
And constructing a preset function by taking the V value of the V axis direction in the adjusted preset texture coordinates as an independent variable. The preset function includes a highlight position offset coefficient and a highlight thickness coefficient (controlling highlight size, scaling, etc.), the maximum value of the preset function is required to be controllable, the independent variable range of the preset function is controllable, the position of the maximum value in the independent variable range is controllable, and the embodiment of the present application uses a gaussian function as an example for illustration.
The constructed Gaussian function may be
Figure BDA0003973359800000201
Wherein b is a highlight position offset coefficient, c is a highlight thickness coefficient, v is an independent variable, and f (v) is a function value of a Gaussian function, namely the highlight coefficient SpecFactor. It can be seen that, in the processing of the high light intensity for the second target, the high light thickness (high light size scaling) is set as an independent parameter, and the high light positional deviation is also set as an independent parameter.
When b=0 and c=0.34, the image of the gaussian function is shown in fig. 16a, where the value of b is modified, the shape of the function image can be shifted left and right, the value of c is modified, the shape of the function image can be widened and narrowed, and as shown in fig. 16b, during the change of the shape of the function, the maximum value is always 1, and the high light intensity is controlled in the direction, and the gaussian function meets all the requirements of the preset function.
And setting a proper b value and a proper c value, namely setting a highlight position offset value corresponding to the highlight position offset coefficient and a highlight thickness value corresponding to the highlight thickness coefficient, substituting the V value of the V-axis coordinate in the preset texture coordinate into a preset function, and generating the highlight coefficient of each pixel of the second target object, wherein the highlight coefficient can also be called a highlight intensity mask, and the value range of the highlight coefficient is [0,1].
As shown in fig. 17a, the highlight effect of the highlight coefficient (highlight mask) of a certain virtual hair piece can be seen, and as shown in fig. 17b, the highlight effect of the highlight coefficient in the whole virtual hair can be seen.
After obtaining the high light coefficient of each pixel, obtaining the high light intensity of each pixel according to the high light coefficient and a preset high light intensity value, for example, multiplying the high light coefficient by the preset high light intensity value to obtain the high light intensity corresponding to each pixel relative to the preset high light intensity value.
In the process of obtaining the high light intensity, only the V value of the V axis direction in the preset texture coordinates is used. In one case, the U value of the U axis direction in the preset texture coordinates may also be used. Correspondingly, the step of setting the highlight thickness value corresponding to the highlight thickness coefficient comprises the following steps: the step of changing the highlight thickness coefficient along with the U value change in the U axis direction in the preset texture coordinate to obtain a changed highlight thickness value, and correspondingly, generating the highlight coefficient of each pixel of the second target object by using a preset function according to the preset texture coordinate, the highlight position offset value and the highlight thickness value, wherein the step comprises the following steps: and generating the highlight coefficient of each pixel of the second target object by utilizing a preset function according to the V value, the highlight position offset value and the changed highlight thickness value of the V axis direction in the preset texture coordinates.
The c value of the gaussian function can be changed along with the U value of the U axis direction in the preset texture coordinate according to a certain function c=g (U), so as to obtain a changed highlight thickness value, and the highlight shape detail is increased. For example, a minimum value of the highlight thickness and a maximum value of the highlight thickness are obtained, and according to the minimum value of the highlight thickness and the maximum value of the highlight thickness, interpolation processing is performed on the highlight thickness coefficient in the preset function by using an interpolation function, so that a changed highlight thickness value is obtained.
For example, the following formula may be used to change c=mix ((| (U-0.5) |2) 2, minC, maxC), where U is U-axis direction U, minC is minimum value of highlight thickness, maxC is maximum value of highlight thickness, and Mix (t, min, max) indicates interpolation between min and max according to t. It should be noted that using this formula, the preset texture coordinates of the virtual hair patch should be as close to the center in the U-axis direction as possible and have a sufficient width. According to the formula, an H-shaped highlight pattern can be obtained, as shown in fig. 18a, which is a schematic H-shaped highlight pattern of a certain virtual hair piece, and fig. 18b, which is a schematic H-shaped highlight pattern of the whole virtual hair piece.
In an embodiment, after obtaining the highlight position offset value, the first noise function may further be used to perform noise processing on the highlight position offset value to obtain a final highlight position offset value, and/or after obtaining the changed highlight thickness value, the second noise function may further be used to perform noise processing on the changed highlight thickness value to obtain a final highlight thickness value. And the noise processing is carried out on the highlight position offset value and the highlight thickness value so as to simulate the interference effect of the hairline on the highlight. As shown in fig. 18c, a schematic diagram of the interference effect obtained by noise processing a certain virtual hair patch is shown, wherein both the highlight position offset value and the highlight thickness value are noise processed.
And finally, generating a highlight coefficient of each pixel of the second target object by utilizing a preset function according to a V value, a highlight position offset value and a changed highlight thickness value in the V axis direction in the preset texture coordinate, and generating a highlight intensity SpecStr of each pixel according to the highlight coefficient of each pixel and the preset highlight intensity value.
It should be noted that, in the processing of the high light intensity of the second object, the high light thickness (high light size scaling) and the high light intensity of the second object are respectively represented by two different parameters c and SpecStr, and the high light size and the high light intensity during the specular reflection processing are controlled separately, so that the high light size and the high light intensity do not need to satisfy the law of conservation of energy as in the standard PBR rendering pipeline, so that the scheme in the embodiment of the present application is suitable for stylized rendering of the second object.
In one aspect, the target includes a third target that is a target with UV coordinates that cannot be aligned with the virtual hair, such as a relatively complex target like clothes, and the step of determining the highlight coefficient of each pixel according to the texture coordinates includes: acquiring a highlight texture map of a third target object; sampling the highlight texture map according to the texture coordinates of each pixel in the pixels to be rendered of the third target object to obtain new texture coordinates; obtaining a preset function which is built in advance, wherein the preset function comprises a highlight position offset coefficient and a highlight thickness coefficient, the maximum value of the preset function is controllable, the independent variable range of the preset function is controllable, and the position of the maximum value in the independent variable range is controllable; setting a highlight position offset value corresponding to the highlight position offset coefficient and a highlight thickness value corresponding to the highlight thickness coefficient; and generating the highlight coefficient of each pixel in the pixels to be rendered of the third target object by using a preset function according to the new texture coordinates, the highlight position offset value and the highlight thickness value.
When the target object includes a third target object, the step of determining the highlight coefficient of each pixel according to the texture coordinates is different from the step when the target object is the second target object in that: acquiring a highlight texture map of a third target object; and sampling the highlight texture map according to the texture coordinates of each pixel to obtain new texture coordinates. The two steps that distinguish this will be described below.
The step of obtaining the highlight texture map of the third target object includes: the normal line map of the third object is obtained, the normal line map is imported into a preset tool such as Blender, a highlight region and a direction are marked through adding and adjusting curves according to a normal line structure, and then the highlight region and the direction are rendered into the highlight texture map of the third object, wherein the highlight texture map describes the trend of coordinates of the highlight. The highlight texture map of a third object may be as shown in fig. 19. After the highlight texture map of the third target object is obtained, the values of the R channel and the G channel of the highlight texture map are sampled according to the texture coordinates of each pixel to obtain two values respectively, and the two values are used as the u value and the v value of the new texture coordinates respectively.
After the u value and v value of the new texture coordinate are obtained, substituting the u value and v value into a preset function for processing to obtain a highlight coefficient, wherein the content of the highlight coefficient is consistent with the processing of the second object, and the description is omitted. It will be appreciated that for the second object, the predetermined texture coordinates are substituted into the predetermined function, and for the third object, the new texture coordinates obtained from the highlight texture map are substituted into the predetermined function.
Fig. 20 is a schematic diagram showing a rendering result of H-shaped highlights on a certain garment.
203, determining a specular reflection illumination value of each pixel according to the high light intensity, the high light color, the shading coefficient or the modified shading coefficient, and the light source information, wherein the modified shading coefficient is obtained by performing modification processing according to the shading coefficient of each pixel.
After obtaining the high light intensity SpecStr of each pixel, obtaining the high light color SpecColor stored in the G-Buffer, and determining the specular reflection value LightValue2 of each pixel by the Shadow coefficient Shadow, the light source intensity LightStr and the light source color LightColor of each pixel. For example, the high light intensity, the high light color, the light source intensity, and the shading coefficient of each pixel are multiplied to obtain the specular reflection illumination value of each pixel. The process of obtaining the specular reflection value of each pixel can be expressed using the following formula, lightValue 2=specstr×speccolor×lightcolor×lightstr×shadow.
In one case, instead of using the original shading coefficient, the corrected shading coefficient is used, and accordingly, the specular reflection value LightValue2 of each pixel is determined according to the high light intensity SpecStr, the high light color SpecColor, the corrected shading coefficient Shadow', the light source intensity LightStr, and the light source color LightColor. For example, the high light intensity, the high light color, the light source intensity, and the corrected shading coefficient of each pixel are multiplied to obtain the specular reflection illumination value of each pixel. The process of obtaining the specular reflection value of each pixel can be expressed by the following formula, where LightValue 2=specstr×speccolor×lightstr×shadow', and the corrected Shadow coefficients can be referred to the calculation method above, and will not be described herein.
204, performing stylized rendering on the target object according to the specular reflection illumination value.
After the specular reflection illumination value is obtained, the target object is rendered according to the specular reflection illumination value to realize stylized rendering, and in this case, only the specular reflection illumination condition of the target object is involved.
In one case, in addition to determining the specular reflection illumination value LightValue2 of the target object, it is also necessary to determine the diffuse reflection illumination value LightValue1 and/or the indirect light diffuse reflection illumination value LightValue3 and/or the indirect light specular reflection illumination value LightValue4 of the target object, and perform stylized rendering on the target object according to the diffuse reflection illumination value LightValue1, the specular reflection illumination value LightValue2 and/or the indirect light diffuse reflection illumination value LightValue3 and/or the indirect light specular reflection illumination value LightValue 4.
The diffuse reflection illumination value of the target object may be determined by determining the diffuse reflection illumination value in a standard PBR rendering pipeline, and if the diffuse reflection illumination value is determined in the standard PBR rendering pipeline, the original shadow coefficient is used for determining the specular reflection illumination value of the target object correspondingly. The determination of the diffuse reflection illumination value of the target object may also be performed in a manner of determining the diffuse reflection illumination value of the target object, which is related to modifying the standard PBR rendering pipeline in the above embodiment, and the corresponding parts are specifically referred to above, and are not repeated here.
In one case, the illumination treatment of indirect light is also included, which may also be referred to as ambient light, and correspondingly, the light source information also includes indirect light information, i.e. ambient light information. Wherein, the indirect diffuse reflection illumination value and/or indirect specular reflection illumination value of the target object can be determined in the following manner: performing indirect diffuse reflection processing on each pixel in the pixel to be rendered of the target object by using the indirect light information and the normal line information to obtain an indirect diffuse reflection illumination value, and/or performing indirect specular reflection processing on each pixel in the pixel to be rendered of the target object by using the roughness and the normal line information to obtain an indirect specular reflection illumination value so as to decouple the indirect diffuse reflection processing and the indirect specular reflection processing; and determining the indirect light illumination value of each pixel in the pixels to be rendered of the target object according to the indirect light diffuse reflection illumination value and/or the indirect light specular reflection illumination value. This part will be described in detail hereinafter, with specific reference to the corresponding part.
In the following, indirect light illumination will be described, and the treatment of indirect light illumination, also referred to as ambient light, will be described. For the processing of indirect light illumination of a standard PBR rendering pipeline, firstly, shooting the surrounding environment of a target object, obtaining environment light information, namely indirect light information, determining an indirect light specular reflection illumination value according to the indirect light information, and then processing the indirect light specular reflection illumination value to obtain a smoother value as an indirect light diffuse reflection illumination value, so that the indirect light specular reflection illumination value and the indirect light diffuse reflection illumination value are coupled, for example, the indirect light specular reflection illumination value is red, the indirect light diffuse reflection illumination value is red, or the indirect light diffuse reflection processing and the indirect light specular reflection processing can be understood to be coupled. However, for stylized rendering, the indirect specular reflection part needs to keep rich details and color variations, but for indirect diffuse reflection part, it needs to be as low-frequency as possible, and the color needs to be controllable, so the standard PBR rendering pipeline cannot meet the demands for the treatment of indirect light illumination.
Fig. 21 is another flow chart of a method for rendering an object according to an embodiment of the present application, which mainly relates to improvement of a rendering flow and a rendering equation of an indirect light reflection process of a standard PBR rendering pipeline, and includes the following steps.
301, model information and light source information of a target object are acquired, the model information including normal line information, and the light source information including indirect light information.
Model information please refer to the above, the indirect light information includes indirect light direction, indirect light color, indirect light shielding intensity, etc.
302, performing indirect light diffuse reflection processing on each pixel in the pixel to be rendered of the target object by using the indirect light information and the normal line information to obtain an indirect light diffuse reflection illumination value, and/or performing indirect light specular reflection processing on each pixel in the pixel to be rendered of the target object by using the roughness and the normal line information to obtain an indirect light specular reflection illumination value, so as to decouple the indirect light diffuse reflection processing and the indirect light specular reflection processing.
In the embodiment of the application, the indirect light diffuse reflection processing and the indirect light mirror surface reflection processing are completely separated, the indirect light diffuse reflection illumination value and the indirect light mirror surface reflection illumination value are completely decoupled/the indirect light diffuse reflection processing and the indirect light mirror surface reflection processing are completely decoupled, so that no relation exists between the obtained indirect light diffuse reflection illumination value and the indirect light mirror surface reflection illumination value, and the stylized rendering requirement is met.
For the processing of indirect light diffuse reflection, the step of performing indirect light diffuse reflection processing on each pixel in the pixels to be rendered of the target object by using the indirect light information and the normal line information to obtain an indirect light diffuse reflection illumination value includes: setting a spatial axis, and setting a plurality of indirect light colors and a plurality of indirect light intensities along the spatial axis; determining an indirect light diffuse reflection value of each pixel in the pixels to be rendered of the target object according to the spatial axis, the normal line information, the multiple indirect light colors and the multiple indirect light intensities; and determining the indirect light diffuse reflection illumination value of each pixel in the pixels to be rendered of the target object according to the indirect light diffuse reflection value, the inherent color of each pixel and the indirect light shielding intensity. In this embodiment a control of the diffuse reflection color of the indirect light is achieved, in particular by arranging a plurality of indirect light colors and a plurality of indirect light intensities in and along the spatial axis.
The spatial axis may be represented by a vector, and may be determined based on indirect optical information about a specific target. The plurality of indirect light colors and the plurality of indirect light intensities are described by taking three as examples. For example, assuming that the picture to be rendered includes the sky, the horizon and the ground surface, the sky is relatively bright and the horizon is the next most, the light corresponding to the top to bottom on the picture is bright to dark, correspondingly, the space axis may be an upward vector (from the ground surface to the sky), and correspondingly, three different indirect light colors and three different indirect light intensities are set, corresponding to the ground surface, the horizon and the sky, respectively. For example, in a studio, the ambient light may have a definite direction, e.g. from a certain direction, e.g. from the left, and then sequentially decrease from left to right, then the spatial axis may be set to a vector in the horizontal left direction, and the indirect light color and indirect light intensity may be set to three along the spatial axis.
By arranging the spatial axis, which is one reference direction of the change of the indirect light color and intensity, or which is understood to be related to the indirect light direction, such that the plurality of different indirect light colors and indirect light intensities arranged are changed along the spatial axis.
And determining the relation between the normal vector in the normal information and the set spatial axis, and carrying out interpolation processing on a plurality of indirect light colors and a plurality of indirect light intensities according to the relation to obtain an indirect light diffuse reflection value of each pixel. Wherein the relationship between the vertex normal and the spatial axis refers to the dot product result between the vertex normal and the spatial axis.
Assuming that the spatial axis is vector a, the normal vector of each pixel of the object is N, the three indirect light colors are SkyLightUpColor, skyLightMidColor and skyight down color, respectively, and the three indirect light intensities are SkyLightUpStr, skyLightMidStr and skyight down str, respectively. Let term= | (a·n))|, sign=step (0, a·n), indirect light diffuse reflection color of each pixel=mix (Sign, mix (1-Term, skyLightDownColor, skyLightMidColor), mix (Term, skyLightMidColor, skyLightUpColor)), wherein step (a, b) represents taking 0 when b < a, taking 1 when b > =a, mix (t, a, b) represents linear interpolation between a, b according to t. The indirect light diffuse reflection intensity of each pixel can be obtained in a similar manner, and thus, the indirect light diffuse reflection value of each pixel can be obtained from the indirect light diffuse reflection intensity of each pixel and the indirect light diffuse reflection color of each pixel, for example, the indirect light diffuse reflection color of each pixel and the indirect light diffuse reflection intensity of each pixel are subjected to multiplication processing to obtain the indirect light diffuse reflection value skyight color of each pixel, which can be understood as the value of the indirect light subjected to the diffuse reflection illumination processing.
After obtaining the diffuse indirect light reflection value of each pixel, multiplying the diffuse indirect light reflection value of each pixel, the intrinsic color of each pixel and the indirect light shielding intensity to obtain the diffuse indirect light reflection illumination value of each pixel, a formula may be used to express the process of obtaining the diffuse indirect light reflection illumination value, where LightValue 3=skyightcolor is BaseColor is AO, skyightcolor is the diffuse indirect light reflection value of each pixel, baseColor is the intrinsic color of each pixel, and AO is the indirect light shielding intensity of each pixel.
The above relates to an indirect light diffuse reflection processing, in which the indirect light diffuse reflection processing process modifies the indirect specular reflection illumination value in the standard PBR rendering pipeline to obtain the standard flow of the indirect light diffuse reflection illumination value, but directly self-defines the indirect light direction (related to the spatial axis), self-defines a plurality of different indirect light colors and indirect light intensities in the whole rendering picture, and interpolates them to obtain the indirect light diffuse reflection value, and obtains the indirect light diffuse reflection illumination value according to the indirect light diffuse reflection value, the inherent color of each pixel and the indirect light shielding intensity.
For the indirect specular reflection processing, in a standard PBR rendering pipeline, the reflection is stronger when the viewing angle is more parallel to the surface of the target due to the fresnel effect. For nonmetallic materials, the target emits a layer of white light, as shown in fig. 22, the two surfaces of the arrows in the line of sight are seen to have a layer of white light, and the color stability of the target is affected. In order to maintain the color stability of the target, two processing methods are proposed in the embodiments of the present application.
The first method, after removing the fresnel term, correspondingly, the step of performing indirect specular reflection processing on each pixel in the pixels to be rendered of the target object by using roughness and normal information to obtain an indirect specular reflection illumination value, includes: determining an indirect light specular color value IndirectSpecColor of each of the pixels to be rendered of the target object according to normal information such as a normal vector and roughness; and obtaining the reflection color of each pixel in the pixels to be rendered of the target object, and determining the indirect light specular reflection illumination value of each pixel according to the indirect light specular reflection value, the reflection color of the target object and the indirect light shielding intensity. It is simply understood that the indirect-light specular illumination value is determined based on the indirect-light specular reflection value IndirectSpecColor and the reflection color SpecColor of each pixel of the target object. In one embodiment, the reflective color SpecColor of each pixel of the target may be the highlight color referred to hereinabove, which is stored in the G-Buffer.
The second method improves the fresnel items, specifically, treats the fresnel items according to roughness. Correspondingly, the step of performing indirect specular reflection processing on each pixel in the pixels to be rendered of the target object by using roughness and normal information to obtain an indirect specular reflection illumination value includes: determining an indirect light specular color value IndirectSpecColor of each of the pixels to be rendered of the target object according to normal information such as a normal vector and roughness; determining a reflected color of the object based on the roughness and the inherent color of each pixel; and determining an indirect light specular reflection illumination value of each pixel in the pixels to be rendered of the target object according to the indirect light specular reflection value, the reflection color of the target object and the indirect light shielding intensity. Wherein the step of determining the reflected color of the object based on the roughness and the inherent color of each pixel comprises: interpolation processing is performed on the preset reflection color and the intrinsic color according to the roughness to obtain the reflection color of each pixel of the target object, a formula can be used to express the process of obtaining the reflection color of each pixel of the target object, speccolor=mix (roughness (coefficient, 0, 1), the preset reflection color and the intrinsic color), wherein the preset reflection color can be 0.04, the coefficient can be set according to the requirement on the effect, the coefficient can be set, the sample (x, a, b) represents that a is taken when x < a, b is taken when x > b, the rest x is taken, mix (t, a, b) represents that linear interpolation is performed between a and b according to t.
The indirect specular reflection value of each pixel is determined according to the normal information and the roughness, and can be understood as follows, for example, the reflection color of each point appears to be different from the metal ball, and the roughness of the target object also affects the indirect specular reflection value, and the different roughness appears to be different. The indirect light specular color value IndirectSpecColor of each pixel can be understood as an environmental color obtained without considering the reflection of the own color of the object, or an environmental color reflected by a reference color such as white object, as shown in fig. 23, and IndirectSpecColor indicates an environmental color reflected by a reference color such as white object, or can be also known as an indirect reflection color.
The step of determining the indirect light specular reflection illumination value of each pixel in the pixels to be rendered of the target object according to the indirect light specular reflection value, the reflection color of the target object and the indirect light shielding intensity comprises the following steps: and multiplying the indirect light specular reflection value, the reflection color of the target object and the ambient light shielding intensity to obtain an indirect light specular reflection illumination value of each pixel. Here, the indirect light specular reflection illumination value LightValue 4=indirectspeccolor×speccolor×ao of each pixel can be expressed using a formula.
The above relates to an indirect specular reflection process in which the fresnel term in the standard PBR rendering pipeline is removed or modified with roughness, and finally an indirect specular reflection illumination value is obtained from the indirect specular reflection value, the reflection color of each pixel, and the indirect light shielding intensity.
303, determining the indirect light illumination value of each pixel in the pixels to be rendered of the target object according to the indirect light diffuse reflection illumination value and/or the indirect light specular reflection illumination value.
If only the indirect light diffuse reflection illumination value of each pixel needs to be calculated, the indirect light diffuse reflection illumination value is directly determined as the indirect light illumination value. If only the indirect specular reflection illumination value of each pixel needs to be calculated, the indirect specular reflection illumination value is directly determined as the indirect illumination value. If the indirect diffuse reflection illumination value is calculated and the indirect specular reflection illumination value is calculated, the indirect diffuse reflection illumination value and the indirect specular reflection illumination value can be directly subjected to addition processing to obtain an indirect illumination value, or the indirect diffuse reflection illumination value and the indirect specular reflection illumination value are directly subjected to weighting processing to obtain an indirect illumination value.
304, performing stylized rendering on the target object according to the indirect light illumination value.
After the indirect light illumination value of each pixel is obtained, the target object is rendered according to the indirect light illumination value to realize stylized rendering, and in this case, only the indirect light illumination condition of the target object is involved.
In one case, besides determining the indirect light illumination value corresponding to the indirect light illumination of the target object, it is also necessary to determine the diffuse reflection illumination value of the target object and/or the specular reflection illumination value of the target object, and perform stylized rendering on the target object according to the indirect light illumination value and/or the diffuse reflection illumination value and/or the specular reflection illumination value of the target object.
The diffuse reflection illumination value of the target object may be determined according to any existing manner, and may also be determined according to the manner of determining the diffuse reflection illumination value in any embodiment mentioned in the application, which is not described in detail. The specular illumination values of the determination target may be determined in any of the existing manners, and may also be determined in any of the manners of determining specular illumination values in any of the embodiments mentioned hereinabove in the present application.
When the target object is stylized and rendered according to the diffuse reflection illumination value of the target object and/or the specular reflection illumination value of the target object and/or the indirect light diffuse reflection illumination value and/or the indirect light specular reflection illumination value, if multiple illumination values are involved, the involved multiple illumination values may be added to obtain a final illumination value of each pixel, or the involved multiple illumination values may be weighted to obtain a final illumination value of each pixel, and the stylized and rendered according to the final illumination value.
For example, when the diffuse reflection illumination value, the specular reflection illumination value, and the indirect light diffuse reflection illumination value are referred to, the diffuse reflection illumination value, the specular reflection illumination value, and the indirect light diffuse reflection illumination value may be added, and the result obtained after the addition processing may be used as the final illumination value for each pixel, or the diffuse reflection illumination value, the specular reflection illumination value, and the indirect light diffuse reflection illumination value may be weighted, and the result obtained after the addition processing may be used as the final illumination value for each pixel, and then rendered according to the final illumination value.
Fig. 24 is a schematic flow chart of a rendering method of an object according to an embodiment of the present application, where the method includes diffuse reflection, specular reflection, diffuse reflection of indirect light, specular reflection of indirect light, and the like.
The method comprises the steps of obtaining model information and light source information of a target object, wherein the model information comprises normal line information, and the light source information comprises first light source information and indirect light information 401.
Wherein the first light source information is used to determine the diffuse reflection illumination value and the specular reflection illumination value, and the light source information used to determine the diffuse reflection illumination value and the specular reflection illumination value in the above embodiments is the first light source information, and a description thereof will not be repeated. The indirect light information is used to determine indirect specular reflection illumination values and indirect diffuse reflection illumination values.
And 402, determining a diffuse reflection illumination coefficient of each pixel in the pixels to be rendered of the target object by using a preset diffuse reflection function according to the normal line information and the first light source information, and determining a diffuse reflection illumination value of each pixel according to the diffuse reflection illumination coefficient and the first light source information, wherein the preset diffuse reflection function is conductive at a position where a function value is zero and the reciprocal is zero.
403, obtaining texture coordinates, highlight color and shading coefficient of each pixel in the pixels to be rendered of the target object, generating high light intensity of each pixel according to the texture coordinates, and determining specular reflection illumination values of each pixel according to the high light intensity, the highlight color, the shading coefficient or the modified shading coefficient and the first light source information, wherein the modified shading coefficient is obtained by modifying according to the shading coefficient of each pixel.
And 404, performing indirect light diffuse reflection processing on each pixel in the pixels to be rendered of the target object by using the indirect light information and the normal line information to obtain an indirect light diffuse reflection illumination value, and performing indirect light mirror reflection processing on each pixel in the pixels to be rendered of the target object by using the roughness and the normal line information to obtain an indirect light mirror reflection illumination value.
Please refer to the description of the corresponding embodiments above in the above steps 402 to 404, which is not repeated here.
And 405, performing stylized rendering on the target object according to the diffuse reflection illumination value, the specular reflection illumination value, the indirect light diffuse reflection illumination value and the indirect specular reflection illumination value.
The diffuse reflection illumination value, the specular reflection illumination value, the indirect light diffuse reflection illumination value and the indirect light specular reflection illumination value can be added to obtain a final illumination value of each pixel, and the target object is rendered according to the final illumination value to realize stylized rendering.
The diffuse reflection illumination value, the specular reflection illumination value, the indirect light diffuse reflection illumination value and the indirect light specular reflection illumination value can be weighted to obtain a final illumination value of each pixel, and the target object is rendered according to the final illumination value to realize stylized rendering.
Fig. 25 and 26 are simplified flow diagrams of a rendering method of an object provided in an embodiment of the present application, where in the diffuse reflection processing portion in fig. 25, different attenuation coefficients are set for different channels, but in the specular reflection processing portion, a flow of specular reflection processing in a standard PBR rendering pipeline is used, an improved processing manner in the present application is adopted in indirect light diffuse reflection processing and indirect specular reflection processing, and in the diffuse reflection processing portion in fig. 26, a uniform attenuation coefficient is set, but in the specular reflection processing portion, a specular reflection processing in a standard PBR rendering pipeline is improved, and in the indirect light diffuse reflection processing and indirect specular reflection processing, an improved processing manner in the present application is adopted, which is specifically understood in conjunction with the portions described in the foregoing.
In the embodiment of the method, diffuse reflection illumination, specular reflection illumination, indirect light diffuse reflection illumination and indirect light specular reflection illumination in the labeling PBR rendering pipeline are improved so as to be suitable for stylized rendering, so that light and shadow interaction can be realized, and the stylized rendering efficiency is improved.
All the above technical solutions may be combined to form an optional embodiment of the present application, which is not described here in detail.
In order to facilitate better implementation of the method for rendering the target object in the embodiment of the present application, the embodiment of the present application further provides a device for rendering the target object. Referring to fig. 27, fig. 27 is a schematic structural diagram of a rendering device for a target object according to an embodiment of the present application. The object rendering apparatus 500 may include a first acquisition module 501, a first diffuse reflection module 502, and a first rendering module 503.
The first obtaining module 501 is configured to obtain model information and light source information of a target object, where the model information includes normal line information.
The first diffuse reflection module 502 is configured to determine, according to the normal information and the light source information, a diffuse reflection illumination coefficient of each pixel in the pixels to be rendered of the target object by using a preset diffuse reflection function, where the preset diffuse reflection function is conductive at a position where a function value is zero and a reciprocal is zero; and determining the diffuse reflection illumination value of each pixel according to the diffuse reflection illumination coefficient and the light source information.
The first diffuse reflection module 502 includes a diffuse reflection coefficient determining module and a diffuse reflection illumination value determining module. The diffuse reflection coefficient determining module is used for determining diffuse reflection illumination coefficients of each pixel in the pixels to be rendered of the target object by utilizing a preset diffuse reflection function according to the normal line information and the light source information, wherein the preset diffuse reflection function is conductive at a position where a function value is zero and the reciprocal is zero. And the diffuse reflection illumination value determining module is used for determining the diffuse reflection illumination value of each pixel according to the diffuse reflection illumination coefficient and the light source information.
In an embodiment, the diffuse reflection coefficient determining module is configured to obtain, for each of the R channel, the G channel, and the B channel, a different attenuation coefficient set in advance; determining the diffuse reflection illumination coefficients of each pixel in the pixels to be rendered in the R channel, the G channel and the B channel respectively by utilizing the preset diffuse reflection function according to the normal information, the light source information and the set different attenuation coefficients; and determining the diffuse reflection illumination coefficients of the R channel, the G channel and the B channel as the diffuse reflection illumination coefficient of each pixel in the pixels to be rendered.
In an embodiment, the light source information includes a transmittance ratio of the light source, and the diffuse reflection coefficient determining module is further configured to set an attenuation coefficient of each of the R channel, the G channel, and the B channel of each pixel to be a default attenuation coefficient when the transmittance ratio is zero; when the transmission ratio is 1, setting the attenuation coefficient of each of the R channel, the G channel and the B channel of each pixel as a preset attenuation coefficient; when the transmission ratio is between 0 and 1, setting the attenuation coefficient of each of the R channel, the G channel and the B channel as an interpolation attenuation coefficient, wherein the interpolation attenuation coefficient is obtained by carrying out interpolation processing on the default attenuation coefficient and a preset attenuation coefficient.
In an embodiment, after determining the diffuse reflection illumination coefficient of each pixel in the pixels to be rendered, the diffuse reflection coefficient determining module is further configured to perform correction processing on the shadow coefficient of each pixel in the pixels to be rendered to obtain a corrected shadow coefficient; and determining the final diffuse reflection illumination coefficient of each pixel according to the diffuse reflection illumination coefficient of each pixel and the corrected shadow coefficient.
The first rendering module 503 performs stylized rendering on the target object according to the diffuse reflection illumination value.
In an embodiment, as shown in fig. 27, the rendering apparatus 500 of the object may further include a first specular reflection module 504, where the first specular reflection module 504 is configured to determine a specular reflection illumination coefficient of each of the pixels to be rendered by using a manner in a physical-based rendering pipeline; determining a specular reflection illumination value of each pixel according to the specular reflection illumination coefficient, a shading coefficient or a corrected shading coefficient, and the light source information, wherein the corrected shading coefficient is obtained by correcting according to the shading coefficient of each pixel; or generating high light intensity of each pixel in the pixels to be rendered according to the texture coordinates of each pixel in the pixels to be rendered of the target object; and determining the specular reflection illumination value of each pixel according to the high light intensity, the high light color, the corrected shadow coefficient and the light source information. Correspondingly, the first rendering module 503 is configured to perform stylized rendering on the target object according to the diffuse reflection illumination value and the specular reflection illumination value.
In an embodiment, as shown in fig. 27, the rendering device 500 of the target object further includes a first indirect light module 505, where the first indirect light module 505 is configured to perform indirect diffuse reflection processing on each pixel in the pixels to be rendered of the target object by using the indirect light information and the normal line information to obtain an indirect diffuse reflection illumination value, and/or perform indirect specular reflection processing on each pixel in the pixels to be rendered of the target object by using the roughness and the normal line information to obtain an indirect specular reflection illumination value, so as to decouple the indirect diffuse reflection processing and the indirect specular reflection processing; and determining the indirect light illumination value in the pixel to be rendered of the target object according to the indirect light diffuse reflection illumination value and/or the indirect light specular reflection illumination value. Correspondingly, a first rendering module 503 is configured to perform stylized rendering on the target object according to the diffuse reflection illumination value and the indirect light illumination value; or performing stylized rendering on the target object according to the diffuse reflection illumination value, the specular reflection illumination value, the indirect light diffuse reflection illumination value and/or the indirect specular reflection illumination value.
In an embodiment, as shown in fig. 28, a schematic structural diagram of a rendering device for a target object according to an embodiment of the present application is shown. The object rendering apparatus 600 may include a second acquisition module 601, a second specular reflection module 602, and a second rendering module 603.
A second obtaining module 601, configured to obtain model information and light source information of a target object, where the model information includes normal information; and obtaining texture coordinates, shadow coefficients and highlight colors of each pixel in the pixels to be rendered of the target object.
A second specular reflection module 602, configured to generate a high light intensity of each of the pixels to be rendered according to the texture coordinates of the target object; and determining a specular reflection illumination value of each pixel according to the high light intensity, the high light color, the shading coefficient or the corrected shading coefficient, and the light source information, wherein the corrected shading coefficient is obtained by correcting according to the shading coefficient of each pixel.
In an embodiment, the second specular reflection module 602 includes a high light intensity determining module and a specular reflection illumination value determining module, where the high light intensity determining module is configured to generate, according to the texture coordinates of the target object, a high light intensity of each pixel in the pixels to be rendered, and the specular reflection illumination value determining module is configured to determine, according to the high light intensity, the high light color, the shading coefficient or the modified shading coefficient, and the light source information, a specular reflection illumination value of each pixel, where the modified shading coefficient is obtained by performing a modification process according to the shading coefficient of each pixel.
In an embodiment, the high light intensity determining module is specifically configured to determine, according to the texture coordinates, a high light coefficient of each pixel in the pixels to be rendered; and generating the high light intensity of each pixel in the pixels to be rendered according to the high light coefficient and the preset high light intensity value.
In an embodiment, the object includes a first object, and the high light intensity determining module specifically performs, when performing the step of determining the high light coefficient of each of the pixels to be rendered according to the texture coordinates: acquiring a highlight texture map of the first target object; determining sampling texture coordinates of each pixel in the pixels to be rendered according to the texture coordinates, normal line information in the model information of the first target object and the position relation of the virtual camera; the Gao Guangwen map is sampled according to the sampled texture coordinates to generate a high light coefficient for each of the pixels to be rendered.
In an embodiment, the object includes a second object, and the high light intensity determining module specifically performs, when performing the step of determining the high light coefficient of each of the pixels to be rendered according to the texture coordinates: determining a preset texture coordinate of the second target object, wherein the preset texture coordinate is obtained by expanding the texture coordinate of the surface patch corresponding to the second target object into a texture coordinate vertically arranged along the V-axis coordinate direction; obtaining a preset function which is built in advance, wherein the preset function comprises a highlight position offset coefficient and a highlight thickness coefficient, the maximum value of the preset function is controllable, the independent variable range of the preset function is controllable, and the position of the maximum value in the independent variable range is controllable; setting a highlight position offset value corresponding to the highlight position offset coefficient and a highlight thickness value corresponding to the highlight thickness coefficient; and generating a highlight coefficient of each pixel in the pixels to be rendered of the second target object by utilizing the preset function according to the preset texture coordinates, the highlight position offset value and the highlight thickness value.
In an embodiment, the object further includes a third object, and the high light intensity determining module specifically performs, when performing the step of determining the high light coefficient of each of the pixels to be rendered according to the texture coordinates: acquiring a highlight texture map of the third target object; sampling the Gao Guangwen map according to the texture coordinates of each pixel in the pixels to be rendered of the third target object to obtain new texture coordinates; obtaining a preset function which is built in advance, wherein the preset function comprises a highlight position offset coefficient and a highlight thickness coefficient, the maximum value of the preset function is controllable, the independent variable range of the preset function is controllable, and the position of the maximum value in the independent variable range is controllable; setting a highlight position offset value corresponding to the highlight position offset coefficient and a highlight thickness value corresponding to the highlight thickness coefficient; and generating a highlight coefficient of each pixel in the pixels to be rendered of the third target object by utilizing the preset function according to the new texture coordinates, the highlight position offset value and the highlight thickness value.
And the second rendering module 603 is configured to perform stylized rendering on the target object according to the specular reflection illumination value.
In an embodiment, the rendering device 600 of the target object may further include a second indirect light module 604 (the same as the first indirect light module), where the second indirect light module is configured to perform indirect diffuse reflection processing on each pixel in the pixel to be rendered of the target object to obtain an indirect diffuse reflection illumination value by using the indirect light information and the normal line information, and/or perform indirect specular reflection processing on each pixel in the pixel to be rendered of the target object to obtain an indirect specular reflection illumination value by using the roughness and the normal line information, so as to decouple the indirect diffuse reflection processing and the indirect specular reflection processing; and determining the indirect light illumination value of each pixel in the pixels to be rendered of the target object according to the indirect light diffuse reflection illumination value and/or the indirect light specular reflection illumination value. Correspondingly, the second rendering module 603 is further configured to stylized render the target object according to the specular reflection illumination value and the indirect light illumination value.
In an embodiment, the rendering device 600 of the target object may further include a second diffuse reflection module (the same as the first diffuse reflection module), and the second diffuse reflection module is configured to determine, according to the normal line information and the light source information, a diffuse reflection illumination coefficient of each pixel in the pixels to be rendered of the target object in the normal line information by using a preset diffuse reflection function, where the preset diffuse reflection function is conductive at a position where the function value is zero and reciprocal is zero; and determining the diffuse reflection illumination value of each pixel according to the diffuse reflection illumination coefficient and the light source information.
In an embodiment, as shown in fig. 29, a schematic structural diagram of a rendering device for a target object according to an embodiment of the present application is shown. The object rendering apparatus 700 may include a third acquisition module 701, a third indirect light module 702, and a third rendering module 703.
The third obtaining module 701 is configured to obtain model information and light source information of the target object, where the model information includes normal line information, and the light source information includes indirect light information.
A third indirect light module 702, configured to perform indirect diffuse reflection processing on each pixel in the pixel to be rendered of the target object by using the indirect light information and the normal line information to obtain an indirect diffuse reflection illumination value, and/or perform indirect specular reflection processing on each pixel in the pixel to be rendered of the target object by using the roughness and the normal line information to obtain an indirect specular reflection illumination value, so as to decouple the indirect diffuse reflection processing and the indirect specular reflection processing; and determining the indirect light illumination value of each pixel in the pixels to be rendered of the target object according to the indirect light diffuse reflection illumination value and/or the indirect light specular reflection illumination value. The third indirect light module 702 is identical to the first indirect light module.
In one embodiment, the third indirect light module 702 comprises: the device comprises an indirect light diffuse reflection module, an indirect light mirror reflection module and an indirect light illumination value determination module. And the indirect light diffuse reflection module is used for carrying out indirect light diffuse reflection processing on each pixel in the pixels to be rendered of the target object by utilizing the indirect light information and the normal line information to obtain an indirect light diffuse reflection illumination value. And the indirect mirror reflection module is used for carrying out indirect mirror reflection processing on each pixel in the pixels to be rendered of the target object by utilizing the roughness and the normal line information to obtain an indirect mirror reflection illumination value. And the indirect light illumination value determining module is used for determining the indirect light illumination value of each pixel in the pixels to be rendered of the target object according to the indirect light diffuse reflection illumination value and/or the indirect light specular reflection illumination value.
In an embodiment, the indirect light information includes an indirect light color and an indirect light intensity, and the indirect light diffuse reflection module is specifically configured to set a spatial axis, and set a plurality of indirect light colors and a plurality of indirect light intensities along the spatial axis; determining an indirect light diffuse reflection value of each pixel in the pixels to be rendered of the target object according to the spatial axis, the normal information, a plurality of indirect light colors and a plurality of indirect light intensities; and determining the indirect light diffuse reflection illumination value of each pixel according to the indirect light diffuse reflection value, the inherent color of each pixel and the indirect light shading intensity.
In an embodiment, the indirect light specular reflection module is specifically configured to determine an indirect light specular reflection value of each pixel in the pixels to be rendered of the target object according to the normal information and the roughness; obtaining the reflection color of each pixel of the target object; and determining the indirect light specular reflection illumination value of each pixel according to the indirect light specular reflection value, the reflection color of the target object and the indirect light shielding intensity.
And a third rendering module 703, configured to perform stylized rendering on the target object according to the indirect light illumination value of each pixel.
In an embodiment, the rendering device 700 of the target object may further include a third diffuse reflection module, where the third diffuse reflection module is consistent with the first diffuse reflection module, and is not described herein.
In an embodiment, the rendering device 700 of the object may further include a third specular reflection module, where the third specular reflection module is consistent with the first specular reflection module, and is not described herein.
All the above technical solutions may be combined to form an optional embodiment of the present application, which is not described here in detail.
Correspondingly, the embodiment of the application also provides computer equipment, which can be a terminal or a server. As shown in fig. 30, fig. 30 is a schematic structural diagram of a computer device according to an embodiment of the present application. The computer device 800 includes a processor 801 having one or more processing cores, a memory 802 having one or more computer readable storage media, and a computer program stored on the memory 802 and executable on the processor. The processor 801 is electrically connected to the memory 802.
The processor 801 is a control center of the computer device 800, connects various parts of the entire computer device 800 using various interfaces and lines, and performs various functions of the computer device 800 and processes data by running or loading software programs (computer programs) and/or modules stored in the memory 802, and calling data stored in the memory 802, thereby performing overall monitoring of the computer device 800.
In the embodiment of the present application, the processor 801 in the computer device 800 loads the instructions corresponding to the processes of one or more application programs/computer programs into the memory 802 according to the following steps, and the processor 801 executes the application programs/computer programs stored in the memory 802, so as to implement various functions, for example, as follows:
obtaining model information and light source information of a target object, wherein the model information comprises normal line information; determining diffuse reflection illumination coefficients of each pixel in the pixels to be rendered of the target object by using a preset diffuse reflection function according to the normal information and the light source information, wherein the preset diffuse reflection function can be conducted at a position where a function value is zero and the reciprocal is zero; determining a diffuse reflection illumination value of each pixel according to the diffuse reflection illumination coefficient and the light source information; performing stylized rendering on the target object according to the diffuse reflection illumination value;
Or alternatively, the process may be performed,
obtaining model information and light source information of a target object, wherein the model information comprises normal line information; obtaining texture coordinates, shadow coefficients and highlight colors of each pixel in the pixels to be rendered of the target object, and generating high light intensity of each pixel in the pixels to be rendered according to the texture coordinates of the target object; determining a specular reflection illumination value of each pixel according to the high light intensity, the high light color, the shading coefficient or the corrected shading coefficient, and the light source information, wherein the corrected shading coefficient is obtained by correction processing according to the shading coefficient of each pixel; performing stylized rendering on the target object according to the specular reflection illumination value;
or alternatively, the process may be performed,
obtaining model information and light source information of a target object, wherein the model information comprises normal line information, and the light source information comprises indirect light information; performing indirect diffuse reflection processing on each pixel in the pixel to be rendered of the target object by using the indirect light information and the normal line information to obtain an indirect diffuse reflection illumination value, and/or performing indirect specular reflection processing on each pixel in the pixel to be rendered of the target object by using roughness and the normal line information to obtain an indirect specular reflection illumination value, so as to decouple the indirect diffuse reflection processing and the indirect specular reflection processing; determining an indirect light illumination value of each pixel in the pixels to be rendered of the target object according to the indirect light diffuse reflection illumination value and/or the indirect light specular reflection illumination value; and performing stylized rendering on the target object according to the indirect light illumination value.
The steps/operations in any of the foregoing method embodiments may be executed by the processor, and detailed descriptions of the foregoing method embodiments are omitted herein, so that detailed implementations and advantages achieved by the foregoing operations may be found in the foregoing embodiments, and are not described herein.
Optionally, as shown in fig. 30, the computer device 800 further includes: a touch display 803, a radio frequency circuit 804, an audio circuit 805, an input unit 806, and a power supply 807. The processor 801 is electrically connected to the touch display 803, the radio frequency circuit 804, the audio circuit 805, the input unit 806, and the power supply 807, respectively. Those skilled in the art will appreciate that the computer device structure shown in FIG. 30 is not limiting of the computer device and may include more or fewer components than shown, or may be a combination of certain components, or a different arrangement of components.
The touch display 803 may be used to display a graphical user interface and receive operation instructions generated by a user acting on the graphical user interface. The touch display 803 may include a display panel and a touch panel. Wherein the display panel may be used to display information entered by a user or provided to a user as well as various graphical user interfaces of a computer device, which may be composed of graphics, text, icons, video, and any combination thereof. Alternatively, the display panel may be configured in the form of a liquid crystal display (LCD, liquid Crystal Display), an Organic Light-Emitting Diode (OLED), or the like. The touch panel may be used to collect touch operations on or near the user (such as operations on or near the touch panel by the user using any suitable object or accessory such as a finger, stylus, etc.), and generate corresponding operation instructions, and the operation instructions execute corresponding programs. The touch panel may overlay the display panel, and upon detection of a touch operation thereon or thereabout, the touch panel is passed to the processor 801 to determine the type of touch event, and the processor 801 then provides a corresponding visual output on the display panel based on the type of touch event. In the embodiment of the present application, the touch panel and the display panel may be integrated into the touch display 803 to implement the input and output functions. In some embodiments, however, the touch panel and the touch panel may be implemented as two separate components to perform the input and output functions. I.e. the touch-sensitive display 803 may also implement an input function as part of the input unit 806.
In the embodiment of the present application, the touch display 803 is configured to present a graphical user interface and receive an operation instruction generated by a user acting on the graphical user interface.
The radio frequency circuit 804 may be used to transceive radio frequency signals to establish wireless communication with a network device or other computer device via wireless communication.
Audio circuitry 805 may be used to provide an audio interface between a user and a computer device through speakers, microphones, and so on. The audio circuit 805 may transmit the received electrical signal converted from audio data to a speaker, and convert the electrical signal into a sound signal for output by the speaker; on the other hand, the microphone converts the collected sound signals into electrical signals, which are received by the audio circuit 805 and converted into audio data, which are processed by the audio data output processor 801 for transmission to, for example, another computer device via the radio frequency circuit 804, or which are output to the memory 802 for further processing. The audio circuitry 805 may also include an ear bud jack to provide communication of the peripheral headphones with the computer device.
The input unit 806 may be used to receive input numbers, character information, or user characteristic information (e.g., fingerprint, iris, facial information, etc.), and to generate keyboard, mouse, joystick, optical, or trackball signal inputs related to user settings and function control.
A power supply 807 is used to power the various components of the computer device 800. Alternatively, the power supply 807 may be logically connected to the processor 801 through a power management system, so that functions of managing charging, discharging, and power consumption management are implemented through the power management system. The power supply 807 may also include one or more of any components, such as a direct current or alternating current power supply, a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator, and the like.
Although not shown in fig. 30, the computer device 800 may further include a camera, a sensor, a wireless fidelity module, a bluetooth module, etc., which will not be described herein.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and for parts of one embodiment that are not described in detail, reference may be made to related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that all or a portion of the steps of the various methods of the above embodiments may be performed by instructions, or by instructions controlling associated hardware, which may be stored in a computer-readable storage medium and loaded and executed by a processor.
To this end, embodiments of the present application provide a computer readable storage medium having stored therein a plurality of computer programs/instructions capable of being loaded by a processor to perform steps in a method for rendering an object in any of the embodiments provided herein. For example, the computer program may perform the steps of:
Obtaining model information and light source information of a target object, wherein the model information comprises normal line information; determining diffuse reflection illumination coefficients of each pixel in the pixels to be rendered of the target object by using a preset diffuse reflection function according to the normal information and the light source information, wherein the preset diffuse reflection function can be conducted at a position where a function value is zero and the reciprocal is zero; determining a diffuse reflection illumination value of each pixel according to the diffuse reflection illumination coefficient and the light source information; performing stylized rendering on the target object according to the diffuse reflection illumination value;
or alternatively, the process may be performed,
obtaining model information and light source information of a target object, wherein the model information comprises normal line information; obtaining texture coordinates, shadow coefficients and highlight colors of each pixel in the pixels to be rendered of the target object, and generating high light intensity of each pixel in the pixels to be rendered according to the texture coordinates of the target object; determining a specular reflection illumination value of each pixel according to the high light intensity, the high light color, the shading coefficient or the corrected shading coefficient, and the light source information, wherein the corrected shading coefficient is obtained by correction processing according to the shading coefficient of each pixel; performing stylized rendering on the target object according to the specular reflection illumination value;
Or alternatively, the process may be performed,
obtaining model information and light source information of a target object, wherein the model information comprises normal line information, and the light source information comprises indirect light information; performing indirect diffuse reflection processing on each pixel in the pixel to be rendered of the target object by using the indirect light information and the normal line information to obtain an indirect diffuse reflection illumination value, and/or performing indirect specular reflection processing on each pixel in the pixel to be rendered of the target object by using roughness and the normal line information to obtain an indirect specular reflection illumination value, so as to decouple the indirect diffuse reflection processing and the indirect specular reflection processing; determining an indirect light illumination value of each pixel in the pixels to be rendered of the target object according to the indirect light diffuse reflection illumination value and/or the indirect light specular reflection illumination value; and performing stylized rendering on the target object according to the indirect light illumination value.
The specific implementation of each operation and the obtained beneficial effects can be found in the previous embodiments, and will not be described herein.
Wherein the storage medium may include: read Only Memory (ROM), random access Memory (RAM, random Access Memory), magnetic or optical disk, and the like.
The steps in the rendering method of any object provided in the embodiments of the present application may be executed by the computer program stored in the storage medium, so that the beneficial effects that may be achieved by the rendering method of any object provided in the embodiments of the present application may be achieved, which are detailed in the previous embodiments and are not described herein.
The foregoing describes in detail a rendering method, apparatus, storage medium and computer device for an object provided in the embodiments of the present application, and specific examples are applied to illustrate principles and implementations of the present application, where the foregoing description of the embodiments is only for helping to understand the method and core idea of the present application; meanwhile, those skilled in the art will have variations in the specific embodiments and application scope in light of the ideas of the present application, and the present description should not be construed as limiting the present application in view of the above.

Claims (29)

1. A method of rendering an object, comprising:
obtaining model information and light source information of a target object, wherein the model information comprises normal line information;
determining diffuse reflection illumination coefficients of each pixel in the pixels to be rendered of the target object by using a preset diffuse reflection function according to the normal information and the light source information, wherein the preset diffuse reflection function can be conducted at a position where a function value is zero and the reciprocal is zero;
Determining a diffuse reflection illumination value of each pixel according to the diffuse reflection illumination coefficient and the light source information;
and performing stylized rendering on the target object according to the diffuse reflection illumination value.
2. The method according to claim 1, wherein the preset diffuse reflection function includes an attenuation coefficient of diffuse reflection, each pixel includes an R channel, a G channel and a B channel, and the step of determining, according to the normal information and the light source information, a diffuse reflection illumination coefficient of each pixel in the pixels to be rendered of the target object using the preset diffuse reflection function includes:
for each channel of the R channel, the G channel and the B channel, obtaining different attenuation coefficients which are preset;
determining the diffuse reflection illumination coefficients of each pixel in the pixels to be rendered in the R channel, the G channel and the B channel respectively by utilizing the preset diffuse reflection function according to the normal information, the light source information and the set different attenuation coefficients;
and determining the diffuse reflection illumination coefficients of the R channel, the G channel and the B channel as the diffuse reflection illumination coefficient of each pixel in the pixels to be rendered.
3. The method of claim 2, wherein the step of obtaining a different attenuation coefficient set in advance for each of the R channel, the G channel, and the B channel comprises: and acquiring different preset attenuation coefficients for different channels of different pixels.
4. The method of claim 2, wherein the light source information comprises a transmittance ratio of the light source, the method further comprising:
when the transmission ratio is zero, setting the attenuation coefficient of each of the R channel, the G channel and the B channel of each pixel as a default attenuation coefficient;
when the transmission ratio is 1, setting the attenuation coefficient of each of the R channel, the G channel and the B channel of each pixel as a preset attenuation coefficient;
when the transmission ratio is between 0 and 1, setting the attenuation coefficient of each of the R channel, the G channel and the B channel as an interpolation attenuation coefficient, wherein the interpolation attenuation coefficient is obtained by carrying out interpolation processing on the default attenuation coefficient and a preset attenuation coefficient.
5. The method of claim 2, further comprising, after the step of determining the diffuse reflection illumination coefficients for each of the pixels to be rendered:
correcting the shadow coefficient of each pixel in the pixels to be rendered to obtain corrected shadow coefficients;
and determining the final diffuse reflection illumination coefficient of each pixel according to the diffuse reflection illumination coefficient of each pixel and the corrected shadow coefficient.
6. The method of claim 5, wherein the light source information includes a light direction, and wherein the step of correcting the shading coefficient of each pixel in the normal information to obtain a corrected shading coefficient comprises:
the shadow coefficient of each pixel is obtained, and a first diffuse reflection illumination coefficient and a maximum shadow coefficient corresponding to the normal information when the normal information is perpendicular to the illumination direction are obtained;
and correcting the shadow coefficient by using the first diffuse reflection illumination coefficient and the maximum shadow coefficient to obtain a corrected shadow coefficient.
7. The method of claim 1, further comprising, prior to the step of stylized rendering the target object based on the diffuse reflected illumination values:
determining specular reflection illumination coefficients for each of the pixels to be rendered using a manner in a physics-based rendering pipeline;
determining a specular reflection illumination value of each pixel according to the specular reflection illumination coefficient, a shading coefficient or a corrected shading coefficient, and the light source information, wherein the corrected shading coefficient is obtained by correcting according to the shading coefficient of each pixel;
Or alternatively, the process may be performed,
generating high light intensity of each pixel in the pixels to be rendered according to the texture coordinates of each pixel in the pixels to be rendered of the target object;
determining a specular reflection illumination value of each pixel according to the high light intensity, the high light color, the corrected shading coefficient and the light source information;
the step of performing stylized rendering on the target object according to the diffuse reflection illumination value includes:
and performing stylized rendering on the target object according to the diffuse reflection illumination value and the specular reflection illumination value.
8. The method of claim 1 or 7, wherein the light source information further comprises indirect light information, and further comprising, prior to the step of stylized rendering the target object based on the diffuse reflected illumination value:
performing indirect diffuse reflection processing on each pixel in the target object pixel to be rendered by using the indirect light information and the normal line information to obtain an indirect diffuse reflection illumination value, and/or performing indirect specular reflection processing on each pixel in the target object pixel to be rendered by using roughness and the normal line information to obtain an indirect specular reflection illumination value, so as to decouple the indirect diffuse reflection processing and the indirect specular reflection processing;
And determining the indirect light illumination value of each pixel in the pixels to be rendered of the target object according to the indirect light diffuse reflection illumination value and/or the indirect light specular reflection illumination value.
9. A method of rendering an object, comprising:
obtaining model information and light source information of a target object, wherein the model information comprises normal line information;
obtaining texture coordinates, shadow coefficients and highlight colors of each pixel in the pixels to be rendered of the target object, and generating high light intensity of each pixel in the pixels to be rendered according to the texture coordinates of the target object;
determining a specular reflection illumination value of each pixel according to the high light intensity, the high light color, the shading coefficient or the corrected shading coefficient, and the light source information, wherein the corrected shading coefficient is obtained by correction processing according to the shading coefficient of each pixel;
and performing stylized rendering on the target object according to the specular reflection illumination value.
10. The method of claim 9, wherein the step of generating a high light intensity for each of the pixels to be rendered from the texture coordinates of the object comprises:
Determining the high light coefficient of each pixel in the pixels to be rendered according to the texture coordinates;
and generating the high light intensity of each pixel in the pixels to be rendered according to the high light coefficient and the preset high light intensity value.
11. The method of claim 10, wherein the object comprises a first object, and wherein the step of determining the high light coefficient of each of the pixels to be rendered based on the texture coordinates comprises:
acquiring a highlight texture map of the first target object;
determining sampling texture coordinates of each pixel in the pixels to be rendered according to the texture coordinates, normal line information in the model information of the first target object and the position relation of the virtual camera;
the Gao Guangwen map is sampled according to the sampled texture coordinates to generate a high light coefficient for each of the pixels to be rendered.
12. The method of claim 11, wherein the first object is a virtual eyeball, and the step of determining the sampling texture coordinates of each pixel to be rendered according to the texture coordinates, the normal information in the model information of the first object, and the positional relationship of the virtual camera includes:
When a virtual camera is looking at the virtual eyeball, acquiring a vertical vector in the vertical direction and a horizontal vector in the horizontal direction on a plane on which the virtual eyeball is positioned;
determining an offset between the virtual camera and the virtual eyeball according to the vertical vector, the horizontal vector and the orientation vector of the virtual camera;
and determining the sampling texture coordinates of each pixel in the pixels to be rendered of the first target object according to the texture coordinates, the offset, the moving speed of the highlight point along with the sight line and the scaling data of the highlight point.
13. The method according to claim 12, wherein the step of acquiring a vertical vector in a vertical direction and a horizontal vector in a horizontal direction on a plane on which the virtual eyeball is located includes:
acquiring a bone vector of a bone bound with a virtual eyeball;
and determining a vertical vector in the vertical direction and a horizontal vector in the horizontal direction on a plane where the virtual eyeball is located according to the skeleton vector.
14. The method according to claim 12, wherein the step of acquiring a vertical vector in a vertical direction and a horizontal vector in a horizontal direction on a plane on which the virtual eyeball is located includes:
Copying the model information of the virtual eyeballs to obtain two models of the virtual eyeballs;
the normal vector of the models of the two virtual eyeballs is set as a vertical vector in the vertical direction and a horizontal vector in the horizontal direction on the plane on which the virtual eyeballs are located.
15. The method of claim 10, wherein the object comprises a second object, and wherein the step of determining the high light coefficient of each of the pixels to be rendered based on the texture coordinates comprises:
determining a preset texture coordinate of the second target object, wherein the preset texture coordinate is obtained by expanding the texture coordinate of the surface patch corresponding to the second target object into a texture coordinate vertically arranged along the V-axis coordinate direction;
obtaining a preset function which is built in advance, wherein the preset function comprises a highlight position offset coefficient and a highlight thickness coefficient, the maximum value of the preset function is controllable, the independent variable range of the preset function is controllable, and the position of the maximum value in the independent variable range is controllable;
setting a highlight position offset value corresponding to the highlight position offset coefficient and a highlight thickness value corresponding to the highlight thickness coefficient;
And generating a highlight coefficient of each pixel in the pixels to be rendered of the second target object by utilizing the preset function according to the preset texture coordinates, the highlight position offset value and the highlight thickness value.
16. The method of claim 15, wherein setting the highlight thickness value corresponding to the highlight thickness coefficient comprises:
changing the highlight thickness coefficient in the preset function along with the U value change of the U axis direction in the preset texture coordinate to obtain a changed highlight thickness value;
the step of generating the highlight coefficient of each pixel in the pixels to be rendered of the second target object by using the preset function according to the preset texture coordinates, the highlight position offset value and the highlight thickness value includes: and generating the highlight coefficient of each pixel in the pixels to be rendered of the second target object by utilizing the preset function according to the V value of the V axis direction in the preset texture coordinates, the highlight position offset value and the changed highlight thickness value.
17. The method of claim 16, wherein the step of determining the position of the probe comprises,
after the step of setting the highlight position offset value corresponding to the highlight position offset coefficient, the method further includes: performing noise processing on the highlight position offset value by using a first noise function to obtain a final highlight position offset value;
And/or
After the varying highlight thickness value is obtained, further comprising: and carrying out noise processing on the changed highlight thickness value by using a second noise function so as to obtain a final highlight thickness value.
18. The method of claim 10, wherein the object further comprises a third object, and wherein the step of determining the high light coefficient of each of the pixels to be rendered based on the texture coordinates comprises:
acquiring a highlight texture map of the third target object;
sampling the Gao Guangwen map according to the texture coordinates of each pixel in the pixels to be rendered of the third target object to obtain new texture coordinates;
obtaining a preset function which is built in advance, wherein the preset function comprises a highlight position offset coefficient and a highlight thickness coefficient, the maximum value of the preset function is controllable, the independent variable range of the preset function is controllable, and the position of the maximum value in the independent variable range is controllable;
setting a highlight position offset value corresponding to the highlight position offset coefficient and a highlight thickness value corresponding to the highlight thickness coefficient;
and generating a highlight coefficient of each pixel in the pixels to be rendered of the third target object by utilizing the preset function according to the new texture coordinates, the highlight position offset value and the highlight thickness value.
19. The method of claim 9, wherein the light source information further comprises indirect light information, and further comprising, prior to the step of stylized rendering the target object based on the specular reflected illumination values:
performing indirect diffuse reflection processing on each pixel in the target object pixel to be rendered by using the indirect light information and the normal line information to obtain an indirect diffuse reflection illumination value, and/or performing indirect specular reflection processing on each pixel in the target object pixel to be rendered by using roughness and the normal line information to obtain an indirect specular reflection illumination value, so as to decouple the indirect diffuse reflection processing and the indirect specular reflection processing;
determining an indirect light illumination value of each pixel in the pixels to be rendered of the target object according to the indirect light diffuse reflection illumination value and/or the indirect light specular reflection illumination value;
the step of performing stylized rendering on the target object according to the specular reflection illumination value includes: and performing stylized rendering on the target object according to the specular reflection illumination value and the indirect light illumination value.
20. A method of rendering an object, comprising:
Obtaining model information and light source information of a target object, wherein the model information comprises normal line information, and the light source information comprises indirect light information;
performing indirect diffuse reflection processing on each pixel in the pixel to be rendered of the target object by using the indirect light information and the normal line information to obtain an indirect diffuse reflection illumination value, and/or performing indirect specular reflection processing on each pixel in the pixel to be rendered of the target object by using roughness and the normal line information to obtain an indirect specular reflection illumination value, so as to decouple the indirect diffuse reflection processing and the indirect specular reflection processing;
determining an indirect light illumination value of each pixel in the pixels to be rendered of the target object according to the indirect light diffuse reflection illumination value and/or the indirect light specular reflection illumination value;
and performing stylized rendering on the target object according to the indirect light illumination value.
21. The method according to claim 20, wherein the indirect light information includes an indirect light color and an indirect light intensity, and the step of performing indirect light diffuse reflection processing on each pixel of the pixels to be rendered of the target object by using the indirect light information and the normal information to obtain an indirect light diffuse reflection illumination value includes:
Setting a spatial axis and setting a plurality of indirect light colors and a plurality of indirect light intensities along the spatial axis;
determining an indirect light diffuse reflection value of each pixel in the pixels to be rendered of the target object according to the spatial axis, the normal information, a plurality of indirect light colors and a plurality of indirect light intensities;
and determining the indirect light diffuse reflection illumination value of each pixel according to the indirect light diffuse reflection value, the inherent color of each pixel of the target object and the indirect light shielding intensity.
22. The method of claim 21, wherein the step of determining an indirect light diffuse reflectance value for each of the target to-be-rendered pixels based on the spatial axis, the normal information, a plurality of indirect light colors, and a plurality of indirect light intensities comprises:
determining a relationship between the normal information and the spatial axis;
and according to the relation, performing interpolation processing on a plurality of indirect light colors and a plurality of indirect light intensities to obtain an indirect light diffuse reflection value of each pixel in the pixels to be rendered of the target object.
23. The method of claim 20, wherein the step of performing indirect specular reflection processing on each of the pixels to be rendered of the target object using roughness and normal information to obtain an indirect specular reflection illumination value includes:
Determining an indirect light specular reflection value of each pixel in the pixels to be rendered of the target object according to the normal line information and the roughness;
obtaining the reflection color of each pixel of the target object;
and determining an indirect light specular reflection illumination value of each pixel in the pixels to be rendered of the target object according to the indirect light specular reflection value, the reflection color of the target object and the indirect light shielding intensity.
24. The method of claim 23, wherein the step of obtaining a reflected color for each pixel of the target comprises:
and determining the reflection color of each pixel of the target object according to the roughness and the inherent color of each pixel in the pixels to be rendered of the target object.
25. A rendering apparatus for an object, comprising:
the first acquisition module is used for acquiring model information and light source information of a target object, wherein the model information comprises normal line information;
the first diffuse reflection module is used for determining diffuse reflection illumination coefficients of each pixel in the pixels to be rendered of the target object by utilizing a preset diffuse reflection function according to the normal information and the light source information; determining diffuse reflection illumination values of each pixel according to the diffuse reflection illumination coefficients and the light source information; wherein the preset diffuse reflection function is conductive at the position of zero function value and the reciprocal is zero;
And the first rendering module is used for performing stylized rendering on the target object according to the diffuse reflection illumination value.
26. A rendering apparatus for an object, comprising:
the second acquisition module is used for acquiring model information and light source information of the target object, wherein the model information comprises normal line information; obtaining texture coordinates, shadow coefficients and highlight colors of each pixel in the pixels to be rendered of the target object;
the second specular reflection module is used for generating high light intensity of each pixel in the pixels to be rendered according to the texture coordinates of the target object; determining a specular reflection illumination value of each pixel according to the high light intensity, the high light color, the shading coefficient or the corrected shading coefficient, and the light source information, wherein the corrected shading coefficient is obtained by correction processing according to the shading coefficient of each pixel;
and the second rendering module is used for performing stylized rendering on the target object according to the specular reflection illumination value.
27. A rendering apparatus for an object, comprising:
the third acquisition module is used for acquiring model information and light source information of the target object, wherein the model information comprises normal line information, and the light source information comprises indirect light information;
The third indirect light module is used for performing indirect light diffuse reflection processing on each pixel in the pixel to be rendered of the target object by utilizing the indirect light information and the normal line information to obtain an indirect light diffuse reflection illumination value, and/or performing indirect light specular reflection processing on each pixel in the pixel to be rendered of the target object by utilizing the roughness and the normal line information to obtain an indirect light specular reflection illumination value so as to decouple the indirect light diffuse reflection processing and the indirect light specular reflection processing; determining an indirect light illumination value of each pixel in the pixels to be rendered of the target object according to the indirect light diffuse reflection illumination value and/or the indirect light specular reflection illumination value;
and the third rendering module is used for performing stylized rendering on the target object according to the indirect light illumination value.
28. A computer readable storage medium, characterized in that the computer readable storage medium stores a computer program adapted to be loaded by a processor for performing the steps in the method of rendering an object according to any of claims 1-27.
29. A computer device comprising a memory in which a computer program is stored and a processor that performs the steps in the method of rendering an object as claimed in any one of claims 1-27 by invoking the computer program stored in the memory.
CN202211527177.8A 2022-11-30 2022-11-30 Rendering method and device of target object, storage medium and computer equipment Pending CN116363288A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211527177.8A CN116363288A (en) 2022-11-30 2022-11-30 Rendering method and device of target object, storage medium and computer equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211527177.8A CN116363288A (en) 2022-11-30 2022-11-30 Rendering method and device of target object, storage medium and computer equipment

Publications (1)

Publication Number Publication Date
CN116363288A true CN116363288A (en) 2023-06-30

Family

ID=86940501

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211527177.8A Pending CN116363288A (en) 2022-11-30 2022-11-30 Rendering method and device of target object, storage medium and computer equipment

Country Status (1)

Country Link
CN (1) CN116363288A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117437345A (en) * 2023-12-22 2024-01-23 山东捷瑞数字科技股份有限公司 Method and system for realizing rendering texture specular reflection effect based on three-dimensional engine

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117437345A (en) * 2023-12-22 2024-01-23 山东捷瑞数字科技股份有限公司 Method and system for realizing rendering texture specular reflection effect based on three-dimensional engine
CN117437345B (en) * 2023-12-22 2024-03-19 山东捷瑞数字科技股份有限公司 Method and system for realizing rendering texture specular reflection effect based on three-dimensional engine

Similar Documents

Publication Publication Date Title
US11257286B2 (en) Method for rendering of simulating illumination and terminal
WO2021129044A1 (en) Object rendering method and apparatus, and storage medium and electronic device
JP6864449B2 (en) Methods and devices for adjusting the brightness of the image
WO2020125785A1 (en) Hair rendering method, device, electronic apparatus, and storage medium
CN112116692A (en) Model rendering method, device and equipment
US20230120253A1 (en) Method and apparatus for generating virtual character, electronic device and readable storage medium
CN111583379B (en) Virtual model rendering method and device, storage medium and electronic equipment
WO2023066121A1 (en) Rendering of three-dimensional model
CN113240783A (en) Stylized rendering method and device, readable storage medium and electronic equipment
JP2004110597A (en) Image generating information, information storage medium, and image generating device
CN116363288A (en) Rendering method and device of target object, storage medium and computer equipment
CN114581979A (en) Image processing method and device
CN113822981B (en) Image rendering method and device, electronic equipment and storage medium
US9626774B2 (en) Saturation varying color space
CN116228943B (en) Virtual object face reconstruction method, face reconstruction network training method and device
CN111402385B (en) Model processing method and device, electronic equipment and storage medium
US8942476B1 (en) Saturation varying and lighting independent color color control for computer graphics
CN116958344A (en) Animation generation method and device for virtual image, computer equipment and storage medium
CN113888398B (en) Hair rendering method and device and electronic equipment
JP2003168130A (en) System for previewing photorealistic rendering of synthetic scene in real-time
JP2007328458A (en) Image forming program, computer-readable storage medium recording the program, image processor and image processing method
KR100900076B1 (en) Texturing System and Method for Border Lins is Natural
CN116524102A (en) Cartoon second-order direct illumination rendering method, device and system
Ostrovka et al. Development of a method for changing the surface properties of a three-dimensional user avatar
CN117095126A (en) Virtual model generation method and device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination