CN114419220A - Stylized rendering method and device for target object, electronic equipment and storage medium - Google Patents

Stylized rendering method and device for target object, electronic equipment and storage medium Download PDF

Info

Publication number
CN114419220A
CN114419220A CN202111601156.1A CN202111601156A CN114419220A CN 114419220 A CN114419220 A CN 114419220A CN 202111601156 A CN202111601156 A CN 202111601156A CN 114419220 A CN114419220 A CN 114419220A
Authority
CN
China
Prior art keywords
determining
model
light
diffuse reflection
reflection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111601156.1A
Other languages
Chinese (zh)
Inventor
高子雅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202111601156.1A priority Critical patent/CN114419220A/en
Publication of CN114419220A publication Critical patent/CN114419220A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/506Illumination models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Generation (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The application provides a stylized rendering method and device for a target object, electronic equipment and a storage medium. The method comprises the following steps: acquiring light source information and the surface color of the target object; wherein, the light source information includes: scattered light information, reflected light information, and ambient light information; determining a diffuse reflection illumination model according to the diffuse reflection Fresnel factor, the surface color of the target object, the scattered light information and the ambient light information; determining a specular reflection model according to the reflected light information and the diffuse reflection illumination model; and determining an illumination model by combining the diffuse reflection illumination model and the specular reflection model, and performing stylized rendering on the target by using the illumination model. This application is revised from PBR's original principle, has reduced fine arts cost of manufacture, with very audio-visual a small amount of parameters to and very standardized workflow, just can realize the non-photorealistic rendering work that relates to the target object of a large amount of different materials fast, satisfied stylized fine arts demand.

Description

Stylized rendering method and device for target object, electronic equipment and storage medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to a stylized rendering method and apparatus for a target object, an electronic device, and a storage medium.
Background
In the related art, for stylized rendering requirements of scenes or objects in projects, general materials of PBRs (also referred to as physics-based rendering technologies) of illusion engines are too realistic, and the stylized art requirements cannot be met. Moreover, since the illusion engine is highly dependent on the PBR workflow, directly modifying the illusion engine to meet stylized art requirements can cause a problem of a large increase in art production cost.
Disclosure of Invention
In view of the above, an object of the present application is to provide a method and an apparatus for stylized rendering of an object, an electronic device, and a storage medium.
In view of the above, in a first aspect, the present application provides a stylized rendering method for an object, comprising:
acquiring light source information and the surface color of the target object; wherein the light source information includes: scattered light information, reflected light information, and ambient light information;
determining a diffuse reflection illumination model according to the diffuse reflection Fresnel factor, the surface color of the target object, the scattered light information and the environment light information;
determining a specular reflection model according to the reflected light information and the diffuse reflection illumination model;
and combining the diffuse reflection illumination model and the specular reflection model to determine an illumination model, and performing stylized rendering on the target by using the illumination model.
In a possible implementation manner, the determining a diffuse reflection illumination model according to a diffuse reflection fresnel factor, a surface color of the target object, the scattered light information, and the ambient light information further includes:
determining a Lambert model component according to the surface color of the target object and the ambient light information;
analyzing the target object to determine roughness, and determining a reflection component according to the surface color of the target object, the diffuse reflection Fresnel factor and the scattered light information;
determining an incident light related item influencing incident light consumed by reflection and an emergent light related item influencing emergent light consumed by reflection according to the scattered light information;
and determining the diffuse reflection illumination model according to the Lambert model component, the reflection component, the incident light related item, the emergent light related item and the dot product result of the normal vector and the incident light.
In one possible implementation, the determining a Lambert model component according to the surface color of the object and the ambient light information further includes:
determining the brightness of the environment light according to the environment light information;
and determining the Lambert model component according to the ratio of the surface color of the target object to the ambient light brightness.
In a possible implementation manner, the analyzing the target object to determine a roughness, and determining a reflection component according to a surface color of the target object, the diffuse reflection fresnel factor, and the scattered light information further includes:
determining a diffuse reflection angle according to the scattered light information;
determining a roughness component according to the roughness and the diffuse reflection angle;
determining the reflection component based on the incident light correlation term, the emergent light correlation term, the diffuse reflection Fresnel factor, the surface color of the target object, and the roughness component
In a possible implementation manner, the determining, according to the scattered light information, an incident light related term affecting incident light consumed by reflection and an outgoing light related term affecting outgoing light consumed by reflection further includes:
acquiring a first included angle between a surface normal and incident light and a second included angle between the surface normal and emergent light according to scattered light information;
and determining the incident light related item according to the first included angle, and determining the emergent light related item according to the second included angle.
In one possible implementation, the determining a roughness component according to the roughness and the diffuse reflection angle further includes:
and determining the roughness component according to the product of the roughness and the cosine value of the diffuse reflection angle.
In a possible implementation manner, the determining the diffuse reflection illumination model according to the Lambert model component and the reflection component further includes:
acquiring the surface gloss and the color tone of the target object in response to the fact that the surface of the target object is detected to be a fluff material;
determining a gloss component from the surface gloss, the hue, and the diffuse reflection angle;
and determining the diffuse reflection illumination model according to the Lambert model component, the reflection component and the gloss component.
In one possible implementation, the method further includes:
acquiring a light and shade boundary part of the target object;
and sampling texture through a brush touch map to render the light and shade boundary part.
In a possible implementation manner, the determining a specular reflection model according to the reflected light information and the diffuse reflection illumination model further includes:
acquiring half-range vectors of incident light and emergent light according to the reflected light information;
acquiring a third included angle between the half-range vector and the surface normal;
determining a diffuse reflection constant according to the diffuse reflection illumination model;
and determining the specular reflection model according to the half-range vector, the third included angle and the diffuse reflection constant on the basis of a Fresnel equation, a normal distribution function and a micro-surface distribution function.
In a second aspect, the present application provides an apparatus for stylized rendering of an object, comprising:
an acquisition module configured to acquire light source information and a surface color of the target object; wherein the light source information includes: scattered light information, reflected light information, and ambient light information;
a first determination module configured to determine a diffuse reflection illumination model according to a diffuse reflection fresnel factor, a surface color of the target object, the scattered light information, and the ambient light information;
a second determination module configured to determine a specular reflection model from the reflected light information and the diffuse reflection illumination model;
a rendering module configured to combine the diffuse reflection illumination model and the specular reflection model to determine an illumination model, and to stylize the object using the illumination model.
In a third aspect, the present application provides an electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the method for stylized rendering of an object according to the first aspect when executing the program.
In a fourth aspect, the present application provides a non-transitory computer-readable storage medium storing computer instructions for causing a computer to perform the method of stylized rendering of an object according to the first aspect.
From the above, the stylized rendering method, apparatus, electronic device and storage medium for an object provided by the present application are modified based on the original principle of PBR by obtaining light source information and a surface color of the object; the light source information may include: scattered light information, reflected light information, and ambient light information; determining a diffuse reflection illumination model according to the diffuse reflection Fresnel factor, the surface color of the target object, the scattered light information and the ambient light information; further determining a specular reflection model according to the reflected light information and the diffuse reflection illumination model; and determining an illumination model by combining a diffuse reflection illumination model and a specular reflection model, and performing stylized rendering on the target by using the illumination model. The original principle of the PBR is modified, the art manufacturing cost is reduced, a strict physical model can be converted into a coloring model mainly based on art guidance, and due to the usability of the art guidance, a small amount of very visual parameters and very standardized workflows can be used for quickly realizing the non-photorealistic rendering work of a large number of objects made of different materials in the rendering work, so that the problem that the PBR general material of the illusion engine is too realistic is avoided, and the stylized art requirement is met.
Drawings
In order to more clearly illustrate the technical solutions in the present application or the related art, the drawings needed to be used in the description of the embodiments or the related art will be briefly introduced below, and it is obvious that the drawings in the following description are only embodiments of the present application, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without creative efforts.
Fig. 1 illustrates an exemplary flowchart of a method for stylized rendering of an object according to an embodiment of the present application.
FIG. 2 shows a schematic diagram of a bright-dark boundary according to an embodiment of the present application.
Fig. 3 shows a schematic diagram of a comparison of a specular reflection model processed object with a ghost engine processed object according to an embodiment of the application.
Fig. 4 shows a schematic diagram of a comparison of an object rendered by a lighting model and an object processed by a ghost engine according to an embodiment of the application.
Fig. 5 shows an exemplary structural diagram of a stylized rendering apparatus for an object provided by an embodiment of the present application.
Fig. 6 shows an exemplary structural schematic diagram of an electronic device provided in an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is further described in detail below with reference to the accompanying drawings in combination with specific embodiments.
It should be noted that technical terms or scientific terms used in the embodiments of the present application should have a general meaning as understood by those having ordinary skill in the art to which the present application belongs, unless otherwise defined. The use of "first," "second," and similar terms in the embodiments of the present application do not denote any order, quantity, or importance, but rather the terms are used to distinguish one element from another. The word "comprising" or "comprises", and the like, means that the element or item listed before the word covers the element or item listed after the word and its equivalents, but does not exclude other elements or items. The terms "connected" or "coupled" and the like are not restricted to physical or mechanical connections, but may include electrical connections, whether direct or indirect. "upper", "lower", "left", "right", and the like are used merely to indicate relative positional relationships, and when the absolute position of the object being described is changed, the relative positional relationships may also be changed accordingly.
As described in the background section, when light is shone on the surface of an object, the object reflects, transmits, absorbs, diffracts, refracts, and interferes with the light, wherein the portion absorbed by the object is converted into heat, and the reflected, transmitted light enters the human visual system, enabling us to see the object. In order to simulate the phenomenon, mathematical models can be established to replace complex physical models, the models are called as a shading effect model or an imaginary general illumination model of the illumination model, the model is based on the optical physical principle, the illumination intensity is calculated depending on the propagation condition of light energy in the real world, and the interaction between light rays and the surfaces of all objects in the whole scene and the surfaces of the objects, including multiple reflection, transmission, scattering and the like, is considered.
When the lighting models of the type are used for stylized rendering of scenes or objects in projects, the general PBR (also called physical rendering technology) materials of the illusion engine are too realistic, and the stylized art requirements cannot be met. Moreover, since the illusion engine is highly dependent on the PBR workflow, directly modifying the illusion engine to meet stylized art requirements can cause a problem of a large increase in art production cost.
Therefore, the stylized rendering method, the stylized rendering device, the electronic equipment and the storage medium for the target object provided by the application are modified based on the original principle of PBR, and the light source information and the surface color of the target object are obtained; the light source information may include: scattered light information, reflected light information, and ambient light information; determining a diffuse reflection illumination model according to the diffuse reflection Fresnel factor, the surface color of the target object, the scattered light information and the ambient light information; further determining a specular reflection model according to the reflected light information and the diffuse reflection illumination model; and determining an illumination model by combining a diffuse reflection illumination model and a specular reflection model, and performing stylized rendering on the target by using the illumination model. The original principle of the PBR is modified, the art manufacturing cost is reduced, a strict physical model can be converted into a coloring model mainly based on art guidance, and due to the usability of the art guidance, a small amount of very visual parameters and very standardized workflows can be used for quickly realizing the non-photorealistic rendering work of a large number of objects made of different materials in the rendering work, so that the problem that the PBR general material of the illusion engine is too realistic is avoided, and the stylized art requirement is met.
The stylized rendering method for an object provided by the embodiment of the present application is specifically described below by using a specific embodiment.
Fig. 1 illustrates an exemplary flowchart of a method for stylized rendering of an object according to an embodiment of the present application.
Referring to fig. 1, a stylized rendering method for an object provided in an embodiment of the present application specifically includes the following steps:
s102: acquiring light source information and the surface color of the target object; wherein the light source information includes: scattered light information, reflected light information, and ambient light information.
S104: and determining a diffuse reflection illumination model according to the diffuse reflection Fresnel factor, the surface color of the target object, the scattered light information and the environment light information.
S106: and determining a specular reflection model according to the reflected light information and the diffuse reflection illumination model.
S108: and combining the diffuse reflection illumination model and the specular reflection model to determine an illumination model, and performing stylized rendering on the target by using the illumination model.
In step S102, assuming that our model has no illumination, we can only display the map of the model, and forge some illumination and shadows on the map, but this time, although the color of the object can be seen, there is no stereoscopic sense. Therefore, a light source needs to be placed in a scene where the target object is located, so that the light source can illuminate the scene, and the scene and the target object in the scene have a stereoscopic effect.
If the color of the light in the scene changes, but the surface of the object does not change, which is not experienced by the human eye, it is necessary to consider the color of the light that generates the light source after the light source is set. After the light source has color change, the color of the map on the surface of the object changes. In the illumination model, it is necessary to acquire light source information, and since the color and texture of the surface of the object are changed after the object is rendered, it is also necessary to acquire the surface color of the object generated after the object is irradiated by the light source, thereby constructing the illumination model. Thus, light source information may be obtained from light sources within the scene, which may include scattered light information, reflected light information, incident light information, outgoing light information, ambient light information, and the like. Still further, the surface color of the target object generated after being irradiated by the light source can be acquired.
With respect to step S104, if only the light source is considered to render the object, the brightness of all places of the entire object becomes completely uniform. However, in the experience of the human eye, the back of the object should be dark, the side should be dark, and the front should be completely bright, so the shape of the object itself should be taken into account. The normal map of the object can accurately express the shape of the object, and the size of an included angle between the normal and the illumination direction determines the light receiving degree of the object. The smaller the included angle is, the stronger the light is, whereas the larger the included angle is, the weaker the light is, which is exactly the same as the change of the cosine value of the included angle. However, in the daily environment, there is often ambient light, and the back surface of the target should not be dark, so that it is necessary to determine the diffuse reflection illumination model by considering not only the surface color of the target and the scattered light information generated by scattering on the surface of the target, but also the ambient light information.
In diffuse reflection, diffuse reflectance represents light that is refracted to the surface, scattered, partially absorbed, and re-emitted. Whereas some light is absorbed and diffuse reflection will be colored by the surface color of the object, generally, any colored portion of a non-metallic material can be considered diffuse reflection. The illusion engine in the related art evaluates the Burley diffuse reflection model, and the model is found to be slightly different from the Lambert model in the prior art, and other more complex diffuse reflection models cannot be effectively applied to image-based or spherical harmonic illumination. Therefore, in the related art, for the diffuse reflection illumination model, the Lambert model is directly applied.
However, the applicant researches and discovers that when a rough material is designed again to draw a certain fresnel reflection, the grazing of a smooth surface is expressed by a fresnel equation, and then the influence of the surface roughness on the fresnel refraction is generally not considered by a common diffuse reflection model, so that the surface is assumed to be smooth, or the fresnel effect is ignored. Thus, the present application adjusts the diffuse reflectance fresnel factor on diffuse reflectance while including the diffuse reflectance retroreflection. In general, BRDF (bidirectional reflectance distribution function) can be defined as the ratio of the outgoing irradiance to the incoming irradiance, and is used to measure the reflection characteristics of the object surface and light rays, and to show the material effect of the object.
In the related art, the BRDF of the Lambert model can be expressed as
f(θ)=max(cosθ,o)=max(L·n,o)。
In some embodiments, adjustments may be made on the basis of the Lambert model, and considerations for coarseness may be added. And determining the Lambert model component according to the surface color of the target object and the ambient light information acquired in the steps. Specifically, the ambient light brightness may be determined according to the ambient light information, and then the Lambert model component may be determined according to a ratio of the surface color of the object to the ambient light brightness.
Wherein the Lambert model component can be expressed as
fLambert=baseColor/brightness
Wherein baseColor represents the surface color of the object, and brightness represents the ambient light brightness. The brightness determines scattering parameters, in order to facilitate adjustment of the parameters in specific rendering work, the method develops conserved parameters and added grazing components to rendering personnel, and compared with a Lambert model in a phantom engine, the method does not follow energy conservation.
In some embodiments, after determining the Lambert model components, the target may be analyzed to determine the roughness of the target and determine the reflectance components based on the color of the target, the diffuse reflectance fresnel factor, and the scattered light information. Specifically, the diffuse reflection angle, the incident light correlation term, and the emergent light correlation term can be determined from the scattered light information because the surface of the smooth target is absolutely flat, and it is the energy source of the sub-surface scattering that can affect the sub-surface scattering, and these energy sources are left by the high light truncation, and therefore it is necessary to acquire the incident light correlation term F that affects the incident light consumed by reflectionL. Not only can be reflected and cut off when incident, but also can be reflected and consumed when final emergent light is scattered, so that an emergent light related item F influencing emergent light consumed by reflection needs to be obtainedV
It should be noted that the incident light correlation term FLCan be determined by the angle between the surface normal and the incident light, the emergent light-dependent term FVCan be determined by the angle between the surface normal and the emerging light. Wherein the surface normal can be obtained by scattering light information. For the normal generation method, in earlier rendering, because the performance of the display card is poor, the normal map is not generally used, but the normal of a comparison list is used to express the shape of the whole model, but for the current technology, the normal map can be used to generate the normal.
Further, the correlation term F for incident lightLCan be expressed as
FL=pow5(1-cosθL)
Wherein pow5 represents a power of 5 operation, θLAn angle between the surface normal and the incident light, i.e., a first angle, is represented; for the outgoing light correlation term FVCan be expressed as
FV=pow5(1-cosθV)
Wherein, thetaVRepresenting the angle of the surface normal to the outgoing light, i.e. the second angle.
Still further, after the incident light related term and the outgoing light related term are determined, the roughness component may be determined based on the roughness and the diffuse reflection angle determined based on the scattered light information in the above step. In particular, the roughness component may be determined from the product of the roughness and the cosine of the diffuse reflection angle, wherein the roughness component R isRCan be expressed as
RR=2*roughness*cos2(θd)
Wherein roughnesss represents roughness, θdIndicating the angle of diffuse reflection. Roughness component RRWhen it is larger than 1, the glancing reflection becomes stronger, and when the roughness component R isRLess than 1, the more glancing the reflection is weaker.
It should be noted that the reflection component f may be determined based on the incident light-related term, the outgoing light-related term, the diffuse reflection fresnel factor, the surface color of the object, and the roughness componentretro-reflectionCan be expressed as
fretro-reflection=baseColor/π*RR(FL+FV+FLFV(RR-1))
Wherein baseColor represents the surface color of the target object, 1/pi represents the diffuse reflection Fresnel factor, RRRepresenting the roughness component. The rough subsurface scattering model will have a roughness-influenced reflection component fretro-reflectionDue to the rough surface, the closer the line of sight and light rays are to the retroreflection, the more pronounced the glancing angle direction.
On the micro-plane, if there are many normal lines that can be reflected to the human eye, then this will produce a high light. If the normal direction is very random, the light rays are reflected by four rays, and the human eyes see diffuse reflection. That is, high light and diffuse reflection are mutually exclusive, and the high light plus diffuse reflection is the total incident light.
Thus, in some embodiments, after determining the Lambert and reflectance components, a diffuse reflectance illumination model may be determined based thereonIn particular, diffuse reflectance illumination model fdCan be expressed as
fd=fLamber(1-0.5FL)(1-0.5FV)+fretro-reflection+n·L
Wherein f isLamberRepresenting the Lambert model component, FLRepresenting incident light dependent terms affecting incident light consumed by reflection, FVRepresenting an emergent-light-dependent term affecting the emergent light consumed by reflection, fretro-reflectionRepresenting the reflected component, n the normal vector, and L the incident light direction.
In some embodiments, if the surface of the detected target object is a villous material, the edge will be brighter due to more frequent multiple refraction and reflection. Therefore, the surface gloss and the color tone of the object can be obtained, the gloss component is determined according to the surface gloss, the color tone and the diffuse reflection angle, and when the edge of the object is brighter due to more frequent occurrence of refraction and reflection, the gloss component can be added to the diffuse reflection illumination model, that is, the diffuse reflection illumination model can be determined again according to the Lambert model component, the reflection component and the gloss component. Among them, the gloss component F (sheen, θ)d) Can be expressed as
F(sheen,θd)=sheen*pow5((1-sheen)+sheen*tint)*(1-cosθd)
Where sheen represents surface gloss and tint represents hue. A artwork creator may be given a choice whether to add a gloss variable to the diffuse reflectance lighting model by setting a boolean value.
FIG. 2 shows a schematic diagram of a bright-dark boundary according to an embodiment of the present application.
In some embodiments, referring to fig. 2, on the basis of obtaining the diffuse reflection illumination model, the light and dark boundary portion may be subjected to separate stylization, and since there is no way to set a cut-off line for the light and dark boundary portion of the target object by taking a blank, some brush brushes may be brushed on the original drawing in a way of sampling the texture of the brush-touch drawing to be mapped into the drawing, so as to render the light and dark boundary portion, and perform stylization on the light and dark boundary portion.
In step S106, in some embodiments, in the absence of light, the target object is not visible, so the target object has no color, and therefore, in the PBR, the material has no material color. The RGB of the material itself is the degree of reception of the target object to the three RGB channels of diffuse reflection light, specular light and ambient light. The RGB channels of the material are scaled for the RGB colors of the light, so for specular reflection, a model of specular reflection can be determined from the reflected light information and a diffuse reflection illumination model based on a Disney model with artistic orientation.
Further, for specular reflection, the reflected energy does not actually go over the surface of the object at all, so this energy is the energy left after reflection. In specular reflection, ambient light cannot be directly described by a formula of a reflection model in the prior art, because the ambient light has many light rays with different incident directions to irradiate the surface of a target object, and a hemispherical integral is required to calculate whether the light rays are reflected or not.
The reason why the specular reflection model is determined using the hemispherical integral is that the surface of the object may be uneven, and the hemispherical integral is an integral upper hemisphere because the lower hemisphere is not received. In fact, the solid angle is defined as being equal to a spherical solid angle regardless of the shape of the square, and a rugged surface can be well simulated by using hemispherical integration. Since the energy of the light emission in one direction needs to be calculated, the integration needs to be done over the entire target surface, dw can be usediAs a function of the integral, i.e. the direction of the incident ray, to flood the entire hemispherical surface.
It should be noted that, the determination process of the specular reflection model may include the following steps: acquiring half-range vectors of incident light and emergent light according to the reflected light information, and determining a diffuse reflection constant difference according to a diffuse reflection illumination model; and determining a specular reflection model according to the half-range vector and the diffuse reflection constant based on the Fresnel equation, the normal distribution function and the micro-surface distribution function. Where the normal distribution function is used to determine where the object produces diffuse reflection and where it produces highlights, the normal distribution function may be determined by hemispherical integration. Wherein the specular reflection model f (L, V) can be expressed as
Figure BDA0003433130080000101
Wherein L represents incident light, V represents emergent light, F () represents the Fresnel equation, D () represents the normal distribution function, G () represents the micro-surface distribution function, dispersion represents the diffuse reflection constant, θdRepresenting the angle of diffuse reflection, thetahRepresenting the angle of the half-way vector with the normal, i.e. the third angle.
It should be noted that D () represents a normal distribution function, also referred to as (NDF), and is used to describe the probability of the normal distribution of the micro-planar elements, i.e., the concentration of the correctly oriented normal. I.e. the concentration of surface points with respect to the surface area that have the correct orientation and can reflect light from L to V. F () represents the fresnel equation and is used to describe the ratio of light rays reflected by the surface at different surface angles. 4cos θLcosθVIt is understood as a correction factor as a correction of the amount of micro-planes transformed between the local space of the micro-geometry and the local space of the entire macroscopic surface.
Further, for NDF, an efficient GGX/TR model can be used. The low overhead of the Blinn-Phong model, and the unique and natural behavior resulting from its longer tail, attract the designer, and can simultaneously adopt the alpha parameter re-parameterized in the model, where alpha is roughhness2The GGX model can be expressed as
Figure BDA0003433130080000111
For the fresnel equation, a small modified Schlick approximation algorithm can be used, a spherical Gaussian approximation value is used to replace an exponential term, the calculation efficiency can be improved, and only an imperceptible difference is brought, and the formula can be:
F(v,h)=F0+(1-F0)2(-5.55473(v·h)-6.98316)(v·h)
wherein F0Indicating the specula reflectivity at normal incidence. And special means that light does not enter an object and is directly reflected on the surface.
Fig. 3 shows a schematic diagram of a comparison of a specular reflection model processed object with a ghost engine processed object according to an embodiment of the application.
Referring to fig. 3, the left side shows the adjusted effect using the specular reflection model of the present application, and the right side shows the official effect of the illusion engine. The Specular reflection model provided by the illusion engine takes an approximation on Specular, (i.e., the R value of RGB), and the grazing Specular reflection is achromatic. Special color and special are opened directly to simulate more artistic oriented effects. The user may give highlight colors or stickers as desired.
In some embodiments, it may further be possible to extend the Split Sum Approximation part of the ghost engine, where the first step of formula derivation can be expressed as
Figure BDA0003433130080000121
The right side of the equation is the Monte Carlo integral formula, which is the probability distribution function: pdf, to be noted are: for the rendering equation, the pdf is a normalized function (normalized function), i.e., the integral value in the hemisphere domain is 1.
The next step is also an approximation for performance considerations. Namely, the Monte Carlo formula of the first step is divided into two sigma to operate.
Figure BDA0003433130080000122
This approximation is accurate when the incident light is constant, which has a very accurate effect under common circumstances.
Fig. 4 shows a schematic diagram of a comparison of an object rendered by a lighting model and an object processed by a ghost engine according to an embodiment of the application.
Referring to fig. 4, it can be seen that, after the illumination model is determined by combining the diffuse reflection illumination model and the specular reflection model in step S108, compared with the target object rendered by the official illumination model of the illusion engine, the effect of stylized rendering of the target object by using the illumination model is more prominent in unreality, and customized stylized rendering is realized.
From the above, the stylized rendering method, apparatus, electronic device and storage medium for an object provided by the present application are modified based on the original principle of PBR by obtaining light source information and a surface color of the object; the light source information may include: scattered light information, reflected light information, and ambient light information; determining a diffuse reflection illumination model according to the diffuse reflection Fresnel factor, the surface color of the target object, the scattered light information and the ambient light information; further determining a specular reflection model according to the reflected light information and the diffuse reflection illumination model; and determining an illumination model by combining a diffuse reflection illumination model and a specular reflection model, and performing stylized rendering on the target by using the illumination model. The original principle of the PBR is modified, the art manufacturing cost is reduced, a strict physical model can be converted into a coloring model mainly based on art guidance, and due to the usability of the art guidance, a small amount of very visual parameters and very standardized workflows can be used for quickly realizing the non-photorealistic rendering work of a large number of objects made of different materials in the rendering work, so that the problem that the PBR general material of the illusion engine is too realistic is avoided, and the stylized art requirement is met.
It should be noted that the method of the embodiment of the present application may be executed by a single device, such as a computer or a server. The method of the embodiment can also be applied to a distributed scene and completed by the mutual cooperation of a plurality of devices. In such a distributed scenario, one of the multiple devices may only perform one or more steps of the method of the embodiment, and the multiple devices interact with each other to complete the method.
It should be noted that the above describes some embodiments of the present application. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments described above and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
Fig. 5 shows an exemplary structural diagram of a stylized rendering apparatus for an object provided by an embodiment of the present application.
Based on the same inventive concept, corresponding to the method of any embodiment, the application also provides a stylized rendering device for the target object.
Referring to fig. 5, the stylized rendering apparatus for an object includes: the device comprises an acquisition module, a first determination module, a second determination module and a control module; wherein the content of the first and second substances,
an acquisition module configured to acquire light source information and a surface color of the target object; wherein the light source information includes: scattered light information, reflected light information, and ambient light information;
a first determination module configured to determine a diffuse reflection illumination model according to a diffuse reflection fresnel factor, a surface color of the target object, the scattered light information, and the ambient light information;
a second determination module configured to determine a specular reflection model from the reflected light information and the diffuse reflection illumination model;
a rendering module configured to combine the diffuse reflection illumination model and the specular reflection model to determine an illumination model, and to stylize the object using the illumination model.
In one possible implementation, the first determining module is further configured to:
determining a Lambert model component according to the surface color of the target object and the ambient light information;
analyzing the target object to determine roughness, and determining a reflection component according to the surface color of the target object, the diffuse reflection Fresnel factor and the scattered light information;
determining an incident light related item influencing incident light consumed by reflection and an emergent light related item influencing emergent light consumed by reflection according to the scattered light information;
and determining the diffuse reflection illumination model according to the Lambert model component, the reflection component, the incident light related item, the emergent light related item and the dot product result of the normal vector and the incident light.
In one possible implementation, the first determining module is further configured to:
determining the brightness of the environment light according to the environment light information;
and determining the Lambert model component according to the ratio of the surface color of the target object to the ambient light brightness.
In one possible implementation, the first determining module is further configured to:
determining a diffuse reflection angle according to the scattered light information;
determining a roughness component according to the roughness and the diffuse reflection angle;
determining the reflection component based on the incident light correlation term, the emergent light correlation term, the diffuse reflection Fresnel factor, the surface color of the target, and the roughness component.
In one possible implementation, the first determining module is further configured to:
acquiring a first included angle between a surface normal and incident light and a second included angle between the surface normal and emergent light according to scattered light information;
and determining the incident light related item according to the first included angle, and determining the emergent light related item according to the second included angle.
In one possible implementation, the first determining module is further configured to:
and determining the roughness component according to the product of the roughness and the cosine value of the diffuse reflection angle.
In one possible implementation, the first determining module is further configured to:
acquiring the surface gloss and the color tone of the target object in response to the fact that the surface of the target object is detected to be a fluff material;
determining a gloss component from the surface gloss, the hue, and the diffuse reflection angle;
and determining the diffuse reflection illumination model according to the Lambert model component, the reflection component and the gloss component.
In one possible implementation manner, the apparatus further includes: a sampling module;
the sampling module further configured to:
acquiring a light and shade boundary part of the target object;
and sampling texture through a brush touch map to render the light and shade boundary part.
In one possible implementation, the second determining module is further configured to:
acquiring half-range vectors of incident light and emergent light according to the reflected light information;
acquiring a third included angle between the half-range vector and the surface normal;
determining a diffuse reflection constant according to the diffuse reflection illumination model;
and determining the specular reflection model according to the half-range vector, the third included angle and the diffuse reflection constant on the basis of a Fresnel equation, a normal distribution function and a micro-surface distribution function.
For convenience of description, the above devices are described as being divided into various modules by functions, and are described separately. Of course, the functionality of the various modules may be implemented in the same one or more software and/or hardware implementations as the present application.
The apparatus of the foregoing embodiment is used to implement the stylized rendering method for an object corresponding to any one of the foregoing embodiments, and has the beneficial effects of the corresponding method embodiment, which are not described herein again.
Fig. 6 shows an exemplary structural schematic diagram of an electronic device provided in an embodiment of the present application.
Based on the same inventive concept, corresponding to the method of any embodiment described above, the present application further provides an electronic device, which includes a memory, a processor, and a computer program stored on the memory and executable on the processor, and when the processor executes the program, the stylized rendering method for an object described in any embodiment above is implemented. Referring to fig. 6, fig. 6 shows a more specific hardware structure diagram of an electronic device provided in this embodiment, where the device may include: a processor 610, a memory 620, an input/output interface 630, a communication interface 640, and a bus 650. Wherein the processor 610, memory 620, input/output interface 630, and communication interface 640 are communicatively coupled to each other within the device via a bus 650.
The processor 610 may be implemented by a general-purpose CPU (Central Processing Unit), a microprocessor, an Application Specific Integrated Circuit (ASIC), or one or more Integrated circuits, and is configured to execute related programs to implement the technical solutions provided in the embodiments of the present specification.
The Memory 620 may be implemented in the form of a ROM (Read Only Memory), a RAM (Random Access Memory), a static storage device, a dynamic storage device, or the like. The memory 620 may store an operating system and other application programs, and when the technical solution provided by the embodiments of the present specification is implemented by software or firmware, the relevant program codes are stored in the memory 620 and called by the processor 610 to be executed.
The input/output interface 630 is used for connecting an input/output module to realize information input and output. The input/output module may be configured as a component in a device (not shown) or may be external to the device to provide a corresponding function. The input devices may include a keyboard, a mouse, a touch screen, a microphone, various sensors, etc., and the output devices may include a display, a speaker, a vibrator, an indicator light, etc.
The communication interface 640 is used for connecting a communication module (not shown in the figure) to realize communication interaction between the device and other devices. The communication module can realize communication in a wired mode (such as USB, network cable and the like) and also can realize communication in a wireless mode (such as mobile network, WIFI, Bluetooth and the like).
Bus 650 includes a pathway to transfer information between various components of the device, such as processor 610, memory 620, input/output interface 630, and communication interface 640.
It should be noted that although the above-mentioned devices only show the processor 610, the memory 620, the input/output interface 630, the communication interface 640 and the bus 650, in a specific implementation, the devices may also include other components necessary for normal operation. In addition, those skilled in the art will appreciate that the above-described apparatus may also include only those components necessary to implement the embodiments of the present description, and not necessarily all of the components shown in the figures.
The electronic device of the above embodiment is used to implement the stylized rendering method for the target object in any of the foregoing embodiments, and has the beneficial effects of the corresponding method embodiment, which are not described herein again.
Based on the same inventive concept, corresponding to any of the above-described embodiment methods, the present application also provides a non-transitory computer-readable storage medium storing computer instructions for causing the computer to perform the method for stylized rendering of an object as described in any of the above embodiments.
Computer-readable media of the present embodiments, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device.
The computer instructions stored in the storage medium of the above embodiment are used to enable the computer to execute the stylized rendering method for an object according to any of the above embodiments, and have the beneficial effects of corresponding method embodiments, which are not described herein again.
Those of ordinary skill in the art will understand that: the discussion of any embodiment above is meant to be exemplary only, and is not intended to intimate that the scope of the disclosure, including the claims, is limited to these examples; within the context of the present application, features from the above embodiments or from different embodiments may also be combined, steps may be implemented in any order, and there are many other variations of the different aspects of the embodiments of the present application as described above, which are not provided in detail for the sake of brevity.
In addition, well-known power/ground connections to Integrated Circuit (IC) chips and other components may or may not be shown in the provided figures for simplicity of illustration and discussion, and so as not to obscure the embodiments of the application. Furthermore, devices may be shown in block diagram form in order to avoid obscuring embodiments of the application, and this also takes into account the fact that specifics with respect to implementation of such block diagram devices are highly dependent upon the platform within which the embodiments of the application are to be implemented (i.e., specifics should be well within purview of one skilled in the art). Where specific details (e.g., circuits) are set forth in order to describe example embodiments of the application, it should be apparent to one skilled in the art that the embodiments of the application can be practiced without, or with variation of, these specific details. Accordingly, the description is to be regarded as illustrative instead of restrictive.
While the present application has been described in conjunction with specific embodiments thereof, many alternatives, modifications, and variations of these embodiments will be apparent to those of ordinary skill in the art in light of the foregoing description. For example, other memory architectures (e.g., dynamic ram (dram)) may use the discussed embodiments.
The present embodiments are intended to embrace all such alternatives, modifications and variances which fall within the broad scope of the appended claims. Therefore, any omissions, modifications, substitutions, improvements, and the like that may be made without departing from the spirit and principles of the embodiments of the present application are intended to be included within the scope of the present application.

Claims (12)

1. A method of stylized rendering of an object, comprising:
acquiring light source information and the surface color of the target object; wherein the light source information includes: scattered light information, reflected light information, and ambient light information;
determining a diffuse reflection illumination model according to the diffuse reflection Fresnel factor, the surface color of the target object, the scattered light information and the environment light information;
determining a specular reflection model according to the reflected light information and the diffuse reflection illumination model;
and combining the diffuse reflection illumination model and the specular reflection model to determine an illumination model, and performing stylized rendering on the target by using the illumination model.
2. The method of claim 1, wherein determining a diffuse reflectance illumination model from a diffuse reflectance fresnel factor, a surface color of the target object, the scattered light information, and the ambient light information, further comprises:
determining a Lambert model component according to the surface color of the target object and the ambient light information;
analyzing the target object to determine roughness, and determining a reflection component according to the surface color of the target object, the diffuse reflection Fresnel factor and the scattered light information;
determining an incident light related item influencing incident light consumed by reflection and an emergent light related item influencing emergent light consumed by reflection according to the scattered light information;
and determining the diffuse reflection illumination model according to the Lambert model component, the reflection component, the incident light related item, the emergent light related item and the dot product result of the normal vector and the incident light.
3. The method of claim 2, wherein determining a Lambert model component from the surface color of the object and the ambient light information further comprises:
determining the brightness of the environment light according to the environment light information;
and determining the Lambert model component according to the ratio of the surface color of the target object to the ambient light brightness.
4. The method of claim 2, wherein analyzing the target to determine roughness and determining a reflected component based on the surface color of the target, the diffuse reflectance fresnel factor, and the scattered light information further comprises:
determining a diffuse reflection angle according to the scattered light information;
determining a roughness component according to the roughness and the diffuse reflection angle;
determining the reflection component based on the incident light correlation term, the emergent light correlation term, the diffuse reflection Fresnel factor, the surface color of the target, and the roughness component.
5. The method of claim 2, wherein determining from the scattered light information an incident light related term that affects incident light consumed by reflection and an outgoing light related term that affects outgoing light consumed by reflection further comprises:
acquiring a first included angle between a surface normal and incident light and a second included angle between the surface normal and emergent light according to scattered light information;
and determining the incident light related item according to the first included angle, and determining the emergent light related item according to the second included angle.
6. The method of claim 4, wherein determining a roughness component from the roughness and the diffuse reflection angle further comprises:
and determining the roughness component according to the product of the roughness and the cosine value of the diffuse reflection angle.
7. The method according to claim 4, wherein said determining said diffuse reflectance illumination model from said Lambert model components and said reflectance components further comprises:
acquiring the surface gloss and the color tone of the target object in response to the fact that the surface of the target object is detected to be a fluff material;
determining a gloss component from the surface gloss, the hue, and the diffuse reflection angle;
and determining the diffuse reflection illumination model according to the Lambert model component, the reflection component and the gloss component.
8. The method of claim 1, further comprising:
acquiring a light and shade boundary part of the target object;
and sampling texture through a brush touch map to render the light and shade boundary part.
9. The method of claim 5, wherein determining a specular reflection model from the reflected light information and the diffuse reflection illumination model, further comprises:
acquiring half-range vectors of incident light and emergent light according to the reflected light information;
acquiring a third included angle between the half-range vector and the surface normal;
determining a diffuse reflection constant according to the diffuse reflection illumination model;
and determining the specular reflection model according to the half-range vector, the third included angle and the diffuse reflection constant on the basis of a Fresnel equation, a normal distribution function and a micro-surface distribution function.
10. An apparatus for stylized rendering of an object, comprising:
an acquisition module configured to acquire light source information and a surface color of the target object; wherein the light source information includes: scattered light information, reflected light information, and ambient light information;
a first determination module configured to determine a diffuse reflection illumination model according to a diffuse reflection fresnel factor, a surface color of the target object, the scattered light information, and the ambient light information;
a second determination module configured to determine a specular reflection model from the reflected light information and the diffuse reflection illumination model;
a rendering module configured to combine the diffuse reflection illumination model and the specular reflection model to determine an illumination model, and to stylize the object using the illumination model.
11. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the method according to any of claims 1 to 9 when executing the program.
12. A non-transitory computer readable storage medium storing computer instructions for causing a computer to implement the method of any one of claims 1 to 9.
CN202111601156.1A 2021-12-24 2021-12-24 Stylized rendering method and device for target object, electronic equipment and storage medium Pending CN114419220A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111601156.1A CN114419220A (en) 2021-12-24 2021-12-24 Stylized rendering method and device for target object, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111601156.1A CN114419220A (en) 2021-12-24 2021-12-24 Stylized rendering method and device for target object, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN114419220A true CN114419220A (en) 2022-04-29

Family

ID=81269463

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111601156.1A Pending CN114419220A (en) 2021-12-24 2021-12-24 Stylized rendering method and device for target object, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114419220A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116091684A (en) * 2023-04-06 2023-05-09 杭州片段网络科技有限公司 WebGL-based image rendering method, device, equipment and storage medium
CN116883567A (en) * 2023-07-07 2023-10-13 上海散爆信息技术有限公司 Fluff rendering method and device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116091684A (en) * 2023-04-06 2023-05-09 杭州片段网络科技有限公司 WebGL-based image rendering method, device, equipment and storage medium
CN116883567A (en) * 2023-07-07 2023-10-13 上海散爆信息技术有限公司 Fluff rendering method and device

Similar Documents

Publication Publication Date Title
WO2021129044A1 (en) Object rendering method and apparatus, and storage medium and electronic device
CA3150045A1 (en) Dynamically estimating light-source-specific parameters for digital images using a neural network
CN114419220A (en) Stylized rendering method and device for target object, electronic equipment and storage medium
CA2817497C (en) Method and system for efficient modeling of specular reflection
US20190005710A1 (en) System and method of rendering a graphical object with modification in structure
CN112819941B (en) Method, apparatus, device and computer readable storage medium for rendering water surface
CN113052947B (en) Rendering method, rendering device, electronic equipment and storage medium
KR102173546B1 (en) Apparatus and method of rendering game objects
EP3057067B1 (en) Device and method for estimating a glossy part of radiation
US9659404B2 (en) Normalized diffusion profile for subsurface scattering rendering
CN112489179B (en) Target model processing method and device, storage medium and computer equipment
US11657478B1 (en) Systems and methods for dynamically rendering three-dimensional images with varying detail to emulate human vision
CN116091684B (en) WebGL-based image rendering method, device, equipment and storage medium
CN113888398A (en) Hair rendering method and device and electronic equipment
CN116758208A (en) Global illumination rendering method and device, storage medium and electronic equipment
WO2017025446A1 (en) Method and apparatus for real-time rendering of images of specular surfaces
Gotanda Beyond a simple physically based Blinn-Phong model in real-time
CN115205440A (en) Image rendering method and device
Tandianus et al. Spectral caustic rendering of a homogeneous caustic object based on wavelength clustering and eye sensitivity
Spicker et al. Quantifying visual abstraction quality for computer-generated illustrations
US20190371049A1 (en) Transform-based shadowing of object sets
Abbas et al. Gaussian radial basis function for efficient computation of forest indirect illumination
CN116421970B (en) Method, device, computer equipment and storage medium for externally-installed rendering of virtual object
CN117078838B (en) Object rendering method and device, storage medium and electronic equipment
US10789757B2 (en) Ray-mediated illumination control

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination