CN111862290B - Radial fuzzy-based fluff rendering method and device and storage medium - Google Patents

Radial fuzzy-based fluff rendering method and device and storage medium Download PDF

Info

Publication number
CN111862290B
CN111862290B CN202010631548.1A CN202010631548A CN111862290B CN 111862290 B CN111862290 B CN 111862290B CN 202010631548 A CN202010631548 A CN 202010631548A CN 111862290 B CN111862290 B CN 111862290B
Authority
CN
China
Prior art keywords
fluff
rendered
space
rendering
attribute information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010631548.1A
Other languages
Chinese (zh)
Other versions
CN111862290A (en
Inventor
陈志强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Perfect World Beijing Software Technology Development Co Ltd
Original Assignee
Perfect World Beijing Software Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Perfect World Beijing Software Technology Development Co Ltd filed Critical Perfect World Beijing Software Technology Development Co Ltd
Priority to CN202010631548.1A priority Critical patent/CN111862290B/en
Publication of CN111862290A publication Critical patent/CN111862290A/en
Priority to PCT/CN2020/130324 priority patent/WO2022000953A1/en
Application granted granted Critical
Publication of CN111862290B publication Critical patent/CN111862290B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/04Texture mapping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/20Indexing scheme for editing of 3D models
    • G06T2219/2012Colour editing, changing, or manipulating; Use of colour codes

Abstract

The invention provides a method, a device and a storage medium for rendering down based on radial blur, wherein the method comprises the steps of obtaining down rendering related resources of an object to be rendered; analyzing the fluff direction and other fluff attribute information in the screen space of the object to be rendered according to the fluff rendering related resources, caching the other fluff attribute information to a first buffer area, and caching the fluff direction in the screen space to a second buffer area; performing illumination calculation based on other fluff attribute information cached in the first buffer area to obtain scene color information of an object to be rendered; and performing radial fuzzy processing on the object to be rendered based on the scene color information and the fluff direction in the screen space cached by the second buffer area to obtain a fluff effect after rendering. Therefore, the embodiment of the invention can quickly and conveniently render the real fluff effect of the object to be rendered, does not need to render the object to be rendered for multiple times, and can effectively improve the fluff rendering efficiency.

Description

Radial fuzzy-based fluff rendering method and device and storage medium
Technical Field
The invention relates to the technical field of scene rendering, in particular to a method and a device for rendering fluff based on radial blur and a storage medium.
Background
With the rapid development of computer graphics technology, virtual objects are increasingly being applied to the production of movies, games, and animations. In order to realize the sense of reality of a virtual object, some key technologies are needed, such as skeletal animation, expression, simulation of villus and the like, a villus scheme commonly adopted in the prior art comprises a FurShell scheme, the villus can be divided into multiple layers for rendering in the implementation process of the FurShell scheme, if the rendering effect is to be improved, the rendering times of a model need to be increased, namely the same model is rendered for multiple times, and therefore rendering efficiency is not high.
Disclosure of Invention
In view of the above, the present invention has been made to provide a method, an apparatus and a storage medium for radial blur based pile rendering that overcome or at least partially solve the above problems. The real fluff effect of the object to be rendered can be quickly and conveniently rendered, the object to be rendered does not need to be rendered for multiple times, particularly, fluff is rendered for a plurality of objects to be rendered under the same camera view angle in a game scene, and the fluff rendering efficiency can be effectively improved.
According to an aspect of the embodiments of the present invention, there is provided a method for rendering a nap based on radial blur, including:
acquiring fluff rendering related resources of an object to be rendered;
analyzing the fluff direction and other fluff attribute information in the screen space of the object to be rendered according to the fluff rendering related resources, caching the other fluff attribute information to a first buffer area, and caching the fluff direction in the screen space to a second buffer area;
performing illumination calculation based on the other fluff attribute information cached in the first buffer area to obtain scene color information of the object to be rendered;
and performing radial fuzzy processing on the object to be rendered based on the scene color information and the fluff direction in the screen space cached by the second buffer area to obtain a fluff pixel value of the object to be rendered.
Optionally, the fluff rendering related resources of the object to be rendered include:
at least one of model basic color mapping resources, noise mapping resources of villi, villi direction mapping resources in a model tangent space, normal mapping resources in the model tangent space, model roughness metal degree mapping resources and model resources containing normal and tangent data.
Optionally, analyzing a direction of a nap in a screen space of the object to be rendered according to the nap rendering related resource includes:
extracting RG channel values and B channel values from the model tangent space normal map resources;
calculating to obtain the fluff direction in a tangent space based on the extracted RG channel value;
multiplying the fluff direction in the tangent space, the extracted channel B value and a preset fluff length coefficient to obtain the real fluff direction in the tangent space;
and converting the real fluff direction in the tangent space to the fluff direction in the screen space.
Optionally, converting the true fluff direction in the tangential space to the fluff direction in the screen space comprises:
converting the real fluff direction in the tangent space to the real fluff direction in the world space, and then converting the real fluff direction in the world space to the real fluff direction in the projection space;
and converting the real fluff direction in the projection space to the fluff direction in the screen space by adopting perspective division.
Optionally, the other pile attribute information includes a basic color of the object to be rendered, and analyzing the other pile attribute information of the object to be rendered according to the pile rendering related resource includes:
extracting basic color information from the model basic color mapping resources, and extracting noise color information from the noise map resources of the fluff;
and performing mixed calculation on the basic color information and the noise color information to obtain the basic color of the object to be rendered.
Optionally, the other fluff attribute information further includes a world space normal direction, and analyzing the other fluff attribute information of the object to be rendered according to the fluff rendering related resource includes:
and extracting the normal direction of the tangent space from the model tangent space normal map, and converting the normal direction of the tangent space into the normal direction of the world space.
Optionally, the other fluff attribute information further includes roughness and metallization of the object to be rendered, and analyzing the other fluff attribute information of the object to be rendered according to the fluff rendering related resource includes:
and analyzing the roughness and the metal degree of the object to be rendered from the model roughness metal degree graph resource.
Optionally, performing radial blurring processing on the object to be rendered based on the scene color information and the fluff direction in the screen space cached by the second buffer to obtain a fluff pixel value of the object to be rendered, including:
determining a radial blurring direction according to the fluff direction in the screen space;
starting to step by preset sampling times from any pixel point of a current screen along the radial fuzzy direction and according to a preset distance interval, and obtaining sampling points of the preset sampling times;
and acquiring scene color information corresponding to the sampling points, averaging the acquired scene color information, and determining a fluff pixel value of the object to be rendered according to the average value.
Optionally, the preset distance interval includes: and presetting the product of the image blurring distance coefficient and the length of the fluff in the screen space.
According to another aspect of the embodiments of the present invention, there is also provided a radial blur-based fluff rendering apparatus, including:
the acquisition module is suitable for acquiring fluff rendering related resources of an object to be rendered;
the analysis module is suitable for analyzing the fluff direction and other fluff attribute information in the screen space of the object to be rendered according to the fluff rendering related resources, caching the other fluff attribute information to a first buffer area, and caching the fluff direction in the screen space to a second buffer area;
the calculation module is suitable for performing illumination calculation on the basis of the other fluff attribute information cached in the first buffer area to obtain scene color information of the object to be rendered;
and the processing module is suitable for performing radial fuzzy processing on the object to be rendered based on the scene color information and the fluff direction in the screen space cached by the second buffer area to obtain a fluff pixel value of the object to be rendered.
According to yet another aspect of embodiments of the present invention, there is also provided a computer storage medium storing computer program code which, when run on a computing device, causes the computing device to perform the method of radial blur based pile rendering of any of the above embodiments.
According to still another aspect of the embodiments of the present invention, there is also provided a computing device including: a processor; a memory storing computer program code; the computer program code, when executed by the processor, causes the computing device to perform the method of radial blur based pile rendering of any of the embodiments above.
According to the embodiment of the invention, after the fluff direction and other fluff attribute information in the screen space of the object to be rendered are obtained through analysis according to the fluff rendering related resources and are respectively cached in different buffer areas, illumination calculation can be carried out on the basis of the cached other fluff attribute information to obtain the scene color information of the object to be rendered, and further, the object to be rendered is subjected to radial fuzzy processing on the basis of the scene color information and the cached fluff direction in the screen space to obtain the fluff pixel value of the object to be rendered. Therefore, the embodiment of the invention can increase the output of the fluff direction in the screen space of the object to be rendered based on the fluff rendering related resources of the object to be rendered in the basic channel delaying the rendering process, cache the fluff direction in the screen space into the special second buffer area, and perform radial fuzzy processing on the object to be rendered according to the fluff direction and the scene color information in the screen space of the second buffer area by adding the rendering channel, thereby rendering the real fluff effect of the object to be rendered quickly and conveniently.
The foregoing description is only an overview of the technical solutions of the present invention, and the embodiments of the present invention are described below in order to make the technical means of the present invention more clearly understood and to make the above and other objects, features, and advantages of the present invention more clearly understandable.
The above and other objects, advantages and features of the present invention will become more apparent to those skilled in the art from the following detailed description of specific embodiments thereof, taken in conjunction with the accompanying drawings.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the invention and not to limit the invention. In the drawings:
FIG. 1 is a flow chart diagram illustrating a method for rendering fuzz based on radial blur according to an embodiment of the present invention;
FIG. 2 illustrates a process diagram for analyzing the orientation of the pile in the screen space of an object to be rendered, according to an embodiment of the invention;
FIG. 3 is a schematic diagram illustrating a process of performing radial blurring on an object to be rendered according to an embodiment of the present invention;
fig. 4 shows a schematic structural diagram of a radial blur-based fluff rendering apparatus according to an embodiment of the present invention.
Detailed Description
Exemplary embodiments of the present invention will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the invention are shown in the drawings, it should be understood that the invention can be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art.
In order to solve the above technical problem, an embodiment of the present invention provides a method for rendering a nap based on radial blur, and fig. 1 illustrates a schematic flow chart of the method for rendering the nap based on radial blur according to an embodiment of the present invention. Referring to fig. 1, the method includes steps S102 to S108.
Step S102, acquiring fluff rendering related resources of the object to be rendered.
The object to be rendered in the embodiment of the invention can comprise a plurality of objects to be rendered which need to be subjected to fuzz rendering in a game scene, namely, a plurality of objects to be rendered are rendered.
And step S104, analyzing the fluff direction and other fluff attribute information in the screen space of the object to be rendered according to the fluff rendering related resources, caching the other fluff attribute information to the first buffer area, and caching the fluff direction in the screen space to the second buffer area.
Other fluff attribute information in the embodiment of the present invention may include a basic color, a roughness, a metallization degree, a world space normal direction, and the like of an object to be rendered, which is not specifically limited in the embodiment of the present invention. In addition, in the embodiment of the present invention, the fluff direction buffered in the screen space in the second buffer area is actually the fluff direction in the screen space obtained after the fluff direction in the tangential space is subjected to space conversion, which will be described in detail in the following embodiments.
And step S106, performing illumination calculation based on other fluff attribute information cached in the first buffer area to obtain scene color information of the object to be rendered.
And S108, performing radial fuzzy processing on the object to be rendered based on the scene color information and the fluff direction in the screen space cached by the second buffer area to obtain a fluff pixel value of the object to be rendered.
According to the embodiment of the invention, the output of the fluff direction in the screen space of the object to be rendered can be increased based on the fluff rendering related resource of the object to be rendered in the BasePass (basic channel) of the delayed rendering (deferredrendering) process, the fluff direction in the screen space is cached to the special second buffer area, and the radial fuzzy processing is carried out on the object to be rendered according to the fluff direction and the scene color information in the screen space of the second buffer area by adding the rendering channel (render Pass), so that the real fluff effect of the object to be rendered is quickly and conveniently rendered.
Referring to step S102, in an embodiment of the present invention, the resources related to obtaining the fuzz rendering of the object to be rendered may include a model base color mapping resource (Albedo), a fuzz noise mapping resource (FurNoise), a fuzz direction mapping resource (FurDirection) in a model tangent space, a model tangent space Normal mapping resource (Normal), a model roughness metal mapping resource (RoughnessMetallic), a model resource including Normal and tangent data, and the like. A map RG channel in a fluff direction map in a model tangent space records the fluff direction, a B channel records the fluff length, and the fluff area of an object to be rendered can be marked. The noise map of the pile may be a black and white noise map for producing a pile shading effect of the object to be rendered. Of course, the resources related to the rendering of the fluff may also include other resources, which is not specifically limited in this embodiment of the present invention.
Referring to step S104 above, in the embodiment of the present invention, the analyzed fluff direction in the screen space of the object to be rendered is cached in the second buffer (gbbuffer), and other fluff attribute information is cached in the first buffer (BasePass gbbuffer), where the gbbuffer may also be referred to as a geometric buffer (geometry buffer), the other fluff attribute information cached in the first buffer may be used for performing illumination calculation, and the fluff direction in the screen space cached in the second buffer may be used for performing subsequent radial blurring processing on the object to be rendered.
In an embodiment of the present invention, referring to fig. 2, the process of analyzing the direction of the nap in the screen space of the object to be rendered according to the nap rendering-related resource may include steps S1041 to S1044.
Step S1041, extracting RG channel value and B channel value from the model tangent space normal map resource.
Step S1042, calculating a fluff direction in the tangent space based on the extracted RG channel value.
In the embodiment of the invention, a map RG channel in a fluff direction map in a model tangent space records a fluff direction, the fluff direction in the tangent space can be effectively calculated based on an RG channel value, and the fluff direction in the tangent space comprises three components of XYZ, wherein XY-RG-2-1, and Z-sqrt (1-XY).
And S1043, multiplying the fluff direction in the tangent space, the extracted B channel value and a preset fluff length coefficient to obtain the real fluff direction in the tangent space.
The preset fluff length coefficient in the embodiment of the present invention may be set by a user, and the embodiment of the present invention does not limit a specific numerical value of the fluff length coefficient.
Step S1044, converting the real fluff direction in the tangent space to the fluff direction in the screen space.
In an embodiment of the present invention, the direction of the fluff in screen space may be referred to as the direction of the fluff in NDC (Normalized Device Coordinates) space. In the process of executing step S1044 to convert the real fluff direction in the tangent space to the fluff direction in the screen space, multiple times of spatial conversion need to be performed on the real fluff direction in the tangent space, so that the real fluff direction is limited within a range of-1 to 1, thereby facilitating the subsequent radial blurring processing process.
First, the real pile direction in the tangent space is converted to the real pile direction in the world space. In this embodiment, a TBN matrix composed of three vectors, namely, a tangential direction, a sub-tangential direction, and a normal direction, of a tangential space may be used to convert the real fluff direction in the tangential space to the real fluff direction in the world space. Then, the real fluff direction in the world space is converted to the real fluff direction in the projection space. In this embodiment, the real pile direction in world space may be multiplied by the view matrix and the projection matrix to obtain the real pile direction in projection space. And finally, converting the real fluff direction in the projection space into the fluff direction in the screen space by adopting a perspective division method. The direction of the fluff finally converted into the screen space is limited to be in the range of-1 to 1.
In an embodiment of the present invention, when step S104 is executed to analyze other fluff attribute information of the object to be rendered according to the fluff rendering related resource, if the other fluff attribute information is the basic color of the object to be rendered, the basic color information may be extracted from the model basic color map resource first, and the noise color information may be extracted from the noise map resource of the fluff. Then, the basic color information and the noise color information are mixed and calculated, so that the basic color (BaseColor) of the object to be rendered can be obtained. For example, the basic color information and the noise color information may be multiplied to obtain the basic color of the object to be rendered.
In another embodiment of the present invention, when the step S104 is executed to analyze other fluff attribute information of the object to be rendered according to the fluff rendering related resource, if the other fluff attribute information includes the world space normal direction, the tangent space normal direction may be extracted from the model tangent space normal map. The light source provided in the game scene is usually located in the world space, and if the correct illumination calculation is to be performed, the tangential space normal direction needs to be converted into the world space normal direction. Further, the tangential space normal direction is converted to the world space normal direction.
In another embodiment of the present invention, if the other fluff attribute information includes the roughness and the metallization degree of the object to be rendered, when step S104 is executed to analyze the other fluff attribute information of the object to be rendered according to the fluff rendering related resource, the roughness and the metallization degree of the object to be rendered can be directly obtained by analyzing from the model roughness metallization degree map resource.
Referring to step S108 above, in an embodiment of the present invention, referring to fig. 3, the process of performing the radial blurring processing on the object to be rendered based on the scene color (SceneColor) information and the direction of the fluff in the screen space includes steps S1081 to S1083.
Step S1081, determining a radial blurring direction according to a fluff direction in a screen space.
In the embodiment of the present invention, the direction of the current fluff on the screen can be sampled from the directions of the fluff in the cached screen space, and the sampled direction is used as the radial blurring direction.
And step S1082, starting from any pixel point of the current screen along a radial fuzzy direction, stepping a preset sampling time at preset distance intervals, and obtaining sampling points of the preset sampling time.
In this step, the preset sampling number, that is, the preset radial fuzzy sampling number, may be preset by a user, and the embodiment of the present invention does not limit the value of the specific radial fuzzy sampling number.
In an embodiment of the present invention, the preset distance interval may be a product of a preset image blurring distance coefficient and a length of a nap in a screen space. The preset image blurring distance coefficient can also be preset by a user, and the embodiment of the invention does not limit the specific numerical value of the preset image blurring distance coefficient.
And step S1083, scene color information corresponding to the sampling point is obtained, the obtained scene color information is averaged, and a fluff pixel value of the object to be rendered is determined according to the average value.
In the embodiment of the present invention, since the sampling point has the preset sampling number, the acquired scene color also has the preset sampling number. The average value of the preset sampling times of the colors is obtained, the fluff pixel value of the object to be rendered is determined according to the average value, and the radial fuzzy result of the fluff can be embodied through the obtained fluff pixel value. Therefore, the embodiment of the invention can effectively and quickly render the fluff effect of the object to be rendered.
Based on the same inventive concept, the embodiment of the invention also provides a fluff rendering device based on radial blurring. Fig. 4 shows a schematic structural diagram of a radial blur-based fluff rendering apparatus according to an embodiment of the present invention, and referring to fig. 4, the radial blur-based fluff rendering apparatus includes an obtaining module 410, an analyzing module 420, a calculating module 430, and a processing module 440.
An obtaining module 410 adapted to obtain a fluff rendering related resource of an object to be rendered;
the analysis module 420 is adapted to analyze the fluff direction and other fluff attribute information in the screen space of the object to be rendered according to the fluff rendering related resources, cache the other fluff attribute information to the first buffer area, and cache the fluff direction in the screen space to the second buffer area;
the calculating module 430 is adapted to perform illumination calculation based on the other fluff attribute information cached in the first buffer area to obtain scene color information of the object to be rendered;
the processing module 440 is adapted to perform radial blurring processing on the object to be rendered based on the scene color information and the fluff direction in the screen space cached by the second buffer to obtain a fluff pixel value of the object to be rendered.
In an embodiment of the present invention, the resources related to rendering the fuzz of the object to be rendered include at least one of model basic color mapping resources, fuzz noise mapping resources, fuzz direction mapping resources in a model tangent space, model tangent space normal mapping resources, model roughness metal degree mapping resources, and model resources including normal and tangent data.
In an embodiment of the present invention, the analysis module 420 is further adapted to: extracting RG channel values and B channel values from the model tangent space normal map resources; calculating to obtain the fluff direction in the tangent space based on the extracted RG channel value; multiplying the fluff direction in the tangent space, the extracted channel B value and a preset fluff length coefficient to obtain the real fluff direction in the tangent space; the true pile direction in the tangent space is converted to the pile direction in the screen space.
In an embodiment of the present invention, the analysis module 420 is further adapted to: after the real fluff direction in the tangent space is converted into the real fluff direction in the world space, the real fluff direction in the world space is converted into the real fluff direction in the projection space; and converting the real fluff direction in the projection space into the fluff direction in the screen space by adopting perspective division.
In an embodiment of the present invention, the other pile attribute information includes a basic color of the object to be rendered, and the analysis module 420 is further adapted to: extracting basic color information from model basic color mapping resources, and extracting noise color information from noise map resources of the fluff; and performing mixed calculation on the basic color information and the noise color information to obtain the basic color of the object to be rendered.
In an embodiment of the present invention, the other pile property information further includes a world space normal direction, and the analysis module 420 is further adapted to: and extracting the normal direction of the tangent space from the normal map of the tangent space of the model, and converting the normal direction of the tangent space into the normal direction of the world space.
In an embodiment of the present invention, the other fluff property information further includes roughness and metallization degree of the object to be rendered, and the analysis module 420 is further adapted to: and analyzing the roughness and the metal degree of the object to be rendered from the model roughness metal degree graph resource.
In an embodiment of the present invention, the processing module 440 is further adapted to: determining a radial blurring direction according to the fluff direction in the screen space; starting from any pixel point of a current screen along a radial fuzzy direction, stepping preset sampling times at intervals according to a preset distance, and obtaining sampling points of the preset sampling times; and acquiring scene color information corresponding to the sampling point, averaging the acquired scene color information, and determining a fluff pixel value of the object to be rendered according to the average value.
In an embodiment of the present invention, the preset distance interval includes: the product of the image blur distance coefficient and the length of the pile in the screen space is preset.
Based on the same inventive concept, embodiments of the present invention also provide a computer storage medium storing computer program code, which, when run on a computing device, causes the computing device to execute the method for rendering fuzz based on radial blur in any of the above embodiments.
Based on the same inventive concept, an embodiment of the present invention further provides a computing device, including: a processor; a memory storing computer program code; the computer program code, when executed by the processor, causes the computing device to perform the method for radial blur based pile rendering in any of the embodiments above.
It is clear to those skilled in the art that the specific working processes of the above-described systems, devices, modules and units may refer to the corresponding processes in the foregoing method embodiments, and for the sake of brevity, further description is omitted here.
In addition, the functional units in the embodiments of the present invention may be physically independent of each other, two or more functional units may be integrated together, or all the functional units may be integrated in one processing unit. The integrated functional units may be implemented in the form of hardware, or in the form of software or firmware.
Those of ordinary skill in the art will understand that: the integrated functional units, if implemented in software and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computing device (e.g., a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention when the instructions are executed. And the aforementioned storage medium includes: u disk, removable hard disk, Read Only Memory (ROM), Random Access Memory (RAM), magnetic or optical disk, and other various media capable of storing program code.
Alternatively, all or part of the steps of implementing the foregoing method embodiments may be implemented by hardware (such as a computing device, e.g., a personal computer, a server, or a network device) associated with program instructions, which may be stored in a computer-readable storage medium, and when the program instructions are executed by a processor of the computing device, the computing device executes all or part of the steps of the method according to the embodiments of the present invention.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments can be modified or some or all of the technical features can be equivalently replaced within the spirit and principle of the present invention; such modifications or substitutions do not depart from the scope of the present invention.

Claims (12)

1. A method for rendering fuzz based on radial blurring comprises the following steps:
acquiring fluff rendering related resources of an object to be rendered;
analyzing the fluff direction and other fluff attribute information in the screen space of the object to be rendered according to the fluff rendering related resources, caching the other fluff attribute information to a first buffer area, and caching the fluff direction in the screen space to a second buffer area; wherein the other fluff attribute information comprises at least one of basic color, world space normal direction, roughness and metallization degree of the object to be rendered;
performing illumination calculation based on the other fluff attribute information cached in the first buffer area to obtain scene color information of the object to be rendered;
and performing radial fuzzy processing on the object to be rendered based on the scene color information and the fluff direction in the screen space cached by the second buffer area to obtain a fluff pixel value of the object to be rendered.
2. The method of claim 1, wherein the fluff rendering related resources of the object to be rendered comprise:
at least one of model basic color mapping resources, noise mapping resources of villi, villi direction mapping resources in a model tangent space, normal mapping resources in the model tangent space, model roughness metal degree mapping resources and model resources containing normal and tangent data.
3. The method of claim 2, wherein analyzing a pile direction in a screen space of the object to be rendered in accordance with the pile rendering related resources comprises:
extracting RG channel values and B channel values from the model tangent space normal map resources;
calculating to obtain the fluff direction in a tangent space based on the extracted RG channel value;
multiplying the fluff direction in the tangent space, the extracted channel B value and a preset fluff length coefficient to obtain the real fluff direction in the tangent space;
and converting the real fluff direction in the tangent space to the fluff direction in the screen space.
4. The method of claim 3, wherein converting the true pile direction in the tangent space to the pile direction in the screen space comprises:
converting the real fluff direction in the tangent space to the real fluff direction in the world space, and converting the real fluff direction in the world space to the real fluff direction in the projection space;
and converting the real fluff direction in the projection space to the fluff direction in the screen space by adopting perspective division.
5. The method according to any one of claims 2-4, wherein if the other pile attribute information includes a basic color of the object to be rendered, analyzing the other pile attribute information of the object to be rendered according to the pile rendering related resource, including:
extracting basic color information from the model basic color mapping resources, and extracting noise color information from the noise map resources of the fluff;
and performing mixed calculation on the basic color information and the noise color information to obtain the basic color of the object to be rendered.
6. The method according to any one of claims 2-4, wherein if the other fluff property information includes a world space normal direction, analyzing the other fluff property information of the object to be rendered according to the fluff rendering related resource, including:
and extracting the normal direction of the tangent space from the model tangent space normal map, and converting the normal direction of the tangent space into the normal direction of the world space.
7. The method according to any one of claims 2 to 4, wherein if the other fluff attribute information includes roughness and metallization of the object to be rendered, analyzing the other fluff attribute information of the object to be rendered according to the fluff rendering related resource includes:
and analyzing the roughness and the metal degree of the object to be rendered from the model roughness metal degree graph resource.
8. The method according to any one of claims 1 to 4, wherein performing radial blurring on an object to be rendered based on the scene color information and a fluff direction in the screen space cached by the second buffer to obtain a fluff pixel value of the object to be rendered comprises:
determining a radial blurring direction according to the fluff direction in the screen space;
starting to step by preset sampling times from any pixel point of a current screen along the radial fuzzy direction and according to a preset distance interval, and obtaining sampling points of the preset sampling times;
and acquiring scene color information corresponding to the sampling points, averaging the acquired scene color information, and determining a fluff pixel value of the object to be rendered according to the average value.
9. The method of claim 8, wherein,
the preset distance interval includes: and presetting the product of the image blurring distance coefficient and the length of the fluff in the screen space.
10. A radial blur based fluff rendering apparatus, comprising:
the acquisition module is used for acquiring fluff rendering related resources of an object to be rendered;
the analysis module is used for analyzing the fluff direction and other fluff attribute information in the screen space of the object to be rendered according to the fluff rendering related resources, caching the other fluff attribute information to a first buffer area, and caching the fluff direction in the screen space to a second buffer area; wherein the other fluff attribute information comprises at least one of basic color, world space normal direction, roughness and metallization degree of the object to be rendered;
the calculation module is used for performing illumination calculation on the basis of the other fluff attribute information cached in the first buffer area to obtain scene color information of the object to be rendered;
and the processing module is used for carrying out radial fuzzy processing on the object to be rendered based on the scene color information and the fluff direction in the screen space cached by the second buffer area to obtain a fluff pixel value of the object to be rendered.
11. A computer storage medium storing computer program code which, when run on a computing device, causes the computing device to perform the radial blur based fluff rendering method of any of claims 1-9.
12. A computing device, comprising: a processor; a memory storing computer program code; the computer program code, when executed by the processor, causes the computing device to perform the method of radial blur based pile rendering according to any of claims 1-9.
CN202010631548.1A 2020-07-03 2020-07-03 Radial fuzzy-based fluff rendering method and device and storage medium Active CN111862290B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202010631548.1A CN111862290B (en) 2020-07-03 2020-07-03 Radial fuzzy-based fluff rendering method and device and storage medium
PCT/CN2020/130324 WO2022000953A1 (en) 2020-07-03 2020-11-20 Fluff rendering method and device based on radial blurring, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010631548.1A CN111862290B (en) 2020-07-03 2020-07-03 Radial fuzzy-based fluff rendering method and device and storage medium

Publications (2)

Publication Number Publication Date
CN111862290A CN111862290A (en) 2020-10-30
CN111862290B true CN111862290B (en) 2021-05-11

Family

ID=73152111

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010631548.1A Active CN111862290B (en) 2020-07-03 2020-07-03 Radial fuzzy-based fluff rendering method and device and storage medium

Country Status (2)

Country Link
CN (1) CN111862290B (en)
WO (1) WO2022000953A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111862290B (en) * 2020-07-03 2021-05-11 完美世界(北京)软件科技发展有限公司 Radial fuzzy-based fluff rendering method and device and storage medium
CN114693856B (en) * 2022-05-30 2022-09-09 腾讯科技(深圳)有限公司 Object generation method and device, computer equipment and storage medium
CN116883567A (en) * 2023-07-07 2023-10-13 上海散爆信息技术有限公司 Fluff rendering method and device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102982575A (en) * 2012-11-29 2013-03-20 杭州挪云科技有限公司 Hair rendering method based on ray tracking
CN104268922A (en) * 2014-09-03 2015-01-07 广州博冠信息科技有限公司 Image rendering method and device
CN104574479A (en) * 2015-01-07 2015-04-29 北京科艺有容科技有限责任公司 Rapid generating method for bird single feathers in three-dimensional animation
CN107204036A (en) * 2016-03-16 2017-09-26 腾讯科技(深圳)有限公司 The method and apparatus for generating hair image

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5758046A (en) * 1995-12-01 1998-05-26 Lucas Digital, Ltd. Method and apparatus for creating lifelike digital representations of hair and other fine-grained images
CN102682420A (en) * 2012-03-31 2012-09-19 北京百舜华年文化传播有限公司 Method and device for converting real character image to cartoon-style image
US10365716B2 (en) * 2013-03-15 2019-07-30 Interaxon Inc. Wearable computing apparatus and method
CN106575445B (en) * 2014-09-24 2021-02-05 英特尔公司 Fur avatar animation
CN108510500B (en) * 2018-05-14 2021-02-26 深圳市云之梦科技有限公司 Method and system for processing hair image layer of virtual character image based on human face skin color detection
CN110060321B (en) * 2018-10-15 2022-11-25 叠境数字科技(上海)有限公司 Real material based quick real-time hair rendering method
CN110136238B (en) * 2019-04-02 2023-06-23 杭州小影创新科技股份有限公司 AR drawing method combined with physical illumination model
CN110648386B (en) * 2019-07-23 2024-01-09 完美世界(北京)软件科技发展有限公司 Method and system for antialiasing of primitives
CN111862290B (en) * 2020-07-03 2021-05-11 完美世界(北京)软件科技发展有限公司 Radial fuzzy-based fluff rendering method and device and storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102982575A (en) * 2012-11-29 2013-03-20 杭州挪云科技有限公司 Hair rendering method based on ray tracking
CN104268922A (en) * 2014-09-03 2015-01-07 广州博冠信息科技有限公司 Image rendering method and device
CN104574479A (en) * 2015-01-07 2015-04-29 北京科艺有容科技有限责任公司 Rapid generating method for bird single feathers in three-dimensional animation
CN107204036A (en) * 2016-03-16 2017-09-26 腾讯科技(深圳)有限公司 The method and apparatus for generating hair image

Also Published As

Publication number Publication date
CN111862290A (en) 2020-10-30
WO2022000953A1 (en) 2022-01-06

Similar Documents

Publication Publication Date Title
CN111862290B (en) Radial fuzzy-based fluff rendering method and device and storage medium
WO2018176938A1 (en) Method and device for extracting center of infrared light spot, and electronic device
CN107154063B (en) Method and device for setting shape of image display area
CN111311523B (en) Image processing method, device and system and electronic equipment
US11748986B2 (en) Method and apparatus for recognizing key identifier in video, device and storage medium
US11776202B2 (en) Image processing method and apparatus, computer storage medium, and electronic device
CN112652046B (en) Game picture generation method, device, equipment and storage medium
CN112019827B (en) Method, device, equipment and storage medium for enhancing video image color
KR20220113686A (en) Image processing method and apparatus, image processing model training method and apparatus
CN111353955A (en) Image processing method, device, equipment and storage medium
CN114719966A (en) Light source determination method and device, electronic equipment and storage medium
CN110363837B (en) Method and device for processing texture image in game, electronic equipment and storage medium
WO2020231016A1 (en) Image optimization method, apparatus, device and storage medium
KR101215666B1 (en) Method, system and computer program product for object color correction
Zhao et al. Efficient image decolorization with a multimodal contrast-preserving measure
CN111429371A (en) Image processing method and device and terminal equipment
Choi et al. A method for fast multi-exposure image fusion
CN109886864B (en) Privacy mask processing method and device
CN108734712B (en) Background segmentation method and device and computer storage medium
Zhang et al. A ViBe based moving targets edge detection algorithm and its parallel implementation
CN113506305A (en) Image enhancement method, semantic segmentation method and device for three-dimensional point cloud data
CN111882498A (en) Image processing method, image processing device, electronic equipment and storage medium
JP2006050070A (en) Image processing method, apparatus and program thereof
CN111754417A (en) Noise reduction method and device for video image, video matting method and device and electronic system
Jia et al. A multi-scale patch-wise algorithm for multi-exposure image fusion

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant