CN111951369A - Method and device for processing detail texture - Google Patents

Method and device for processing detail texture Download PDF

Info

Publication number
CN111951369A
CN111951369A CN202010905793.7A CN202010905793A CN111951369A CN 111951369 A CN111951369 A CN 111951369A CN 202010905793 A CN202010905793 A CN 202010905793A CN 111951369 A CN111951369 A CN 111951369A
Authority
CN
China
Prior art keywords
texture
mask
detail
textures
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010905793.7A
Other languages
Chinese (zh)
Other versions
CN111951369B (en
Inventor
马克思米兰·罗兹勒
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202010905793.7A priority Critical patent/CN111951369B/en
Publication of CN111951369A publication Critical patent/CN111951369A/en
Application granted granted Critical
Publication of CN111951369B publication Critical patent/CN111951369B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/04Texture mapping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/51Indexing; Data structures therefor; Storage structures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques

Abstract

The embodiment of the invention provides a method and a device for processing detail textures, wherein the method comprises the following steps: acquiring a target detail texture from a plurality of detail textures stored in a preset texture array; determining a target mask texture from a plurality of preset mask textures according to a first index and a preset mapping relation of the target detail texture; the preset gray scale intervals of the mask textures respectively correspond to the sub-gray scale intervals of the single texture channel; and fusing the target detail texture, the target mask texture and a preset main texture. The method has the advantages that the surface details of the model can be better represented by using a plurality of different detail textures, the application area of the detail textures is defined by using the mask textures in a single texture channel, the application strength of the detail textures is defined by adopting a mixing range, and the performance of the sampler can be favorably ensured.

Description

Method and device for processing detail texture
Technical Field
The present invention relates to the field of image processing technologies, and in particular, to a method and an apparatus for processing a detail texture.
Background
Rendering is the last process of CG (computer Graphics), and is also the stage of finally making an image or model more consistent with a 3D scene, and it is common to use a detail texture in real-time rendering to improve the near-looking appearance of a model, and basically all major game engines, such as Unity, UDK, goodot, Unreal, Lumberyard, and CryEngine, have built-in functions to provide support or implement using the detail texture.
Under a PBR (physical-Based-Rendering) coloring model, fine arts mainly tile detail textures on the surface of the model for multiple times, so that the tiled detail textures are subjected to detail fusion with an original main texture, the richness of the details on the surface of the model is increased when the model is viewed at near distance, and the visual effect is improved. However, in order to implement detail texture masking, an additional texture channel is also needed to define the detail texture application area that needs to be masked, most engines do not provide a layer of masking to define the location and strength of the application detail texture, and only one layer of detail texture can be used in all engines.
Due to the above performance limitations, in general, when using detail textures, the art only considers using a general detail texture on the whole texture. But in a model usually a combination of materials and surface types is used, such as metal, wood, paint, rubber, plastic or textile, where the different types of surface structures and material properties vary widely and the more general detail texture used may not be compatible with certain materials.
Disclosure of Invention
In view of the above problems, embodiments of the present invention are proposed to provide a method for processing a detail texture and a corresponding device for processing a detail texture that overcome or at least partially solve the above problems.
In order to solve the above problem, an embodiment of the present invention discloses a method for processing a detail texture, where the method includes:
acquiring a target detail texture from a plurality of detail textures stored in a preset texture array; the detail textures respectively have corresponding first indexes;
determining a target mask texture from a plurality of preset mask textures according to a first index and a preset mapping relation of the target detail texture; the preset gray scale intervals of the mask textures respectively correspond to the sub-gray scale intervals of the single texture channel; the plurality of mask textures have second indices determined based on corresponding sub-gray intervals; the preset mapping relation is the mapping relation between the second index and the first index; the second index of the target mask texture matches the first index of the target detail texture;
and fusing the target detail texture, the target mask texture and a preset main texture.
Optionally, the sub-gray scale interval is obtained by dividing the gray scale interval of the single texture channel based on the number of the mask textures.
Optionally, the fusing the target detail texture, the target mask texture, and a preset main texture includes:
determining a mixing range according to the target mask texture;
and fusing the target detail texture and the preset main texture by adopting the mixing range.
Optionally, the fusing the target detail texture, the target mask texture, and a preset main texture includes:
determining a mixing range and texture intensity according to the target mask texture;
and fusing the target detail texture and a preset main texture by adopting the mixing range and the texture intensity.
Optionally, the fusing the target detail texture, the target mask texture, and a preset main texture includes:
and expanding the sub-gray scale interval corresponding to the target mask texture into a preset gray scale interval, and fusing the target detail texture, the target mask texture with the preset gray scale interval and a preset main texture.
The embodiment of the invention also discloses a method for processing the detail texture, which comprises the following steps:
acquiring a plurality of detail textures, and storing the detail textures to a preset texture array; the detail textures respectively have corresponding first indexes;
obtaining a plurality of mask textures, and establishing corresponding relations between the mask textures and a plurality of sub-gray scale intervals based on a single texture channel; the plurality of mask textures have second indices determined based on corresponding sub-gray intervals;
and establishing a mapping relation between the first index and the second index so as to be used for fusing the detail texture, the mask texture and a preset main texture respectively according to the mapping relation.
Optionally, the establishing a correspondence between the mask textures and a plurality of sub-gray scale intervals based on a single texture channel includes:
dividing the gray level interval corresponding to the single texture channel into a plurality of sub-gray level intervals based on the number of the mask textures;
and establishing corresponding relations between the plurality of mask textures and the plurality of sub-gray scale intervals respectively.
Optionally, the dividing the gray scale interval corresponding to the single texture channel into a plurality of sub-gray scale intervals based on the number of the mask textures includes:
obtaining a division interval value for dividing the gray scale interval by using a divisor of the number of the gray scale interval and the mask texture corresponding to the single texture channel;
and dividing the gray level interval by adopting the division interval value to obtain a plurality of sub gray level intervals.
Optionally, the method further comprises:
determining a proportional relation between a sub-gray level interval corresponding to any one of the plurality of mask textures and a gray level interval of pixels contained in the mask texture;
and adjusting the gray scale of the pixels contained in the corresponding mask texture based on the proportional relation to obtain the adjusted mask texture.
Optionally, the method further comprises:
and determining a second index of the corresponding mask texture according to the sub-gray level interval corresponding to the pixel contained in any adjusted mask texture.
The embodiment of the invention also discloses a device for processing the detail texture, which comprises:
the target detail texture acquisition module is used for acquiring a target detail texture from a plurality of detail textures stored in a preset texture array; the detail textures respectively have corresponding first indexes;
the target mask texture determining module is used for determining a target mask texture from a plurality of preset mask textures according to the first index of the target detail texture and a preset mapping relation; the preset gray scale intervals of the plurality of mask textures respectively correspond to the plurality of sub-gray scale intervals of the single texture channel; the plurality of mask textures have second indices determined based on corresponding sub-gray intervals; the preset mapping relation is the mapping relation between the second index and the first index; the second index of the target mask texture matches the first index of the target detail texture;
and the texture fusion module is used for fusing the target detail texture, the target mask texture and a preset main texture.
Optionally, the sub-gray scale interval is obtained by dividing the gray scale interval of the single texture channel based on the number of the mask textures.
Optionally, the texture fusion module comprises:
a blending range determining submodule for determining a blending range according to the target mask texture;
and the first texture fusion submodule is used for fusing the target detail texture and the preset main texture by adopting the mixing range.
Optionally, the texture fusion module comprises:
the texture intensity determining submodule is used for determining a mixing range and texture intensity according to the target mask texture;
and the second texture fusion sub-module is used for fusing the target detail texture and the preset main texture by adopting the mixing range and the texture intensity.
Optionally, the texture fusion module comprises:
and the third texture fusion submodule is used for expanding the sub-gray scale interval corresponding to the target mask texture into a preset gray scale interval and fusing the target detail texture, the target mask texture with the preset gray scale interval and a preset main texture.
The embodiment of the invention also discloses a device for processing the detail texture, which comprises:
the detail texture storage module is used for acquiring a plurality of detail textures and storing the detail textures to a preset texture array; the detail textures respectively have corresponding first indexes;
the corresponding relation establishing module is used for acquiring a plurality of mask textures and establishing corresponding relations between the mask textures and a plurality of sub-gray scale intervals based on a single texture channel; the plurality of mask textures have second indices determined based on corresponding sub-gray intervals;
and the mapping relation establishing module is used for establishing the mapping relation between the first index and the second index so as to respectively fuse the detail texture, the mask texture and a preset main texture according to the mapping relation.
Optionally, the correspondence relationship establishing module includes:
a sub-gray interval division submodule for dividing the gray interval corresponding to the single texture channel into a plurality of sub-gray intervals based on the number of the mask textures;
and the corresponding relation establishing submodule is used for establishing corresponding relations between the plurality of mask textures and the plurality of sub-gray scale intervals respectively.
Optionally, the sub-gray scale interval division sub-module includes:
a division interval value determining unit, configured to obtain a division interval value used for dividing the gray interval by using a divisor of the number of gray intervals and mask textures corresponding to the single texture channel;
and the sub-gray interval dividing unit is used for dividing the gray interval by adopting the dividing interval value to obtain a plurality of sub-gray intervals.
Optionally, the corresponding relationship establishing module further includes:
the proportion relation determining submodule is used for determining the proportion relation between a sub-gray level interval corresponding to any mask texture in the plurality of mask textures and a gray level interval of pixels contained in the mask texture;
and the pixel gray level adjusting submodule is used for adjusting the gray level of the pixel contained in the corresponding mask texture based on the proportional relation to obtain the adjusted mask texture.
Optionally, the corresponding relationship establishing module further includes:
and the second index determining submodule is used for determining a second index of the corresponding mask texture according to the sub-gray level interval corresponding to the pixel contained in any adjusted mask texture.
The embodiment of the invention also discloses an electronic device, which comprises: a processor, a memory and a computer program stored on the memory and capable of running on the processor, the computer program, when executed by the processor, implementing the steps of the method of processing the detail texture.
The embodiment of the invention also discloses a computer readable storage medium, wherein a computer program is stored on the computer readable storage medium, and the computer program realizes the steps of the processing method of the detail texture when being executed by a processor.
The embodiment of the invention has the following advantages:
in the embodiment of the invention, a target detail texture is obtained from a plurality of detail textures stored in a preset texture array, and the target mask texture is determined from a plurality of preset mask textures according to a first index of the target detail texture and a preset mapping relation, wherein the gray scale intervals of the plurality of preset mask textures can respectively correspond to a plurality of sub-gray scale intervals of a single texture channel, the preset mapping relation refers to the mapping relation between the first index of the detail texture and a second index of the mask texture, and finally the target detail texture, the target mask texture and a preset main texture can be fused. The method has the advantages that the surface details of the model are better represented by using a plurality of different detail textures, so that the visual effect and consistency of the 3D real-time model are greatly improved, the mask textures in a single texture channel are used for defining the application area of the detail textures, the application intensity of the detail textures is defined by adopting a detail mixing mask range, the occupied memory of the single texture channel is small, and the performance of the sampler is favorably ensured.
Drawings
FIG. 1 is a flowchart illustrating a method for processing a detail texture according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a detail texture with no intensity variation applied in an embodiment of the present invention;
FIG. 3 is a diagram illustrating fusion of detail textures in the related art;
fig. 4A to 4B are schematic diagrams illustrating a process of restoring a sub-gray scale interval according to an embodiment of the present invention;
FIG. 5 is a flowchart illustrating a method for processing a detail texture according to another embodiment of the present invention;
FIG. 6 is a schematic diagram illustrating the division of sub-gray scale intervals according to an embodiment of the present invention;
FIG. 7 is a schematic diagram illustrating the determination of a second index according to an embodiment of the present invention;
FIG. 8 is a process diagram of a method for processing detail textures in an embodiment of the invention;
FIG. 9 is a block diagram of a detail texture processing apparatus according to an embodiment of the present invention;
FIG. 10 is a block diagram of a detail texture processing apparatus according to another embodiment of the present invention.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in further detail below.
To facilitate the understanding of the present invention by the user, the following explanation is made of terms or nouns involved in the following embodiments of the present invention:
3D model: mathematical representation of the surface of a three-dimensional object;
mesh, geometry, 3D surface: a set of terms used to describe the vertex data and triangle indices of the 3D model;
vertex: a point in 3D space containing information about the normal direction, texture coordinates and other mesh properties;
triangular surface: a triangular face is a 3D face formed by a set of three vertices;
texture mapping: a group of pixels for projection onto a 3D surface in real-time rendering;
frame rate: similar to the concept in movies, the number of images (frames) displayed per second;
masking textures: the texture used to define the texture application range and intensity. The texture is a black and white map, and only gray values are used for representing areas and intensity;
coloring model: simulating the effect of illumination on the surface of the object;
a shader: one technique for rendering graphics is to customize the algorithm of the graphics card rendering picture with code, or to use code to tell the GPU how to draw the vertex or pixel color of the model, so that the picture achieves the effect we want;
normal mapping: and describing the normal information of the object by mapping, thereby realizing the light and shadow effect of the details of the object.
In real-time rendering, artists typically use texture mapping to add high frequency details to a 3D model that cannot be geometrically represented. Such small details of a 3D model cannot be made in a conventional rendering pipeline, as this adds much extra work to the graphics pipeline and memory bandwidth, and in practice it is not feasible to render millions of vertices in a single model to achieve a reasonably real-time frame rate without stuttering.
Using texture mapping can solve this problem, but the more textures used, the higher the definition, and the larger the memory. Therefore, the developer must select a texture resolution that balances memory requirements and visual fidelity. However, since players are usually free to move around in the game, viewing the 3D model from various angles and distances, typical texture resolutions often fail to guarantee effectiveness at close distances. At this point, we need the detail texture to increase the level of visual detail at close distances.
Existing implementations rely primarily on the use of a single detail texture, including primarily the size, intensity, mask, detail normal of the detail texture, and various PBR parameters such as albedo color, coarseness/smoothness, and metallization, among others, primarily through the built-in Shader shaders or Shader-graphs of the game engine.
The prior art relies on the use of a single detail texture, which is mainly achieved by multiple tiling of the single detail texture and blending with the original texture. Specifically, the surface effect of the model in the near-viewing process is improved by creating independent detail albedo and normal line mapping textures, namely, the detail textures are sampled by a Shader through an original mapping scaled version or texture coordinates generated by programming, and then the detail textures obtained through sampling are mixed with the original textures, so that the richness of details on the surface of the model in the near-viewing process is increased, and the visual effect is improved. There may be differences between different game engines in the details of the algorithm that mixes albedo and normal mapping, and in the functions that can affect the available for the detail texture settings.
However, the near-view appearance of the model is improved by relying on a single detail texture, when a large number of single detail textures are required, the large number of detail textures occupy more memory, more texture samplers are required in the Shader, and in order to implement the detail texture mask, an additional texture channel is required to be used for defining a detail texture application area to be masked. Because of the limitations described above, existing detailed texture implementations only allow a single detailed texture to be used on all surfaces of a model, but because a model typically contains a variety of different surface types, a single detailed texture can only represent one type of surface, i.e., all surfaces of a model can only use one texture and material. Therefore, the single detail texture that is more common is typically selected on the detail textures for all surfaces in the model, but the common single detail texture may not be compatible with a certain surface.
Referring to fig. 1, a flowchart illustrating steps of a method for processing a detail texture according to an embodiment of the present invention is shown, and the method focuses on a process of using the detail texture (i.e., texture fusion), and may be applied to an engine related to rendering, and specifically may include the following steps:
step 101, obtaining a target detail texture from a plurality of detail textures stored in a preset texture array; the detail textures respectively have corresponding first indexes;
in one embodiment of the present invention, in order to increase the richness of details on the surface of the PBR shading model and to enable different detail textures to be used on each material, the target detail texture may be obtained from a plurality of detail textures stored in a preset texture array.
The PBR rendering model may be a 3D model and the use of detail texture to increase the surface detail of the model may be achieved by texture mapping, which refers to a technique that projects a 2D image onto the 3D model. This process requires that the 3D model have texture coordinates that can create a 2D representation of the 3D model, the textures being used to store high frequency detail and surface information such as color, transparency, and other lighting information, where the 2D image may refer to a detail texture.
Among them, the detail texture is a method capable of increasing the detail of the near-distance surface by mixing a single layer texture with the main texture. A detail texture is a tile texture, i.e. the texture is repeated multiple times on the 3D model, whereas a main texture is, in contrast, usually mapped only once onto the 3D model.
Step 102, determining a target mask texture from a plurality of preset mask textures according to a first index of the target detail texture and a preset mapping relation;
in one embodiment of the present invention, in addition to acquiring the detail textures, a plurality of mask textures are acquired, so that the application range and the intensity of the detail textures on the model surface are defined by the mask textures.
Specifically, the target mask texture may be determined from a plurality of preset mask textures according to a first index of the target detail texture and a preset mapping relationship, where the preset mapping relationship may refer to a mapping relationship between a second index and the first index, and the second index of the target mask texture may be matched with the first index of the target detail texture.
The preset gray scale intervals of a plurality of mask textures can respectively correspond to a plurality of sub-gray scale intervals of a single texture channel; the plurality of mask textures may have a second index determined based on the corresponding sub-gray scale interval; the sub-gray scale intervals respectively corresponding to the plurality of mask textures may be obtained by dividing the gray scale interval of a single texture channel based on the number of mask textures.
And 103, fusing the target detail texture, the target mask texture and a preset main texture.
In an embodiment of the present invention, after the target detail texture is obtained, the corresponding second index is determined according to the first index and the mapping relationship, and the corresponding target mask texture for defining the application intensity and range when the detail texture is fused is determined through the second index, the target detail texture, the target mask texture, and the preset main texture may be fused.
The predetermined main texture may refer to a texture different from the entire surface of the PBR colored pattern, and may include metal, wood, paint, rubber, plastic, or fabric, etc., which is not limited by the embodiments of the present invention.
In one case, step 103 may include sub-steps S11 and S12 as follows:
a substep S11 of determining a blending range according to the target mask texture;
and a substep S12 of fusing the target detail texture and a preset main texture using the blending range.
In practical applications, there is no smooth transition in the application between different detail textures, and after sampling the corresponding target detail texture from the texture array and determining the target mask texture, and applying the two textures, the detail texture is applied without intensity change, i.e. without any fusion. Referring to FIG. 2, a schematic diagram of an embodiment of the present invention is shown in which a detail texture without intensity change is applied, and the detail texture is directly cut off when going from one detail texture to another.
At this time, a blending range corresponding to the target mask texture may be determined, and the target detail texture and the preset main texture are fused by using the blending range. The blending range of the target mask texture may be determined according to the gray value of the pixel included in the target mask texture, and the gray value of the pixel included in the target mask texture is positively correlated with the transparency, that is, the higher the gray value of the pixel included in the target mask texture is, the lower the transparency is, and the more obvious the pixel region with low transparency is when the target mask texture is fused with the selected detail texture and the preset main texture.
In another case, step 103 may include sub-steps S13 and S14 as follows:
a substep S13 of determining a blending range and a texture strength according to the target mask texture;
and a substep S14 of fusing the target detail texture and a preset main texture using the blending range and the texture intensity.
In a preferred embodiment, while establishing the corresponding relationship between the plurality of mask textures and the plurality of sub-gray scale intervals, a plurality of blending ranges for the plurality of mask textures can be set, so that when the target mask texture is used, the application range and the application intensity of the target mask texture can be defined through the blending ranges; for convenience of definition, each mask texture may be defined with each blending range.
Specifically, in the related art, when the detail texture is used, it is generally required to be mixed with albedo, normal and roughness textures of the main texture, and a mask texture may also be used to define where the detail texture will appear and the intensity of the detail texture. Referring to fig. 3, a fusion diagram of detail texture in the related art is shown, and in detail masking (i.e., fusion of detail texture and mask texture), the fusion is performed using the entire gray scale interval (0-255) corresponding to a single texture channel.
However, in order to apply the plurality of mask textures using the entire gray scale interval of the single texture channel, i.e., if the plurality of mask textures are required to be applied within the same gray scale interval 0-255, the entire gray scale interval may be divided into smaller sub-gray scale intervals to correspond to the plurality of mask textures, respectively.
In order to respectively correspond the plurality of mask textures to the plurality of sub-gray scale intervals, a proportional relationship between the sub-gray scale interval corresponding to any one of the plurality of mask textures and the gray scale interval of the pixels included in the mask texture can be determined; then based on the proportional relation, adjusting the gray scale of pixels contained in the corresponding mask texture to obtain the adjusted mask texture; the adjusted mask texture has a corresponding relation with the sub-gray scale interval corresponding to the proportional relation. For example, assuming that the gray scale interval of the pixels included in the target mask texture may be 0-255, and the second sub-gray scale interval 65-128 may be selected as the sub-gray scale interval corresponding to any one of the mask textures, the gray scale interval 0-255 may be calculated at this time, the proportional relationship between the gray scale interval and the second sub-gray scale interval 65-128 is 4:1, and the pixels included in the target mask texture are compressed to 1/4 based on the proportional relationship, so that the pixels included in the compressed target mask texture may have a corresponding relationship with the second sub-gray scale interval.
In order to realize smooth transition of application between different detail textures, a blend mask can be reconstructed within each subset value range, that is, a sub-gray level interval corresponding to the target mask texture can be expanded into a preset gray level interval, and the target detail texture, the target mask texture with the preset gray level interval, and a preset main texture are fused.
Referring to fig. 4A to 4B, schematic diagrams of a process of restoring a sub-gray interval in an embodiment of the present invention are shown, where after a pixel included in each mask texture and an interval value (64 in this example) of each sub-gray interval are obtained, the sub-gray interval may be recalculated in a Shader, and the expansion is restored to the entire gray interval (0 to 255), and the calculated result is used to fuse the mask textures, so that the result is the same as when a single detail texture is used, that is, the pixel included in the target mask texture compressed by 1/4 in the above example is re-expanded to the gray of the original pixel.
It should be noted that the gray scale adjustment of the pixels included in the mask texture according to the proportional relationship is to implement that the gray scale interval using a single texture channel corresponds to a plurality of mask textures, reduce the memory and ensure the performance; the gray scale interval of the gray scale of the pixels contained in the adjusted mask texture is expanded to the original gray scale interval, so that the pixels contained in the mask texture are used for fusion.
In a preferred embodiment, the more corresponding mask textures, i.e. sub-gray intervals, in a gray interval, the more the available interval for blending each detail texture is compressed. When the whole gray scale interval is divided into several small segments (i.e. sub-gray scale intervals), even if the whole gray scale interval is expanded again, the final effect is still slightly lost in quality. However, this loss of blending effect is not visually apparent since the detail texture is only used to represent the fine details of the object, and not the main texture. Secondly, the single detail texture and the single detail texture cannot be fused with each other, and each single detail texture must be fused with the main texture before being fused with another single detail texture.
In the embodiment of the invention, a target detail texture is obtained from a plurality of detail textures stored in a preset texture array, and the target mask texture is determined from a plurality of preset mask textures according to a first index of the target detail texture and a preset mapping relation, wherein the gray scale intervals of the plurality of preset mask textures can respectively correspond to a plurality of sub-gray scale intervals of a single texture channel, the preset mapping relation refers to the mapping relation between the first index of the detail texture and a second index of the mask texture, and finally the target detail texture, the target mask texture and a preset main texture can be fused. The method has the advantages that the surface details of the model are better represented by the aid of the different detail textures, visual effect and consistency of the 3D real-time model are greatly improved, the application area of the detail textures is defined by the aid of the mask textures in the single texture channel, the application strength of the detail textures is defined by the aid of a mixing range, the occupied memory of the single texture channel is small, and performance of the sampler is guaranteed.
Referring to fig. 5, a flowchart illustrating steps of a method for processing a detail texture according to another embodiment of the present invention is shown, and focusing on a processing procedure of the detail texture (i.e. before performing texture fusion), the method may be applied to rendering-related software used by a rendering person, and specifically may include the following steps:
step 501, obtaining a plurality of detail textures, and storing the detail textures to a preset texture array; the detail textures respectively have corresponding first indexes;
in one embodiment of the present invention, in order to increase the richness of details on the surface of the PBR shading model and enable different detail textures to be used on each material, a plurality of detail textures for increasing details on different model surfaces may be first obtained, and a texture array for storing the plurality of detail textures may be created, so that the detail textures to be used may be extracted through the texture array.
Where a texture array refers to a collection of multiple individual textures of the same size and format. The GPU will process them as one object, thereby greatly increasing efficiency. Each texture in the texture array has a separate index ID parameter, which can be referred to as a first index in the embodiment of the invention, and the Shader sampler only needs to use one ID at a time to sample a single texture; and when multiple detail textures are used, the multiple detail textures can be efficiently managed and distributed by using the texture array.
Step 502, obtaining a plurality of mask textures, and establishing corresponding relations between the mask textures and a plurality of sub-gray scale intervals based on a single texture channel; the plurality of mask textures have second indices determined based on corresponding sub-gray intervals;
in one embodiment of the present invention, in addition to acquiring the detail textures, a plurality of mask textures are acquired, so that the application range and the intensity of the detail textures on the model surface are defined by the mask textures.
In practical application, the mask texture may be applied through the texture channel, and the mask texture is generally applied through a plurality of texture channels, so as to avoid that the performance of the Shader is affected by the setting of the texture channels and the memory of the Shader is affected by the application definition of the detail texture by the mask textures, and at this time, the mask textures may be applied through a single texture channel.
The single texture channel has a corresponding gray scale interval, the gray scale interval of one texture channel is generally adopted to correspond to one mask texture, the gray scale interval can be divided into a plurality of sub-gray scale intervals, the sub-gray scale intervals are adopted to respectively correspond to the mask textures, and the corresponding relation between the sub-gray scale intervals and the mask textures can be established, so that the mask texture required to be used can be obtained through the established corresponding relation.
In one embodiment of the present invention, step 502 may include sub-steps S21 and S22 as follows:
a substep S21 of dividing the gray level interval corresponding to the single texture channel into a plurality of sub-gray level intervals based on the number of the mask textures;
in practical application, the gray scale interval corresponding to a single texture channel can be divided based on the number of the mask textures, specifically, a division interval value for dividing the gray scale interval can be obtained by adopting the gray scale interval corresponding to the single texture channel and the divisor of the number of the mask textures; the gray scale interval may then be divided by a division interval value to obtain a plurality of sub-gray scale intervals.
Referring to fig. 6, which shows a schematic diagram of dividing the sub-gray scale interval in the embodiment of the present invention, assuming that 4 detail textures are to be used, that is, the number of detail textures is 4, and one of the sub-gray scale intervals can be applied to one detail texture, that is, the number of sub-gray scale intervals can be the same as that of the detail textures, and the number of mask textures corresponding to the sub-gray scale intervals can also be the same as that of the detail textures, and is also 4. At this time, a division interval value for each sub-gray level interval can be obtained by adopting a gray level interval (0-255) corresponding to a single texture channel and a divisor of the number of mask textures, and the division interval value can be 64; then, dividing the gray scale interval by using a division interval value to obtain a plurality of sub-gray scale intervals respectively corresponding to the plurality of mask textures, wherein the sub-gray scale intervals can be respectively a first sub-gray scale interval 1-64, a second sub-gray scale interval 65-128, a third sub-gray scale interval 129-192 and a fourth sub-gray scale interval 193-255; the sub-gray scale interval of 0-1 may represent a place for masking out the unnecessary detail texture, i.e. the transparency of the pixels in the interval when merging with the detail texture is low.
Substep S22, establishing a corresponding relationship between the plurality of mask textures and the plurality of sub-gray scale intervals, respectively;
in an embodiment of the present invention, after the gray scale interval corresponding to a single texture channel is divided based on the number of the mask textures to obtain a plurality of sub-gray scale intervals, in order to respectively correspond the plurality of mask textures to the plurality of sub-gray scale intervals, a corresponding relationship between the plurality of mask textures and the plurality of sub-gray scale intervals may be further established.
In practical application, the proportional relation between the sub-gray scale interval corresponding to any one of the plurality of mask textures and the gray scale interval of the pixels contained in the mask textures can be determined; and then, based on the proportional relation, adjusting the gray scale of the pixels contained in the corresponding mask texture to obtain the adjusted mask texture.
As an example, assume that there is a mask texture 1, a mask texture 2, a mask texture 3 and a mask texture 4, and stores the first sub-gray scale interval, the second sub-gray scale interval, the third sub-gray scale interval and the fourth sub-gray scale interval, wherein, the corresponding relationship between the mask texture 1 and the second sub-gray scale interval can be established, and the gray scale interval of the pixel included in the mask texture 1 can be 0-255, the second sub-gray scale interval corresponding to mask texture 1 according to the correspondence relationship may be 65-128, then gray scale intervals 0-255 may be calculated, the ratio of the first sub-gray scale interval 65-128 to the second sub-gray scale interval is 4:1, and the pixels of the mask texture are compressed to the original 1/4 based on the ratio, so that the pixels included in the compressed mask texture can have a corresponding relationship with the second sub-gray scale interval.
In the sub-step S23, a second index of the corresponding mask texture is determined according to the sub-gray level interval corresponding to the pixel included in any adjusted mask texture.
In an embodiment of the present invention, each detail texture in the texture array has a first index corresponding to one another, so that the corresponding detail texture is extracted and used according to the first index, at this time, a plurality of mask textures may be indexed, that is, each mask texture may also have a second index corresponding to one another, so that the corresponding mask texture may be extracted and used through the second index.
Specifically, the second index of the corresponding mask texture may be determined according to the sub-gray level interval corresponding to the mask texture.
In a preferred embodiment, the second index of the mask texture is determined according to the sub-gray scale interval, and may be determined by the pixels included in the mask texture. The gray value of each pixel included in each mask texture may be first calculated, and then which sub-gray interval of the plurality of sub-gray intervals the gray value is located in may be determined, and the second index value of each mask texture may be determined according to the falling sub-gray interval.
In practical applications, in order to use values in a plurality of mask textures to define the application range of the detail texture, the gray value of each pixel in the plurality of mask textures may be calculated by a formula, where the formula may be ((gray value of pixel-sub-set value range 0.01) × number of texture details), and then the result is rounded down, i.e., the decimal place is rounded down, only the integer number is taken, for example, the result obtained by calculation is 1.8, at this time, the result obtained by rounding down may be rounded down to about 1, and then the result obtained by rounding down the calculation result determines the closest sub-gray level interval to determine the second index.
As an example, a custom shader that can compute the pixel index value ID from multiple mask textures, if 4 detail textures are used, the entire gray scale interval is 255(256, including 0), the entire gray scale interval is divided by 4, the resulting 64 is the interval value for each sub-gray scale interval, and then four textures 256.0/4.0 64.0. The second index may be calculated by rounding the gray value of the pixel to the nearest section, and there may be 4 cases as follows:
0.000-1.000 ═ none
The gray scale interval ID-0 is 1.000-64.00
The gray scale interval ID-1 is 65.00-128.0
Sub-gray scale interval ID-2 is 129.0-191.0
Sub-gray scale interval ID-3-192.0-255.0
Where the range of 0.0-1.0 is used to mask out the texture without detail.
After calculation by formula, referring to fig. 7, which shows a schematic diagram of determining a second index in the embodiment of the present invention, assuming that the gray scale value of ID-0 is 0.0-0.25, and its corresponding sub-gray scale interval may be the first sub-gray scale interval 1-64, it means that the pixel ID of any value in the sub-gray scale interval is 0; the gray scale value of ID-1 is 0.25-0.5, and the corresponding sub-gray scale interval can be a second sub-gray scale interval 65-128, which means that the pixel ID of any value in the sub-gray scale interval is 1; the gray scale value of ID-2 is 0.5-0.75, and the corresponding sub-gray scale interval can be the third sub-gray scale interval 129-192, which means that any pixel ID within the third sub-gray scale interval is 2; the gray scale value of ID-3 is 0.75-1.0, and the corresponding sub-gray scale interval can be the fourth sub-gray scale interval 193-255, which means that any pixel ID within the fourth sub-gray scale interval is 3.
Step 503, establishing a mapping relationship between the first index and the second index, so as to respectively fuse the detail texture, the mask texture and the preset main texture according to the mapping relationship.
In practical application, when a detail texture, a mask texture and a preset main texture are fused, a corresponding detail texture may be extracted from a texture array through a first index, and an application range and an application strength of the detail texture need to be defined through the corresponding mask texture, at this time, a mapping relationship between the first index of each detail texture and a second index of each mask texture may be set, so that the mask texture may be determined through the mapping relationship (i.e., according to the second index corresponding to the first index) while the detail texture is acquired through the first index.
As an example, assume that the number of detail textures stored into the texture array may be 4, and the first index corresponding to the plurality of detail textures may include an ID1-0, an ID1-1, an ID1-2, and an ID1-3, the second indexes corresponding to the plurality of mask textures corresponding to the plurality of sub-gray intervals may include ID2-0, ID2-1, ID2-2, and ID2-3, at which time a mapping relationship between the first index and the second index may be established, such as mapping the first index ID1-0 to the second index ID2-2, then, while the corresponding detail texture 1 is being retrieved by ID1-0, a second index ID2-2 may be determined by the mapping of ID1-0, and acquires mask texture 2 for defining the application range and intensity of detail texture 1 through the second index ID 2-2.
In a preferred embodiment, referring to fig. 8, a process diagram of a method for processing a detail texture in an embodiment of the present invention is shown, in a process of fusing a main texture, a detail texture, and a mask texture, for a main texture and a plurality of detail masks, a second index is calculated first, and a detail texture is sampled from a texture array using a first index corresponding to the second index, which may be mainly represented by obtaining a corresponding detail texture 1 through ID1-0 and obtaining a corresponding mask texture 2 through ID 2-2; then, the sub-gray scale interval of the mask texture 2 corresponding to the second index ID2-2 can be expanded to the preset gray scale interval by resetting, and used as a mixture in the last step.
In an actual workflow, a separate layer is created for each ID, and the correct color and blending values are set using the layer style of photoshop. Thus, the mask is not lost by editing and modifying the layers afterwards. However, when copying the RGB values to the alpha channel, the values are converted from the RPG color space to the grayscale space. Photoshop default conversion settings do not produce accurate gray values and also add noise to the process. To obtain accurate results, the Photoshop settings need to be adjusted. The detail texture includes a normal map and two textures that adjust albedo and smoothness. To pack this texture, the original normal map R/G channel needs to be copied into the A/G channel of the detail texture. (A/G channel is slightly more accurate than R/B) R/B channel is used to store textures that can brighten or dim the original albedo and smooth textures. The average brightness must be 127. Above 127 the light becomes brighter and below the light becomes darker. Wherein the R channel is used for Albedo change and the B channel is used for smoothness.
In Shader, the normal maps can be blended using UDN or a whitening blending algorithm. The detail albedo texture may be a color texture or a high-pass processed texture with an average value of 0.5 (127). This allows the texture to be superimposed directly on the underlying original texture map.
In the embodiment of the invention, the art can better embody the details of the surface of the model by using a plurality of different detailed textures, thereby greatly improving the visual effect and consistency of the 3D real-time model. The technology is suitable for any 3D model, no matter a large earth surface object or a small role model, as long as a short-distance lens exists, the processing method of the detail texture in the embodiment of the invention can be applied; and, the occupied memory is small, and the Shader performance is excellent. Greatly improves the visual performance, enriches the art effect and provides more art functions. In addition, the processing method of the detail texture in the embodiment of the invention can also be conveniently added into the existing Shader.
In the embodiment of the invention, a plurality of detail textures are stored in a texture array, a plurality of mask textures are respectively corresponding to a plurality of sub-gray scale intervals based on a single texture channel, a mapping relation between a first index of the detail texture and a second index of the mask texture is established so as to obtain a target detail texture from the plurality of detail textures, the target mask texture is determined from the plurality of mask textures according to the first index and the mapping relation of the target detail texture, and the target detail texture, the target mask texture and a preset main texture are fused. The method has the advantages that the surface details of the model are better represented by the aid of the different detail textures, visual effect and consistency of the 3D real-time model are greatly improved, the application area of the detail textures is defined by the aid of the mask textures in the single texture channel, the application strength of the detail textures is defined by the aid of a mixing range, the occupied memory of the single texture channel is small, and performance of the sampler is guaranteed.
It should be noted that, for simplicity of description, the method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present invention is not limited by the illustrated order of acts, as some steps may occur in other orders or concurrently in accordance with the embodiments of the present invention. Further, those skilled in the art will appreciate that the embodiments described in the specification are presently preferred and that no particular act is required to implement the invention.
Referring to fig. 9, a block diagram of a detailed texture processing apparatus according to an embodiment of the present invention is shown, which may specifically include the following modules:
a target detail texture obtaining module 901, configured to obtain a target detail texture from a plurality of detail textures stored in a preset texture array; the detail textures respectively have corresponding first indexes;
a target mask texture determining module 902, configured to determine a target mask texture from a plurality of preset mask textures according to a first index of a target detail texture and a preset mapping relationship; the preset gray scale intervals of the plurality of mask textures respectively correspond to the plurality of sub-gray scale intervals of the single texture channel; the plurality of mask textures have second indices determined based on corresponding sub-gray intervals; the preset mapping relation is the mapping relation between the second index and the first index; the second index of the target mask texture matches the first index of the target detail texture;
and a texture fusion module 903, configured to fuse the target detail texture, the target mask texture, and a preset main texture.
In an embodiment of the invention, the sub-gray scale interval is obtained by dividing the gray scale interval of the single texture channel based on the number of the mask textures.
In one embodiment of the invention, the texture fusion module 903 may include the following sub-modules:
a blending range determining submodule for determining a blending range according to the target mask texture;
and the first texture fusion submodule is used for fusing the target detail texture and the preset main texture by adopting the mixing range.
In one embodiment of the invention, the texture fusion module 903 may include the following sub-modules:
the texture intensity determining submodule is used for determining a mixing range and texture intensity according to the target mask texture;
and the second texture fusion sub-module is used for fusing the target detail texture and the preset main texture by adopting the mixing range and the texture intensity.
In one embodiment of the invention, the texture fusion module 903 may include the following sub-modules:
and the third texture fusion submodule is used for expanding the sub-gray scale interval corresponding to the target mask texture into a preset gray scale interval and fusing the target detail texture, the target mask texture with the preset gray scale interval and a preset main texture.
Referring to fig. 10, a block diagram of a detail texture processing apparatus according to another embodiment of the present invention is shown, which may specifically include the following modules:
a detail texture storage module 1001 configured to obtain a plurality of detail textures, and store the plurality of detail textures to a preset texture array; the detail textures respectively have corresponding first indexes;
a corresponding relation establishing module 1002, configured to obtain a plurality of mask textures, and establish corresponding relations between the plurality of mask textures and a plurality of sub-gray scale intervals based on a single texture channel, respectively; the plurality of mask textures have second indices determined based on corresponding sub-gray intervals;
a mapping relationship establishing module 1003, configured to establish a mapping relationship between the first index and the second index, so as to respectively fuse the detail texture, the mask texture, and a preset main texture according to the mapping relationship.
In an embodiment of the present invention, the correspondence relationship establishing module 1002 may include the following sub-modules:
a sub-gray interval division submodule for dividing the gray interval corresponding to the single texture channel into a plurality of sub-gray intervals based on the number of the mask textures;
and the corresponding relation establishing submodule is used for establishing corresponding relations between the plurality of mask textures and the plurality of sub-gray scale intervals respectively.
In an embodiment of the present invention, the sub-gray interval division sub-module may include the following units:
a division interval value determining unit, configured to obtain a division interval value used for dividing the gray interval by using a divisor of the number of gray intervals and mask textures corresponding to the single texture channel;
and the sub-gray interval dividing unit is used for dividing the gray interval by adopting the dividing interval value to obtain a plurality of sub-gray intervals.
In an embodiment of the present invention, the corresponding relationship establishing module 1002 may further include the following sub-modules:
the proportion relation determining submodule is used for determining the proportion relation between a sub-gray level interval corresponding to any mask texture in the plurality of mask textures and a gray level interval of pixels contained in the mask texture;
and the pixel gray level adjusting submodule is used for adjusting the gray level of the pixel contained in the corresponding mask texture based on the proportional relation to obtain the adjusted mask texture.
In an embodiment of the present invention, the corresponding relationship establishing module 1002 may further include the following sub-modules:
and the second index determining submodule is used for determining a second index of the corresponding mask texture according to the sub-gray level interval corresponding to the pixel contained in any adjusted mask texture.
For the device embodiment, since it is basically similar to the method embodiment, the description is simple, and for the relevant points, refer to the partial description of the method embodiment.
The embodiment of the present invention further provides an electronic device, which includes a processor, a memory, and a computer program stored in the memory and capable of running on the processor, and when being executed by the processor, the computer program implements each process of the above-mentioned detailed texture processing method embodiment, and can achieve the same technical effect, and is not described herein again to avoid repetition.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when being executed by a processor, the computer program implements each process of the above embodiment of the method for processing a detail texture, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The embodiments in the present specification are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, apparatus, or computer program product. Accordingly, embodiments of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
Embodiments of the present invention are described with reference to flowchart illustrations and/or block diagrams of methods, terminal devices (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing terminal to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing terminal, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing terminal to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing terminal to cause a series of operational steps to be performed on the computer or other programmable terminal to produce a computer implemented process such that the instructions which execute on the computer or other programmable terminal provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present invention have been described, additional variations and modifications of these embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all such alterations and modifications as fall within the scope of the embodiments of the invention.
Finally, it should also be noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or terminal that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or terminal. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or terminal that comprises the element.
The detailed description of the method and the device for processing the detail texture provided by the present invention is provided above, and the principle and the implementation of the present invention are explained herein by applying a specific example, and the description of the above embodiment is only used to help understanding the method and the core idea of the present invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention.

Claims (14)

1. A method for processing detail texture, the method comprising:
acquiring a target detail texture from a plurality of detail textures stored in a preset texture array; the detail textures respectively have corresponding first indexes;
determining a target mask texture from a plurality of preset mask textures according to a first index and a preset mapping relation of the target detail texture; the preset gray scale intervals of the mask textures respectively correspond to the sub-gray scale intervals of the single texture channel; the plurality of mask textures have second indices determined based on corresponding sub-gray intervals; the preset mapping relation is the mapping relation between the second index and the first index; the second index of the target mask texture matches the first index of the target detail texture;
and fusing the target detail texture, the target mask texture and a preset main texture.
2. The processing method according to claim 1,
and the sub-gray level interval is obtained by dividing the gray level interval of the single texture channel based on the number of the mask textures.
3. The processing method according to claim 1, wherein said fusing the target detail texture, the target mask texture and a preset main texture comprises:
determining a mixing range according to the target mask texture;
and fusing the target detail texture and the preset main texture by adopting the mixing range.
4. The processing method according to claim 1, wherein said fusing the target detail texture, the target mask texture and a preset main texture comprises:
determining a mixing range and texture intensity according to the target mask texture;
and fusing the target detail texture and a preset main texture by adopting the mixing range and the texture intensity.
5. The processing method according to claim 1, wherein said fusing the target detail texture, the target mask texture and a preset main texture comprises:
and expanding the sub-gray scale interval corresponding to the target mask texture into a preset gray scale interval, and fusing the target detail texture, the target mask texture with the preset gray scale interval and a preset main texture.
6. A method for processing detail texture, the method comprising:
acquiring a plurality of detail textures, and storing the detail textures to a preset texture array; the detail textures respectively have corresponding first indexes;
obtaining a plurality of mask textures, and establishing corresponding relations between the mask textures and a plurality of sub-gray scale intervals based on a single texture channel; the plurality of mask textures have second indices determined based on corresponding sub-gray intervals;
and establishing a mapping relation between the first index and the second index so as to be used for fusing the detail texture, the mask texture and a preset main texture respectively according to the mapping relation.
7. The processing method according to claim 6, wherein said establishing correspondence between said mask textures and sub-gray levels based on a single texture channel comprises:
dividing the gray level interval corresponding to the single texture channel into a plurality of sub-gray level intervals based on the number of the mask textures;
and establishing corresponding relations between the plurality of mask textures and the plurality of sub-gray scale intervals respectively.
8. The processing method according to claim 7, wherein the dividing the gray scale interval corresponding to the single texture channel into a plurality of sub-gray scale intervals based on the number of mask textures comprises:
obtaining a division interval value for dividing the gray scale interval by using a divisor of the number of the gray scale interval and the mask texture corresponding to the single texture channel;
and dividing the gray level interval by adopting the division interval value to obtain a plurality of sub gray level intervals.
9. The processing method of claim 7, further comprising:
determining a proportional relation between a sub-gray level interval corresponding to any one of the plurality of mask textures and a gray level interval of pixels contained in the mask texture;
and adjusting the gray scale of the pixels contained in the corresponding mask texture based on the proportional relation to obtain the adjusted mask texture.
10. The process of claim 9, wherein the process further comprises:
and determining a second index of the corresponding mask texture according to the sub-gray level interval corresponding to the pixel contained in any adjusted mask texture.
11. An apparatus for processing a detail texture, the apparatus comprising:
the target detail texture acquisition module is used for acquiring a target detail texture from a plurality of detail textures stored in a preset texture array; the detail textures respectively have corresponding first indexes;
the target mask texture determining module is used for determining a target mask texture from a plurality of preset mask textures according to the first index of the target detail texture and a preset mapping relation; the preset gray scale intervals of the mask textures respectively correspond to the sub-gray scale intervals of the single texture channel; the plurality of mask textures have second indices determined based on corresponding sub-gray intervals; the preset mapping relation is the mapping relation between the second index and the first index; the second index of the target mask texture matches the first index of the target detail texture;
and the texture fusion module is used for fusing the target detail texture, the target mask texture and a preset main texture.
12. An apparatus for processing a detail texture, the apparatus comprising:
the detail texture storage module is used for acquiring a plurality of detail textures and storing the detail textures to a preset texture array; the detail textures respectively have corresponding first indexes;
the corresponding relation establishing module is used for acquiring a plurality of mask textures and establishing corresponding relations between the mask textures and a plurality of sub-gray scale intervals based on a single texture channel; the plurality of mask textures have second indices determined based on corresponding sub-gray intervals;
and the mapping relation establishing module is used for establishing the mapping relation between the first index and the second index so as to respectively fuse the detail texture, the mask texture and a preset main texture according to the mapping relation.
13. An electronic device, comprising: processor, memory and a computer program stored on the memory and capable of running on the processor, which computer program, when executed by the processor, carries out the steps of the method of processing a detail texture of any one of claims 1 to 5 or 6 to 10.
14. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method of processing a detail texture according to any one of claims 1 to 5 or claims 6 to 10.
CN202010905793.7A 2020-09-01 2020-09-01 Detail texture processing method and device Active CN111951369B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010905793.7A CN111951369B (en) 2020-09-01 2020-09-01 Detail texture processing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010905793.7A CN111951369B (en) 2020-09-01 2020-09-01 Detail texture processing method and device

Publications (2)

Publication Number Publication Date
CN111951369A true CN111951369A (en) 2020-11-17
CN111951369B CN111951369B (en) 2023-05-23

Family

ID=73367245

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010905793.7A Active CN111951369B (en) 2020-09-01 2020-09-01 Detail texture processing method and device

Country Status (1)

Country Link
CN (1) CN111951369B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114419233A (en) * 2021-12-31 2022-04-29 网易(杭州)网络有限公司 Model generation method and device, computer equipment and storage medium
WO2023066098A1 (en) * 2021-10-22 2023-04-27 华为技术有限公司 Rendering processing method and apparatus, and device and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5613048A (en) * 1993-08-03 1997-03-18 Apple Computer, Inc. Three-dimensional image synthesis using view interpolation
EP1434171A2 (en) * 1995-08-04 2004-06-30 Microsoft Corporation Method and system for texture mapping a source image to a destination image
CN103927395A (en) * 2014-05-05 2014-07-16 曾志明 Data structure and rendering method for SSI three-dimensional geometry and material textures
US9007374B1 (en) * 2011-07-20 2015-04-14 Autodesk, Inc. Selection and thematic highlighting using terrain textures
CN111489426A (en) * 2020-04-09 2020-08-04 腾讯科技(深圳)有限公司 Expression generation method, device, equipment and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5613048A (en) * 1993-08-03 1997-03-18 Apple Computer, Inc. Three-dimensional image synthesis using view interpolation
EP1434171A2 (en) * 1995-08-04 2004-06-30 Microsoft Corporation Method and system for texture mapping a source image to a destination image
US9007374B1 (en) * 2011-07-20 2015-04-14 Autodesk, Inc. Selection and thematic highlighting using terrain textures
CN103927395A (en) * 2014-05-05 2014-07-16 曾志明 Data structure and rendering method for SSI three-dimensional geometry and material textures
CN111489426A (en) * 2020-04-09 2020-08-04 腾讯科技(深圳)有限公司 Expression generation method, device, equipment and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023066098A1 (en) * 2021-10-22 2023-04-27 华为技术有限公司 Rendering processing method and apparatus, and device and storage medium
CN114419233A (en) * 2021-12-31 2022-04-29 网易(杭州)网络有限公司 Model generation method and device, computer equipment and storage medium

Also Published As

Publication number Publication date
CN111951369B (en) 2023-05-23

Similar Documents

Publication Publication Date Title
US11967046B2 (en) Methods and apparatus for enhancing optical images and parametric databases
CN112316420B (en) Model rendering method, device, equipment and storage medium
CN109377546B (en) Virtual reality model rendering method and device
US7545384B2 (en) Image generation method and apparatus
US8411113B1 (en) Layered digital image data reordering and related digital image rendering engine
Schütz et al. Real-time continuous level of detail rendering of point clouds
GB2377870A (en) Generating confidence data when rendering surfaces of a 3D object
CN111508052A (en) Rendering method and device of three-dimensional grid body
US20030107572A1 (en) Method and apparatus for reducing the polygon count of a textured, three dimensional model of an object
US10089782B2 (en) Generating polygon vertices using surface relief information
CN111951369B (en) Detail texture processing method and device
Drago et al. Perceptual evaluation of tone mapping operators with regard to similarity and preference
US8942476B1 (en) Saturation varying and lighting independent color color control for computer graphics
CN110969688A (en) Real-time color homogenizing method for real-scene three-dimensional model
CN103514593B (en) Image processing method and device
JP2004533678A (en) System and method for determining spatial hierarchy for polygon data by using cubic root scaling
Jobst et al. Mechanisms on graphical core variables in the design of cartographic 3D city presentations
JP7387029B2 (en) Single-image 3D photography technology using soft layering and depth-aware inpainting
CN115761121A (en) Cloud and mist generation method and device and electronic equipment
CN114288671A (en) Method, device and equipment for making map and computer readable medium
CN115131480A (en) Method and device for manufacturing special effect of horse race lamp and electronic equipment
US9514566B2 (en) Image-generated system using beta distribution to provide accurate shadow mapping
Callieri et al. A realtime immersive application with realistic lighting: The Parthenon
Jaiswal Fundamental of Interactive Computer Graphics and Quality Assessment
US11928757B2 (en) Partially texturizing color images for color accessibility

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant