CN117671114A - Model rendering method and device, storage medium and electronic device - Google Patents

Model rendering method and device, storage medium and electronic device Download PDF

Info

Publication number
CN117671114A
CN117671114A CN202311366639.7A CN202311366639A CN117671114A CN 117671114 A CN117671114 A CN 117671114A CN 202311366639 A CN202311366639 A CN 202311366639A CN 117671114 A CN117671114 A CN 117671114A
Authority
CN
China
Prior art keywords
texture
sampling
result
map
offset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311366639.7A
Other languages
Chinese (zh)
Inventor
潘昕宇
方敬儒
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202311366639.7A priority Critical patent/CN117671114A/en
Publication of CN117671114A publication Critical patent/CN117671114A/en
Pending legal-status Critical Current

Links

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Image Generation (AREA)

Abstract

The application discloses a model rendering method and device, a storage medium and an electronic device. The method comprises the following steps: sampling the noise map and the first material capturing map based on a hashing algorithm to obtain first normal information, wherein the first material capturing map is a normal texture map for determining the surface reflection effect of the virtual model; calculating to obtain a target observation matrix based on the position matrix and the initial observation matrix of the virtual camera; texture sampling is carried out on the second material capturing mapping by utilizing the first normal information and the target observation matrix to obtain a sampling result, wherein the second material capturing mapping is used for determining the sphere reflection color of the virtual model; and rendering the virtual model by adopting the sampling result to obtain a rendering result. The technical problems that the texture repeatability of the material reflection effect obtained by the existing material capturing scheme is high and the dynamic change effect is poor are solved.

Description

Model rendering method and device, storage medium and electronic device
Technical Field
The present application relates to the field of computer technologies, and in particular, to a model rendering method, a device, a storage medium, and an electronic device.
Background
In game scene production, a part of the scene surface needs to exhibit a color reflection effect (e.g., an opal ground effect) that dynamically changes as the scene lens rotates. In one method of the related art, depth information is typically calculated through parallax barrier mapping using a texture system in a rendering engine, and further sampled multiple times to reduce texture repeatability. But this method has a large performance overhead and is difficult to apply to mobile-end devices. In another method, the dynamic color reflection effect is simulated by a material capturing scheme, however, the method still has the problems of higher texture repeatability and poorer dynamic change effect.
In view of the above problems, no effective solution has been proposed at present.
It should be noted that the information disclosed in the foregoing background section is only for enhancing understanding of the background of the present application and thus may include information that does not form the prior art that is already known to those of ordinary skill in the art.
Disclosure of Invention
At least some embodiments of the present application provide a model rendering method, a device, a storage medium, and an electronic device, so as to at least solve the technical problems of high texture repeatability and poor dynamic change effect of a material reflection effect obtained by an existing material capturing scheme.
According to one embodiment of the present application, there is provided a model rendering method, including: sampling the noise map and the first material capturing map based on a hashing algorithm to obtain first normal information, wherein the first material capturing map is a normal texture map for determining the surface reflection effect of the virtual model; calculating a target observation matrix based on a position matrix and an initial observation matrix of a virtual camera, wherein the virtual camera is used for observing a virtual model in a virtual scene; texture sampling is carried out on the second material capturing mapping by utilizing the first normal information and the target observation matrix to obtain a sampling result, wherein the second material capturing mapping is used for determining the sphere reflection color of the virtual model; and rendering the virtual model by adopting the sampling result to obtain a rendering result.
According to one embodiment of the present application, there is also provided a model rendering apparatus including: the first sampling module is used for sampling the noise mapping and the first material capturing mapping based on a hashing algorithm to obtain first normal information, wherein the first material capturing mapping is a normal texture mapping used for determining the surface reflection effect of the virtual model; the computing module is used for computing to obtain a target observation matrix based on the position matrix and the initial observation matrix of the virtual camera, wherein the virtual camera is used for observing the virtual model in the virtual scene; the second sampling module is used for performing texture sampling on the second material capturing mapping by utilizing the first normal information and the target observation matrix to obtain a sampling result, wherein the second material capturing mapping is used for determining the sphere reflection color of the virtual model; and the rendering module is used for rendering the virtual model by adopting the sampling result to obtain a rendering result.
According to an embodiment of the present application, there is also provided a computer readable storage medium having a computer program stored therein, wherein the computer program is configured to perform the model rendering method of any one of the above when run.
According to one embodiment of the present application, there is also provided an electronic device including: comprising a memory in which a computer program is stored and a processor arranged to run the computer program to perform the model rendering method of any of the above.
In at least some embodiments of the present application, firstly, sampling a noise map and a first material capturing map based on a hashing algorithm to obtain first normal information, where the first material capturing map is a normal texture map for determining a surface reflection effect of a virtual model; further based on the position matrix and the initial observation matrix of the virtual camera, calculating to obtain a target observation matrix, wherein the virtual camera is used for observing a virtual model in a virtual scene; and performing texture sampling on the second material capturing mapping by using the first normal information and the target observation matrix to obtain a sampling result, wherein the second material capturing mapping is used for determining the sphere reflection color of the virtual model, and rendering the virtual model by adopting the sampling result on the basis to obtain a rendering result. Therefore, the purpose of rendering the color reflection effect of the virtual model with lower texture repeatability and dynamic change along with the rotation and translation of the virtual camera is achieved, the technical effects of reducing the texture repeatability of the material reflection effect and improving the dynamic change effect of the color reflection are achieved, and the technical problems that the texture repeatability of the material reflection effect obtained by the existing material capturing scheme is high and the dynamic change effect is poor are solved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiments of the application and together with the description serve to explain the application and do not constitute an undue limitation to the application. In the drawings:
FIG. 1 is a block diagram of a hardware architecture of a mobile terminal according to a model rendering method according to one embodiment of the present application;
FIG. 2 is a flow chart of a model rendering method according to one embodiment of the present application;
FIG. 3 is a schematic diagram of an alternative rendering result according to one embodiment of the present application;
FIG. 4 is a schematic illustration of an alternative first texture capture map according to one embodiment of the present application;
FIG. 5 is a schematic illustration of an alternative second texture capture map according to one embodiment of the present application;
FIG. 6 is a block diagram of a model rendering device according to one embodiment of the present application;
FIG. 7 is a block diagram of an alternative model rendering device according to one embodiment of the present application;
fig. 8 is a schematic diagram of an electronic device according to one embodiment of the present application.
Detailed Description
In order to make the present application solution better understood by those skilled in the art, the following description will be made in detail and with reference to the accompanying drawings in the embodiments of the present application, it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. All other embodiments, which can be made by one of ordinary skill in the art based on the embodiments herein without making any inventive effort, shall fall within the scope of the present application.
It should be noted that the terms "first," "second," and the like in the description and claims of the present application and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that embodiments of the present application described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
In the description of the present application, the term "for example" is used to mean "serving as an example, illustration, or description". Any embodiment described herein as "for example" is not necessarily to be construed as preferred or advantageous over other embodiments. The following description is presented to enable any person skilled in the art to make and use the application. In the following description, details are set forth for purposes of explanation. It will be apparent to one of ordinary skill in the art that the present application may be practiced without these specific details. In other instances, well-known structures and processes have not been shown in detail to avoid obscuring the description of the present application with unnecessary detail. Thus, the present application is not intended to be limited to the embodiments shown, but is to be accorded the widest scope consistent with the principles and features disclosed herein.
In describing embodiments of the present application, partial terms or terms that appear are used in the following explanation:
sader Shader: an image for rendering a virtual scene, instead of an editable program of a fixed rendering pipeline.
Parallax occlusion mapping (Parallax Occlusion Mapping, POM): is a technique for imparting depth or parallax appearance to a material. The result of the POM implementation is similar to that achieved by displacement on a tessellation grid, but it is designed to work with pixel shaders instead of vertex shaders.
Texture tiling Repeat (Repeat write): a texture picture is repeatedly tiled on a planar or object surface until the entire surface is completely covered.
Material Capture (Material Capture): and (3) baking the light source and the material information offline to form a piece of material ball, putting the material ball on a map, directly taking the material ball for use in rendering, and mapping texture information of MatCap on a model through calculation of View (View) space transformation.
Texture coordinates (Texture Coordinate, texCoord): by default, each component of the TexCoord two-dimensional vector has a value ranging from 0 to 1. In a common application program interface, the coordinates (0, 0) correspond to the upper left corner of the texture, and the coordinates (1, 1) correspond to the lower right corner of the texture. When rendering the surface of an object, the coordinates of the pixels on the object are calculated in the manner of texture coordinates.
Screen coordinates (screen uv): refers to two-dimensional coordinate information based on a device screen (e.g., a cell phone display, a computer display, etc.). Similar to texture coordinates, a common application program interface refers to the transverse component of the texture coordinates as the U-axis or X-axis and the vertical component as the V-axis or Y-axis.
Texture Sample (Texture Sample): a process of retrieving texture colors using texture coordinates. The search order is such that the texture coordinates start from the lower left corner of the (0, 0) texture image to the upper right corner of the texture image (1, 1).
Matrix transformation (MVP Transform): the MVP matrix is a generic term for three matrices, model, view, projection. These three matrices represent the transformation of object vertex coordinates from local space to clipping space, ending up in the form of screen coordinates. The model matrix represents the transformation of vertex coordinates from the object's own Local Space (Local Space) to World Space (World Space); the observation matrix represents the world Space to the observation Space (View Space); the projection matrix represents the view Space to the Clip Space (Clip Space).
Linear interpolation (Lerp): interpolation is calculated according to the formulase:Sub>A x=a+ (B-ase:Sub>A) ×alphase:Sub>A, where X represents the result of interpolation calculation, ase:Sub>A represents the original value, B represents the target value, and alphase:Sub>A represents the excessive intensity between ase:Sub>A and B changes.
In one possible implementation manner of the present application, the inventor has practiced and studied the method for manufacturing a material system or a material capturing scheme in a rendering engine, which is generally used in the background related to dynamically changing color reflection effect manufacturing in the field of computer technology, and then has the technical problems of higher texture repeatability and poorer dynamically changing effect.
Based on the above, the embodiment of the application provides a model rendering method, which adopts a hash algorithm to sample a noise map and a first material capture map to obtain first normal information; further calculating to obtain a target observation matrix based on the position matrix and the initial observation matrix of the virtual camera; texture sampling is carried out on the second material capturing mapping by utilizing the first normal line information and the target observation matrix, so that a sampling result is obtained; the virtual model is rendered by adopting the sampling result, so that the technical conception of the rendering result is obtained, the technical effects of reducing the texture repeatability of the material reflection effect and improving the dynamic change effect of the color reflection are realized, and the technical problems of high texture repeatability and poor dynamic change effect of the material reflection effect obtained by the existing material capturing scheme are solved.
The scenarios to which the embodiments of the present application apply may include, but are not limited to: game development scenes (e.g., creating game characters, production scenes, and special effects), movie production scenes (e.g., creating special effects, animations, and virtual backgrounds), architectural design and visualization scenes, virtual reality and augmented reality scenes, computer-aided design scenes (e.g., creating and displaying product models, engineering models, and prototypes), entertainment and experience scenes (e.g., creating immersive virtual experiences in entertainment and experience venues such as theme parks, museums, and exhibitions), educational and training scenes (e.g., creating interactive virtual learning environments and simulating experimental scenes), medical and medical scenes (e.g., creating mannequins, surgical simulations, medical image processing, etc.). In particular, the game types targeted by the game development scenario may be action types, adventure types, simulation types, role playing types, leisure types, and the like.
The above-described method embodiments referred to in the present application may be performed in a terminal device (e.g., a mobile terminal, a computer terminal, or similar computing device). Taking the mobile terminal as an example, the mobile terminal can be a terminal device such as a smart phone, a tablet computer, a palm computer, a mobile internet device, a game machine and the like.
Fig. 1 is a block diagram of a hardware structure of a mobile terminal according to a model rendering method according to one embodiment of the present application. As shown in fig. 1, a mobile terminal may include one or more (only one shown in fig. 1) processors 102, memory 104, transmission devices 106, input output devices 108, and display devices 110. Taking the example that the model rendering method is applied to the electronic game scene through the mobile terminal, the processor 102 invokes and runs the computer program stored in the memory 104 to execute the model rendering method, and the generated rendering result of the virtual model in the electronic game scene is transmitted to the input/output device 108 and/or the display device 110 through the transmission device 106, so as to provide the rendering result of the virtual model to the player.
As also shown in fig. 1, the processor 102 may include, but is not limited to: a central processor (Central Processing Unit, CPU), a graphics processor (Graphics Processing Unit, GPU), a digital signal processing (Digital Signal Processing, DSP) chip, a microprocessor (Microcontroller Unit, MCU), a programmable logic device (Field Programmable Gate Array, FPGA), a Neural network processor (Neural-Network Processing Unit, NPU), a tensor processor (Tensor Processing Unit, TPU), an artificial intelligence (Artificial Intelligence, AI) type processor, and the like.
It will be appreciated by those skilled in the art that the structure shown in fig. 1 is merely illustrative and not limiting of the structure of the mobile terminal described above. For example, the mobile terminal may also include more or fewer components than shown in fig. 1, or have a different configuration than shown in fig. 1.
In some optional embodiments based on game scenes, the terminal device may further provide a human-machine interaction interface with a touch-sensitive surface, where the human-machine interaction interface may sense finger contacts and/or gestures to interact with a graphical user interface (Graphical User Interface, GUI), where the human-machine interaction functions may include the following interactions: executable instructions for performing the above-described human-machine interaction functions, such as creating web pages, drawing, word processing, making electronic documents, games, video conferencing, instant messaging, sending and receiving electronic mail, talking interfaces, playing digital video, playing digital music, and/or web browsing, are configured/stored in a computer program product or readable storage medium executable by one or more processors.
The above method embodiments related to the present application may also be executed in a server. The server may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, a content distribution network (Content Delivery Network, CDN), basic cloud computing services such as big data and an artificial intelligent platform. Taking the example of the model rendering method being applied to the electronic game scene by the electronic game server, the electronic game server may generate a rendering result of the virtual model in the electronic game scene based on the model rendering method and provide the rendering result of the virtual model to the player (for example, may be rendered for display on a display screen of the player terminal, or provided to the player by holographic projection, etc.).
According to one embodiment of the present application, an embodiment of a model rendering method is provided, it being noted that the steps shown in the flowcharts of the figures may be performed in a computer system such as a set of computer executable instructions, and although a logical order is shown in the flowcharts, in some cases the steps shown or described may be performed in an order different from that shown or described herein.
In this embodiment, a model rendering method running on the mobile terminal is provided, and fig. 2 is a flowchart of a model rendering method according to one embodiment of the present application, as shown in fig. 2, and the method includes the following steps:
in step S21, sampling is performed on the noise map and the first material capture map based on the hashing algorithm to obtain first normal information, where the first material capture map is a normal texture map for determining a surface reflection effect of the virtual model.
The hash algorithm described above is used to transform an input of arbitrary length to an output of fixed length. The specific formulas of the hashing algorithm are not limited in this application. In an application scenario, the Hash algorithm may be a commonly used Hash (Hash) algorithm. The noise map is a preset map for generating random normal textures. The first texture capture map is a pre-made normal texture map for determining a surface reflection effect of the virtual model. Based on the hashing algorithm described above, the index can be randomly mapped to a texture coordinate location on the texture (noise map and first texture capture map) during the texture tiling process. On this basis, the first normal information is used to characterize a randomly sampled normal texture.
Further, the tiling texture repeatability is reduced by setting different Hash values for texture sampling. Therefore, through the step S21, the method provided by the present application can reduce the texture repeatability in the texture tiling process, and improve the picture expression effect of the virtual scene.
Step S22, calculating a target observation matrix based on the position matrix and the initial observation matrix of the virtual camera, wherein the virtual camera is used for observing the virtual model in the virtual scene.
In the related art, a technical scheme capable of realizing a color reflection effect dynamically changing along with rotation and translational changes of a virtual camera is lacking, and in particular, a Matcap scheme is difficult to realize the color reflection effect dynamically changing along with translational changes of the virtual camera. Therefore, according to the above step S22, a new target observation matrix is reconstructed based on the position matrix and the initial observation matrix of the virtual camera, the target observation matrix can represent the rotation change and the translation change of the virtual camera, and the sampling and the rendering are performed based on the target observation matrix, so that the sampling behavior and the rendering behavior can be associated with the rotation change and the translation change of the virtual camera, and further, the color reflection effect dynamically changing with the rotation change and the translation change of the virtual camera is realized.
And S23, performing texture sampling on the second material capturing map by using the first normal information and the target observation matrix to obtain a sampling result, wherein the second material capturing map is used for determining the sphere reflection color of the virtual model.
The second material capturing map is a pre-manufactured map, and is used for determining the color of the virtual model so as to simulate the sphere reflection effect. The first normal information can represent random texture tiling effect in the virtual scene, the target observation matrix can represent dynamic effect of rotation transformation and translation transformation of color reflection effect along with the virtual camera in the virtual scene, and on the basis, texture sampling is carried out on the second material capturing mapping, so that texture color sampling results with random texture tiling effect and dynamic effect can be obtained.
And step S24, rendering the virtual model by adopting the sampling result to obtain a rendering result.
And rendering the virtual model by adopting the sampling result, wherein the surface of the virtual model in the obtained rendering result presents texture colors with random texture tiling effect and dynamic effect. The rendering result may be an image or video or the like on which the virtual model is displayed. The dynamic effect in the texture color is expressed as that the texture color can be changed along with the rotation and translation change of the virtual camera, and the color change, the brightness change, the reflection effect change and the like are presented.
The virtual model may be a model of the ground, terrain, objects, characters, etc. in the game scene. The game types corresponding to the game scene may be: action classes (e.g., first or third person shooter games, two-or three-dimensional combat games, war action games, sports action games, etc.), adventure classes (e.g., adventure games, collection games, puzzle games, etc.), simulation classes (e.g., simulated sand table games, simulated foster games, strategy simulation games, city building simulation games, business simulation games, etc.), role playing classes and leisure classes (e.g., chess and card game games, recreation game games, music rhythm games, trade foster games, etc.), etc.
In the embodiment of the application, firstly, sampling processing is performed on a noise map and a first material capturing map based on a hash algorithm to obtain first normal information, wherein the first material capturing map is a normal texture map for determining a surface reflection effect of a virtual model; further based on the position matrix and the initial observation matrix of the virtual camera, calculating to obtain a target observation matrix, wherein the virtual camera is used for observing a virtual model in a virtual scene; and performing texture sampling on the second material capturing mapping by using the first normal information and the target observation matrix to obtain a sampling result, wherein the second material capturing mapping is used for determining the sphere reflection color of the virtual model, and rendering the virtual model by adopting the sampling result on the basis to obtain a rendering result. Therefore, the purpose of rendering the color reflection effect of the virtual model with lower texture repeatability and dynamic change along with the rotation and translation of the virtual camera is achieved, the technical effects of reducing the texture repeatability of the material reflection effect and improving the dynamic change effect of the color reflection are achieved, and the technical problems that the texture repeatability of the material reflection effect obtained by the existing material capturing scheme is high and the dynamic change effect is poor are solved.
The above-described methods of embodiments of the present application are further described below.
Optionally, in an application scenario, the preset color reflection effect is an opal simulation effect.
In an application scene, the method provided by the embodiment of the application is adopted to render the color reflection effect of the simulated opal for the ground of the virtual scene, and the rendering result is shown in fig. 3. Therefore, the rendering result with lower repeatability can be obtained by the method provided by the application, and the color and the reflection effect of the ground in the rendering result can be dynamically changed along with the rotation change and the translation change of the virtual camera.
In the application scenario, the Matcap-based processing procedure comprises the following steps: the first stage, reducing the repeatability of the normal reflection map; the second stage addresses the problem of the lack of dynamic effects of normal reflection effects as virtual cameras translate.
Optionally, in step S21, the noise map and the first texture capture map are sampled based on a hashing algorithm to obtain first normal information, which may include the following steps:
step S211, determining a texture drawing index and a first interpolation coefficient by using the noise map;
step S212, performing texture sampling and scaling processing on a preset normal map to obtain a first texture coordinate;
Step S213, carrying out random offset processing on the first texture coordinates by using the texture drawing index to obtain second texture coordinates;
step S214, based on a hashing algorithm and the second texture coordinates, obtaining color information corresponding to the first material capturing map;
step S215, interpolation smoothing processing is carried out on the color information based on the first interpolation coefficient, and a target color vector is obtained, wherein the target color vector is used for representing the first normal information.
In one application scenario, a pre-made noise map (denoted TexNorNoise) is sampled based on the first stage of the Matcap processing, and the resulting sampled value is used as a texture rendering index (index). Specifically, the noise map is scaled and updated by multiplying the map UV coordinates of the noise map by a scaling factor (referred to as TexNorNoise scale) corresponding to the noise map TexNorNoise. And performing texture sampling on the updated noise map to obtain a vector in a first format, and taking one component (such as an x component) of the vector in the first format as a texture drawing index (index).
It should be noted that the first format may be a data type of a four-dimensional floating point number, and is composed of four floating point numbers, where a single precision value of each floating point number has 4 bytes (32 bits) including a sign bit, an 8-bit binary exponent, and a 23-bit mantissa. The four-dimensional floating point number data types described above provide a range of approximately between-3.4E+38 and 3.4E+38.
Further, a pre-made normal map is sampled to obtain UV coordinates, and the UV coordinates are multiplied by a scaling factor (denoted as inormalscale) corresponding to the normal map to obtain first texture coordinates (denoted as normltex_uv 1).
The scaling factor texnornoise scale may be determined by the noise map or may be preset by a skilled person. The scaling factor inormalscale corresponding to the previously created normal map may be determined by the normal map or may be preset by a technician.
It should be noted that when the first texture coordinate is used, partial derivatives (i.e., dx and dy) of the first texture coordinate (normal_uv1) in the x and y directions may be calculated using the partial derivative functions ddx and ddy to characterize the offset of the texture coordinate.
Further, the first texture coordinate (normal_uv 1) is randomly offset based on the texture drawing index (index), and a second texture coordinate (normal_uv 2) after offset update is obtained. Based on the Hash algorithm and the second texture coordinate (normal_uv 2), performing Hash filling by using the first material capturing map (marked as Tex 1), and obtaining color information corresponding to the first material capturing map, where the color information includes sharpness levels, color values, and the like of a plurality of pixels. On the basis, the target color vector is obtained by interpolation smoothing processing of the color information based on a first interpolation coefficient f1 determined by the noise map, and the first normal line information is determined.
Optionally, in step S213, performing a random offset process on the first texture coordinate by using the texture drawing index to obtain the second texture coordinate may include the following steps:
step S2131, determining a sampling offset value based on the texture rendering index;
step S2132, generating a target offset by using the rounding result corresponding to the sampling offset value;
in step S2133, the first texture coordinates are offset based on the target offset to obtain second texture coordinates.
In the above application scenario, in the first stage of the Matcap-based processing procedure, three channels of RGB of the noise map are considered, and indexes of 2^3 =8 are obtained, which are respectively expressed as: 0000 =0; 0001 =1; 0010 =2; … …;0111 =7. Thus, the range of the texture rendering index described above can be regarded as 0 to 7. The texture rendering index (index) is multiplied by 8 to obtain a sampling offset value, which may be a floating point number named L, representing the offset corresponding to the channel of the noise map. The fractional part of the sample offset value L is obtained using a frac function and assigned to the first interpolation coefficient f1.
Further, the sampling offset value L is rounded by using a rounding function, and the target offset is obtained. And performing offset processing on the first texture coordinate (normal_uv 1) based on the target offset to obtain a second texture coordinate (normal_uv 2) after offset updating.
Optionally, the target offset includes a first offset and a second offset, and in step S2132, generating the target offset using the rounding result corresponding to the sampling offset value may include the following steps:
step S21321, rounding the sampling offset value to obtain a first rounding result and a second rounding result, wherein the rounding directions of the first rounding result and the second rounding result are opposite;
in step S21322, trigonometric function multiplication is performed on the first rounding result and the first random number to obtain a first offset, and trigonometric function multiplication is performed on the second rounding result and the second random number to obtain a second offset.
The first random number is generated by a preset random generation algorithm and a first preset threshold value, and the second random number is generated by a preset random generation algorithm and a second preset threshold value.
The above random generation algorithm may be an algorithm commonly used in image processing in computer graphics (Computer Graphics, CG), such as: pseudo-random number generators such as uniformly distributed random number generation algorithms, gaussian distributed random number generation algorithms, etc.; noise generation algorithms such as a progressive noise (Perlin noise) algorithm, a noise improvement (Simplex noise) algorithm, or a simulated cell texture noise (Cellular noise) algorithm; random sampling algorithms, such as a Monte Carlo sampling algorithm, a random dithering algorithm, a random perturbation algorithm, etc.; the monte carlo algorithm. In the application scenario, at the first stage of the Matcap-based processing procedure, a floor function is adopted to round down the sampling offset value L, so as to obtain a first rounding result ia; and (3) upwardly rounding the sampling offset value L by adopting a ceil function to obtain a second rounding result ib.
Further, based on the first preset threshold value and the second preset threshold value (in this example, the threshold values in the range of 3.0 to 7.0 are taken), the first random number float2 (3.0,7.0) and the second random number float2 (3.0,7.0) of the floating point type are generated by adopting any one of the random generation algorithms. On this basis, a first offset amount offa is calculated according to the following formula (1) by using a sine function, and a second offset amount offb is calculated according to the following formula (2).
offa=sin (float 2 (3.0, 7.0) ×ia) formula (1)
offb=sin (float 2 (3.0, 7.0) ×ib) formula (2)
According to the method steps, the random offset is generated and used as the coordinate offset of the subsequent texture sampling, so that the texture repeatability can be reduced in the texture tiling process, and the visual effect of the picture of the virtual scene can be enhanced.
Optionally, in step S214, based on the hashing algorithm and the second texture coordinates, obtaining color information corresponding to the first texture capture map may include the following steps:
step S2141, calculating a second texture coordinate, a target offset corresponding to the second texture coordinate and a preset tiling repetition parameter according to a hashing algorithm to obtain a sampling position;
Step S2142, determining a definition level of the first material capture map according to the sampling position;
in step S2143, color information corresponding to the first material capture map is obtained at the sharpness level.
In the above application scenario, the first material capturing map (Tex 1) may be shown in fig. 4, where the first material capturing map is a custom normal texture map, and is used for making the surface reflection effect of the opal. By sampling the first texture capture map (Tex 1), the R, G two vectors of texture will act as texture coordinates for the matcap texture, affecting the reflective portion of the opal surface reflection effect.
From the first texture capture map (Tex 1), the mipmap level of the texture is calculated at the target location using the get_miplevel function. The target position is the position of the second texture coordinate (normal_uv 2) plus a preset tile repetition parameter (disttortscale) multiplied by the first offset offa or the second offset offb. The get_miplevel function is used to calculate the mipmap level (i.e., mipLevel) of the texture based on the partial derivatives (i.e., dx and dy) of the texture coordinates, where the mipmap level of the texture refers to the different resolution images produced by the texture as it is scaled down and scaled up. In the rendering process, according to the size and distance of an object on a screen, a proper mipmap level is selected for sampling and rendering, so that the rendering efficiency and the image quality are improved. Distant objects may use a lower level mipmap texture while near objects require a higher level mipmap texture to maintain detail clarity.
Specifically, the implementation process of determining the sharpness level of the first texture capture map according to the sampling position may include: firstly, calculating the square d of the length of the partial derivative of the texture coordinate of a sampling position (namely the target position) in the first material capturing map, and converting d into a floating point number of a mipmap level by adopting a log2 function; and then, carrying out downward rounding on the floating point number by adopting a floor function, and assigning a rounding result to the mipLevel to obtain the definition grade.
In the above application scenario, under the definition level corresponding to each sampling position based on the first offset offa, a color value (assigned to the cola variable) corresponding to the first material capturing map (Tex 1) at the sampling position is obtained. Correspondingly, under the definition level corresponding to each sampling position based on the second offset offb, a color value (assigned to the colb variable) corresponding to the first material capturing map (Tex 1) at the sampling position is obtained.
Optionally, the color information includes a first color value and a second color value, wherein the first color value is determined by a first offset and the second color value is determined by a second offset; in step S215, interpolation smoothing is performed on the color information based on the first interpolation coefficient to obtain a target color vector, which may include the following steps:
Step S2151, the difference between the first color value and the second color value is summed up to obtain a first calculation result;
step S2152, performing multiplication mixed calculation on the first interpolation coefficient, the first calculation result and the preset scaling factor to obtain a second calculation result;
step S2153, based on the second calculation result, performing smooth interpolation processing to obtain a second interpolation coefficient;
in step S2154, the first color value and the second color value are linearly interpolated by using the second interpolation coefficient to obtain the target color vector.
In the application scene, based on the first stage of the Matcap processing process, the color value corresponding to the cola variable is a first color value, and the color value corresponding to the cola variable is a second color value. Calculating the sum of differences between the first color value and the second color value corresponding to the plurality of sampling positions by using a sum function, namely a first calculation result sum (cola-cola); further, a second calculation result R is calculated according to the following formula (3):
r=f1-0.1×sum (cola-colb) formula (3)
In the above formula (3), 0.1 represents a preset scaling factor, typically an empirical value of a technician. The sum function is used to output the sum of three components of a three-dimensional vector, for example: sum (v) = v.x + v.y + v.z.
Further, the second calculation result R is smoothly interpolated from 0.2 to 0.8 by using a smooth step function (smoothstep function), to obtain a second interpolation coefficient f2, specifically as shown in the following formula (4):
f2 Equation (4) =smoothstep (0.2, 0.8, f-0.1×sum (cola-colb))
Further, the cola and colb are interpolated using a linear interpolation function (lerp function) to obtain a target color vector (denoted as normaltex).
Therefore, the method steps provided by the application can solve the problem of high texture tiling repeatability in the matcap scheme with lower performance cost, and the generated target color vector can show a texture tiling effect with high randomness, so that the generated picture of the virtual scene is ensured to have better visual performance.
Optionally, in step S22, the calculating a target observation matrix based on the position matrix and the initial observation matrix of the virtual camera may include the following steps:
step S221, calculating to obtain a position matrix according to the position coordinates of the virtual camera in the virtual scene and a periodic factor, wherein the periodic factor is used for determining the conversion relation of translation and rotation of the virtual camera in the virtual scene;
Step S222, performing multiplication calculation on the position matrix and the initial observation matrix to obtain a target observation matrix.
In the above application scenario, based on the second stage of the Matcap processing procedure, three components X, Y, Z are calculated from the position coordinates (CameraPosition) and the periodicity factor (innonormal) of the virtual camera, and are used to construct a 3×3 matrix (cameraponsmatrix). For example, the implementation of the calculation of three components is as follows:
FLOAT X=abs(sin(InNormalPeriod*CameraPosition.x));
FLOAT Y=abs(sin(InNormalPeriod*CameraPosition.y));
FLOAT Z=abs(sin(InNormalPeriod*CameraPosition.z));
the implementation process of constructing a 3×3 matrix (CameraPosMatrix) is as follows:
FLOAT3 p1=FLOAT3(X,Y,Z);
FLOAT3 p2=FLOAT3(Y,Z,X);
FLOAT3 p3=FLOAT3(Z,X,Y);
FLOAT3×3CameraPosMatrix=FLOAT3x3(p1,p2,p3)。
it should be noted that the above-mentioned periodic factor is a unitorm public variable, and is used to represent the rotation of the virtual camera in the virtual scene, that is, how much rotation the virtual camera is converted into when being controlled to translate by one unit distance.
Further, the pre-calculated initial observation matrix (ViewMatrix, i.e., V matrix) of the virtual camera is multiplied by the above 3×3 matrix (camaraposmatrix) to obtain a target observation matrix (ViewNew). Thereby, the translational change amount of the virtual camera in the virtual scene is priced to the rotational change amount.
Optionally, in step S23, texture sampling is performed on the second material capture map using the first normal information and the target observation matrix to obtain a sampling result, which may include the following steps:
Step S231, performing space transformation and normalization processing on the first normal line information by using the target observation matrix to obtain second normal line information, wherein the second normal line information corresponds to the view space;
step S232, performing space transformation and normalization processing on model normal information of the virtual model by using the initial observation matrix to obtain third normal information, wherein the third normal information corresponds to the view space;
step S233, linear interpolation processing is carried out on the second normal line information and the third normal line information by adopting preset normal line reflection intensity parameters, and an interpolation result is obtained;
step S234, texture sampling is performed on the second material capture map based on the interpolation result, so as to obtain a sampling result.
In the application scenario, in the second stage of the Matcap-based processing procedure, first, according to the pre-calculated world inverse transpose matrix (input. World inv_ins, i.e. M matrix), the world space transformation (including M matrix transformation and normalization processing) is performed on the target color vector (normal) output in the first stage.
Further, the target observation matrix is used for transforming and normalizing the target color vector (normalex) output by the first stage, so as to obtain second normal information (denoted as normalex_view) in the view space. The model normal information (marked as input. Normal) of the virtual model is transformed and normalized using an initial observation matrix (ViewMatrix) to obtain third normal information (marked as normal_view) in view space.
In the application scenario, the second normal line information (normal_view) and the third normal line information (normal_view) are linearly interpolated according to the custom parameter inormalntensity (for controlling the intensity of the reflection normal line), so as to obtain an interpolation result (normal_view). That is, the vertex normal information of the virtual model and the second normal information under the newly calculated view space are subjected to transition to obtain the normal information finally used for sampling.
Further, texture sampling is performed on the second material capturing map based on the interpolation result, and a sampling result is obtained. In the above application scenario, as shown in fig. 5, the second material capturing map (Tex 2) calculates a sphere texture coordinate (sphere_textchord) based on the interpolation result, and samples the second material capturing map (Tex 2) using the sphere texture coordinate as the texture coordinate to obtain a sphere reflection color (sphere_reflection), that is, a sampling result. The ground of the virtual scene rendered based on the sampling result is shown in fig. 3.
It is easy to note that in the second stage of the Matcap-based processing procedure described above, the problem in the prior art that the normal reflection effect lacks the dynamic effect along with the translation transformation of the virtual camera is solved by reconstructing a View matrix and converting the translation transformation of the virtual camera into rotation transformation.
Optionally, the model rendering method may further include at least one of the following steps:
step S25, performing upper limit adjustment on the interpolation result, and updating the interpolation result;
and S26, mapping components of the interpolation result to the target value interval respectively, and updating the interpolation result.
And in the second stage of the Matcap-based processing process, after an interpolation result (normal_view) is obtained, the interpolation result can be subjected to value upper limit adjustment, so that the value of the interpolation result does not exceed a preset value, and the opal reflection effect of part of pixels is prevented from being too bright.
In the second stage of the Matcap-based processing procedure, after calculating the sphere texture coordinate (sphere_textchord) based on the interpolation result, the component of the sphere texture coordinate may be mapped to the target value interval, so as to sample the second material capturing map (Tex 2) later. For example, the x component of normal_view is mapped to the [0,1] range and the y component is mapped to the [ -1,0] range.
In summary, the above method provided in the present application can automatically change texture coordinates of the matcap when the virtual camera translates, and in visual appearance, the obtained color reflection effect dynamically changes (such as a glittering effect of a brilliant crystal) along with the translation change and rotation change of the virtual camera.
From the description of the above embodiments, it will be clear to a person skilled in the art that the method according to the above embodiments may be implemented by means of software plus the necessary general hardware platform, but of course also by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (e.g. a magnetic disc, an optical disc), including several instructions for causing a terminal device (which may be a mobile phone, a computer, a server, or a network device, etc.) to perform the method described in the embodiments of the present application.
The embodiment also provides a model rendering device, which is used for implementing the above embodiment and the preferred implementation manner, and is not described in detail. As used below, the term "module" may be a combination of software and/or hardware that implements a predetermined function. While the means described in the following embodiments are preferably implemented in software, implementation in hardware, or a combination of software and hardware, is also possible and contemplated.
Fig. 6 is a block diagram of a model rendering apparatus according to one embodiment of the present application, as shown in fig. 5, the apparatus includes: the first sampling module 601 is configured to sample the noise map and the first material capture map based on a hashing algorithm to obtain first normal information, where the first material capture map is a normal texture map for determining a surface reflection effect of the virtual model; the calculation module 602 is configured to calculate a target observation matrix based on the position matrix and the initial observation matrix of the virtual camera, where the virtual camera is configured to observe a virtual model in a virtual scene; the second sampling module 603 is configured to perform texture sampling on a second material capturing map by using the first normal information and the target observation matrix to obtain a sampling result, where the second material capturing map is used to determine a sphere reflection color of the virtual model; and the rendering module 604 is configured to render the virtual model by using the sampling result, so as to obtain a rendering result.
Optionally, the first sampling module 601 is further configured to: determining a texture drawing index and a first interpolation coefficient by using the noise map; performing texture sampling and scaling on a preset normal map to obtain a first texture coordinate; carrying out random offset processing on the first texture coordinate by using the texture drawing index to obtain a second texture coordinate; acquiring color information corresponding to the first material capturing map based on a hash algorithm and the second texture coordinates; and carrying out interpolation smoothing processing on the color information based on the first interpolation coefficient to obtain a target color vector, wherein the target color vector is used for representing the first normal information.
Optionally, the first sampling module 601 is further configured to: determining a sampling offset value based on the texture rendering index; generating a target offset by using a rounding result corresponding to the sampling offset value; and performing offset processing on the first texture coordinate based on the target offset to obtain a second texture coordinate.
Optionally, the target offset includes a first offset and a second offset, and the first sampling module 601 is further configured to: rounding the sampling offset value to obtain a first rounding result and a second rounding result, wherein the rounding directions of the first rounding result and the second rounding result are opposite; and performing trigonometric function multiplication on the first rounding result and the first random number to obtain a first offset, and performing trigonometric function multiplication on the second rounding result and the second random number to obtain a second offset.
Optionally, the first sampling module 601 is further configured to: calculating a second texture coordinate, a target offset corresponding to the second texture coordinate and a preset tiling repetition parameter according to a hashing algorithm to obtain a sampling position; determining a definition level of the first material capture map according to the sampling position; and under the definition level, acquiring color information corresponding to the first material capturing map.
Optionally, the color information includes a first color value and a second color value, wherein the first color value is determined by a first offset and the second color value is determined by a second offset; the first sampling module 601 is further configured to: summing the differences between the first color value and the second color value to obtain a first calculation result; multiplying and mixing calculation is carried out on the first interpolation coefficient, the first calculation result and a preset scaling factor to obtain a second calculation result; performing smooth interpolation processing on the second calculation result to obtain a second interpolation coefficient; and linearly interpolating the first color value and the second color value by using the second interpolation coefficient to obtain a target color vector.
Optionally, the computing module 602 is further configured to: calculating to obtain a position matrix according to the position coordinates of the virtual camera in the virtual scene and a periodic factor, wherein the periodic factor is used for determining the conversion relation of translation and rotation of the virtual camera in the virtual scene; and multiplying the position matrix and the initial observation matrix to obtain a target observation matrix.
Optionally, the second sampling module 603 is further configured to: performing space transformation and normalization processing on the first normal information by using a target observation matrix to obtain second normal information, wherein the second normal information corresponds to a view space; performing space transformation and normalization processing on model normal information of the virtual model by using the initial observation matrix to obtain third normal information, wherein the third normal information corresponds to view space; performing linear interpolation processing on the second normal line information and the third normal line information by adopting preset normal line reflection intensity parameters to obtain an interpolation result; and performing texture sampling on the second material capturing map based on the interpolation result to obtain a sampling result.
Alternatively, fig. 7 is a block diagram of an alternative model rendering apparatus according to one embodiment of the present application, as shown in fig. 7, which includes, in addition to all the modules shown in fig. 6: update module 605: performing value upper limit adjustment on the interpolation result, and updating the interpolation result; and mapping the components of the interpolation result to the target value interval respectively, and updating the interpolation result.
It should be noted that each of the above modules may be implemented by software or hardware, and for the latter, it may be implemented by, but not limited to: the modules are all located in the same processor; alternatively, the above modules may be located in different processors in any combination.
Embodiments of the present application also provide a computer readable storage medium having a computer program stored therein, wherein the computer program is arranged to perform the steps of any of the method embodiments described above when run.
Alternatively, in the present embodiment, the above-described computer-readable storage medium may include, but is not limited to: a usb disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a removable hard disk, a magnetic disk, or an optical disk, or other various media in which a computer program can be stored.
Alternatively, in this embodiment, the above-mentioned computer-readable storage medium may be located in any one of the computer terminals in the computer terminal group in the computer network, or in any one of the mobile terminals in the mobile terminal group.
Alternatively, in the present embodiment, the above-described computer-readable storage medium may be configured to store a computer program for performing the steps of:
s1, sampling a noise map and a first material capturing map based on a hashing algorithm to obtain first normal information, wherein the first material capturing map is a normal texture map used for determining a surface reflection effect of a virtual model;
s2, calculating a target observation matrix based on a position matrix and an initial observation matrix of a virtual camera, wherein the virtual camera is used for observing a virtual model in a virtual scene;
s3, performing texture sampling on the second material capturing map by using the first normal information and the target observation matrix to obtain a sampling result, wherein the second material capturing map is used for determining the sphere reflection color of the virtual model;
and S4, rendering the virtual model by adopting the sampling result to obtain a rendering result.
Optionally, the above computer readable storage medium is further configured to store program code for performing the steps of: determining a texture drawing index and a first interpolation coefficient by using the noise map; performing texture sampling and scaling on a preset normal map to obtain a first texture coordinate; carrying out random offset processing on the first texture coordinate by using the texture drawing index to obtain a second texture coordinate; acquiring color information corresponding to the first material capturing map based on a hash algorithm and the second texture coordinates; and carrying out interpolation smoothing processing on the color information based on the first interpolation coefficient to obtain a target color vector, wherein the target color vector is used for representing the first normal information.
Optionally, the above computer readable storage medium is further configured to store program code for performing the steps of: determining a sampling offset value based on the texture rendering index; generating a target offset by using a rounding result corresponding to the sampling offset value; and performing offset processing on the first texture coordinate based on the target offset to obtain a second texture coordinate.
Optionally, the above computer readable storage medium is further configured to store program code for performing the steps of: the target offset comprises a first offset and a second offset, and the sampling offset is rounded to obtain a first rounding result and a second rounding result, wherein the rounding directions of the first rounding result and the second rounding result are opposite; and performing trigonometric function multiplication on the first rounding result and the first random number to obtain a first offset, and performing trigonometric function multiplication on the second rounding result and the second random number to obtain a second offset.
Optionally, the above computer readable storage medium is further configured to store program code for performing the steps of: calculating a second texture coordinate, a target offset corresponding to the second texture coordinate and a preset tiling repetition parameter according to a hashing algorithm to obtain a sampling position; determining a definition level of the first material capture map according to the sampling position; and under the definition level, acquiring color information corresponding to the first material capturing map.
Optionally, the above computer readable storage medium is further configured to store program code for performing the steps of: the color information includes a first color value and a second color value, wherein the first color value is determined by a first offset and the second color value is determined by a second offset; summing the differences between the first color value and the second color value to obtain a first calculation result; multiplying and mixing calculation is carried out on the first interpolation coefficient, the first calculation result and a preset scaling factor to obtain a second calculation result; performing smooth interpolation processing on the second calculation result to obtain a second interpolation coefficient; and linearly interpolating the first color value and the second color value by using the second interpolation coefficient to obtain a target color vector.
Optionally, the above computer readable storage medium is further configured to store program code for performing the steps of: calculating to obtain a position matrix according to the position coordinates of the virtual camera in the virtual scene and a periodic factor, wherein the periodic factor is used for determining the conversion relation of translation and rotation of the virtual camera in the virtual scene; and multiplying the position matrix and the initial observation matrix to obtain a target observation matrix.
Optionally, the above computer readable storage medium is further configured to store program code for performing the steps of: performing space transformation and normalization processing on the first normal information by using a target observation matrix to obtain second normal information, wherein the second normal information corresponds to a view space; performing space transformation and normalization processing on model normal information of the virtual model by using the initial observation matrix to obtain third normal information, wherein the third normal information corresponds to view space; performing linear interpolation processing on the second normal line information and the third normal line information by adopting preset normal line reflection intensity parameters to obtain an interpolation result; and performing texture sampling on the second material capturing map based on the interpolation result to obtain a sampling result.
Optionally, the above computer readable storage medium is further configured to store program code for performing the steps of: performing value upper limit adjustment on the interpolation result, and updating the interpolation result; and mapping the components of the interpolation result to the target value interval respectively, and updating the interpolation result.
In the computer-readable storage medium of the above embodiment, a technical solution for implementing a model rendering method is provided. Firstly, sampling a noise map and a first material capturing map based on a hashing algorithm to obtain first normal information, wherein the first material capturing map is a normal texture map used for determining a surface reflection effect of a virtual model; further based on the position matrix and the initial observation matrix of the virtual camera, calculating to obtain a target observation matrix, wherein the virtual camera is used for observing a virtual model in a virtual scene; and performing texture sampling on the second material capturing mapping by using the first normal information and the target observation matrix to obtain a sampling result, wherein the second material capturing mapping is used for determining the sphere reflection color of the virtual model, and rendering the virtual model by adopting the sampling result on the basis to obtain a rendering result. Therefore, the purpose of rendering the color reflection effect of the virtual model with lower texture repeatability and dynamic change along with the rotation and translation of the virtual camera is achieved, the technical effects of reducing the texture repeatability of the material reflection effect and improving the dynamic change effect of the color reflection are achieved, and the technical problems that the texture repeatability of the material reflection effect obtained by the existing material capturing scheme is high and the dynamic change effect is poor are solved.
From the above description of embodiments, those skilled in the art will readily appreciate that the example embodiments described herein may be implemented in software, or may be implemented in software in combination with the necessary hardware. Thus, the technical solution according to the embodiments of the present application may be embodied in the form of a software product, which may be stored in a computer readable storage medium (may be a CD-ROM, a U-disk, a mobile hard disk, etc.) or on a network, and includes several instructions to cause a computing device (may be a personal computer, a server, a terminal device, or a network device, etc.) to perform the method according to the embodiments of the present application.
In an exemplary embodiment of the present application, a computer-readable storage medium stores thereon a program product capable of implementing the method described above in the present embodiment. In some possible implementations, the various aspects of the embodiments of the present application may also be implemented in the form of a program product comprising program code for causing a terminal device to carry out the steps according to the various exemplary embodiments of the present application as described in the "exemplary methods" section of the embodiments, when the program product is run on the terminal device.
A program product for implementing the above method according to an embodiment of the present application may employ a portable compact disc read-only memory (CD-ROM) and comprise program code and may be run on a terminal device, such as a personal computer. However, the program product of the embodiments of the present application is not limited thereto, and in the embodiments of the present application, the computer-readable storage medium may be any tangible medium that can contain, or store the program for use by or in connection with the instruction execution system, apparatus, or device.
Any combination of one or more computer readable media may be employed by the program product described above. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the computer-readable storage medium would include the following: an electrical connection having one or more wires, a portable disk, a hard disk, random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
It should be noted that the program code embodied on the computer readable storage medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Embodiments of the present application also provide an electronic device comprising a memory having a computer program stored therein and a processor arranged to run the computer program to perform the steps of any of the method embodiments described above.
Optionally, the electronic apparatus may further include a transmission device and an input/output device, where the transmission device is connected to the processor, and the input/output device is connected to the processor.
Alternatively, in the present embodiment, the above-described processor may be configured to execute the following steps by a computer program:
s1, sampling a noise map and a first material capturing map based on a hashing algorithm to obtain first normal information, wherein the first material capturing map is a normal texture map used for determining a surface reflection effect of a virtual model;
s2, calculating a target observation matrix based on a position matrix and an initial observation matrix of a virtual camera, wherein the virtual camera is used for observing a virtual model in a virtual scene;
S3, performing texture sampling on the second material capturing map by using the first normal information and the target observation matrix to obtain a sampling result, wherein the second material capturing map is used for determining the sphere reflection color of the virtual model;
and S4, rendering the virtual model by adopting the sampling result to obtain a rendering result.
Optionally, the above processor may be further configured to perform the following steps by a computer program: determining a texture drawing index and a first interpolation coefficient by using the noise map; performing texture sampling and scaling on a preset normal map to obtain a first texture coordinate; carrying out random offset processing on the first texture coordinate by using the texture drawing index to obtain a second texture coordinate; acquiring color information corresponding to the first material capturing map based on a hash algorithm and the second texture coordinates; and carrying out interpolation smoothing processing on the color information based on the first interpolation coefficient to obtain a target color vector, wherein the target color vector is used for representing the first normal information.
Optionally, the above processor may be further configured to perform the following steps by a computer program: determining a sampling offset value based on the texture rendering index; generating a target offset by using a rounding result corresponding to the sampling offset value; and performing offset processing on the first texture coordinate based on the target offset to obtain a second texture coordinate.
Optionally, the above processor may be further configured to perform the following steps by a computer program: the target offset comprises a first offset and a second offset, and the sampling offset is rounded to obtain a first rounding result and a second rounding result, wherein the rounding directions of the first rounding result and the second rounding result are opposite; and performing trigonometric function multiplication on the first rounding result and the first random number to obtain a first offset, and performing trigonometric function multiplication on the second rounding result and the second random number to obtain a second offset.
Optionally, the above processor may be further configured to perform the following steps by a computer program: calculating a second texture coordinate, a target offset corresponding to the second texture coordinate and a preset tiling repetition parameter according to a hashing algorithm to obtain a sampling position; determining a definition level of the first material capture map according to the sampling position; and under the definition level, acquiring color information corresponding to the first material capturing map.
Optionally, the above processor may be further configured to perform the following steps by a computer program: the color information includes a first color value and a second color value, wherein the first color value is determined by a first offset and the second color value is determined by a second offset; summing the differences between the first color value and the second color value to obtain a first calculation result; multiplying and mixing calculation is carried out on the first interpolation coefficient, the first calculation result and a preset scaling factor to obtain a second calculation result; performing smooth interpolation processing on the second calculation result to obtain a second interpolation coefficient; and linearly interpolating the first color value and the second color value by using the second interpolation coefficient to obtain a target color vector.
Optionally, the above processor may be further configured to perform the following steps by a computer program: calculating to obtain a position matrix according to the position coordinates of the virtual camera in the virtual scene and a periodic factor, wherein the periodic factor is used for determining the conversion relation of translation and rotation of the virtual camera in the virtual scene; and multiplying the position matrix and the initial observation matrix to obtain a target observation matrix.
Optionally, the above processor may be further configured to perform the following steps by a computer program: performing space transformation and normalization processing on the first normal information by using a target observation matrix to obtain second normal information, wherein the second normal information corresponds to a view space; performing space transformation and normalization processing on model normal information of the virtual model by using the initial observation matrix to obtain third normal information, wherein the third normal information corresponds to view space; performing linear interpolation processing on the second normal line information and the third normal line information by adopting preset normal line reflection intensity parameters to obtain an interpolation result; and performing texture sampling on the second material capturing map based on the interpolation result to obtain a sampling result.
Optionally, the above processor may be further configured to perform the following steps by a computer program: performing value upper limit adjustment on the interpolation result, and updating the interpolation result; and mapping the components of the interpolation result to the target value interval respectively, and updating the interpolation result.
In the electronic device of the above embodiment, a technical solution for implementing a model rendering method is provided. Firstly, sampling a noise map and a first material capturing map based on a hashing algorithm to obtain first normal information, wherein the first material capturing map is a normal texture map used for determining a surface reflection effect of a virtual model; further based on the position matrix and the initial observation matrix of the virtual camera, calculating to obtain a target observation matrix, wherein the virtual camera is used for observing a virtual model in a virtual scene; and performing texture sampling on the second material capturing mapping by using the first normal information and the target observation matrix to obtain a sampling result, wherein the second material capturing mapping is used for determining the sphere reflection color of the virtual model, and rendering the virtual model by adopting the sampling result on the basis to obtain a rendering result. Therefore, the purpose of rendering the color reflection effect of the virtual model with lower texture repeatability and dynamic change along with the rotation and translation of the virtual camera is achieved, the technical effects of reducing the texture repeatability of the material reflection effect and improving the dynamic change effect of the color reflection are achieved, and the technical problems that the texture repeatability of the material reflection effect obtained by the existing material capturing scheme is high and the dynamic change effect is poor are solved.
Fig. 8 is a schematic diagram of an electronic device according to one embodiment of the present application. As shown in fig. 8, the electronic device 800 is merely an example, and should not impose any limitation on the functionality and scope of use of the embodiments of the present application.
As shown in fig. 8, the electronic apparatus 800 is embodied in the form of a general purpose computing device. Components of the electronic device 800 may include, but are not limited to: the at least one processor 810, the at least one memory 820, a bus 830 connecting the various system components including the memory 820 and the processor 810, and a display 840.
Wherein the memory 820 stores program code that can be executed by the processor 810, such that the processor 810 performs the steps according to various exemplary implementations of the present application described in the above method section of the embodiments of the present application.
The memory 820 may include readable media in the form of volatile memory units, such as Random Access Memory (RAM) 8201 and/or cache memory 8202, and may further include Read Only Memory (ROM) 8203, and may also include nonvolatile memory, such as one or more magnetic storage devices, flash memory, or other nonvolatile solid state memory.
In some examples, memory 820 may also include a program/utility 8204 having a set (at least one) of program modules 8205, such program modules 8205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each or some combination of which may include an implementation of a network environment. The memory 820 may further include memory remotely located relative to the processor 810, which may be connected to the electronic device 800 via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
Bus 830 may be one or more of several types of bus structures including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, processor 810, or a local bus using any of a variety of bus architectures.
The display 840 may be, for example, a touch screen type liquid crystal display (Liquid Crystal Display, LCD) that may enable a user to interact with a user interface of the electronic device 800.
Optionally, the electronic apparatus 800 may also communicate with one or more external devices 900 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with the electronic apparatus 800, and/or with any device (e.g., router, modem, etc.) that enables the electronic apparatus 800 to communicate with one or more other computing devices. Such communication may occur through an input/output (I/O) interface 850. Also, the electronic device 800 may communicate with one or more networks (e.g., local area network (Local Area Network, LAN), wide area network (Wide Area Network, WAN) and/or public network, such as the internet) via the network adapter 860. As shown in fig. 8, network adapter 860 communicates with other modules of electronic device 800 over bus 830. It should be appreciated that although not shown in fig. 8, other hardware and/or software modules may be used in connection with the electronic device 800, which may include, but are not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, disk array (Redundant Arrays of Independent Disks, RAID) systems, tape drives, data backup storage systems, and the like.
The electronic device 800 may further include: a keyboard, a cursor control device (e.g., a mouse), an input/output interface (I/O interface), a network interface, a power supply, and/or a camera.
It will be appreciated by those of ordinary skill in the art that the configuration shown in fig. 8 is merely illustrative and is not intended to limit the configuration of the electronic device described above. For example, the electronic device 800 may also include more or fewer components than shown in fig. 8, or have a different configuration than shown in fig. 8. The memory 820 may be used to store a computer program and corresponding data, such as a computer program and corresponding data corresponding to a model rendering method in an embodiment of the present application. The processor 810 executes a computer program stored in the memory 820 to perform various functional applications and data processing, i.e., to implement the model rendering method described above.
The foregoing embodiment numbers of the present application are merely for describing, and do not represent advantages or disadvantages of the embodiments.
In the foregoing embodiments of the present application, the descriptions of the embodiments are emphasized, and for a portion of this disclosure that is not described in detail in this embodiment, reference is made to the related descriptions of other embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed technology content may be implemented in other manners. The above-described embodiments of the apparatus are merely exemplary, and the division of the units, for example, may be a logic function division, and may be implemented in another manner, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some interfaces, units or modules, or may be in electrical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be embodied in essence or a part contributing to the prior art or all or part of the technical solution in the form of a software product stored in a storage medium, including several instructions to cause a computer device (which may be a personal computer, a server or a network device, etc.) to perform all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a usb disk, a read-only memory (ROM), a random-access memory (RAM), a removable hard disk, a magnetic disk, or an optical disk, etc., which can store program codes.
The foregoing is merely a preferred embodiment of the present application and it should be noted that modifications and adaptations to those skilled in the art may be made without departing from the principles of the present application and are intended to be comprehended within the scope of the present application.

Claims (12)

1. A model rendering method, comprising:
sampling the noise map and the first material capturing map based on a hash algorithm to obtain first normal information, wherein the first material capturing map is a normal texture map for determining the surface reflection effect of the virtual model;
calculating a target observation matrix based on a position matrix and an initial observation matrix of a virtual camera, wherein the virtual camera is used for observing the virtual model in a virtual scene;
texture sampling is carried out on a second material capturing mapping by utilizing the first normal information and the target observation matrix to obtain a sampling result, wherein the second material capturing mapping is used for determining the sphere reflection color of the virtual model;
and rendering the virtual model by adopting the sampling result to obtain a rendering result.
2. The model rendering method of claim 1, wherein sampling the noise map and the first material capture map based on the hashing algorithm to obtain the first normal information includes:
Determining a texture drawing index and a first interpolation coefficient by using the noise map;
performing texture sampling and scaling on a preset normal map to obtain a first texture coordinate;
performing random offset processing on the first texture coordinates by using the texture drawing index to obtain second texture coordinates;
acquiring color information corresponding to the first material capturing map based on the hash algorithm and the second texture coordinates;
and carrying out interpolation smoothing processing on the color information based on the first interpolation coefficient to obtain a target color vector, wherein the target color vector is used for representing the first normal information.
3. The model rendering method of claim 2, wherein randomly shifting the first texture coordinates using the texture rendering index to obtain the second texture coordinates comprises:
determining a sampling offset value based on the texture rendering index;
generating a target offset by using a rounding result corresponding to the sampling offset value;
and performing offset processing on the first texture coordinate based on the target offset to obtain the second texture coordinate.
4. A model rendering method according to claim 3, wherein the target offset includes a first offset and a second offset, and generating the target offset using the rounding result corresponding to the sampling offset value includes:
Rounding the sampling offset value to obtain a first rounding result and a second rounding result, wherein the rounding directions of the first rounding result and the second rounding result are opposite;
and performing trigonometric function multiplication on the first rounding result and the first random number to obtain the first offset, and performing trigonometric function multiplication on the second rounding result and the second random number to obtain the second offset.
5. The model rendering method of claim 2, wherein obtaining the color information corresponding to the first material capture map based on the hashing algorithm and the second texture coordinates comprises:
calculating the second texture coordinates, target offset corresponding to the second texture coordinates and preset tiling repetition parameters according to the hashing algorithm to obtain sampling positions;
determining a definition level of the first material capture map according to the sampling position;
and under the definition level, acquiring color information corresponding to the first material capturing map.
6. The model rendering method of claim 2, wherein the color information includes a first color value and a second color value, wherein the first color value is determined by a first offset and the second color value is determined by a second offset;
Performing interpolation smoothing processing on the color information based on the first interpolation coefficient, and obtaining the target color vector includes:
summing the differences between the first color value and the second color value to obtain a first calculation result;
multiplying and mixing calculation is carried out on the first interpolation coefficient, the first calculation result and a preset scaling factor to obtain a second calculation result;
performing smooth interpolation processing on the second calculation result to obtain a second interpolation coefficient;
and linearly interpolating the first color value and the second color value by using the second interpolation coefficient to obtain the target color vector.
7. The model rendering method of claim 1, wherein calculating the target observation matrix based on the position matrix and the initial observation matrix of the virtual camera comprises:
calculating to obtain the position matrix according to the position coordinates of the virtual camera in the virtual scene and a periodic factor, wherein the periodic factor is used for determining the conversion relation of translation and rotation of the virtual camera in the virtual scene;
and multiplying the position matrix and the initial observation matrix to obtain the target observation matrix.
8. The model rendering method of claim 1, wherein texture sampling the second material capture map using the first normal information and the target observation matrix, the sampling result comprising:
performing space transformation and normalization processing on the first normal information by using the target observation matrix to obtain second normal information, wherein the second normal information corresponds to a view space;
performing space transformation and normalization processing on model normal information of the virtual model by using the initial observation matrix to obtain third normal information, wherein the third normal information corresponds to view space;
performing linear interpolation processing on the second normal information and the third normal information by adopting preset normal reflection intensity parameters to obtain an interpolation result;
and performing texture sampling on the second material capturing map based on the interpolation result to obtain the sampling result.
9. The model rendering method of claim 8, further comprising at least one of:
performing value upper limit adjustment on the interpolation result, and updating the interpolation result;
And mapping the components of the interpolation result to a target value interval respectively, and updating the interpolation result.
10. A model rendering apparatus, characterized by comprising:
the first sampling module is used for sampling the noise mapping and the first material capturing mapping based on a hashing algorithm to obtain first normal information, wherein the first material capturing mapping is a normal texture mapping used for determining the surface reflection effect of the virtual model;
the computing module is used for computing a target observation matrix based on a position matrix and an initial observation matrix of a virtual camera, wherein the virtual camera is used for observing the virtual model in a virtual scene;
the second sampling module is used for performing texture sampling on a second material capturing map by utilizing the first normal information and the target observation matrix to obtain a sampling result, wherein the second material capturing map is used for determining the sphere reflection color of the virtual model;
and the rendering module is used for rendering the virtual model by adopting the sampling result to obtain a rendering result, wherein the rendering result is used for displaying the preset color reflection effect of the virtual model.
11. A computer readable storage medium, characterized in that a computer program is stored in the computer readable storage medium, wherein the computer program is arranged to perform the model rendering method of any one of claims 1 to 9 when being run by a processor.
12. An electronic device comprising a memory and a processor, characterized in that the memory has stored therein a computer program, the processor being arranged to run the computer program to perform the model rendering method as claimed in any of claims 1 to 9.
CN202311366639.7A 2023-10-20 2023-10-20 Model rendering method and device, storage medium and electronic device Pending CN117671114A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311366639.7A CN117671114A (en) 2023-10-20 2023-10-20 Model rendering method and device, storage medium and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311366639.7A CN117671114A (en) 2023-10-20 2023-10-20 Model rendering method and device, storage medium and electronic device

Publications (1)

Publication Number Publication Date
CN117671114A true CN117671114A (en) 2024-03-08

Family

ID=90063026

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311366639.7A Pending CN117671114A (en) 2023-10-20 2023-10-20 Model rendering method and device, storage medium and electronic device

Country Status (1)

Country Link
CN (1) CN117671114A (en)

Similar Documents

Publication Publication Date Title
US6707458B1 (en) Method and apparatus for texture tiling in a graphics system
KR20080090671A (en) Apparatus and method for mapping textures to object model
JP2021510842A (en) 3D display using electromagnetic field calculation
JP2023029984A (en) Method, device, electronic apparatus, and readable storage medium for generating virtual image
CN101477701A (en) Built-in real tri-dimension rendering process oriented to AutoCAD and 3DS MAX
CN101477700A (en) Real tri-dimension display method oriented to Google Earth and Sketch Up
CN115375822A (en) Cloud model rendering method and device, storage medium and electronic device
CN115082607A (en) Virtual character hair rendering method and device, electronic equipment and storage medium
CN108230430B (en) Cloud layer mask image processing method and device
Kolivand et al. An overview on base real-time shadow techniques in virtual environments
Yan et al. A non-photorealistic rendering method based on Chinese ink and wash painting style for 3D mountain models
CN116271814A (en) Scene picture processing method and device, storage medium and electronic device
CN117671114A (en) Model rendering method and device, storage medium and electronic device
CN115526976A (en) Virtual scene rendering method and device, storage medium and electronic equipment
CN115131489A (en) Cloud layer rendering method and device, storage medium and electronic device
CN115035231A (en) Shadow baking method, shadow baking device, electronic apparatus, and storage medium
CN101561935B (en) GoCAD software-oriented implanted true three-dimensional display method
Catanese et al. Rendering of 3d dynamic virtual environments
CN116889723A (en) Picture generation method and device of virtual scene, storage medium and electronic device
CN117496035A (en) Texture map generation method and device, storage medium and electronic device
Nordahl Enhancing the hpc-lab snow simulator with more realistic terrains and other interactive features
Sterling et al. Integrated multimodal interaction using texture representations
CN114332316A (en) Virtual character processing method and device, electronic equipment and storage medium
CN116726501A (en) Method and device for generating projection in game scene, storage medium and electronic device
Walia et al. A framework for interactive 3D rendering on mobile devices

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination