CN118334232A - Method and device for realizing highlight effect and electronic equipment - Google Patents
Method and device for realizing highlight effect and electronic equipment Download PDFInfo
- Publication number
- CN118334232A CN118334232A CN202410382877.5A CN202410382877A CN118334232A CN 118334232 A CN118334232 A CN 118334232A CN 202410382877 A CN202410382877 A CN 202410382877A CN 118334232 A CN118334232 A CN 118334232A
- Authority
- CN
- China
- Prior art keywords
- highlight
- virtual object
- pixel point
- anisotropic
- direction vector
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000000694 effects Effects 0.000 title claims abstract description 194
- 238000000034 method Methods 0.000 title claims abstract description 93
- 239000013598 vector Substances 0.000 claims abstract description 180
- 238000009877 rendering Methods 0.000 claims abstract description 86
- 239000000463 material Substances 0.000 claims abstract description 59
- 238000003860 storage Methods 0.000 claims abstract description 7
- 238000012937 correction Methods 0.000 claims description 22
- 230000005012 migration Effects 0.000 claims description 21
- 238000013508 migration Methods 0.000 claims description 21
- 230000004044 response Effects 0.000 claims description 15
- 238000012790 confirmation Methods 0.000 claims description 10
- 238000005070 sampling Methods 0.000 claims description 10
- 230000000007 visual effect Effects 0.000 claims description 9
- 230000008859 change Effects 0.000 claims description 8
- 238000013507 mapping Methods 0.000 claims description 4
- 238000010200 validation analysis Methods 0.000 claims description 2
- 239000000758 substrate Substances 0.000 claims 1
- 230000001795 light effect Effects 0.000 abstract description 6
- 238000010586 diagram Methods 0.000 description 15
- 238000005286 illumination Methods 0.000 description 15
- 238000011161 development Methods 0.000 description 5
- 230000008569 process Effects 0.000 description 5
- 238000004364 calculation method Methods 0.000 description 4
- 230000009471 action Effects 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 239000003086 colorant Substances 0.000 description 2
- 239000000835 fiber Substances 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 239000007769 metal material Substances 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 239000000126 substance Substances 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- 230000016776 visual perception Effects 0.000 description 2
- 230000006978 adaptation Effects 0.000 description 1
- 238000005326 angular distribution function Methods 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 238000005282 brightening Methods 0.000 description 1
- 238000004040 coloring Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 238000005315 distribution function Methods 0.000 description 1
- 230000001747 exhibiting effect Effects 0.000 description 1
- 230000001678 irradiating effect Effects 0.000 description 1
- 238000012821 model calculation Methods 0.000 description 1
- 239000002245 particle Substances 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 230000000704 physical effect Effects 0.000 description 1
- 230000000644 propagated effect Effects 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
Landscapes
- Image Generation (AREA)
Abstract
The application discloses a method, a device, electronic equipment and a computer readable storage medium for realizing a highlight effect. The method comprises the following steps: responding to the selection operation for the virtual object, and acquiring an anisotropic highlight offset map corresponding to the virtual object; generating first color increment information corresponding to each pixel point of the virtual object according to the anisotropic highlight offset map and the sight direction vector corresponding to each pixel point of the virtual object based on the initial material shader; and rendering the virtual object based on the first color increment information corresponding to each pixel point of the virtual object to obtain the virtual object with the first highlight effect. The method solves the technical problems that the virtual objects in the dark part and the shadow area lack detail texture and have poor expression effect because the virtual objects in the dark part and the shadow area cannot be rendered with high light effect in the prior art.
Description
Technical Field
The present application relates to the field of computer technologies, and in particular, to a method and an apparatus for implementing a highlight effect, an electronic device, and a computer readable storage medium.
Background
The highlight effect is an indispensable rendering special effect in virtual business projects such as games, and can enhance the stereoscopic impression and the sense of reality of scenes or objects. For some materials with pronounced grain directions (e.g., hair), the highlighting effect is affected by the observer's line of sight direction and the direction of the light source, and is characterized by anisotropy.
Currently, the anisotropic highlight rendering method cannot realize highlight effect rendering on virtual objects in dark areas and shadow areas, mainly because the virtual objects in dark areas and shadow areas do not receive illumination, which results in lack of detail texture of the virtual objects in dark areas and shadow areas.
Therefore, the prior art has the technical problems that the virtual objects in the dark part and the shadow area lack detail texture and have poor expression effect because the highlight effect cannot be rendered on the virtual objects in the dark part and the shadow area.
Disclosure of Invention
The application provides a method, a device, electronic equipment and a computer readable storage medium for realizing a highlight effect, which are used for solving the technical problems that the virtual objects in a dark part and a shadow area lack detail texture and have poor expression effect due to the fact that the highlight effect cannot be rendered on the virtual objects in the dark part and the shadow area in the prior art.
In a first aspect, an embodiment of the present application provides a method for implementing a highlight effect, where the method includes: responding to a selection operation for a virtual object, and acquiring an anisotropic highlight offset map corresponding to the virtual object, wherein the anisotropic highlight offset map is used for representing the texture direction of the virtual object; generating first color increment information corresponding to each pixel point of the virtual object according to the anisotropic highlight offset map and the sight direction vector corresponding to each pixel point of the virtual object based on an initial material shader; the initial texture shader comprises a highlight model, a first highlight parameter and a second highlight parameter, wherein the highlight model is used for calculating an offset parameter aiming at the first highlight parameter according to the sight line direction vector, the second highlight parameter and the anisotropic highlight offset map, and the offset parameter is used for carrying out offset correction on the first highlight parameter so as to generate the first color increment information, and the sight line direction vector is determined according to the direction from a virtual camera to the pixel point; and rendering the virtual object based on the first color increment information corresponding to each pixel point of the virtual object to obtain the virtual object with the first highlight effect.
In a second aspect, an embodiment of the present application provides a method for implementing a highlight effect, where the method includes: responding to a rendering instruction aiming at a virtual object, taking the direction from a virtual camera to each pixel point of the virtual object at the current rendering time as a sight direction, and determining a sight direction vector corresponding to each pixel point of the virtual object; based on a material shader, generating color increment information corresponding to each pixel point of the virtual object according to an anisotropic highlight migration map corresponding to the virtual object and the sight line direction vector corresponding to each pixel point of the virtual object, wherein the anisotropic highlight migration map is used for representing the texture direction of the virtual object; and performing color rendering on the virtual object according to the color increment information corresponding to each pixel point of the virtual object, and generating a highlight effect of the virtual object at the current rendering moment.
In a third aspect, an embodiment of the present application provides an apparatus for implementing a highlight effect, where the apparatus includes: the device comprises an acquisition unit, a generation unit, a rendering unit and a display unit; the obtaining unit is used for responding to the selection operation of the virtual object, obtaining an anisotropic highlight offset map corresponding to the virtual object, wherein the anisotropic highlight offset map is used for representing the texture direction of the virtual object; the generating unit is used for generating first color increment information corresponding to each pixel point of the virtual object according to the anisotropic highlight offset mapping and the sight direction vector corresponding to each pixel point of the virtual object based on an initial material shader; the initial texture shader comprises a highlight model, a first highlight parameter and a second highlight parameter, wherein the highlight model is used for calculating an offset parameter aiming at the first highlight parameter according to the sight line direction vector, the second highlight parameter and the anisotropic highlight offset map, and the offset parameter is used for carrying out offset correction on the first highlight parameter so as to generate the first color increment information, and the sight line direction vector is determined according to the direction from a virtual camera to the pixel point; the rendering unit is configured to render the virtual object based on the first color increment information corresponding to each pixel point of the virtual object, so as to obtain a virtual object with a first highlight effect.
In a fourth aspect, an embodiment of the present application provides a device for implementing a highlight effect, where the device includes: a viewing direction determining unit, a color increment generating unit, and a highlight effect rendering unit; the visual line direction determining unit is used for responding to a rendering instruction aiming at the virtual object, taking the direction from the virtual camera to each pixel point of the virtual object at the current rendering moment as a visual line direction, and determining a visual line direction vector corresponding to each pixel point of the virtual object; the color increment generation unit is used for generating color increment information corresponding to each pixel point of the virtual object based on a material shader according to the anisotropic highlight migration map corresponding to the virtual object and the sight line direction vector corresponding to each pixel point of the virtual object, wherein the anisotropic highlight migration map is used for representing the texture direction of the virtual object; the highlight effect rendering unit is used for performing color rendering on the virtual object according to the color increment information corresponding to each pixel point of the virtual object, and generating the highlight effect of the virtual object at the current rendering time.
In a fifth aspect, an embodiment of the present application provides an electronic device, including: a memory, a processor; the memory is used for storing one or more computer instructions; the processor is configured to execute the one or more computer instructions to implement the method described above.
In a sixth aspect, embodiments of the present application provide a computer readable storage medium having stored thereon one or more computer instructions which, when executed by a processor, perform the above-described method.
Compared with the prior art, the method for realizing the highlight effect is based on the initial material shader, generates the first color increment information corresponding to each pixel point of the virtual object according to the anisotropic highlight offset map and the sight line direction vector corresponding to each pixel point of the virtual object, and realizes the highlight rendering of the virtual object based on the first color increment information corresponding to each pixel point. In the method, the highlight model generates the offset parameter according to the sight line direction vector, the second highlight parameter and the anisotropic highlight offset map, and then offset correction is performed on the first highlight parameter based on the offset parameter to generate the first color increment information, so that the method does not relate to any illumination information in the color increment information generation process, namely, the anisotropic highlight effect can be rendered for the virtual object in the dark part or the shadow area which is not irradiated by light. The method solves the technical problems that the virtual objects in the dark part and the shadow area lack detail texture and have poor expression effect because the virtual objects in the dark part and the shadow area cannot be rendered with high light effect in the prior art.
Drawings
Fig. 1 is an application system diagram of a method for implementing a highlight effect according to an embodiment of the present application;
FIG. 2 is a flow chart of a method for implementing a highlight effect according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a selection operation for a virtual object according to an embodiment of the present application;
FIG. 4 is a schematic diagram of an anisotropic highlight offset map according to an embodiment of the present application;
FIG. 5 is a schematic diagram of a parameter setting interface according to an embodiment of the present application;
FIG. 6 is a schematic diagram of a highlight effect provided by an embodiment of the present application;
FIG. 7 is a flow chart of a method for implementing a highlight effect according to another embodiment of the present application;
FIG. 8 is a schematic structural diagram of a device for realizing high light effect according to an embodiment of the present application;
FIG. 9 is a schematic structural diagram of a device for realizing high light effect according to another embodiment of the present application;
Fig. 10 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present application. The present application may be embodied in many other forms than those herein described, and those skilled in the art will readily appreciate that the present application may be similarly embodied without departing from the spirit or essential characteristics thereof, and therefore the present application is not limited to the specific embodiments disclosed below.
For ease of understanding, technical terms that may be involved in embodiments of the present application will first be briefly described.
1. High light effect
The highlight effect is an indispensable special effect in three-dimensional rendering and image editing, simulates a bright reflection area formed by light rays irradiating the surface of an object, and can greatly enhance the stereoscopic impression and the reality of a scene or the object. That is, the highlight effect is an important technical means for locally brightening an image by accurately simulating an illumination phenomenon in the real world, thereby improving visual impact and depth perception.
2. Anisotropy of
Anisotropy (Anisotropy) refers to the property that all or part of the chemical, physical, etc. properties of a substance change with changes in direction, exhibiting differences in different directions. Anisotropy refers to the phenomenon that the reflection characteristics of the surface of an object change in different directions in illumination effect rendering. Anisotropy affects the appearance of highlights, which can vary depending on the viewing angle and the direction of the light source relative to the surface texture of the object when the model is made of a material with anisotropic reflective properties.
3. Direct light
Direct light (DIRECT LIGHT), which refers to light rays that are emitted directly from a light source and strike the surface of an object without being reflected, scattered, or indirectly propagated by other media. In natural environments, sunlight is a typical direct light source, and on sunny and cloudless days, sunlight reaches the earth's surface directly without being scattered by particles in the atmosphere. In the virtual engine, direct light refers to virtual sunlight generally, which is a main light source in a virtual scene and belongs to parallel light.
4. Tangential space
Tangential space (TANGENT SPACE), a concept in computer graphics, plays an important role in rendering and texture mapping, among other things. Tangential space is a coordinate system defined locally on a curved surface, which is associated with a certain vertex of the model surface, and is generally described by three mutually perpendicular vectors, which are respectively: tangent, normal, and minor normal.
5. Tangent line
The tangent (Tangent, T) is the vector along the direction of the curved surface in which the U direction changes the fastest in the UV coordinate system. For 3D models, especially in the case of texture mapping, the tangent line is typically aligned with the texture coordinate U-axis. A tangent line is also understood to mean a straight line which just touches a point on the curve.
6. Normal line
Normal (N), which is a unit vector perpendicular to a curved surface, represents the orientation of the curved surface at this vertex. In illumination calculations, normals are used to determine the angular relationship between a ray and a surface. The normal is also understood to mean a straight line which is always perpendicular to a certain plane.
7. Auxiliary normal line
The minor normal (Binormal, B), which is a vector perpendicular to the tangent and also in the tangent plane, is aligned with the texture coordinate V-axis. The direction is usually obtained by cross-multiplying the tangent with the normal. A minor normal is also understood to mean a straight line perpendicular to the tangent and normal.
8. Half angle vector
The Half-Angle Vector (Half-Angle Vector) refers to the sum of the line-of-sight direction Vector and the light direction Vector, and specifically refers to the intermediate Vector or the harmonic mean of the two vectors. The half-angle vector is a key vector for calculating the high light reflection intensity, combines the information of the light direction and the observer direction, and can simulate the specular reflection effect.
9. Kajiya-Kay model
Kajiya-Kay model is a method used in classical computer graphics to simulate and render the coloring effect of anisotropic hair (and other similar fibrous structures). The model is particularly concerned with how light interacts with surfaces having pronounced directional characteristics. In the real world, hair fibres are often anisotropic, meaning that their reflective, absorptive and transmissive properties of light vary with the angle of observation, and in particular exhibit different optical properties in a direction perpendicular to the hair axis than in a direction along the hair axis. The Kajiya-Kay model exploits this principle to calculate the reflected color and intensity of hair by defining an angular distribution function that depends on the angle of incidence of the light and the axis relative to the hair. This distribution function is typically of the double gaussian or other form to approximate the effect of the microstructure of the hair on light.
With the development of computer technology, virtual business projects such as games not only consider the usability of the project, but also pay attention to the quality of the project, wherein the highlight effect is an indispensable rendering special effect for improving the quality of the project, and scenes or objects with the highlight effect can bring more stereoscopic and real visual perception to users. For some materials with significant texture directions (e.g., hair), the highlighting effect may be affected by the direction of the viewer's line of sight and the direction of the light, and may be characterized as anisotropic.
At present, the anisotropic highlight effect is realized through Kajiya-Kay model calculation, the Kajiya-Kay model takes a half-angle vector as one of input data, and color increment of the virtual object in different light directions and sight directions is calculated, so that the highlight effect in different light directions and sight directions is rendered. Since the half angle vector is the sum of the sight line direction vector and the light ray direction vector, and the virtual object in the dark part and the shadow area lacks illumination, the highlight effect cannot be realized, so that the virtual object in the dark part and the shadow area lacks detail texture, and the expression effect is poor. Particularly for some black virtual objects (e.g., black hair), a black piece of rendering will appear in dark and shadow areas, giving the user a poor visual perception.
In view of this, the present application provides a method for implementing a highlight effect, which is based on an initial texture shader, generates first color increment information corresponding to each pixel of a virtual object according to an anisotropic highlight offset map and a line-of-sight direction vector corresponding to each pixel of the virtual object, and implements highlight rendering of the virtual object based on the first color increment information corresponding to each pixel. In the method, the highlight model generates the offset parameter according to the sight line direction vector, the second highlight parameter and the anisotropic highlight offset map, and then offset correction is performed on the first highlight parameter based on the offset parameter to generate the first color increment information, so that the method does not relate to any illumination information in the color increment information generation process, namely, the anisotropic highlight effect can be rendered for the virtual object in the dark part or the shadow area which is not irradiated by light.
The method, apparatus, electronic device, and computer-readable storage medium for realizing the highlight effect according to the present application are described in further detail below with reference to specific embodiments and drawings.
Fig. 1 is an application system diagram of a method for implementing a highlight effect according to an embodiment of the present application.
As shown in fig. 1, the system includes a user terminal 101 and a server terminal 102. The user terminal 101 may be any device such as a smart phone, a tablet computer, a notebook computer, a desktop computer, a Personal digital assistant (Personal DIGITAL ASSISTANT, PDA), etc. The server 102 may be a processing model inside the client 101, a processing device electrically connected to the client 101, or a server communicatively connected to a plurality of clients 101. The method for realizing the highlight effect provided by the embodiment of the application is deployed on the server 102, and can be executed to render the virtual object and present the virtual object to the user through the user 101 when the user sends a rendering request for the virtual object through the user 101.
An embodiment of the application provides a method for realizing a highlight effect.
Fig. 2 is a flowchart of a method for implementing the highlight effect provided in the present embodiment. The method for realizing the highlight effect provided in this embodiment is described in detail below with reference to fig. 2. The examples referred to in the following description are for explaining the technical solution of the present application and are not intended to be limiting in practical use.
As shown in fig. 2, the implementation method of the highlight effect provided in the present embodiment includes the following steps S210 to S230.
Step S210, responding to a selection operation for a virtual object, and acquiring an anisotropic highlight offset map corresponding to the virtual object, wherein the anisotropic highlight offset map is used for representing the texture direction of the virtual object.
Step S220, based on an initial material shader, generating first color increment information corresponding to each pixel point of the virtual object according to the anisotropic highlight offset map and the sight line direction vector corresponding to each pixel point of the virtual object; the initial texture shader comprises a highlight model, a first highlight parameter and a second highlight parameter, wherein the highlight model is used for calculating an offset parameter aiming at the first highlight parameter according to the sight line direction vector, the second highlight parameter and the anisotropic highlight offset map, the offset parameter is used for carrying out offset correction on the first highlight parameter so as to generate the first color increment information, and the sight line direction vector is determined according to the direction from a virtual camera to the pixel point.
Step S230, rendering the virtual object based on the first color increment information corresponding to each pixel point of the virtual object, to obtain a virtual object with a first highlight effect.
Virtual business items refer to business activities created, managed and operated in a digital environment, which are not dependent on the existence of conventional entities, but are implemented through the internet and digital technologies. Such as: virtual games, virtual conferences, virtual reality items, augmented reality items, and the like. The lack of real lighting in virtual business projects requires developers to simulate real lighting effects by programming or using specific graphic tools. Therefore, in a specific implementation manner, the implementation method of the highlight effect provided in this embodiment may be applied in a development stage of the virtual business project, so that a developer can implement rendering of the highlight effect on the virtual object (including the virtual object in the dark portion or the shadow region of the virtual scene) based on the method.
The following describes the above steps in detail:
For step S210:
in this step, an anisotropic highlight offset map corresponding to the virtual object will be acquired in response to the selection operation for the virtual object.
The virtual object is any object preset in the virtual scene, and can be an object positioned in an illumination area of the virtual scene, or an object positioned in a dark part or a shadow area of the virtual scene, or an independent individual, or a part of a certain body. In one embodiment, the virtual object is a virtual object that the developer is currently preparing to achieve a highlight effect, located in a dark or shadow region of the virtual scene. In general, before a project development stage, a project designer has made a project plan for a virtual service project to be developed, and according to the project plan, the developer can learn which virtual objects are in or can be in a dark part and a shadow area of a virtual scene, and the developer uses the virtual objects as virtual objects to be subjected to a highlight effect, and sequentially realizes the highlight effect on the virtual objects by adopting the method provided by the embodiment. It should be noted that, because the virtual object in the illumination area is irradiated by light, its highlight effect may be implemented by using the prior art, or may be implemented by using the method provided in this embodiment, which is not limited herein.
The selection operation may be understood as an operation in which a developer selects a virtual object currently achieving a highlight effect in a project development tool, such as: the game developer operates in the game engine to select the hair of the character a currently achieving the highlight effect.
Fig. 3 is a schematic diagram of a selection operation for a virtual object provided in this embodiment.
As shown in fig. 3, the virtual object selection interface 30 of the game engine includes a plurality of candidate virtual objects, such as candidate virtual object body03, candidate virtual object body02, candidate virtual object pair, candidate virtual object eye, candidate virtual object skin, etc., and of course, these candidate virtual objects may be directed to the same virtual character, so that a virtual character to be processed, such as virtual character a, may be selected in the virtual character selection box 31, and then the candidate virtual object described above for virtual character a is displayed in the candidate virtual object display box 32. When the developer prepares to realize the highlight effect on the hair of the virtual character a, the candidate virtual object hair is selected so that the candidate virtual object hair becomes the virtual object to be realized with the highlight effect.
The anisotropic highlight migration map is used for representing the texture direction of the virtual object, the anisotropic highlight migration map can be understood as a texture map consistent with the texture direction of the virtual object, and according to the texture map, simulation of different highlight phenomena of a surface of a specific material at different angles can be realized, and specifically, the shape and the direction of a highlight region can be changed along with the change of an observation angle. Anisotropic high light offset maps typically contain one or more texture coordinate channels that are used to control the directionality of the high light produced by the object surface. For example, a stretched saw tooth like highlight effect is produced along the hair strand texture.
In an alternative implementation, an initial anisotropic highlight offset map may be provided for a virtual object of the same material class, where the initial anisotropic highlight offset map is used to characterize the initial texture direction of the material class to which the virtual object belongs. When the virtual object is determined, the corresponding initial anisotropic highlight offset map is called according to the material category of the virtual object, and then the shape of the initial anisotropic highlight offset map is adjusted according to the preset texture direction of the virtual object, so that the shape of the initial anisotropic highlight offset map accords with the preset texture direction of the virtual object, and the anisotropic highlight offset map corresponding to the virtual object is formed.
Based on this, the obtaining of the anisotropic highlight offset map corresponding to the virtual object may include the following steps S211 to S212:
step S211, in response to the selection operation for the virtual object, acquiring an initial anisotropic highlight offset map corresponding to the virtual object.
Step S212, adjusting the initial anisotropic highlight offset map according to the texture direction preset for the virtual object, so as to form the anisotropic highlight offset map.
Fig. 4 is a schematic diagram of an anisotropic highlight offset map according to the present embodiment.
Assuming that the object to be processed is the hair of the virtual character a in the virtual game, the texture direction of the hair of the virtual character a is curled, when the game developer selects the hair of the virtual character a as the virtual object in the game engine, an initial anisotropic highlight offset map for pre-drawing the hair material with fine arts can be obtained, and since the vertical hair is easier to draw, the initial anisotropic highlight offset map may be a texture map representing the vertical texture direction as shown in fig. 4 (a). After the initial anisotropic highlight migration map is obtained, the shape of the initial anisotropic highlight migration map is adjusted according to the preset curling texture direction aiming at the hair of the virtual character A, so that the vertical hair is in a preset curling state, and the anisotropic highlight migration map representing the curling texture direction is formed as shown in fig. 4 (b).
For step S220:
In this step, the first color increment information corresponding to each pixel of the virtual object is generated based on the initial texture shader, the anisotropic highlight offset map, and the line-of-sight direction vector corresponding to each pixel of the virtual object.
The initial texture shader may be understood as a texture shader that is not parameter optimized, including a highlight model, a first highlight parameter, and a second highlight parameter.
In an alternative implementation, the first highlight parameter includes highlight color information; the second highlight parameter comprises at least one of the following information: high light shift information, high light range information, and high light intensity information. Wherein the highlight color information is used to control the color of the highlight region seen by the observer, and the highlight color can be white, metallic or other specific colors for different materials, depending on the physical properties of the materials themselves. For example, metallic materials may exhibit highlighting of their own colors, while non-metallic materials may exhibit highlighting that is closer to the color of the light source in the virtual scene. The highlight offset information is used for controlling the displacement degree of the highlight position relative to the normal direction, and can be used for simulating the phenomenon that the highlight region is not completely reflected according to the normal direction on certain materials, such as anisotropic highlight, and the highlight region can be changed in position according to the change of the surface microstructure by adjusting the highlight offset information. The highlight range information is used to describe the extent to which the highlight region spreads or the gloss, with a high gloss value meaning that the highlight region is smaller and more concentrated, and a low gloss value would make the highlight region larger and more blurred, mainly affecting the speed of decay of the highlight. The high light intensity information is used for defining the brightness level of the high light effect on the surface of the object, and determines the relative brightness of the high light area relative to the environment illumination and other illumination contributions of the virtual scene, the material can be more reflective and shiny due to the higher high light intensity, the effect can be reduced due to the lower high light intensity, and the material can be more matte.
The highlight model is used for calculating an offset parameter aiming at a first highlight parameter according to the sight line direction vector, a second highlight parameter and the anisotropic highlight offset map, so that offset correction is carried out on the first highlight parameter based on the offset parameter, and first color increment information is generated.
In a specific implementation manner, the highlight model is a reconstructed Kajiya-Kay model, the traditional Kajiya-Kay model uses a half-angle vector as one item of input data to calculate an offset parameter, and the half-angle vector is the sum of a light direction vector and a sight direction vector, so that the traditional Kajiya-Kay model cannot perform highlight effect rendering on virtual objects in dark and shadow areas which cannot be illuminated by light. Based on this, in the method provided in this embodiment, the conventional Kajiya-Kay model is reconstructed, and the input data is changed from the half-angle vector to the line-of-sight direction vector, so as to eliminate the dependence of the highlight effect on illumination.
The offset parameter can be understood as a correction parameter for correcting the highlight color information, and can be understood as an anisotropy value output by Kajiya-Kay model according to different sight directions, and the highlight color information corrected by the offset parameter can show the anisotropic visual effect.
The color increment refers to anisotropic color information generated after the highlight color information is corrected by adopting an offset parameter. Note that, the highlight effect is actually an effect added on the basis of the original rendering effect of the virtual object, and therefore, color information for rendering the highlight effect is defined as color delta information, that is, a color added on top of the original rendering effect. In this embodiment, the color increment information generated based on the initial texture shader is defined as the first color increment information.
The line-of-sight direction vector is determined according to the direction from the virtual camera to the pixel point, the virtual camera is a virtual tool for presenting a virtual scene, and the virtual camera usually presents the virtual scene at the angle of an observer, so that the direction from the virtual camera to the pixel point of the virtual object is the line-of-sight direction of the observer, and the line-of-sight direction vector can be calculated based on the line-of-sight direction of the observer.
In a specific implementation, the position of the observer (virtual camera) and the position of a pixel point of the virtual object are determined, i.e. the gaze direction vector is calculated according to the Normalize function, specifically, the gaze direction vector is calculated by the following expression:
ViewDirection=Normalize(PixelPosition-EyePosition)
Where ViewDirection denotes the line of sight direction vector, pixelPosition denotes the position of the pixel, eyePosition denotes the position of the observer (virtual camera), normalize () is a function of the calculated line of sight direction vector, which ensures that the result is a vector of length 1, representing the exact direction from the observer position to the pixel position, not just the distance.
In an alternative implementation, based on the initial texture shader, generating the first color increment information corresponding to each pixel point of the virtual object may include the following steps S221 to S224:
Step S221, sampling the anisotropic highlight offset map to obtain a normal direction vector and a secondary normal direction vector corresponding to each pixel point of the virtual object.
The anisotropic highlight offset map records the texture direction of the virtual object, which is usually represented in tangential space. The tangential space is a local coordinate system, which is composed of three mutually perpendicular basis vectors: the tangent is the vector along the texture U-axis, the negative normal is the vector along the texture V-axis, the normal is the vector perpendicular to the tangent and the minor tangent, pointing outside the object, i.e. towards the surface. By converting the directional information of the surface into the tangent space based on the local texture coordinates of the material surface, the highlight calculation is not influenced by global coordinate transformation of the model, so that the highlight effect of the anisotropic material in different sight directions can be accurately simulated.
And obtaining a normal direction vector and a secondary normal direction vector corresponding to each pixel point of the virtual object by sampling the anisotropic highlight offset map. Specifically, texture coordinates of each vertex of the virtual object can be projected onto the anisotropic highlight offset map, and the pixel points of the texture coordinates corresponding to the texture coordinates of each pixel point of the virtual object are sampled by using the texture coordinates of the position of each pixel point of the virtual object, so that the anisotropic offset degree of the highlight on each pixel point can be obtained, and a normal direction vector and a sub-normal direction vector corresponding to each pixel point are obtained.
Step S222, generating a tangential direction vector corresponding to each pixel point of the virtual object according to the normal direction vector and the auxiliary normal direction vector corresponding to each pixel point of the virtual object and the second highlight parameter.
Specifically, the normal direction vector corresponding to each pixel point is adjusted according to the second highlight parameter, and then the tangential direction vector corresponding to each pixel point is generated according to the auxiliary normal direction vector corresponding to each pixel point and the adjusted normal direction vector.
Step S223, inputting the tangential direction vector, the sight direction vector and the second highlight parameter corresponding to each pixel of the virtual object into the highlight model, so as to obtain an offset parameter corresponding to each pixel of the virtual object output by the highlight model.
Specifically, the tangential direction vector, the line of sight direction vector and the second highlight parameter corresponding to each pixel point are taken as input data, and the input data is input into a highlight model, and the highlight model outputs the offset parameter corresponding to each pixel point.
Step S224, performing offset correction on the first highlight parameter corresponding to each pixel of the virtual object by using the offset parameter corresponding to each pixel of the virtual object, so as to generate the first color increment information corresponding to each pixel of the virtual object.
Specifically, the product of the offset parameter corresponding to each pixel point and the first highlight parameter (highlight color information) is used as the first color increment information corresponding to each pixel point.
For step S230:
In this step, the virtual object is rendered based on the first color increment information corresponding to each pixel point of the virtual object, and the virtual object with the first highlight effect is obtained.
Specifically, the initial material shader is given to the virtual object, so that the virtual object can be rendered. The initial texture shader has a program for realizing the original effect of the virtual object in addition to the program for realizing the highlight effect. For example, if the virtual object is a character a hair, the initial material shader has a program for rendering an original effect of the character a hair in addition to the program for rendering a highlight effect on the character a hair, and according to the program, the original color information corresponding to each pixel point of the character a hair is calculated, and then the original effect of the character a hair, such as black hair, purple hair, colored hair, etc., can be rendered according to the original color information corresponding to each pixel point. The highlight effect can be increased on the basis of the color increment information, for example, a white highlight effect on black hair, a light purple highlight effect on purple hair, and the like.
The method provided by the embodiment is mainly applied to the development stage of the virtual business project, so that the obtained virtual object with the first highlight effect can be displayed on the terminal equipment of the developer, and the developer can view the highlight effect rendered by the initial material shader.
The first highlight effect is a highlight effect rendered based on the initial material shader, and can be also understood to be a highlight effect rendered based on the first color increment information obtained by calculation of the initial highlight parameters in the initial material shader.
In general, the highlight effect is not a developer satisfaction effect, and based on this, in an alternative implementation provided in this embodiment, a step of changing the highlight parameters in the initial texture shader is further provided, so as to optimize the initial texture shader to obtain the developer satisfaction highlight effect.
In step S240, the initial texture shader is used as an optimized texture shader in response to the confirmation operation for the first highlight effect.
If the developer is satisfied with the first highlight effect rendered by the initial texture shader, a confirmation operation for the first highlight effect can be performed on the terminal device of the developer, for example, a "confirm" button is clicked in a highlight effect viewing interface, for example, a parameter setting interface is opened after viewing, a "save" button is clicked in the parameter setting interface, and the like. After confirmation, the initial texture shader may be used as an optimized texture shader in the application phase of the virtual business project.
Step S250, in response to the changing operation for the first highlight parameter and/or the second highlight parameter, changing the initial texture shader into a texture shader to be optimized; generating second color increment information corresponding to each pixel point of the virtual object according to the anisotropic highlight offset map and the sight line direction vector corresponding to each pixel point of the virtual object based on the material shader to be optimized; and rendering the virtual object based on the second color increment information corresponding to each pixel point of the virtual object to obtain the virtual object with the second highlight effect.
In most cases, the first highlighting effect rendered based on the initial texture shader is not as planned. Then, the developer opens the parameter setting interface to adjust the first highlighting parameter and/or the second highlighting parameter, such as changing the highlighting color information, changing the highlighting offset information, changing the highlighting range information, changing the highlighting intensity information, etc.
Fig. 5 is a schematic diagram of a parameter setting interface provided in the present embodiment.
As shown in fig. 5, the parameter setting interface 50 includes an adjustment frame 51 of highlight color (Addtion _ HighlightColor) information, an adjustment frame 52 of highlight offset (Addtion _ HIGHLIGHTSHIFT) information, an adjustment frame 53 of highlight range (addition_ HIGHLIGHTRANGE) information, and an adjustment frame 54 of highlight intensity (addition_ HIGHLIGHTSTRENGTH) information. By outputting specific numerical values in the adjustment frames 51, 52, 53, and 54, it is possible to display changes to the highlight color information, the highlight shift information, the highlight range information, and the highlight intensity information.
After the first highlight parameter and/or the second highlight parameter in the initial texture shader are changed, the initial texture shader is changed, and in this embodiment, the texture shader in the parameter adjustment stage is defined as the texture shader to be optimized.
Based on the material shader to be optimized, the highlight effect rendered on the virtual object can be displayed on the terminal device of the developer according to the steps S220 to S230, and in this embodiment, the highlight effect rendered based on the material shader to be optimized is defined as the second highlight effect. The specific rendering process may refer to the detailed descriptions in the above steps S220 to S230, and will not be described herein.
In step S260, the texture shader to be optimized is used as an optimized texture shader in response to the confirmation operation for the second highlight effect.
If the developer is satisfied with the second highlight effect rendered by the texture shader to be optimized, the confirmation operation of the second highlight effect can be performed on the terminal equipment of the developer. After confirmation, the material shader to be optimized can be used as an optimized material shader in the application stage of the virtual business project.
Of course, if the second highlight effect rendered by the material shader to be optimized still cannot meet the planning requirement, the first highlight parameter and/or the second highlight parameter in the material shader to be optimized can be repeatedly adjusted based on the steps until the highlight effect satisfied by the developer is rendered.
The steps S210 to S260 provide a method for implementing a highlight effect, which can render a highlight effect on a virtual object belonging to an anisotropic characteristic material in a dark portion and a shadow region of a virtual scene.
In a specific application scenario, the virtual object is a hair of a virtual character. The microstructure of hair is composed of a number of elongated and regularly arranged fibres which exhibit different optical properties in different directions, i.e. have anisotropic characteristics. The highlight effect of hair is usually elliptical along the direction of the fibers, with bright stripes running along the hair line being observed with varying viewing angles. If the hair in the dark or shadow area has no highlight effect, the detail texture of the hair is not displayed, so that the hair looks flat and has poor effect. By adopting the method provided by the embodiment, the highlight effect can be displayed no matter the hair is in the dark part or the shadow area, and the overall quality of the virtual character is improved.
Fig. 6 is a schematic view of the highlight effect provided by the present embodiment.
As shown in fig. 6, in the virtual scene, the virtual character hair 61 is in a dark portion and cannot be irradiated by light, and the anisotropic highlight effect, such as the highlight effect displayed in the hair region 611 and the highlight effect displayed in the hair region 613, is achieved by the method provided in this embodiment. If the highlight effect is not rendered on the virtual character hair 61 in the dark portion, the entire hair will show the rendering effect as in the hair region 612, without any detailed texture, look flat, and show poor effect.
The embodiment provides a method for realizing a highlight effect, which is based on an initial material shader, generates first color increment information corresponding to each pixel point of a virtual object according to an anisotropic highlight offset map and a sight line direction vector corresponding to each pixel point of the virtual object, and realizes highlight rendering of the virtual object based on the first color increment information corresponding to each pixel point. In the method, the highlight model generates the offset parameter according to the sight line direction vector, the second highlight parameter and the anisotropic highlight offset map, and then offset correction is performed on the first highlight parameter based on the offset parameter to generate the first color increment information, so that the method does not relate to any illumination information in the color increment information generation process, namely, the anisotropic highlight effect can be rendered for the virtual object in the dark part or the shadow area which is not irradiated by light.
It should be noted that, the examples in the present embodiment are only for explaining the method of the present application, and are not limited to practical use, and the implementation method of the highlight effect provided by the present application includes, but is not limited to, the method of the present embodiment.
Another embodiment of the present application provides a method for implementing a highlight effect.
In a specific implementation manner, the implementation method of the highlight effect provided by the embodiment of the application is applied to an application stage of a virtual service project, that is, after a developer implements the highlight effect of a virtual object based on the implementation method of the highlight effect provided by the embodiment of the application, the highlight effect of the virtual object is rendered in real time based on the optimized material shader in the application stage.
Fig. 7 is a flowchart of a method for realizing the highlight effect provided in the present embodiment. The method for realizing the highlight effect provided in this embodiment is described in detail below with reference to fig. 7. The examples referred to in the following description are for explaining the technical solution of the present application and are not intended to be limiting in practical use.
As shown in fig. 7, the implementation method of the highlight effect provided in the present embodiment includes the following steps S710 to S730:
In step S710, in response to a rendering instruction for a virtual object, a direction from a virtual camera to each pixel of the virtual object at a current rendering time is taken as a line-of-sight direction, and a line-of-sight direction vector corresponding to each pixel of the virtual object is determined.
The rendering instruction for the virtual object may be understood as request information for performing highlight rendering on the virtual object, which is generated by the user terminal 101 in response to a user operation as shown in fig. 1. Such as: the user performs an operation of logging in the game, and the user controls the movement of the virtual character, and makes the hair of the virtual character face the user, and makes the hair of the virtual character be in a shadow area. After generating the rendering instruction, the user terminal 101 sends the rendering instruction to the server terminal 102, and the server terminal 102 determines a line-of-sight direction vector corresponding to each pixel point according to the direction from the virtual camera to each pixel point of the virtual object at the current rendering time. The method for calculating the line-of-sight direction vector may refer to the detailed description of step S220 in the above embodiment, and will not be described herein.
It should be noted that, the implementation method of the highlight effect provided in this embodiment performs the rendering of the highlight effect on the virtual object in real time in the project operation process, specifically, renders the highlight effect in real time according to the real-time change of the line of sight direction between the observer (virtual camera) and the virtual object.
The virtual object may be an individual in the virtual scene or may be a part of an individual in the virtual scene, such as the hair of a virtual character.
In an optional implementation manner provided in this embodiment, the virtual object is a hair of a virtual character. The microstructure of hair is composed of a number of elongated and regularly arranged fibres which exhibit different optical properties in different directions, i.e. have anisotropic characteristics. The highlight effect of hair is usually elliptical along the direction of the fibers, with bright stripes running along the hair line being observed with varying viewing angles. If the hair in the dark or shadow area has no highlight effect, the detail texture of the hair is not displayed, so that the hair looks flat and has poor effect. By adopting the method provided by the embodiment, the highlight effect can be rendered no matter whether the hair is in the illumination area or in the dark part or the shadow area, and the overall quality of the virtual character is improved.
Step S720, based on the texture shader, generating color increment information corresponding to each pixel of the virtual object according to the anisotropic highlight offset map corresponding to the virtual object and the line-of-sight direction vector corresponding to each pixel of the virtual object, where the anisotropic highlight offset map is used to characterize the texture direction of the virtual object.
In an alternative implementation, the texture shader is an optimized texture shader determined by the implementation method of the highlight effect provided in the previous embodiment, and the highlight effect meeting the planning requirement can be rendered based on the texture shader.
In one embodiment, the texture shader includes a highlight model, and a first highlight parameter and a second highlight parameter that are preset. The first parameter includes highlight color information, and the second highlight parameter includes highlight offset information, highlight range information, and highlight intensity information. The first parameter and the second parameter are also parameters optimized by the previous embodiment, i.e. parameters included in the optimized texture shader.
The highlight model is used for calculating an offset parameter aiming at the first highlight parameter according to the sight line direction vector, the second highlight parameter and the anisotropic highlight offset map, so that offset correction is carried out on the first highlight parameter based on the offset parameter, and color increment information is generated.
In a specific implementation manner, the highlight model is a reconstructed Kajiya-Kay model, the traditional Kajiya-Kay model uses a half-angle vector as one item of input data to calculate an offset parameter, and the half-angle vector is the sum of a ray direction vector and a sight line method vector, so that the traditional Kajiya-Kay model cannot render a highlight effect on a virtual object which cannot be illuminated in a dark part and a shadow region. Based on this, in the method provided in this embodiment, the conventional Kajiya-Kay model is reconstructed, and the input data is changed from the half-angle vector to the line-of-sight direction vector, so that the dependence of highlight effect rendering on illumination is eliminated.
The offset parameter can be understood as a correction parameter for correcting the highlight color information, and can be understood as an anisotropy value output by Kajiya-Kay model according to different sight directions, and the highlight color information corrected by the offset parameter can show the anisotropic visual effect.
The color increment is anisotropic color information generated by correcting highlight color information (first highlight information) by using an offset parameter. Note that, the highlight effect is actually an effect added on the basis of the original rendering effect of the virtual object, and therefore, color information for rendering the highlight effect may be defined as color delta information, that is, a color added on top of the original rendering effect.
In a specific implementation manner, based on the texture shader, generating the color increment information corresponding to each pixel point of the virtual object may include the following steps S721 to S724:
step S721, sampling the anisotropic highlight offset map, and obtaining a normal direction vector and a sub-normal direction vector corresponding to each pixel of the virtual object.
The anisotropic highlight offset map records the texture direction of the virtual object, which is usually represented in tangential space. The tangential space is a local coordinate system, which is composed of three mutually perpendicular basis vectors: the tangent is the direction vector along the texture U-axis, the negative normal is the direction vector along the texture V-axis, and the normal is the direction vector perpendicular to the tangent and the minor tangent. By converting the directional information of the surface into the tangent space based on the local texture coordinates of the material surface, the highlight calculation is not influenced by global coordinate transformation of the object to be processed, so that the highlight effect of the anisotropic material in different sight directions can be accurately simulated.
And obtaining a normal direction vector and a secondary normal direction vector corresponding to each pixel point of the virtual object by sampling the anisotropic highlight offset map. Specifically, texture coordinates of each vertex of the virtual object can be projected onto the anisotropic highlight offset map, and the pixel points of the texture coordinates corresponding to the texture coordinates of each pixel point of the virtual object are sampled by using the texture coordinates of the position of each pixel point of the virtual object, so that the anisotropic offset degree of the highlight on each pixel point can be obtained, and a normal direction vector and a sub-normal direction vector corresponding to each pixel point are obtained.
Step S722, generating a tangential direction vector corresponding to each pixel point of the virtual object according to the normal direction vector and the auxiliary normal direction vector corresponding to each pixel point of the virtual object and the second highlight parameter.
Specifically, the normal direction vector corresponding to each pixel point is adjusted according to the second highlight parameter, and then the tangential direction vector corresponding to each pixel point is generated according to the auxiliary normal direction vector corresponding to each pixel point and the adjusted normal direction vector.
Step S723, inputting the tangential direction vector, the line of sight direction vector and the second highlight parameter corresponding to each pixel of the virtual object into the highlight model, to obtain an offset parameter corresponding to each pixel of the virtual object output by the highlight model.
Specifically, the tangential direction vector, the line of sight direction vector and the second highlight parameter corresponding to each pixel point are taken as input data, and the input data is input into a highlight model, and the highlight model outputs the offset parameter corresponding to each pixel point.
Step S724, performing offset correction on the first highlight parameter corresponding to each pixel of the virtual object by using the offset parameter corresponding to each pixel of the virtual object, so as to generate the color increment information corresponding to each pixel of the virtual object.
Specifically, the product of the offset parameter corresponding to each pixel point and the first highlight parameter (highlight color information) is used as the color increment information corresponding to each pixel point.
Step S730, performing color rendering on the virtual object according to the color increment information corresponding to each pixel point of the virtual object, so as to generate a highlight effect of the virtual object at the current rendering time.
Specifically, the material shader is given to the virtual object, so that the virtual object can be rendered. The texture shader has a program for realizing the original effect of the virtual object in addition to the program for realizing the highlight effect. The color rendering of the virtual object according to the color increment information corresponding to each pixel point is to increase the highlight effect on the basis of the original rendering effect, and the performance of the original rendering effect is not affected.
The rendered highlight effect is the highlight effect of the virtual object at the current rendering time, if the sight line direction vector corresponding to each pixel point at the next rendering time is not changed, the color increment information does not need to be recalculated, and if the sight line direction vector is changed, the color increment information needs to be recalculated according to the current sight line direction vector.
The present embodiment provides an optional method for implementing a highlight effect, and it should be noted that, the examples in this embodiment are only for exemplary explanation of the method described in the present application, and are not limited to practical use, and the method for implementing a highlight effect provided in the present application includes, but is not limited to, the method described in this embodiment.
An embodiment of the application provides a device for realizing a highlight effect. Fig. 8 is a schematic structural diagram of a device for realizing a highlight effect according to the present embodiment.
As shown in fig. 8, the implementation apparatus for a highlight effect provided in this embodiment includes: an acquisition unit 801, a generation unit 802, and a rendering unit 803.
The obtaining unit 801 is configured to obtain, in response to a selection operation for a virtual object, an anisotropic highlight offset map corresponding to the virtual object, where the anisotropic highlight offset map is used to characterize a texture direction of the virtual object.
Optionally, the responding to the selection operation for the virtual object obtains an anisotropic highlight offset map corresponding to the virtual object, including:
Responding to the selection operation aiming at the virtual object, acquiring an initial anisotropic highlight migration map corresponding to the virtual object, wherein the initial anisotropic highlight migration map is used for representing the initial texture direction of the material class to which the virtual object belongs;
And adjusting the initial anisotropic highlight offset map according to the texture direction preset for the virtual object to form the anisotropic highlight offset map.
The generating unit 802 is configured to generate, based on an initial texture shader, first color increment information corresponding to each pixel of the virtual object according to the anisotropic highlight offset map and a line-of-sight direction vector corresponding to each pixel of the virtual object; the initial texture shader comprises a highlight model, a first highlight parameter and a second highlight parameter, wherein the highlight model is used for calculating an offset parameter aiming at the first highlight parameter according to the sight line direction vector, the second highlight parameter and the anisotropic highlight offset map, the offset parameter is used for carrying out offset correction on the first highlight parameter so as to generate the first color increment information, and the sight line direction vector is determined according to the direction from a virtual camera to the pixel point.
Optionally, the generating, based on the initial texture shader, the first color increment information corresponding to each pixel of the virtual object according to the anisotropic highlight offset map and the line-of-sight direction vector corresponding to each pixel of the virtual object includes:
Sampling the anisotropic highlight offset map to obtain a normal direction vector and a secondary normal direction vector corresponding to each pixel point of the virtual object;
generating a tangential direction vector corresponding to each pixel point of the virtual object according to the normal direction vector and the auxiliary normal direction vector corresponding to each pixel point of the virtual object and the second highlight parameter;
Inputting the tangential direction vector, the sight line direction vector and the second highlight parameters corresponding to each pixel point of the virtual object into the highlight model to obtain offset parameters corresponding to each pixel point of the virtual object output by the highlight model;
and carrying out offset correction on the first highlight parameters corresponding to each pixel point of the virtual object by using the offset parameters corresponding to each pixel point of the virtual object, and generating the first color increment information corresponding to each pixel point of the virtual object.
Optionally, the first highlight parameter includes highlight color information;
The second highlight parameter comprises at least one of the following information: high light shift information, high light range information, and high light intensity information.
The rendering unit 803 is configured to render the virtual object based on the first color increment information corresponding to each pixel point of the virtual object, to obtain a virtual object with a first highlight effect.
Optionally, the apparatus further includes: a material shader optimizing unit:
the texture shader optimizing unit is configured to respond to a confirmation operation for the first highlight effect by using the initial texture shader as an optimized texture shader.
Optionally, the method is further used for:
In response to a change operation for the first highlight parameter and/or the second highlight parameter, changing the initial texture shader to a texture shader to be optimized;
generating second color increment information corresponding to each pixel point of the virtual object according to the anisotropic highlight offset map and the sight line direction vector corresponding to each pixel point of the virtual object based on the material shader to be optimized;
and rendering the virtual object based on the second color increment information corresponding to each pixel point of the virtual object to obtain the virtual object with the second highlight effect.
Optionally, the method is further used for:
And responding to the confirmation operation for the second highlight effect, and taking the material shader to be optimized as an optimized material shader.
Optionally, the virtual object is a hair of a virtual character.
Another embodiment of the present application provides a device for implementing a highlight effect. Fig. 9 is a schematic structural diagram of a device for realizing a highlight effect according to the present embodiment.
As shown in fig. 9, the implementation apparatus for a highlight effect provided in this embodiment includes: a line-of-sight direction determination unit 901, a color increment generation unit 902, and a highlight effect rendering unit 903.
The line-of-sight direction determining unit 901 is configured to determine, in response to a rendering instruction for a virtual object, a line-of-sight direction vector corresponding to each pixel point of the virtual object with a direction from a virtual camera to each pixel point of the virtual object at a current rendering time as a line-of-sight direction.
The color increment generation unit 902 is configured to generate color increment information corresponding to each pixel of the virtual object according to an anisotropic highlight offset map corresponding to the virtual object and the line-of-sight direction vector corresponding to each pixel of the virtual object, where the anisotropic highlight offset map is used to characterize a texture direction of the virtual object.
Optionally, the texture shader includes a highlight model, and a first highlight parameter and a second highlight parameter that are preset;
The generating, based on a texture shader, color increment information corresponding to each pixel of the virtual object according to the anisotropic highlight offset map corresponding to the virtual object and the line-of-sight direction vector corresponding to each pixel of the virtual object includes:
Sampling the anisotropic highlight offset map to obtain a normal direction vector and a secondary normal direction vector corresponding to each pixel point of the virtual object;
generating a tangential direction vector corresponding to each pixel point of the virtual object according to the normal direction vector and the auxiliary normal direction vector corresponding to each pixel point of the virtual object and the second highlight parameter;
Inputting the tangential direction vector, the sight line direction vector and the second highlight parameters corresponding to each pixel point of the virtual object into the highlight model to obtain offset parameters corresponding to each pixel point of the virtual object output by the highlight model;
and carrying out offset correction on the first highlight parameter corresponding to each pixel point of the virtual object by using the offset parameter corresponding to each pixel point of the virtual object, and generating the color increment information corresponding to each pixel point of the virtual object.
The highlight effect rendering unit 903 is configured to perform color rendering on the virtual object according to the color increment information corresponding to each pixel point of the virtual object, so as to generate a highlight effect of the virtual object at the current rendering time.
Optionally, the virtual object is a hair of a virtual character.
An embodiment of the present application provides an electronic device, and fig. 10 is a schematic structural diagram of the electronic device provided in this embodiment.
As shown in fig. 10, the electronic device provided in this embodiment includes: memory 1001, processor 1002;
the memory 1001 is configured to store computer instructions for executing a method for implementing a highlight effect and/or a method for rendering a highlight effect;
the processor 1002 is configured to execute computer instructions stored in the memory 1001 to perform the following operations:
Responding to a selection operation for a virtual object, and acquiring an anisotropic highlight offset map corresponding to the virtual object, wherein the anisotropic highlight offset map is used for representing the texture direction of the virtual object;
Generating first color increment information corresponding to each pixel point of the virtual object according to the anisotropic highlight offset map and the sight direction vector corresponding to each pixel point of the virtual object based on an initial material shader; the initial texture shader comprises a highlight model, a first highlight parameter and a second highlight parameter, wherein the highlight model is used for calculating an offset parameter aiming at the first highlight parameter according to the sight line direction vector, the second highlight parameter and the anisotropic highlight offset map, and the offset parameter is used for carrying out offset correction on the first highlight parameter so as to generate the first color increment information, and the sight line direction vector is determined according to the direction from a virtual camera to the pixel point;
And rendering the virtual object based on the first color increment information corresponding to each pixel point of the virtual object to obtain the virtual object with the first highlight effect.
Optionally, further performing:
In response to a validation operation for the first highlight effect, the initial texture shader is treated as an optimized texture shader.
Optionally, further performing:
In response to a change operation for the first highlight parameter and/or the second highlight parameter, changing the initial texture shader to a texture shader to be optimized;
generating second color increment information corresponding to each pixel point of the virtual object according to the anisotropic highlight offset map and the sight line direction vector corresponding to each pixel point of the virtual object based on the material shader to be optimized;
and rendering the virtual object based on the second color increment information corresponding to each pixel point of the virtual object to obtain the virtual object with the second highlight effect.
Optionally, further performing:
And responding to the confirmation operation for the second highlight effect, and taking the material shader to be optimized as an optimized material shader.
Optionally, the responding to the selection operation for the virtual object obtains an anisotropic highlight offset map corresponding to the virtual object, including:
Responding to the selection operation aiming at the virtual object, acquiring an initial anisotropic highlight migration map corresponding to the virtual object, wherein the initial anisotropic highlight migration map is used for representing the initial texture direction of the material class to which the virtual object belongs;
And adjusting the initial anisotropic highlight offset map according to the texture direction preset for the virtual object to form the anisotropic highlight offset map.
Optionally, the generating, based on the initial texture shader, the first color increment information corresponding to each pixel of the virtual object according to the anisotropic highlight offset map and the line-of-sight direction vector corresponding to each pixel of the virtual object includes:
Sampling the anisotropic highlight offset map to obtain a normal direction vector and a secondary normal direction vector corresponding to each pixel point of the virtual object;
generating a tangential direction vector corresponding to each pixel point of the virtual object according to the normal direction vector and the auxiliary normal direction vector corresponding to each pixel point of the virtual object and the second highlight parameter;
Inputting the tangential direction vector, the sight line direction vector and the second highlight parameters corresponding to each pixel point of the virtual object into the highlight model to obtain offset parameters corresponding to each pixel point of the virtual object output by the highlight model;
and carrying out offset correction on the first highlight parameters corresponding to each pixel point of the virtual object by using the offset parameters corresponding to each pixel point of the virtual object, and generating the first color increment information corresponding to each pixel point of the virtual object.
Optionally, the first highlight parameter includes highlight color information;
The second highlight parameter comprises at least one of the following information: high light shift information, high light range information, and high light intensity information.
Optionally, the virtual object is a hair of a virtual character.
Or, the following operations are performed:
Responding to a rendering instruction aiming at a virtual object, taking the direction from a virtual camera to each pixel point of the virtual object at the current rendering time as a sight direction, and determining a sight direction vector corresponding to each pixel point of the virtual object;
Based on a material shader, generating color increment information corresponding to each pixel point of the virtual object according to an anisotropic highlight migration map corresponding to the virtual object and the sight line direction vector corresponding to each pixel point of the virtual object, wherein the anisotropic highlight migration map is used for representing the texture direction of the virtual object;
And performing color rendering on the virtual object according to the color increment information corresponding to each pixel point of the virtual object, and generating a highlight effect of the virtual object at the current rendering moment.
Optionally, the texture shader includes a highlight model, and a first highlight parameter and a second highlight parameter that are preset;
The generating, based on a texture shader, color increment information corresponding to each pixel of the virtual object according to the anisotropic highlight offset map corresponding to the virtual object and the line-of-sight direction vector corresponding to each pixel of the virtual object includes:
Sampling the anisotropic highlight offset map to obtain a normal direction vector and a secondary normal direction vector corresponding to each pixel point of the virtual object;
generating a tangential direction vector corresponding to each pixel point of the virtual object according to the normal direction vector and the auxiliary normal direction vector corresponding to each pixel point of the virtual object and the second highlight parameter;
Inputting the tangential direction vector, the sight line direction vector and the second highlight parameters corresponding to each pixel point of the virtual object into the highlight model to obtain offset parameters corresponding to each pixel point of the virtual object output by the highlight model;
and carrying out offset correction on the first highlight parameter corresponding to each pixel point of the virtual object by using the offset parameter corresponding to each pixel point of the virtual object, and generating the color increment information corresponding to each pixel point of the virtual object.
Optionally, the virtual object is a hair of a virtual character.
Another embodiment of the application provides a computer-readable storage medium comprising computer instructions which, when executed by a processor, are configured to implement the methods described in the embodiments of the application.
It is noted that the relational terms such as "first," "second," and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Furthermore, the terms "comprise," "have," "include," and other similar terms, are intended to be inclusive and open-ended in that any one or more items following any one of the terms described above, neither of which indicates that the one or more items have been enumerated, as an exhaustive list, or limited to only those one or more items so enumerated.
As used herein, unless expressly stated otherwise, the term "or" includes all possible combinations, except where not possible. For example, if expressed as a database may include a or B, then unless specifically stated or not possible, the database may include a, or B, or a and B. In a second example, if expressed as a database might include A, B or C, the database may include database A, or B, or C, or A and B, or A and C, or B and C, or A and B and C, unless otherwise specifically stated or not possible.
It is noted that the above-described embodiments may be implemented by hardware or software (program code), or a combination of hardware and software. If implemented by software, it may be stored in the computer-readable medium described above. The software, when executed by a processor, may perform the methods disclosed above. The computing units and other functional units described in this disclosure may be implemented by hardware or software, or a combination of hardware and software. Those of ordinary skill in the art will also appreciate that the above-described modules/units may be combined into one module/unit, and each of the above-described modules/units may be further divided into a plurality of sub-modules/sub-units.
In the foregoing detailed description, embodiments have been described with reference to numerous specific details that may vary from implementation to implementation. Certain adaptations and modifications of the described embodiments can be made. Other embodiments will be apparent to those skilled in the art from consideration of the specification and practice of the application disclosed herein. The specification and examples are for illustrative purposes only, with the true scope and nature of the application being indicated by the following claims. The order of steps shown in the figures is also for illustrative purposes only and is not meant to be limited to any particular step, order. Accordingly, those skilled in the art will recognize that the steps may be performed in a different order when performing the same method.
In the drawings and detailed description of the application, exemplary embodiments are disclosed. Many variations and modifications may be made to these embodiments. Accordingly, although specific terms are employed, they are used in a generic and descriptive sense only and not for purposes of limitation.
Claims (15)
1. A method for implementing a highlight effect, the method comprising:
Responding to a selection operation for a virtual object, and acquiring an anisotropic highlight offset map corresponding to the virtual object, wherein the anisotropic highlight offset map is used for representing the texture direction of the virtual object;
Generating first color increment information corresponding to each pixel point of the virtual object according to the anisotropic highlight offset map and the sight direction vector corresponding to each pixel point of the virtual object based on an initial material shader; the initial texture shader comprises a highlight model, a first highlight parameter and a second highlight parameter, wherein the highlight model is used for calculating an offset parameter aiming at the first highlight parameter according to the sight line direction vector, the second highlight parameter and the anisotropic highlight offset map, and the offset parameter is used for carrying out offset correction on the first highlight parameter so as to generate the first color increment information, and the sight line direction vector is determined according to the direction from a virtual camera to the pixel point;
And rendering the virtual object based on the first color increment information corresponding to each pixel point of the virtual object to obtain the virtual object with the first highlight effect.
2. The method according to claim 1, wherein the method further comprises:
In response to a validation operation for the first highlight effect, the initial texture shader is treated as an optimized texture shader.
3. The method according to claim 1, wherein the method further comprises:
In response to a change operation for the first highlight parameter and/or the second highlight parameter, changing the initial texture shader to a texture shader to be optimized;
generating second color increment information corresponding to each pixel point of the virtual object according to the anisotropic highlight offset map and the sight line direction vector corresponding to each pixel point of the virtual object based on the material shader to be optimized;
and rendering the virtual object based on the second color increment information corresponding to each pixel point of the virtual object to obtain the virtual object with the second highlight effect.
4. A method according to claim 3, characterized in that the method further comprises:
And responding to the confirmation operation for the second highlight effect, and taking the material shader to be optimized as an optimized material shader.
5. The method of claim 1, wherein the obtaining, in response to a selection operation for a virtual object, an anisotropic highlight offset map corresponding to the virtual object comprises:
Responding to the selection operation aiming at the virtual object, acquiring an initial anisotropic highlight migration map corresponding to the virtual object, wherein the initial anisotropic highlight migration map is used for representing the initial texture direction of the material class to which the virtual object belongs;
And adjusting the initial anisotropic highlight offset map according to the texture direction preset for the virtual object to form the anisotropic highlight offset map.
6. The method of claim 1, wherein the generating, based on the initial texture shader, the first color delta information corresponding to each pixel of the virtual object according to the anisotropic highlight offset map and the gaze direction vector corresponding to each pixel of the virtual object, comprises:
Sampling the anisotropic highlight offset map to obtain a normal direction vector and a secondary normal direction vector corresponding to each pixel point of the virtual object;
generating a tangential direction vector corresponding to each pixel point of the virtual object according to the normal direction vector and the auxiliary normal direction vector corresponding to each pixel point of the virtual object and the second highlight parameter;
Inputting the tangential direction vector, the sight line direction vector and the second highlight parameters corresponding to each pixel point of the virtual object into the highlight model to obtain offset parameters corresponding to each pixel point of the virtual object output by the highlight model;
and carrying out offset correction on the first highlight parameters corresponding to each pixel point of the virtual object by using the offset parameters corresponding to each pixel point of the virtual object, and generating the first color increment information corresponding to each pixel point of the virtual object.
7. The method of claim 1, wherein the step of determining the position of the substrate comprises,
The first highlight parameter comprises highlight color information;
The second highlight parameter comprises at least one of the following information: high light shift information, high light range information, and high light intensity information.
8. The method of any one of claims 1-7, wherein the virtual object is a virtual character's hair.
9. A method for implementing a highlight effect, the method comprising:
Responding to a rendering instruction aiming at a virtual object, taking the direction from a virtual camera to each pixel point of the virtual object at the current rendering time as a sight direction, and determining a sight direction vector corresponding to each pixel point of the virtual object;
Based on a material shader, generating color increment information corresponding to each pixel point of the virtual object according to an anisotropic highlight migration map corresponding to the virtual object and the sight line direction vector corresponding to each pixel point of the virtual object, wherein the anisotropic highlight migration map is used for representing the texture direction of the virtual object;
And performing color rendering on the virtual object according to the color increment information corresponding to each pixel point of the virtual object, and generating a highlight effect of the virtual object at the current rendering moment.
10. The method of claim 9, wherein the texture shader comprises a highlighting model, and a first and a second predetermined highlighting parameters;
The generating, based on a texture shader, color increment information corresponding to each pixel of the virtual object according to the anisotropic highlight offset map corresponding to the virtual object and the line-of-sight direction vector corresponding to each pixel of the virtual object includes:
Sampling the anisotropic highlight offset map to obtain a normal direction vector and a secondary normal direction vector corresponding to each pixel point of the virtual object;
generating a tangential direction vector corresponding to each pixel point of the virtual object according to the normal direction vector and the auxiliary normal direction vector corresponding to each pixel point of the virtual object and the second highlight parameter;
Inputting the tangential direction vector, the sight line direction vector and the second highlight parameters corresponding to each pixel point of the virtual object into the highlight model to obtain offset parameters corresponding to each pixel point of the virtual object output by the highlight model;
and carrying out offset correction on the first highlight parameter corresponding to each pixel point of the virtual object by using the offset parameter corresponding to each pixel point of the virtual object, and generating the color increment information corresponding to each pixel point of the virtual object.
11. A method according to any one of claims 9-10, wherein the virtual object is a virtual character's hair.
12. A device for achieving a highlight effect, the device comprising: the device comprises an acquisition unit, a generation unit, a rendering unit and a display unit;
The obtaining unit is used for responding to the selection operation of the virtual object, obtaining an anisotropic highlight offset map corresponding to the virtual object, wherein the anisotropic highlight offset map is used for representing the texture direction of the virtual object;
The generating unit is used for generating first color increment information corresponding to each pixel point of the virtual object according to the anisotropic highlight offset mapping and the sight direction vector corresponding to each pixel point of the virtual object based on an initial material shader; the initial texture shader comprises a highlight model, a first highlight parameter and a second highlight parameter, wherein the highlight model is used for calculating an offset parameter aiming at the first highlight parameter according to the sight line direction vector, the second highlight parameter and the anisotropic highlight offset map, and the offset parameter is used for carrying out offset correction on the first highlight parameter so as to generate the first color increment information, and the sight line direction vector is determined according to the direction from a virtual camera to the pixel point;
The rendering unit is configured to render the virtual object based on the first color increment information corresponding to each pixel point of the virtual object, so as to obtain a virtual object with a first highlight effect.
13. A device for achieving a highlight effect, the device comprising: a viewing direction determining unit, a color increment generating unit, and a highlight effect rendering unit;
The visual line direction determining unit is used for responding to a rendering instruction aiming at the virtual object, taking the direction from the virtual camera to each pixel point of the virtual object at the current rendering moment as a visual line direction, and determining a visual line direction vector corresponding to each pixel point of the virtual object;
The color increment generation unit is used for generating color increment information corresponding to each pixel point of the virtual object based on a material shader according to the anisotropic highlight migration map corresponding to the virtual object and the sight line direction vector corresponding to each pixel point of the virtual object, wherein the anisotropic highlight migration map is used for representing the texture direction of the virtual object;
the highlight effect rendering unit is used for performing color rendering on the virtual object according to the color increment information corresponding to each pixel point of the virtual object, and generating the highlight effect of the virtual object at the current rendering time.
14. An electronic device, comprising: a memory, a processor;
The memory is used for storing one or more computer instructions;
the processor being configured to execute the one or more computer instructions to implement the method of any of claims 1-11.
15. A computer readable storage medium having stored thereon one or more computer instructions which, when executed by a processor, perform the method of any of claims 1-11.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410382877.5A CN118334232A (en) | 2024-03-29 | 2024-03-29 | Method and device for realizing highlight effect and electronic equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410382877.5A CN118334232A (en) | 2024-03-29 | 2024-03-29 | Method and device for realizing highlight effect and electronic equipment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN118334232A true CN118334232A (en) | 2024-07-12 |
Family
ID=91776629
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202410382877.5A Pending CN118334232A (en) | 2024-03-29 | 2024-03-29 | Method and device for realizing highlight effect and electronic equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN118334232A (en) |
-
2024
- 2024-03-29 CN CN202410382877.5A patent/CN118334232A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109685869B (en) | Virtual model rendering method and device, storage medium and electronic equipment | |
US6876362B1 (en) | Omnidirectional shadow texture mapping | |
Deussen et al. | Computer-generated pen-and-ink illustration of trees | |
Sheffer et al. | Seamster: inconspicuous low-distortion texture seam layout | |
d'Eon et al. | Efficient rendering of human skin | |
US7583264B2 (en) | Apparatus and program for image generation | |
US6593924B1 (en) | Rendering a non-photorealistic image | |
Lu et al. | Illustrative interactive stipple rendering | |
JP4864972B2 (en) | 2D editing metaphor for 3D graphics (METAPHOR) | |
CN113822981B (en) | Image rendering method and device, electronic equipment and storage medium | |
US9905045B1 (en) | Statistical hair scattering model | |
US7064753B2 (en) | Image generating method, storage medium, image generating apparatus, data signal and program | |
CN117745915B (en) | Model rendering method, device, equipment and storage medium | |
US7133052B1 (en) | Morph map based simulated real-time rendering | |
CN116363288A (en) | Rendering method and device of target object, storage medium and computer equipment | |
CN117437345B (en) | Method and system for realizing rendering texture specular reflection effect based on three-dimensional engine | |
Huang et al. | Image-space texture-based output-coherent surface flow visualization | |
Levene | A framework for non-realistic projections | |
US9665955B1 (en) | Pose-space shape fitting | |
CN118334232A (en) | Method and device for realizing highlight effect and electronic equipment | |
CN116137051A (en) | Water surface rendering method, device, equipment and storage medium | |
CN117671110B (en) | Real-time rendering system and method based on artificial intelligence | |
CN117058301B (en) | Knitted fabric real-time rendering method based on delayed coloring | |
Zeng et al. | 3D plants reconstruction based on point cloud | |
Zhang | Interactive Texture Mapping Based on Differentiable Rendering |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |