CN115082614A - Highlight generation method and device, computer equipment and storage medium - Google Patents

Highlight generation method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN115082614A
CN115082614A CN202210662721.3A CN202210662721A CN115082614A CN 115082614 A CN115082614 A CN 115082614A CN 202210662721 A CN202210662721 A CN 202210662721A CN 115082614 A CN115082614 A CN 115082614A
Authority
CN
China
Prior art keywords
highlight
target model
information
disturbance
vector
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210662721.3A
Other languages
Chinese (zh)
Inventor
冷晨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Datianmian White Sugar Technology Co ltd
Original Assignee
Beijing Datianmian White Sugar Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Datianmian White Sugar Technology Co ltd filed Critical Beijing Datianmian White Sugar Technology Co ltd
Priority to CN202210662721.3A priority Critical patent/CN115082614A/en
Publication of CN115082614A publication Critical patent/CN115082614A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The present disclosure provides a highlight generation method, apparatus, computer device and storage medium, wherein the method comprises: acquiring a target model to be added with highlight effect, and determining highlight intensity information of the target model; the high light intensity information is used for representing high light intensity corresponding to each vertex in the target model; acquiring a tangent disturbance image corresponding to the target model, and determining highlight disturbance information for controlling highlight texture based on the tangent disturbance image; the tangent line disturbance image is used for representing texture information of a real object corresponding to the target model; and generating a target highlight effect of the target model based on the highlight intensity information and the highlight disturbance information.

Description

Highlight generation method and device, computer equipment and storage medium
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to a highlight generation method and apparatus, a computer device, and a storage medium.
Background
When a three-dimensional model is built, in order to make the display effect of the three-dimensional model more realistic, highlight is generally required to be added to the hairs of the character model.
In the related art, if the highlight effect of the hair is to be more real, each hair needs to be modeled separately, so that when highlight is generated, a corresponding highlight effect needs to be generated for each hair, and a great performance burden is caused; or, in order to reduce the performance load, the hair is divided into a plurality of patches, and respective highlights are generated on a patch-by-patch basis, but the highlight effect generated in this way is not very realistic. Therefore, how to generate a realistic highlight effect on the basis of reducing the performance burden becomes an urgent problem to be solved.
Disclosure of Invention
The embodiment of the disclosure at least provides a highlight generation method and device, computer equipment and a storage medium.
In a first aspect, an embodiment of the present disclosure provides a highlight generation method, including:
acquiring a target model to be added with highlight effect, and determining highlight intensity information of the target model; the high light intensity information is used for representing high light intensity corresponding to each vertex in the target model;
acquiring a tangent disturbance image corresponding to the target model, and determining highlight disturbance information for controlling highlight texture based on the tangent disturbance image; the tangent line disturbance image is used for representing texture information of a real object corresponding to the target model;
and generating a target highlight effect of the target model based on the highlight intensity information and the highlight disturbance information.
In the method, highlight intensity information of the target model can be determined firstly, so that the generation range of highlight on the target model can be determined, highlight disturbance information for controlling highlight texture is determined through the tangent disturbance image, the highlight disturbance information can control the target model to generate highlight texture of real hair, fewer computing resources are occupied when the highlight disturbance information is generated, and finally, a highlight effect is generated based on the highlight intensity information and the highlight disturbance information, so that the generated highlight effect is more vivid.
In a possible embodiment, the determining high light intensity information of the target model includes:
obtaining a light vector of a scene where the target model is located and a sight line vector of a current observation visual angle;
generating a highlight direction vector for characterizing a highlight direction based on the ray vector and the sight line vector;
determining a secondary tangent vector of each vertex of the target model based on the tangent vector and the normal vector of each vertex of the target model;
determining highlight intensity information of the target model based on the highlight direction vectors and sub-tangent vectors of respective vertices of the target model.
In a possible embodiment, the determining highlight disturbance information for controlling highlight texture based on the tangent disturbance image includes:
obtaining a light vector of a scene where the target model is located and a sight vector of a current observation visual angle, and generating a highlight direction vector for representing a highlight direction based on the light vector and the sight vector; and the number of the first and second groups,
determining a disturbance normal vector of the target model based on the tangent disturbance image and normal vectors of all vertexes of the target model;
and determining highlight disturbance information of the target model based on the highlight direction vector and the disturbance normal vector.
By adopting the method, the highlight disturbance information for controlling the highlight texture can be generated based on the real texture information, so that the target highlight effect generated based on the highlight disturbance information has the real texture effect.
In a possible embodiment, the determining high light intensity information of the target model includes:
determining high light intensity information of a hair region of the target model;
the acquiring of the tangent disturbance image corresponding to the target model includes:
acquiring a real hair image;
and performing decolorizing treatment on the hair image to obtain the tangent line disturbance image.
By adopting the method, real texture information can be obtained, so that the subsequent target highlight effect generated based on the tangent line disturbance image has a real texture effect.
In a possible embodiment, the generating the target highlight effect of the target model based on the highlight intensity information and the highlight disturbance information includes:
determining a highlight color range of the target model under the current observation visual angle; wherein the highlight color range is used for representing the color of each vertex in the target model under the influence of highlight under the current observation visual angle;
generating a target highlight effect of the target model based on the highlight intensity information, the highlight disturbance information and the highlight color range.
By adopting the method, the tone of the generated target highlight effect can be adjusted through the highlight color range, so that the target highlight effect is better fused with the light color of the target model and the scene where the target model is located, and the target highlight effect is more vivid.
In one possible embodiment, the determining the highlight color range of the target model at the current observation angle includes:
acquiring a sight vector of a current observation visual angle;
determining a viewing angle direction light range based on the sight line vector and normal vectors of each vertex of the target model; the viewing angle direction light range is used for representing high light intensity corresponding to each vertex of the target model under the current observation viewing angle;
and determining the highlight color range based on the visual angle direction light range and a preset model color.
By adopting the method, the visual angle direction light range can enable the generated highlight to have a polishing effect, the color of the model is fused, the generated highlight color with the polishing effect can be more attached to the ground color of the target model, and the display effect is more natural.
In a possible embodiment, the generating the target highlight effect of the target model based on the highlight intensity information, the highlight disturbance information, and the highlight color range includes:
multiplying the highlight intensity information and the highlight disturbance information to obtain first highlight information; wherein the first highlight information is used for representing highlight information under the texture information;
and superposing the first highlight information and the highlight color range to generate a target highlight effect of the target model.
Here, texture information and highlight luminance information have been implied to first highlight information, color information and highlight luminance information have been implied to second highlight information, based on first highlight information and the common control target highlight effect of highlight color scope, can make the target highlight effect of generation combine multifactor, both possess real texture, have natural light effect again.
In a second aspect, embodiments of the present disclosure further provide a highlight generating device, including:
the first determination module is used for acquiring a target model to be added with highlight effect and determining highlight intensity information of the target model; the high light intensity information is used for representing high light intensity corresponding to each vertex in the target model;
the second determining module is used for acquiring a tangent disturbance image corresponding to the target model and determining highlight disturbance information for controlling highlight texture based on the tangent disturbance image; the tangent line disturbance image is used for representing texture information of a real object corresponding to the target model;
and the generating module is used for generating a target highlight effect of the target model based on the highlight intensity information and the highlight disturbance information.
In one possible embodiment, the first determining module, when determining high light intensity information of the target model, is configured to:
obtaining a light vector of a scene where the target model is located and a sight line vector of a current observation visual angle;
generating a highlight direction vector for characterizing a highlight direction based on the ray vector and the sight line vector;
determining a secondary tangent vector of each vertex of the target model based on the tangent vector and the normal vector of each vertex of the target model;
determining highlight intensity information of the target model based on the highlight direction vectors and sub-tangent vectors of respective vertices of the target model.
In a possible implementation, the second determining module, when determining highlight perturbation information for controlling highlight texture based on the tangent perturbation image, is configured to:
obtaining a light vector of a scene where the target model is located and a sight vector of a current observation visual angle, and generating a highlight direction vector for representing a highlight direction based on the light vector and the sight vector; and the number of the first and second groups,
determining a disturbance normal vector of the target model based on the tangent disturbance image and normal vectors of all vertexes of the target model;
and determining highlight disturbance information of the target model based on the highlight direction vector and the disturbance normal vector.
In one possible embodiment, the first determining module, when determining high light intensity information of the target model, is configured to:
determining high light intensity information of a hair region of the target model;
the second determining module, when acquiring the tangent disturbance image corresponding to the target model, is configured to:
acquiring a real hair image;
and performing decolorizing treatment on the hair image to obtain the tangent line disturbance image.
In a possible embodiment, the generating module, when generating the target highlight effect of the target model based on the highlight intensity information and the highlight disturbance information, is configured to:
determining a highlight color range of the target model under the current observation visual angle; wherein the highlight color range is used for representing the color of each vertex in the target model under the influence of highlight under the current observation visual angle;
generating a target highlight effect of the target model based on the highlight intensity information, the highlight disturbance information and the highlight color range.
In a possible embodiment, the generating module, when determining the highlight color range of the target model at the current viewing angle, is configured to:
acquiring a sight vector of a current observation visual angle;
determining a viewing angle direction light range based on the sight line vector and normal vectors of each vertex of the target model; the viewing angle direction light range is used for representing high light intensity corresponding to each vertex of the target model under the current observation viewing angle;
and determining the highlight color range based on the visual angle direction light range and a preset model color.
In a possible embodiment, the generating module, when generating the target highlight effect of the target model based on the highlight intensity information, the highlight disturbance information and the highlight color range, is configured to:
multiplying the highlight intensity information and the highlight disturbance information to obtain first highlight information; wherein the first highlight information is used for representing highlight information under the texture information;
and superposing the first highlight information and the highlight color range to generate a target highlight effect of the target model.
In a third aspect, an embodiment of the present disclosure further provides a computer device, including: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor and the memory communicating via the bus when the computer device is running, the machine-readable instructions when executed by the processor performing the steps of the first aspect described above, or any possible implementation of the first aspect.
In a fourth aspect, this disclosed embodiment also provides a computer-readable storage medium, on which a computer program is stored, where the computer program is executed by a processor to perform the steps in the first aspect or any one of the possible implementation manners of the first aspect.
For the description of the effects of the highlight generating device, the computer device, and the computer readable storage medium, reference is made to the description of the highlight generating method, and details are not repeated here.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the technical aspects of the disclosure.
In order to make the aforementioned objects, features and advantages of the present disclosure more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and, together with the description, serve to explain the principles of the disclosure.
In order to more clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings required for use in the embodiments will be briefly described below, and the drawings herein incorporated in and forming a part of the specification illustrate embodiments consistent with the present disclosure and, together with the description, serve to explain the technical solutions of the present disclosure. It is appreciated that the following drawings depict only certain embodiments of the disclosure and are therefore not to be considered limiting of its scope, for those skilled in the art will be able to derive additional related drawings therefrom without the benefit of the inventive faculty.
FIG. 1 is a flow chart illustrating a highlight generation method provided by an embodiment of the present disclosure;
FIG. 2 illustrates a schematic diagram of a region for generating highlights provided by an embodiment of the present disclosure;
FIG. 3 is a schematic diagram illustrating a tangential perturbation image provided by an embodiment of the present disclosure;
fig. 4 is a schematic diagram illustrating an architecture of a highlight generation apparatus provided by an embodiment of the present disclosure;
fig. 5 shows a schematic structural diagram of a computer device provided by an embodiment of the present disclosure.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present disclosure more clear, the technical solutions of the embodiments of the present disclosure will be described clearly and completely with reference to the drawings in the embodiments of the present disclosure, and it is obvious that the described embodiments are only a part of the embodiments of the present disclosure, not all of the embodiments. The components of the embodiments of the present disclosure, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present disclosure, presented in the figures, is not intended to limit the scope of the claimed disclosure, but is merely representative of selected embodiments of the disclosure. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the disclosure without making creative efforts, shall fall within the protection scope of the disclosure.
When a three-dimensional model is built, in order to make the display effect of the three-dimensional model more realistic, highlight is generally required to be added to the hairs of the character model.
In the related art, if the highlight effect of the hair is to be more real, each hair needs to be modeled separately, so that when highlight is generated, a corresponding highlight effect needs to be generated for each hair, and a great performance burden is caused; or, in order to reduce the performance load, the hair is divided into a plurality of patches, and respective highlights are generated on a patch-by-patch basis, but the highlight effect generated in this way is not very realistic. Therefore, how to generate a realistic highlight effect on the basis of reducing the performance burden becomes an urgent problem to be solved.
Based on the above research, the present disclosure provides a highlight generation method, apparatus, computer device and storage medium, which may determine highlight intensity information of a target model first, so as to determine a generation range of highlight on the target model, then determine highlight disturbance information for controlling highlight texture through a tangent disturbance image, the highlight disturbance information may control the target model to generate highlight texture of real hair, and only occupy less computing resources when generating the highlight disturbance information, and finally generate highlight effect based on the highlight intensity information and the highlight disturbance information, so as to make the generated highlight effect more vivid.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
The term "and/or" herein merely describes an associative relationship, meaning that three relationships may exist, e.g., a and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the term "at least one" herein means any one of a plurality or any combination of at least two of a plurality, for example, including at least one of A, B, C, and may mean including any one or more elements selected from the group consisting of A, B and C.
To facilitate understanding of the present embodiment, first, a detailed description is given to a highlight generation method disclosed in the embodiments of the present disclosure, and an execution subject of the highlight generation method provided in the embodiments of the present disclosure is generally a computer device with certain computing power, where the computer device includes, for example: the client or the server or other electronic devices, and the client can be a personal computer, a tablet computer, a smart phone, and the like. In some possible implementations, the highlight generation method may be implemented by a processor calling computer readable instructions stored in a memory.
Referring to fig. 1, a flowchart of a highlight generation method provided by the embodiment of the present disclosure is shown, where the method includes steps 101 to 103, where:
step 101, obtaining a target model to be added with highlight effect, and determining highlight intensity information of the target model; the high light intensity information is used for representing high light intensity corresponding to each vertex in the target model;
102, acquiring a tangent disturbance image corresponding to the target model, and determining highlight disturbance information for controlling highlight texture based on the tangent disturbance image; the tangent line disturbance image is used for representing texture information of a real object corresponding to the target model;
and 103, generating a target highlight effect of the target model based on the highlight intensity information and the highlight disturbance information.
The following is a detailed description of the above steps:
for step 101,
The target model may be a human model, an animal model, and the target model includes a plurality of vertices, which may be, for example, vertices of each patch in the target model or vertices of each mesh of the target model.
In a possible embodiment, when determining the high light intensity information of the target model, the high light intensity information of the hair area of the target model may be determined. Illustratively, the hair information may be a hair of a human model, a hair of an animal model, or the like.
In one possible embodiment, in determining the high light intensity information of the object model, the following steps may be performed:
a1, acquiring a light vector of a scene where the target model is located and a sight line vector of a current observation visual angle.
The light vector of the scene where the target model is located is the direction of the light in the scene where the target model is located, the light vector is a preset value in the rendered scene, and the light vector can be a unit vector. The sight line vector of the current observation visual angle is a vector pointing to the target model from the current observation point, and the sight line vector can be updated along with the change of the observation point and the observation angle.
A2, generating a highlight direction vector for representing the highlight direction based on the ray vector and the sight line vector.
Specifically, since the highlight direction is related to the observed position and angle, and the direction of the light in the scene where the target model is located, the light vector and the sight line vector may be added to obtain the highlight direction vector.
A3, determining a secondary tangent vector of each vertex of the target model based on the tangent vector and the normal vector of each vertex of the target model.
The tangent vector and the normal vector of each vertex of the target model are fixed values, for any vertex of the target model, the normal vector of the vertex is a vector perpendicular to a (tangent) plane where the vertex is located, the tangent direction of the vertex is perpendicular to the normal vector of the vertex, and the secondary tangent vector of the vertex is perpendicular to the tangent vector and the normal vector, so that the tangent vector and the normal vector of each vertex can be cross-product calculated to obtain the secondary tangent vector of each vertex.
A4, determining highlight strength information of the target model based on the highlight direction vectors and the secondary tangent vectors of the vertexes of the target model.
Specifically, the highlight direction vector and the secondary tangent vector of each vertex of the target model may be subjected to dot product calculation, that is, the highlight may be mapped to the secondary tangent direction of each vertex, so as to obtain highlight intensity information at each vertex, where the highlight intensity information is specifically a highlight intensity value of each vertex, and therefore, as shown in fig. 2, the highlight intensity information may indicate an area in the target model, where the highlight needs to be generated, and if the highlight intensity information of point a is 0.2, the highlight intensity information of point B is 0, and the highlight intensity information of point C is 0.8, then the highlight needs to be generated at points a and C, and the highlight does not need to be generated at point B.
In a possible implementation manner, when determining the highlight intensity information of the target model, the highlight intensity information may be obtained by performing dot product calculation on the highlight direction vector and the sub-tangent vectors of the vertices of the target model, and then performing power calculation on the calculated first result.
Specifically, when the first result is exponentiated, the exponentiation may be performed with the first result calculated at each vertex as a base number and a first predetermined number of times as an exponentiation. Because the highlight intensity information at highlight edge is lower, the highlight intensity information at the summit that can highlight edge becomes littleer through the power calculation, is 0 almost, and the highlight intensity information at the summit of center department is higher, even through the power operation, the highlight intensity information at the summit of center department is still higher, consequently can make the edge of the highlight that generates based on this highlight intensity information more obvious, has promoted the bandwagon effect of the highlight of generation.
Illustratively, the highlight intensity information of the point a is 0.2, the exponentiation operation is performed at 0.2 × 0.2 × 0.2 ═ 0.008, and the highlight generated based on the highlight intensity information of 0.008 is hardly visible, while the highlight intensity information of the point C is 0.8,0.8 × 0.8 × 0.8 ═ 0.512, and the highlight generated based on the highlight intensity information of 0.512 is still visible.
With respect to step 102,
The tangential perturbation image may be a highlight texture image of real hair. In a possible implementation manner, when acquiring the tangential disturbance image corresponding to the target model, a real hair image may be acquired first, and then the tangential disturbance image may be obtained by performing a decoloring process on the hair image. The tangential line disturbance image obtained based on the method is a gray scale image of real hair texture, and the tangential line disturbance image can be shown in fig. 3.
By adopting the method, real texture information can be obtained, so that the subsequent target highlight effect generated based on the tangent line disturbance image has a real texture effect.
In one possible implementation, when determining highlight perturbation information for controlling highlight texture based on the tangent perturbation image, the following steps may be performed (wherein the execution order of B1 and B2 is not sequential):
b1, acquiring a light vector of a scene where the target model is located and a sight line vector of a current observation visual angle, and generating a highlight direction vector for representing a highlight direction based on the light vector and the sight line vector.
Here, the method of generating the highlight direction vector is the same as the above-mentioned a1 to a2, and is not described herein again.
And B2, determining a disturbance normal vector of the target model based on the tangent disturbance image and normal vectors of all vertexes of the target model.
Specifically, the tangent disturbance image includes brightness information of each pixel point, and a normal vector of each vertex of the target model is a three-dimensional vector in a three-dimensional coordinate system, such as (1, 2, 3), where an x-axis direction of the three-dimensional coordinate system is a left-right direction, a y-axis direction is an up-down direction, and a z-axis direction is an inside-outside direction, and since hair is in the up-down direction, that is, a true texture of highlight hair is in the up-down direction, a y-axis value of the normal vector of each vertex of the target model can be multiplied by the brightness information of the corresponding pixel point of each vertex in the tangent disturbance image, so as to obtain a disturbance normal vector of each vertex.
In a possible implementation manner, after the y-axis numerical value of the normal vector of each vertex of the target model is multiplied by the brightness information of the corresponding pixel point of each vertex in the tangent disturbance image, the multiplied second result may be subjected to remapping processing to obtain the disturbance normal vector.
Specifically, in order to control the value range of the normal vector of each vertex, when performing the secondary remapping processing on the multiplied second result, the second result of each vertex may be multiplied by a first preset weight to obtain the normal vector of the perturbation. Illustratively, when the range of the second result after the power calculation of each vertex is between 0 and 4, in order to make the range of the normal vector of the disturbance of each vertex between 0 and 1, the second result after the power calculation of each vertex may be multiplied by 0.25.
And B3, determining highlight disturbance information of the target model based on the highlight direction vector and the disturbance normal vector.
Specifically, in order to combine the highlight effect with the texture information, the highlight direction vector and the disturbance normal vector of each vertex may be subjected to dot product calculation, and the highlight effect is mapped onto the disturbance normal vector, so as to obtain highlight disturbance information reflecting the highlight texture information.
By adopting the method, the highlight disturbance information for controlling the highlight texture can be generated based on the real texture information, so that the highlight effect generated based on the highlight disturbance information subsequently has the real texture effect.
In a possible implementation manner, when determining the highlight disturbance information of the target model based on the highlight direction vector and the disturbance normal vector, the highlight direction vector and the disturbance direction vector may be subjected to dot product calculation, and then the calculated second result is subjected to power calculation to obtain the highlight disturbance information.
Specifically, when the third result is exponentiated, the third result calculated at each vertex may be used as a base number, and the third predetermined number of times may be used as an exponentiation. By adopting the method, the third result of the vertex with smaller third result becomes smaller, and the third result of the vertex with higher third result is still higher, so that after the highlight disturbance information is generated based on the third result, the texture generated based on the highlight disturbance information can be brighter in bright places and darker in dark places, and the generated highlight texture is more obvious.
For step 103,
Specifically, the highlight intensity information and the highlight disturbance information may be multiplied to obtain a target highlight effect of the target model.
Here, after the highlight intensity information is multiplied by the highlight disturbance information, since the highlight intensity information is used to control the existence of highlights at each vertex, if the result of multiplying the vertex with highlight intensity information of 0 is still 0, no highlights are generated, or the brightness value obtained by multiplying the vertex with highlight intensity information of too low is still too low to be seen, the highlight intensity information determines the generation range of highlights.
Similarly, after the highlight intensity information and the highlight disturbance information are multiplied, the intensity of each vertex brightness in the highlight disturbance information is still maintained (for example, the vertex brightness at the hair gap is low, and the vertex brightness at the hair surface is high), so that the generated target highlight effect still has a texture effect.
The result of multiplying the highlight intensity information by the highlight disturbance information is the highlight brightness value of each vertex of the target model, and a gray target highlight effect can be generated based on the highlight brightness value of each vertex.
In a possible implementation manner, since the highlight intensity information and the highlight disturbance information only include the highlight intensity of each vertex of the target model and do not include the color of the highlight, in order to make the generated target highlight effect more natural when generating the highlight, when generating the target highlight effect of the target model based on the highlight intensity information and the highlight disturbance information, a highlight color range of the target model at the current observation angle may be determined first; wherein the highlight color range is used for representing the color of each vertex in the target model under the influence of highlight under the current observation visual angle; and then generating a target highlight effect of the target model based on the highlight intensity information, the highlight disturbance information and the highlight color range.
By adopting the method, the tone of the generated target highlight effect can be adjusted through the highlight color range, so that the target highlight effect is better fused with the light color of the target model and the scene where the target model is located, and the target highlight effect is more vivid.
In a possible implementation manner, when determining the highlight color range of the target model under the current observation angle, a line-of-sight vector of the current observation angle may be obtained, and then a view direction light range may be determined based on the line-of-sight vector and a normal vector of each vertex of the target model, where the view direction light range is used to represent highlight intensities corresponding to each vertex of the target model under the current observation angle, and finally the highlight color range may be determined based on the view direction light range and a preset model color.
The preset model color may be a model ground color of a highlight area to be added in the target model, and for example, when high light is added to the hair of a character model, the model color may be the ground color of the hair. Specifically, when the viewing angle direction light range is determined based on the viewing angle vector and the normal vector of the target model, the viewing angle direction light range may be generated by dot product of the viewing angle vector and the normal vector of each vertex of the target model, that is, by mapping the viewing angle vector to the direction of the normal vector of each vertex.
In a possible implementation manner, after dot product of the sight line vector and the normal vector of each vertex of the target model, the fourth result after dot product may be subjected to power calculation, and the fourth result after power calculation may be subjected to remapping processing, so as to obtain the viewing angle direction light range.
Specifically, when the fourth result is exponentiated, the fourth result calculated at each vertex may be used as a base number, and the fourth result may be exponentiated by using a fourth predetermined number of times as an exponent. In this way, the fourth result of the vertex with smaller fourth result becomes smaller, and the fourth result of the vertex with higher fourth result is still higher, so that the highlight brightness of each vertex in the viewing angle direction light range is adjusted, the highlight edge under the viewing angle becomes clearer, and the size of the highlight area under the viewing angle can be controlled.
Then, in order to control the value range of the viewing angle direction light range, when the fourth result obtained by the power calculation is remapped, the fourth result of each vertex may be multiplied by a second preset weight (e.g., 0.3) to obtain the viewing angle direction light range.
When the highlight color range is determined based on the viewing angle direction light range and a preset model color, the model color may be an RGB color, and the highlight color range may be obtained by multiplying the viewing angle direction light range and the model color.
By adopting the method, the visual angle direction light range can enable the generated highlight to have a polishing effect, the color of the model is fused, the generated highlight color with the polishing effect can be more attached to the ground color of the target model, and the display effect is more natural.
In a possible implementation manner, when generating the target highlight effect of the target model based on the highlight intensity information, the highlight disturbance information, and the highlight color range, the highlight intensity information and the highlight disturbance information may be multiplied to obtain first highlight information; and the first highlight information is used for representing highlight information under the texture information, and then the first highlight information is superposed with the highlight color range to generate a target highlight effect of the target model.
The result of the superposition of the first highlight information and the highlight color range is a color value (such as an RGB value) of the highlight of each vertex, and a color target highlight effect can be generated based on the color value of the highlight of each vertex.
Here, texture information and highlight luminance information have been implied to first highlight information, color information and highlight luminance information have been implied to second highlight information, based on first highlight information and the common control target highlight effect of highlight color scope, can make the target highlight effect of generation combine multifactor, both possess real texture, have natural light effect again.
The highlight generation method provided by the embodiment of the disclosure can determine highlight intensity information of the target model, so that the generation range of highlight on the target model can be determined, then highlight disturbance information for controlling highlight texture is determined through the tangent disturbance image, the highlight disturbance information can control the target model to generate highlight texture of real hair, and only less computing resources are needed to be occupied when the highlight disturbance information is generated, and finally, a highlight effect is generated based on the highlight intensity information and the highlight disturbance information, so that the generated highlight effect is more vivid.
It will be understood by those skilled in the art that in the method of the present invention, the order of writing the steps does not imply a strict order of execution and any limitations on the implementation, and the specific order of execution of the steps should be determined by their function and possible inherent logic.
Based on the same inventive concept, a highlight generating device corresponding to the highlight generating method is also provided in the embodiments of the present disclosure, and as the principle of solving the problem of the device in the embodiments of the present disclosure is similar to the highlight generating method in the embodiments of the present disclosure, the implementation of the device may refer to the implementation of the method, and repeated details are not repeated.
Referring to fig. 4, there is shown a schematic diagram of an architecture of a highlight generating device provided in the embodiment of the present disclosure, the device includes: a first determining module 401, a second determining module 402, a generating module 403; wherein the content of the first and second substances,
the first determining module 401 is configured to obtain a target model to be added with a highlight effect, and determine highlight intensity information of the target model; the high light intensity information is used for representing high light intensity corresponding to each vertex in the target model;
a second determining module 402, configured to obtain a tangent disturbance image corresponding to the target model, and determine highlight disturbance information for controlling highlight texture based on the tangent disturbance image; the tangent line disturbance image is used for representing texture information of a real object corresponding to the target model;
a generating module 403, configured to generate a target highlight effect of the target model based on the highlight intensity information and the highlight disturbance information.
In a possible implementation, the first determining module 401, when determining the high light intensity information of the target model, is configured to:
obtaining a light vector of a scene where the target model is located and a sight line vector of a current observation visual angle;
generating a highlight direction vector for characterizing a highlight direction based on the ray vector and the sight line vector;
determining a secondary tangent vector of each vertex of the target model based on the tangent vector and the normal vector of each vertex of the target model;
determining highlight intensity information of the target model based on the highlight direction vectors and sub-tangent vectors of respective vertices of the target model.
In a possible implementation, the second determining module 402, when determining highlight perturbation information for controlling highlight texture based on the tangent perturbation image, is configured to:
obtaining a light vector of a scene where the target model is located and a sight vector of a current observation visual angle, and generating a highlight direction vector for representing a highlight direction based on the light vector and the sight vector; and the number of the first and second groups,
determining a disturbance normal vector of the target model based on the tangent disturbance image and normal vectors of all vertexes of the target model;
and determining highlight disturbance information of the target model based on the highlight direction vector and the disturbance normal vector.
In a possible implementation, the first determining module 401, when determining the high light intensity information of the target model, is configured to:
determining high light intensity information of a hair region of the target model;
the second determining module 402, when acquiring the tangent disturbance image corresponding to the target model, is configured to:
acquiring a real hair image;
and performing decolorizing treatment on the hair image to obtain the tangent line disturbance image.
In a possible implementation, the generating module 403, when generating the target highlight effect of the target model based on the highlight intensity information and the highlight disturbance information, is configured to:
determining a highlight color range of the target model under the current observation visual angle; wherein the highlight color range is used for representing the color of each vertex in the target model under the influence of highlight under the current observation visual angle;
generating a target highlight effect of the target model based on the highlight intensity information, the highlight disturbance information and the highlight color range.
In a possible embodiment, the generating module 403, when determining the highlight color range of the target model at the current viewing angle, is configured to:
acquiring a sight vector of a current observation visual angle;
determining a viewing angle direction light range based on the sight line vector and normal vectors of each vertex of the target model; the viewing angle direction light range is used for representing high light intensity corresponding to each vertex of the target model under the current observation viewing angle;
and determining the highlight color range based on the visual angle direction light range and a preset model color.
In a possible implementation, the generating module 403, when generating the target highlight effect of the target model based on the highlight intensity information, the highlight disturbance information and the highlight color range, is configured to:
multiplying the highlight intensity information and the highlight disturbance information to obtain first highlight information; wherein the first highlight information is used for representing highlight information under the texture information;
and superposing the first highlight information and the highlight color range to generate a target highlight effect of the target model.
The description of the processing flow of each module in the device and the interaction flow between the modules may refer to the related description in the above method embodiments, and will not be described in detail here.
Based on the same technical concept, the embodiment of the disclosure also provides computer equipment. Referring to fig. 5, a schematic structural diagram of a computer device 500 provided in the embodiment of the present disclosure includes a processor 501, a memory 502, and a bus 503. The memory 502 is used for storing execution instructions and includes a memory 5021 and an external memory 5022; the memory 5021 is also referred to as an internal memory, and is used for temporarily storing operation data in the processor 501 and data exchanged with an external storage 5022 such as a hard disk, the processor 501 exchanges data with the external storage 5022 through the memory 5021, and when the computer device 500 operates, the processor 501 communicates with the storage 502 through the bus 503, so that the processor 501 executes the following instructions:
acquiring a target model to be added with highlight effect, and determining highlight intensity information of the target model; the high light intensity information is used for representing high light intensity corresponding to each vertex in the target model;
acquiring a tangent disturbance image corresponding to the target model, and determining highlight disturbance information for controlling highlight texture based on the tangent disturbance image; the tangent line disturbance image is used for representing texture information of a real object corresponding to the target model;
and generating a target highlight effect of the target model based on the highlight intensity information and the highlight disturbance information.
The embodiments of the present disclosure also provide a computer-readable storage medium, on which a computer program is stored, where the computer program is executed by a processor to perform the steps of the highlight generation method described in the above method embodiments. The storage medium may be a volatile or non-volatile computer-readable storage medium.
The embodiments of the present disclosure further provide a computer program product, where the computer program product bears a program code, and instructions included in the program code may be used to execute the steps of the highlight generation method in the foregoing method embodiments, which may be specifically referred to in the foregoing method embodiments and are not described herein again.
The computer program product may be implemented by hardware, software or a combination thereof. In an alternative embodiment, the computer program product is embodied in a computer storage medium, and in another alternative embodiment, the computer program product is embodied in a Software product, such as a Software Development Kit (SDK), or the like.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the system and the apparatus described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again. In the several embodiments provided in the present disclosure, it should be understood that the disclosed system, apparatus, and method may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions when actually implemented, and for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some communication interfaces, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present disclosure may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer-readable storage medium executable by a processor. Based on such understanding, the technical solution of the present disclosure may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present disclosure. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
Finally, it should be noted that: the above-mentioned embodiments are merely specific embodiments of the present disclosure, which are used for illustrating the technical solutions of the present disclosure and not for limiting the same, and the scope of the present disclosure is not limited thereto, and although the present disclosure is described in detail with reference to the foregoing embodiments, those skilled in the art should understand that: any person skilled in the art can modify or easily conceive of the technical solutions described in the foregoing embodiments or equivalent technical features thereof within the technical scope of the present disclosure; such modifications, changes or substitutions do not depart from the spirit and scope of the embodiments of the present disclosure, and should be construed as being included therein. Therefore, the protection scope of the present disclosure shall be subject to the protection scope of the claims.
If the technical scheme of the application relates to personal information, a product applying the technical scheme of the application clearly informs personal information processing rules before processing the personal information, and obtains personal independent consent. If the technical scheme of the application relates to sensitive personal information, a product applying the technical scheme of the application obtains individual consent before processing the sensitive personal information, and simultaneously meets the requirement of 'express consent'. For example, at a personal information collection device such as a camera, a clear and significant identifier is set to inform that the personal information collection range is entered, the personal information is collected, and if the person voluntarily enters the collection range, the person is regarded as agreeing to collect the personal information; or on the device for processing the personal information, under the condition of informing the personal information processing rule by using obvious identification/information, obtaining personal authorization by modes of popping window information or asking a person to upload personal information of the person by himself, and the like; the personal information processing rule may include information such as a personal information processor, a personal information processing purpose, a processing method, and a type of personal information to be processed.

Claims (10)

1. A highlight generation method, comprising:
acquiring a target model to be added with highlight effect, and determining highlight intensity information of the target model; the high light intensity information is used for representing high light intensity corresponding to each vertex in the target model;
acquiring a tangent disturbance image corresponding to the target model, and determining highlight disturbance information for controlling highlight texture based on the tangent disturbance image; the tangent line disturbance image is used for representing texture information of a real object corresponding to the target model;
and generating a target highlight effect of the target model based on the highlight intensity information and the highlight disturbance information.
2. The method of claim 1, wherein the determining high light intensity information for the target model comprises:
obtaining a light vector of a scene where the target model is located and a sight line vector of a current observation visual angle;
generating a highlight direction vector for characterizing a highlight direction based on the ray vector and the sight line vector;
determining a secondary tangent vector of each vertex of the target model based on the tangent vector and the normal vector of each vertex of the target model;
determining highlight intensity information of the target model based on the highlight direction vectors and sub-tangent vectors of respective vertices of the target model.
3. The method according to claim 1 or 2, wherein the determining highlight perturbation information for controlling highlight texture based on the tangent perturbation image comprises:
obtaining a light vector of a scene where the target model is located and a sight vector of a current observation visual angle, and generating a highlight direction vector for representing a highlight direction based on the light vector and the sight vector; and the number of the first and second groups,
determining a disturbance normal vector of the target model based on the tangent disturbance image and normal vectors of all vertexes of the target model;
and determining highlight disturbance information of the target model based on the highlight direction vector and the disturbance normal vector.
4. The method according to any one of claims 1 to 3, wherein the determining high light intensity information of the target model comprises:
determining high light intensity information of a hair region of the target model;
the acquiring of the tangent disturbance image corresponding to the target model includes:
acquiring a real hair image;
and performing decolorizing treatment on the hair image to obtain the tangent line disturbance image.
5. The method according to any one of claims 1 to 4, wherein the generating the target highlight effect of the target model based on the highlight intensity information and the highlight disturbance information comprises:
determining a highlight color range of the target model under the current observation visual angle; wherein the highlight color range is used for representing the color of each vertex in the target model under the influence of highlight under the current observation visual angle;
generating a target highlight effect of the target model based on the highlight intensity information, the highlight disturbance information and the highlight color range.
6. The method of claim 5, wherein determining a highlight color range for the target model at the current viewing perspective comprises:
acquiring a sight vector of a current observation visual angle;
determining a viewing angle direction light range based on the sight line vector and normal vectors of each vertex of the target model; the viewing angle direction light range is used for representing high light intensity corresponding to each vertex of the target model under the current observation viewing angle;
and determining the highlight color range based on the visual angle direction light range and a preset model color.
7. The method of claim 5 or 6, wherein generating the target highlight effect of the target model based on the highlight intensity information, the highlight disturbance information, and the highlight color range comprises:
multiplying the highlight intensity information and the highlight disturbance information to obtain first highlight information; wherein the first highlight information is used for representing highlight information under the texture information;
and superposing the first highlight information and the highlight color range to generate a target highlight effect of the target model.
8. A highlight generating device, comprising:
the first determination module is used for acquiring a target model to be added with highlight effect and determining highlight intensity information of the target model; the high light intensity information is used for representing high light intensity corresponding to each vertex in the target model;
the second determining module is used for acquiring a tangent disturbance image corresponding to the target model and determining highlight disturbance information for controlling highlight texture based on the tangent disturbance image; the tangent line disturbance image is used for representing texture information of a real object corresponding to the target model;
and the generating module is used for generating a target highlight effect of the target model based on the highlight intensity information and the highlight disturbance information.
9. A computer device, comprising: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor and the memory communicating over the bus when a computer device is run, the machine-readable instructions when executed by the processor performing the steps of the highlight generation method according to any of claims 1 to 7.
10. A computer-readable storage medium, characterized in that a computer program is stored on the computer-readable storage medium, which computer program, when being executed by a processor, performs the steps of the highlight generation method according to any of the claims 1 to 7.
CN202210662721.3A 2022-06-13 2022-06-13 Highlight generation method and device, computer equipment and storage medium Pending CN115082614A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210662721.3A CN115082614A (en) 2022-06-13 2022-06-13 Highlight generation method and device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210662721.3A CN115082614A (en) 2022-06-13 2022-06-13 Highlight generation method and device, computer equipment and storage medium

Publications (1)

Publication Number Publication Date
CN115082614A true CN115082614A (en) 2022-09-20

Family

ID=83251975

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210662721.3A Pending CN115082614A (en) 2022-06-13 2022-06-13 Highlight generation method and device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115082614A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115861520A (en) * 2023-02-02 2023-03-28 深圳思谋信息科技有限公司 Highlight detection method and device, computer equipment and storage medium
CN116342848A (en) * 2023-03-28 2023-06-27 云阳县优多科技有限公司 Intelligent manufacturing method and system for toy

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115861520A (en) * 2023-02-02 2023-03-28 深圳思谋信息科技有限公司 Highlight detection method and device, computer equipment and storage medium
CN115861520B (en) * 2023-02-02 2023-04-28 深圳思谋信息科技有限公司 Highlight detection method, highlight detection device, computer equipment and storage medium
CN116342848A (en) * 2023-03-28 2023-06-27 云阳县优多科技有限公司 Intelligent manufacturing method and system for toy
CN116342848B (en) * 2023-03-28 2024-02-02 云阳县优多科技有限公司 Intelligent manufacturing method and system for toy

Similar Documents

Publication Publication Date Title
CN107154030B (en) Image processing method and device, electronic equipment and storage medium
CN115082614A (en) Highlight generation method and device, computer equipment and storage medium
CN111784821B (en) Three-dimensional model generation method and device, computer equipment and storage medium
CN107564080B (en) Face image replacement system
CN113838176B (en) Model training method, three-dimensional face image generation method and three-dimensional face image generation equipment
CN109636890B (en) Texture fusion method and device, electronic equipment, storage medium and product
CN113205568A (en) Image processing method, image processing device, electronic equipment and storage medium
CN109005368A (en) A kind of generation method of high dynamic range images, mobile terminal and storage medium
CN109410309B (en) Relighting method and device, electronic equipment and computer storage medium
CN111861632A (en) Virtual makeup trial method and device, electronic equipment and readable storage medium
CN112419144A (en) Face image processing method and device, electronic equipment and storage medium
CN111653175B (en) Virtual sand table display method and device
CN111383311B (en) Normal map generation method, device, equipment and storage medium
CN114529657A (en) Rendering image generation method and device, computer equipment and storage medium
CN108665498B (en) Image processing method, device, electronic equipment and storage medium
CN111260767B (en) Rendering method, rendering device, electronic device and readable storage medium in game
CN110310357B (en) Model interleaving processing method and device, computing equipment and storage medium
US10754498B2 (en) Hybrid image rendering system
CN110838167A (en) Model rendering method and device and storage medium
CN114581592A (en) Highlight rendering method and device, computer equipment and storage medium
CN115359170A (en) Scene data generation method and device, electronic equipment and storage medium
CN115063330A (en) Hair rendering method and device, electronic equipment and storage medium
CN114612614A (en) Human body model reconstruction method and device, computer equipment and storage medium
CN114529656A (en) Shadow map generation method and device, computer equipment and storage medium
CN114627225A (en) Method and device for rendering graphics and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination