CN110992248B - Lip makeup special effect display method, device, equipment and storage medium - Google Patents

Lip makeup special effect display method, device, equipment and storage medium Download PDF

Info

Publication number
CN110992248B
CN110992248B CN201911183742.1A CN201911183742A CN110992248B CN 110992248 B CN110992248 B CN 110992248B CN 201911183742 A CN201911183742 A CN 201911183742A CN 110992248 B CN110992248 B CN 110992248B
Authority
CN
China
Prior art keywords
lip
color
special effect
dyeing
virtual character
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911183742.1A
Other languages
Chinese (zh)
Other versions
CN110992248A (en
Inventor
刘电
屈禹呈
化超煜
李薇薇
陆佳能
丁程峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201911183742.1A priority Critical patent/CN110992248B/en
Priority to CN202011098254.3A priority patent/CN112037123B/en
Publication of CN110992248A publication Critical patent/CN110992248A/en
Application granted granted Critical
Publication of CN110992248B publication Critical patent/CN110992248B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • G06T3/04
    • G06T5/77
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • G06T2207/30201Face

Abstract

The application discloses a method, a device, equipment and a medium for displaying a lip makeup special effect, wherein the method belongs to the field of man-machine interaction, and comprises the following steps: obtaining a custom lip makeup setting corresponding to the virtual character; determining the generation parameters of the lip makeup special effect according to the custom lip makeup setting; and displaying the lip making-up special effect of the virtual character according to the generated parameters of the lip making-up special effect, wherein the lip making-up special effect comprises at least one of a jelly texture special effect and a paillette particle special effect. This application has realized the realistic simulation of writing to the jelly lip effect and the spangle effect of lip gloss.

Description

Lip makeup special effect display method, device, equipment and storage medium
Technical Field
The embodiment of the application relates to the field of human-computer interaction, in particular to a method, a device, equipment and a storage medium for displaying a lip makeup special effect.
Background
Applications such as online games, simulated makeup, simulated life, etc. are provided with female avatars. With the higher requirements on the degree of writing and the degree of freedom, the customization of the virtual character is not limited to the shape and the position of the face, the skin color and the like, and lip makeup customization is supported.
In the related art, the special effect of lip makeup can be added to the virtual character, such as a lip stick, a lipstick, lip gloss or lip glaze. The realization process of the lip makeup special effect comprises the following steps: setting a diffuse reflection map, a normal map, roughness (parameters) and metal degree (parameters) of the lip, and then rendering the diffuse reflection map and the normal map by adopting a physical illumination model according to the roughness and the metal degree to obtain the lip with the lip makeup special effect.
Although the color of the lip makeup can be defined by diffuse reflection mapping, roughness and metallization, the color and the roughness can be changed, and the method belongs to a surface-based illumination model solution in nature, and cannot simulate lip gloss in the real world in a very realistic manner.
Disclosure of Invention
The embodiment of the application provides a method, a device, equipment and a storage medium for displaying a lip makeup special effect, and can solve the problems that only the color of the lip makeup can be defined in the related art, and the lip color in the real world cannot be simulated in a realistic manner. The technical scheme is as follows:
according to one aspect of the present application, there is provided a method of displaying a lip cosmetic effect, the method comprising:
obtaining basic highlight color, main light source direction dyeing and sight line direction dyeing of lip pixel points of the virtual character;
mixing the basic highlight color, the main light source direction dyeing and the sight line direction dyeing according to different directions to obtain a mixed highlight color;
and inputting the highlight color into a lighting model for rendering, and displaying the lip making-up special effect of the virtual character, wherein the lip making-up special effect has a jelly texture.
According to one aspect of the present application, there is provided a lip makeup effect display device, the device comprising:
an obtaining module, configured to obtain a generation parameter of the lip makeup special effect of the virtual character, where the generation parameter of the lip makeup special effect includes: basic highlight color, main light source direction dyeing and sight line direction dyeing of lip pixel points of the virtual character;
the mixing module is used for mixing the basic highlight color, the main light source direction dyeing and the sight line direction dyeing according to different directions to obtain a mixed highlight color;
and the display module is used for inputting the highlight color to a lighting model for rendering, and displaying the lip making-up special effect of the virtual character, wherein the lip making-up special effect has jelly texture.
According to one aspect of the present application, there is provided a method of displaying a lip cosmetic effect, the method comprising:
obtaining a first highlight color of a lip pixel point of a virtual character;
acquiring a random color value when the lip pixel point belongs to a paillette area in a paillette chartlet, wherein the paillette chartlet is a chartlet of the paillette area and a non-paillette area for indicating the lip of the virtual character;
mixing the first highlight color and the random color value of the lip pixel point belonging to the paillette area to obtain a mixed second highlight color;
inputting the first highlight color of the lip pixel points belonging to the non-paillette area and the second highlight color of the lip pixel points belonging to the paillette area into an illumination model for rendering, and displaying the lip special effect of the virtual character, wherein the lip special effect has paillette grain texture.
According to one aspect of the present application, there is provided a lip makeup effect display device, the device comprising:
the acquisition module is used for acquiring a first highlight color of a lip pixel point of the virtual character;
the obtaining module is further configured to obtain a random color value when the lip pixel point belongs to a paillette area in a paillette chartlet, where the paillette chartlet is a chartlet for indicating the paillette area and a non-paillette area of the lip of the virtual character;
the mixing module is used for mixing the first highlight color and the random color value of the lip pixel point belonging to the paillette area to obtain a mixed second highlight color;
and the display module is used for inputting the first highlight color of the lip pixel point belonging to the non-paillette area and the second highlight color of the lip pixel point belonging to the paillette area into an illumination model for rendering, so that the lip special effect of the virtual character is displayed, and the lip special effect has paillette particle texture.
According to one aspect of the present application, there is provided a method of displaying a lip cosmetic effect, the method comprising:
displaying a lip makeup setting interface corresponding to the virtual character, wherein the lip makeup setting interface is used for customizing the generation parameters of the lip makeup special effect;
responding to a setting operation, customizing generation parameters of the lip makeup special effect, wherein the generation parameters comprise: at least one of a basic highlight color, a main light source direction coloring and a sight line direction coloring of lip pixel points of the virtual character;
displaying a preview picture of the lip makeup special effect of the virtual character, wherein the lip makeup special effect has a jelly texture.
According to one aspect of the present application, there is provided a method of displaying a lip cosmetic effect, the method comprising:
displaying a lip makeup setting interface corresponding to the virtual character, wherein the lip makeup setting interface is used for customizing the generation parameters of the lip makeup special effect;
responding to a setting operation, customizing generation parameters of the lip makeup special effect, wherein the generation parameters comprise: at least one of a basic highlight color of a lip pixel point of the virtual character and a mapping parameter of a highlight mapping;
displaying a preview picture of the lip makeup special effect of the virtual character, wherein the lip makeup special effect has paillette grain texture.
According to another aspect of the present application, there is provided a computer device comprising a memory and a processor; the memory stores at least one program that is loaded and executed by the processor to implement the lip makeup special effect display method as described above.
According to another aspect of the present application, there is provided a computer readable storage medium having at least one program stored therein, the at least one program being loaded and executed by a processor to implement the lip makeup special effect display method as described above.
According to another aspect of the present application, there is provided a computer program product having at least one program stored therein, the at least one program being loaded and executed by a processor to implement the method of displaying a lip cosmetic effect as described above.
The beneficial effects brought by the technical scheme provided by the embodiment of the application at least comprise:
obtaining basic highlight color, main light source direction dyeing and sight line direction dyeing of lip pixel points of the virtual character; mixing the basic highlight color, the main light source direction dyeing and the sight line direction dyeing according to different directions to obtain a mixed highlight color; and inputting the highlight color into the illumination model for rendering, and displaying the lip makeup special effect of the virtual character, wherein the lip makeup special effect has jelly texture. The realistic simulation of the jelly lip effect of the lip gloss is realized, and particularly the realistic simulation of the visual special effect of the jelly lip in different sight directions can be realized.
Obtaining a first highlight color of a lip pixel point of a virtual character; acquiring a random color value when a lip pixel point belongs to a paillette area in a paillette chartlet; mixing the first highlight color and the random color value of the lip pixel point belonging to the paillette area to obtain a mixed second highlight color; inputting the first highlight color of the lip pixel points belonging to the non-paillette area and the second highlight color of the lip pixel points belonging to the paillette area into an illumination model for rendering, and displaying the lip special effect of the virtual character, wherein the lip special effect has paillette grain texture. The method realizes the realistic simulation of the paillette effect of the lip gloss, and particularly can perform realistic simulation on paillette particles with a plurality of flashing colors in the paillette lip gloss.
Because the rendering pipeline is not influenced, the method can be achieved on a mobile terminal platform (such as a mobile phone or a tablet computer) with lower rendering cost. In addition, the realization method of the parameterization also reduces the number of the pictures required by the customization requirement, and the effect of 'thousands of people and thousands of lips' can be achieved only by using a plurality of self-defined parameters.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a schematic diagram of a computer system provided in an exemplary embodiment of the present application;
FIG. 2 is a flow chart of a method for displaying a lip cosmetic effect provided by another exemplary embodiment of the present application;
FIG. 3 is a schematic diagram of a lip makeup effect display method provided in accordance with another exemplary embodiment of the present application;
FIG. 4 is a flow chart of a method for displaying a lip cosmetic effect provided by another exemplary embodiment of the present application;
FIG. 5 is a flow chart of a method for displaying a lip cosmetic effect provided by another exemplary embodiment of the present application;
FIG. 6 is a schematic diagram of a lip makeup effect display method provided in accordance with another exemplary embodiment of the present application;
FIG. 7 is a flow chart of a method for displaying a lip cosmetic effect provided by another exemplary embodiment of the present application;
FIG. 8 is a flow chart of a method for displaying a lip cosmetic effect provided by another exemplary embodiment of the present application;
FIG. 9 is a schematic illustration of a highlight map provided by another exemplary embodiment of the present application;
FIG. 10 is a flow chart of a method for displaying a lip cosmetic effect provided by another exemplary embodiment of the present application;
FIG. 11 is a schematic diagram of a lip makeup effect display method provided in accordance with another exemplary embodiment of the present application;
FIG. 12 is a flow chart of a method for displaying a lip cosmetic effect provided by another exemplary embodiment of the present application;
FIG. 13 is a schematic diagram of a lip makeup effect display method provided in accordance with another exemplary embodiment of the present application;
FIG. 14 is a flow chart of a method for displaying a lip cosmetic effect provided by another exemplary embodiment of the present application;
FIG. 15 is an interface schematic of a method of displaying a lip makeup effect according to another exemplary embodiment of the present application;
FIG. 16 is an interface schematic of a method of displaying a lip cosmetic effect provided by another exemplary embodiment of the present application;
FIG. 17 is a flow chart of a method for displaying a lip makeup effect according to another exemplary embodiment of the present application;
FIG. 18 is a graphical illustration of the effect of different lip makeup specific effects for different characters provided by another exemplary embodiment of the present application;
FIG. 19 is a graphical illustration of the effect of different lip makeup specific effects on the same character provided by another exemplary embodiment of the present application;
FIG. 20 is a photograph comparing the effect of a lip makeup effect with a photograph provided by another exemplary embodiment of the present application;
FIG. 21 is a block diagram of a lip makeup effect display device provided in accordance with another exemplary embodiment of the present application;
FIG. 22 is a block diagram of a lip makeup effect display device provided in accordance with another exemplary embodiment of the present application;
FIG. 23 is a block diagram of a computer device provided in another exemplary embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the embodiments of the present application more clear, the embodiments of the present application will be further described in detail with reference to the accompanying drawings.
Although the following description uses the terms first, second, etc. to describe various elements, these elements should not be limited by the terms. These terms are only used to separate an element from another element region. For example, a first direction may be referred to as a second direction, and similarly, a second direction may be referred to as a first direction, without departing from the scope of the various described examples. Both the first direction and the second direction may be directions, and in some cases, may be separate and different directions.
The terminology used in the description of the various described examples herein is for the purpose of describing particular examples only and is not intended to be limiting. As used in the description of the various described examples and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items. It will be further understood that the terms "Comprises," "Comprising," "inCludes" and/or "inCluding," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
Depending on the context, the term "if" may be interpreted to mean "when" ("where" or "upon") or "in response to a determination" or "in response to a detection". Similarly, the phrase "if it is determined." or "if [ a stated condition or event ] is detected" may be interpreted to mean "upon determining.. or" in response to determining. "or" upon detecting [ a stated condition or event ] or "in response to detecting [ a stated condition or event ]" depending on the context.
Several nouns of the present application are first introduced:
virtual environment: is a virtual environment that is displayed (or provided) when an application is run on the terminal. The virtual environment may be a simulation environment of a real world, a semi-simulation semi-fictional environment, or a pure fictional environment, also called a virtual world. The virtual environment may be any one of a two-dimensional virtual environment, a 2.5-dimensional virtual environment, and a three-dimensional virtual environment, which is not limited in this application. The following embodiments are illustrated with the virtual environment being a three-dimensional virtual environment. The virtual environment related to the embodiment of the application comprises a virtual environment in the time of no competitive fight and a virtual environment in which competitive fight is carried out.
Virtual roles: refers to a movable object in a virtual environment. The movable object may be a virtual character, a virtual animal, an animation character, etc., such as: a person or an animal displayed in a three-dimensional virtual environment. Optionally, the virtual character is a three-dimensional volumetric model created based on animated skeletal techniques. Each virtual character has its own shape and volume in the three-dimensional virtual environment, occupying a portion of the space in the three-dimensional virtual environment. For example, the virtual character in the embodiment of the present application may be a virtual character having a lip-mounted special effect, such as a female virtual character.
Viewing angle: refers to an observation angle when the virtual character is observed in the virtual world from the first person perspective or the third person perspective. Optionally, in an embodiment of the present application, the perspective is a perspective when the virtual character is observed by the camera model in the virtual world.
Optionally, the camera model automatically follows the virtual character in the virtual world, that is, when the position of the virtual character in the virtual world changes, the camera model changes while following the position of the virtual character in the virtual world, and the camera model is always within the preset distance range of the virtual character in the virtual world. Optionally, the relative positions of the camera model and the virtual character do not change during the automatic following process.
The camera model is as follows: the camera model is a three-dimensional model positioned around a virtual character in a virtual world, and is positioned near or at the head of the virtual character when a first person view angle is adopted; when a third person perspective view is adopted, the camera model can be located behind the virtual character and bound with the virtual character, or located at any position away from the virtual character by a preset distance, the virtual character located in the virtual world can be observed from different angles through the camera model, and optionally, when the third person perspective view is the shoulder-crossing perspective view of the first person, the camera model is located behind the virtual character (such as the head and the shoulder of the virtual character). Optionally, the viewing angle includes other viewing angles, such as a top viewing angle, in addition to the first person viewing angle and the third person viewing angle; the camera model may be located overhead of the virtual character's head when a top-down view is used, which is a view looking into the virtual world from an overhead top-down view. Optionally, the camera model is not actually displayed in the virtual world, i.e. the camera model is not displayed in the virtual world displayed by the user interface.
To illustrate an example where the camera model is located at any position away from the virtual character by a preset distance, optionally, one virtual character corresponds to one camera model, and the camera model may rotate with the virtual character as a rotation center, for example: the camera model is rotated with any point of the virtual character as a rotation center, the camera model rotates not only angularly but also shifts in displacement during the rotation, and the distance between the camera model and the rotation center is kept constant during the rotation, that is, the camera model rotates on the surface of a sphere with the rotation center as the sphere center, wherein any point of the virtual character can be the head, the trunk or any point around the virtual character, which is not limited in the embodiment of the present application. Optionally, when the virtual character is observed by the camera model, the center of the view angle of the camera model points to the direction in which the point of the spherical surface on which the camera model is located points to the center of the sphere.
The map material required by lip makeup rendering comprises at least one of the following materials: diffuse reflection map (diffuse map), normal map (normal map), and paillette map.
Diffuse reflection mapping: in a virtual environment for representing reflections of object surfaces and mapping of surface colors. In other words, it can express the color and intensity of the object appearing by light irradiation. In the embodiment of the application, the diffuse reflection map mainly affects the appearance of the lips, and can be cherry small mouths, red lips or crimson lips.
Normal mapping: the shadow on the surface of the model is influenced to achieve the mapping with the concave-convex effect. The normal map does not change the shape of the model, but defines the normal or slope of a surface.
Paillette chartlet: is a map used for simulating the influence of paillette or fine powder on the light shadow effect. The paillette seal can realize the deflection of the normal on the lip and the change of the slight metal degree.
The generation parameters needed for lip makeup rendering include at least one of: diffuse reflection dyeing, metallization degree, roughness, metallization degree compensation, highlight color, dyeing in the direction of a main light source, dyeing in the direction of a sight line and color shift parameters.
Diffuse reflection dyeing: coefficients for influencing the color of the diffuse reflection map. The original inherent color is shifted, for example, the diffuse reflection map is red lip, but it can be changed into a blue lip by the diffuse reflection coloring formula.
Degree of metallization: parameters for influencing the ratio of specular to diffuse reflectance. The metallization can strongly influence the embodiment of different lipstick textures, the metallization of a common organism is 0, the diffuse reflection ratio is large, the highlight has the reflectivity of 0.02-0.04, and the color of the highlight is gray; when the degree of metallization is 1, the diffuse reflection ratio is 0, the reflectivity of the highlight is greatly improved, and the color of the highlight can be various colors. The metal degree is a basic parameter based on physical rendering, and is controlled through numerical values, so that a larger adjustment space can be provided than a diffuse reflection map.
And (3) metal degree compensation: because the color of diffuse reflection is close to black under the condition of excessive metal degree, the lips are dark, and the artistic personnel and users can decide whether to lighten the color of the lips through metal degree compensation.
Roughness: parameters for influencing the roughness of the surface of the object.
Basic highlight color: when the degree of metallization is other than 0, the base highlight color can be varied, so a separate highlight color parameter is used to control the highlight color.
Dyeing in the main light source direction: the color is based on the direction of the main light source, and is used for representing the refraction color of the main light source irradiated on the semitransparent lip makeup. This is because the translucent lip cosmetic has a difference in color when irradiated with light in different directions.
Dyeing in the sight direction: the color is based on the visual line direction of the camera, and is used for reflecting the refraction color when the semitransparent lip makeup is observed in the visual line direction. This is because the translucent lip cosmetic has a difference in color when viewed in different directions.
Color shift parameters: parameters for varying the random color value reflected by the spangles in the spangle lip cosmetic.
The related art has disadvantages in that it is limited to a surface-based illumination model, cannot represent a transparent feeling of lips, and a glittering feeling of a partial lip makeup, so that the quality of the lips is not realistic and colorful. The most notable lip gloss, however, is usually glossy (pink) or translucent. In particular, a translucent lip gloss (thick coating for short) with a certain thickness can generate scattering when receiving light, and the scattering phenomenon can lead to higher color saturation of the dark part of the lip.
The embodiment of the application provides a rendering scheme of a lip makeup special effect, which not only can cover the lip makeup effect of the traditional scheme, but also can combine the effects of realizing a paillette effect and a semitransparent jelly lip equivalent effect to provide a more real and higher-freedom lip special effect for a user. The rendering scheme can be applied to lip makeup customization of a female virtual character in a game and can also be applied to a simulator for lip makeup self-selection in makeup software.
FIG. 1 shows a block diagram of a computer system provided in an exemplary embodiment of the present application. The computer system 100 includes: a first terminal 120, a server 140, and a second terminal 160.
The first terminal 120 is installed and operated with an application program supporting a virtual environment. The application program can be any one of a virtual reality application program, a three-dimensional map program, a military simulation program, an FPS game, an MOBA game and a multi-player gun battle type survival game. The first terminal 120 is a terminal used by a first user who uses the first terminal 120 to control a first virtual object located in a virtual environment to perform activities including, but not limited to: adjusting at least one of body posture, crawling, walking, running, riding, jumping, driving, saber attack, shooting, parachuting, creating a community, joining a community. Illustratively, the first virtual object is a first virtual character, such as a simulated character object or an animated character object.
The first terminal 120 is connected to the server 140 through a wireless network or a wired network.
The server 140 includes at least one of a server, a plurality of servers, a cloud computing platform, and a virtualization center. Illustratively, the server 140 includes a processor 144 and a memory 142, the memory 142 in turn including a display module 1421, a control module 1422, and a receiving module 1423. The server 140 is used to provide background services for applications that support a three-dimensional virtual environment. Alternatively, the server 140 undertakes primary computational work and the first and second terminals 120, 160 undertake secondary computational work; alternatively, the server 140 undertakes the secondary computing work and the first terminal 120 and the second terminal 160 undertakes the primary computing work; alternatively, the server 140, the first terminal 120, and the second terminal 160 perform cooperative computing by using a distributed computing architecture.
The second terminal 160 is installed and operated with an application program supporting a virtual environment. The application program can be any one of a virtual reality application program, a three-dimensional map program, a military simulation program, an FPS game, an MOBA game and a multi-player gun battle type survival game. The second terminal 160 is a terminal used by a second user who uses the second terminal 160 to control a second virtual object located in the virtual environment to perform activities including, but not limited to: adjusting at least one of body posture, crawling, walking, running, riding, jumping, driving, saber attack, shooting, parachuting, creating a community, joining a community. Illustratively, the second virtual object is a second virtual character, such as a simulated character object or an animated character object. Optionally, the second virtual object is a member of the same community as the first virtual object, or a member of a different community.
Optionally, the first virtual character and the second virtual character are in the same virtual environment. Optionally, the first virtual character and the second virtual character may belong to the same team, the same organization, have a friend relationship, or have temporary communication authority.
Alternatively, the applications installed on the first terminal 120 and the second terminal 160 are the same, or the applications installed on the two terminals are the same type of application of different control system platforms. The first terminal 120 may generally refer to one of a plurality of terminals, and the second terminal 160 may generally refer to one of a plurality of terminals, and this embodiment is only illustrated by the first terminal 120 and the second terminal 160. The device types of the first terminal 120 and the second terminal 160 are the same or different, and include: at least one of a smartphone, a tablet, an e-book reader, an MP3 player, an MP4 player, a laptop portable computer, and a desktop computer. The following embodiments are illustrated with the terminal comprising a smartphone.
Those skilled in the art will appreciate that the number of terminals described above may be greater or fewer. For example, the number of the terminals may be only one, or several tens or hundreds of the terminals, or more. The number of terminals and the type of the device are not limited in the embodiments of the present application.
Illustratively, the lip of the virtual character has a lip special effect, and the lip special effect supports user customization.
Fig. 2 is a flowchart illustrating a method for displaying a lip cosmetic effect according to an exemplary embodiment of the present application. The method may be performed by applying it to the terminal shown in fig. 1. The method comprises the following steps:
step 201, obtaining a custom lip makeup setting corresponding to a virtual character;
an application program runs in the terminal. The application is provided with a virtual environment (or virtual world). One or more avatars are present in the virtual environment, for example the avatars may be female avatars.
The application program is provided with a user-defined lip makeup setting interface corresponding to the virtual character. The user-defined lip makeup setting interface is used for a user to define the generation parameters of the lip makeup special effect. The user obtains a custom lip makeup setting by adjusting at least one control on the custom lip makeup device interface, as schematically shown in fig. 3.
After the user finishes the custom lip makeup setting, the application program stores the custom lip makeup setting in a local or server, and the custom lip makeup setting is read in subsequent operation.
Step 202, determining a generation parameter of a lip makeup special effect according to the custom lip makeup setting;
at least one setting item exists in the custom lip makeup setting. When the setting item belongs to a packaging setting item (or called an integrated setting item) of a plurality of generation parameters, the application program determines a plurality of generation parameters of the lip makeup special effect according to the packaging setting item; when the setting item belongs to a single setting item of a single generation parameter, the application program determines a certain generation parameter of the lip makeup special effect according to the single setting item.
The head of the virtual character is provided with a three-dimensional model, and a plurality of lip pixel points exist on the lip of the virtual character. The materials for the lip makeup effect include, but are not limited to: diffuse reflection mapping, normal mapping, and paillette mapping. Parameters for generating the lip makeup effect include, but are not limited to: diffuse reflection dyeing, metal degree, roughness coefficient, metal degree compensation color, basic highlight color, dyeing in the main light source direction, dyeing in the sight line direction and paving coefficient.
In one example, the diffuse reflection map, the normal map and the highlight map are preset for the lips of the virtual character, and are not manually customized by a user, and other generation parameters can be customized by the user. In another example, the diffuse reflection map, the normal map and the highlight map can also be provided in multiple numbers, and one of the multiple numbers is selected by a user in a customized manner.
And 203, displaying the lip making-up special effect of the virtual character according to the generation parameters of the lip making-up special effect, wherein the lip making-up special effect comprises at least one of a jelly texture special effect and a paillette particle special effect.
And the application program generates an illumination input parameter of each lip pixel point according to the generation parameter of the lip makeup special effect. The illumination input parameters include: diffuse reflectance color, highlight color (merging), normal vector (merging), and roughness of lip pixels. And inputting the illumination input parameters of each lip pixel point into the illumination model, rendering each lip pixel point, and further displaying the lip makeup special effect of the virtual character.
The special effect of jelly texture means that the transparency of the lip makeup of the virtual character is between completely transparent and opaque, and the virtual character has jelly texture with certain thickness.
The paillette particle special effect means that the lip makeup of the virtual character has a plurality of paillette particles, and each paillette particle reflects different colors.
In summary, in the method provided in this embodiment, the custom lip makeup setting corresponding to the virtual character is obtained; determining a generation parameter of a lip makeup special effect according to the custom lip makeup setting; the generation parameters of the lip making up special effect are used for displaying the lip making up special effect of the virtual character, the lip making up special effect comprises at least one of a jelly texture special effect and a paillette particle special effect, and the realistic simulation of at least one of the jelly lip effect and the paillette effect of the lip color is realized. Because the rendering pipeline is not influenced by the scheme, the method can be achieved on a mobile terminal platform (such as a mobile phone or a tablet computer) with lower rendering cost. In addition, the realization method of the parameterization also reduces the number of the pictures required by the customization requirement, and the effect of 'thousands of people and thousands of lips' can be achieved only by using the generation parameters of the lip makeup special effect.
Specific jelly texture specific effect of lip makeup
Input map resources: diffuse reflection mapping and normal mapping;
input generation parameters: basic highlight color, main light source direction dyeing, sight direction dyeing, diffuse reflection dyeing, metal degree, roughness and metal degree compensation color (optional).
As can be seen from fig. 3, the illumination input parameters of the lip pixel point at least include four parameters: diffuse reflectance color, highlight color, normal vector and roughness of lip pixel points. The special effect of jelly texture is mainly related to highlight color. Based on the embodiment shown in fig. 2, fig. 4 is a flowchart illustrating a lip makeup effect display method according to another exemplary embodiment of the present application. The method may be performed by applying it to the terminal shown in fig. 1. The method comprises the following steps:
step 401, obtaining basic highlight color, main light source direction dyeing and sight direction dyeing of lip pixel points of a virtual character;
the parameters for generating the lip makeup special effect at least comprise: basic highlight color, main light source direction dyeing and sight direction dyeing of lip pixel points. The three generation parameters can all be customized by a user.
Step 402, mixing three parameters of basic highlight color, main light source direction dyeing and sight line direction dyeing according to different directions to obtain mixed highlight color;
in one example, the step 402 may include the following sub-steps, as shown in FIG. 5:
step 4021, performing first linear interpolation on the basic highlight color S and the dyeing F in the main light source direction according to the main light source direction, and calculating to obtain a first mixed highlight color;
the main light source direction is the direction of the main light source in the virtual environment where the virtual character is located illuminating the lip pixel point. The first linear interpolation adopts a gray value calculated by a Lambert illumination formula as a weight.
Exemplary referring to fig. 6 in combination, the calculation formula of the first mixed highlight color FCcolor is as follows:
FCcolor=Lerp1(F,S,Nol);
Nol=NdotL*NdotL;
NdotL=dot(worldNormal,lightDir)。
wherein F is the primary light source direction stain, S is the base highlight color, world Normal is the world normal, and light Dir is the primary light source direction. Nol is the result of one-time squaring of NdotL, and dot is the dot product operation, lerp1Is the first linear interpolation.
Step 4022, performing second linear interpolation on the first mixed highlight color FCcolor and the sight direction dyeing V according to the sight direction, and calculating to obtain a second mixed highlight color;
the direction of sight is the direction of the camera model in the virtual environment towards the lip pixels. And the second linear interpolation adopts the gray value calculated by a Fresnel illumination formula as the weight.
color=Lerp2(FCcolor,V,Nov);
Nov=NdotV*NdotV;
NdotV=dot(worldNormal,viewDir)。
Where V is the gaze direction stain, world normal is the world normal, viewDir is the gaze direction. Nov is the result of one-time squaring of NdotV, dot is the dot product operation, lerp2Is the second linear interpolation.
And 403, inputting the mixed highlight color into a lighting model for rendering, and displaying the lip makeup special effect of the virtual character, wherein the lip makeup special effect has jelly texture.
In one example, the highlight color and other input parameters after the second mixing are input into the illumination model for rendering, and the lip special effect of the virtual character is displayed.
In one example, to mask indirect light (indirect diffuse reflection + indirect Ambient highlights), an AO (Ambient occlusion) parameter may also be introduced, filtering the indirect Ambient highlights. The AO parameters may be stored in the B channel of the normal map.
Exemplary, other input parameters include: at least one of a diffuse reflectance color, a first normal vector, and a roughness.
In summary, in the method provided in this embodiment, a first linear interpolation is performed according to the main light source direction by dyeing the basic highlight color and the main light source direction, and a first mixed highlight color is obtained through calculation; dyeing the first mixed highlight color and the sight direction, and performing second linear interpolation according to the sight direction to obtain a second mixed highlight color through calculation; through the highlight color mixing in two directions, the realistic simulation that the color and the transparency of the semitransparent lip makeup are changed at different observation angles is realized, and the relatively real jelly lip effect is realized. And because the highlight color mixing in the two directions does not have obvious physical explanation, but belongs to an implementation mode with very little calculation overhead, the rendering cost can be effectively reduced.
In an alternative embodiment based on fig. 4, the calculation of the "diffuse reflection color" in the other input parameters shown in fig. 6 comprises the following steps, as shown in fig. 7:
601, acquiring a first color value corresponding to a lip pixel point of a virtual character in a diffuse reflection map;
there are four channels in the diffuse reflection map: r (red) channel, G (green) channel, B (blue) channel, a (transparency channel). The RGB channel stores a first color value of each lip pixel point.
Step 602, mixing the first color value with a diffuse reflection dyeing value to obtain a second color value;
the diffuse reflection dyeing value is set by user definition, and the first color value and the diffuse reflection dyeing value are mixed to obtain a second color value expected by a user. The second color value is a specific base color for a lip makeup.
And 603, calculating the second color value according to the refraction characteristic corresponding to the metal degree to obtain the diffuse reflection color of the lip pixel point.
The degree of metallization is a parameter for influencing the ratio of high light to diffuse reflection. The metallization can strongly influence the embodiment of different lipstick textures, the metallization of a common organism is 0, the diffuse reflection ratio is large, the highlight has the reflectivity of 0.02-0.04, and the color of the highlight is gray; when the degree of metallization is 1, the diffuse reflection ratio is 0, the reflectivity of the highlight is greatly improved, and the color of the highlight can be various colors. The metal degree is a basic parameter based on physical rendering, and is controlled through numerical values, so that a larger adjustment space can be provided than a diffuse reflection map.
Alternatively, the lips may appear dull because the color of the diffuse reflection is close to black in the case of excessively high metal levels. This embodiment can also let the fine arts personnel and user decide whether to lighten the color of the lip through the metal degree compensating color. When the metal degree compensation color exists, compensating the metal degree according to the metal degree compensation value to obtain the compensated metal degree; and calculating the second color value according to the refraction characteristic corresponding to the compensated metal degree to obtain the diffuse reflection color of the lip pixel point.
In summary, in the method provided in this embodiment, by providing the customized diffuse reflection dyeing value (and the metal degree compensation color), the user can customize the basic color of the lip makeup special effect, so as to implement red lip makeup, green lip makeup, purple lip makeup, blue lip makeup, black lip makeup, and the like.
In an alternative embodiment based on fig. 4, there are four channels in the normal map: r (red) channel, G (green) channel, B (blue) channel, a (transparency channel). The R channel stores a first normal component X, G, the second channel Y, B stores a third normal component Z, and the first normal vector of each lip pixel point can be directly read from the RGB channel of the normal map.
In an alternative embodiment based on fig. 4, the normal map only needs to store the first normal component and the second normal component, i.e. the R channel stores the first normal component X, G channel stores the second normal component Y. Reading a first normal component and a second normal component stored in a normal map by a lip pixel point; calculating according to the first normal component and the second normal component and the Pythagorean theorem to obtain a third normal component; and determining the first normal component, the second normal component and the third normal component as the first normal vector of the lip pixel point.
Referring to fig. 6 in combination, the third normal component Z (corresponding to the original B channel) is calculated as follows:
B=sqrt(1-dot(RG,RG))。
wherein sqrt represents the square root computation and dot represents the dot product operation.
In summary, in the method provided in this embodiment, only the first normal component and the second normal component are stored in the normal map, the channel B in the normal map can be used to store other data, and only four channels of one normal map are needed to store data that needs to be stored in two maps (the first normal component, the second normal component, the third normal component, the roughness, and the AO) in the related art, thereby effectively saving data amount.
In an alternative embodiment based on fig. 4, the roughness of each lip pixel is calculated as follows
In this embodiment, the initial roughness is stored in the a (Alpha) channel of the normal map. When the user further defines the roughness coefficient, the terminal reads the initial roughness of the lip pixel point stored in the normal map; and multiplying the initial roughness by the self-defined roughness coefficient to obtain the roughness.
Paillette granule special effect for lip cosmetic
Input map resources: diffuse reflection mapping, normal mapping and paillette mapping (newly added);
input generation parameters: basic highlight color, diffuse reflection dyeing, metallation, roughness, metallation compensation color (optional).
As can be seen from fig. 3, the illumination input parameters of the lip pixel point at least include four parameters: diffuse reflectance color, highlight color, normal vector, and roughness for each lip pixel. Wherein the paillette grain effect is mainly related to highlight color and normal vector. Based on the embodiment shown in fig. 2, fig. 8 is a flowchart illustrating a lip makeup effect display method according to another exemplary embodiment of the present application. The method may be performed by applying it to the terminal shown in fig. 1. The method comprises the following steps:
step 801, acquiring a first highlight color of a lip pixel point of a virtual character;
the first highlight color may be user-defined.
Step 802, acquiring a random color value when a lip pixel point belongs to a paillette area in a paillette chartlet, wherein the paillette chartlet is a chartlet area used for indicating lips of a virtual character and a chartlet area;
the highlight map is map material indicating highlight areas and non-highlight areas of the lips of the virtual character. Fig. 9 shows a schematic view of a spangle map. The paillette map has a plurality of paillette areas in the foreground and non-paillette areas in the background.
In one example, there are four channels in the highlight map: r (red) channel, G (green) channel, B (blue) channel, a (transparency channel). The R channel stores the first paillette normal component X, G channel stores the second paillette normal component Y, B channel stores the random gray value, and the a channel stores the mask value. The random gray value is a channel value used to indicate a random color value.
In another example, there are four channels in the highlight map: r (red) channel, G (green) channel, B (blue) channel, a (transparency channel). The R channel stores the first paillette normal component X, G channel stores the second paillette normal component Y, B channel stores the third paillette normal component Z, and the a channel stores the mask value. The random gray value of each pixel point is stored in other positions.
Step 803, mixing the first highlight color and the random color value of the lip pixel point belonging to the paillette area to obtain a mixed second highlight color;
and step 804, inputting the first highlight color of the lip pixel points belonging to the non-paillette area and the second highlight color of the lip pixel points belonging to the paillette area into an illumination model for rendering, and displaying the lip special effect of the virtual character, wherein the lip special effect has paillette particle texture.
In summary, in the method provided in this embodiment, the first highlight color of the lip pixel of the virtual character is obtained; acquiring a random color value when a lip pixel point belongs to a paillette area in a paillette chartlet; mixing the first highlight color and the random color value of the lip pixel point belonging to the paillette area to obtain a mixed second highlight color; inputting the first highlight color of the lip pixel points belonging to the non-paillette area and the second highlight color of the lip pixel points belonging to the paillette area into an illumination model for rendering, and displaying the lip special effect of the virtual character, wherein the lip special effect has paillette grain texture. The method realizes the realistic simulation of the paillette effect of the lip gloss, and particularly can perform realistic simulation on paillette particles with a plurality of flashing colors in the paillette lip gloss.
FIG. 10 is a flow chart illustrating a method for displaying a lip cosmetic effect according to another exemplary embodiment of the present application. The method may be performed by applying it to the terminal shown in fig. 1. With respect to fig. 8, step 802 may alternatively be implemented as step 8021 and step 8022, step 803 may alternatively be implemented as step 8031 and step 8032, and step 804 may alternatively be implemented as step 8041, the method comprising:
step 801, acquiring a first highlight color of a lip pixel point of a virtual character;
and when the lip special effect does not comprise jelly texture, determining the basic highlight color of the lip pixel point of the virtual character as the first highlight color. The base highlight color may be user-defined.
When the lip effect includes the jelly texture, the second mixed highlight color calculated by the steps 4021 and 4022 shown in fig. 5 is determined as the first highlight color.
Step 8021, reading a random gray value of the lip pixel point in the paillette chartlet;
with combined reference to fig. 11, there are four channels in the highlight map: r (red) channel, G (green) channel, B (blue) channel, a (transparency channel). The R channel stores the first paillette normal component X, G channel stores the second paillette normal component Y, B channel stores the random gray value, and the a channel stores the mask value.
Step 8022, convert the random gray value from the first color gamut to the second color gamut to obtain the random color value of the lip pixel.
The application program obtains a random gray value in the paillette chartlet; and converting the random gray value from the first color gamut to the second color gamut to obtain a random color value. In this embodiment, the first color gamut may refer to an HSV (hue, saturation, hue) color gamut, and the second color gamut refers to an RGB color gamut.
Optionally, when the lip pixel belongs to a non-paillette area, the mask value is 0; when the lip pixel belongs to the paillette area, the mask value is 1. That is, when the mask value of the lip pixel is 0, the random color value is not displayed, and the basic highlight color is displayed; when the mask value of the lip pixel point is 1, the mixed color of the basic highlight color and the random color value can be displayed.
Step 8031, obtaining color shift parameters;
the color shift parameter is a parameter for influencing the color of the paillette particle when reflecting light. The color shift parameters may also be user-defined.
Step 8032, mixing the random color value with the first highlight color according to the color shift parameter to obtain a mixed second highlight color;
the application program obtains the basic highlight color customized or preset by the user. And for each lip pixel point, mixing the random color value with the basic highlight color to obtain a mixed highlight color.
Step 8041, inputting the first highlight color of the lip pixel point belonging to the non-paillette area, the second highlight color of the lip pixel point belonging to the paillette area and other input parameters into the illumination model for rendering, and displaying the lip special effect of the virtual character.
Exemplary, other input parameters include: diffuse reflectance color, second normal vector, and roughness.
In summary, in the method provided in this embodiment, the random color value indicated by the paillette map is mixed with the highlight color, so that the lip pixel point in the paillette area presents a color reflection characteristic, the visual special effect of a plurality of paillette particles in lip makeup can be realistically simulated, and the realistic paillette lip makeup effect is realized.
The method provided by the embodiment can enable the user to customize the flashing color of the paillette by providing the color offset parameter customized by the user, thereby providing higher customization degree.
In an alternative embodiment based on fig. 10, the calculation process of the diffuse reflection color of each lip pixel point includes the following steps, as shown in fig. 7, which are not described again.
In an alternative embodiment based on fig. 10, the calculation of the second normal vector for each lip pixel is as follows, as shown in fig. 12:
step 1201, reading a first normal vector stored in a normal map by a lip pixel point;
illustratively, the normal map only needs to store the first normal component and the second normal component, i.e., the R channel stores the first normal component X, G and the channel stores the second normal component Y. Reading a first normal component and a second normal component stored in a normal map by a lip pixel point; calculating according to the first normal component and the second normal component and the Pythagorean theorem to obtain a third normal component; and determining the first normal component, the second normal component and the third normal component as the first normal vector of the lip pixel point.
The calculation formula of the third normal component Z (corresponding to the original B channel) is as follows:
B=sqrt(1-dot(RG,RG))。
wherein sqrt represents the square root computation and dot represents the dot product operation.
Step 1202, reading a paillette normal vector stored in a paillette chartlet by a lip pixel point;
for example, the tile map only needs to store the first normal component and the second normal component, i.e., the R channel stores the first normal component X, G channel stores the second normal component Y. Reading a first paillette normal component and a second paillette normal component stored in a paillette map by a lip pixel point; calculating according to the first paillette normal component and the second paillette normal component and the pythagorean theorem to obtain a third paillette normal component; and determining the normal component of the first paillette, the normal component of the second paillette and the normal component of the third paillette as the paillette normal vector of the lip pixel point.
The formula for the third paillette normal component Z (corresponding to the original B channel) is as follows:
B=sqrt(1-dot(RG,RG))。
wherein sqrt represents the square root computation and dot represents the dot product operation.
Step 1203, merging the first normal vector and the paillette normal vector to obtain a second normal vector.
Wherein, merging refers to the way of stacking XY (RB) channels with normal lines and multiplying Z (B) channels.
In summary, in the method provided in this embodiment, only the first normal component and the second normal component are stored in the normal map, the channel B in the normal map can be used to store other data, and only one normal map is needed to store data that needs to be stored in two maps (the first normal component, the second normal component, the third normal component, the roughness, and AO) in the related art, thereby effectively saving data amount.
According to the method provided by the embodiment, only the first paillette normal component and the second paillette normal component are stored in the paillette map, so that the channel B in the paillette map can be used for storing other data, and the data volume is effectively saved.
In an alternative embodiment based on FIG. 10, the roughness of each lip pixel is calculated as follows
In this embodiment, the initial roughness is stored in the a (Alpha) channel of the normal map. When the user further defines the roughness coefficient, the terminal reads the initial roughness of the lip pixel point stored in the normal map; and multiplying the initial roughness by the self-defined roughness coefficient to obtain the roughness.
In an alternative embodiment based on fig. 10, the paillette grain size in the paillette map is adjustable. The parameter corresponding to the paillette particle size is a tiling parameter. The method further comprises the following steps: acquiring a tiling parameter; and pasting the paillette chartlet material to the model surface of the lip of the virtual character according to the tiling parameters to obtain the paillette chartlet of the lip of the virtual character. The paillette chartlet material can refer to the chartlet material shown in fig. 9, the model surface of the lip of the virtual character can be the UV surface of the three-dimensional model, and the paillette chartlet material can be tiled for many times on different positions of the UV surface according to tiling parameters, so that the size of paillette particles is increased or reduced, and the final paillette chartlet is obtained.
Specific effect of 'jelly and paillette' for lip makeup
In an exemplary embodiment, in conjunction with FIG. 13, the illumination model employs a micro surface reflection (GGX) illumination model. Aiming at the pixel point to be rendered, the input quantity required to be input into the GGX illumination model comprises the following steps: diffuse reflectance color, combined highlight color, roughness, and combined normal. The GGX illumination model is based on diffuse reflectance color, combined highlight color, roughness, and normal vector.
Calculating diffuse reflection color:
the diffuse reflection map is stored with RGBA of each pixel point, wherein R is a red pixel component, G is a green pixel component, B is a blue pixel component, and A is a mask value. And acquiring a first color value of the pixel point from the diffuse reflection map, and superposing the user-defined diffuse reflection dyeing value and the first color value to obtain a second color value. When the user-defined metal color compensation value does not exist, calculating the second color value according to the refraction characteristic corresponding to the metal degree to obtain the diffuse reflection color; and when the user-defined metal degree compensation color exists, calculating to obtain compensated metal degree according to the metal degree compensation color and the metal degree, and calculating to obtain diffuse reflection color according to the refraction characteristic corresponding to the compensated metal degree. And inputting the diffuse reflection color corresponding to the pixel point into the GGX illumination model.
Calculation process of highlight color:
firstly, the pixel point is not a paillette pixel point;
1. performing first linear interpolation on the highlight color S and the dyeing F in the main light source direction according to the main light source direction, and calculating to obtain a first mixed highlight color FCcolor;
optionally, the first linear interpolation is calculated by using a lambert lighting formula to obtain a gray level as a weight. The calculation formula is as follows:
FCcolor=Lerp1(F,S,Nol);
Nol=NdotL*NdotL;
NdotL=dot(worldNormal,lightDir)。
where world normal is the world normal and lightDir is the main light source direction. Nol is the result of one-time squaring of NdotL, and dot is the dot product operation, lerp1Is the first linear interpolation.
2. Performing second linear interpolation on the first mixed highlight color FCcolor and the sight line direction color V according to the sight line direction, and calculating to obtain a second mixed highlight color;
optionally, the second linear interpolation uses the gray scale calculated by the fresnel illumination formula as a weight. The calculation formula is as follows:
color=Lerp2(FCcolor,V,Nov);
Nov=NdotV*NdotV;
NdotV=dot(worldNormal,viewDir)。
where world normal is the world normal and viewDir is the direction of sight. Nov is the result of one-time squaring of NdotV, dot is the dot product operation, lerp2Is the second linear interpolation.
And when the pixel point is not the paillette pixel point, determining the highlight color after the second mixing as the combined highlight color.
And secondly, the pixel point is a paillette pixel point.
The B channel of the paillette chartlet map stores the random gray value of each paillette pixel point, and the value range of the random gray value is 0-1. And converting the random gray value of the pixel point into a random color value by a gray value in an HSVtoRGB mode, and then mixing the random color value with the second mixed highlight color by a color offset parameter to obtain a combined highlight color.
The calculation process of the normal vector is as follows:
firstly, the situation of paillette special effect does not exist;
and if the RGB channels of the normal map respectively store three components of the normal vector, determining the normal vector indicated by the normal map as the normal vector of the pixel point.
And if the RG channel of the normal map stores two components of the normal vector of the map respectively, calculating according to the Pythagorean theorem and the two components RG to obtain a third component B. The calculation formula is as follows: b ═ sqrt (1-dot (RG, RG)). Where sqrt represents the square root computation and dot represents the dot product. And determining the three RGB components obtained by calculation as the normal vector of the pixel point.
Secondly, the special effect of the paillette exists;
the RG channel of the paillette chartlet is respectively stored with two components of the paillette normal vector, and a third component B is obtained by calculation according to the Pythagorean theorem and the two components RG. The calculation formula is as follows: b sqrt (1-dot (RG, RG)). Wherein sqrt represents the square root computation and dot represents the dot product operation. And combining the three components RGB obtained by calculation with the normal vector of the map to obtain the normal vector of the pixel point. The merging refers to the superposition of XY (RB) channels of normal lines, and the multiplication of Z (B) channels can make the normal line deflection natural. In addition, in order to not cross the due random range, the gray scale value of the normal offset in the normal map in this embodiment is limited to 0.25-0.75 (based on the value range 0-1, 0-255 can be mapped to the value range 0-1).
Roughness:
in this embodiment, the initial roughness is stored in the a (Alpha) channel of the normal map. When the user further defines the roughness coefficient, the terminal reads the initial roughness of the lip pixel point stored in the normal map; and multiplying the initial roughness by the self-defined roughness coefficient to obtain the roughness.
On the basis of the above embodiment, the process of customizing the lip makeup special effect by the user is as follows, as shown in fig. 14:
1401, displaying a lip makeup setting interface corresponding to the virtual character, wherein the lip makeup setting interface is used for self-defining generation parameters of a lip makeup special effect;
parameters for generating the lip makeup effect include, but are not limited to: diffuse reflection dyeing, metal degree, roughness coefficient, metal degree compensation color, basic highlight color, dyeing in the main light source direction, dyeing in the sight line direction, tiling coefficient and color shift parameter.
The lip makeup setting interface displays an integrated setting item corresponding to at least two generation parameters or a single setting item corresponding to a single generation parameter.
Referring schematically to fig. 15, each of the fashion colors in fig. 15 corresponds to a set of parameters for generating a lip makeup effect. The user only needs to select a popular color, namely a lip makeup style is selected, and a group of lip makeup special effect generation parameters are determined.
Referring to fig. 16 schematically, "full color gamut" in fig. 16 corresponds to "diffuse reflection dyeing", "gloss setting item" corresponds to "roughness", "spangle intensity" corresponds to "metallic degree", and "spangle particle size" corresponds to "tiling coefficient". The user may customize for a single generation parameter.
Step 1402, responding to the setting operation, customizing the generation parameters of the lip makeup special effect, wherein the generation parameters comprise: at least one of a basic highlight color, a main light source direction coloring and a sight line direction coloring of a lip pixel point of the virtual character;
the basic highlight color, the main light source direction dyeing and the sight line direction dyeing of the lip pixel points of the virtual character are generation parameters related to jelly texture.
Other generation parameters (and chartlet material) may also be customized by the user.
And 1403, displaying a preview picture of the lip makeup special effect of the virtual character, wherein the lip makeup special effect has jelly texture.
The terminal renders the lip makeup special effect according to the manner shown in the above embodiment, and displays a preview picture of the lip makeup special effect on a lip makeup setting interface or another user interface.
On the basis of the above embodiment, the process of customizing the lip makeup special effect by the user is as follows, as shown in fig. 17:
step 1701, displaying a lip makeup setting interface corresponding to the virtual character, wherein the lip makeup setting interface is used for customizing the generation parameters of the lip makeup special effect;
parameters for generating the lip makeup effect include, but are not limited to: diffuse reflection dyeing, metal degree, roughness coefficient, metal degree compensation color, basic highlight color, dyeing in the main light source direction, dyeing in the sight line direction, tiling coefficient and color shift parameter.
The lip makeup setting interface displays an integrated setting item corresponding to at least two generation parameters or a single setting item corresponding to a single generation parameter.
Referring schematically to fig. 15, each of the fashion colors in fig. 15 corresponds to a set of parameters for generating a lip makeup effect. The user only needs to select a popular color, namely a lip makeup style is selected, and a group of lip makeup special effect generation parameters are determined.
Referring to fig. 16 schematically, "full color gamut" in fig. 16 corresponds to "diffuse reflection dyeing", "gloss setting item" corresponds to "roughness", "spangle intensity" corresponds to "metallic degree", and "spangle particle size" corresponds to "tiling coefficient". The user may customize for a single generation parameter.
Step 1702, in response to the setting operation, self-defining a generation parameter of the lip makeup special effect, where the generation parameter includes: at least one of a basic highlight color of a lip pixel point of the virtual character and a mapping parameter of the paillette mapping;
the basic highlight color of the lip pixel points of the virtual character and the mapping parameters of the paillette mapping are generation parameters related to paillette texture. The mapping parameters of the paillette mapping comprise: whether or not to enable a tile, and a tiling parameter, and a color shift parameter.
Other generation parameters (and chartlet material) may also be customized by the user.
And 1703, displaying a preview picture of the lip makeup special effect of the virtual character, wherein the lip makeup special effect has paillette particle texture.
The embodiment of the application can realize that different virtual roles have different lip makeup special effects, such as fig. 18; it is also possible to achieve different lip makeup effects for the same virtual character, such as fig. 19. Moreover, the lip makeup effect rendered by the embodiment of the application is very similar to the lip makeup effect on a photograph in the real world, and the comparison process is shown in fig. 20.
Fig. 21 is a block diagram of a lip makeup effect display device according to an exemplary embodiment of the present application.
The device includes:
an obtaining module 2120, configured to obtain a lip makeup special effect generation parameter of the virtual character, where the lip makeup special effect generation parameter includes: basic highlight color, main light source direction dyeing and sight line direction dyeing of lip pixel points of the virtual character;
the mixing module 2140 is configured to mix the basic highlight color, the main light source direction dyeing, and the sight line direction dyeing in different directions to obtain a mixed highlight color;
the display module 2160 is used for inputting the highlight color to a lighting model for rendering, and displaying the lip makeup special effect of the virtual character, wherein the lip makeup special effect has a jelly texture.
In an optional embodiment, the mixing module 2140 is configured to perform a first linear interpolation on the basic highlight color and the primary light source direction dyeing according to a primary light source direction, and calculate a first mixed highlight color; performing second linear interpolation on the first mixed highlight color and the sight line direction dyeing according to the sight line direction, and calculating to obtain a second mixed highlight color;
the main light source direction is the direction in which the main light source in the virtual environment where the virtual character is located illuminates the lip pixel point, and the sight line direction is the direction in which a camera model in the virtual environment faces the lip pixel point.
In an optional embodiment, the first linear interpolation uses a gray value calculated by a lambert illumination formula as a weight; and the second linear interpolation adopts a gray value calculated by a Fresnel illumination formula as weight.
In an optional embodiment, the display module 2160 is configured to input the second mixed highlight color and other input parameters into the lighting model for rendering, and display the lip effect of the virtual character;
wherein the other input parameters further include: at least one of diffuse reflectance color, first normal vector, roughness.
In an optional embodiment, the apparatus further comprises:
the diffuse reflection calculation module 2182 is configured to obtain a first color value corresponding to the lip pixel point of the virtual character in a diffuse reflection map; mixing the first color value with the diffuse reflection dyeing value to obtain a second color value; and calculating the second color value according to the refraction characteristic corresponding to the metal degree to obtain the diffuse reflection color of the lip pixel point.
In an optional embodiment, the obtaining module 2120 is further configured to obtain a metal degree compensation value;
the diffuse reflection calculation module 2182 is configured to compensate the metal degree according to the metal degree compensation value to obtain a compensated metal degree; and calculating the second color value according to the refraction characteristic corresponding to the compensated metal degree to obtain the diffuse reflection color of the lip pixel point.
In an optional embodiment, the apparatus further comprises:
a normal calculation module 2184, configured to read a first normal component and a second normal component stored in a normal map by a lip pixel point of the virtual character; calculating according to the first normal vector and the second normal vector and the Pythagorean theorem to obtain a third normal vector; determining the first normal vector, the second normal component, and the third normal component as the first normal vector.
In an optional embodiment, the apparatus further comprises:
a roughness calculating module 2186, configured to read an initial roughness stored in the normal map by the lip pixel point; and multiplying the initial roughness by the self-defined roughness coefficient to obtain the roughness.
In another exemplary embodiment, the present application further provides a lip makeup effect display device, including: a display module and an interaction module;
the display module is used for displaying a lip makeup setting interface corresponding to the virtual character, and the lip makeup setting interface is used for customizing the generation parameters of the lip makeup special effect;
the interaction module is used for responding to setting operation and customizing the generation parameters of the lip makeup special effect, and the generation parameters comprise: at least one of a basic highlight color, a main light source direction coloring and a sight line direction coloring of lip pixel points of the virtual character;
the display module is used for displaying a preview picture of the lip makeup special effect of the virtual character, and the lip makeup special effect has jelly texture. Wherein the lip makeup special effect can be calculated by each module shown in fig. 21.
FIG. 22 illustrates a block diagram of a lip makeup effect display device according to an exemplary embodiment of the present application. The device includes:
an obtaining module 2220, configured to obtain a first highlight color of a lip pixel of a virtual character;
the obtaining module 2220 is further configured to obtain a random color value when the lip pixel point belongs to a paillette area in a paillette map, where the paillette map is a map of the paillette area and a non-paillette area for indicating the lip of the virtual character;
a mixing module 2240, configured to mix the first highlight color and the random color value of the lip pixel belonging to the paillette area, to obtain a mixed second highlight color;
the display module 2260 is configured to input the first highlight color of the lip pixel point belonging to the non-paillette area and the second highlight color of the lip pixel point belonging to the paillette area into an illumination model for rendering, so as to display the lip special effect of the virtual character, where the lip special effect has paillette grain texture.
In an optional embodiment, the obtaining module 2220 is further configured to obtain a color offset parameter;
the mixing module 2240 is further configured to mix the random color value with the first highlight color according to the color offset parameter, so as to obtain a mixed second highlight color.
In an optional embodiment, the obtaining module 2220 is further configured to read a random channel value of the lip pixel point in the paillette map; and converting the random channel value from a first color gamut to a second color gamut to obtain a random color value of the lip pixel point.
In an optional embodiment, the obtaining module 2220 is further configured to determine a base highlight color of the lip pixel of the virtual character as the first highlight color.
In an optional embodiment, the obtaining module 2220 is further configured to obtain a basic highlight color, a primary light source direction coloring and a sight line direction coloring of the lip pixel point of the virtual character; performing first linear interpolation on the basic highlight color and the dyeing in the main light source direction according to the main light source direction, and calculating to obtain a first mixed highlight color; performing second linear interpolation on the first mixed highlight color and the sight line direction dyeing according to the sight line direction, and calculating to obtain a second mixed highlight color; determining the second mixed highlight color as the first highlight color;
the main light source direction is the direction in which the main light source in the virtual environment where the virtual character is located illuminates the lip pixel point, and the sight line direction is the direction in which a camera model in the virtual environment faces the lip pixel point.
In an optional embodiment, the first linear interpolation uses a gray value calculated by a lambert illumination formula as a weight; and the second linear interpolation adopts a gray value calculated by a Fresnel illumination formula as weight.
In an optional embodiment, the display module 2260 is configured to input the first highlight color of the lip pixel point belonging to the non-highlight region, the second highlight color of the lip pixel point belonging to the highlight region, and other input parameters into the illumination model for rendering, so as to display the lip special effect of the virtual character;
wherein the other input parameters further include: at least one of diffuse reflectance color, second normal vector, roughness.
In an optional embodiment, the apparatus further comprises:
the diffuse reflection calculation module 2282 is configured to obtain a first color value, in the diffuse reflection map, of the lip pixel point of the virtual character; mixing the first color value with the diffuse reflection dyeing value to obtain a second color value; and calculating the second color value according to the refraction characteristic corresponding to the metal degree to obtain the diffuse reflection color of the lip pixel point.
In an optional embodiment, the obtaining module 2220 is further configured to obtain a metal degree compensation value;
the diffuse reflection calculation module 2282 is configured to compensate the metal degree according to the metal degree compensation value, so as to obtain a compensated metal degree; and calculating the second color value according to the refraction characteristic corresponding to the compensated metal degree to obtain the diffuse reflection color of the lip pixel point.
In an optional embodiment, the apparatus further comprises:
a normal calculation module 2284, configured to read a first normal vector stored in the normal map by the lip pixel; reading a paillette normal vector stored in the paillette chartlet by the lip pixel point; and combining the first normal vector and the paillette normal vector to obtain the second normal vector.
In an optional embodiment, the apparatus further comprises:
a normal calculation module 2284, configured to read a first normal component and a second normal component stored in the normal map by the lip pixel; calculating according to the first normal component and the second normal component and the Pythagorean theorem to obtain a third paillette normal component; determining the first normal component, the second normal component and the third normal component as a first normal vector of the lip pixel point;
in an optional embodiment, the apparatus further comprises:
a normal calculation module 2284, configured to read a first paillette normal component and a second paillette normal component stored in the paillette map by the lip pixel point; calculating according to the first paillette normal component and the second paillette normal component and the Pythagorean theorem to obtain a third paillette normal component; and determining the first paillette normal component, the second paillette normal component and the third paillette normal component as paillette normal vectors of the lip pixel points.
In an optional embodiment, the apparatus further comprises:
the roughness calculating module 2286 is configured to read an initial roughness stored in the normal map by the lip pixel; and multiplying the initial roughness by the self-defined roughness coefficient to obtain the roughness.
In an optional embodiment, the apparatus further comprises:
the obtaining module 2220 is further configured to obtain a tiling parameter;
the tiling module 2288 is configured to tile the highlight mapping material onto the model surface of the lip of the virtual character according to the tiling parameters, so as to obtain the highlight mapping of the lip of the virtual character.
In another embodiment, the present application provides a lip makeup effect display device comprising: a display module and an interaction module;
the display module is used for displaying a lip makeup setting interface corresponding to the virtual character, and the lip makeup setting interface is used for customizing the generation parameters of the lip makeup special effect;
the interaction module is used for responding to setting operation and customizing the generation parameters of the lip makeup special effect, and the generation parameters comprise: at least one of a basic highlight color of a lip pixel point of the virtual character and a mapping parameter of a highlight mapping;
the display module is used for displaying a preview picture of the lip makeup special effect of the virtual character, and the lip makeup special effect has paillette grain texture. Wherein the lip makeup effect can be calculated by the modules shown in fig. 22.
Fig. 23 shows a block diagram of a computer device 2300 according to an exemplary embodiment of the present application. The computer device 2300 may be a portable mobile terminal such as: smart phones, tablet computers, MP3 players (Moving Picture Experts Group Audio Layer III, motion video Experts compression standard Audio Layer 3), MP4 players (Moving Picture Experts Group Audio Layer IV, motion video Experts compression standard Audio Layer 4). Computer device 2300 may also be referred to by other names such as user equipment, portable terminal, and the like.
Generally, computer device 2300 includes: a processor 2301 and a memory 2302.
The processor 2301 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so on. The processor 2301 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). The processor 2301 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in an awake state, and is also called a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 2301 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing the content required to be displayed on the display screen. In some embodiments, the processor 2301 may further include an AI (Artificial Intelligence) processor for processing computing operations related to machine learning.
Memory 2302 may include one or more computer-readable storage media, which may be tangible and non-transitory. Memory 2302 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 2302 is used to store at least one instruction for execution by processor 2301 to implement the lip makeup special effects display methods provided herein.
In some embodiments, computer device 2300 may also optionally include: a peripheral interface 2303 and at least one peripheral. Specifically, the peripheral device includes: at least one of a radio frequency circuit 2304, a touch display 2305, a camera 2306, an audio circuit 2307, a positioning component 2308, and a power supply 2309.
The peripheral interface 2303 may be used to connect at least one peripheral related to I/O (Input/Output) to the processor 2301 and the memory 2302. In some embodiments, the processor 2301, memory 2302, and peripheral interface 2303 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 2301, the memory 2302, and the peripheral device interface 2303 can be implemented on separate chips or circuit boards, which are not limited by this embodiment.
The Radio Frequency circuit 2304 is used for receiving and transmitting RF (Radio Frequency) signals, also called electromagnetic signals. The radio frequency circuit 2304 communicates with communication networks and other communication devices via electromagnetic signals. The rf circuit 2304 converts an electrical signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 2304 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuit 2304 may communicate with other terminals through at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: the world wide web, metropolitan area networks, intranets, generations of mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the rf circuit 2304 may also include NFC (Near Field Communication) related circuits, which are not limited in this application.
The touch display 2305 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. Touch display 2305 also has the ability to capture touch signals on or over the surface of touch display 2305. The touch signal may be input to the processor 2301 as a control signal for processing. The touch screen display 2305 is used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the touch display screen 2305 may be one, providing the front panel of the computer device 2300; in other embodiments, the touch screen display 2305 can be at least two, each disposed on a different surface of the computer device 2300 or in a folded design; in still other embodiments, the touch display 2305 may be a flexible display disposed on a curved surface or on a folded surface of the computer device 2300. Even more, the touch screen 2305 can be arranged in a non-rectangular irregular pattern, i.e., a shaped screen. The touch Display 2305 can be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), and other materials.
The camera assembly 2306 is used to capture images or video. Optionally, camera assembly 2306 includes a front camera and a rear camera. Generally, a front camera is used for realizing video call or self-shooting, and a rear camera is used for realizing shooting of pictures or videos. In some embodiments, the number of the rear cameras is at least two, and each of the rear cameras is any one of a main camera, a depth-of-field camera and a wide-angle camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize a panoramic shooting function and a VR (Virtual Reality) shooting function. In some embodiments, camera assembly 2306 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The audio circuit 2307 is used to provide an audio interface between a user and the computer device 2300. The audio circuit 2307 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals into the processor 2301 for processing or inputting the electric signals into the radio frequency circuit 2304 to realize voice communication. For stereo capture or noise reduction purposes, multiple microphones may be provided, each at a different location on computer device 2300. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 2301 or the radio frequency circuit 2304 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, the audio circuit 2307 may also include a headphone jack.
The Location component 2308 is used to locate the current geographic Location of the computer device 2300 for navigation or LBS (Location Based Service). The Positioning component 2308 may be a Positioning component based on the Global Positioning System (GPS) in the united states, the beidou System in china, or the galileo System in russia.
The power supply 2309 is used to supply power to various components in the computer device 2300. The power source 2309 may be alternating current, direct current, disposable batteries, or rechargeable batteries. When the power supply 2309 includes a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, computer device 2300 also includes one or more sensors 2310. The one or more sensors 2310 include, but are not limited to: acceleration sensor 2311, gyro sensor 2312, pressure sensor 2313, fingerprint sensor 2314, optical sensor 2315, and proximity sensor 2316.
The acceleration sensor 2311 detects the magnitude of acceleration in three coordinate axes of a coordinate system established with the computer device 2300. For example, the acceleration sensor 2311 is used for detecting the components of the gravitational acceleration in three coordinate axes. The processor 2301 may control the touch display 2305 to display a user interface in a landscape view or a portrait view according to a gravitational acceleration signal of the acceleration sensor 2311 set. The acceleration sensor 2311 may be used for game or user's motion data acquisition.
The gyro sensor 2312 can detect the body direction and the rotation angle of the computer device 2300, and the gyro sensor 2312 can collect 3D actions of the user on the computer device 2300 together with the acceleration sensor 2311. The processor 2301 may implement the following functions according to the data collected by the gyro sensor 2312: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
Pressure sensors 2313 can be disposed on the side bezel of computer device 2300 and/or on the lower layers of touch display screen 2305. When the pressure sensor 2313 is provided on the side frame of the computer device 2300, a user's grip signal on the computer device 2300 can be detected, and left-right hand recognition or shortcut operation can be performed based on the grip signal. When the pressure sensor 2313 is arranged at the lower layer of the touch display screen 2305, the operability control on the UI interface can be controlled according to the pressure operation of the user on the touch display screen 2305. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 2314 is used for collecting a fingerprint of a user to identify the identity of the user according to the collected fingerprint. Upon recognizing that the user's identity is a trusted identity, the processor 2301 authorizes the user to perform relevant sensitive operations including unlocking a screen, viewing encrypted information, downloading software, paying, and changing settings, etc. The fingerprint sensor 2314 may be provided on the front, back or side of the computer device 2300. When a physical key or vendor Logo is provided on the computer device 2300, the fingerprint sensor 2314 may be integrated with the physical key or vendor Logo.
The optical sensor 2315 is used to collect ambient light intensity. In one embodiment, the processor 2301 may control the display brightness of the touch display screen 2305 based on the ambient light intensity collected by the optical sensor 2315. Specifically, when the ambient light intensity is high, the display brightness of the touch display screen 2305 is increased; when the ambient light intensity is low, the display brightness of the touch display screen 2305 is turned down. In another embodiment, the processor 2301 may also dynamically adjust the shooting parameters of the camera assembly 2306 according to the intensity of ambient light collected by the optical sensor 2315.
A proximity sensor 2316, also known as a distance sensor, is typically provided on the front side of the computer device 2300. The proximity sensor 2316 is used to capture the distance between the user and the front of the computer device 2300. In one embodiment, the processor 2301 controls the touch display screen 2305 to switch from a bright screen state to a dark screen state when the proximity sensor 2316 detects that the distance between the user and the front surface of the computer device 2300 is gradually decreased; when the proximity sensor 2316 detects that the distance between the user and the front surface of the computer device 2300 is gradually increased, the touch display screen 2305 is controlled by the processor 2301 to switch from a breath screen state to a bright screen state.
Those skilled in the art will appreciate that the architecture shown in FIG. 23 is not intended to be limiting of the computer device 2300, and may include more or fewer components than those shown, or may combine certain components, or may employ a different arrangement of components.
The present application further provides a computer device, comprising: the display device comprises a processor and a memory, wherein at least one instruction, at least one program, a code set or an instruction set is stored in the storage medium, and the at least one instruction, the at least one program, the code set or the instruction set is loaded and executed by the processor to realize the lip makeup special effect display method provided by the method embodiments.
The present application further provides a computer-readable storage medium having at least one instruction, at least one program, code set, or instruction set stored therein, which is loaded and executed by a processor to implement the method for displaying lip cosmetic effects provided by the above-mentioned method embodiments.
The present application further provides a computer-readable storage medium having at least one instruction, at least one program, a set of codes, or a set of instructions stored therein, which is loaded and executed by the processor to implement the method for displaying lip cosmetic effects provided by the above method embodiments.
Optionally, the present application also provides a computer program product containing instructions, which when run on a computer device, causes the computer device to execute the lip makeup special effect display method provided by the above method embodiments.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is only exemplary of the present application and should not be taken as limiting, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (9)

1. A method of displaying a lip cosmetic effect, the method comprising:
obtaining basic highlight color, main light source direction dyeing and sight line direction dyeing of lip pixel points of the virtual character;
performing first linear interpolation on the basic highlight color and the dyeing in the main light source direction according to the main light source direction, and calculating to obtain a first mixed highlight color;
performing second linear interpolation on the first mixed highlight color and the sight line direction dyeing according to the sight line direction, and calculating to obtain a second mixed highlight color;
inputting the second mixed highlight color into a lighting model for rendering, and displaying the lip making-up special effect of the virtual character, wherein the lip making-up special effect has jelly texture;
the main light source direction is the direction of the lip pixel point illuminated by the main light source in the virtual environment where the virtual character is located, the sight line direction is the direction of the lip pixel point oriented by a camera model in the virtual environment, the main light source direction dyeing is based on the dyeing of the main light source direction, and the sight line direction dyeing is based on the dyeing of the sight line direction.
2. The method according to claim 1, wherein the inputting the second mixed highlight color into a lighting model for rendering to display the lip makeup special effect with a jelly texture comprises:
inputting the second mixed highlight color and other input parameters into the illumination model for rendering, and displaying the lip special effect of the virtual character;
wherein the other input parameters further include: at least one of diffuse reflectance color, first normal vector, roughness.
3. The method of claim 2, wherein the other input parameters further include the diffuse reflectance color, the method further comprising:
acquiring a first color value corresponding to the lip pixel point of the virtual character in a diffuse reflection map;
mixing the first color value with the diffuse reflection dyeing value to obtain a second color value;
and calculating the second color value according to the refraction characteristic corresponding to the metal degree to obtain the diffuse reflection color of the lip pixel point.
4. The method of claim 3, further comprising:
acquiring a metal degree compensation value;
calculating the second color value according to the refraction characteristic corresponding to the metal degree to obtain the diffuse reflection color of the lip pixel point, and the method comprises the following steps:
compensating the metal degree according to the metal degree compensation value to obtain compensated metal degree;
and calculating the second color value according to the refraction characteristic corresponding to the compensated metal degree to obtain the diffuse reflection color of the lip pixel point.
5. The method of claim 2, wherein the other input parameters further include the first normal vector, the method further comprising:
reading a first normal component and a second normal component stored in a normal map by lip pixel points of the virtual character;
calculating according to the first normal component and the second normal component and the pythagorean theorem to obtain a third normal component;
determining the first normal component, the second normal component, and the third normal component as the first normal vector.
6. The method of claim 2, wherein the other input parameters further comprise: roughness, the method further comprising:
reading the initial roughness of the lip pixel points stored in the normal map;
and multiplying the initial roughness by the self-defined roughness coefficient to obtain the roughness.
7. A device for displaying the effect of a lip makeup, said device comprising:
an obtaining module, configured to obtain a generation parameter of the lip makeup special effect of the virtual character, where the generation parameter of the lip makeup special effect includes: basic highlight color, main light source direction dyeing and sight line direction dyeing of lip pixel points of the virtual character;
the mixing module is used for dyeing the basic highlight color and the main light source direction and performing first linear interpolation according to the main light source direction to obtain a first mixed highlight color through calculation; performing second linear interpolation on the first mixed highlight color and the sight line direction dyeing according to the sight line direction, and calculating to obtain a second mixed highlight color;
the display module is used for inputting the second mixed highlight color to a lighting model for rendering, and displaying the lip making-up special effect of the virtual character, wherein the lip making-up special effect has jelly texture;
the main light source direction is the direction of the lip pixel point illuminated by the main light source in the virtual environment where the virtual character is located, the sight line direction is the direction of the lip pixel point oriented by a camera model in the virtual environment, the main light source direction dyeing is based on the dyeing of the main light source direction, and the sight line direction dyeing is based on the dyeing of the sight line direction.
8. A computer device, wherein the computer device comprises a memory and a processor; the memory stores at least one program that is loaded and executed by the processor to implement the lip makeup special effect display method according to any one of claims 1 to 6.
9. A computer-readable storage medium, wherein at least one program is stored in the computer-readable storage medium, and the at least one program is loaded and executed by a processor to implement the method for displaying lip cosmetic effects according to any one of claims 1 to 6.
CN201911183742.1A 2019-11-27 2019-11-27 Lip makeup special effect display method, device, equipment and storage medium Active CN110992248B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201911183742.1A CN110992248B (en) 2019-11-27 2019-11-27 Lip makeup special effect display method, device, equipment and storage medium
CN202011098254.3A CN112037123B (en) 2019-11-27 2019-11-27 Lip makeup special effect display method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911183742.1A CN110992248B (en) 2019-11-27 2019-11-27 Lip makeup special effect display method, device, equipment and storage medium

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202011098254.3A Division CN112037123B (en) 2019-11-27 2019-11-27 Lip makeup special effect display method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN110992248A CN110992248A (en) 2020-04-10
CN110992248B true CN110992248B (en) 2021-03-19

Family

ID=70087394

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201911183742.1A Active CN110992248B (en) 2019-11-27 2019-11-27 Lip makeup special effect display method, device, equipment and storage medium
CN202011098254.3A Active CN112037123B (en) 2019-11-27 2019-11-27 Lip makeup special effect display method, device, equipment and storage medium

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202011098254.3A Active CN112037123B (en) 2019-11-27 2019-11-27 Lip makeup special effect display method, device, equipment and storage medium

Country Status (1)

Country Link
CN (2) CN110992248B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111530085B (en) * 2020-05-06 2023-04-07 网易(杭州)网络有限公司 Game role dyeing method, device, equipment and storage medium
CN111861632B (en) * 2020-06-05 2023-06-30 北京旷视科技有限公司 Virtual makeup testing method and device, electronic equipment and readable storage medium
CN112402958B (en) * 2020-10-27 2022-05-13 腾讯科技(深圳)有限公司 Image processing method, device and storage medium
CN112419465A (en) * 2020-12-09 2021-02-26 网易(杭州)网络有限公司 Rendering method and device of virtual model
CN113096224A (en) * 2021-04-01 2021-07-09 游艺星际(北京)科技有限公司 Three-dimensional virtual image generation method and device
CN113470160B (en) * 2021-05-25 2023-08-08 北京达佳互联信息技术有限公司 Image processing method, device, electronic equipment and storage medium
CN113379885B (en) * 2021-06-22 2023-08-22 网易(杭州)网络有限公司 Virtual hair processing method and device, readable storage medium and electronic equipment
CN113361125B (en) * 2021-06-24 2022-04-29 武汉理工大学 Lip makeup simulation method and system based on double-color reflection model

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108564646A (en) * 2018-03-28 2018-09-21 腾讯科技(深圳)有限公司 Rendering intent and device, storage medium, the electronic device of object

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030156752A1 (en) * 2002-02-12 2003-08-21 Turpin Kenneth A. Color imaging and format system and methods of making and using same
JP4849761B2 (en) * 2002-09-02 2012-01-11 株式会社 資生堂 Makeup method based on texture and texture image map
FR2959853B1 (en) * 2010-05-04 2014-04-11 Vesalis IMAGE PROCESSING METHOD FOR APPLICATION OF COLOR
KR101398188B1 (en) * 2012-03-13 2014-05-30 주식회사 네오위즈블레스스튜디오 Method for providing on-line game supporting character make up and system there of
GB2518589B (en) * 2013-07-30 2019-12-11 Holition Ltd Image processing
TWI573093B (en) * 2016-06-14 2017-03-01 Asustek Comp Inc Method of establishing virtual makeup data, electronic device having method of establishing virtual makeup data and non-transitory computer readable storage medium thereof
CN107229905B (en) * 2017-05-05 2020-08-11 广州视源电子科技股份有限公司 Method and device for rendering color of lips and electronic equipment
CN109157831A (en) * 2018-08-06 2019-01-08 光锐恒宇(北京)科技有限公司 Implementation method, device, intelligent terminal and the computer readable storage medium of game
CN109949216B (en) * 2019-04-19 2022-12-02 中共中央办公厅电子科技学院(北京电子科技学院) Complex makeup transfer method based on facial analysis and illumination transfer

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108564646A (en) * 2018-03-28 2018-09-21 腾讯科技(深圳)有限公司 Rendering intent and device, storage medium, the electronic device of object

Also Published As

Publication number Publication date
CN110992248A (en) 2020-04-10
CN112037123B (en) 2023-08-08
CN112037123A (en) 2020-12-04

Similar Documents

Publication Publication Date Title
CN110992248B (en) Lip makeup special effect display method, device, equipment and storage medium
CN109978989B (en) Three-dimensional face model generation method, three-dimensional face model generation device, computer equipment and storage medium
CN112870707B (en) Virtual object display method in virtual scene, computer device and storage medium
CN110692089B (en) Method and system for generating shadows of inserted content
CN110064200B (en) Object construction method and device based on virtual environment and readable storage medium
CN112287852B (en) Face image processing method, face image display method, face image processing device and face image display equipment
CN109947338B (en) Image switching display method and device, electronic equipment and storage medium
CN112884874B (en) Method, device, equipment and medium for applying applique on virtual model
CN112907716B (en) Cloud rendering method, device, equipment and storage medium in virtual environment
CN112337105B (en) Virtual image generation method, device, terminal and storage medium
WO2022052620A1 (en) Image generation method and electronic device
EP3971838A1 (en) Personalized face display method and apparatus for three-dimensional character, and device and storage medium
CN112691372B (en) Virtual item display method, device, equipment and readable storage medium
CN112581571B (en) Control method and device for virtual image model, electronic equipment and storage medium
CN111696190B (en) Lighting effects from illuminated inserted content
CN111325822B (en) Method, device and equipment for displaying hot spot diagram and readable storage medium
CN112884873B (en) Method, device, equipment and medium for rendering virtual object in virtual environment
CN112950753B (en) Virtual plant display method, device, equipment and storage medium
CN113194329B (en) Live interaction method, device, terminal and storage medium
CN112634155B (en) Image processing method, device, electronic equipment and storage medium
CN114028808A (en) Virtual pet appearance editing method and device, terminal and storage medium
CN113209610B (en) Virtual scene picture display method and device, computer equipment and storage medium
CN114155336A (en) Virtual object display method and device, electronic equipment and storage medium
CN111445439A (en) Image analysis method, image analysis device, electronic device, and medium
CN116524063B (en) Illumination color calculation method, device, equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40022544

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant