CN116492687A - Virtual character image processing method, device, equipment and storage medium - Google Patents

Virtual character image processing method, device, equipment and storage medium Download PDF

Info

Publication number
CN116492687A
CN116492687A CN202310473647.5A CN202310473647A CN116492687A CN 116492687 A CN116492687 A CN 116492687A CN 202310473647 A CN202310473647 A CN 202310473647A CN 116492687 A CN116492687 A CN 116492687A
Authority
CN
China
Prior art keywords
color
control
color adjustment
avatar
dyeing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310473647.5A
Other languages
Chinese (zh)
Inventor
刘电
谢鑫
彭皓珂
王超范
殷亚婷
化超煜
杨峰
屈禹呈
胡啸
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202310473647.5A priority Critical patent/CN116492687A/en
Publication of CN116492687A publication Critical patent/CN116492687A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

This application is a divisional application of 202011628842.3. The application discloses a method, a device, equipment and a storage medium for processing virtual character images, and belongs to the technical field of man-machine interaction. The method comprises the following steps: displaying the avatar image on the user interface; displaying a color adjustment control of the first independent dyeing region; responding to the adjustment operation of the color parameter value on the color adjustment control, and displaying the virtual character image after the color adjustment of the first independent dyeing area; in response to an application operation to the color-adjusted avatar, the color-adjusted avatar is applied while the virtual task is being performed. The method comprises the steps of dividing the appearance parts of the virtual character images into independent dyeing areas, and then customizing color parameter values on each independent dyeing area to customize unique appearance for the three-dimensional virtual character.

Description

Virtual character image processing method, device, equipment and storage medium
The present application is a divisional application of application number 202011628842.3, application date 2020, 12/31, and entitled "method, apparatus, device, and storage medium for processing virtual character image".
Technical Field
The present invention relates to the field of man-machine interaction, and in particular, to a method, an apparatus, a device, and a storage medium for processing an avatar image.
Background
Many online games provide a three-dimensional virtual character changing function, such as customizing holiday theme skins for celebrations to change the image of the three-dimensional virtual character.
For the image setting of the three-dimensional virtual character, various visual image options are provided in the online game, and the player can select the skin color, hairstyle, coat, lower garment, shoes, headwear, hanging parts and the like of the three-dimensional virtual character on the basis of the various visual image options on the terminal.
However, the colors of the various appearance images such as skin color, hairstyle, coat, lower garment, shoes, headwear, pendant and the like are preset, cannot be changed and are not unique.
Disclosure of Invention
The embodiment of the application provides a processing method, a device, equipment and a storage medium for an avatar image, wherein the unique appearance is customized for a three-dimensional avatar by dividing independent dyeing areas of appearance components of the avatar image and then customizing color parameter values on each independent dyeing area. The technical scheme is as follows:
According to one aspect of the present application, there is provided a method for processing an avatar, the method comprising:
displaying the virtual character image on a user interface, wherein the user interface comprises a color adjustment control of a first independent dyeing area, the first independent dyeing area is one independent dyeing area in at least one independent dyeing area on an appearance part of the virtual character image, and the color adjustment control is used for adjusting color parameter values on the first independent dyeing area;
responding to the adjustment operation of the color parameter value on the color adjustment control, and displaying the virtual character image after the color adjustment of the first independent dyeing area;
in response to an application operation to the color-adjusted avatar, the color-adjusted avatar is applied while the virtual task is being performed.
According to another aspect of the present application, there is provided a processing apparatus for avatar image, the apparatus comprising:
the display module is used for displaying the virtual character image on the user interface, the user interface comprises a color adjustment control of a first independent dyeing area, the first independent dyeing area is one independent dyeing area in at least one independent dyeing area on an appearance part of the virtual character image, and the color adjustment control is used for adjusting color parameter values on the first independent dyeing area;
The display module is used for responding to the adjustment operation of the color parameter values on the color adjustment control and displaying the color-adjusted virtual character image of the first independent dyeing area;
and the application module is used for responding to the application operation of the color-adjusted virtual character image and applying the color-adjusted virtual character image when executing the virtual task.
According to another aspect of the present application, there is provided a terminal, including: a processor and a memory storing a computer program loaded and executed by the processor to implement the method of processing an avatar as described above.
According to another aspect of the present application, there is provided a computer-readable storage medium having stored therein a computer program loaded and executed by a processor to implement the method of processing an avatar image as described above.
According to another aspect of the present application, a computer program product is provided, the computer program product comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions so that the computer device performs the avatar image processing method as described above.
The beneficial effects that technical scheme that this application embodiment provided include at least:
dividing appearance parts of the virtual character images into independent dyeing areas, dividing at least one independent dyeing area, then respectively customizing color parameter values of each independent dyeing area, customizing the appearance of the parts meeting the user requirements by customizing the color parameter values, further customizing the unique virtual character images of the three-dimensional virtual characters meeting the user requirements, improving the appearance adjustment experience of the users on the three-dimensional virtual characters, enriching the images of the three-dimensional virtual characters, enabling each three-dimensional virtual character to have the unique image, being easier to distinguish from other three-dimensional virtual characters, and determining the identity of each three-dimensional virtual character more intuitively when a plurality of three-dimensional virtual characters are in the same virtual picture; and the unique virtual character image also has collection value in design.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 illustrates a schematic diagram of a computer system provided in an exemplary embodiment of the present application;
FIG. 2 is a flowchart illustrating a method for processing an avatar image according to an exemplary embodiment of the present application;
FIG. 3 illustrates a process interface diagram for avatar presentation provided by an exemplary embodiment of the present application;
FIG. 4 illustrates a process interface diagram for avatar presentation provided in another exemplary embodiment of the present application;
fig. 5 is a flowchart illustrating a method for processing an avatar image according to another exemplary embodiment of the present application;
FIG. 6 illustrates a process interface diagram for avatar presentation provided in another exemplary embodiment of the present application;
fig. 7 is a flowchart illustrating a method for processing an avatar image according to another exemplary embodiment of the present application;
FIG. 8 illustrates a process interface diagram for avatar presentation provided in another exemplary embodiment of the present application;
fig. 9 is a flowchart illustrating a method for processing an avatar image according to another exemplary embodiment of the present application;
FIG. 10 illustrates a process interface diagram for avatar presentation provided in another exemplary embodiment of the present application;
FIG. 11 illustrates a combined schematic of a color demarcation scheme and a preset dyeing scheme provided in an exemplary embodiment of the present application;
fig. 12 is a flowchart illustrating a method for processing an avatar image according to another exemplary embodiment of the present application;
FIG. 13 illustrates a process interface diagram for avatar presentation provided in another exemplary embodiment of the present application;
FIG. 14 illustrates a process interface diagram for avatar presentation provided in another exemplary embodiment of the present application;
FIG. 15 illustrates a flowchart of a shading calculation method provided by an exemplary embodiment of the present application;
FIG. 16 is a flow chart illustrating a method of defining a color demarcation scheme and a preset staining scheme provided by one exemplary embodiment of the present application;
FIG. 17 illustrates a block diagram of a processing device for avatar presentation provided by an exemplary embodiment of the present application;
fig. 18 shows a schematic structural diagram of a computer device according to an exemplary embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, the embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
First, several terms referred to in this application are described:
Virtual environment: is a virtual environment that an application displays (or provides) while running on a terminal. The virtual environment may be a simulation environment for the real world, a semi-simulation and semi-imaginary environment, or a pure imaginary environment, also referred to as a virtual world. The virtual environment may be any one of a two-dimensional virtual environment, a 2.5-dimensional virtual environment, and a three-dimensional virtual environment, which is not limited in this application. The following embodiments are illustrated with the virtual environment being a three-dimensional virtual environment. The virtual environment related to the embodiment of the application comprises a virtual environment when the competition and the war are not performed and a virtual environment when the competition and the war are performed.
Virtual roles: refers to movable objects in a virtual environment. The movable object may be a virtual character, a virtual animal, a cartoon character, etc., such as: characters or animals displayed in a three-dimensional virtual environment, and the like. Optionally, the virtual character is a three-dimensional stereoscopic model created based on an animated skeleton technique. Each virtual character has its own shape and volume in the three-dimensional virtual environment, occupying a portion of the space in the three-dimensional virtual environment. The three-dimensional virtual character in the embodiment of the present application may be a virtual character with an appearance special effect, for example, may be a skirt-mounted virtual character, or may be a sports-mounted virtual character.
Virtual character image: also referred to as the appearance of the avatar, the same avatar may achieve a variety of different appearance effects through different modeling and coloring.
Appearance component: according to the design of the characters, one virtual character image can be divided into a plurality of different appearance parts, and players can freely match the appearance parts to enrich the appearance effect of the virtual characters, and meanwhile, the user-defined experience of the virtual character image can be brought to the players. The definition of the different appearance components varies, and the appearance components may be divided into hair accessories, coats and undergarments 3 parts, for example.
Complete staining area: an entire tile pixel physically rendered on an appearance element.
Cloth shade: by extracting the brightness and saturation of the native color map pixels, the values of the extracted brightness and saturation are weighted and mixed to obtain a weighted mixed value, which is called an original mask value, and then the original mask value is linearly mapped onto the dyed area of the appearance part, which is called a cloth mask. By different cloth shade values, 1, 2, … and N independent dyeing areas can be independently divided in one complete dyeing area. For example, one full dyeing region may be divided into a color light region and a color dark region, followed by dyeing, respectively.
Independent staining area: a dyeing area divided based on the original mask value on the appearance part; illustratively, pixels having an original mask value greater than or equal to the mask threshold are partitioned into color bright regions, and pixels having an original mask value less than the mask threshold are partitioned into color dark regions.
The independent dyeing area also comprises a special independent dyeing area, namely a metal area; the metallic threshold allows the metallic regions in the stained area to be identified for independent staining.
Color demarcation scheme: the scheme is that an appearance part is divided into D areas to be provided with a shade, wherein D is a positive integer; for example, an avatar character may be divided into three areas of a head, a body, and a foot, and a head mask, a body mask, and a foot mask are provided to dye the three areas, respectively. For N independent colored areas of an appearance component, selecting different masks will create different color demarcations. The color demarcation itself may be a style parameter of the stain for the player to combine and select, and illustratively, a head mask 1, a head mask 2, a body mask 1, and a foot mask are provided for the player to select and combine, and the player may select one of the two head masks, one of the two body masks, and then combine with the foot mask to form a color demarcation scheme, e.g., select the head mask 1, the body mask 1, and the foot mask to form a color demarcation scheme; in addition, the color demarcation may also have a different smooth transition effect.
Degree of metal: parameters for influencing the diffuse reflection. The degree of metal can strongly influence the diffuse reflection of the surface of the appearance part, and the degree of metal of a general organism is 0, so that the diffuse reflection intensity is higher; and when the degree of metalization is 1, the diffuse reflection intensity may be 0. The degree of metallization is controlled by a numerical value based on one basic parameter of physical rendering. Diffuse reflection refers to a physical phenomenon used in a virtual environment to represent reflection and surface color of an object surface.
Roughness: parameters for influencing the roughness of the surface of the object.
Presetting a dyeing scheme: the color matching scheme of the appearance component comprises color parameter values of single or multiple appearance components and combinations thereof, including hue, mixing mode (namely transparency, also called concentration) and color demarcation scheme.
FIG. 1 illustrates a block diagram of a computer system provided in an exemplary embodiment of the present application. The computer system 100 includes: a first terminal 120, a server 140, and a second terminal 160.
The first terminal 120 is connected to the server 140 through a wireless network or a wired network. Server 140 includes at least one of a server, a plurality of servers, a cloud computing platform, and a virtualization center. Illustratively, the server 140 includes a processor 144 and a memory 142, the memory 142 in turn including a receive module 1421, a control module 1422, and a transmit module 1423. The server 140 is used to provide background services for applications supporting a three-dimensional virtual environment. Optionally, the server 140 takes on primary computing work, and the first terminal 120 and the second terminal 160 take on secondary computing work; alternatively, the server 140 performs a secondary computing job, and the first terminal 120 and the second terminal 160 perform a primary computing job; alternatively, the server 140, the first terminal 120 and the second terminal 160 perform cooperative computing by using a distributed computing architecture.
Optionally, the first virtual character and the second virtual character are in the same virtual environment. Alternatively, the first virtual character and the second virtual character may belong to the same team, the same organization, have a friend relationship, or have temporary communication rights.
Alternatively, the applications installed on the first terminal 120 and the second terminal 160 are the same, or the applications installed on the two terminals are the same type of application of different control system platforms. The first terminal 120 may refer broadly to one of a plurality of terminals, and the second terminal 160 may refer broadly to one of a plurality of terminals, the present embodiment being illustrated with only the first terminal 120 and the second terminal 160. The device types of the first terminal 120 and the second terminal 160 are the same or different, and include: at least one of a smart phone, a tablet computer, an electronic book reader, a dynamic video expert compression standard audio layer 3 (Moving Picture Experts GroupAudio Layer III, MP 3) player, a dynamic video expert compression standard audio layer 4 (Moving PictureExperts Group Audio Layer IV, MP 4) player, a laptop portable computer, and a desktop computer. The following embodiments are illustrated with the terminal comprising a smart phone.
Those skilled in the art will recognize that the number of terminals may be greater or lesser. Such as the above-mentioned terminals may be only one, or the above-mentioned terminals may be several tens or hundreds, or more. The number of terminals and the device type are not limited in the embodiment of the present application.
Fig. 2 is a flowchart illustrating a method for processing an avatar image according to an exemplary embodiment of the present application, which is applied to a terminal of the above computer system, for example, the method includes:
step 201, displaying the avatar image on a user interface, wherein the user interface comprises a color adjustment control of a first independent dyeing area.
The application program is provided with a user interface for the user-defined setting of the virtual character image, the virtual character image is displayed on the user interface, and the user interface also comprises a color adjustment control of a first independent dyeing area; wherein, each independent dyeing area in at least one independent dyeing area corresponds to a respective color adjustment control, and the first independent dyeing area refers to one independent dyeing area in at least one independent dyeing area on the appearance part of the virtual character image, and the color adjustment control is used for adjusting the color parameter value in the first independent dyeing area.
Optionally, the color adjustment control comprises a first type color adjustment control and a second type color adjustment control. The first type of color adjustment control is a color wheel control or a combination control of a color wheel control and a slide bar control; the second type of color adjustment control is a slide bar control, or a combination of slide bar controls.
The color disc control is a color-adjusting control displayed in the form of a color-mixing disc; illustratively, the color wheel control consists of a color wheel and a slider, and the color-related parameters are adjusted by drag operation of the slider on the color wheel. The color disc control is provided with the colors corresponding to the color parameter values, so that the user can intuitively know the color types by adopting the color disc control to adjust the colors, and the user can find the required colors more quickly.
The slide bar control is a color-adjusting control displayed in a slide bar mode; illustratively, the slide bar control consists of a slide bar and a slide block, and the color-related parameters are adjusted by dragging the slide block on the slide bar. A slide bar control is focused on the adjustment of a color parameter value, so that a user can more easily realize the adjustment control of each color parameter value.
Optionally, the terminal displays a first type of color adjustment control of the first independent dyeing region and a selection control of a second type of color adjustment control on the user interface; in response to a second selection operation on the selection control of the second type of color adjustment control, switching from the first type of color adjustment control to the second type of color adjustment control, displaying the second type of color adjustment control on the user interface.
The terminal simultaneously displays a first type color adjustment control and a second type color adjustment control selection control on a user interface; after the second type of color adjustment control is displayed on the user interface, the terminal switches back from the second type of color adjustment control to the first type of color adjustment control again in response to a second selection operation on the selection control of the first type of color adjustment control, and the first type of color adjustment control is displayed on the user interface. Wherein the second selection operation is an operation of selecting a type of color adjustment control to be used.
Illustratively, as shown in fig. 3, a selection control 14 of a first type color adjustment control 12 and a selection control 15 of a second type color adjustment control 13 are displayed on the user interface 11, wherein the first type color adjustment control 12 is a combination control of a color wheel control for adjusting brightness and saturation and two slide bar controls, one slide bar control for adjusting transparency and the other slide bar control for adjusting hue; the second type of color adjustment control 13 is a combination control of four slide bar controls for adjusting hue, brightness, saturation and transparency, respectively. The terminal displays a first type of color adjustment control 12 on the user interface 11, the user interface 11 further comprising a selection control 14 and a selection control 15, the terminal switching the display of a second type of color adjustment control 13 on the user interface 11 in response to a selection operation on the selection control 15. The design of the selection control 14 and the selection control 15 on the user interface can realize the rapid switching between the first type and the second type of color adjustment control, and the man-machine interaction efficiency when the color adjustment control is switched is improved.
And 202, displaying the color-adjusted virtual character image of the first independent dyeing area in response to the adjustment operation of the color parameter value on the color adjustment control.
And the terminal responds to the adjustment operation of the color parameter value on the color adjustment control, and displays the virtual character image after the color adjustment of the first independent dyeing area. Illustratively, as shown in fig. 4, an avatar 16 is displayed on the user interface 11, a first type color adjustment control 12 on the terminal is triggered to adjust color parameter values of a first independent colored area on the avatar 16, the avatar 16 is updated to an avatar 17, and the avatar 17 after the color parameter values are adjusted is displayed.
Optionally, the color parameter values include at least one parameter value of hue, brightness, saturation, and transparency.
Optionally, the adjusting operation is a touch operation; the terminal responds to a touch control starting event on the color adjustment control, and the color parameter type and the color parameter value corresponding to the color parameter type are displayed on the user interface; updating a color parameter value displayed on the user interface in response to a touch movement event on the color adjustment control; and responding to a touch control ending event on the color adjustment control, displaying the virtual character image after color adjustment, and canceling to display the color type and the color parameter value corresponding to the color parameter type.
Illustratively, as shown in fig. 4, the terminal displays color parameter values 18 of the color parameter types "hue" 19 and "hue" on the user interface 11, which are 15, in response to a touch start event on the color adjustment control 12; in response to a touch movement event on the color adjustment control 12, updating the color parameter value on the user interface 11 from 15 to 20; in response to a touch end event on the color adjustment control 12, the color adjusted avatar 17 is displayed, the "hue" 19 and the color parameter value 18 are cancelled.
Wherein, touch start event: when a finger starts to touch the screen, the event is triggered even if there is one finger already placed on the screen, when there is another finger touching the screen. Touch move (touch move) event: continuously triggering when the finger slides on the screen. Touch end event: when the finger is off the screen.
In response to the application operation to the color-adjusted avatar, the color-adjusted avatar is applied while the virtual task is being performed, step 203.
The terminal responds to the application operation of the color-adjusted avatar, and the color-adjusted avatar is applied when the virtual task is executed, wherein the virtual task refers to a task which the virtual character needs to execute in the virtual environment. For example, the virtual tasks may include virtual fight, a drug picking task, a Non-Player Character (NPC) dialogue, etc., i.e., a virtual Character fights against other virtual characters or NPCs in the virtual environment, a virtual Character picks up drugs in the virtual environment, a virtual Character dialogues with NPCs in the virtual environment, etc.
The color-adjusted avatar is applied when the avatar is executed, and the color-adjusted avatar can be more easily distinguished from other avatar due to the independent customization of the avatar, namely, the color-adjusted avatar can be more quickly positioned from a plurality of avatar images to the adjusted avatar while having attractive appearance, and the color-adjusted avatar can be more easily positioned from the avatar environment to the adjusted avatar.
In summary, according to the processing method for the avatar image provided by the embodiment, the appearance part of the avatar image is divided into the independent dyeing areas, at least one independent dyeing area is divided, then the color parameter values of each independent dyeing area are respectively customized, fine adjustment of the color parameter values of the appearance of the part can be realized through the customization of the color parameter values, the appearance of the part meeting the user requirements is customized, the unique avatar image of the three-dimensional avatar meeting the user requirements is customized, the appearance adjustment experience of the user on the three-dimensional avatar can be improved, the avatar image of the three-dimensional avatar is enriched, each three-dimensional avatar can have the unique image, the avatar is more easily distinguished from other three-dimensional avatar, and when a plurality of three-dimensional avatar images are in the same virtual image, the identity of each three-dimensional avatar can be more intuitively determined; and the unique virtual character image also has collection value in design.
At least two independent dyeing areas are displayed on the user interface, and the independent dyeing areas needing to be customized can be selected on the terminal, so that the color parameter values on the independent dyeing areas can be adjusted, and as an example, before the color adjustment control of the first independent dyeing area is displayed, the following steps are performed:
step 301, displaying an avatar on a user interface, the user interface including zone selection controls for at least two independently colored zones.
Displaying the avatar character on the user interface while displaying an area selection control for each of at least two independently colored areas, the at least two independently colored areas being at least two independently colored areas obtained after the mask of the same appearance part is demarcated based on the original mask value. The mask is a cloth mask, and the independent dyeing areas are obtained by dividing pixels corresponding to the appearance component based on the original mask values, for example, the pixel areas corresponding to the original mask values on each segment can be used as an independent dyeing area by segmenting the original mask values. For example, a mask value threshold is set, and a pixel region corresponding to an original mask value larger than the mask value threshold is used as an independent dyeing region; and taking the pixel area corresponding to the original mask value smaller than the mask value threshold as an independent dyeing area.
Optionally, the at least two independently colored regions comprise at least two of a colored light region, a colored dark region, and a metallic region on the appearance component.
The original mask value of one appearance part is calculated, the region having the original mask value of 1 is determined as the color bright region of the mask of the appearance part, and the region having the original mask value of 0 is determined as the color dark region of the mask of the appearance part.
And (3) acquiring the metal degree of a mask of one appearance part from the color channel with the metal degree of the pixel points, determining a pixel region with the metal degree being greater than or equal to a metal threshold value as a metal region, and determining a pixel region with the metal degree being less than the metal threshold value as a nonmetal region. For example, the metal region may be divided into a bright color region and a dark color region, and the non-metal region may include a bright color region and a dark color region, and the metal region may include a bright color region and a dark color region.
In response to a first selection operation on the zone selection control of the first independently colored zone, a color adjustment control of the first independently colored zone is displayed on the user interface, step 302.
The region selection control is a control for selecting an independent dyeing region to be dyed, and the first selection operation is an operation for selecting an independent dyeing region to be dyed. Illustratively, as shown in FIG. 6, an avatar character 17 is displayed on the user interface 11, with zone selection controls 20 for three independently colored zones displayed; the terminal still displays the avatar 17 on the user interface 11 in response to the first selection operation on the area selection control 20 marked with "cloth highlight", and also displays the color adjustment control 12 of the independent colored area of "cloth highlight".
For example, as shown in fig. 6, the color adjustment control 12 of the "cloth bright portion" is displayed, and meanwhile, the area selection control of the three independent dyeing areas of the "cloth bright portion", "cloth dark portion", and "metal" is still displayed, and through the selection operation on the three area selection controls, the three independent dyeing areas can be switched, so that the color adjustment control 12 of the "cloth bright portion" is switched, the color adjustment control 12 of the "cloth dark portion" is switched, and the color adjustment control 12 of the "metal" is switched.
For example, the terminal may also directly display the lower diagram in fig. 6, and display the area selection controls of the three independent dyeing areas of "cloth bright portion", "cloth dark portion", and "metal" on the user interface 11, and simultaneously display the color adjustment control 12 of the independent dyeing area of "cloth bright portion"; the terminal can switch among the three independent dyeing areas through the selection operation on the three area selection controls.
In summary, according to the processing method for the avatar image provided by the embodiment, the color parameter values of at least two independent dyeing areas of one appearance part can be adjusted, and then the color mixing display is performed based on the two groups of color parameter values, so that the at least two colors can be mixed and mixed, the color customization variety of the appearance part is enriched, and the uniqueness of the customized avatar image is ensured; and the appearance characterization obtained by mixing and toning the appearance component has unknown property, so that the interest of adjusting the virtual character image by a user can be increased.
In some embodiments, a mask combination is included in a color demarcation scheme, which may be preset in the terminal, and as illustrated in fig. 7, the user interface further includes a first selection control of at least two color demarcation schemes, and after displaying the color adjustment controls of the first independent dyeing areas, the following steps are performed:
step 401, displaying a first selection control of at least two color demarcation schemes on a user interface.
In step 402, in response to a third selection operation on the first selection control of the first color demarcation scheme, a color adjustment control for the first independent color region under the first color demarcation scheme is displayed.
Each color demarcation scheme corresponds to a first selection control, the first selection control is a control for selecting the color demarcation scheme, and the third selection operation is an operation for selecting the color demarcation scheme. At least two color demarcation schemes are preset in the terminal, and each color demarcation scheme in the at least two color demarcation schemes corresponds to a first selection control. Illustratively, the different color demarcation schemes differ in color demarcation manner, e.g., the first color demarcation scheme is a combination of "head mask 1", "body mask 1" and "foot mask 1", and the second color demarcation scheme is a combination of "head mask 2", "body mask 2" and "foot mask 1".
As shown in fig. 8, a color adjustment control 12 of the independent dyeing area of "cloth brightness" under the color demarcation scheme 1 is displayed on the user interface 11; triggering a selection operation on a selection control 21 of the appearance part "whole body", displaying a first selection control 22 of the color demarcation scheme 1 and the color demarcation scheme 2 on the user interface 11, in the middle one of fig. 8, it can be seen that the current color demarcation scheme is the color demarcation scheme 1; the terminal switches from color demarcation scheme 1 to color demarcation scheme 2 in response to a third selection operation on the first selection control 22 of color demarcation scheme 2, as shown in the lower diagram of fig. 8.
In summary, the method for processing the avatar image according to the present embodiment provides a plurality of color demarcation schemes, so that the user can select a satisfactory mask combination mode, further adjust the color parameter values of the mask, and customize the avatar image with a unique color.
In some embodiments, before displaying the color adjustment control of the first independent color region, a second selection control of at least two preset color schemes may be displayed on the user interface, and as illustrated in fig. 9, for example, a mask color may be set before displaying the color adjustment control of the first independent color region, which includes the following steps:
step 501, displaying a second selection control of at least two preset staining protocols on a user interface.
And displaying a second selection control of each of at least two preset dyeing schemes on a user interface of the terminal, wherein the preset dyeing schemes are preset color parameter value combinations, and the preset dyeing schemes are used for setting color parameter values of the appearance component, for example, preset hue, brightness, saturation and transparency are carried in the preset dyeing schemes. The user can determine a favorite color system through a preset dyeing scheme, and then fine adjustment is carried out on the color of the appearance part through a color adjustment control.
Step 502, in response to a fourth selection operation on the second selection control of the first preset dyeing scheme, displaying an avatar image dyed on the appearance component by the first preset dyeing scheme, wherein the user interface comprises the first selection control of at least two color demarcation schemes.
The second selection control is a control for selecting a preset dyeing scheme, and the fourth selection operation is an operation for selecting the preset dyeing scheme. Illustratively, as shown in FIG. 10, a second selection control 23 of four preset staining protocols, "preset 1", "preset 2", "preset 3", and "preset 4" is displayed on the user interface 11; the terminal displays the avatar 17 dyed to the exterior part using the preset dyeing scheme corresponding to "preset 1" in response to the fourth selection operation on the second selection control 23 of "preset 1".
While the preset dyeing scheme may be set on the user interface, a color demarcation scheme may also be set, illustratively, step 503 is performed.
In step 503, in response to a third selection operation on the first selection control of the first color demarcation scheme, at least one independent dyeing area under the first color demarcation scheme is determined, and an entry control for adjusting the color parameter value is further included on the user interface.
Illustratively, as shown in fig. 10, the user interface 11 further includes a first selection control 25 of the color demarcation scheme 1 and the color demarcation scheme 2, and the terminal determines at least one independent dyeing region under the first color demarcation scheme in response to a third selection operation on the first selection control 25 of the color demarcation scheme 1.
In step 504, a color adjustment control for a first independent color zone of the at least one independent color zone is displayed in response to an incoming color adjustment operation on the incoming control.
The color adjustment control is configured to adjust a color parameter value on a first parameter value range, where the first parameter value range is a parameter value range corresponding to a color parameter value indicated by a first preset dyeing scheme, and the first parameter value range includes a color parameter value indicated by the first preset dyeing scheme.
Illustratively, as shown in FIG. 10, an entry control 26 for color parameter value adjustment is also included on the user interface 11; the terminal displays the color adjustment control 12 of a first one of the at least one independently colored areas in response to an incoming color adjustment operation on the entry control 26.
As illustrated in fig. 11, the preset dyeing scheme and the color demarcation scheme may be combined into a plurality of dyeing schemes, "preset 1" corresponds to color 1, "preset 2" corresponds to color 2, and "preset 3" corresponds to color 3, "mask 1" is combined from "head mask 1", "body mask 1", and "foot mask 1," mask 2 "is combined from" head mask 2"," body mask 2", and" foot mask 1, "and, as illustrated, the preset dyeing scheme and the color demarcation scheme may be combined into a combination of" head mask 1 color 1"," body mask 1 color 1", and" foot mask 1 color 1, "or a combination of" head mask 2 color 2"," body mask 2 color 2", and" foot mask 1 color 2, "or a combination of" head mask 1 color 3"," body mask 1 color 3", and" foot mask 1 color 3, "or the like.
In summary, the processing method for the avatar image provided in the embodiment provides multiple color demarcation schemes and multiple preset dyeing schemes, and through the combination of the two schemes, more customizing schemes for the avatar image can be provided, so that the customizing schemes for the avatar image are richer.
In some embodiments, the user interface further includes a cache control and a load control, where the cache control and the load control may be used to switch between two versions of the avatar by cooperation, and steps 601 to 603 are added after step 202, as shown in fig. 12, and the steps are as follows:
and step 601, caching the color-adjusted avatar image in response to the caching operation on the caching control.
After the terminal adjusts the color parameter value of the avatar image for one round, before the color-adjusted avatar image is applied, the color-adjusted avatar image is cached through the caching operation on the caching control.
In step 602, in response to a readjustment operation of the color parameter values on the color adjustment control, displaying the readjusted avatar image of the first independent color zone.
After the terminal caches the color-adjusted virtual character image, the color parameter values of the character virtual image are adjusted again through the color adjustment control, namely the color parameter values of the virtual character image are adjusted again, and the virtual character image after the color adjustment again is obtained. It should be noted that, each round of adjustment of the color parameter value includes at least one adjustment operation.
Step 603, in response to the load operation on the load control, switching from the color-adjusted avatar to the color-adjusted avatar.
If the user wants to apply the color-adjusted avatar after comparing the color-adjusted avatar with the color-adjusted avatar again, reloading the color-adjusted avatar by a load operation on the load control, and redisplaying the color-adjusted avatar on the user interface.
Illustratively, as shown in fig. 13, a first avatar 29 after a round of color adjustment is displayed on the terminal; the terminal caches the first avatar 29 in response to a caching operation on the cache control 27; then, performing two-round color adjustment through the color adjustment control 12, and displaying a second virtual character image 30 after the two-round adjustment; the terminal switches the second avatar 30 to the first avatar 29 in response to a load operation on the load control 28.
Illustratively, the user interface further includes a reset control thereon; the terminal may switch from the color-adjusted avatar to the original avatar in response to a reset operation on the reset control after displaying the color-adjusted avatar of the first independent color region in response to an adjustment operation of the color parameter value on the color adjustment control. Illustratively, as shown in fig. 14, the avatar 17 before color adjustment is displayed on the user interface 11 of the terminal, and the avatar 29 is displayed on the user interface 11 after color adjustment; the terminal redisplays the avatar 17 in response to a reset operation on the reset control 31.
Illustratively, the user interface further includes a revocation control and a recovery control; the terminal responds to the adjustment operation of the color parameter value on the color adjustment control, and displays the virtual character image after the color adjustment of the first independent dyeing area; switching from the color-adjusted avatar to the color-adjusted avatar in response to a cancel operation on the cancel control; and then switching from the adjusted avatar to the avatar after the color adjustment again in response to a restoration operation on the restoration control. Wherein, the above-mentioned cancel operation means cancel one-step operation, and the restore operation means restore one-step operation.
In summary, according to the method for processing the avatar image provided by the embodiment, two sets of customization schemes of the avatar image can be set through the cache control and the loading control, and after the user compares the customization schemes of the two sets of avatar image, a satisfactory set of application is selected, so that customization experience of the avatar image of the user is improved, the need of readjusting color parameter values when the user wants to restore the customized avatar image of the previous set is avoided, and the man-machine interaction efficiency is improved.
FIG. 15 is a flowchart of a coloring calculation method according to an exemplary embodiment of the present application, where the method may be applied to a terminal or a server of the computer system, and the terminal obtains a first pixel value of an intrinsic color map, where the intrinsic color map includes R, G, B, A four color channels, R refers to a Red (Red) channel, G refers to a Green (Green) channel, B refers to a Blue (Blue) channel, and A refers to a transparency channel; the terminal obtains the first pixel value diffuse (R, G, B) of the intrinsic color map based on the above-described RGB three color channels.
The terminal calculates brightness V and saturation S based on the first pixel value, wherein the values of V and S are in the value range of 0 to 1, including 0 and 1. For example, the terminal may invoke an HSV (Hue, saturation, value) model to calculate brightness and Saturation, where Hue represents Hue, saturation represents Saturation, value represents brightness; alternatively, the terminal may invoke an HSL (Hue, saturation) model to calculate Lightness and Saturation, where Lightness represents luminance and Lightness of a color is also referred to as luminance.
The hue is a basic attribute of color, that is, meaning of color, such as red, yellow, blue, etc.; the saturation represents the degree to which the color is near the spectral color; brightness indicates the degree of brightness of the color; brightness is the relative darkness of a color, and is typically measured using a percentage of 0 to 100%.
The terminal calculates the original intrinsic color mask sourceMask based on the custom mode, brightness and saturation, as follows:
sourceMask=lerp(V,S,mode);---(1)
wherein the sourceMask value is in the interval of 0 to 1; the terminal then calculates the intrinsic color mask based on the custom high-level point clamph, the custom low-level point clampwow, and sourceMask, with the following formula:
colorMask=saturate[(sourceMask-clampLow)/(clampHigh-clampLow)];---(2)
wherein the colorMask value is in the interval of 0 to 1.
The terminal calculates a color tone high point pre-coverage (R, G, B) based on diffuse (R, G, B) and a custom color tone high point colorHigh (R, G, B, a) as follows:
colorHighPreCover(R,G,B)=lerp[diffuse(R,G,B),colorHigh(R,G,B),colorHigh(A)];---(3)
based on diffuse (R, G, B) and custom color level low point colorLow (R, G, B, a), color level low point pre-coverage (R, G, B) is calculated as follows:
colorLowPreCover(R,G,B)=lerp[diffuse(R,G,B),colorLow(R,G,B),colorLow(A)];---(4)
the terminal calculates the intrinsic color mixture color blend (R, G, B) based on colorMask, colorHighPreCover and color lowpre cover, as follows:
colorBlended(R,G,B)=lerp[colorLowPreCover,colorHighPreCover,colorMask];---(5)
The terminal acquires the metal degree metal of the physical mapping, wherein the physical mapping also comprises four color channels R, G, B and A, the channel A is the color channel of the metal degree, and the value of the metal is in the interval of 0 to 1; the terminal determines a metal interval metalMask based on the metal and a custom metal recognition threshold, and the formula is as follows:
metalMask=step[metalThreshold,metal];---(6)
wherein, the value of the metalMask is in the interval of 0 to 1.
The terminal calculates the final dyeing result colorResult (R, G, B) based on colorBlended, metalMask and custom metal dyeing colorMetal (R, G, B, a), as follows:
colorResult(R,G,B)=lerp[colorBlended,colorMetal,metalMask×colorMetal(A)]。---(7)
three formulas for a shading computing application are illustrated:
lerp (f, h, j) =f (1-j) +hxj, interpolation between f and h is performed with j as a weight, for mixing colors.
The value of the saturation (z) =max [ (min (z, 1), 0] returns z in the (0, 1) interval, returns 1 if z >1, returns 0 if z < 0), and can be limited to the (0-1) interval for numerical mapping of the mask, wherein the (0, 1) interval represents an interval greater than 0 and less than 1.
step (g, k) means that if k > g returns 1, if k < g returns 0, the formula is delimited by g, and whether k is greater than or equal to g or less than g is determined for the determination of the metal interval.
The parameters related to the mask in the coloring calculation process are four, which are respectively:
Custom mode: a floating point number with a value ranging from 0 to 1, which is used as a weight to interpolate a shade obtained from saturation and a shade obtained from brightness, so as to determine whether to take the shade extracted from saturation as a reference or take the shade extracted from brightness as a reference, and the floating point number can adapt to the distribution of inherent colors of different types of appearance parts, for example, a piece of clothing with different colors but similar saturation can take the brightness as the shade because the brightness of different colors is different under the same saturation; while a garment having a different color but similar brightness may have saturation as a mask because saturation varies with similar brightness.
Custom tone scale high points: a floating point number with a value ranging from 0 to 1 is used to define the value of the highest gray that will be mapped to 1, since the gray range extracted from the ordinary intrinsic color map is mostly not in the interval from 0 to 1, and if it is mapped to the interval from 0 to 1, the span between the two colors of the subsequent dyeing can be ensured, which can be freely decided by the art or the player.
Custom tone scale low points: floating point numbers ranging from 0 to 1, like high-level points, are used to define the value of the lowest gray that will map to 0.
Custom metal recognition threshold: floating point number with value range of 0 to 1, and metal degree larger than or equal to the self-defined metal identification threshold value can be identified as metal by the shader to participate in metal dyeing; the degree of metal less than the custom metal recognition threshold is recognized by the shader as non-metal and does not participate in metal staining.
The parameters related to color in the coloring calculation process are three, namely:
custom color tone scale high points: a four-dimensional array, generally expressed as (R, G, B, A), wherein R, G, B, A respectively represent a floating point number with a value range of 0 to 1, and the R, G, B three-dimensional vector combination is used for designating the color dyed to the high-point region of the tone scale; a represents the opacity of the coloration, 0 represents no coloration, 1 represents the color of the corresponding region covered.
Custom color gamut low points: a four-dimensional array similar to the above is also generally expressed as (R, G, B, A), wherein R, G, B and A respectively represent a floating point number with a value range of 0 to 1, and the R, G and B three-dimensional vector combination is used for designating the color dyed to the low-point region of the tone scale; a represents the opacity of the coloration, 0 represents no coloration, 1 represents the color of the corresponding region covered.
Custom metal staining: similar four-dimensional arrays as above are also generally denoted as (R, G, B, A), where R, G, B, A each represent a floating point number ranging from 0 to 1, and the R, G, B three-dimensional vectors are used to designate colors to be dyed to the metal areas identified. A represents the opacity of the coloration, 0 represents no coloration, 1 represents the color of the corresponding region covered.
It should be noted that, the four parameters related to the mask are parameters included in the color demarcation scheme, and the terminal obtains the four parameters related to the mask based on the set color demarcation scheme when performing the coloring calculation. The three parameters related to the color may be parameters included in a preset dyeing scheme, and the terminal obtains the three parameters related to the color based on the preset dyeing scheme when performing the coloring calculation; alternatively, the three parameters related to color may be three parameters customized by the user on three independent dyeing areas of the appearance component, and the terminal obtains the three parameters customized by the user when performing the coloring calculation. The calculating process of the coloring device is to calculate coloring of one appearance part, and each time the color parameter value is modified in the process of carrying out color adjustment on each independent coloring area of the appearance part on the user interface of the terminal, the coloring device carries out coloring calculation once, and the calculated coloring result is displayed on the virtual character image of the user interface.
As shown in fig. 16, for the description of the parameter setting process in the color demarcation scheme and the preset dyeing scheme, the worker opens the dyeing editor, and the dyeing editor automatically reads the set identifier of the loaded virtual character and the material ball corresponding to the set identifier; the method comprises the steps that a worker inputs a mask configuration name, modifies four floating point parameters related to a mask, and correspondingly stores the mask configuration name and the four floating point parameters related to the mask into a mask configuration table so as to obtain a color demarcation scheme; the color configuration name is also input, three four-dimensional parameters related to the color are modified, and then the color configuration name and the three four-dimensional parameters related to the color are correspondingly stored in a color configuration table, so that a preset dyeing scheme is obtained; the stain editor refreshes the suit stain of the virtual character based on four floating point parameters associated with the mask and three four dimensional parameters associated with the color, providing a live preview effect.
In summary, the present solution can change the traditional whole body staining into zonal staining by dividing the staining area of the appearance member, thereby achieving the following effects: first, three areas of bright, dark, and metallic can be provided for dyeing; secondly, the capability of transition to the original color is increased based on the transparency dyeing logic, so compared with the traditional method of controlling dyeing through 1 color and 3 parameters (hue, brightness and saturation), the scheme has the advantages that the dyeing is controlled through 3 colors and 12 parameters (4 parameters corresponding to each color are 3 colors, 4 parameters are hue, brightness, saturation and transparency respectively), the degree of freedom of dyeing the appearance part of the virtual character by a player is greatly increased, the game content and personalized expression are enriched, and the player can play more creative roles; thirdly, in terms of performance, existing resources are greatly utilized, extra mapping sampling expenditure is not needed, the method can be achieved only through mathematical computation in a shader, in addition, although the mathematical computation is added in the scheme, the method has no performance disadvantage compared with other existing schemes through simplification of a staining algorithm and no step of reverting to RGB after HSV space staining. The set of dyeing realization methods can also define dye consumption according to the modification range.
Fig. 17 is a block diagram illustrating a processing apparatus for avatar image, which may be implemented as part or all of a terminal through software, hardware, or a combination of both, according to an exemplary embodiment of the present application.
The device comprises:
a display module 701, configured to display the avatar image on a user interface, where the user interface includes a color adjustment control for a first independent color area, the first independent color area being one of at least one independent color areas on an appearance component of the avatar image, and the color adjustment control is configured to adjust a color parameter value on the first independent color area;
a display module 701, configured to display the color-adjusted avatar image of the first independent color area in response to an adjustment operation of the color parameter value on the color adjustment control;
an application module 702 for applying the color-adjusted avatar in executing the virtual task in response to an application operation to the color-adjusted avatar.
In some embodiments, the display module 701 is configured to:
responding to a touch control starting event on the color adjustment control, and displaying a color parameter type and a color parameter value corresponding to the color parameter type on a user interface;
Updating a color parameter value displayed on the user interface in response to a touch movement event on the color adjustment control;
and responding to a touch end event on the color adjustment control, and displaying the virtual character image after color adjustment.
In some embodiments, the display module 701 is configured to cancel displaying the color parameter type and the color parameter value corresponding to the color parameter type in response to a touch end event on the color adjustment control;
wherein the color parameter values include at least one parameter value of hue, brightness, saturation, and transparency.
In some embodiments, a region selection control on the user interface that includes at least two independently colored regions;
a display module 701, configured to display a color adjustment control of the first independent color region on the user interface in response to a first selection operation on the region selection control of the first independent color region;
wherein the at least two independently colored regions comprise at least two of a color light region, a color dark region, and a metal region on the appearance component.
In some embodiments, the color adjustment controls include a first type of color adjustment control and a second type of color adjustment control; a display module 701, configured to:
In response to a first selection operation on the region selection control of the first independent color region, displaying a first type of color adjustment control of the first independent color region and a selection control of a second type of color adjustment control on the user interface;
switching from the first type of color adjustment control to the second type of color adjustment control in response to a second selection operation on a selection control of the second type of color adjustment control; the first type of color adjustment control comprises one of a color disc control and a slide bar control, and the second type of color adjustment control comprises the other of the color disc control and the slide bar control.
In some embodiments, the user interface includes a first selection control of at least two color demarcation schemes, the color demarcation schemes referring to schemes that set the appearance component demarcation areas to a mask;
the display module 701 is configured to display a color adjustment control of the first independent dyeing area under the first color demarcation scheme in response to a third selection operation on the first selection control of the first color demarcation scheme.
In some embodiments, the user interface includes a second selection control of at least two pre-set staining schemes, the pre-set staining schemes being staining schemes of pre-designed appearance parts; a display module 701, configured to:
Responding to a fourth selection operation on a second selection control of the first preset dyeing scheme, displaying an avatar image after the appearance part is dyed by adopting the first preset dyeing scheme, and further comprising an entry control for adjusting the color parameter value on the user interface;
and in response to an entering dyeing adjustment operation on the entering control, displaying a color adjustment control, wherein the color adjustment control is used for adjusting color parameter values on a first parameter value range, and the first parameter value range is a parameter value range corresponding to the color parameter values indicated by the first preset dyeing scheme.
In some embodiments, the user interface further includes a first selection control of at least two color demarcation schemes; a display module 701, configured to:
determining at least one independent dyeing area under the first color demarcation scheme in response to a third selection operation on the first selection control of the first color demarcation scheme;
and displaying a color adjustment control of a first independent dyeing area in the at least one independent dyeing area in response to an entering dyeing adjustment operation on the entering control.
In some embodiments, a cache control and a load control are included on the user interface; the apparatus further comprises a cache module 703;
A caching module 703, configured to cache the color-adjusted avatar image in response to a caching operation on the cache control;
a display module 701, configured to display the virtual character image after the re-color adjustment of the first independent color area in response to the re-adjustment operation of the color parameter value on the color adjustment control;
the display module 701 is configured to switch from the avatar after the color adjustment again to the avatar after the color adjustment in response to a load operation on the load control.
In some embodiments, a reset control is included on the user interface;
the display module 701 is configured to switch from the color-adjusted avatar to the original avatar in response to a reset operation on the reset control.
In some embodiments, a revocation control is included on the user interface; a display module 701, configured to:
responding to readjustment operation of color parameter values on the color adjustment control, and displaying the virtual character image after readjustment of the color of the first independent dyeing area;
and responding to the cancel operation on the cancel control, and switching from the avatar after the color adjustment to the avatar after the color adjustment.
In some embodiments, a resume control is included on the user interface;
And a display module 701 for switching from the adjusted avatar to the avatar after the color adjustment again in response to a restoration operation on the restoration control.
In summary, the processing device for an avatar image provided in this embodiment divides the appearance part of the avatar image into independent dyeing areas, divides at least one independent dyeing area, then respectively self-defines the color parameter values of each independent dyeing area, and can realize fine adjustment of the color parameter values of the appearance of the part by self-defining the color parameter values, thereby customizing the appearance of the part according with the user requirements, further customizing the unique avatar image of the three-dimensional avatar according with the user requirements, improving the appearance adjustment experience of the user on the three-dimensional avatar, enriching the image of the three-dimensional avatar, enabling each three-dimensional avatar to have unique images, being easier to be distinguished from other three-dimensional avatars, and being more intuitive to determine the identity of each three-dimensional avatar when a plurality of three-dimensional avatars are in the same virtual picture; and the unique virtual character image also has collection value in design.
Fig. 18 shows a schematic structural diagram of a computer device according to an exemplary embodiment of the present application. The computer device may be a device that performs a processing method of the avatar image as provided herein, and the computer device may be a terminal. Specifically, the present invention relates to a method for manufacturing a semiconductor device.
The computer apparatus 800 includes a central processing unit (CPU, central Processing Unit) 801, a system Memory 804 including a random access Memory (RAM, random Access Memory) 802 and a Read Only Memory (ROM) 803, and a system bus 805 connecting the system Memory 804 and the central processing unit 801. The computer device 800 also includes a basic input/output system (I/O system, input Output System) 806, which helps to transfer information between various devices within the computer, and a mass storage device 807 for storing an operating system 813, application programs 814, and other program modules 815.
The basic input/output system 806 includes a display 808 for displaying information and an input device 809, such as a mouse, keyboard, or the like, for user input of information. Wherein both the display 808 and the input device 809 are connected to the central processing unit 801 via an input output controller 810 connected to the system bus 805. The basic input/output system 806 may also include an input/output controller 810 for receiving and processing input from a number of other devices, such as a keyboard, mouse, or electronic stylus. Similarly, the input output controller 810 also provides output to a display screen, a printer, or other type of output device.
The mass storage device 807 is connected to the central processing unit 801 through a mass storage controller (not shown) connected to the system bus 805. The mass storage device 807 and its associated computer-readable media provide non-volatile storage for the computer device 800. That is, the mass storage device 807 may include a computer readable medium (not shown) such as a hard disk or compact disc read only memory (CD-ROM, compact Disc Read OnlyMemory) drive.
Computer readable media may include computer storage media and communication media. Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes RAM, ROM, erasable programmable Read-Only Memory (EPROM, erasableProgrammable Read Only Memory), electrically charged erasable programmable Read-Only Memory (EEPROM, electrically Erasable Programmable Read Only Memory), flash Memory or other solid state Memory technology, CD-ROM, digital versatile disks (DVD, digital Versatile Disc) or solid state disks (SSD, solid State Drives), other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices. The random Access Memory may include resistive random Access Memory (ReRAM, resistanceRandom Access Memory) and dynamic random Access Memory (DRAM, dynamic RandomAccess Memory). Of course, those skilled in the art will recognize that computer storage media are not limited to the ones described above. The system memory 804 and mass storage device 807 described above may be collectively referred to as memory.
According to various embodiments of the present application, the computer device 800 may also operate by being connected to a remote computer on a network, such as the Internet. I.e., computer device 800 may be connected to a network 812 through a network interface unit 811 connected to system bus 805, or other types of networks or remote computer systems (not shown) may also be connected to using network interface unit 811.
The memory also includes one or more programs, one or more programs stored in the memory and configured to be executed by the CPU.
In an alternative embodiment, a computer apparatus is provided that includes a processor and a memory having at least one instruction, at least one program, code set, or instruction set stored therein, the at least one instruction, at least one program, code set, or instruction set being loaded and executed by the processor to implement a method of avatar image processing as described above.
Alternatively, the computer-readable storage medium may include: read Only Memory (ROM), random access memory (RAM, random Access Memory), solid state disk (SSD, solid State Drives), or optical disk, etc. The random access memory may include resistive random access memory (ReRAM, resistance Random Access Memory) and dynamic random access memory (DRAM, dynamic Random Access Memory), among others. The foregoing embodiment numbers of the present application are merely for describing, and do not represent advantages or disadvantages of the embodiments.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program for instructing relevant hardware, where the program may be stored in a computer readable storage medium, and the storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The application also provides a computer readable storage medium, in which at least one instruction, at least one program, a code set, or an instruction set is stored, where the at least one instruction, the at least one program, the code set, or the instruction set is loaded and executed by a processor to implement a method for processing an avatar image provided by each method embodiment described above.
The present application also provides a computer program product comprising computer instructions stored on a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions so that the computer device performs the avatar image processing method as described above.
It should be understood that references herein to "a plurality" are to two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a exists alone, A and B exist together, and B exists alone. The character "/" generally indicates that the context-dependent object is an "or" relationship.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program for instructing relevant hardware, where the program may be stored in a computer readable storage medium, and the storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The foregoing description of the preferred embodiments is merely exemplary in nature and is in no way intended to limit the invention, since it is intended that all modifications, equivalents, improvements, etc. that fall within the spirit and scope of the invention.

Claims (15)

1. A method for processing an avatar image, the method comprising:
displaying an avatar image on a user interface, wherein the user interface comprises a second selection control of at least two preset dyeing schemes, the preset dyeing schemes are dyeing collocation schemes of appearance components of the avatar image, which are designed in advance, and the preset dyeing schemes are used for setting color parameter values of the appearance components;
responding to a fourth selection operation on a second selection control of a first preset dyeing scheme, and displaying an avatar image dyed on the appearance part by adopting the first preset dyeing scheme;
Displaying a color adjustment control of a first independent dyeing area, wherein the first independent dyeing area is one independent dyeing area in at least one independent dyeing area on the appearance component, and the color adjustment control is used for adjusting color parameter values on the first independent dyeing area on a first parameter value range, and the first parameter value range is a parameter value range corresponding to the color parameter values indicated by the first preset dyeing scheme;
responding to the adjustment operation of the color parameter value on the color adjustment control, and displaying the color-adjusted virtual character image of the first independent dyeing area;
and responding to the application operation of the color-adjusted avatar, and applying the color-adjusted avatar when executing the virtual task.
2. The method of claim 1, wherein the user interface includes an entry control thereon for adjusting color parameter values;
the color adjustment control for displaying a first independently colored region includes:
and responding to the entering dyeing adjustment operation on the entering control, and displaying the color adjustment control of the first independent dyeing area.
3. The method of claim 2, further comprising a first selection control of at least two color demarcation schemes on the user interface; the color demarcation scheme is a scheme of setting a mask for the appearance component dividing region;
the color adjustment control for displaying the first independently colored region in response to an incoming color adjustment operation on the incoming control, comprising:
determining the at least one independent dyeing area under the first color demarcation scheme in response to a third selection operation on a first selection control of the first color demarcation scheme;
and displaying the color adjustment control of the first independent dyeing area in the at least one independent dyeing area in response to the entering dyeing adjustment operation on the entering control.
4. The method of claim 1, wherein the user interface includes a first selection control of at least two color demarcation schemes thereon; the color demarcation scheme is a scheme of setting a mask for the appearance component dividing region;
the color adjustment control for displaying a first independently colored region includes:
and responding to a third selection operation on a first selection control of a first color demarcation scheme, and displaying the color adjustment control of the first independent dyeing area under the first color demarcation scheme.
5. The method of claim 1, wherein the user interface includes a zone selection control of at least two independently colored zones;
the color adjustment control for displaying a first independently colored region includes:
responsive to a first selection operation on a region selection control of the first independently colored region, displaying the color adjustment control of the first independently colored region on the user interface;
wherein the at least two independently colored regions comprise at least two of a color light region, a color dark region, and a metal region on the appearance component.
6. The method of claim 5, wherein the color adjustment controls comprise a first type of color adjustment control and a second type of color adjustment control;
the displaying the color adjustment control of the first independent color zone on the user interface in response to a first selection operation on a zone selection control of the first independent color zone comprises:
displaying, on the user interface, the first type of color adjustment control for the first independent colored region and a selection control for the second type of color adjustment control in response to the first selection operation on a region selection control for the first independent colored region;
Switching from the first type of color adjustment control to the second type of color adjustment control in response to a second selection operation on a selection control of the second type of color adjustment control; the first type color adjustment control comprises one of a color disc control and a slide bar control, and the second type color adjustment control comprises the other of the color disc control and the slide bar control.
7. The method of any one of claims 1 to 6, wherein displaying the color-adjusted avatar of the first independently colored region in response to the adjustment of the color parameter value on the color adjustment control comprises:
responding to a touch control starting event on the color adjustment control, and displaying a color parameter type and the color parameter value corresponding to the color parameter type on the user interface;
updating the color parameter values displayed on the user interface in response to a touch movement event on the color adjustment control;
responding to a touch end event on the color adjustment control, and displaying the virtual character image after color adjustment;
wherein the color parameter type includes at least one of hue, brightness, saturation, and transparency, and the color parameter value includes at least one of hue, brightness, saturation, and transparency.
8. The method of claim 7, wherein the method further comprises:
and in response to the touch end event on the color adjustment control, canceling to display the color parameter type and the color parameter value corresponding to the color parameter type.
9. The method of any one of claims 1 to 8, wherein the user interface includes a cache control and a load control thereon;
and after the color-adjusted avatar image of the first independent dyeing area is displayed in response to the adjustment operation of the color parameter value on the color adjustment control, the method comprises the following steps:
responding to the caching operation on the caching control, and caching the virtual character image after the color adjustment;
responding to the readjustment operation of the color parameter values on the color adjustment control, and displaying the virtual character image of the first independent dyeing area after the readjustment of the color;
and responding to the loading operation on the loading control, and switching from the avatar after the color adjustment to the avatar after the color adjustment.
10. The method of any one of claims 1 to 8, wherein the user interface includes a reset control thereon;
And after the color-adjusted avatar image of the first independent dyeing area is displayed in response to the adjustment operation of the color parameter value on the color adjustment control, the method comprises the following steps:
and responding to the reset operation on the reset control, and switching from the avatar after color adjustment to the original avatar.
11. The method of any one of claims 1 to 8, wherein the user interface includes a revocation control thereon;
and after the color-adjusted avatar image of the first independent dyeing area is displayed in response to the adjustment operation of the color parameter value on the color adjustment control, the method comprises the following steps:
responding to the readjustment operation of the color parameter values on the color adjustment control, and displaying the virtual character image of the first independent dyeing area after the readjustment of the color;
and responding to the cancel operation on the cancel control, and switching from the virtual character image after the color adjustment to the virtual character image after the color adjustment.
12. The method of claim 11, wherein the user interface includes a resume control thereon;
and after the virtual character image after the color adjustment is switched from the virtual character image after the color adjustment again to the virtual character image after the color adjustment is responded to the cancel operation on the cancel control, the method comprises the following steps:
And responding to the recovery operation on the recovery control, and switching from the adjusted avatar to the avatar subjected to the color adjustment again.
13. A device for processing an avatar image, the device comprising:
the display module is used for displaying the virtual character image on a user interface, the user interface comprises a second selection control of at least two preset dyeing schemes, the preset dyeing schemes are dyeing collocation schemes of appearance components of the virtual character image, which are designed in advance, and the preset dyeing schemes are used for setting color parameter values of the appearance components;
the display module is used for responding to a fourth selection operation on a second selection control of a first preset dyeing scheme and displaying an avatar image dyed by the appearance part by adopting the first preset dyeing scheme;
the display module is used for displaying a color adjustment control of a first independent dyeing area, wherein the first independent dyeing area is one independent dyeing area in at least one independent dyeing area on the appearance component, the color adjustment control is used for adjusting color parameter values on the first independent dyeing area on a first parameter value range, and the first parameter value range is a parameter value range corresponding to the color parameter values indicated by the first preset dyeing scheme;
The display module is used for responding to the adjustment operation of the color parameter value on the color adjustment control and displaying the color-adjusted virtual character image of the first independent dyeing area;
and the application module is used for responding to the application operation of the color-adjusted virtual character image and applying the color-adjusted virtual character image when executing the virtual task.
14. A terminal, the terminal comprising: a processor and a memory storing a computer program loaded and executed by the processor to implement the method of processing an avatar image as claimed in any one of claims 1 to 12.
15. A computer-readable storage medium, in which a computer program is stored, the computer program being loaded and executed by a processor to implement the method of processing an avatar image as claimed in any one of claims 1 to 12.
CN202310473647.5A 2020-12-31 2020-12-31 Virtual character image processing method, device, equipment and storage medium Pending CN116492687A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310473647.5A CN116492687A (en) 2020-12-31 2020-12-31 Virtual character image processing method, device, equipment and storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011628842.3A CN112657195B (en) 2020-12-31 2020-12-31 Virtual character image processing method, device, equipment and storage medium
CN202310473647.5A CN116492687A (en) 2020-12-31 2020-12-31 Virtual character image processing method, device, equipment and storage medium

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN202011628842.3A Division CN112657195B (en) 2020-12-31 2020-12-31 Virtual character image processing method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN116492687A true CN116492687A (en) 2023-07-28

Family

ID=75412713

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202011628842.3A Active CN112657195B (en) 2020-12-31 2020-12-31 Virtual character image processing method, device, equipment and storage medium
CN202310473647.5A Pending CN116492687A (en) 2020-12-31 2020-12-31 Virtual character image processing method, device, equipment and storage medium

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202011628842.3A Active CN112657195B (en) 2020-12-31 2020-12-31 Virtual character image processing method, device, equipment and storage medium

Country Status (1)

Country Link
CN (2) CN112657195B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113797548B (en) * 2021-09-18 2024-02-27 珠海金山数字网络科技有限公司 Object processing method and device
CN113797533A (en) * 2021-09-24 2021-12-17 网易(杭州)网络有限公司 Color selection method and device in game and electronic terminal
CN114504824A (en) * 2022-02-05 2022-05-17 腾讯科技(深圳)有限公司 Object control method, device, terminal and storage medium

Also Published As

Publication number Publication date
CN112657195A (en) 2021-04-16
CN112657195B (en) 2023-04-25

Similar Documents

Publication Publication Date Title
CN112657195B (en) Virtual character image processing method, device, equipment and storage medium
KR102296906B1 (en) Virtual character generation from image or video data
CN110198437B (en) Image processing method and device, storage medium and electronic device
CN111282277B (en) Special effect processing method, device and equipment and storage medium
US20090251484A1 (en) Avatar for a portable device
CN110333924B (en) Image gradual change adjustment method, device, equipment and storage medium
CN105204709A (en) Theme switching method and device
JPH08215432A (en) Three-dimensional game device and image synthesizing method
CN109087369A (en) Virtual objects display methods, device, electronic device and storage medium
CN105892839A (en) Screenshot processing method and device based on instant communication tool
KR101398188B1 (en) Method for providing on-line game supporting character make up and system there of
US20220375151A1 (en) Method and apparatus for displaying virtual character, device, and storage medium
US20080303830A1 (en) Automatic feature mapping in inheritance based avatar generation
WO2022159494A2 (en) Three-dimensional avatar generation and manipulation
CN111935489A (en) Network live broadcast method, information display method and device, live broadcast server and terminal equipment
CN113766168A (en) Interactive processing method, device, terminal and medium
CN111729314A (en) Virtual character face pinching processing method and device and readable storage medium
CN108737878A (en) The method and system of user interface color is changed for being presented in conjunction with video
CN113476849B (en) Information processing method, device, equipment and storage medium in game
KR100993801B1 (en) Avatar presenting apparatus and method thereof and computer readable medium processing the method
KR20090058760A (en) Avatar presenting method and computer readable medium processing the method
KR20220012786A (en) Apparatus and method for developing style analysis model based on data augmentation
WO2023093428A1 (en) Interaction method and apparatus, computer device, and storage medium
JP2010029397A (en) Program, information storage medium and image generation system
US20230101386A1 (en) Program, information processing method, server, and server information processing method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40088865

Country of ref document: HK