CN112915544B - Mapping method, mapping device, storage medium, and electronic apparatus - Google Patents

Mapping method, mapping device, storage medium, and electronic apparatus Download PDF

Info

Publication number
CN112915544B
CN112915544B CN202110390583.3A CN202110390583A CN112915544B CN 112915544 B CN112915544 B CN 112915544B CN 202110390583 A CN202110390583 A CN 202110390583A CN 112915544 B CN112915544 B CN 112915544B
Authority
CN
China
Prior art keywords
pixel block
pixel
gray
map
mapping
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110390583.3A
Other languages
Chinese (zh)
Other versions
CN112915544A (en
Inventor
宁光普
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202110390583.3A priority Critical patent/CN112915544B/en
Publication of CN112915544A publication Critical patent/CN112915544A/en
Application granted granted Critical
Publication of CN112915544B publication Critical patent/CN112915544B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/001Texturing; Colouring; Generation of texture or colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The present disclosure provides a mapping method, a mapping apparatus, a computer-readable storage medium, and an electronic device, and relates to the technical field of image processing. The method for making the map comprises the following steps: acquiring a first pixel block and a second pixel block, wherein the first pixel block and the second pixel block have the same size; performing a product processing process on the first pixel block and the second pixel block with the corresponding color disturbance parameters respectively; performing superposition processing on the first pixel block and the second pixel block after the product processing to obtain a candidate map; and determining a partial map corresponding to the gray scale extraction parameters from the gray scale maps corresponding to the candidate maps, and generating a target map according to the partial map. The method and the device can improve the intellectualization and automation of the halation dyeing color map and improve the color effect unification of the map.

Description

Mapping method, mapping device, storage medium, and electronic apparatus
Technical Field
The present disclosure relates to the field of image processing technology, and in particular, to a mapping method, a mapping apparatus, a computer-readable storage medium, and an electronic device.
Background
Currently, when a developer creates a stylized game map, the stylized game map is often developed based on aesthetic deliberations of an artist, or is drawn using drawing software such as photoshop.
The stylized image is drawn specifically by dividing the texture level of the image from coarse to fine, which is generated by the analog painter at the center, generally, a large brush and a large harmony color block are used for drawing the rough outline, and then a small brush is used for gradually and finely drawing, so that the process of drawing from coarse to fine is a layer-by-layer drawing process. However, with respect to the stylized game map, using the conventional stylized image production method, it is impossible for a developer to produce a plurality of stylized game maps, and at the same time, the colors and the stroke effects of the stylized game maps in the game are not uniform.
It should be noted that the information disclosed in the above background section is only for enhancing understanding of the background of the present disclosure and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
The disclosure aims to provide a mapping method, a mapping device, a computer readable storage medium and an electronic device, so as to overcome the problems that the intellectualization and automation of the halation dyeing color mapping cannot be realized and the color effect uniformity of the mapping is improved due to the limitations and defects of the related art at least to a certain extent.
According to a first aspect of the present disclosure, there is provided a method of mapping, comprising: acquiring a first pixel block and a second pixel block, wherein the first pixel block and the second pixel block have the same size; performing a product processing process on the first pixel block and the second pixel block with the corresponding color disturbance parameters respectively; performing superposition processing on the first pixel block and the second pixel block after the product processing to obtain a candidate map; and determining a partial map corresponding to the gray scale extraction parameters from the gray scale maps corresponding to the candidate maps, and generating a target map according to the partial map.
According to a second aspect of the present disclosure, there is provided a mapping apparatus comprising: the pixel block acquisition module is used for acquiring a first pixel block and a second pixel block, wherein the sizes of the first pixel block and the second pixel block are the same; the pixel block processing module is used for executing a product processing process on the first pixel block and the second pixel block and the corresponding color disturbance parameters respectively; the superposition processing module is used for executing superposition processing on the first pixel block and the second pixel block after the product processing to obtain a candidate map; and the mapping generation module is used for determining a partial mapping corresponding to the gray extraction parameters from the gray mapping corresponding to the candidate mapping, and generating a target mapping according to the partial mapping.
Alternatively, the pixel block processing module may be configured to perform: determining a spacing distance corresponding to the first pixel block; adjusting the distance between adjacent pixels in the first pixel block based on the interval distance to obtain an adjusted first pixel block; and executing a product processing process on the adjusted first pixel block and the second pixel block and the corresponding color disturbance parameters respectively.
Optionally, the color perturbation parameters include a color diffusion parameter, a color variation parameter, a color key parameter, and the pixel block processing module may be configured to perform: performing shift processing on each pixel in the first pixel block according to the color diffusion parameters; performing product processing on the shifted first pixel block, the color change parameter and the color key parameter; and multiplying the second pixel block by the color diffusion parameter.
Alternatively, the superimposition processing module may be configured to perform: performing superposition processing on the first pixel block and the second pixel block after the product processing to obtain an intermediate pixel block; and performing preset pixel doubling processing on the middle pixel block subjected to the superposition processing to obtain a candidate map.
Alternatively, the map generation module may be configured to perform: converting the candidate map into a gray map; determining pixel gray values corresponding to all color channels in the map to be generated based on gray extraction parameters, and taking the pixel gray values as target pixel gray values; and determining a partial mapping corresponding to the gray extraction parameters according to the gray values of all pixels in the gray mapping and the gray value of the target pixel, and generating the target mapping according to the partial mapping.
Optionally, the map generation module may be further configured to perform: determining the pixel gray value of each color channel in the gray map, and comparing the pixel gray values; respectively calculating gray level difference values of the contrast pixel gray level value and the target pixel gray level value in each color channel; and determining a part of the mapping corresponding to the gray value of the comparison pixel of the gray difference value in the preset difference value range as a target mapping.
Optionally, the mapping apparatus may further include: the map processing module may be configured to perform: and performing a non-uniform blurring process on the gray map to obtain a noise-removed gray map.
According to a third aspect of the present disclosure, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the mapping method as described above.
According to a fourth aspect of the present disclosure, there is provided an electronic device comprising: one or more processors; and a storage means for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement the mapping method as described above.
Exemplary embodiments of the present disclosure have the following advantageous effects:
In some embodiments of the present disclosure, first, a first pixel block and a second pixel block are acquired; then, the first pixel block and the second pixel block are respectively multiplied by the corresponding color disturbance parameters; then, a superposition processing process is carried out on the first pixel block and the second pixel block after the product processing, and a candidate map is obtained; and then, determining a partial map corresponding to the gray extraction parameter from the gray maps corresponding to the candidate maps, and generating a target map according to the partial map. On the one hand, the method and the device execute a product processing process and a superposition processing process on the first pixel block and the second pixel block to determine a candidate mapping, determine a partial mapping corresponding to a gray extraction parameter from gray mapping corresponding to the candidate mapping, and generate a target mapping. The method avoids relying on artistic staff to make the chartlet, improves the intellectualization and automation of the halation dyeing color chartlet, reduces the production cost of the chartlet and improves the production yield of the chartlet. On the other hand, the method and the device can determine the partial mapping corresponding to the gray extraction parameters from the gray mapping corresponding to the candidate mapping, generate the target mapping, avoid different mapping color effects produced by different aesthetic staff, and improve the uniformity of the color effects of the mapping.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure. It will be apparent to those of ordinary skill in the art that the drawings in the following description are merely examples of the disclosure and that other drawings may be derived from them without undue effort. In the drawings:
FIG. 1 schematically illustrates a flow chart of a mapping method according to an exemplary embodiment of the present disclosure;
FIG. 2 schematically illustrates a node schematic of a node edit function that is an actual pixel block;
FIG. 3 schematically illustrates a node schematic of a first pixel block generating random colors;
FIG. 4 schematically illustrates a node diagram for generating blocks of pixels of different colors under UV coordinates;
FIG. 5 schematically illustrates a node schematic of a second pixel block generating random colors;
FIG. 6 schematically illustrates a schematic diagram of a candidate map according to an exemplary embodiment of the present disclosure;
FIG. 7 schematically illustrates a node schematic diagram for generating an alpha map according to an exemplary embodiment of the present disclosure;
FIG. 8 schematically illustrates a schematic view of an enlarged alpha map according to an exemplary embodiment of the present disclosure;
FIG. 9 schematically illustrates a schematic view of a texture ball corresponding to an alpha map according to an exemplary embodiment of the present disclosure;
FIG. 10 schematically illustrates a block diagram of a mapping apparatus according to an exemplary embodiment of the present disclosure;
fig. 11 schematically illustrates a block diagram of an electronic device in an exemplary embodiment according to the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. However, the exemplary embodiments may be embodied in many forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of the example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the present disclosure. However, those skilled in the art will recognize that the aspects of the present disclosure may be practiced with one or more of the specific details, or with other methods, components, devices, steps, etc. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus a repetitive description thereof will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in software or in one or more hardware modules or integrated circuits or in different networks and/or processor devices and/or microcontroller devices.
In this disclosure, the terms "comprising," "including," and "containing" are used to indicate an open-ended meaning and are intended to mean that additional elements/components/etc. may be present in addition to the listed elements/components/etc. In addition, the terms "first," "second," and "first," "second," are used herein for distinguishing purposes only, and should not be taken as a limitation of the present disclosure.
The flow diagrams depicted in the figures are exemplary only and not necessarily all steps are included. For example, some steps may be decomposed, and some steps may be combined or partially combined, so that the order of actual execution may be changed according to actual situations.
In the game scene making process, a stylized map drawn by an artist is often required. The artwork may be manually drawn by an artist or may be drawn using photoshop software.
The drawing of the stylized image specifically refers to simulating a drawing process of the stylized image using a computer, and generating a corresponding stylized image based on the original input image. The existing image stylized drawing method is mainly a layering method based on layer-by-layer drawing, the main idea is to simulate the classification of rough-to-fine image textures generated by painters in the heart, generally, a large brush and a large blending color block are used for drawing rough outlines, then a small brush is used for gradually and gradually carrying out finer drawing, and the method is a layer-by-layer drawing process from thick to thin.
However, due to the aesthetic level of different art personnel, the colors and the pen touch effects of the drawn stylized drawings are uneven, and the uniformity of the effects cannot be achieved.
In response to this problem, the present disclosure proposes a map making method.
It should be noted that, in the exemplary embodiment of the present disclosure, the mapping method described below may be implemented by a terminal device (e.g., a mobile phone, a tablet, a personal computer, etc.), that is, each step of the mapping method may be performed by the terminal device, in which case the mapping apparatus may be configured in the terminal device.
The steps of the mapping method in the present exemplary embodiment will be described in more detail with reference to the accompanying drawings and examples.
Fig. 1 schematically shows a flow chart of a mapping method of an exemplary embodiment of the present disclosure. In the following illustration, an illustration is made with a terminal device as an execution subject. Referring to fig. 1, the mapping method specifically may include the following steps:
s102, acquiring a first pixel block and a second pixel block.
In an exemplary embodiment of the present disclosure, the first pixel block and the second pixel block may be obtained by computer software by setting a pixel size. The first pixel block and the second pixel block have the same pixel size. The present disclosure may use the Pixel Processor node edit function in texture software substance designer to get Pixel blocks. The terminal device may set a first pixel block and a second pixel block of 8 x 8 pixels. As shown in fig. 2, the present disclosure sets Pixel blocks of 8 x 8 pixels in substance designer Pixel Processor node edit functions.
The method and the device can display the first pixel block and the second pixel block in UV coordinates, and determine the position of each pixel in the pixel blocks.
S104, performing a product processing process on the first pixel block and the second pixel block and the corresponding color disturbance parameters respectively.
The color disturbance parameters may include color diffusion parameters, color variation parameters, and color key parameters. The range of values of each of the color disturbance parameters may be set at [0,1]. Color diffusion parameters can be used to control the extent of diffusion and influence of color. The color change parameter may be used to control the degree of color change. The color key parameters may be used to control the propensity of the base color.
The color diffusion parameters, the color change parameters and the color key parameters in the color diffusion and color change display device belong to configurable parameters, can be adjusted according to drawing requirements of actual mapping, avoid different mapping color effects produced by different aesthetic staff, and improve the uniformity of the mapping color effects. Meanwhile, the target mapping is obtained based on the first pixel blocks and the second pixel blocks which are the same in size and different in initial data, so that the intellectualization and automation of the halation dyeing color mapping are improved, the mapping manufacturing cost is reduced, and the mapping manufacturing yield is improved.
As shown in fig. 3, in substance designer, a color Diffusion node 302 (Diffusion) may be used to configure a color Diffusion parameter, which may limit the Diffusion range of random values in a pixel block; color change parameters may be configured with Color change node 304 (Color change); color key parameters may be configured with color key point 306 (Basecolor).
The color key parameters may include color parameters of three color channels of Red (Red, R), green (Green, G), and Blue (Blue, B) in the pixel block.
The present disclosure is directed to three color channels of Red (Red, R), green (G), blue (B), the RGB channels being based on a gray scale of 0-1, which, upon superposition, produce various colors. The color perturbation interval for each pixel block of the present disclosure may be a random value between [ -1,1 ]. The color diffusion parameters may be generated by a pseudo-random function corresponding to the pseudo-random function node 308 (GlobalRandom). The pseudo-random function 308 may generate a floating random value between 0 and a preset input value.
According to an exemplary embodiment of the present disclosure, the terminal device may first determine a separation distance corresponding to the first pixel block; then, based on the interval distance, adjusting the distance between adjacent pixels in the first pixel block to obtain an adjusted first pixel block; and then, performing a product processing process on the adjusted first pixel block and the second pixel block and the corresponding color disturbance parameters respectively.
After the terminal device obtains the first pixel block, the size of the first pixel block can be determined. The separation distance may be used to separate pixels in the first pixel block. For example, the size of the first pixel block is 8×8 pixels, the terminal device may determine that the interval distance corresponding to the first pixel block is a value greater than 8, and the interval distance may be set to 16.
As shown in fig. 4, in the interface of substance designer, the pixel position in the horizontal direction of the corresponding first pixel block is multiplied by the separation distance 16 and the pixel coordinates in the vertical direction are multiplied by the separation distance 16 in the position node 401 ($pos) of the first pixel block such that adjacent pixels in the first pixel block are separated.
In UV coordinates, the present disclosure may separate adjacent ones of the first pixel blocks along the UV coordinate axes, respectively. The distance interval that can separate adjacent pixels in the first pixel block at this time is within the scope of the present disclosure. The distance between adjacent pixels in the first pixel block may be set to the same value or may be set to different values. The present disclosure separates adjacent pixels in a first pixel block, avoiding the same color from being adjacent, resulting in a first pixel block containing pixels of different colors. As shown in fig. 4, X may represent a U coordinate axis in UV coordinates and Y may represent a V coordinate axis in UV coordinates.
According to another embodiment of the present disclosure, the terminal device may first perform shift processing on each pixel in the first pixel block according to the color diffusion parameter; then, carrying out product processing on the shifted first pixel block, the color change parameter and the color key parameter; then, the second pixel block is multiplied by the color diffusion parameter.
The terminal device can enable each pixel in the first pixel block to be randomly shifted based on the color diffusion parameter, and can reversely adjust the color of the pixel.
As shown in fig. 3, in the interface of substance designer, the terminal device first adjusts the distance between the adjacent pixels in the first pixel block and multiplies the adjusted distance by the random value generating node 314; then, according to the color diffusion parameters corresponding to the color diffusion nodes 302, performing random shift processing on each pixel in the first pixel block; second, color inverting the randomly shifted first pixel block based on the inverse color node 310 (One Minus); then, the product processing is performed on the first pixel block, the color variation parameter corresponding to the color variation node 304, and the color base parameter corresponding to the color base node 306, so as to obtain a first pixel block varying in the [ -1,1] interval.
The terminal device, in performing random shift processing on each pixel in the first pixel block according to the color diffusion parameter corresponding to the color diffusion node 302, may further include:
First, the product processing is performed on the color diffusion parameter corresponding to the color diffusion node 302 and the pseudo random function corresponding to the pseudo random function node 308, so as to obtain the numerical range of the color diffusion parameter.
Substance designer includes a decimal node 312 (frac) in the interface that can be used to return the decimal portion of the random value.
After determining the size of a first pixel block, the method performs a product processing process on RGB color key parameters, color diffusion parameters and pseudo-random functions of each pixel in the first pixel block, and then obtains three RGB color channel parameters of each pixel in the first pixel block after the product processing. The three RGB color channel parameters may also be converted to parameters expressed in terms of hue, saturation, and brightness by rgb_hsl mode conversion nodes (Hue Saturation Lightness, HSL). The present disclosure may also convert HSL color channel parameters to parameters represented in three aspects, red, green, blue, by the hsl_rgb mode conversion node.
As shown in fig. 5, in the interface of substance designer, the terminal device may first establish a random value of 0-1 in the UV coordinate direction, and divide the random value by 8×8 pixels corresponding to the pixel block size node ($size) 502 to obtain a second pixel block; the second pixel block is then multiplied by the color diffusion parameter corresponding to the color diffusion node 302.
S106, performing superposition processing on the first pixel block and the second pixel block after the multiplication processing to obtain a candidate map.
After obtaining the first pixel block and the second pixel block after the product processing, the terminal device may first perform superposition processing on the first pixel block and the second pixel block after the product processing to obtain an intermediate pixel block; and then, performing preset pixel doubling processing on the middle pixel block after the superposition processing to obtain a candidate map.
Specifically, the present disclosure may perform superposition processing on each pixel in the second pixel block after the product processing and a pixel in a corresponding position in the first pixel block, to obtain an intermediate pixel block. That is, the color corresponding to each pixel in the intermediate pixel block is obtained by superimposing the color corresponding to each pixel in the second pixel block and the color corresponding to the pixel at the corresponding position in the first pixel block.
In an exemplary embodiment of the present disclosure, the size of the intermediate pixel block may be the same as the size of the first pixel block and the size of the second pixel block.
In an exemplary embodiment of the present disclosure, the present disclosure determines a candidate map by performing a product process and a superimposition process on a first pixel block and a second pixel block, determines a partial map corresponding to a gray scale extraction parameter from gray scale maps corresponding to the candidate map, and generates a target map according to the partial map. Avoids relying on artistic staff to make the map and improves the intellectualization and automation of the halation dyeing color map.
As shown in fig. 5, the terminal device first performs superposition processing on the pixels at the positions corresponding to the position nodes 401 of the second pixel block and the first pixel block after the product processing, so as to obtain an intermediate pixel block, where the size of the intermediate pixel block is 8×8. And then, the terminal equipment performs preset pixel doubling processing on the middle pixel block subjected to the superposition processing to obtain a candidate map. The size of the candidate map is 1024×1024 pixels, as shown in fig. 6.
S108, determining a partial map corresponding to the gray extraction parameters from the gray maps corresponding to the candidate maps, and generating a target map according to the partial map.
In an exemplary embodiment of the present disclosure, the gray scale map may be a candidate map converted by the terminal device through the gray scale conversion node. As shown in fig. 7, the terminal device converts the candidate map with size 1024×1024 pixels into a gray map through the gray conversion node 704 (GRAYSCALE CONRERSION).
The gray scale extraction parameter may be a gray scale extraction parameter corresponding to the color of the map to be generated. The gray scale extraction parameters may also include pixel gray scale values corresponding to each color channel in the map to be generated. The present disclosure may determine gray scale extraction parameters based on the needs of the actual scene.
According to an exemplary embodiment of the present disclosure, the terminal device may first convert the candidate map into a grayscale map; then, based on the gray scale extraction parameters, determining pixel gray scale values corresponding to all color channels in the to-be-generated map, and taking the pixel gray scale values as target pixel gray scale values; and then, according to the gray value of each pixel in the gray map and the gray value of the target pixel, determining a partial map corresponding to the gray extraction parameter, and generating the target map according to the partial map.
The target map may contain only the color pixel blocks corresponding to the target pixel gray values, and the target map may be the same size as the candidate map. The target map may be a color map determined according to an actual color requirement, or may be a map obtained by doubling a pixel block corresponding to a part of the map.
According to another embodiment of the present disclosure, the terminal device may first determine a pixel gray value of each color channel in the gray map as a comparison pixel gray value; then, respectively calculating gray difference values of the gray values of the comparison pixels and the gray values of the target pixels in the color channels; and then, determining a part of the mapping corresponding to the gray value of the comparison pixel with the gray difference value in the difference range as a target mapping.
After the gray map is obtained, the terminal device may further perform a non-uniform blurring process on the gray map to obtain a gray map from which noise is removed. The present disclosure performs a non-uniform blurring process on a grayscale map such that transitions of similar colors in a target map are smoothed.
The comparison pixel gray value may refer to a pixel gray value of each color channel in the gray map, and may be used for comparing with a target gray value. The contrast pixel gray value may comprise a target gray value. The terminal device may compare the comparison pixel gray value with the target pixel gray value, determine the target gray value from the comparison pixel gray value, and further determine a partial map corresponding to the target gray value.
As shown in fig. 7, in the interface substance designer, when the terminal device obtains the candidate map, firstly, the corresponding pixel block in the node editing function 702 of the candidate map is input into the gray level conversion node 704 to obtain the gray level map; then, a gray map with noise removed can be obtained through Non-uniform blur nodes 706 (Non uniform blue); secondly, the terminal equipment can determine the pixel gray value corresponding to each color channel in the map to be generated based on the gray extraction parameter, and the pixel gray value is used as a target pixel gray value; again, the gray differences of the gray values of the comparison pixels in the respective color channels and the gray values of the target pixels may be calculated, respectively, and then, a partial map corresponding to the gray values of the comparison pixels having the gray differences within the difference range may be determined as the target map.
The terminal device may also determine the partial map corresponding to the target pixel gray value from the gray map through the gray range selection node 708 (Histogram selection), then determine the foreground layer region shape of the map to be generated through the model node 710 (shape) and the gray blurring node 712 (Blur HQ GRAYSCALE), and then input the partial map corresponding to the target pixel gray value of the foreground layer region shape (as a background layer) into the mixed mode node 714 (Blend) to obtain the target map 716 (Alpha map). The enlarged Alpha map is shown in FIG. 8.
The terminal device may further obtain an intelligent material ball composed of colors corresponding to the Alpha map based on the generated Alpha map, as shown in fig. 9.
The method and the device can determine the partial mapping corresponding to the gray extraction parameters from the gray mapping corresponding to the candidate mapping, and generate the target mapping according to the partial mapping, so that different color effects of mapping manufactured by different aesthetic persons are avoided, and the color effect unification of the mapping is improved.
It should be noted that although the steps of the methods in the present disclosure are depicted in the accompanying drawings in a particular order, this does not require or imply that the steps must be performed in that particular order, or that all illustrated steps be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step to perform, and/or one step decomposed into multiple steps to perform, etc.
Further, in an exemplary embodiment of the present disclosure, a mapping apparatus is also provided.
Fig. 10 schematically shows a block diagram of a mapping apparatus according to an exemplary embodiment of the present disclosure. Referring to fig. 10, a mapping apparatus 1000 according to an exemplary embodiment of the present disclosure may include: a pixel block acquisition module 1001, a pixel block processing module 1003, an overlay processing module 1005, and a map generation module 1007.
The pixel block obtaining module 1001 is configured to obtain a first pixel block and a second pixel block, where the first pixel block and the second pixel block have the same size; a pixel block processing module 1003, configured to perform a product processing procedure on the first pixel block and the second pixel block and the corresponding color disturbance parameter, respectively; the superposition processing module 1005 is configured to perform a superposition processing procedure on the first pixel block and the second pixel block after the product processing, to obtain a candidate map; the map generation module 1007 is configured to determine a partial map corresponding to the gray extraction parameter from the gray maps corresponding to the candidate maps, and generate a target map according to the partial map.
According to another embodiment of the present disclosure, the pixel block processing module 1003 may be configured to perform: determining a spacing distance corresponding to the first pixel block; adjusting the distance between adjacent pixels in the first pixel block based on the interval distance to obtain an adjusted first pixel block; and executing a product processing process on the adjusted first pixel block and the second pixel block and the corresponding color disturbance parameters respectively.
According to another embodiment of the present disclosure, the color perturbation parameters include a color diffusion parameter, a color variation parameter, a color key parameter, and the pixel block processing module 1003 may be configured to perform: performing shift processing on each pixel in the first pixel block according to the color diffusion parameters; performing product processing on the shifted first pixel block, the color change parameter and the color key parameter; and multiplying the second pixel block by the color diffusion parameter.
According to another embodiment of the present disclosure, the overlay processing module 1005 may be configured to perform: performing superposition processing on the first pixel block and the second pixel block after the product processing to obtain an intermediate pixel block; and performing preset pixel doubling processing on the middle pixel block subjected to the superposition processing to obtain a candidate map.
According to another embodiment of the present disclosure, the map generation module 1007 may be configured to perform: converting the candidate map into a gray map; determining pixel gray values corresponding to all color channels in the map to be generated based on gray extraction parameters, and taking the pixel gray values as target pixel gray values; and determining a partial mapping corresponding to the gray extraction parameters according to the gray values of all pixels in the gray mapping and the gray value of the target pixel, and generating the target mapping according to the partial mapping.
According to another embodiment of the present disclosure, the map generation module 1007 may be further configured to perform: determining the pixel gray value of each color channel in the gray map, and comparing the pixel gray values; respectively calculating gray level difference values of the contrast pixel gray level value and the target pixel gray level value in each color channel; and determining a part of the mapping corresponding to the gray value of the comparison pixel of the gray difference value in the preset difference value range as a target mapping.
According to another embodiment of the present disclosure, the mapping apparatus 1000 may further include: the map processing module may be configured to perform: and performing a non-uniform blurring process on the gray map to obtain a noise-removed gray map.
The specific details of the modules/units in the above apparatus are already described in the embodiments of the method section, and thus are not repeated.
In an exemplary embodiment of the present disclosure, a computer-readable storage medium having stored thereon a program product capable of implementing the method described above in the present specification is also provided. In some possible embodiments, the aspects of the invention may also be implemented in the form of a program product comprising program code for causing a terminal device to carry out the steps according to the various exemplary embodiments of the invention as described in the "exemplary method" section of this specification, when the program product is run on the terminal device.
In an exemplary embodiment of the present disclosure, an electronic device capable of implementing the above method is also provided.
Those skilled in the art will appreciate that the various aspects of the invention may be implemented as a system, method, or program product. Accordingly, aspects of the invention may be embodied in the following forms, namely: an entirely hardware embodiment, an entirely software embodiment (including firmware, micro-code, etc.) or an embodiment combining hardware and software aspects may be referred to herein as a "circuit," module "or" system.
An electronic device 1100 according to this embodiment of the invention is described below with reference to fig. 11. The electronic device 1100 shown in fig. 11 is merely an example, and should not be construed as limiting the functionality and scope of use of embodiments of the present invention.
As shown in fig. 11, the electronic device 1100 is embodied in the form of a general purpose computing device. Components of electronic device 1100 may include, but are not limited to: the at least one processing unit 1110, the at least one memory unit 1120, a bus 1130 connecting the different system components (including the memory unit 1120 and the processing unit 1110), and a display unit 1140.
Wherein the storage unit stores program code that is executable by the processing unit 1110 such that the processing unit 1110 performs steps according to various exemplary embodiments of the present invention described in the above-described "exemplary methods" section of the present specification. For example, the processing unit 1110 may perform steps S102 to S108 shown in fig. 1.
The storage unit 1120 may include a readable medium in the form of a volatile storage unit, such as a Random Access Memory (RAM) 11201 and/or a cache memory 11202, and may further include a Read Only Memory (ROM) 11203.
Storage unit 1120 may also include a program/utility 11204 having a set (at least one) of program modules 11205, such program modules 11205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each or some combination of which may include an implementation of a network environment.
The bus 1130 may be a local bus representing one or more of several types of bus structures, including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a bus using any of a variety of bus architectures.
The electronic device 1100 may also communicate with one or more external devices 1200 (e.g., keyboard, pointing device, bluetooth device, etc.), one or more devices that enable a user to communicate with the electronic device 1100, and/or any devices (e.g., routers, modems, etc.) that enable the electronic device 1100 to communicate with one or more other computing devices. Such communication may occur through an input/output (I/O) interface 1150. Also, electronic device 1100 can communicate with one or more networks such as a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the Internet, through network adapter 1160. As shown, network adapter 1160 communicates with other modules of electronic device 1100 via bus 1130. It should be appreciated that although not shown, other hardware and/or software modules may be used in connection with electronic device 1100, including, but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, data backup storage systems, and the like.
From the above description of embodiments, those skilled in the art will readily appreciate that the example embodiments described herein may be implemented in software, or may be implemented in software in combination with the necessary hardware. Thus, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (may be a CD-ROM, a U-disk, a mobile hard disk, etc.) or on a network, including several instructions to cause a computing device (may be a personal computer, a server, a terminal device, or a network device, etc.) to perform the method according to the embodiments of the present disclosure.
Furthermore, the above-described drawings are only schematic illustrations of processes included in the method according to the exemplary embodiment of the present invention, and are not intended to be limiting. It will be readily appreciated that the processes shown in the above figures do not indicate or limit the temporal order of these processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, for example, among a plurality of modules.
It should be noted that although in the above detailed description several modules or units of a device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit in accordance with embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into a plurality of modules or units to be embodied.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any adaptations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (9)

1. A method of producing a map, comprising:
Acquiring a first pixel block and a second pixel block, wherein the first pixel block and the second pixel block have the same size;
performing a product processing process on the first pixel block and the second pixel block with the corresponding color disturbance parameters respectively;
performing superposition processing on the first pixel block and the second pixel block after the product processing to obtain a candidate map;
Converting the candidate map to a gray scale map;
Determining pixel gray values corresponding to all color channels in the map to be generated based on gray extraction parameters, and taking the pixel gray values as target pixel gray values;
And determining a partial mapping corresponding to the gray extraction parameter according to the gray value of each pixel in the gray mapping and the gray value of the target pixel, and generating the target mapping according to the partial mapping.
2. The mapping method according to claim 1, wherein performing a product processing procedure on the first pixel block and the second pixel block with the corresponding color perturbation parameters, respectively, includes:
Determining a spacing distance corresponding to the first pixel block;
Based on the interval distance, adjusting the distance between adjacent pixels in the first pixel block to obtain an adjusted first pixel block;
and executing a product processing process on the adjusted first pixel block and the second pixel block and the corresponding color disturbance parameters respectively.
3. The method of mapping according to claim 1, wherein the color perturbation parameters include a color diffusion parameter, a color variation parameter, and a color key parameter, and performing a product processing process on the first pixel block and the second pixel block with the corresponding color perturbation parameters, respectively, includes:
performing shift processing on each pixel in the first pixel block according to the color diffusion parameters;
performing product processing on the shifted first pixel block, the color change parameter and the color key parameter;
And multiplying the second pixel block and the color diffusion parameter.
4. A method of mapping according to any one of claims 1 to 3, wherein performing a superposition process on the first pixel block and the second pixel block after the product process to obtain a candidate map comprises:
performing superposition processing on the first pixel block and the second pixel block after the product processing to obtain an intermediate pixel block;
And performing preset pixel doubling processing on the middle pixel block after the superposition processing to obtain a candidate map.
5. The method of claim 1, wherein determining a partial map corresponding to a gray scale extraction parameter according to each pixel gray scale value in the gray scale map and the target pixel gray scale value, and generating a target map according to the partial map comprises:
Determining pixel gray values of all color channels in the gray map as comparison pixel gray values;
respectively calculating gray level difference values of the compared pixel gray level value and the target pixel gray level value in each color channel;
and determining a partial mapping corresponding to the gray value of the comparison pixel of the gray difference value within a preset difference value range as a target mapping.
6. The method of mapping according to claim 1, further comprising:
and executing a non-uniform blurring process on the gray scale map to obtain the gray scale map with noise removed.
7. A map making apparatus, comprising:
A pixel block obtaining module, configured to obtain a first pixel block and a second pixel block, where the first pixel block and the second pixel block have the same size;
The pixel block processing module is used for executing a product processing process on the first pixel block and the second pixel block and the corresponding color disturbance parameters respectively;
the superposition processing module is used for executing a superposition processing process on the first pixel block and the second pixel block after the product processing to obtain a candidate map;
The mapping generation module is used for converting the candidate mapping into a gray mapping; determining pixel gray values corresponding to all color channels in the map to be generated based on gray extraction parameters, and taking the pixel gray values as target pixel gray values; and determining a partial mapping corresponding to the gray extraction parameter according to the gray value of each pixel in the gray mapping and the gray value of the target pixel, and generating the target mapping according to the partial mapping.
8. A computer readable storage medium, on which a computer program is stored, characterized in that the program, when being executed by a processor, implements a mapping method as claimed in any one of claims 1 to 6.
9. An electronic device, comprising:
one or more processors;
Storage means for storing one or more programs which when executed by the one or more processors cause the one or more processors to implement the mapping method of any of claims 1 to 6.
CN202110390583.3A 2021-04-12 2021-04-12 Mapping method, mapping device, storage medium, and electronic apparatus Active CN112915544B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110390583.3A CN112915544B (en) 2021-04-12 2021-04-12 Mapping method, mapping device, storage medium, and electronic apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110390583.3A CN112915544B (en) 2021-04-12 2021-04-12 Mapping method, mapping device, storage medium, and electronic apparatus

Publications (2)

Publication Number Publication Date
CN112915544A CN112915544A (en) 2021-06-08
CN112915544B true CN112915544B (en) 2024-05-28

Family

ID=76174126

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110390583.3A Active CN112915544B (en) 2021-04-12 2021-04-12 Mapping method, mapping device, storage medium, and electronic apparatus

Country Status (1)

Country Link
CN (1) CN112915544B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105719327A (en) * 2016-02-29 2016-06-29 北京中邮云天科技有限公司 Art stylization image processing method
CN109472839A (en) * 2018-10-26 2019-03-15 Oppo广东移动通信有限公司 Image generating method and device, computer equipment and computer storage medium
CN109753892A (en) * 2018-12-18 2019-05-14 广州市百果园信息技术有限公司 Generation method, device, computer storage medium and the terminal of face wrinkle
CN111784814A (en) * 2020-07-16 2020-10-16 网易(杭州)网络有限公司 Virtual character skin adjusting method and device
CN112053419A (en) * 2020-09-21 2020-12-08 长沙景嘉微电子股份有限公司 Image superposition processing method and device, storage medium and electronic device
CN112516595A (en) * 2020-12-15 2021-03-19 完美世界(北京)软件科技发展有限公司 Magma rendering method, device, equipment and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10127887B2 (en) * 2015-01-14 2018-11-13 Intel Corporation Acceleration of color conversion

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105719327A (en) * 2016-02-29 2016-06-29 北京中邮云天科技有限公司 Art stylization image processing method
CN109472839A (en) * 2018-10-26 2019-03-15 Oppo广东移动通信有限公司 Image generating method and device, computer equipment and computer storage medium
CN109753892A (en) * 2018-12-18 2019-05-14 广州市百果园信息技术有限公司 Generation method, device, computer storage medium and the terminal of face wrinkle
CN111784814A (en) * 2020-07-16 2020-10-16 网易(杭州)网络有限公司 Virtual character skin adjusting method and device
CN112053419A (en) * 2020-09-21 2020-12-08 长沙景嘉微电子股份有限公司 Image superposition processing method and device, storage medium and electronic device
CN112516595A (en) * 2020-12-15 2021-03-19 完美世界(北京)软件科技发展有限公司 Magma rendering method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN112915544A (en) 2021-06-08

Similar Documents

Publication Publication Date Title
CN105374007B (en) Merge the pencil drawing generation method and device of skeleton stroke and textural characteristics
CN108550101A (en) Image processing method, device and storage medium
US9813614B2 (en) Method and system for analog/digital image simplification and stylization
CN110248242B (en) Image processing and live broadcasting method, device, equipment and storage medium
CN112308944A (en) Augmented reality display method of simulated lip makeup
CN113240783B (en) Stylized rendering method and device, readable storage medium and electronic equipment
WO2023093291A1 (en) Image processing method and apparatus, computer device, and computer program product
CN108596992B (en) Rapid real-time lip gloss makeup method
CN111489322A (en) Method and device for adding sky filter to static picture
CN111383311B (en) Normal map generation method, device, equipment and storage medium
CN113436284A (en) Image processing method and device, computer equipment and storage medium
CN110610504A (en) Pencil drawing generation method and device based on skeleton and tone
CN117455753B (en) Special effect template generation method, special effect generation device and storage medium
CN112915544B (en) Mapping method, mapping device, storage medium, and electronic apparatus
CN112819741A (en) Image fusion method and device, electronic equipment and storage medium
CN111652792B (en) Local processing method, live broadcasting method, device, equipment and storage medium for image
CN112070854A (en) Image generation method, device, equipment and storage medium
JP2023126774A (en) Screen tone look generator
CN116797701A (en) Diffusion effect rendering method and device, storage medium and electronic equipment
CN116485967A (en) Virtual model rendering method and related device
Lu et al. A two-stage parametric subspace model for efficient contrast-preserving decolorization
KR20100062832A (en) Surface texture mapping apparatus and its method
US6903740B1 (en) Volumetric-based method and system for visualizing datasets
CN111667541B (en) Image color quantization method, device, electronic equipment and storage medium
CN112819922B (en) Character portrait drawing generation method based on continuous lines

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant