CN112915544A - Map making method, map making device, storage medium, and electronic device - Google Patents

Map making method, map making device, storage medium, and electronic device Download PDF

Info

Publication number
CN112915544A
CN112915544A CN202110390583.3A CN202110390583A CN112915544A CN 112915544 A CN112915544 A CN 112915544A CN 202110390583 A CN202110390583 A CN 202110390583A CN 112915544 A CN112915544 A CN 112915544A
Authority
CN
China
Prior art keywords
pixel block
gray
pixel
map
color
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110390583.3A
Other languages
Chinese (zh)
Other versions
CN112915544B (en
Inventor
宁光普
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202110390583.3A priority Critical patent/CN112915544B/en
Publication of CN112915544A publication Critical patent/CN112915544A/en
Application granted granted Critical
Publication of CN112915544B publication Critical patent/CN112915544B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/001Texturing; Colouring; Generation of texture or colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The disclosure provides a map making method, a map making device, a computer readable storage medium and an electronic device, and relates to the technical field of image processing. The method for manufacturing the map comprises the following steps: acquiring a first pixel block and a second pixel block, wherein the first pixel block and the second pixel block have the same size; performing a product processing process on the first pixel block and the second pixel block and the corresponding color disturbance parameters respectively; performing a superposition processing process on the first pixel block and the second pixel block after the product processing to obtain a candidate map; and determining a partial map corresponding to the gray extraction parameter from the gray maps corresponding to the candidate maps, and generating a target map according to the partial map. The method can improve the intellectualization and the automation of the halation color mapping and improve the unification of the color effect of the mapping.

Description

Map making method, map making device, storage medium, and electronic device
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to a method and an apparatus for creating a map, a computer-readable storage medium, and an electronic device.
Background
At present, developers often develop stylized game maps based on the aesthetic sense of artists or draw them using drawing software such as photoshop when creating stylized game maps.
The stylized image drawing specifically refers to the process of drawing from coarse to fine layer by layer, wherein the stylized image is divided by simulating the image texture levels from coarse to fine, which are generated by a painter at the heart, a rough outline is generally drawn by a large brush and a large harmonic color block, and then the stylized image is drawn more and more finely by a fine brush step by step. However, with the existing stylized image production method for stylized game maps, developers cannot produce a plurality of stylized game maps, and the stylized map colors and the stroke effects in the game are not uniform.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
The present disclosure is directed to a method for producing a map, a device for producing a map, a computer-readable storage medium, and an electronic device, so as to overcome at least some of the problems that it is impossible to achieve intelligentization and automation of a halation color map and to improve color effect unification of a map due to limitations and defects of related art.
According to a first aspect of the present disclosure, there is provided a method of making a map, comprising: acquiring a first pixel block and a second pixel block, wherein the first pixel block and the second pixel block have the same size; performing a product processing process on the first pixel block and the second pixel block and the corresponding color disturbance parameters respectively; performing a superposition processing process on the first pixel block and the second pixel block after the product processing to obtain a candidate map; and determining a partial map corresponding to the gray extraction parameter from the gray maps corresponding to the candidate maps, and generating a target map according to the partial map.
According to a second aspect of the present disclosure, there is provided a map making apparatus comprising: the device comprises a pixel block acquisition module, a pixel block acquisition module and a pixel block acquisition module, wherein the pixel block acquisition module is used for acquiring a first pixel block and a second pixel block, and the first pixel block and the second pixel block have the same size; the pixel block processing module is used for executing a product processing process on the first pixel block and the second pixel block and the corresponding color disturbance parameters respectively; the superposition processing module is used for executing a superposition processing process on the first pixel block and the second pixel block after the product processing to obtain a candidate map; and the map generating module is used for determining a part of map corresponding to the gray extraction parameter from the gray maps corresponding to the candidate maps and generating the target map according to the part of map.
Optionally, the pixel block processing module may be configured to perform: determining a spacing distance corresponding to the first pixel block; adjusting the distance between adjacent pixels in the first pixel block based on the interval distance to obtain an adjusted first pixel block; and respectively executing a product processing process on the adjusted first pixel block and the adjusted second pixel block and the corresponding color disturbance parameters.
Optionally, the color perturbation parameter includes a color diffusion parameter, a color variation parameter, and a color key parameter, and the pixel block processing module may be configured to perform: shifting each pixel in the first pixel block according to the color diffusion parameter; performing product processing on the shifted first pixel block, the color change parameter and the color key parameter; and performing product processing on the second pixel block and the color diffusion parameter.
Optionally, the overlay processing module may be configured to perform: performing superposition processing on the first pixel block and the second pixel block after the product processing to obtain a middle pixel block; and performing preset pixel doubling processing on the middle pixel block after the superposition processing to obtain a candidate mapping.
Optionally, the map generation module may be configured to perform: converting the candidate map into a gray map; determining pixel gray values corresponding to all color channels in the map to be generated based on the gray extraction parameters as target pixel gray values; and determining a partial mapping corresponding to the gray extraction parameters according to the gray value of each pixel in the gray mapping and the gray value of the target pixel, and generating the target mapping according to the partial mapping.
Optionally, the map generation module may be further configured to perform: determining the pixel gray value of each color channel in the gray mapping as a comparison pixel gray value; respectively calculating the gray difference value of the comparison pixel gray value and the target pixel gray value in each color channel; and determining a partial mapping corresponding to the gray value of the comparison pixel with the gray difference value within the preset difference value range as a target mapping.
Optionally, the charting apparatus may further include: a map processing module, which may be configured to perform: and performing a non-uniform fuzzy processing process on the gray map to obtain the gray map with noise points removed.
According to a third aspect of the present disclosure, there is provided a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements a mapping method as described above.
According to a fourth aspect of the present disclosure, there is provided an electronic device comprising: one or more processors; a storage device for storing one or more programs which, when executed by one or more processors, cause the one or more processors to implement the charting method as described above.
Exemplary embodiments of the present disclosure have the following advantageous effects:
in the technical solution provided in some embodiments of the present disclosure, first, a first pixel block and a second pixel block are obtained; then, the first pixel block and the second pixel block are respectively multiplied with the corresponding color disturbance parameters; then, executing a superposition processing process on the first pixel block and the second pixel block after the product processing to obtain a candidate map; and then, determining a partial map corresponding to the gray extraction parameter from the gray maps corresponding to the candidate maps, and generating a target map according to the partial map. On one hand, the method determines candidate mapping by performing a product processing process and an overlap processing process on the first pixel block and the second pixel block, determines partial mapping corresponding to the gray extraction parameters from the gray mapping corresponding to the candidate mapping, and generates the target mapping. The method avoids dependence on art workers to make the chartlet, improves the intellectualization and the automation of the halation color chartlet, reduces the chartlet making cost and improves the chartlet making yield. On the other hand, the method and the device can determine the partial mapping corresponding to the gray extraction parameter from the gray mapping corresponding to the candidate mapping, generate the target mapping, avoid different mapping color effects made by different aesthetic personnel, and improve the color effect unification of the mapping.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty. In the drawings:
FIG. 1 schematically illustrates a flow chart of a charting method according to an exemplary embodiment of the present disclosure;
FIG. 2 schematically illustrates a node diagram of a node editing function that is an actual pixel block;
FIG. 3 schematically illustrates a node diagram of a first block of pixels generating random colors;
FIG. 4 schematically illustrates a nodal schematic view of generating blocks of pixels of different colors in UV coordinates;
FIG. 5 schematically illustrates a nodal schematic of a second block of pixels generating random colors;
FIG. 6 schematically illustrates a diagram of candidate maps according to an exemplary embodiment of the present disclosure;
FIG. 7 schematically illustrates a node diagram for generating an alpha map according to an exemplary embodiment of the present disclosure;
FIG. 8 schematically illustrates a schematic view of an enlarged alpha map according to an exemplary embodiment of the present disclosure;
FIG. 9 schematically illustrates a diagram of a texture ball corresponding to an alpha map according to an exemplary embodiment of the present disclosure;
FIG. 10 schematically illustrates a block diagram of a charting apparatus according to an exemplary embodiment of the present disclosure;
fig. 11 schematically shows a block diagram of an electronic device in an exemplary embodiment according to the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the subject matter of the present disclosure can be practiced without one or more of the specific details, or with other methods, components, devices, steps, and the like. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
In the present disclosure, the terms "comprises" and "comprising" are used in an open-ended fashion, and mean that there may be additional elements/components/etc. in addition to the listed elements/components/etc. In addition, the terms "first" and "second" used in the present disclosure are for the purpose of distinction only and should not be construed as a limitation of the present disclosure.
The flow charts shown in the drawings are merely illustrative and do not necessarily include all of the steps. For example, some steps may be decomposed, and some steps may be combined or partially combined, so that the actual execution sequence may be changed according to the actual situation.
In the process of making a game scene, stylized pastels drawn by art workers are often used. The artists can draw the stylized chartlet manually or by using photoshop software.
The drawing of the stylized image specifically refers to simulating a drawing process of the stylized image by using a computer, and generating a corresponding stylized image based on an original input image. The conventional image stylized drawing method is mainly based on a layering method for layer-by-layer drawing, and the method has the main idea that the rough to fine image texture grades generated by a painter in the center are simulated to be divided, a large brush and a large harmonic color block are usually used for drawing a rough outline, and then a fine brush is used for drawing more and more finely step by step, so that the method is a layer-by-layer drawing process from coarse to fine.
However, due to the aesthetic level of different artists, the colors and the brush stroke effects of the drawn stylized chartlet are uneven, and the effects cannot be unified.
To address this problem, the present disclosure provides a method for making a map.
It should be noted that, in the exemplary embodiment of the present disclosure, the following charting method may be implemented by a terminal device (e.g., a mobile phone, a tablet, a personal computer, etc.), that is, the steps of the charting method may be executed by the terminal device, in which case, the charting apparatus may be configured in the terminal device.
Hereinafter, each step of the map making method in the present exemplary embodiment will be described in more detail with reference to the drawings and examples.
Fig. 1 schematically shows a flowchart of a charting method of an exemplary embodiment of the present disclosure. In the following description, a terminal device is exemplified as an execution subject. Referring to fig. 1, the method for making a map may specifically include the following steps:
s102, acquiring a first pixel block and a second pixel block.
In an exemplary embodiment of the present disclosure, the first pixel block and the second pixel block may be obtained by setting a pixel size by computer software. The first pixel block and the second pixel block have the same pixel size. The present disclosure may use the Pixel Processor node editing function in the texture production software substance designer to obtain the Pixel blocks. The terminal device may set a first pixel block and a second pixel block of 8 x 8 pixels. As shown in fig. 2, the Pixel Processor node editing function of the present disclosure sets a Pixel block of 8 × 8 pixels in the substance designer.
The present disclosure may display the first and second pixel blocks in UV coordinates, determining the location of each pixel in the pixel blocks.
And S104, performing a product processing process on the first pixel block and the second pixel block respectively with the corresponding color disturbance parameters.
The color disturbance parameters may include a color diffusion parameter, a color variation parameter, and a color key parameter. The value range of each parameter in the color disturbance parameters can be set to [0,1 ]. The color diffusion parameter may be used to control the diffusion of color and the range of influence. The color change parameter may be used to control the degree of color change. The color key parameter may be used to control the propensity of the base color.
The color diffusion parameter, the color change parameter and the color key parameter belong to configurable parameters, and can be adjusted according to the drawing requirements of the actual chartlet, so that different chartlets are prevented from making different chartlets, and the color effect unification of the chartlets is improved. Meanwhile, the target chartlet is obtained based on the first pixel block and the second pixel block which are the same in size and different in initial data, the intelligentization and automation of the halation color chartlet are improved, the chartlet manufacturing cost is reduced, and the chartlet manufacturing yield is improved.
As shown in fig. 3, a color Diffusion node 302(Diffusion) may be configured in the substance designer to configure a color Diffusion parameter, which may limit the Diffusion range of random values in a pixel block; color change parameters may be configured with a Color change node 304(Color Variation); the color key parameter may be configured with a color key node 306 (Basecolor).
The color key parameters may include color parameters of three color channels of Red (Red, R), Green (Green, G) and Blue (Blue, B) in the pixel block.
The present disclosure is directed to three color channels, Red (R), Green (G), and Blue (B), where the RGB channels are based on 0-1 gray scale, and produce various colors after superposition. The color perturbation interval of each pixel block of the present disclosure may be a random value between [ -1,1 ]. The color diffusion parameter may be generated by a pseudo-random function corresponding to pseudo-random function node 308 (GlobalRandom). The pseudo-random function 308 may generate a floating random value between 0 and a preset input value.
According to an exemplary embodiment of the present disclosure, the terminal device may first determine a separation distance corresponding to the first pixel block; then, based on the spacing distance, adjusting the distance between adjacent pixels in the first pixel block to obtain an adjusted first pixel block; and then, respectively carrying out a product processing process on the adjusted first pixel block and the adjusted second pixel block and the corresponding color disturbance parameters.
After the terminal device obtains the first pixel block, the size of the first pixel block can be determined. The separation distance may be used to separate the pixels in the first block of pixels. For example, the size of the first pixel block is 8 × 8 pixels, and the terminal device may determine that the first pixel block corresponds to a separation distance greater than 8, where the separation distance may be set to 16.
As shown in fig. 4, in the subordinate designer interface, the pixel position in the horizontal direction of the corresponding first pixel block in the position node 401($ pos) of the first pixel block is multiplied by the spacing distance 16, and the pixel coordinate in the vertical direction is multiplied by the two-time spacing distance 16, so that adjacent pixels in the first pixel block are spaced.
In the UV coordinate, the present disclosure may separate adjacent pixel blocks in the first pixel block along the UV coordinate axis, respectively. The distance interval at which the first pixel block can be separated by adjacent pixels falls within the scope of the present disclosure. The spacing distances between adjacent pixels in the first pixel block may be set to the same value or different values. The adjacent pixels in the first pixel block are separated, the adjacent pixels with the same color are avoided, and the first pixel block containing the pixels with different colors is obtained. As shown in fig. 4, X may represent a U coordinate axis in the UV coordinate, and Y may represent a V coordinate axis in the UV coordinate.
According to another embodiment of the present disclosure, the terminal device may first perform a shift process on each pixel in the first pixel block according to the color diffusion parameter; then, performing product processing on the shifted first pixel block, the color change parameter and the color key parameter; subsequently, the second pixel block is multiplied by the color diffusion parameter.
The terminal equipment can enable each pixel in the first pixel block to be randomly shifted based on the color diffusion parameter, and can reversely adjust the color of the pixel.
As shown in fig. 3, in the interface of the substance designer, the terminal device first adjusts the distance between adjacent pixels in the first pixel block and multiplies the adjusted distance by the random value generating node 314; then, according to the color diffusion parameters corresponding to the color diffusion nodes 302, performing random displacement processing on each pixel in the first pixel block; secondly, based on the reverse color node 310(One Minus), the color of the first pixel block after random shift is reversed; and then, performing product processing on the first pixel block, the color change parameter corresponding to the color change node 304 and the color key parameter corresponding to the color key node 306 to obtain the first pixel block which changes in the range of [ -1,1 ].
The terminal device may further perform, in the random shift processing on each pixel in the first pixel block according to the color diffusion parameter corresponding to the color diffusion node 302, the step of:
first, a product processing is performed on the color diffusion parameter corresponding to the color diffusion node 302 and the pseudo-random function corresponding to the pseudo-random function node 308 to obtain a numerical range of the color diffusion parameter.
The substance designer interface includes a fraction node 312(frac) that may be used to return the fractional part of the random value.
After the size of the first pixel block is determined, a product processing process is executed aiming at the RGB color key parameters, the color diffusion parameters and the pseudorandom functions of all the pixels in the first pixel block, and then three RGB color channel parameters of all the pixels in the first pixel block after the product processing are obtained. The three RGB color channel parameters may also be converted into parameters expressed in terms of Hue, Saturation, and brightness by a RGB _ HSL mode conversion node (HSL). The present disclosure may also convert the HSL color channel parameters into parameters expressed in three aspects of red, green, and blue through the HSL _ RGB mode conversion node.
As shown in fig. 5, in the interface of the substance designer, the terminal device may first establish a random value of 0 to 1 in the UV coordinate direction, and divide the random value by 8 × 8 pixels corresponding to the pixel block size node ($ size)502 to obtain a second pixel block; and then the second pixel block is multiplied by the color diffusion parameter corresponding to the color diffusion node 302.
And S106, performing a superposition processing process on the first pixel block and the second pixel block after the product processing to obtain a candidate mapping.
After the terminal device obtains the first pixel block and the second pixel block after the product processing, the terminal device may first perform superposition processing on the first pixel block and the second pixel block after the product processing to obtain a middle pixel block; and then, carrying out preset pixel doubling treatment on the middle pixel block after the superposition treatment to obtain a candidate mapping.
Specifically, the method and the device can perform superposition processing on each pixel in the second pixel block after the product processing and the pixel at the corresponding position in the first pixel block to obtain a middle pixel block. That is, the color corresponding to each pixel in the middle pixel block is obtained by superimposing the color corresponding to each pixel in the second pixel block and the color corresponding to the pixel at the corresponding position in the first pixel block.
In an exemplary embodiment of the present disclosure, the size of the middle pixel block may be the same as the size of the first pixel block and the size of the second pixel block.
In an exemplary embodiment of the present disclosure, the present disclosure determines candidate maps by performing a multiplication process and an overlay process on a first pixel block and a second pixel block, determines a partial map corresponding to a gray extraction parameter from gray maps corresponding to the candidate maps, and generates a target map according to the partial map. Avoid relying on the art personnel to make the map, improve the intellectuality and the automation of the fainting and dyeing color map.
As shown in fig. 5, the terminal device first superimposes the multiplied second pixel block with the pixel at the position corresponding to the position node 401 of the first pixel block, so as to obtain a middle pixel block, where the size of the middle pixel block is 8 × 8. And then, the terminal equipment performs preset pixel doubling processing on the middle pixel block after the superposition processing to obtain a candidate mapping. The size of the candidate map is 1024 x 1024 pixels, as shown in fig. 6.
And S108, determining a partial mapping corresponding to the gray extraction parameters from the gray mapping corresponding to the candidate mapping, and generating a target mapping according to the partial mapping.
In an exemplary embodiment of the present disclosure, the grayscale map may be obtained by the terminal device by converting the candidate map through the grayscale conversion node. As shown in fig. 7, the terminal device converts the candidate map having a size of 1024 × 1024 pixels into a gray map through a gray conversion node 704(gray conversion).
The grayscale extraction parameter may be a grayscale extraction parameter corresponding to a color of the map to be generated. The grayscale extraction parameters may further include pixel grayscale values corresponding to each color channel in the map to be generated. The present disclosure may determine the grayscale extraction parameters based on the needs of the actual scene.
According to an exemplary embodiment of the present disclosure, the terminal device may first convert the candidate map into a grayscale map; then, determining pixel gray values corresponding to all color channels in the map to be generated based on the gray extraction parameters as target pixel gray values; and then, according to the gray value of each pixel in the gray map and the gray value of the target pixel, determining a partial map corresponding to the gray extraction parameter, and generating the target map according to the partial map.
The target map may only contain blocks of color pixels corresponding to the gray values of the target pixels, and the size of the target map may be the same as the candidate map. The target map can be a color map determined according to actual color requirements, or a map obtained by doubling pixel blocks corresponding to partial maps.
According to another embodiment of the present disclosure, the terminal device may first determine a pixel gray value of each color channel in the gray map as a comparison pixel gray value; then, respectively calculating the gray difference value of the comparison pixel gray value and the target pixel gray value in each color channel; and then, determining a partial mapping corresponding to the gray value of the comparison pixel with the gray difference value in the difference range as a target mapping.
After the gray map is obtained, the terminal device may further perform a non-uniform blurring process on the gray map to obtain a gray map with noise removed. The present disclosure performs a non-uniform blurring process on the grayscale map to smooth the transition of the near colors in the target map.
The comparison pixel gray value may refer to a pixel gray value of each color channel in the gray map, and may be used for comparison with the target gray value. The contrast pixel grayscale value may include a target grayscale value. The terminal device may compare the comparison pixel gray value with the target pixel gray value, determine the target gray value from the comparison pixel gray value, and further determine a partial mapping corresponding to the target gray value.
As shown in fig. 7, in the interface of the substance designer, when the terminal device obtains a candidate chartlet, first, a corresponding pixel block in a node editing function 702 of the candidate chartlet is input to a gray level conversion node 704 to obtain a gray level chartlet; then, a gray map with noise removed can be obtained through a Non-uniform fuzzy node 706(Non uniform blu); secondly, the terminal device can determine pixel gray values corresponding to all color channels in the map to be generated based on the gray extraction parameters, and the pixel gray values are used as target pixel gray values; thirdly, the gray difference between the gray value of the comparison pixel in each color channel and the gray value of the target pixel can be calculated respectively, and then the partial mapping corresponding to the gray value of the comparison pixel with the gray difference within the difference range can be determined as the target mapping.
The terminal device may also determine a partial map corresponding to the target pixel gray value from the gray map through a gray range selection node 708(Histogram select), determine the foreground layer region shape of the map to be generated through a model node 710(shape) and a gray fuzzy node 712 (Blend HQ graph), and then input the partial map (serving as a background layer) having the foreground layer region shape corresponding to the target pixel gray value into a Blend-mode node 714(Blend) to obtain a target map 716(Alpha map). The enlarged Alpha map is shown in FIG. 8.
The terminal device may further obtain an intelligent material ball composed of the color of the Alpha map based on the generated Alpha map, as shown in fig. 9.
According to the method and the device, the partial chartlet corresponding to the gray extraction parameter can be determined from the gray chartlet corresponding to the candidate chartlet, the target chartlet is generated according to the partial chartlet, the different chartlets made by different aesthetic personnel are prevented from having different chartlet color effects, and the color effect unification of the chartlets is improved.
It should be noted that although the various steps of the methods of the present disclosure are depicted in the drawings in a particular order, this does not require or imply that these steps must be performed in this particular order, or that all of the depicted steps must be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions, etc.
Further, in an exemplary embodiment of the present disclosure, a map making apparatus is also provided.
Fig. 10 schematically shows a block diagram of a mapping apparatus according to an exemplary embodiment of the present disclosure. Referring to fig. 10, a charting apparatus 1000 according to an exemplary embodiment of the present disclosure may include: a pixel block acquisition module 1001, a pixel block processing module 1003, an overlay processing module 1005, and a map generation module 1007.
The pixel block obtaining module 1001 is configured to obtain a first pixel block and a second pixel block, where the first pixel block and the second pixel block have the same size; a pixel block processing module 1003, configured to perform a product processing procedure on the first pixel block and the second pixel block respectively with the corresponding color perturbation parameter; a superposition processing module 1005, configured to perform a superposition processing procedure on the first pixel block and the second pixel block after the product processing, so as to obtain a candidate map; the map generating module 1007 is configured to determine a partial map corresponding to the grayscale extraction parameter from the grayscale maps corresponding to the candidate maps, and generate a target map according to the partial map.
According to another embodiment of the present disclosure, the pixel block processing module 1003 may be configured to perform: determining a spacing distance corresponding to the first pixel block; adjusting the distance between adjacent pixels in the first pixel block based on the interval distance to obtain an adjusted first pixel block; and respectively executing a product processing process on the adjusted first pixel block and the adjusted second pixel block and the corresponding color disturbance parameters.
According to another embodiment of the present disclosure, the color perturbation parameter includes a color diffusion parameter, a color variation parameter, a color key parameter, and the pixel block processing module 1003 may be configured to perform: shifting each pixel in the first pixel block according to the color diffusion parameter; performing product processing on the shifted first pixel block, the color change parameter and the color key parameter; and performing product processing on the second pixel block and the color diffusion parameter.
According to another embodiment of the present disclosure, the overlay processing module 1005 may be configured to perform: performing superposition processing on the first pixel block and the second pixel block after the product processing to obtain a middle pixel block; and performing preset pixel doubling processing on the middle pixel block after the superposition processing to obtain a candidate mapping.
According to another embodiment of the present disclosure, the chartlet generation module 1007 may be configured to perform: converting the candidate map into a gray map; determining pixel gray values corresponding to all color channels in the map to be generated based on the gray extraction parameters as target pixel gray values; and determining a partial mapping corresponding to the gray extraction parameters according to the gray value of each pixel in the gray mapping and the gray value of the target pixel, and generating the target mapping according to the partial mapping.
According to another embodiment of the present disclosure, the chartlet generation module 1007 may be further configured to perform: determining the pixel gray value of each color channel in the gray mapping as a comparison pixel gray value; respectively calculating the gray difference value of the comparison pixel gray value and the target pixel gray value in each color channel; and determining a partial mapping corresponding to the gray value of the comparison pixel with the gray difference value within the preset difference value range as a target mapping.
According to another embodiment of the present disclosure, the charting apparatus 1000 may further include: a map processing module, which may be configured to perform: and performing a non-uniform fuzzy processing process on the gray map to obtain the gray map with noise points removed.
The details of each module/unit in the above-mentioned apparatus have been described in detail in the embodiments of the method section, and thus are not described again.
In an exemplary embodiment of the present disclosure, there is also provided a computer-readable storage medium having stored thereon a program product capable of implementing the above-described method of the present specification. In some possible embodiments, aspects of the invention may also be implemented in the form of a program product comprising program code means for causing a terminal device to carry out the steps according to various exemplary embodiments of the invention described in the above-mentioned "exemplary methods" section of the present description, when the program product is run on the terminal device.
In an exemplary embodiment of the present disclosure, an electronic device capable of implementing the above method is also provided.
As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method or program product. Thus, various aspects of the invention may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system.
An electronic device 1100 according to this embodiment of the invention is described below with reference to fig. 11. The electronic device 1100 shown in fig. 11 is only an example and should not bring any limitations to the function and the scope of use of the embodiments of the present invention.
As shown in fig. 11, electronic device 1100 is embodied in the form of a general purpose computing device. The components of the electronic device 1100 may include, but are not limited to: the at least one processing unit 1110, the at least one memory unit 1120, a bus 1130 connecting different system components (including the memory unit 1120 and the processing unit 1110), and a display unit 1140.
Wherein the storage unit stores program code that is executable by the processing unit 1110 to cause the processing unit 1110 to perform steps according to various exemplary embodiments of the present invention as described in the above section "exemplary methods" of the present specification. For example, the processing unit 1110 may perform steps S102 to S108 as shown in fig. 1.
The storage unit 1120 may include a readable medium in the form of a volatile memory unit, such as a random access memory unit (RAM)11201 and/or a cache memory unit 11202, and may further include a read only memory unit (ROM) 11203.
Storage unit 1120 may also include a program/utility 11204 having a set (at least one) of program modules 11205, such program modules 11205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
Bus 1130 may be representative of one or more of several types of bus structures, including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
The electronic device 1100 may also communicate with one or more external devices 1200 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to communicate with the electronic device 1100, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 1100 to communicate with one or more other computing devices. Such communication may occur via an input/output (I/O) interface 1150. Also, the electronic device 1100 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network such as the internet) via the network adapter 1160. As shown, the network adapter 1160 communicates with the other modules of the electronic device 1100 over the bus 1130. It should be appreciated that although not shown, other hardware and/or software modules may be used in conjunction with the electronic device 1100, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a terminal device, or a network device, etc.) to execute the method according to the embodiments of the present disclosure.
Furthermore, the above-described figures are merely schematic illustrations of processes involved in methods according to exemplary embodiments of the invention, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is to be limited only by the terms of the appended claims.

Claims (10)

1. A method for making a map is characterized by comprising the following steps:
acquiring a first pixel block and a second pixel block, wherein the first pixel block and the second pixel block have the same size;
performing a product processing procedure on the first pixel block and the second pixel block and corresponding color disturbance parameters respectively;
performing a superposition processing process on the first pixel block and the second pixel block after the product processing to obtain a candidate map;
and determining a partial map corresponding to the gray extraction parameter from the gray maps corresponding to the candidate maps, and generating a target map according to the partial map.
2. The mapping method according to claim 1, wherein the multiplying the first pixel block and the second pixel block with the corresponding color perturbation parameters respectively comprises:
determining a spacing distance corresponding to the first pixel block;
adjusting the distance between adjacent pixels in the first pixel block based on the spacing distance to obtain the adjusted first pixel block;
and respectively executing a product processing process on the adjusted first pixel block and the adjusted second pixel block and the corresponding color disturbance parameters.
3. The method according to claim 1, wherein the color disturbance parameters include a color diffusion parameter, a color variation parameter, and a color key parameter, and performing a multiplication process on each of the first pixel block and the second pixel block with the corresponding color disturbance parameter includes:
performing shift processing on each pixel in the first pixel block according to the color diffusion parameter;
performing product processing on the shifted first pixel block, the color change parameter and the color key parameter;
and performing product processing on the second pixel block and the color diffusion parameter.
4. The mapping method according to any of claims 1 to 3, wherein performing an overlay process on the first pixel block and the second pixel block after the product processing to obtain candidate mappings comprises:
performing superposition processing on the first pixel block and the second pixel block after the product processing to obtain a middle pixel block;
and performing preset pixel doubling processing on the middle pixel block after the superposition processing to obtain a candidate mapping.
5. The mapping method according to any one of claims 1 to 3, wherein determining a partial mapping corresponding to a gray extraction parameter from the gray mapping corresponding to the candidate mapping, and generating a target mapping according to the partial mapping comprises:
converting the candidate map into a gray map;
determining pixel gray values corresponding to all color channels in the map to be generated based on the gray extraction parameters as target pixel gray values;
and determining a partial mapping corresponding to the gray extraction parameters according to the gray value of each pixel in the gray mapping and the gray value of the target pixel, and generating the target mapping according to the partial mapping.
6. The mapping method according to claim 5, wherein determining a partial mapping corresponding to a gray extraction parameter according to the gray value of each pixel in the gray mapping and the gray value of the target pixel, and generating the target mapping according to the partial mapping comprises:
determining the pixel gray value of each color channel in the gray mapping as a comparison pixel gray value;
respectively calculating the gray difference value of the comparison pixel gray value and the target pixel gray value in each color channel;
and determining a partial mapping corresponding to the gray value of the comparison pixel of the gray difference value in a preset difference value range as a target mapping.
7. The chartlet making method of claim 5, further comprising:
and performing a non-uniform fuzzy processing process on the gray map to obtain the gray map with noise points removed.
8. A chartlet making apparatus, comprising:
the device comprises a pixel block acquisition module, a pixel block acquisition module and a pixel block acquisition module, wherein the pixel block acquisition module is used for acquiring a first pixel block and a second pixel block, and the first pixel block and the second pixel block have the same size;
the pixel block processing module is used for respectively executing a product processing process on the first pixel block and the second pixel block and the corresponding color disturbance parameters;
the superposition processing module is used for executing a superposition processing process on the first pixel block and the second pixel block after the product processing to obtain a candidate map;
and the map generating module is used for determining a partial map corresponding to the gray extraction parameter from the gray maps corresponding to the candidate maps and generating a target map according to the partial map.
9. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the map making method according to any one of claims 1 to 7.
10. An electronic device, comprising:
one or more processors;
storage means for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to carry out the charting method of any one of claims 1 to 7.
CN202110390583.3A 2021-04-12 2021-04-12 Mapping method, mapping device, storage medium, and electronic apparatus Active CN112915544B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110390583.3A CN112915544B (en) 2021-04-12 2021-04-12 Mapping method, mapping device, storage medium, and electronic apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110390583.3A CN112915544B (en) 2021-04-12 2021-04-12 Mapping method, mapping device, storage medium, and electronic apparatus

Publications (2)

Publication Number Publication Date
CN112915544A true CN112915544A (en) 2021-06-08
CN112915544B CN112915544B (en) 2024-05-28

Family

ID=76174126

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110390583.3A Active CN112915544B (en) 2021-04-12 2021-04-12 Mapping method, mapping device, storage medium, and electronic apparatus

Country Status (1)

Country Link
CN (1) CN112915544B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105719327A (en) * 2016-02-29 2016-06-29 北京中邮云天科技有限公司 Art stylization image processing method
US20160203790A1 (en) * 2015-01-14 2016-07-14 Intel Corporation Acceleration of color conversion
CN109472839A (en) * 2018-10-26 2019-03-15 Oppo广东移动通信有限公司 Image generating method and device, computer equipment and computer storage medium
CN109753892A (en) * 2018-12-18 2019-05-14 广州市百果园信息技术有限公司 Generation method, device, computer storage medium and the terminal of face wrinkle
CN111784814A (en) * 2020-07-16 2020-10-16 网易(杭州)网络有限公司 Virtual character skin adjusting method and device
CN112053419A (en) * 2020-09-21 2020-12-08 长沙景嘉微电子股份有限公司 Image superposition processing method and device, storage medium and electronic device
CN112516595A (en) * 2020-12-15 2021-03-19 完美世界(北京)软件科技发展有限公司 Magma rendering method, device, equipment and storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160203790A1 (en) * 2015-01-14 2016-07-14 Intel Corporation Acceleration of color conversion
CN105719327A (en) * 2016-02-29 2016-06-29 北京中邮云天科技有限公司 Art stylization image processing method
CN109472839A (en) * 2018-10-26 2019-03-15 Oppo广东移动通信有限公司 Image generating method and device, computer equipment and computer storage medium
CN109753892A (en) * 2018-12-18 2019-05-14 广州市百果园信息技术有限公司 Generation method, device, computer storage medium and the terminal of face wrinkle
CN111784814A (en) * 2020-07-16 2020-10-16 网易(杭州)网络有限公司 Virtual character skin adjusting method and device
CN112053419A (en) * 2020-09-21 2020-12-08 长沙景嘉微电子股份有限公司 Image superposition processing method and device, storage medium and electronic device
CN112516595A (en) * 2020-12-15 2021-03-19 完美世界(北京)软件科技发展有限公司 Magma rendering method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN112915544B (en) 2024-05-28

Similar Documents

Publication Publication Date Title
CN108876931B (en) Three-dimensional object color adjustment method and device, computer equipment and computer readable storage medium
CN108550101A (en) Image processing method, device and storage medium
CN107452010A (en) A kind of automatically stingy nomography and device
CN115812221A (en) Image generation and coloring method and device
CN113240783B (en) Stylized rendering method and device, readable storage medium and electronic equipment
CN112308944A (en) Augmented reality display method of simulated lip makeup
CN111489322A (en) Method and device for adding sky filter to static picture
CN111383311B (en) Normal map generation method, device, equipment and storage medium
CN113436284A (en) Image processing method and device, computer equipment and storage medium
JP2023540730A (en) Methods, devices, electronic devices, and readable storage media for constructing topographic maps
CN108596992B (en) Rapid real-time lip gloss makeup method
CN110610504A (en) Pencil drawing generation method and device based on skeleton and tone
CN112819741A (en) Image fusion method and device, electronic equipment and storage medium
CN113096045A (en) Bitmap stretching method, bitmap stretching device and electronic equipment
CN112915544B (en) Mapping method, mapping device, storage medium, and electronic apparatus
CN112070854A (en) Image generation method, device, equipment and storage medium
JP4174133B2 (en) Image generation method
CN116485967A (en) Virtual model rendering method and related device
JP2003331313A (en) Image processing program
KR20130088953A (en) Apparatus and method for image restoration, stereo-scopic image conversion apparatus and method usig that
US6903740B1 (en) Volumetric-based method and system for visualizing datasets
KR20100062832A (en) Surface texture mapping apparatus and its method
CN111340924B (en) Image rendering method, device, equipment and storage medium
CN110853146A (en) Relief modeling method and system and relief processing equipment
Lu et al. A two-stage parametric subspace model for efficient contrast-preserving decolorization

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant