CN115294243A - Automatic coloring method and device for line art picture and storage medium - Google Patents

Automatic coloring method and device for line art picture and storage medium Download PDF

Info

Publication number
CN115294243A
CN115294243A CN202210825589.3A CN202210825589A CN115294243A CN 115294243 A CN115294243 A CN 115294243A CN 202210825589 A CN202210825589 A CN 202210825589A CN 115294243 A CN115294243 A CN 115294243A
Authority
CN
China
Prior art keywords
template
coloring
color
value
line
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210825589.3A
Other languages
Chinese (zh)
Inventor
戴玲娜
朱静洁
高飞
李鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Miaoji Technology Co ltd
Original Assignee
Hangzhou Miaoji Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Miaoji Technology Co ltd filed Critical Hangzhou Miaoji Technology Co ltd
Priority to CN202210825589.3A priority Critical patent/CN115294243A/en
Publication of CN115294243A publication Critical patent/CN115294243A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/40Filling a planar surface by adding surface attributes, e.g. colour or texture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/001Texturing; Colouring; Generation of texture or colour

Abstract

The embodiment of the application discloses an automatic coloring method, a device and a storage medium for an artistic line painting, wherein the automatic coloring method for the artistic line painting comprises the following steps: obtaining a line drawing generated according to a photo input by a user or a line drawing directly input by the user, preprocessing the line drawing, converting the preprocessed line drawing into a gray scale image, and performing normalization processing to obtain a target image; judging whether to color in different areas based on the instruction of the user, if so, generating a corresponding analysis mask according to the picture, extracting an area needing to be colored according to the analysis mask, and carrying out normalization processing to obtain a target area, otherwise, skipping the step; acquiring a template converted into a first color space, and coloring the template, a target image and/or a target area by a coloring method according to the depth of the ground color of the template and the characteristics of line drawings; and fusing the colored template, the target image and/or the target area of the second color space to obtain a coloring result.

Description

Automatic coloring method and device for artistic line painting and storage medium
Technical Field
The application relates to the technical field of image processing, in particular to an automatic coloring method and device for an art line painting and a storage medium.
Background
The line art painting is an important way for representing the art work, and has extremely high unique aesthetic value. Nowadays, people are increasingly pursuing line drawings, and are not limited to a single photo form, so that the line drawings are very important to be applied to various actual products, personalized wallpaper, head portrait, expression bags and the like.
In the technical aspect, a relatively common fusion method is a calculation process performed by setting a weight relationship between a template layout and a line drawing, such as an addweighted function mentioned in OpenCV in a Python library. At present, a method for fusing color images is mainly used for processing through line masks, but the method is suitable for the clear and obvious edges of a foreground image (lines), otherwise, the obtained result is hard edges and the whole image is not harmonious, and the line images cannot obtain complete mask representing the lines, so the method is not suitable for the line images with gray values. Another method is to use artificial PS, but this requires the user to have certain computer processing skills, and the most obvious problem of manual image trimming is that it is time-consuming and labor-consuming, and the obtained fusion result often loses part of the gray values in the line drawing. In the application of actual productization, most of the line coloring and fusing processes use the line after binarization, and although the method can realize color and texture replacement of line drawings and application to various templates, the huge problem that the gradation information on countless lines is lost by binarization is also greatly reduced, so that the quality of the finally processed picture is greatly reduced.
Based on HSV, a plurality of image processing works can be performed, and the prior art based on fusion of a gray level image and a color image in HSV space, such as remote sensing image fusion processing in HSV space, but the method is simply realized by replacing a V brightness space, and the applicable range is relatively limited; for example, in the color multi-focus image fusion in the HSV space, the contrast, the saturation, the tone change speed and the like of the image can be effectively improved. However, the existing method does not realize the method of line coloring and template fusion by performing linear operation in the HSV space. Generally speaking, there is no complete set of processing method for automatically coloring line drawings and adding templates for users to select, which can solve the above problems.
Disclosure of Invention
An object of the embodiments of the present application is to provide an automatic coloring method and apparatus for an art line drawing, and a storage medium, so as to solve the problem that in the prior art, there is no complete processing method for automatically coloring and adding templates to a line drawing, which can be selected by a user, and is applicable to the fusion of most templates and various line drawings, and can completely retain line information.
In order to achieve the above object, an embodiment of the present application provides an automatic coloring method for an art line painting, including the steps of: obtaining a line drawing generated according to a photo input by a user or the line drawing directly input by the user, preprocessing the line drawing to enable the line drawing to be suitable for the mask position of a template, converting the preprocessed line drawing into a gray scale image, and performing normalization processing to obtain a target image; judging whether to color the regions according to the user instruction, if so, generating a corresponding analysis mask according to the picture, extracting the region to be colored according to the analysis mask, and carrying out normalization processing to obtain a target region, otherwise, skipping the step; acquiring a template converted into a first color space, selecting a coloring method to color the template, the target image and/or the target area according to the depth of the background color of the template and the characteristics of the line drawing, and converting the template, the target image and/or the target area of the first color space subjected to coloring treatment back to a second color space, wherein the first color space comprises an LAB space, an HSV space or an HSI space, the second color space comprises an RGB space, and the coloring method comprises pure color coloring, texture extraction coloring and/or template self-adaptive coloring; and fusing the colored template of the second color space, the target image and/or the target area to obtain a coloring result.
Optionally, the solid-color coloring method includes:
when the lines of the target image or the target area are made to appear black on the template, the formula is used:
Figure BDA0003743845490000031
obtaining the value of a brightness channel;
when the lines of the target image or the target area are made to appear white on the template, the formula is used:
Figure BDA0003743845490000032
and
Figure BDA0003743845490000033
obtaining a value of a saturation channel and a value of a brightness channel;
wherein the content of the first and second substances,
Figure BDA0003743845490000034
refers to the value of the luminance channel obtained using the pure color coloring method,
Figure BDA0003743845490000035
denotes the value of the luminance channel of the stencil in HSV space, I gray Refers to a gray value of a line of the target image or the target region,
Figure BDA0003743845490000036
refers to the value of the saturation channel obtained by using the pure-color coloring method,
Figure BDA0003743845490000037
refers to the value of the saturation channel of the template in HSV space.
Optionally, the texture extraction type coloring method includes:
using the formula:
Figure BDA0003743845490000038
and
Figure BDA0003743845490000039
the value of the saturation channel and the value of the luminance channel are obtained,
wherein the content of the first and second substances,
Figure BDA00037438454900000310
refers to the value of the luminance channel obtained using the texture extraction-type coloring method,
Figure BDA00037438454900000311
refers to the value of the saturation channel obtained using the texture extraction-type coloring method,
Figure BDA00037438454900000312
refers to the value of the saturation channel of the template in HSV space, I gray The gray value of the line of the target image or the target area is indicated.
Optionally, the template adaptive coloring method includes:
using the formula:
Figure BDA00037438454900000313
and
Figure BDA00037438454900000314
obtaining the value of the saturation channel, the value of the hue channel and the value of the brightness channel,
wherein, 0<c<1,
Figure BDA00037438454900000315
Refers to the value of the hue channel obtained by using the template adaptive coloring method,
Figure BDA00037438454900000316
refers to the value of the luminance channel obtained by using the template adaptive coloring method,
Figure BDA00037438454900000317
finger using the stencil to adaptThe values of the saturation channel obtained by the color method,
Figure BDA00037438454900000318
refers to the value of the hue channel of the template in HSV space,
Figure BDA0003743845490000041
refers to the value of the saturation channel of the template in HSV space,
Figure BDA0003743845490000042
denotes the value of the brightness channel of the stencil in HSV space, I gray Refers to the line of the target image or the gray value of the target area.
Optionally, after performing the texture-extraction type coloring process and acquiring the template, the target image and/or the target area converted back to the second color space, the method further includes:
the coloring treatment is performed by a fused coloring method, which includes:
using the formula:
Figure BDA0003743845490000043
the result of the coloring is obtained and,
wherein the content of the first and second substances,
Figure BDA0003743845490000044
refers to the coloring result in the RGB space, I color The target image or the target region of the RBG space colored by the texture extraction-type coloring method, and T the module of the RGB space.
Optionally, the method of pre-processing comprises:
and cutting redundant white edges around the line painting. And zooming according to the length-width ratio of the cut line drawing so that the line drawing can adapt to the mask position of the template.
Optionally, the method for generating the corresponding parsing mask according to the photo includes:
and cutting the photo, and inputting the cut photo into a corresponding image segmentation network to obtain the multi-channel analysis mask.
Optionally, the method for fusing the colored template, the target image and/or the target region of the second color space includes:
fusing each target area after being colored in the subareas to obtain a fusion result, and fusing the fusion result with the colored lines of the target image and the template to obtain the coloring result.
In order to realize the above object, the present application further provides an automatic coloring device for the art line painting, including: a memory; and
a processor coupled to the memory, the processor configured to:
obtaining a line drawing generated according to a photo input by a user or the line drawing directly input by the user, preprocessing the line drawing to enable the line drawing to be suitable for the mask position of a template, converting the preprocessed line drawing into a gray scale image and carrying out normalization processing to obtain a target image;
judging whether to color in different areas or not based on an instruction of a user, if so, generating a corresponding analysis mask according to the picture, extracting an area needing to be colored according to the analysis mask, and carrying out normalization processing to obtain a target area, otherwise, skipping the step;
acquiring a template converted into a first color space, coloring the template, the target image and/or the target area by a coloring method according to the background color depth of the template and the characteristics of the line drawing, and converting the template, the target image and/or the target area of the first color space subjected to coloring treatment back into a second color space, wherein the first color space comprises an LAB space, an HSV space or an HSI space, the second color space comprises an RGB space, and the coloring method comprises pure-color coloring, texture extraction coloring and/or template adaptive coloring;
and fusing the colored template of the second color space, the target image and/or the target area to obtain a coloring result.
To achieve the above object, the present application also provides a computer storage medium having a computer program stored thereon, wherein the computer program, when executed by a machine, implements the steps of the method as described above.
The embodiment of the application has the following advantages:
1. the embodiment of the application provides an automatic coloring method for an art line drawing, which comprises the following steps: obtaining a line drawing generated according to a photo input by a user or the line drawing directly input by the user, preprocessing the line drawing to enable the line drawing to be suitable for the mask position of a template, converting the preprocessed line drawing into a gray image and carrying out normalization processing to obtain a target image; judging whether to color the subareas or not based on a user instruction, if so, generating a corresponding analysis mask according to the photo, extracting an area needing to be colored according to the analysis mask, and carrying out normalization processing to obtain a target area, otherwise, skipping the step; obtaining a template converted into a first color space, selecting a coloring method to color the template, the target image and/or the target area according to the depth of the background color of the template and the characteristics of the line drawing, and converting the template, the target image and/or the target area of the first color space subjected to coloring treatment back to a second color space, wherein the first color space comprises an LAB space, an HSV space or an HSI space, the second color space comprises an RGB space, and the coloring method comprises pure color coloring, texture extraction coloring and/or template self-adaptive coloring; and fusing the colored template of the second color space, the target image and/or the target area to obtain a coloring result.
By the method, different linear processing calculations in the HSV space can meet the requirements of users for freely changing line textures or colors, fusing the line textures or colors with the template in various texture modes, enabling the fusion of the line textures and the template to be more natural and retaining more line information. Therefore, the effect display of the online platform can be met, and the application requirements of the user on various personalized products of line drawings can also be met.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below. It should be apparent that the drawings in the following description are merely exemplary, and that other embodiments can be derived from the drawings provided by those of ordinary skill in the art without inventive exercise.
Fig. 1 is a flowchart of an automatic coloring method for an art line painting according to an embodiment of the present application;
fig. 2 is a schematic flow chart of the line drawing and template fusion of the automatic coloring method for art line drawing provided by the embodiment of the present application;
fig. 3 is a schematic diagram illustrating an example of a pure-color coloring method for automatically coloring an art line painting, in which a line is white according to an embodiment of the present application;
fig. 4 is a schematic diagram illustrating an example of pure-color coloring for rendering black lines in an automatic coloring method for an art line drawing according to an embodiment of the present application;
fig. 5 is a schematic diagram illustrating an example of texture extraction type coloring in an automatic coloring method for an art line drawing according to an embodiment of the present application;
fig. 6 is a schematic diagram illustrating an example of template adaptive coloring in an automatic coloring method for an art line drawing according to an embodiment of the present application;
fig. 7 is a schematic diagram illustrating an example of fusion coloring in an automatic coloring method for an art line painting according to an embodiment of the present application;
fig. 8 is a coloring indication diagram of a target area of an automatic coloring method for an art line painting according to an embodiment of the present application;
fig. 9 is a block diagram of a module of an automatic coloring device for an art line painting according to an embodiment of the present application.
Detailed Description
The present disclosure is not intended to be limited to the particular embodiments shown herein, but is to be accorded the widest scope consistent with the principles and novel features disclosed herein. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
In addition, the technical features mentioned in the different embodiments of the present application described below may be combined with each other as long as they do not conflict with each other.
An embodiment of the present application provides an automatic coloring method for line art drawings, and referring to fig. 1, fig. 1 is a flowchart of an automatic coloring method for line art drawings provided in an embodiment of the present application, it should be understood that the method may further include additional blocks not shown and/or may omit the shown blocks, and the scope of the present application is not limited in this respect.
In step 101, a line drawing generated according to a photo input by a user or the line drawing directly input by the user is acquired, the line drawing is preprocessed to enable the line drawing to be suitable for a mask position of a template, and the preprocessed line drawing is converted into a gray scale image and is subjected to normalization processing to obtain a target image.
In some embodiments, the method of pre-processing comprises: and cutting redundant white edges around the line painting. And zooming according to the length-width ratio of the cut line drawing so that the line drawing can adapt to the mask position of the template.
Specifically, according to a line drawing generated by inputting a photo or a line drawing directly input by a user, cutting redundant white edges around the line drawing. And scaling according to the length-width ratio of the cut line picture to enable the line picture to adapt to the mask position of the template, and finally converting the line picture into a gray image and carrying out normalization processing. For the template, the template needs to be converted from the RGB space to a first color space, e.g., HSV space.
Data cutting: the user can inputPhotograph I of p On-line generation of line drawing I s Or the user directly uploads the line drawing I s At this time, the boundary width P of the left, right, upper and lower white edges can be obtained through the pixel points drawn by the lines l ,P r ,P u ,P d And to ensure that the line drawing is centered, we process so that P is l =P r =MIN(P l ,P r ) Processing clipping I according to the above parameters p And I s And recording the final preprocessed pictures and line drawings as I P And I S
Scaling the data: in the application, a template is marked as T, and a mask corresponding to the template is marked as T m . First, T needs to be judged m Middle mask area
Figure BDA0003743845490000081
Aspect ratio of (1) if
Figure BDA0003743845490000082
Then will I S 、I P To a height and
Figure BDA0003743845490000083
and (5) the consistency is achieved. Otherwise, the first step is to S 、I P Is scaled to width and
Figure BDA0003743845490000084
and (5) the consistency is achieved.
Conversion to grayscale and normalization: to scale the I S Convert from RGB to grayscale and normalize the values to between 0-1, denoted as
Figure BDA0003743845490000085
Template color space conversion: converting an input template T from RGB to HSV space T hsv . Three channels under HSV represent hue, saturation and brightness respectively and are respectively marked as
Figure BDA0003743845490000086
Wherein the color toneRepresenting the spectral hue of the entire graph; the saturation degree represents the degree of the color approaching the spectral color, the color saturation is higher when the numerical value is larger, and otherwise, the color approaches to white; the brightness indicates the brightness of the color, and a larger value indicates a brighter brightness.
In step 102, whether to color in different regions is judged based on a user instruction, if so, a corresponding analysis mask is generated according to the photo, a region to be colored is extracted according to the analysis mask, normalization processing is performed to obtain a target region, and if not, the step is skipped.
Specifically, the following embodiments take the example of uploading a face photograph by a user as an example. Referring to fig. 2, whether to color in different regions is judged based on a user instruction, if yes, a corresponding face analysis mask is generated according to a face picture uploaded by the user, and then a region to be colored, namely a target region, is extracted by fine-tuning the mask of each part of the face for subsequent coloring.
In some embodiments, the method of generating the corresponding parsing mask from the photo includes: and cutting the photo, and inputting the cut photo into a corresponding image segmentation network to obtain the analysis mask of multiple channels.
Specifically, the processed photograph I p And inputting the data into a corresponding image segmentation network to obtain a corresponding multi-channel analysis mask P. In the present application, a face photograph is taken as an example: and inputting the face picture into the fine-tuned BisenetV2 model to obtain the 16-channel analysis P. In order to make the final fusion result more natural at the border, the skin area as well as the hair area need to be post-processed in advance: skin region P pair with minimum filter of Kernal =5 × 5 skin Carrying out treatment; by etching operation in image processing on hair region P hair Processing is performed with Kernal =3 × 3 and the number of iterations is 5. Finally, the face analysis data is normalized to be between 0 and 1 and recorded as
Figure BDA0003743845490000092
At step 103, a template converted into a first color space is obtained, a coloring method is selected to color the template, the target image and/or the target area according to the depth of the background color of the template and the characteristics of the line drawing, the template, the target image and/or the target area of the first color space after being subjected to coloring processing is converted into a second color space, wherein the first color space comprises an LAB space, an HSV space or an HSI space, the second color space comprises an RGB space, and the coloring method comprises pure color coloring, texture extraction coloring and/or template self-adaptive coloring.
Specifically, there are 4 coloring modes, which are pure-color coloring, texture extraction coloring, template adaptive coloring and fusion coloring. Selecting a proper coloring mode according to the background color depth of the template, the characteristics of the line painting and the like, coloring the template and the line painting, and in the following embodiment, taking the coloring treatment in the HSV space as an example for explanation, it should be understood that the coloring mode in the application can be applied to the other second color spaces for treatment as well.
In some embodiments, the solid-color coloring method includes:
when the lines of the target image or the target area are made to appear black on the template, the formula is used:
Figure BDA0003743845490000091
obtaining the value of a brightness channel;
when the lines of the target image or the target area are made to appear white on the template, the formula is used:
Figure BDA0003743845490000101
and
Figure BDA0003743845490000102
obtaining a value of a saturation channel and a value of a brightness channel;
wherein, the first and the second end of the pipe are connected with each other,
Figure BDA0003743845490000103
refers to the value of the luminance channel obtained using the solid-color coloring method,
Figure BDA0003743845490000104
denotes the value of the brightness channel of the template in HSV space, I gray Refers to a gray value of a line of the target image or the target region,
Figure BDA0003743845490000105
refers to the value of the saturation channel obtained by using the pure-color coloring method,
Figure BDA0003743845490000106
refers to the value of the saturation channel of the template in HSV space.
Specifically, the following embodiments will be described by taking coloring of lines of a target image as an example. Referring to fig. 3 and 4, the upper left corner of fig. 3 is shown as a bright red template; the lower left corner is an input line drawing (local); the right panel shows the processing results: the black part of the input line is colored to be white, and then the light red of the gray line is gradually transited to the color of the large red template. The top left diagram in FIG. 4 is an orange template; the lower left corner is an input line drawing (local); the right graph is the processing result: and after the black part of the input line is colored, the input line is black, then the gray line is light orange, and finally the color of the input line is gradually transited to the color of the orange template. The method is suitable for fusing the template and the line painting, the line color of the line painting is black or white in the template, and the natural color transition effect can be obtained by using the method for processing.
If the lines need to be black on the template, we only need to perform linear calculation in the V space, and the formula is as follows:
Figure RE-GDA0003873083580000107
the method achieves the effect by changing the brightness of the placement position of the lines in the template, namely the brightness is closer to the template when the gray value in the lines is close to the position of 1; on the contrary, the effect of presenting black when the brightness is close to 0; in the process of drawing the numerical value from 0 to 1, IAssuming that the template background is pure red, the effect map transitions from black to light red to pure red.
If the lines need to be white on the template, the S and V spaces need to be processed simultaneously.
Figure BDA0003743845490000108
The meaning of the processing function in V space is that the closer to 1 in the line drawing, the closer to the template the saturation is; the closer the line is to the position of 0, the saturation is close to 255, so that the line appears white in the template; in the process of the values from 0 to 1, assuming that the template is pure red, the final effect graph will transition from white to light red to pure red. Finally, the space under HSV is filled
Figure BDA0003743845490000111
Three channels are combined to obtain the final pure-color type coloring result:
Figure BDA0003743845490000112
in some embodiments, the texture extraction-type coloring method comprises:
using the formula:
Figure BDA0003743845490000113
and
Figure BDA0003743845490000114
the value of the saturation channel and the value of the luminance channel are obtained,
wherein, the first and the second end of the pipe are connected with each other,
Figure BDA0003743845490000115
refers to the value of the luminance channel obtained using the texture extraction-type coloring method,
Figure BDA0003743845490000116
using said texture extractionThe saturation channel value obtained by the coloring method,
Figure BDA0003743845490000117
refers to the value of the saturation channel of the template in HSV space, I gray The gray value of the line of the target image or the target area is indicated.
Specifically, referring to fig. 5, in fig. 5, the top left corner is a dark red template, and the bottom left corner is an input line drawing (local); the right graph is the processing result: and coloring the black part of the input line to be dark red, then rendering the gray line to be light red, and finally gradually transitioning to white. The method is suitable for imparting any color or texture to the lines. The specific formula is as follows:
Figure BDA0003743845490000118
wherein
Figure BDA0003743845490000119
Firstly, keeping the hue of the H space unchanged; to enable the line portions to be template textured means
Figure BDA00037438454900001110
The closer to 0 the middle the saturation is. In order to make the brightness of the final result according to
Figure BDA00037438454900001111
The saturation is close to 0 when the value in the line drawing is closer to 1 in the S space processing process, and the saturation is pure white when the saturation is equal to 0. V-space processing is required at this point so that the brightness is up to 255 brightest when the line is close to 1. Finally, the space under HSV space
Figure BDA00037438454900001112
Three channels are combined to obtain the final pure-color type coloring result:
Figure BDA00037438454900001113
in some embodiments, the template adaptive coloring method comprises:
using the formula:
Figure BDA00037438454900001114
and
Figure BDA00037438454900001115
obtaining the value of the saturation channel, the value of the hue channel and the value of the brightness channel,
wherein, 0<c<1,
Figure BDA0003743845490000121
Refers to the value of the hue channel obtained by using the template adaptive coloring method,
Figure BDA0003743845490000122
refers to the value of the luminance channel obtained by using the template adaptive coloring method,
Figure BDA0003743845490000123
refers to the value of the saturation channel obtained by using the template adaptive coloring method,
Figure BDA0003743845490000124
refers to the value of the hue channel of the template in HSV space,
Figure BDA0003743845490000125
refers to the value of the saturation channel of the template in HSV space,
Figure BDA0003743845490000126
denotes the value of the brightness channel of the template in HSV space, I gray Refers to the line of the target image or the gray value of the target area.
Specifically, referring to fig. 6, the template map is shown in the upper left corner of fig. 6: the template picture is divided into an upper color and a lower color which are respectively red and yellow; the lower left corner is the input line drawing (local); the right panel shows the processing results: the upper half of the line strips are black positions, the color of the lines is more red than red, the lower half of the line strips are black positions, and the color of the lines is more yellow than yellow. The method is suitable for the self-adaptive darkening of the line color according to the color of the template when the template is fused with the line painting. If the template has red, blue and green blocks, the user can obtain the result that the lines at the position of the red template have deeper red, the lines at the position of the blue are bluer, and the lines at the position of the green are greener. The concrete formula is as follows:
Figure BDA0003743845490000127
Figure BDA0003743845490000128
Figure BDA0003743845490000129
and when the median of the S space line drawing approaches to 1, keeping the template saturation unchanged, and otherwise, enabling the saturation to be maximum. Correspondingly, when the number of lines is close to 0, c is 0.5, and the brightness is reduced by half; while the luminance in the resulting graph remains unchanged when the line value approaches 1. By the processing mode, the effect of adaptively coloring the lines and the templates during fusion can be achieved. Finally, the space under HSV is filled
Figure BDA00037438454900001210
The three channels are combined to obtain the final pure color type coloring result:
Figure BDA00037438454900001211
in some embodiments, after performing the texture-extracting coloring process and obtaining the template, the target image and/or the target area converted back to the second color space, further comprising:
the coloring treatment is performed by a fused coloring method, which includes:
using the formula:
Figure BDA0003743845490000131
the result of the coloring is obtained and,
wherein, the first and the second end of the pipe are connected with each other,
Figure BDA0003743845490000132
refers to the coloring result in the RGB space, I color The target image or the target area of the RBG space colored by the texture extraction-type coloring method is referred to, and T refers to the module of the RGB space.
Specifically, referring to fig. 7 and 8, the upper left drawing in fig. 7 is a watercolor texture template drawing; the left middle picture is a deep red template picture; the lower left figure is an input line drawing (local); the right panel shows the processing results: the black lines are treated to be dark red, the gray lines are treated to be light red, and finally the transition is the template of the watercolor paper texture. In fig. 8, the large red, brown, skin color, lip color, and the like are the results obtained by the texture extraction formula; the color of the lines of the nose, the chin and the like of the face is obtained by the way of self-adaptive coloring of the template, so that the skin color which is darker than the skin color of the cheeks can be obtained. The method is used for making up the defects of the two methods of 'pure color coloring' and 'texture extraction'. As a result of the pure-color coloring, the lines can only be black or white on the template, the texture coloring can only make the lines have textures, and the rest positions are white. The fusion coloring method enables lines as foreground lines to have various textures and colors when the lines are fused with the template, and can also replace templates in various forms as background. The colored line painting can be obtained by the coloring mode of texture extraction and is recorded as
Figure BDA0003743845490000133
Is calculated byThe final line template fusion result can be obtained by the formula:
Figure BDA0003743845490000134
the four methods mentioned above are not only suitable for lines, but also for the mask codes extracted from the' regional extraction
Figure BDA0003743845490000135
The coloring is also applicable, the above embodiment takes the example of drawing the corresponding target image with lines as an example,
Figure BDA0003743845490000136
Figure BDA0003743845490000137
all are relevant parameters of the target image corresponding to the line drawing, and it should be understood that the parameters corresponding to the target area may be used instead. The two are consistent in use, because each layer in the mask is also a gray value between 0 and 1, the only difference is that the mask is regional. Finally, aiming at the four coloring modes mentioned above, the user can select according to specific conditions, and finally the result of the processed HSV space needs to be converted back to the RGB space.
At step 104, the colored template, the target image and/or the target area of the second color space are fused to obtain a coloring result.
In some embodiments, the method of fusing the colored template, the target image and/or the target region of the second color space comprises: fusing each target area after being colored in the subareas to obtain a fusion result, and fusing the fusion result with the colored lines of the target image and the template to obtain the coloring result.
Referring to fig. 2, in particular, if the photo uploaded by the user is a face photo, the step of "extracting by partition" is performedObtaining corresponding face analysis
Figure BDA0003743845490000141
So the fusion operation is the handle
Figure BDA0003743845490000142
And combining the masks of each layer after coloring treatment to obtain the coloring result of the whole area of the face. And finally, the result is fused with the lines to obtain five sense organs, so that the final coloring result of the face portrait is obtained.
By the method, through different linear processing calculation in the HSV space, the method can meet the requirements that a user can freely change the texture or color of the line, the line is fused with the template in various texture forms, the fusion of the texture and the template is more natural, and more line information can be reserved. Therefore, the effect display of the online platform can be met, and the application requirements of the user on various personalized products of line drawings can also be met.
Fig. 9 is a block diagram of an automatic coloring device for an art line painting according to an embodiment of the present application. The device includes:
a memory 201; and a processor 202 connected to the memory 201, the processor 202 configured to: obtaining a line drawing generated according to a photo input by a user or the line drawing directly input by the user, preprocessing the line drawing to enable the line drawing to be suitable for the mask position of a template, converting the preprocessed line drawing into a gray scale image and carrying out normalization processing to obtain a target image; judging whether to color in different areas based on the instruction of a user, if so, generating a corresponding analysis mask according to the picture, extracting an area needing to be colored according to the analysis mask, and carrying out normalization processing to obtain a target area, otherwise, skipping the step; acquiring a template converted into a first color space, selecting a coloring method to color the template, the target image and/or the target area according to the depth of the background color of the template and the characteristics of the line drawing, and converting the template, the target image and/or the target area of the first color space subjected to coloring treatment back into a second color space, wherein the first color space comprises an LAB space, an HSV space or an HSI space, the second color space comprises an RGB space, and the coloring method comprises pure color coloring, texture extraction coloring and/or template self-adaptive coloring; and fusing the colored template of the second color space, the target image and/or the target area to obtain a coloring result.
In some embodiments, the processor 202 is further configured to: the pure-color coloring method comprises the following steps:
when the lines of the target image or the target area are made to appear black on the template, the formula is used:
Figure BDA0003743845490000151
obtaining the value of a brightness channel;
when the lines of the target image or the target area are made to appear white on the template, the formula is used:
Figure BDA0003743845490000152
and
Figure BDA0003743845490000153
obtaining a value of a saturation channel and a value of a brightness channel;
wherein the content of the first and second substances,
Figure BDA0003743845490000154
refers to the value of the luminance channel obtained using the pure color coloring method,
Figure BDA0003743845490000155
denotes the value of the luminance channel of the stencil in HSV space, I gray Refers to a gray value of a line of the target image or the target region,
Figure BDA0003743845490000156
refers to the value of the saturation channel obtained by using the pure-color coloring method,
Figure BDA0003743845490000157
refers to the value of the saturation channel of the template in HSV space.
In some embodiments, the processor 202 is further configured to: the texture extraction type coloring method comprises the following steps:
using the formula:
Figure BDA0003743845490000158
and
Figure BDA0003743845490000159
the value of the saturation channel and the value of the luminance channel are obtained,
wherein, the first and the second end of the pipe are connected with each other,
Figure BDA00037438454900001510
refers to the value of the luminance channel obtained using the texture extraction-type coloring method,
Figure BDA00037438454900001511
refers to the value of the saturation channel obtained using the texture extraction-type coloring method,
Figure BDA00037438454900001512
refers to the value of the saturation channel of the template in HSV space, I gray The gray value of the line of the target image or the target area is indicated.
In some embodiments, the processor 202 is further configured to: the template self-adaptive coloring method comprises the following steps:
using the formula:
Figure BDA00037438454900001513
and
Figure BDA00037438454900001514
obtaining the value of the saturation channel, the value of the hue channel and the value of the brightness channel,
wherein, 0<c<1,
Figure BDA0003743845490000161
Refers to the value of the hue channel obtained by using the template adaptive coloring method,
Figure BDA0003743845490000162
refers to the value of the luminance channel obtained by using the template adaptive coloring method,
Figure BDA0003743845490000163
refers to the value of the saturation channel obtained by using the template adaptive coloring method,
Figure BDA0003743845490000164
refers to the value of the hue channel of the template in HSV space,
Figure BDA0003743845490000165
refers to the value of the saturation channel of the template in HSV space,
Figure BDA0003743845490000166
denotes the value of the brightness channel of the template in HSV space, I gray Refers to the line of the target image or the gray value of the target area.
In some embodiments, the processor 202 is further configured to: after performing the texture-extracting coloring process and obtaining the template, the target image and/or the target area converted back to the second color space, further comprising:
the coloring treatment is performed by a fused coloring method, which includes:
using the formula:
Figure BDA0003743845490000167
the result of the coloring is obtained and,
wherein the content of the first and second substances,
Figure BDA0003743845490000168
refers to the coloring result in the RGB space, I color Is extracted by the textureThe target image or the target area of the RBG space after coloring by a color method, T refers to the module of the RGB space.
In some embodiments, the processor 202 is further configured to: the pretreatment method comprises the following steps:
and cutting redundant white edges around the line painting. And zooming according to the length-width ratio of the cut line drawing so that the line drawing can adapt to the mask position of the template.
In some embodiments, the processor 202 is further configured to: the method for generating the corresponding analysis mask according to the photo comprises the following steps:
and cutting the photo, and inputting the cut photo into a corresponding image segmentation network to obtain the multi-channel analysis mask.
In some embodiments, the processor 202 is further configured to: the method for fusing the colored template, the target image and/or the target area of the second color space comprises the following steps:
fusing each target area after being colored in the subareas to obtain a fusion result, and fusing the fusion result with the colored lines of the target image and the template to obtain the coloring result.
For the specific implementation method, reference is made to the foregoing method embodiments, which are not described herein again.
The present application may be methods, apparatus, systems and/or computer program products. The computer program product may include a computer-readable storage medium having computer-readable program instructions embodied therewith for carrying out aspects of the present application.
The computer-readable storage medium may be a tangible device that can hold and store the instructions for use by the instruction execution device. The computer readable storage medium may be, for example, but not limited to, an electronic memory device, a magnetic memory device, an optical memory device, an electromagnetic memory device, a semiconductor memory device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a Static Random Access Memory (SRAM), a portable compact disc read-only memory (CD-ROM), a Digital Versatile Disc (DVD), a memory stick, a floppy disk, a mechanical coding device, such as a punch card or an in-groove protrusion structure having instructions stored thereon, and any suitable combination of the foregoing. Computer-readable storage media as used herein is not to be construed as transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through waveguides or other transmission media (e.g., optical pulses through fiber optic cables), or electrical signals transmitted through electrical wires.
The computer-readable program instructions described herein may be downloaded from a computer-readable storage medium to a respective computing/processing device, or to an external computer or external storage device via a network, such as the internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmission, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. The network adapter card or network interface in each computing/processing device receives computer-readable program instructions from the network and forwards the computer-readable program instructions for storage in a computer-readable storage medium in the respective computing/processing device.
Computer program instructions for carrying out operations of the present application may be assembler instructions, instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or source or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider). In some embodiments, aspects of the present application are implemented by personalizing an electronic circuit, such as a programmable logic circuit, a Field Programmable Gate Array (FPGA), or a Programmable Logic Array (PLA), with state information of computer-readable program instructions, which can execute the computer-readable program instructions.
Various aspects of the present application are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer-readable program instructions may be provided to a processing unit of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processing unit of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer-readable program instructions may also be stored in a computer-readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer-readable medium storing the instructions comprises an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
It is noted that, unless expressly stated otherwise, all the features disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose. Thus, unless expressly stated otherwise, each feature disclosed is one example only of a generic series of equivalent or similar features. Where used, it is further preferred, even further and more preferred to proceed briefly with the description of the other embodiment on the basis of the preceding embodiment, the contents of which further, preferably, more further or more preferably the rear band is combined with the preceding embodiment as a complete constituent of the other embodiment. Several further, preferred, still further or more preferred arrangements of the belt after the same embodiment may be combined in any combination to form a further embodiment.
Although the present application has been described in detail with respect to the general description and the specific examples, it will be apparent to those skilled in the art that certain changes and modifications may be made based on the present application. Accordingly, such modifications and improvements are intended to be within the scope of this invention as claimed.

Claims (10)

1. An automatic coloring method for an artistic line drawing is characterized by comprising the following steps:
obtaining a line drawing generated according to a photo input by a user or the line drawing directly input by the user, preprocessing the line drawing to enable the line drawing to be suitable for the mask position of a template, converting the preprocessed line drawing into a gray scale image and carrying out normalization processing to obtain a target image;
judging whether to color in different areas based on a user instruction, if so, generating a corresponding analysis mask according to the picture, extracting an area needing to be colored according to the analysis mask, and carrying out normalization processing to obtain a target area, otherwise, skipping the step;
acquiring a template converted into a first color space, selecting a coloring method to color the template, the target image and/or the target area according to the depth of the background color of the template and the characteristics of the line drawing, and converting the template, the target image and/or the target area of the first color space subjected to coloring treatment back into a second color space, wherein the first color space comprises an LAB space, an HSV space or an HSI space, the second color space comprises an RGB space, and the coloring method comprises pure color coloring, texture extraction coloring and/or template self-adaptive coloring;
and fusing the colored template of the second color space, the target image and/or the target area to obtain a coloring result.
2. The automatic coloring method for the line art drawing according to claim 1, wherein the pure-color coloring method comprises the following steps:
making the target imageWhen the lines or the target area presents black on the template, a formula is utilized:
Figure FDA0003743845480000011
obtaining the value of a brightness channel;
when the line of the target image or the target area is made to appear white on the template, using the formula:
Figure FDA0003743845480000012
and
Figure FDA0003743845480000013
obtaining a value of a saturation channel and a value of a brightness channel;
wherein the content of the first and second substances,
Figure FDA0003743845480000014
refers to the value of the luminance channel obtained using the pure color coloring method,
Figure FDA0003743845480000015
denotes the value of the luminance channel of the stencil in the HSV space, I gray Refers to the line of the target image or the gray value of the target area,
Figure FDA0003743845480000021
refers to the value of the saturation channel obtained by using the pure-color coloring method,
Figure FDA0003743845480000022
refers to the value of the saturation channel of the template in the HSV space.
3. The automatic coloring method for the line art drawing according to claim 1, wherein the texture extraction type coloring method comprises the following steps:
using the formula:
Figure FDA0003743845480000023
and
Figure FDA0003743845480000024
the value of the saturation channel and the value of the luminance channel are obtained,
wherein the content of the first and second substances,
Figure FDA0003743845480000025
Figure FDA0003743845480000026
refers to the value of the luminance channel obtained using the texture extraction-type coloring method,
Figure FDA0003743845480000027
refers to the value of the saturation channel obtained using the texture extraction coloring method,
Figure FDA0003743845480000028
refers to the value of the saturation channel of the template in the HSV space, I gray Refers to the line of the target image or the gray value of the target area.
4. The automatic coloring method for the line art drawing according to claim 1, wherein the template adaptive coloring method comprises the following steps:
using the formula:
Figure FDA0003743845480000029
and
Figure FDA00037438454800000210
obtaining the value of the saturation channel, the value of the hue channel and the value of the brightness channel,
wherein, 0<c<1,
Figure FDA00037438454800000211
Refers to the color obtained by using the template self-adaptive coloring methodThe value of the channel is adjusted,
Figure FDA00037438454800000212
refers to the value of the luminance channel obtained by using the template adaptive coloring method,
Figure FDA00037438454800000213
refers to the value of the saturation channel obtained by using the template self-adaptive coloring method,
Figure FDA00037438454800000214
refers to the value of the hue channel of the template in the HSV space,
Figure FDA00037438454800000215
refers to the value of the saturation channel of the template in the HSV space,
Figure FDA00037438454800000216
refers to the value of the brightness channel of the stencil in the HSV space, I gray Refers to the line of the target image or the gray value of the target area.
5. The automatic coloring method for line art drawing according to claim 1, further comprising, after performing the texture-extraction-type coloring process and obtaining the template, the target image and/or the target area converted back to the second color space:
the coloring treatment is carried out by a fusion coloring method, and the fusion coloring method comprises the following steps:
using the formula:
Figure FDA0003743845480000031
the result of the coloring is obtained and,
wherein the content of the first and second substances,
Figure FDA0003743845480000032
refers to the coloring result in the RGB space, I color The target image or the target area of the RBG space is colored by the texture extraction-type coloring method, and T refers to the module of the RGB space.
6. The automatic coloring method for line art drawing according to claim 1, wherein the preprocessing method comprises:
and cutting redundant white edges around the line painting. And zooming according to the length-width ratio of the cut line drawing so that the line drawing can adapt to the mask position of the template.
7. The automatic coloring method for line art drawing according to claim 1, wherein the method for generating the corresponding parsing mask according to the photo comprises:
and cutting the photo, and inputting the cut photo into a corresponding image segmentation network to obtain the multi-channel analysis mask.
8. The automatic coloring method for line art drawing according to claim 1, wherein the method for fusing the colored template, the target image and/or the target area of the second color space comprises:
fusing each target area after being colored in the subareas to obtain a fusion result, and fusing the fusion result with the colored lines of the target image and the template to obtain the coloring result.
9. The utility model provides an automatic device of coloring of art lines drawing which characterized in that includes:
a memory; and
a processor coupled to the memory, the processor configured to:
obtaining a line drawing generated according to a photo input by a user or the line drawing directly input by the user, preprocessing the line drawing to enable the line drawing to be suitable for the mask position of a template, converting the preprocessed line drawing into a gray scale image and carrying out normalization processing to obtain a target image;
judging whether to color in different areas based on a user instruction, if so, generating a corresponding analysis mask according to the picture, extracting an area needing to be colored according to the analysis mask, and carrying out normalization processing to obtain a target area, otherwise, skipping the step;
acquiring a template converted into a first color space, selecting a coloring method to color the template, the target image and/or the target area according to the depth of the background color of the template and the characteristics of the line drawing, and converting the template, the target image and/or the target area of the first color space subjected to coloring treatment back into a second color space, wherein the first color space comprises an LAB space, an HSV space or an HSI space, the second color space comprises an RGB space, and the coloring method comprises pure color coloring, texture extraction coloring and/or template self-adaptive coloring;
and fusing the colored template of the second color space, the target image and/or the target area to obtain a coloring result.
10. A computer storage medium on which a computer program is stored, the computer program, when executed by a machine, implementing the steps of a method according to any one of claims 1 to 8.
CN202210825589.3A 2022-07-13 2022-07-13 Automatic coloring method and device for line art picture and storage medium Pending CN115294243A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210825589.3A CN115294243A (en) 2022-07-13 2022-07-13 Automatic coloring method and device for line art picture and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210825589.3A CN115294243A (en) 2022-07-13 2022-07-13 Automatic coloring method and device for line art picture and storage medium

Publications (1)

Publication Number Publication Date
CN115294243A true CN115294243A (en) 2022-11-04

Family

ID=83821957

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210825589.3A Pending CN115294243A (en) 2022-07-13 2022-07-13 Automatic coloring method and device for line art picture and storage medium

Country Status (1)

Country Link
CN (1) CN115294243A (en)

Similar Documents

Publication Publication Date Title
US11455516B2 (en) Image lighting methods and apparatuses, electronic devices, and storage media
JP7090113B2 (en) Line drawing generation
AU2003204466B2 (en) Method and system for enhancing portrait images
US7532752B2 (en) Non-photorealistic sketching
US20060153470A1 (en) Method and system for enhancing portrait images that are processed in a batch mode
Kumar et al. A comprehensive survey on non-photorealistic rendering and benchmark developments for image abstraction and stylization
US8406566B1 (en) Methods and apparatus for soft edge masking
US7675652B2 (en) Correcting eye color in a digital image
US11651480B2 (en) Systems and methods for selective enhancement of objects in images
US8023768B2 (en) Universal front end for masks, selections, and paths
JP2000134486A (en) Image processing unit, image processing method and storage medium
US11727543B2 (en) Systems and methods for content-aware enhancement of images
CN107408401A (en) The user&#39;s sliding block for simplifying adjustment for image
US11670031B2 (en) System and method for automatically generating an avatar with pronounced features
US20230281764A1 (en) Systems and methods for selective enhancement of skin features in images
US20130195354A1 (en) Saturation Varying Color Space
CN115294243A (en) Automatic coloring method and device for line art picture and storage medium
US11574388B2 (en) Automatically correcting eye region artifacts in digital images portraying faces
JP2021033686A (en) Image area extraction processing method and image area extraction processing program
Dodgson et al. Contrast Brushes: Interactive Image Enhancement by Direct Manipulation.
Gao et al. PencilArt: a chromatic penciling style generation framework
CN112927321A (en) Intelligent image design method, device, equipment and storage medium based on neural network
US8184925B1 (en) System for converting a photograph into a portrait-style image
Doyle et al. Painted stained glass
AU2015271935A1 (en) Measure of image region visual information

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination