US20220301239A1 - Automatic coloring of line drawing - Google Patents

Automatic coloring of line drawing Download PDF

Info

Publication number
US20220301239A1
US20220301239A1 US17/834,856 US202217834856A US2022301239A1 US 20220301239 A1 US20220301239 A1 US 20220301239A1 US 202217834856 A US202217834856 A US 202217834856A US 2022301239 A1 US2022301239 A1 US 2022301239A1
Authority
US
United States
Prior art keywords
line drawing
local style
coloring
local
style
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US17/834,856
Inventor
Eiichi Matsumoto
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Preferred Networks Inc
Original Assignee
Preferred Networks Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Preferred Networks Inc filed Critical Preferred Networks Inc
Priority to US17/834,856 priority Critical patent/US20220301239A1/en
Publication of US20220301239A1 publication Critical patent/US20220301239A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/001Texturing; Colouring; Generation of texture or colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/20Drawing from basic elements, e.g. lines or circles
    • G06T11/203Drawing of straight lines or curves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning

Definitions

  • the present disclosure relates to a line drawing automatic coloring program, a line drawing automatic coloring device, and a line drawing automatic coloring method for automatically coloring a line drawing image.
  • machine learning using a neural network having a multilayer structure called deep learning has been applied in various fields.
  • the machine learning has also been prominently utilized and has achieved a remarkable result in a field of image processing such as image recognition and image generation.
  • a line drawing automatic coloring program is a line drawing automatic coloring program for causing a computer to realize processing for automatically performing coloring on line drawing data, the line drawing automatic coloring program causing the computer to realize the following functions: a line drawing data acquiring function of acquiring line drawing data of a target to be colored; a local style designation receiving function of receiving at least one local style designation for applying a selected local style to any place of the acquired line drawing data; and a coloring processing function of performing coloring processing reflecting the local style designation received by the local style designation receiving function on the line drawing data acquired by the line drawing data acquiring function based on a learned model for coloring in which it is learned in advance to perform coloring processing while reflecting the local style on the line drawing data using the line drawing data and the local style designation as inputs.
  • the local style designation receiving function includes: a reference image acquiring function of acquiring at least one reference image from which a user desires to extract the local style; a local style extraction place designating function of receiving at least one designation of a place from which the user desires to extract the local style in the acquired reference image; a local style extracting function of performing extraction processing of extracting the local style from the reference image with respect to at least one designated place designated by the local style extraction place designating function; and a local style application designating function of designating a place to which the local style extracted by the local style extracting function is applied with respect to the line drawing data acquired by the line drawing data acquiring function.
  • the local style designation receiving function is configured to receive at least one local style designation selected by a user from a plurality of local styles extracted in advance by extraction processing and stored by a storage means.
  • the local style is extracted based on a learned model for a local style in which it is learned in advance to extract the local style from any place of the reference image.
  • an encoder obtained by preparing plural sets of line drawing data and coloring correct answer image data and executing the following steps for the plural sets of line drawing data and coloring correct answer image data is set to the learned model for a local style, each set having line drawing data and coloring correct answer image data representing a correct answer coloring state for the line drawing data, and the following steps including: a step of inputting the coloring correct answer image data as the reference image to the encoder extracting the local style and generating a local style map corresponding to each of all pixels of the input coloring correct answer image data; a step of picking up at least one local style from the local style map extracted by the encoder and inputting the picked up local style together with the line drawing data to a decoder; a step of executing coloring processing reflecting the picked up local style on the line drawing data in the decoder to obtain colored image data; a step of calculating loss of the colored image data with respect to the coloring correct answer image data by a loss function, using the colored image data obtained by the de
  • a decoder obtained by preparing plural sets of line drawing data and coloring correct answer image data and executing the following steps for the plural sets of line drawing data and coloring correct answer image data is set to the learned model for coloring, each set having line drawing data and coloring correct answer image data representing a correct answer coloring state for the line drawing data, and the following steps including: a step of inputting the coloring correct answer image data as the reference image to an encoder extracting the local style and generating a local style map corresponding to each of all pixels of the input coloring correct answer image data; a step of picking up at least one local style from the local style map extracted by the encoder and inputting the picked up local style together with the line drawing data to the decoder; a step of executing coloring processing reflecting the picked up local style on the line drawing data in the decoder to obtain colored image data; a step of calculating loss of the colored image data with respect to the coloring correct answer image data by a loss function, using the colored image data obtained by the decoder
  • a line drawing automatic coloring device includes: a line drawing data acquiring unit configured to acquire line drawing data of a target to be colored; a local style designation receiving unit configured to receive at least one local style designation for applying a selected local style to any place of the acquired line drawing data; and a coloring processing unit configured to perform coloring processing reflecting the local style designation received by the local style designation receiving unit on the line drawing data acquired by the line drawing data acquiring unit based on a learned model for coloring in which it is learned in advance to perform coloring processing while reflecting the local style on the line drawing data using the line drawing data and the local style designation as inputs.
  • the local style designation receiving unit includes: a reference image acquiring unit configured to acquire at least one reference image from which a user desires to extract the local style; a local style extraction place designating unit configured to receive at least one designation of a place from which the user desires to extract the local style in the acquired reference image; a local style extracting unit configured to perform extraction processing of extracting the local style from the reference image with respect to at least one designated place designated by the local style extraction place designating unit; and a local style application designating unit configured to designate a place to which the local style extracted by the local style extracting unit is applied with respect to the line drawing data acquired by the line drawing data acquiring unit.
  • a line drawing automatic coloring method for automatically performing coloring on line drawing data includes: a line drawing data acquiring step of acquiring line drawing data of a target to be colored; a local style designation receiving step of receiving at least one local style designation for applying a selected local style to any place of the acquired line drawing data; and a coloring processing step of performing coloring processing reflecting the local style designation received in the local style designation receiving step on the line drawing data acquired in the line drawing data acquiring step based on a learned model for coloring in which it is learned in advance to perform coloring processing while reflecting the local style on the line drawing data using the line drawing data and the local style designation as inputs.
  • the local style designation receiving step includes: a reference image acquiring step of acquiring at least one reference image from which a user desires to extract the local style; a local style extraction place designating step of receiving at least one designation of a place from which the user desires to extract the local style in the acquired reference image; a local style extracting step of performing extraction processing of extracting the local style from the reference image with respect to at least one designated place designated in the local style extraction place designating step; and a local style application designating step of designating a place to which the local style extracted in the local style extracting step is applied with respect to the line drawing data acquired in the line drawing data acquiring step.
  • FIG. 1 is a block diagram showing a configuration of a line drawing automatic coloring device according to the present disclosure
  • FIG. 2 is a view showing a concept of coloring processing using a local style in the line drawing automatic coloring device according to the present disclosure
  • FIG. 3 is a flowchart showing a flow of learning of a learned model for a local style and a learned model for coloring that are used in the line drawing automatic coloring device according to the present disclosure
  • FIG. 4 is a flowchart showing a flow of coloring processing in the line drawing automatic coloring device according to the present disclosure.
  • FIG. 5 is a block diagram showing a configuration of a line drawing automatic coloring device according to a second embodiment.
  • FIG. 1 is a block diagram showing a configuration of a line drawing automatic coloring device 10 according to the present disclosure.
  • the line drawing automatic coloring device 10 may be a device designed as a dedicated machine, but it is assumed that the line drawing automatic coloring device 10 is a device that can be realized by general computers.
  • the line drawing automatic coloring device 10 includes a central processing unit (CPU), a graphics processing unit (GPU), a memory, and a storage such as a hard disk drive that are normally included in the general computers and are not shown.
  • various types of processing may be executed by a program in order to cause these general computers to function as the line drawing automatic coloring device 10 according to the present embodiment.
  • the line drawing automatic coloring device 10 includes at least a line drawing data acquiring unit 11 (also referred to as line drawing data acquirer), a reference image acquiring unit 12 (also referred to as reference image acquirer), a local style extraction place designating unit 13 (also referred to as local style extraction place designator), a local style extracting unit 14 (also referred to as local style extractor), a local style application designating unit 15 (also referred to as local style application designator), a coloring processing unit 16 (also referred to as coloring processor), and a storing unit 17 (also referred to as storage device).
  • a line drawing data acquiring unit 11 also referred to as line drawing data acquirer
  • a reference image acquiring unit 12 also referred to as reference image acquirer
  • a local style extraction place designating unit 13 also referred to as local style extraction place designator
  • a local style extracting unit 14 also referred to as local style extractor
  • a local style application designating unit 15 also referred to as local style application designator
  • coloring processing unit 16 also
  • the line drawing data acquiring unit 11 has a function of acquiring line drawing data of a target to be colored.
  • line drawing which is the target to be colored, is not particularly limited, but in a learning process of a learning model to be described below, it is preferable to include line drawing data similar to line drawing that a user desires to set as the target to be colored in terms of a thickness of a line, a type of touch or the like in line drawing data prepared as a set together with coloring correct answer image data.
  • the reference image acquiring unit 12 has a function of acquiring reference image data for extracting a local style that the user desires to apply to the line drawing data of the target to be colored.
  • the local style is a local style related to coloring such as a color, a texture, a gradation, a painting style, a pattern, a highlight, and a shadow.
  • the local style extraction place designating unit 13 has a function of designating a place at which the user desires to extract the local style in the reference image data acquired by the reference image acquiring unit 12 .
  • the local style is generated using information of neighboring pixels present within a predetermined range with respect to one pixel of the reference image data, and when the reference image data is composed of the number of pixels of width W ⁇ height H, a local style is generated for each pixel, and plural types of styles such as a color, a texture, a gradation, a painting style, a pattern, a highlight, and a shadow are extracted for one pixel. Therefore, when the number of types of styles is set to C, W ⁇ H ⁇ C local styles can be generated from one reference image data.
  • the local style extracting unit 14 has a function of extracting the local style from the reference image data. Extraction processing of the local style is performed based on, for example, a learned model for a local style learned in advance using training data for convolutional neural networks (CNN). It is possible to appropriately set up to which range the neighboring data to use for extracting the local style for one pixel of the reference image data, and extraction may be performed by a plurality of patterns with respect to up to which range the neighboring pixels are used for extracting the local style.
  • the extraction processing in the local style extracting unit 14 may be a method of performing extraction processing only on the place designated by the local style extraction place designating unit 13 or may be a method of performing extraction processing of local styles on all pixels of the reference image data.
  • the local style application designating unit 15 has a function of designating which the local style extracted from the place designated by the local style extraction place designating unit 13 is applied to which area in the line drawing data of the target to be colored.
  • the local style for the line drawing data may be designated at one place or may be designated at a plurality of places.
  • all of the C types of local styles extracted from one pixel designated by the local style extraction place designating unit 13 may be applied to the designated place of the line drawing data or a specific local style of the C types of local styles may be selected and be applied to the designated place of the line drawing data.
  • Any local styles and any characters of any local styles selected by the user can be applied. For example, it can be applied that only the local style related to the texture of the local styles extracted from one pixel designated by the local style extraction place designating unit 13 without applying the local style related to the color of the local styles.
  • the coloring processing unit 16 has a function of performing coloring processing reflecting the local style designated for the line drawing data.
  • the coloring processing is performed based on, for example, a learned model.
  • An example of the learned model for coloring is in which it is learned in advance to perform coloring processing reflecting the local style on the line drawing data using the line drawing data and the designation of the application place of at least one local style for the line drawing data as inputs.
  • Colored image data is obtained by performing the coloring processing by the coloring processing unit 16 .
  • the storing unit 17 has a function of storing data required for various processing performed in the line drawing automatic coloring device 10 including the line drawing data acquiring unit 11 , the reference image acquiring unit 12 , the local style extraction place designating unit 13 , the local style extracting unit 14 , the local style application designating unit 15 , the coloring processing unit 16 and the like, and data obtained as a result of the processing.
  • FIG. 2 is a view showing a concept of coloring processing using a local style in the line drawing automatic coloring device 10 according to the present disclosure.
  • the line drawing automatic coloring device 10 when automatically coloring the line drawing data, it is possible to extract the local style desired by the user from a reference image and reflect the extracted local style on the line drawing data.
  • Three reference images A, B, and C are shown on the left side of FIG. 2 .
  • the user When the user desires to use a portion of the sky of a background of the reference image A as a background of the line drawing data, the user designates one place of the background of the reference image A, extracts the local style from the designated point, and designates a place of the sky of the line drawing data as a point on which the user desires to reflect the local style extracted from the reference image A.
  • the user desires to use an expression of body hair of a cat of the reference image B for coloring of a dog of the line drawing data
  • the user designates one place of a body hair portion of the reference image B, extracts the local style from the designated point, and designates a body portion of the dog of the line drawing data as a point on which the user desires to reflect the local style extracted from the reference image B.
  • the user When the user desires to use a texture of a belt of a wristwatch of the reference image C for coloring of a hat of the line drawing data, the user designates one place of a belt portion of the wristwatch of the reference image C, extracts the local style from the designated point, and designates a hat portion of the line drawing data as a point on which the user desires to reflect the local style extracted from the reference image C.
  • the coloring processing is performed on the places, colored image data shown on the right side of FIG. 2 by performing the coloring on the line drawing data is obtained. Hatching is performed to only the places at which the local styles are designated in the colored image data on the right side of FIG. 2 , but although not expressed in FIG. 2 , for example, the coloring processing may be also automatically performed on other places at which the local styles are not designated.
  • FIG. 3 is a flowchart showing an example of flow of learning of a learned model for a local style and a learned model for coloring that are used in the line drawing automatic coloring device 10 according to the present disclosure.
  • a learning method does not need to be one, and various learning processes can be used. For example, learning the learned model for a local style for extracting the local style and the learned model for coloring for performing the coloring processing can be simultaneously carried out.
  • plural sets of line drawing data and coloring correct answer image data are prepared, each set having line drawing data and coloring correct answer image data representing a correct answer coloring state for the line drawing data.
  • a set of line drawing data and coloring correct answer image data can be prepared by extracting only the line drawing data from the coloring correct answer image data by edge extraction or the like.
  • two convolutional neural networks including an encoder extracting a local style from a reference image and a decoder performing the coloring processing on the line drawing data are prepared as convolutional neural networks performing learning (S 101 ).
  • Coloring correct answer image data of the number of pixels of W ⁇ H of the line drawing data and the coloring correct answer image data prepared as the sets are input as a reference image to the encoder, and the encoder extracts a local style map composed of W ⁇ H ⁇ C local styles (S 102 ). That is, the local style map corresponding to each of all the pixels of the input coloring correct answer image data is generated.
  • At least one local style of the W ⁇ H ⁇ C local styles extracted by the encoder is picked up (for example, randomly picked up), and the picked up local style is input together with the line drawing data to the decoder (S 103 ).
  • the decoder S 103
  • an input is given so as to apply the local style to a pixel position on the line drawing data at the same position as a pixel position on the coloring correct answer image data at which the local style is picked up.
  • a process of picking up the local style is to perform pick-up in both of a pattern that picks up all of the C types of local styles corresponding to one pixel as a bundle and a pattern that picks up only some of the C types of local styles corresponding to one pixel.
  • a pick-up rule in addition to random pickup, any process such as a process of performing pick-up according to a predetermined rule may be used. Learning including a pattern that does not pick up any local style may be performed. When considering convenience of the user, it is preferable that both of coloring that applies the local style and coloring that does not apply the local style can be performed.
  • the decoder executes coloring processing that reflects the picked up local style on the line drawing data (S 104 ).
  • the decoder executes the coloring processing to obtain colored image data.
  • loss of the colored image data with respect to the coloring correct answer image data is calculated by a loss function, using the colored image data obtained by the decoder and the coloring correct answer image data (S 105 ).
  • parameters of the encoder and the decoder are updated so as to reduce the loss calculated by the loss function (S 106 ).
  • the updating process of the parameters of the encoder and the decoder may be repeated until the loss is reduced to be less than a threshold value.
  • Steps S 101 to S 106 of FIG. 3 indicate one cycle as a minimum unit of the learning. Learning of a considerable number of cycles can be repeated, and learning is completed in a step where appropriate extraction of the local style and acquisition of the colored image data in which appropriate coloring is performed become possible. Parameters and the like of the encoder at the completion of the learning are acquired as the learned model for a local style, parameters and the like of the decoder at the completion of the learning are acquired as the learned model for coloring, and the acquired parameters are in the storing unit 17 .
  • FIG. 4 is a flowchart showing a flow of coloring processing in the line drawing automatic coloring device 10 according to the present disclosure.
  • the coloring processing in the line drawing automatic coloring device 10 according to the present embodiment is started by acquiring the line drawing data (step S 201 ).
  • the user selects the line drawing data of the target to be colored, such that the acquisition of the line drawing data is performed.
  • the reference image data from which the user desires to extract the local style is acquired (step S 202 ).
  • the place at which the user desires to extract the local style is designated (step S 203 ).
  • the local style of the designated place is extracted (step S 204 ).
  • the extraction of the local style is performed based on, for example, the learned model for a local style learned in advance using the training data.
  • a position on the line drawing data to which the user desires to apply the extracted local style is designated (step S 205 ).
  • the above steps S 201 to S 205 can be executed based on a graphical user interface.
  • steps S 201 to S 205 After executing steps S 201 to S 205 , an input of the user for whether or not to extract and apply another local style is received, and it is determined whether or not to extract and apply another local style (step S 206 ). When it is desired to extract and apply another local style (S 206 —Y), steps S 201 to S 205 are executed once again. The step can be repeated. When there is no need to extract another local style (S 206 —N), the coloring processing proceeds to the next step S 207 .
  • the coloring processing is executed on the entirety of the line drawing data while reflecting the local style on the designated place using the line drawing data, the local style, and the designation of the application place of the local style as inputs (step S 207 ).
  • the coloring processing is performed based on the learned model for coloring in which it is learned in advance to perform the coloring processing reflecting the local style on the line drawing data.
  • the colored image data obtained by the coloring processing can be provided as, for example, a graphical user interface that causes the user to show a coloring state by displaying the colored image data instead of the line drawing data on a display region displaying the line drawing data on the display screen.
  • An input of the user for whether or not the colored image data needs to be corrected is received, and it is determined whether or not the colored image data needs to be corrected (step S 208 ).
  • steps S 201 to S 205 are executed once again.
  • steps S 201 to S 205 can be executed again in a state in which the extracted local style and the designation of the application place of the local style are maintained.
  • the coloring processing ends.
  • the line drawing automatic coloring device 10 with respect to the line drawing data of the target to be colored, the local style desired by the user is extracted from the reference image data, the place to which the extracted local style is applied is designated in the line drawing data, and the coloring processing can be executed. Therefore, it can realize automatic coloring processing reflecting the local styles related to the coloring characters such as the texture, the gradation, the painting style, the pattern, the highlight, and the shadow as well as the color at the place desired by the user in the line drawing data.
  • plural types of local styles simultaneously extracted for the same pixel can be designated as a bundle and only some local styles of plural types of local styles simultaneously extracted for the same pixel can be selected and designated. Therefore, for example, a local style in which the user desires to reflect only the texture without reflecting the color can be designated, such that the user experience is improved.
  • the place to which the local style desired by the user is applied is designated with respect to the line drawing data of the target to be colored, and the coloring processing can be executed.
  • the automatic coloring processing reflecting the local styles related to the coloring such as the texture, the gradation, the painting style, the pattern, the highlight, and/or the shadow as well as the color at the place desired by the user in the line drawing data.
  • the local style may be extracted from the place designated by the user in the reference image in a state in which the reference image is acquired and be reflected on the line drawing data, or may be selected from the library by the user in a state in which a plurality of local styles are extracted in advance and are stored in the storing unit as the library and be reflected on the line drawing data.
  • plural types of local styles simultaneously extracted for the same pixel can be designated as a bundle or only some local styles of plural types of local styles simultaneously extracted for the same pixel can be selected and designated. Therefore, for example, a local style in which the user desires to reflect only the texture without reflecting the color can be designated, such that the user experience is improved.
  • FIG. 5 is a block diagram showing a configuration of a line drawing automatic coloring device 20 according to a second embodiment.
  • the line drawing automatic coloring device 20 includes at least a line drawing data acquiring unit 11 , a local style designation receiving unit 21 , a coloring processing unit 16 , and a storing unit 22 .
  • components denoted by the same reference numerals as those of FIG. 1 according to the first embodiment perform the same functions as those of FIG. 1 in the present embodiment, and a description thereof is thus be omitted.
  • the local style designation receiving unit 21 has a function of receiving at least one local style designation for applying a selected local style to any place of acquired line drawing data.
  • the local style designation according to the present embodiment is performed in a form in which the user selects a desired local style from a local style library in which the plural types of local styles are extracted in advance and stored and designates a place on the line drawing data on which the user desires to reflect the selected local style.
  • the local style designation receiving unit 21 in the present embodiment may have the same function as the function of extracting the local style from the reference image data and reflecting the extracted local style on the line drawing data, which is performed in the reference image acquiring unit 12 , the local style extraction place designating unit 13 , the local style extracting unit 14 , and the local style application designating unit 15 in the first embodiment. That is, the local style designation receiving unit 21 may have both functions so that the local style may be extracted and used from the reference image data or may be selected and used from the local style library.
  • the storing unit 22 similarly stores data stored in the storing unit 17 in the first embodiment and required for various processing and data obtained as a result of the processing, and also stores the local style library constituted by the plural types of local styles extracted in advance. It is preferable that the local style libraries can be classified and sorted depending on desired conditions such as a type of texture, author and the like so as to be easily used by the user.
  • a flow of coloring processing in the line drawing automatic coloring device 20 according to the second embodiment is the same as of the coloring processing in FIG. 4 except that processing in which the user selects the desired local style with reference to the local style library stored in the storing unit 22 and designates a position on the line drawing data on which the user desires to reflect the selected local style is executed instead of the processing of extracting and applying the local style from the reference image data in steps S 202 to S 205 in FIG. 4 with respect to the coloring processing of the line drawing automatic coloring device 10 according to the first embodiment.
  • the place to which the local data desired by the user is applied is designated with respect to line drawing data of a target to be colored with reference to the local style library, and the coloring processing can be executed. Therefore, it is possible to realize automatic coloring processing reflecting local styles related to coloring such as a texture, a gradation, a painting style, a pattern, a highlight, and a shadow as well as a color at the place desired by the user in the line drawing data. Since the local style library is created in advance and the user can select the local style from the local style library, a frequently used local style can be used for the coloring processing without being extracted from the reference image data each time, such that convenience of the user is improved.
  • the coloring processing is executed after the designation of all the local styles ends when the designation of the local styles for the line drawing data is performed plural times has been described in the first and second embodiments, but the present disclosure is not limited thereto.
  • the coloring processing may be frequently executed each time the designation of the application place of the local style from the reference image data or the designation of the application place of the local styled from the local style library is performed on the line drawing data.
  • the coloring processing by executing the coloring processing each time and displaying the colored image data on the display region of the display screen each time, the user can designate an application place of the next local style while confirming a state of a colored image changed each time the application place of the local style is designated, such that the user experience is improved.
  • the coloring processing that applies the local style has been described in the first and second embodiments, but there are many local styles having directionality or regularity such as a gradation or a pattern.
  • a function of performing conversion processing such as converting angles of the local styles, converting hues of the local styles, or changing orientations of gradations of the local styles may be added. As a result, the user experience is further improved.
  • the place to which the extracted local style is applied is determined by designating any place from the line drawing data displayed on the display screen by the user, who provides the instruction via a user interface.
  • a function of informing the user of the place on the line drawing data related to the place at which the local style is extracted may be provided.
  • Processing using an existing image recognition technique is performed on each of the reference image data and the line drawing data to extract with which place on the line drawing data a feature of an image of the extraction place of the local style has a high relationship. For example, when the user selects an “eye” portion of a person on the reference image data by clicking the mouse, a place corresponding to a feature of an “eye” is extracted from the line drawing data using an existing image recognition technique and is presented in a form in which it can be recognized by the user. For example, a method of informing the user of the correspondence by blinking an “eye” portion on the line drawing data displayed on the display screen or temporarily changing a color of the “eye” portion is conceivable.
  • an existing image recognition technique such as pattern matching or object detection
  • candidate places of a local style that are to be applied to the selected place may be extracted by performing processing using an image recognition technique.
  • the candidate places are not only extracted by performing the processing using the image recognition technique, but coloring processing reflecting the local style may also be automatically executed by automatically performing selection among the extracted candidate places.
  • the extracted local style may be simultaneously applied to these targets.
  • candidate places of the target to be colored, having the same feature as that of the place designated so as to extract the local style are extracted by the existing image recognition technique, and the extracted local style is applied to a plurality of extracted candidate places.
  • the application candidate places of the local style in the image data are extracted from the feature of the selected place in the reference image data using the image recognition technique, or the extraction place candidates of the local style in the reference image data are extracted from the feature of the selected place in the image data using the image recognition technique, such that there is an effect that the convenience of the user is improved.
  • the coloring processing according to the present disclosure may be applied to, for example, image data having the same property as that of the line drawing data such as black-and-white comics, a gray scale image, a pencil sketch, a line drawing in which a shadow, a halftone or the like is partially written, and an undercoated line drawing, in addition to the line drawing data, as long as the image data can be prepared in a pair with the coloring correct answer image data and learning can be performed based on the image data.
  • the line drawing data that becomes a set by performing edge extraction processing on the coloring correct answer image data is extracted, but it is also possible to automatically create a pair before and after the coloring from a data set including only the coloring correct answer image data by using a standard image processing technique such as grayscale processing, processing converting a brightness into a halftone, or processing reducing the number of colors instead of the edge extraction processing.
  • a standard image processing technique such as grayscale processing, processing converting a brightness into a halftone, or processing reducing the number of colors instead of the edge extraction processing.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Software Systems (AREA)
  • Medical Informatics (AREA)
  • Evolutionary Computation (AREA)
  • Data Mining & Analysis (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Processing Or Creating Images (AREA)
  • Image Analysis (AREA)
  • Image Generation (AREA)

Abstract

A line drawing automatic coloring method according to the present disclosure includes: acquiring line drawing data of a target to be colored; receiving at least one local style designation for applying a selected local style to at least one place of the acquired line drawing data; and performing coloring processing reflecting the local style designation on the line drawing data based on a learned model for coloring in which it is learned in advance using the line drawing data and the local style designation as inputs.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application claims the benefit and priority to Japanese application number JP2017-108427 filed Sep. 20, 2017, the disclosure of which is incorporated in its entirety by reference herein.
  • BACKGROUND Technical Field
  • The present disclosure relates to a line drawing automatic coloring program, a line drawing automatic coloring device, and a line drawing automatic coloring method for automatically coloring a line drawing image.
  • Related Art
  • In recent years, machine learning using a neural network having a multilayer structure, called deep learning has been applied in various fields. The machine learning has also been prominently utilized and has achieved a remarkable result in a field of image processing such as image recognition and image generation.
  • SUMMARY
  • A line drawing automatic coloring program according to the present disclosure is a line drawing automatic coloring program for causing a computer to realize processing for automatically performing coloring on line drawing data, the line drawing automatic coloring program causing the computer to realize the following functions: a line drawing data acquiring function of acquiring line drawing data of a target to be colored; a local style designation receiving function of receiving at least one local style designation for applying a selected local style to any place of the acquired line drawing data; and a coloring processing function of performing coloring processing reflecting the local style designation received by the local style designation receiving function on the line drawing data acquired by the line drawing data acquiring function based on a learned model for coloring in which it is learned in advance to perform coloring processing while reflecting the local style on the line drawing data using the line drawing data and the local style designation as inputs.
  • In addition, in the line drawing automatic coloring program according to the present disclosure, the local style designation receiving function includes: a reference image acquiring function of acquiring at least one reference image from which a user desires to extract the local style; a local style extraction place designating function of receiving at least one designation of a place from which the user desires to extract the local style in the acquired reference image; a local style extracting function of performing extraction processing of extracting the local style from the reference image with respect to at least one designated place designated by the local style extraction place designating function; and a local style application designating function of designating a place to which the local style extracted by the local style extracting function is applied with respect to the line drawing data acquired by the line drawing data acquiring function.
  • In addition, in the line drawing automatic coloring program according to the present disclosure, the local style designation receiving function is configured to receive at least one local style designation selected by a user from a plurality of local styles extracted in advance by extraction processing and stored by a storage means.
  • In addition, in the line drawing automatic coloring program according to the present disclosure, in the extraction processing for the local style, the local style is extracted based on a learned model for a local style in which it is learned in advance to extract the local style from any place of the reference image.
  • In addition, in the line drawing automatic coloring program according to the present disclosure, an encoder obtained by preparing plural sets of line drawing data and coloring correct answer image data and executing the following steps for the plural sets of line drawing data and coloring correct answer image data is set to the learned model for a local style, each set having line drawing data and coloring correct answer image data representing a correct answer coloring state for the line drawing data, and the following steps including: a step of inputting the coloring correct answer image data as the reference image to the encoder extracting the local style and generating a local style map corresponding to each of all pixels of the input coloring correct answer image data; a step of picking up at least one local style from the local style map extracted by the encoder and inputting the picked up local style together with the line drawing data to a decoder; a step of executing coloring processing reflecting the picked up local style on the line drawing data in the decoder to obtain colored image data; a step of calculating loss of the colored image data with respect to the coloring correct answer image data by a loss function, using the colored image data obtained by the decoder and the coloring correct answer image data; and a step of updating parameters of the encoder and the decoder so as to reduce the loss calculated by the loss function.
  • In addition, in the line drawing automatic coloring program according to the present disclosure, a decoder obtained by preparing plural sets of line drawing data and coloring correct answer image data and executing the following steps for the plural sets of line drawing data and coloring correct answer image data is set to the learned model for coloring, each set having line drawing data and coloring correct answer image data representing a correct answer coloring state for the line drawing data, and the following steps including: a step of inputting the coloring correct answer image data as the reference image to an encoder extracting the local style and generating a local style map corresponding to each of all pixels of the input coloring correct answer image data; a step of picking up at least one local style from the local style map extracted by the encoder and inputting the picked up local style together with the line drawing data to the decoder; a step of executing coloring processing reflecting the picked up local style on the line drawing data in the decoder to obtain colored image data; a step of calculating loss of the colored image data with respect to the coloring correct answer image data by a loss function, using the colored image data obtained by the decoder and the coloring correct answer image data; and a step of updating parameters of the encoder and the decoder so as to reduce the loss calculated by the loss function.
  • A line drawing automatic coloring device according to the present disclosure includes: a line drawing data acquiring unit configured to acquire line drawing data of a target to be colored; a local style designation receiving unit configured to receive at least one local style designation for applying a selected local style to any place of the acquired line drawing data; and a coloring processing unit configured to perform coloring processing reflecting the local style designation received by the local style designation receiving unit on the line drawing data acquired by the line drawing data acquiring unit based on a learned model for coloring in which it is learned in advance to perform coloring processing while reflecting the local style on the line drawing data using the line drawing data and the local style designation as inputs.
  • In addition, in the line drawing automatic coloring device according to the present disclosure, the local style designation receiving unit includes: a reference image acquiring unit configured to acquire at least one reference image from which a user desires to extract the local style; a local style extraction place designating unit configured to receive at least one designation of a place from which the user desires to extract the local style in the acquired reference image; a local style extracting unit configured to perform extraction processing of extracting the local style from the reference image with respect to at least one designated place designated by the local style extraction place designating unit; and a local style application designating unit configured to designate a place to which the local style extracted by the local style extracting unit is applied with respect to the line drawing data acquired by the line drawing data acquiring unit.
  • According to the present disclosure, a line drawing automatic coloring method for automatically performing coloring on line drawing data includes: a line drawing data acquiring step of acquiring line drawing data of a target to be colored; a local style designation receiving step of receiving at least one local style designation for applying a selected local style to any place of the acquired line drawing data; and a coloring processing step of performing coloring processing reflecting the local style designation received in the local style designation receiving step on the line drawing data acquired in the line drawing data acquiring step based on a learned model for coloring in which it is learned in advance to perform coloring processing while reflecting the local style on the line drawing data using the line drawing data and the local style designation as inputs.
  • In addition, in the line drawing automatic coloring method according to the present disclosure, the local style designation receiving step includes: a reference image acquiring step of acquiring at least one reference image from which a user desires to extract the local style; a local style extraction place designating step of receiving at least one designation of a place from which the user desires to extract the local style in the acquired reference image; a local style extracting step of performing extraction processing of extracting the local style from the reference image with respect to at least one designated place designated in the local style extraction place designating step; and a local style application designating step of designating a place to which the local style extracted in the local style extracting step is applied with respect to the line drawing data acquired in the line drawing data acquiring step.
  • BRIEF DESCRIPTION OF DRAWINGS
  • FIG. 1 is a block diagram showing a configuration of a line drawing automatic coloring device according to the present disclosure;
  • FIG. 2 is a view showing a concept of coloring processing using a local style in the line drawing automatic coloring device according to the present disclosure;
  • FIG. 3 is a flowchart showing a flow of learning of a learned model for a local style and a learned model for coloring that are used in the line drawing automatic coloring device according to the present disclosure;
  • FIG. 4 is a flowchart showing a flow of coloring processing in the line drawing automatic coloring device according to the present disclosure; and
  • FIG. 5 is a block diagram showing a configuration of a line drawing automatic coloring device according to a second embodiment.
  • DETAILED DESCRIPTION First Embodiment
  • Hereinafter, an example of a line drawing automatic coloring device according to a first embodiment is described with reference to the drawings. FIG. 1 is a block diagram showing a configuration of a line drawing automatic coloring device 10 according to the present disclosure. It should be noted that the line drawing automatic coloring device 10 may be a device designed as a dedicated machine, but it is assumed that the line drawing automatic coloring device 10 is a device that can be realized by general computers. In this case, it may be assumed that the line drawing automatic coloring device 10 includes a central processing unit (CPU), a graphics processing unit (GPU), a memory, and a storage such as a hard disk drive that are normally included in the general computers and are not shown. In addition, various types of processing may be executed by a program in order to cause these general computers to function as the line drawing automatic coloring device 10 according to the present embodiment.
  • As shown in FIG. 1, the line drawing automatic coloring device 10 includes at least a line drawing data acquiring unit 11 (also referred to as line drawing data acquirer), a reference image acquiring unit 12 (also referred to as reference image acquirer), a local style extraction place designating unit 13 (also referred to as local style extraction place designator), a local style extracting unit 14 (also referred to as local style extractor), a local style application designating unit 15 (also referred to as local style application designator), a coloring processing unit 16 (also referred to as coloring processor), and a storing unit 17 (also referred to as storage device).
  • The line drawing data acquiring unit 11 has a function of acquiring line drawing data of a target to be colored. In the present disclosure, line drawing, which is the target to be colored, is not particularly limited, but in a learning process of a learning model to be described below, it is preferable to include line drawing data similar to line drawing that a user desires to set as the target to be colored in terms of a thickness of a line, a type of touch or the like in line drawing data prepared as a set together with coloring correct answer image data.
  • The reference image acquiring unit 12 has a function of acquiring reference image data for extracting a local style that the user desires to apply to the line drawing data of the target to be colored. Here, the local style is a local style related to coloring such as a color, a texture, a gradation, a painting style, a pattern, a highlight, and a shadow.
  • The local style extraction place designating unit 13 has a function of designating a place at which the user desires to extract the local style in the reference image data acquired by the reference image acquiring unit 12. The local style is generated using information of neighboring pixels present within a predetermined range with respect to one pixel of the reference image data, and when the reference image data is composed of the number of pixels of width W×height H, a local style is generated for each pixel, and plural types of styles such as a color, a texture, a gradation, a painting style, a pattern, a highlight, and a shadow are extracted for one pixel. Therefore, when the number of types of styles is set to C, W×H×C local styles can be generated from one reference image data.
  • The local style extracting unit 14 has a function of extracting the local style from the reference image data. Extraction processing of the local style is performed based on, for example, a learned model for a local style learned in advance using training data for convolutional neural networks (CNN). It is possible to appropriately set up to which range the neighboring data to use for extracting the local style for one pixel of the reference image data, and extraction may be performed by a plurality of patterns with respect to up to which range the neighboring pixels are used for extracting the local style. The extraction processing in the local style extracting unit 14 may be a method of performing extraction processing only on the place designated by the local style extraction place designating unit 13 or may be a method of performing extraction processing of local styles on all pixels of the reference image data.
  • The local style application designating unit 15 has a function of designating which the local style extracted from the place designated by the local style extraction place designating unit 13 is applied to which area in the line drawing data of the target to be colored. The local style for the line drawing data may be designated at one place or may be designated at a plurality of places. In addition, all of the C types of local styles extracted from one pixel designated by the local style extraction place designating unit 13 may be applied to the designated place of the line drawing data or a specific local style of the C types of local styles may be selected and be applied to the designated place of the line drawing data. Any local styles and any characters of any local styles selected by the user can be applied. For example, it can be applied that only the local style related to the texture of the local styles extracted from one pixel designated by the local style extraction place designating unit 13 without applying the local style related to the color of the local styles.
  • The coloring processing unit 16 has a function of performing coloring processing reflecting the local style designated for the line drawing data. The coloring processing is performed based on, for example, a learned model. An example of the learned model for coloring is in which it is learned in advance to perform coloring processing reflecting the local style on the line drawing data using the line drawing data and the designation of the application place of at least one local style for the line drawing data as inputs. Colored image data is obtained by performing the coloring processing by the coloring processing unit 16.
  • The storing unit 17 has a function of storing data required for various processing performed in the line drawing automatic coloring device 10 including the line drawing data acquiring unit 11, the reference image acquiring unit 12, the local style extraction place designating unit 13, the local style extracting unit 14, the local style application designating unit 15, the coloring processing unit 16 and the like, and data obtained as a result of the processing.
  • FIG. 2 is a view showing a concept of coloring processing using a local style in the line drawing automatic coloring device 10 according to the present disclosure. In the line drawing automatic coloring device 10 according to the present disclosure, when automatically coloring the line drawing data, it is possible to extract the local style desired by the user from a reference image and reflect the extracted local style on the line drawing data. Three reference images A, B, and C are shown on the left side of FIG. 2. When the user desires to use a portion of the sky of a background of the reference image A as a background of the line drawing data, the user designates one place of the background of the reference image A, extracts the local style from the designated point, and designates a place of the sky of the line drawing data as a point on which the user desires to reflect the local style extracted from the reference image A. In addition, when the user desires to use an expression of body hair of a cat of the reference image B for coloring of a dog of the line drawing data, the user designates one place of a body hair portion of the reference image B, extracts the local style from the designated point, and designates a body portion of the dog of the line drawing data as a point on which the user desires to reflect the local style extracted from the reference image B. When the user desires to use a texture of a belt of a wristwatch of the reference image C for coloring of a hat of the line drawing data, the user designates one place of a belt portion of the wristwatch of the reference image C, extracts the local style from the designated point, and designates a hat portion of the line drawing data as a point on which the user desires to reflect the local style extracted from the reference image C. As described above, using the local styles are extracted from each of the reference images A, B, and C, the places on the line drawing data on which the user desires to reflect the local styles are designated, and the coloring processing is performed on the places, colored image data shown on the right side of FIG. 2 by performing the coloring on the line drawing data is obtained. Hatching is performed to only the places at which the local styles are designated in the colored image data on the right side of FIG. 2, but although not expressed in FIG. 2, for example, the coloring processing may be also automatically performed on other places at which the local styles are not designated.
  • FIG. 3 is a flowchart showing an example of flow of learning of a learned model for a local style and a learned model for coloring that are used in the line drawing automatic coloring device 10 according to the present disclosure. A learning method does not need to be one, and various learning processes can be used. For example, learning the learned model for a local style for extracting the local style and the learned model for coloring for performing the coloring processing can be simultaneously carried out.
  • For learning, plural sets of line drawing data and coloring correct answer image data are prepared, each set having line drawing data and coloring correct answer image data representing a correct answer coloring state for the line drawing data. A set of line drawing data and coloring correct answer image data can be prepared by extracting only the line drawing data from the coloring correct answer image data by edge extraction or the like.
  • In starting the learning processing, first, two convolutional neural networks including an encoder extracting a local style from a reference image and a decoder performing the coloring processing on the line drawing data are prepared as convolutional neural networks performing learning (S101).
  • Coloring correct answer image data of the number of pixels of W×H of the line drawing data and the coloring correct answer image data prepared as the sets are input as a reference image to the encoder, and the encoder extracts a local style map composed of W×H×C local styles (S102). That is, the local style map corresponding to each of all the pixels of the input coloring correct answer image data is generated.
  • At least one local style of the W×H×C local styles extracted by the encoder is picked up (for example, randomly picked up), and the picked up local style is input together with the line drawing data to the decoder (S103). In this case, with respect to the designation of the application place of the local style for the line drawing data, an input is given so as to apply the local style to a pixel position on the line drawing data at the same position as a pixel position on the coloring correct answer image data at which the local style is picked up. In addition, a process of picking up the local style is to perform pick-up in both of a pattern that picks up all of the C types of local styles corresponding to one pixel as a bundle and a pattern that picks up only some of the C types of local styles corresponding to one pixel. With respect to a pick-up rule, in addition to random pickup, any process such as a process of performing pick-up according to a predetermined rule may be used. Learning including a pattern that does not pick up any local style may be performed. When considering convenience of the user, it is preferable that both of coloring that applies the local style and coloring that does not apply the local style can be performed.
  • The decoder executes coloring processing that reflects the picked up local style on the line drawing data (S104). The decoder executes the coloring processing to obtain colored image data.
  • Then, loss of the colored image data with respect to the coloring correct answer image data is calculated by a loss function, using the colored image data obtained by the decoder and the coloring correct answer image data (S105). Finally, parameters of the encoder and the decoder are updated so as to reduce the loss calculated by the loss function (S106). The updating process of the parameters of the encoder and the decoder may be repeated until the loss is reduced to be less than a threshold value.
  • Steps S101 to S106 of FIG. 3 indicate one cycle as a minimum unit of the learning. Learning of a considerable number of cycles can be repeated, and learning is completed in a step where appropriate extraction of the local style and acquisition of the colored image data in which appropriate coloring is performed become possible. Parameters and the like of the encoder at the completion of the learning are acquired as the learned model for a local style, parameters and the like of the decoder at the completion of the learning are acquired as the learned model for coloring, and the acquired parameters are in the storing unit 17.
  • FIG. 4 is a flowchart showing a flow of coloring processing in the line drawing automatic coloring device 10 according to the present disclosure. The coloring processing in the line drawing automatic coloring device 10 according to the present embodiment is started by acquiring the line drawing data (step S201). For example, the user selects the line drawing data of the target to be colored, such that the acquisition of the line drawing data is performed.
  • Then, the reference image data from which the user desires to extract the local style is acquired (step S202). In the acquired reference image data, the place at which the user desires to extract the local style is designated (step S203). Then, the local style of the designated place is extracted (step S204). The extraction of the local style is performed based on, for example, the learned model for a local style learned in advance using the training data. Next, a position on the line drawing data to which the user desires to apply the extracted local style is designated (step S205). The above steps S201 to S205 can be executed based on a graphical user interface. For example, it is conceivable to display the acquired line drawing data and the acquired reference image data on respective display regions provided in a display screen, designate the extraction place of the local style by selecting a displayed reference image by, for example, a pointer of a mouse, and designate the application place of the local style by the pointer of the mouse for the displayed line drawing.
  • After executing steps S201 to S205, an input of the user for whether or not to extract and apply another local style is received, and it is determined whether or not to extract and apply another local style (step S206). When it is desired to extract and apply another local style (S206—Y), steps S201 to S205 are executed once again. The step can be repeated. When there is no need to extract another local style (S206—N), the coloring processing proceeds to the next step S207.
  • The coloring processing is executed on the entirety of the line drawing data while reflecting the local style on the designated place using the line drawing data, the local style, and the designation of the application place of the local style as inputs (step S207). The coloring processing is performed based on the learned model for coloring in which it is learned in advance to perform the coloring processing reflecting the local style on the line drawing data. The colored image data obtained by the coloring processing can be provided as, for example, a graphical user interface that causes the user to show a coloring state by displaying the colored image data instead of the line drawing data on a display region displaying the line drawing data on the display screen. An input of the user for whether or not the colored image data needs to be corrected is received, and it is determined whether or not the colored image data needs to be corrected (step S208). When the user desires to modify the colored image data (S208—Y), steps S201 to S205 are executed once again. In this case, steps S201 to S205 can be executed again in a state in which the extracted local style and the designation of the application place of the local style are maintained. When the colored image data does not need to be modified, (S208—N), the coloring processing ends.
  • As described above, according to the line drawing automatic coloring device 10 according to the present embodiment, with respect to the line drawing data of the target to be colored, the local style desired by the user is extracted from the reference image data, the place to which the extracted local style is applied is designated in the line drawing data, and the coloring processing can be executed. Therefore, it can realize automatic coloring processing reflecting the local styles related to the coloring characters such as the texture, the gradation, the painting style, the pattern, the highlight, and the shadow as well as the color at the place desired by the user in the line drawing data. In addition, when designating the local style, plural types of local styles simultaneously extracted for the same pixel can be designated as a bundle and only some local styles of plural types of local styles simultaneously extracted for the same pixel can be selected and designated. Therefore, for example, a local style in which the user desires to reflect only the texture without reflecting the color can be designated, such that the user experience is improved.
  • In some embodiments, the place to which the local style desired by the user is applied is designated with respect to the line drawing data of the target to be colored, and the coloring processing can be executed. In other words, it is possible to realize the automatic coloring processing reflecting the local styles related to the coloring such as the texture, the gradation, the painting style, the pattern, the highlight, and/or the shadow as well as the color at the place desired by the user in the line drawing data. The local style may be extracted from the place designated by the user in the reference image in a state in which the reference image is acquired and be reflected on the line drawing data, or may be selected from the library by the user in a state in which a plurality of local styles are extracted in advance and are stored in the storing unit as the library and be reflected on the line drawing data. In addition, in the local style, plural types of local styles simultaneously extracted for the same pixel can be designated as a bundle or only some local styles of plural types of local styles simultaneously extracted for the same pixel can be selected and designated. Therefore, for example, a local style in which the user desires to reflect only the texture without reflecting the color can be designated, such that the user experience is improved.
  • Second Embodiment
  • A case in which the user selects the reference image data from which the user desires to extract the local style and extracts the local style from the reference image data has been described in the first embodiment, but the present disclosure is not limited thereto. In a second embodiment, an embodiment in which plural types of local styles are extracted in advance by extraction processing and are stored in a storage means and a user selects and uses a local style that the user desires to use among the plural types of local styles is described.
  • FIG. 5 is a block diagram showing a configuration of a line drawing automatic coloring device 20 according to a second embodiment. As shown in FIG. 5, the line drawing automatic coloring device 20 includes at least a line drawing data acquiring unit 11, a local style designation receiving unit 21, a coloring processing unit 16, and a storing unit 22. It is to be noted that components denoted by the same reference numerals as those of FIG. 1 according to the first embodiment perform the same functions as those of FIG. 1 in the present embodiment, and a description thereof is thus be omitted.
  • The local style designation receiving unit 21 has a function of receiving at least one local style designation for applying a selected local style to any place of acquired line drawing data. The local style designation according to the present embodiment is performed in a form in which the user selects a desired local style from a local style library in which the plural types of local styles are extracted in advance and stored and designates a place on the line drawing data on which the user desires to reflect the selected local style.
  • It is noted that the local style designation receiving unit 21 in the present embodiment may have the same function as the function of extracting the local style from the reference image data and reflecting the extracted local style on the line drawing data, which is performed in the reference image acquiring unit 12, the local style extraction place designating unit 13, the local style extracting unit 14, and the local style application designating unit 15 in the first embodiment. That is, the local style designation receiving unit 21 may have both functions so that the local style may be extracted and used from the reference image data or may be selected and used from the local style library.
  • The storing unit 22 similarly stores data stored in the storing unit 17 in the first embodiment and required for various processing and data obtained as a result of the processing, and also stores the local style library constituted by the plural types of local styles extracted in advance. It is preferable that the local style libraries can be classified and sorted depending on desired conditions such as a type of texture, author and the like so as to be easily used by the user.
  • A flow of coloring processing in the line drawing automatic coloring device 20 according to the second embodiment is the same as of the coloring processing in FIG. 4 except that processing in which the user selects the desired local style with reference to the local style library stored in the storing unit 22 and designates a position on the line drawing data on which the user desires to reflect the selected local style is executed instead of the processing of extracting and applying the local style from the reference image data in steps S202 to S205 in FIG. 4 with respect to the coloring processing of the line drawing automatic coloring device 10 according to the first embodiment.
  • As described above, according to the line drawing automatic coloring device 20 according to the second embodiment, the place to which the local data desired by the user is applied is designated with respect to line drawing data of a target to be colored with reference to the local style library, and the coloring processing can be executed. Therefore, it is possible to realize automatic coloring processing reflecting local styles related to coloring such as a texture, a gradation, a painting style, a pattern, a highlight, and a shadow as well as a color at the place desired by the user in the line drawing data. Since the local style library is created in advance and the user can select the local style from the local style library, a frequently used local style can be used for the coloring processing without being extracted from the reference image data each time, such that convenience of the user is improved.
  • A case in which the coloring processing is executed after the designation of all the local styles ends when the designation of the local styles for the line drawing data is performed plural times has been described in the first and second embodiments, but the present disclosure is not limited thereto. For example, the coloring processing may be frequently executed each time the designation of the application place of the local style from the reference image data or the designation of the application place of the local styled from the local style library is performed on the line drawing data. As described above, by executing the coloring processing each time and displaying the colored image data on the display region of the display screen each time, the user can designate an application place of the next local style while confirming a state of a colored image changed each time the application place of the local style is designated, such that the user experience is improved.
  • The coloring processing that applies the local style has been described in the first and second embodiments, but there are many local styles having directionality or regularity such as a gradation or a pattern. When applying these local styles to the designated place of the line drawing data, a function of performing conversion processing such as converting angles of the local styles, converting hues of the local styles, or changing orientations of gradations of the local styles may be added. As a result, the user experience is further improved.
  • In the first and second embodiments, after the extraction place of the local style is designated, the place to which the extracted local style is applied is determined by designating any place from the line drawing data displayed on the display screen by the user, who provides the instruction via a user interface. In this case, a function of informing the user of the place on the line drawing data related to the place at which the local style is extracted may be provided.
  • Processing using an existing image recognition technique, such as pattern matching or object detection is performed on each of the reference image data and the line drawing data to extract with which place on the line drawing data a feature of an image of the extraction place of the local style has a high relationship. For example, when the user selects an “eye” portion of a person on the reference image data by clicking the mouse, a place corresponding to a feature of an “eye” is extracted from the line drawing data using an existing image recognition technique and is presented in a form in which it can be recognized by the user. For example, a method of informing the user of the correspondence by blinking an “eye” portion on the line drawing data displayed on the display screen or temporarily changing a color of the “eye” portion is conceivable.
  • Likewise, when a specific place on the line drawing data previously displayed on the display screen is selected by a click or the like by the mouse, candidate places of a local style that are to be applied to the selected place may be extracted by performing processing using an image recognition technique.
  • In addition, the candidate places are not only extracted by performing the processing using the image recognition technique, but coloring processing reflecting the local style may also be automatically executed by automatically performing selection among the extracted candidate places.
  • Further, when a plurality of targets on which the user desires to perform the same coloring processing appear, for example, when the same character repeatedly appears in a plurality of frames in comics or when the same character appears in a plurality of line drawing data of a target to be colored, the extracted local style may be simultaneously applied to these targets. Also in this case, candidate places of the target to be colored, having the same feature as that of the place designated so as to extract the local style are extracted by the existing image recognition technique, and the extracted local style is applied to a plurality of extracted candidate places. The processing as described above is performed, such that it is possible to efficiently perform animation creation, coloring processing of comics, and the like.
  • As described above, the application candidate places of the local style in the image data are extracted from the feature of the selected place in the reference image data using the image recognition technique, or the extraction place candidates of the local style in the reference image data are extracted from the feature of the selected place in the image data using the image recognition technique, such that there is an effect that the convenience of the user is improved.
  • A configuration in which the coloring processing applying the local style extracted from the reference image data is performed on the line drawing data has been described in the first and second embodiments, but the present disclosure is not limited thereto. The coloring processing according to the present disclosure may be applied to, for example, image data having the same property as that of the line drawing data such as black-and-white comics, a gray scale image, a pencil sketch, a line drawing in which a shadow, a halftone or the like is partially written, and an undercoated line drawing, in addition to the line drawing data, as long as the image data can be prepared in a pair with the coloring correct answer image data and learning can be performed based on the image data.
  • In the case of the line drawing data, the line drawing data that becomes a set by performing edge extraction processing on the coloring correct answer image data is extracted, but it is also possible to automatically create a pair before and after the coloring from a data set including only the coloring correct answer image data by using a standard image processing technique such as grayscale processing, processing converting a brightness into a halftone, or processing reducing the number of colors instead of the edge extraction processing.

Claims (2)

1-16. (canceled)
17. A method of automatic coloring, comprising:
acquiring data of a target to be colored;
receiving at least one local style designation for applying a selected local style to at least a first part of the target to be colored, wherein the selected local style is not applied to another part of the target to be colored; and
performing coloring processing reflecting the local style designation on the first part of the target to be colored, by inputting (i) the acquired data of the target to be colored, (ii) extracted information indicating the selected local style applied to the first part of the target, the extracted information being obtained from a predetermined process for extracting a local style based on a user-designated part of a reference image having the selected local style, and (iii) information indicating a position of the first part to which the selected local style is applied, into a neural network system for coloring the target to be colored,
wherein the selected local style is related to a style different from a color, and
wherein the neural network system is obtained based on at least a training process using image data to be colored and answer image data of the image data to be colored.
US17/834,856 2017-09-20 2022-06-07 Automatic coloring of line drawing Pending US20220301239A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/834,856 US20220301239A1 (en) 2017-09-20 2022-06-07 Automatic coloring of line drawing

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2017180427A JP7242165B2 (en) 2017-09-20 2017-09-20 Program, Information Processing Apparatus, and Method
JP2017-180427 2017-09-20
US16/135,627 US11386587B2 (en) 2017-09-20 2018-09-19 Automatic coloring of line drawing
US17/834,856 US20220301239A1 (en) 2017-09-20 2022-06-07 Automatic coloring of line drawing

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US16/135,627 Continuation US11386587B2 (en) 2017-09-20 2018-09-19 Automatic coloring of line drawing

Publications (1)

Publication Number Publication Date
US20220301239A1 true US20220301239A1 (en) 2022-09-22

Family

ID=65720448

Family Applications (2)

Application Number Title Priority Date Filing Date
US16/135,627 Active US11386587B2 (en) 2017-09-20 2018-09-19 Automatic coloring of line drawing
US17/834,856 Pending US20220301239A1 (en) 2017-09-20 2022-06-07 Automatic coloring of line drawing

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US16/135,627 Active US11386587B2 (en) 2017-09-20 2018-09-19 Automatic coloring of line drawing

Country Status (2)

Country Link
US (2) US11386587B2 (en)
JP (1) JP7242165B2 (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7477260B2 (en) 2018-01-30 2024-05-01 株式会社Preferred Networks Information processing device, information processing program, and information processing method
JP6676744B1 (en) * 2018-12-28 2020-04-08 株式会社Cygames Image processing method, image processing system and program
CN111651969B (en) * 2019-03-04 2023-10-27 微软技术许可有限责任公司 style migration
CN112837332B (en) * 2021-01-13 2024-01-19 杭州水母智能科技有限公司 Creative design generation method, creative design generation device, terminal and storage medium
CN114155314A (en) * 2021-11-25 2022-03-08 航天科工深圳(集团)有限公司 Intelligent wall painting method, system, equipment and storage medium based on image recognition and character recognition
CN117557589A (en) * 2023-11-08 2024-02-13 深圳市闪剪智能科技有限公司 Line drawing coloring method, device and storage medium based on neural network

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200082249A1 (en) * 2016-12-16 2020-03-12 Microsoft Technology Licensing, Llc Image stylization based on learning network

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09134422A (en) * 1995-11-07 1997-05-20 B U G:Kk Continuous coloring method for animation and device therefor
JP2004334814A (en) 2003-05-07 2004-11-25 Susumu Shiroyama Technique, device, and program of colorization of monochromatic image
JP2012033012A (en) 2010-07-30 2012-02-16 Casio Comput Co Ltd Image tone conversion device, image tone conversion system, image tone conversion method and program
JP6004260B2 (en) 2012-08-27 2016-10-05 健二 東海林 Line drawing coloring system
US10198839B2 (en) * 2016-09-22 2019-02-05 Apple Inc. Style transfer-based image content correction
US10916001B2 (en) * 2016-11-28 2021-02-09 Adobe Inc. Facilitating sketch to painting transformations
US10311326B2 (en) 2017-03-31 2019-06-04 Qualcomm Incorporated Systems and methods for improved image textures
JP2019036899A (en) 2017-08-21 2019-03-07 株式会社東芝 Information processing unit, information processing method and program
JP7477260B2 (en) 2018-01-30 2024-05-01 株式会社Preferred Networks Information processing device, information processing program, and information processing method

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200082249A1 (en) * 2016-12-16 2020-03-12 Microsoft Technology Licensing, Llc Image stylization based on learning network

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Zhang, L., Ji, Y., Lin, X. and Liu, C. "Style transfer for anime sketches with enhanced residual u-net and auxiliary classifier gan." 2017 4th IAPR Asian conference on pattern recognition (ACPR). IEEE, 2017 (Year: 2017) *

Also Published As

Publication number Publication date
JP2019057066A (en) 2019-04-11
US20190087982A1 (en) 2019-03-21
JP7242165B2 (en) 2023-03-20
US11386587B2 (en) 2022-07-12

Similar Documents

Publication Publication Date Title
US20220301239A1 (en) Automatic coloring of line drawing
JP6970553B2 (en) Image processing device, image processing method
WO2022095721A1 (en) Parameter estimation model training method and apparatus, and device and storage medium
US11436436B2 (en) Data augmentation system, data augmentation method, and information storage medium
JPH10187936A (en) Image processor
CN109146991B (en) Picture format conversion method, device, equipment and storage medium
US11321584B2 (en) Information processing device, information processing program, and information processing method
US20220245866A1 (en) Device and method for automatically coloring cartoon sketch image
CN107730568B (en) Coloring method and device based on weight learning
JP7014782B2 (en) Automatic coloring method, automatic coloring system and automatic coloring device
CN113191938B (en) Image processing method, image processing device, electronic equipment and storage medium
CN113989167B (en) Contour extraction method, device, equipment and medium based on seed point self-growth
US10185875B2 (en) Image processing device, image display device, image processing method, and medium
US10546406B2 (en) User generated character animation
US20220366248A1 (en) Learning apparatus, a learning method, object detecting apparatus, object detecting method, and recording medium
CN115843375A (en) Logo labeling method and device, logo detection model updating method and system and storage medium
CN115810081A (en) Three-dimensional model generation method and device
JP7106144B2 (en) Image analysis device
KR102374141B1 (en) Costume region removal method for flexible virtual fitting image generation
KR102157005B1 (en) Method of improving precision of deep learning resultant image by using image filtering technique
US20220129973A1 (en) Image Modification to Generate Ghost Mannequin Effect in Image Content
CN117274765A (en) Model generation method, face key point detection device and electronic equipment
JP2019501472A (en) Graphic processing method for images
CN114820290A (en) Image processing method, image processing device, electronic equipment and storage medium
CN116088985A (en) User interface generation method and system based on artificial intelligence and completion psychology

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED