KR101752701B1 - Method for recreating makeup on image - Google Patents
Method for recreating makeup on image Download PDFInfo
- Publication number
- KR101752701B1 KR101752701B1 KR1020120146866A KR20120146866A KR101752701B1 KR 101752701 B1 KR101752701 B1 KR 101752701B1 KR 1020120146866 A KR1020120146866 A KR 1020120146866A KR 20120146866 A KR20120146866 A KR 20120146866A KR 101752701 B1 KR101752701 B1 KR 101752701B1
- Authority
- KR
- South Korea
- Prior art keywords
- cosmetic
- image
- pattern
- makeup
- generating
- Prior art date
Links
Images
Abstract
A method of reproducing an image with the same effect as an actual makeup is disclosed. A method for reproducing a cosmetic effect on an image includes the steps of generating a cosmetic pattern based on a difference value between a pre-cosmetic image and a post-makeup image, generating a makeup tool pattern by referring to characteristics of the makeup tool, And generating an application pattern synthesized by resizing the makeup tool pattern; and reproducing the cosmetic effect by applying the application pattern to the original image using the application pattern as an alpha channel. Therefore, it is possible to realistically reproduce the skin surface when the actual cosmetic is applied to the skin.
Description
BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a cosmetic effect reproducing method, and more particularly, to a method of reproducing an image with the same effect as an actual makeup.
In recent years, researchers in computer graphics and multimedia have been developing a variety of applications related to cosmetic makeup, shaping, and orthodontics
In addition, color simulation apparatuses or methods are being utilized for demonstrating the cosmetic on the screen before the user purchases and uses the cosmetic product.
For example, in order to express the effect of photographing a photograph taken with a camera and expressing the effect that the oil painting is drawn by hand, various brushing touches of a professional painter are imaged, and a technique of generating an oil painting image by applying a brush touch image to the image- Were developed.
In the case of a cosmetic simulation SW, which is being provided by a cosmetics company, cosmetic effect simulations are performed at the level of extracting feature points of a face to divide areas such as eyes, nose, mouth,
However, techniques for performing a cosmetic simulation on an image input from a camera are mostly performed by simply applying cosmetic colors without considering the properties of the cosmetic material and the makeup tools used.
Therefore, there is a limit in expressing the case where the user performs makeup using a specific makeup tool realistically on the image.
SUMMARY OF THE INVENTION It is an object of the present invention to provide a method of reproducing a cosmetic effect on an image based on a makeup pattern.
According to an aspect of the present invention, there is provided a method of reproducing a cosmetic effect on an image, the method comprising: generating a cosmetic pattern based on a difference value between a pre-cosmetic image and a post- Generating an application pattern by synthesizing the cosmetic pattern and the makeup tool pattern by synthesizing the cosmetic pattern and the makeup tool pattern; and applying the application pattern to the original image using the application pattern as an alpha channel to reproduce the cosmetic effect .
Here, the method for reproducing the cosmetic effect on the image may further include generating a bidirectional reflectance distribution function for expression of the cosmetic material.
Here, the step of generating the bidirectional reflectance distribution function may generate the bidirectional reflectance distribution function using the photographed image while adjusting the incident angle of the light by changing the position and angle of the illumination.
Here, the step of reproducing the cosmetic effect may reproduce the cosmetic effect by performing image rendering according to the bidirectional reflectance distribution function.
Wherein the step of generating the cosmetic pattern comprises the steps of: acquiring a pre-cosmetic image in a preset illumination; acquiring a post-cosmetic image in the same illumination as the preset illumination; And a step of calculating a difference value from the pre-cosmetic image and the after-cosmetic image converted into the monochrome image.
Here, the step of creating the makeup tool pattern may generate a makeup tool pattern by referring to the characteristics of the makeup tool including the soft brush and the hard brush.
Here, the step of generating the application pattern may resize and normalize the cosmetic pattern and the makeup tool pattern to the same size to generate an application pattern.
Here, the step of reproducing the cosmetic effect may reproduce the cosmetic effect by alpha-blending the original image and the applied pattern.
When the method of reproducing the cosmetic effect according to the embodiment of the present invention as described above is used, the cosmetic effect as in applying the actual cosmetic to the skin based on the cosmetic pattern according to the cosmetic and the cosmetic tool is reproduced in the image can do.
1 is a conceptual diagram illustrating a method of acquiring a cosmetic pattern according to an embodiment of the present invention.
2 is a conceptual diagram illustrating a method of calculating a bidirectional reflectance distribution function according to an embodiment of the present invention.
3 is an exemplary view of a pattern of a soft brush according to an embodiment of the present invention.
4 is an exemplary view of a hard brush pattern according to an embodiment of the present invention.
5 is a conceptual diagram illustrating a method of reproducing a cosmetic effect on an image according to an embodiment of the present invention.
6 is a flowchart illustrating a method of reproducing a cosmetic effect on an image according to an embodiment of the present invention.
While the invention is susceptible to various modifications and alternative forms, specific embodiments thereof are shown by way of example in the drawings and will herein be described in detail. It should be understood, however, that the invention is not intended to be limited to the particular embodiments, but includes all modifications, equivalents, and alternatives falling within the spirit and scope of the invention. Like reference numerals are used for like elements in describing each drawing.
The terms first, second, A, B, etc. may be used to describe various elements, but the elements should not be limited by the terms. The terms are used only for the purpose of distinguishing one component from another. For example, without departing from the scope of the present invention, the first component may be referred to as a second component, and similarly, the second component may also be referred to as a first component. And / or < / RTI > includes any combination of a plurality of related listed items or any of a plurality of related listed items.
It is to be understood that when an element is referred to as being "connected" or "connected" to another element, it may be directly connected or connected to the other element, . On the other hand, when an element is referred to as being "directly connected" or "directly connected" to another element, it should be understood that there are no other elements in between.
The terminology used in this application is used only to describe a specific embodiment and is not intended to limit the invention. The singular expressions include plural expressions unless the context clearly dictates otherwise. In the present application, the terms "comprises" or "having" and the like are used to specify that there is a feature, a number, a step, an operation, an element, a component or a combination thereof described in the specification, But do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, or combinations thereof.
Unless defined otherwise, all terms used herein, including technical or scientific terms, have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. Terms such as those defined in commonly used dictionaries are to be interpreted as having a meaning consistent with the contextual meaning of the related art and are to be interpreted as either ideal or overly formal in the sense of the present application Do not.
Hereinafter, preferred embodiments according to the present invention will be described in detail with reference to the accompanying drawings.
1 is a conceptual diagram illustrating a method of acquiring a cosmetic pattern according to an embodiment of the present invention.
Referring to Figure 1, a method of obtaining a cosmetic pattern is described.
First, the settings of the
In addition, the image photographed for acquiring the cosmetic pattern is provided with the
The
After completion of setting of the
The
For example, pre-makeup images of actual skin or artificial skin are photographed in a
The photographed pre-makeup image and post-makeup image are converted into black and white images, and the difference between the converted pre-makeup image and the post-makeup image is calculated. The difference value can be calculated as a binary number, and a pixel having a difference value of 0 and a pixel having a difference value of 1 can be classified. That is, when the difference value represented by the binary number is 0, it means a pixel having no difference between the pre-makeup image and the post-makeup image, and when the difference value is 1, the difference between the pre- . ≪ / RTI > Therefore, a pixel having a difference value of 1 can be used as a cosmetic pattern.
2 is a conceptual diagram illustrating a method of calculating a bidirectional reflectance distribution function according to an embodiment of the present invention.
Referring to FIG. 2, in order to reproduce the post-makeup image more realistically, it is necessary to express the cosmetic material.
That is, a bidirectional reflectance distribution function (BRDF) can be utilized to represent the material of the
Figure 2 shows a method of calculating a bi-directional reflectivity distribution function.
The bidirectional reflectance distribution function can realize interaction between light and an object and can be calculated from images obtained at various incident angles (photographing angles) through a change in the incident angle of the
The bidirectional reflectance distribution function according to an embodiment of the present invention can be calculated from an image photographed by adjusting the incidence angle of light by changing the position and angle of the
Therefore, according to the embodiment of the present invention, in the rendering of the image, the post-makeup image can be expressed more realistically by using the bi-directional reflectance distribution function.
FIG. 3 is an exemplary view of a pattern of a soft brush according to an embodiment of the present invention, and FIG. 4 is an exemplary view of a pattern of a hard brush according to an embodiment of the present invention.
3 and 4, other cosmetic effects may be implemented depending on the characteristics of the makeup tool.
The brush used for makeup can be divided into a soft brush having a thin soft brush and a hard brush having a thick brush brush.
The soft brush is painted strongly on the center and slowly creases outward as you make up. Then, the hard brush is gradually softened gradually from the center to the outside. The soft brush has a large difference in strength between the center and the outer makeup, while the hard brush has a small difference in strength between the center and the outer makeup.
Fig. 3 shows the makeup intensity and makeup pattern by the soft brush.
The strength of makeup by the soft brush can be expressed as shown in Fig. 3 (a). Here, the x-axis represents the size of the brush, and the y-axis represents the intensity of the make-up. That is, in the case of make-up with a soft brush, the area corresponding to the center of the brush has a large strength of makeup, and the area corresponding to the outside of the brush has a small makeup strength.
In addition, the makeup pattern by the soft brush can be expressed as shown in Fig. 3 (b). That is, the center area, which is a white part, means an area where makeup is darkened, and the outer area, which is a black part, may mean an area where makeup is softened.
Fig. 4 shows the makeup intensity and makeup pattern by the hard brush.
The intensity of makeup by the hard brush can be expressed as shown in Fig. 4 (a). Here, the x-axis represents the size of the brush, and the y-axis represents the intensity of the make-up. That is, when the make-up is performed with the hard brush, the area corresponding to the center of the brush has a large intensity of make-up, and the area corresponding to the outside of the brush has a small make-up strength, but the difference is small as compared with the soft brush.
In addition, the makeup pattern by the hard brush can be expressed as shown in Fig. 4 (b). That is, the center area, which is a white part, means an area where makeup is darkened, and the outer area, which is a black part, may mean an area where makeup is softened.
Thus, soft brushes and hard brushes can express makeup of different patterns. That is, makeup of a different pattern may be expressed depending on which cosmetic tool is used by the user.
5 is a conceptual diagram illustrating a method of reproducing a cosmetic effect on an image according to an embodiment of the present invention.
5, a
The
According to the embodiment of the present invention, the
Particularly, according to the embodiment of the present invention, the
An alpha channel may mean a supplementary channel for handling editing information other than data decomposed into three primary colors in image processing, and can perform image processing effectively using an alpha channel.
In addition, alpha blending is a technique of expressing a transparent image by adding alpha representing transparency to image data, and the
6 is a flowchart illustrating a method of reproducing a cosmetic effect on an image according to an embodiment of the present invention.
Referring to FIG. 6, a method for reproducing a cosmetic effect on an image according to an embodiment of the present invention includes generating a cosmetic pattern S610, creating a makeup tool pattern S620, (S630), and reproducing the cosmetic effect (S640).
The cosmetic pattern can be generated based on the difference value between the pre-cosmetic image and the post-cosmetic image (S610).
The step of creating a cosmetic pattern S610 includes the steps of acquiring a pre-cosmetic image in a
More specifically, pre-makeup images of actual skin or artificial skin are photographed in a
The photographed pre-makeup image and post-makeup image are converted into black and white images, and the difference between the converted pre-makeup image and the post-makeup image is calculated.
The difference value can be calculated as a binary number, and a pixel having a difference value of 0 and a pixel having a difference value of 1 can be classified. Therefore, a pixel having a difference value of 1 can be used as a cosmetic pattern.
The makeup tool pattern can be generated by referring to the characteristics of the makeup tool (S620). For example, creating a makeup tool pattern (S620) may create a makeup tool pattern with reference to the characteristics of the makeup tool, including soft brushes and hard brushes.
For example, a soft brush should have a strong center of paint and a soft, gradually creasing outward. Then, the hard brush is gradually softened gradually from the center to the outside. The soft brush has a large difference in strength between the center and the outer makeup, while the hard brush has a small difference in strength between the center and the outer makeup.
Thus, soft brushes and hard brushes can express makeup of different patterns. That is, makeup of a different pattern may be expressed depending on which cosmetic tool is used by the user.
That is, step S620 of creating a makeup tool pattern may generate a makeup tool pattern that includes information about makeup tools that perform makeup of different patterns, such as soft brushes and hard brushes.
The
For example, the cosmetic pattern and the makeup tool pattern can be resized to the same width size. In addition, the range of the cosmetic pattern and the makeup tool pattern can be normalized to have a value of 0 to 0.5, and the
The cosmetic effect can be reproduced by applying the
For example, in the step of reproducing the cosmetic effect S640, the
In addition, the method for reproducing the cosmetic effect on the image according to the embodiment of the present invention may further include generating the bidirectional reflectance distribution function for the expression of the cosmetic material. The step of generating the bidirectional reflectance distribution function may generate the bidirectional reflectance distribution function using the photographed image while changing the position and angle of the
For example, the bidirectional reflectance distribution function can realize interaction between light and an object and can be calculated from images obtained at various incident angles (photographing angles) through changes in the incident angle of the
Accordingly, the step of reproducing the cosmetic effect (S640) can reproduce the cosmetic effect by performing image rendering according to the bidirectional reflectance distribution function.
It will be apparent to those skilled in the art that various modifications and variations can be made in the present invention without departing from the spirit or scope of the present invention as defined by the following claims It can be understood that
10: Camera 20: Lighting
21: diffusion plate 30: skin
100: Cosmetic color image 200: Application pattern
300: Original image 400: Result image
Claims (9)
Creating a makeup tool pattern with reference to a characteristic of the makeup tool;
Creating an application pattern synthesized by resizing the cosmetic pattern and the makeup tool pattern; And
And applying the application pattern to the original image using the application pattern as an alpha channel to reproduce the cosmetic effect,
Each of the pre-cosmetic image and the post-cosmetic image is obtained by illuminating and photographing under a condition that a specular area does not appear,
Wherein the step of creating the makeup tool pattern comprises:
Generating a pattern of makeup by a soft brush in which the central area of the brush has a larger makeup intensity than the outer area; And
And generating a makeup pattern by a hard brush in which the difference in make-up strength between the central area and the outer area of the brush is smaller than the make-up pattern by the soft brush.
Prior to reproducing the cosmetic effect,
Further comprising generating a bi-directional reflectance distribution function for expression of the cosmetic material.
Wherein the generating the bidirectional reflectance distribution function comprises:
Wherein the bi-directional reflectance distribution function is generated by using the photographed image while adjusting the incidence angle of the light by changing the position and angle of the illumination.
The step of reproducing the cosmetic effect comprises:
Wherein the image rendering according to the bidirectional reflectance distribution function is performed to reproduce the cosmetic effect.
Wherein the step of generating the cosmetic pattern comprises:
Acquiring a pre-cosmetic image in a preset illumination;
Obtaining a post-makeup image in the same illumination as the preset illumination;
Converting the pre-cosmetic image and the post-makeup image into a monochrome image; And
And calculating a difference value from the cosmetic image and the after-cosmetic image which are converted into the monochrome image.
Wherein the generating the application pattern comprises:
Wherein the cosmetic pattern and the makeup tool pattern are resized to the same size and normalized to generate the application pattern.
The step of reproducing the cosmetic effect comprises:
Wherein the cosmetic effect is reproduced by alpha blending the original image and the application pattern.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020120146866A KR101752701B1 (en) | 2012-12-14 | 2012-12-14 | Method for recreating makeup on image |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020120146866A KR101752701B1 (en) | 2012-12-14 | 2012-12-14 | Method for recreating makeup on image |
Publications (2)
Publication Number | Publication Date |
---|---|
KR20140077752A KR20140077752A (en) | 2014-06-24 |
KR101752701B1 true KR101752701B1 (en) | 2017-06-30 |
Family
ID=51129602
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020120146866A KR101752701B1 (en) | 2012-12-14 | 2012-12-14 | Method for recreating makeup on image |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR101752701B1 (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10762665B2 (en) | 2018-05-23 | 2020-09-01 | Perfect Corp. | Systems and methods for performing virtual application of makeup effects based on a source image |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001238727A (en) * | 2000-02-29 | 2001-09-04 | Kao Corp | Makeup advice system |
JP2005216131A (en) * | 2004-01-30 | 2005-08-11 | Digital Fashion Ltd | Makeup simulation apparatus, method and program |
JP2009077086A (en) * | 2007-09-19 | 2009-04-09 | Fuji Xerox Co Ltd | Image processor and program |
-
2012
- 2012-12-14 KR KR1020120146866A patent/KR101752701B1/en active IP Right Grant
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001238727A (en) * | 2000-02-29 | 2001-09-04 | Kao Corp | Makeup advice system |
JP2005216131A (en) * | 2004-01-30 | 2005-08-11 | Digital Fashion Ltd | Makeup simulation apparatus, method and program |
JP2009077086A (en) * | 2007-09-19 | 2009-04-09 | Fuji Xerox Co Ltd | Image processor and program |
Also Published As
Publication number | Publication date |
---|---|
KR20140077752A (en) | 2014-06-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106373187B (en) | Two dimensional image based on AR is converted to the implementation method of three-dimensional scenic | |
JP6396890B2 (en) | Image processing apparatus, image processing method, and program capable of virtually reproducing state where makeup coating material is applied | |
Flagg et al. | Projector-guided painting | |
Furferi et al. | From 2D to 2.5 D ie from painting to tactile model | |
Karimov et al. | Advanced tone rendition technique for a painting robot | |
Ganovelli et al. | Introduction to computer graphics: A practical learning approach | |
Gerl et al. | Interactive example-based hatching | |
US10643491B2 (en) | Process, system and method for step-by-step painting of an image on a transparent surface | |
Lopez-Moreno et al. | Non-photorealistic, depth-based image editing | |
KR101752701B1 (en) | Method for recreating makeup on image | |
JP2023126774A (en) | Screen tone look generator | |
Seo et al. | Interactive painterly rendering with artistic error correction | |
Blatner et al. | TangiPaint: a tangible digital painting system | |
Ligon | Digital art revolution | |
Stephenson | Essential RenderMan® | |
Madračević et al. | 3D modeling from 2D images | |
Mesquita et al. | Synthesis and Validation of Virtual Woodcuts Generated with Reaction-Diffusion | |
KR20080041978A (en) | Painterly rendering method based human painting process and exhibition system thereof | |
Jo | KeyShot 3D Rendering | |
JP6708862B1 (en) | Image processing method, program, and image processing apparatus | |
Cho | Vray for Sketchup | |
Liu | Chinese Ink-and-Brush Painting with Film Lighting Aesthetics in 3D Computer Graphics | |
Huang et al. | Stereoscopic oil paintings from RGBD images | |
Jackson | Digital Painting Techniques: Using Corel Painter 2016 | |
Ferwerda et al. | Tangible images: Bridging the real and virtual worlds |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A201 | Request for examination | ||
E902 | Notification of reason for refusal | ||
E701 | Decision to grant or registration of patent right | ||
GRNT | Written decision to grant |