KR101752701B1 - Method for recreating makeup on image - Google Patents

Method for recreating makeup on image Download PDF

Info

Publication number
KR101752701B1
KR101752701B1 KR1020120146866A KR20120146866A KR101752701B1 KR 101752701 B1 KR101752701 B1 KR 101752701B1 KR 1020120146866 A KR1020120146866 A KR 1020120146866A KR 20120146866 A KR20120146866 A KR 20120146866A KR 101752701 B1 KR101752701 B1 KR 101752701B1
Authority
KR
South Korea
Prior art keywords
cosmetic
image
pattern
makeup
generating
Prior art date
Application number
KR1020120146866A
Other languages
Korean (ko)
Other versions
KR20140077752A (en
Inventor
이송우
김진서
이지형
김재우
유주연
장인수
최윤석
권순영
Original Assignee
한국전자통신연구원
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 한국전자통신연구원 filed Critical 한국전자통신연구원
Priority to KR1020120146866A priority Critical patent/KR101752701B1/en
Publication of KR20140077752A publication Critical patent/KR20140077752A/en
Application granted granted Critical
Publication of KR101752701B1 publication Critical patent/KR101752701B1/en

Links

Images

Abstract

A method of reproducing an image with the same effect as an actual makeup is disclosed. A method for reproducing a cosmetic effect on an image includes the steps of generating a cosmetic pattern based on a difference value between a pre-cosmetic image and a post-makeup image, generating a makeup tool pattern by referring to characteristics of the makeup tool, And generating an application pattern synthesized by resizing the makeup tool pattern; and reproducing the cosmetic effect by applying the application pattern to the original image using the application pattern as an alpha channel. Therefore, it is possible to realistically reproduce the skin surface when the actual cosmetic is applied to the skin.

Description

METHOD FOR RECREATING MAKEUP ON IMAGE}

BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a cosmetic effect reproducing method, and more particularly, to a method of reproducing an image with the same effect as an actual makeup.

In recent years, researchers in computer graphics and multimedia have been developing a variety of applications related to cosmetic makeup, shaping, and orthodontics

In addition, color simulation apparatuses or methods are being utilized for demonstrating the cosmetic on the screen before the user purchases and uses the cosmetic product.

For example, in order to express the effect of photographing a photograph taken with a camera and expressing the effect that the oil painting is drawn by hand, various brushing touches of a professional painter are imaged, and a technique of generating an oil painting image by applying a brush touch image to the image- Were developed.

In the case of a cosmetic simulation SW, which is being provided by a cosmetics company, cosmetic effect simulations are performed at the level of extracting feature points of a face to divide areas such as eyes, nose, mouth,

However, techniques for performing a cosmetic simulation on an image input from a camera are mostly performed by simply applying cosmetic colors without considering the properties of the cosmetic material and the makeup tools used.

Therefore, there is a limit in expressing the case where the user performs makeup using a specific makeup tool realistically on the image.

SUMMARY OF THE INVENTION It is an object of the present invention to provide a method of reproducing a cosmetic effect on an image based on a makeup pattern.

According to an aspect of the present invention, there is provided a method of reproducing a cosmetic effect on an image, the method comprising: generating a cosmetic pattern based on a difference value between a pre-cosmetic image and a post- Generating an application pattern by synthesizing the cosmetic pattern and the makeup tool pattern by synthesizing the cosmetic pattern and the makeup tool pattern; and applying the application pattern to the original image using the application pattern as an alpha channel to reproduce the cosmetic effect .

Here, the method for reproducing the cosmetic effect on the image may further include generating a bidirectional reflectance distribution function for expression of the cosmetic material.

Here, the step of generating the bidirectional reflectance distribution function may generate the bidirectional reflectance distribution function using the photographed image while adjusting the incident angle of the light by changing the position and angle of the illumination.

Here, the step of reproducing the cosmetic effect may reproduce the cosmetic effect by performing image rendering according to the bidirectional reflectance distribution function.

Wherein the step of generating the cosmetic pattern comprises the steps of: acquiring a pre-cosmetic image in a preset illumination; acquiring a post-cosmetic image in the same illumination as the preset illumination; And a step of calculating a difference value from the pre-cosmetic image and the after-cosmetic image converted into the monochrome image.

Here, the step of creating the makeup tool pattern may generate a makeup tool pattern by referring to the characteristics of the makeup tool including the soft brush and the hard brush.

Here, the step of generating the application pattern may resize and normalize the cosmetic pattern and the makeup tool pattern to the same size to generate an application pattern.

Here, the step of reproducing the cosmetic effect may reproduce the cosmetic effect by alpha-blending the original image and the applied pattern.

When the method of reproducing the cosmetic effect according to the embodiment of the present invention as described above is used, the cosmetic effect as in applying the actual cosmetic to the skin based on the cosmetic pattern according to the cosmetic and the cosmetic tool is reproduced in the image can do.

1 is a conceptual diagram illustrating a method of acquiring a cosmetic pattern according to an embodiment of the present invention.
2 is a conceptual diagram illustrating a method of calculating a bidirectional reflectance distribution function according to an embodiment of the present invention.
3 is an exemplary view of a pattern of a soft brush according to an embodiment of the present invention.
4 is an exemplary view of a hard brush pattern according to an embodiment of the present invention.
5 is a conceptual diagram illustrating a method of reproducing a cosmetic effect on an image according to an embodiment of the present invention.
6 is a flowchart illustrating a method of reproducing a cosmetic effect on an image according to an embodiment of the present invention.

While the invention is susceptible to various modifications and alternative forms, specific embodiments thereof are shown by way of example in the drawings and will herein be described in detail. It should be understood, however, that the invention is not intended to be limited to the particular embodiments, but includes all modifications, equivalents, and alternatives falling within the spirit and scope of the invention. Like reference numerals are used for like elements in describing each drawing.

The terms first, second, A, B, etc. may be used to describe various elements, but the elements should not be limited by the terms. The terms are used only for the purpose of distinguishing one component from another. For example, without departing from the scope of the present invention, the first component may be referred to as a second component, and similarly, the second component may also be referred to as a first component. And / or < / RTI > includes any combination of a plurality of related listed items or any of a plurality of related listed items.

It is to be understood that when an element is referred to as being "connected" or "connected" to another element, it may be directly connected or connected to the other element, . On the other hand, when an element is referred to as being "directly connected" or "directly connected" to another element, it should be understood that there are no other elements in between.

The terminology used in this application is used only to describe a specific embodiment and is not intended to limit the invention. The singular expressions include plural expressions unless the context clearly dictates otherwise. In the present application, the terms "comprises" or "having" and the like are used to specify that there is a feature, a number, a step, an operation, an element, a component or a combination thereof described in the specification, But do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, or combinations thereof.

Unless defined otherwise, all terms used herein, including technical or scientific terms, have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. Terms such as those defined in commonly used dictionaries are to be interpreted as having a meaning consistent with the contextual meaning of the related art and are to be interpreted as either ideal or overly formal in the sense of the present application Do not.

Hereinafter, preferred embodiments according to the present invention will be described in detail with reference to the accompanying drawings.

1 is a conceptual diagram illustrating a method of acquiring a cosmetic pattern according to an embodiment of the present invention.

Referring to Figure 1, a method of obtaining a cosmetic pattern is described.

First, the settings of the illumination 20 and the camera 10 are required to acquire an image. The illumination 20 may be disposed on both sides of the skin 30 to be imaged, and a diffusion plate 21 may be provided in front of the illumination 20 to disperse the light. In addition, it is preferable that external light other than the installed illumination 20 be blocked. Here, the skin 30 to be photographed may be artificial or actual skin.

In addition, the image photographed for acquiring the cosmetic pattern is provided with the illumination 20 so that a specular region in which the light of the illumination 20 strongly reflects is not displayed.

The camera 10 for photographing the skin 30 can be placed at a position perpendicular to the skin 30 to photograph the skin 30. [

After completion of setting of the illumination 20 and the camera 10 for acquiring an image, a pre-makeup image and a post-makeup image are photographed.

The camera 10 takes a pre-makeup image and a post-makeup image in a direction perpendicular to the skin 30. In addition, when the camera 10 photographs the pre-cosmetic image and the post-makeup image, the illumination 20 remains the same.

For example, pre-makeup images of actual skin or artificial skin are photographed in a predetermined illumination 20. After taking pre-makeup images, cosmetics are applied to the actual skin or artificial skin using a makeup brush or the like, and a post-makeup image is taken.

The photographed pre-makeup image and post-makeup image are converted into black and white images, and the difference between the converted pre-makeup image and the post-makeup image is calculated. The difference value can be calculated as a binary number, and a pixel having a difference value of 0 and a pixel having a difference value of 1 can be classified. That is, when the difference value represented by the binary number is 0, it means a pixel having no difference between the pre-makeup image and the post-makeup image, and when the difference value is 1, the difference between the pre- . ≪ / RTI > Therefore, a pixel having a difference value of 1 can be used as a cosmetic pattern.

2 is a conceptual diagram illustrating a method of calculating a bidirectional reflectance distribution function according to an embodiment of the present invention.

Referring to FIG. 2, in order to reproduce the post-makeup image more realistically, it is necessary to express the cosmetic material.

That is, a bidirectional reflectance distribution function (BRDF) can be utilized to represent the material of the skin 30 after makeup in a realistic manner.

Figure 2 shows a method of calculating a bi-directional reflectivity distribution function.

The bidirectional reflectance distribution function can realize interaction between light and an object and can be calculated from images obtained at various incident angles (photographing angles) through a change in the incident angle of the illumination 20 or a change in the photographing angle of the photographing apparatus.

The bidirectional reflectance distribution function according to an embodiment of the present invention can be calculated from an image photographed by adjusting the incidence angle of light by changing the position and angle of the illumination 20. [ In this case, when the cosmetic pattern is generated, the diffusion plate 21 is provided in front of the illumination 20. However, in calculating the bidirectional reflectance distribution function, the diffusion plate 21 may not be provided in front of the illumination 20. [

Therefore, according to the embodiment of the present invention, in the rendering of the image, the post-makeup image can be expressed more realistically by using the bi-directional reflectance distribution function.

FIG. 3 is an exemplary view of a pattern of a soft brush according to an embodiment of the present invention, and FIG. 4 is an exemplary view of a pattern of a hard brush according to an embodiment of the present invention.

3 and 4, other cosmetic effects may be implemented depending on the characteristics of the makeup tool.

The brush used for makeup can be divided into a soft brush having a thin soft brush and a hard brush having a thick brush brush.

The soft brush is painted strongly on the center and slowly creases outward as you make up. Then, the hard brush is gradually softened gradually from the center to the outside. The soft brush has a large difference in strength between the center and the outer makeup, while the hard brush has a small difference in strength between the center and the outer makeup.

Fig. 3 shows the makeup intensity and makeup pattern by the soft brush.

The strength of makeup by the soft brush can be expressed as shown in Fig. 3 (a). Here, the x-axis represents the size of the brush, and the y-axis represents the intensity of the make-up. That is, in the case of make-up with a soft brush, the area corresponding to the center of the brush has a large strength of makeup, and the area corresponding to the outside of the brush has a small makeup strength.

In addition, the makeup pattern by the soft brush can be expressed as shown in Fig. 3 (b). That is, the center area, which is a white part, means an area where makeup is darkened, and the outer area, which is a black part, may mean an area where makeup is softened.

Fig. 4 shows the makeup intensity and makeup pattern by the hard brush.

The intensity of makeup by the hard brush can be expressed as shown in Fig. 4 (a). Here, the x-axis represents the size of the brush, and the y-axis represents the intensity of the make-up. That is, when the make-up is performed with the hard brush, the area corresponding to the center of the brush has a large intensity of make-up, and the area corresponding to the outside of the brush has a small make-up strength, but the difference is small as compared with the soft brush.

In addition, the makeup pattern by the hard brush can be expressed as shown in Fig. 4 (b). That is, the center area, which is a white part, means an area where makeup is darkened, and the outer area, which is a black part, may mean an area where makeup is softened.

Thus, soft brushes and hard brushes can express makeup of different patterns. That is, makeup of a different pattern may be expressed depending on which cosmetic tool is used by the user.

5 is a conceptual diagram illustrating a method of reproducing a cosmetic effect on an image according to an embodiment of the present invention.

5, a cosmetic color image 100, an application pattern 200, and an original image 300 are required to reproduce a cosmetic effect on an image according to an embodiment of the present invention.

The cosmetic color image 100 refers to an image including the color of a cosmetic product, and the application pattern 200 refers to an image obtained by resizing a cosmetic pattern and a makeup tool pattern. In addition, the original image 300 may mean an image of the skin 30 taken before the makeup.

According to the embodiment of the present invention, the cosmetic color image 100 and the application pattern 200 may be applied to the original image 300 to acquire the resultant image 400 in which the cosmetic effect is reproduced.

Particularly, according to the embodiment of the present invention, the application pattern 200 can be applied to the original image 300 through alpha blending using the application pattern 200 as an alpha channel.

An alpha channel may mean a supplementary channel for handling editing information other than data decomposed into three primary colors in image processing, and can perform image processing effectively using an alpha channel.

In addition, alpha blending is a technique of expressing a transparent image by adding alpha representing transparency to image data, and the application pattern 200 can be applied to the original image 300 using an alpha blending technique.

6 is a flowchart illustrating a method of reproducing a cosmetic effect on an image according to an embodiment of the present invention.

Referring to FIG. 6, a method for reproducing a cosmetic effect on an image according to an embodiment of the present invention includes generating a cosmetic pattern S610, creating a makeup tool pattern S620, (S630), and reproducing the cosmetic effect (S640).

The cosmetic pattern can be generated based on the difference value between the pre-cosmetic image and the post-cosmetic image (S610).

The step of creating a cosmetic pattern S610 includes the steps of acquiring a pre-cosmetic image in a preset illumination 20, acquiring a post-makeup image in the same illumination 20 as the preset illumination 20, Converting the entire image and the post-makeup image into a monochrome image, and calculating a difference value from the pre-cosmetic image and the after-cosmetic image converted into the monochrome image.

More specifically, pre-makeup images of actual skin or artificial skin are photographed in a preset illumination 20. [ After taking pre-makeup images, cosmetics are applied to the actual skin or artificial skin using a makeup brush or the like, and a post-makeup image is taken.

The photographed pre-makeup image and post-makeup image are converted into black and white images, and the difference between the converted pre-makeup image and the post-makeup image is calculated.

The difference value can be calculated as a binary number, and a pixel having a difference value of 0 and a pixel having a difference value of 1 can be classified. Therefore, a pixel having a difference value of 1 can be used as a cosmetic pattern.

The makeup tool pattern can be generated by referring to the characteristics of the makeup tool (S620). For example, creating a makeup tool pattern (S620) may create a makeup tool pattern with reference to the characteristics of the makeup tool, including soft brushes and hard brushes.

For example, a soft brush should have a strong center of paint and a soft, gradually creasing outward. Then, the hard brush is gradually softened gradually from the center to the outside. The soft brush has a large difference in strength between the center and the outer makeup, while the hard brush has a small difference in strength between the center and the outer makeup.

Thus, soft brushes and hard brushes can express makeup of different patterns. That is, makeup of a different pattern may be expressed depending on which cosmetic tool is used by the user.

That is, step S620 of creating a makeup tool pattern may generate a makeup tool pattern that includes information about makeup tools that perform makeup of different patterns, such as soft brushes and hard brushes.

The application pattern 200 synthesized by resizing the cosmetic pattern and the makeup tool pattern may be generated (S630). The step of creating the application pattern 200 (S630) may resize the cosmetic pattern and the makeup tool pattern to the same size and normalize them to generate the application pattern 200. [

For example, the cosmetic pattern and the makeup tool pattern can be resized to the same width size. In addition, the range of the cosmetic pattern and the makeup tool pattern can be normalized to have a value of 0 to 0.5, and the application pattern 200 having a value of 0 to 1.0 can be generated by adding two normalized patterns.

The cosmetic effect can be reproduced by applying the application pattern 200 to the original image 300 using the application pattern 200 as an alpha channel in operation S640. That is, the cosmetic effect can be reproduced by alpha blending the original image 300 and the application pattern 200 (S640).

For example, in the step of reproducing the cosmetic effect S640, the cosmetic color image 100 and the application pattern 200 may be applied to the original image 300 to obtain the resultant image 400 in which the cosmetic effect is reproduced have. Particularly, according to the embodiment of the present invention, the application pattern 200 can be applied to the original image 300 through alpha blending using the application pattern 200 as an alpha channel.

In addition, the method for reproducing the cosmetic effect on the image according to the embodiment of the present invention may further include generating the bidirectional reflectance distribution function for the expression of the cosmetic material. The step of generating the bidirectional reflectance distribution function may generate the bidirectional reflectance distribution function using the photographed image while changing the position and angle of the illumination 20 to adjust the incidence angle of the light.

For example, the bidirectional reflectance distribution function can realize interaction between light and an object and can be calculated from images obtained at various incident angles (photographing angles) through changes in the incident angle of the illumination 20 or changes in the photographing angle of the photographing apparatus .

Accordingly, the step of reproducing the cosmetic effect (S640) can reproduce the cosmetic effect by performing image rendering according to the bidirectional reflectance distribution function.

It will be apparent to those skilled in the art that various modifications and variations can be made in the present invention without departing from the spirit or scope of the present invention as defined by the following claims It can be understood that

10: Camera 20: Lighting
21: diffusion plate 30: skin
100: Cosmetic color image 200: Application pattern
300: Original image 400: Result image

Claims (9)

Creating a cosmetic pattern based on a difference value between a pre-cosmetic image and a post-cosmetic image;
Creating a makeup tool pattern with reference to a characteristic of the makeup tool;
Creating an application pattern synthesized by resizing the cosmetic pattern and the makeup tool pattern; And
And applying the application pattern to the original image using the application pattern as an alpha channel to reproduce the cosmetic effect,
Each of the pre-cosmetic image and the post-cosmetic image is obtained by illuminating and photographing under a condition that a specular area does not appear,
Wherein the step of creating the makeup tool pattern comprises:
Generating a pattern of makeup by a soft brush in which the central area of the brush has a larger makeup intensity than the outer area; And
And generating a makeup pattern by a hard brush in which the difference in make-up strength between the central area and the outer area of the brush is smaller than the make-up pattern by the soft brush.
The method according to claim 1,
Prior to reproducing the cosmetic effect,
Further comprising generating a bi-directional reflectance distribution function for expression of the cosmetic material.
The method of claim 2,
Wherein the generating the bidirectional reflectance distribution function comprises:
Wherein the bi-directional reflectance distribution function is generated by using the photographed image while adjusting the incidence angle of the light by changing the position and angle of the illumination.
The method of claim 3,
The step of reproducing the cosmetic effect comprises:
Wherein the image rendering according to the bidirectional reflectance distribution function is performed to reproduce the cosmetic effect.
The method according to claim 1,
Wherein the step of generating the cosmetic pattern comprises:
Acquiring a pre-cosmetic image in a preset illumination;
Obtaining a post-makeup image in the same illumination as the preset illumination;
Converting the pre-cosmetic image and the post-makeup image into a monochrome image; And
And calculating a difference value from the cosmetic image and the after-cosmetic image which are converted into the monochrome image.
delete The method according to claim 1,
Wherein the generating the application pattern comprises:
Wherein the cosmetic pattern and the makeup tool pattern are resized to the same size and normalized to generate the application pattern.
The method according to claim 1,
The step of reproducing the cosmetic effect comprises:
Wherein the cosmetic effect is reproduced by alpha blending the original image and the application pattern.
delete
KR1020120146866A 2012-12-14 2012-12-14 Method for recreating makeup on image KR101752701B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020120146866A KR101752701B1 (en) 2012-12-14 2012-12-14 Method for recreating makeup on image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020120146866A KR101752701B1 (en) 2012-12-14 2012-12-14 Method for recreating makeup on image

Publications (2)

Publication Number Publication Date
KR20140077752A KR20140077752A (en) 2014-06-24
KR101752701B1 true KR101752701B1 (en) 2017-06-30

Family

ID=51129602

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020120146866A KR101752701B1 (en) 2012-12-14 2012-12-14 Method for recreating makeup on image

Country Status (1)

Country Link
KR (1) KR101752701B1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10762665B2 (en) 2018-05-23 2020-09-01 Perfect Corp. Systems and methods for performing virtual application of makeup effects based on a source image

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001238727A (en) * 2000-02-29 2001-09-04 Kao Corp Makeup advice system
JP2005216131A (en) * 2004-01-30 2005-08-11 Digital Fashion Ltd Makeup simulation apparatus, method and program
JP2009077086A (en) * 2007-09-19 2009-04-09 Fuji Xerox Co Ltd Image processor and program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001238727A (en) * 2000-02-29 2001-09-04 Kao Corp Makeup advice system
JP2005216131A (en) * 2004-01-30 2005-08-11 Digital Fashion Ltd Makeup simulation apparatus, method and program
JP2009077086A (en) * 2007-09-19 2009-04-09 Fuji Xerox Co Ltd Image processor and program

Also Published As

Publication number Publication date
KR20140077752A (en) 2014-06-24

Similar Documents

Publication Publication Date Title
CN106373187B (en) Two dimensional image based on AR is converted to the implementation method of three-dimensional scenic
JP6396890B2 (en) Image processing apparatus, image processing method, and program capable of virtually reproducing state where makeup coating material is applied
Flagg et al. Projector-guided painting
Furferi et al. From 2D to 2.5 D ie from painting to tactile model
Karimov et al. Advanced tone rendition technique for a painting robot
Ganovelli et al. Introduction to computer graphics: A practical learning approach
Gerl et al. Interactive example-based hatching
US10643491B2 (en) Process, system and method for step-by-step painting of an image on a transparent surface
Lopez-Moreno et al. Non-photorealistic, depth-based image editing
KR101752701B1 (en) Method for recreating makeup on image
JP2023126774A (en) Screen tone look generator
Seo et al. Interactive painterly rendering with artistic error correction
Blatner et al. TangiPaint: a tangible digital painting system
Ligon Digital art revolution
Stephenson Essential RenderMan®
Madračević et al. 3D modeling from 2D images
Mesquita et al. Synthesis and Validation of Virtual Woodcuts Generated with Reaction-Diffusion
KR20080041978A (en) Painterly rendering method based human painting process and exhibition system thereof
Jo KeyShot 3D Rendering
JP6708862B1 (en) Image processing method, program, and image processing apparatus
Cho Vray for Sketchup
Liu Chinese Ink-and-Brush Painting with Film Lighting Aesthetics in 3D Computer Graphics
Huang et al. Stereoscopic oil paintings from RGBD images
Jackson Digital Painting Techniques: Using Corel Painter 2016
Ferwerda et al. Tangible images: Bridging the real and virtual worlds

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant