WO2014167831A1 - メイクアップ塗材が塗布された状態を仮想的に再現することができる画像処理装置、画像処理方法、プログラム - Google Patents
メイクアップ塗材が塗布された状態を仮想的に再現することができる画像処理装置、画像処理方法、プログラム Download PDFInfo
- Publication number
- WO2014167831A1 WO2014167831A1 PCT/JP2014/001991 JP2014001991W WO2014167831A1 WO 2014167831 A1 WO2014167831 A1 WO 2014167831A1 JP 2014001991 W JP2014001991 W JP 2014001991W WO 2014167831 A1 WO2014167831 A1 WO 2014167831A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- coating material
- pixel
- image
- color
- makeup
- Prior art date
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A45—HAND OR TRAVELLING ARTICLES
- A45D—HAIRDRESSING OR SHAVING EQUIPMENT; EQUIPMENT FOR COSMETICS OR COSMETIC TREATMENTS, e.g. FOR MANICURING OR PEDICURING
- A45D44/00—Other cosmetic or toiletry articles, e.g. for hairdressers' rooms
- A45D44/005—Other cosmetic or toiletry articles, e.g. for hairdressers' rooms for selecting or displaying personal cosmetic colours or hairstyle
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/60—Editing figures and text; Combining figures or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/20—Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/02—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the way in which colour is displayed
- G09G5/026—Control of mixing and/or overlay of colours in general
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/46—Colour picture communication systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/46—Colour picture communication systems
- H04N1/56—Processing of colour picture signals
- H04N1/60—Colour correction or control
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/46—Colour picture communication systems
- H04N1/56—Processing of colour picture signals
- H04N1/60—Colour correction or control
- H04N1/62—Retouching, i.e. modification of isolated colours only or in isolated picture areas only
- H04N1/622—Retouching, i.e. modification of isolated colours only or in isolated picture areas only with simulation on a subsidiary picture reproducer
Definitions
- the present invention belongs to the technical field of image processing technology.
- Image processing technology extracts a part image corresponding to a desired part from image data acquired by a camera, etc., rewrites individual pixel bit values constituting this part image, and synthesizes the rewritten part image with the original image This is a technique for processing an image.
- a makeup simulator that generates a repainted image when cosmetics or paint is applied is one of the well-known applications of such image processing technology.
- Image processing for the makeup simulator includes a method of replacing the original color of the part to be processed with the target application color, a method of synthesizing the target application color with the part to be processed, and the like.
- An object of the present invention is to provide an image processing apparatus that can more realistically reproduce the state in which a makeup coating material having various optical characteristics is applied and the ambient light is illuminated.
- the above problem consists of a designation means for designating a portion where a makeup coating material should be applied from an object shown in an original image, and a plurality of pixels having a color range in which the sample color of the makeup coating material is a representative color.
- This can be solved by the generating means for generating the coating material application layer and the synthesizing means for synthesizing the coating material application layer with the original image.
- the color range of the plurality of pixels constituting the coating material application layer is the representative value of the plurality of pixels constituting the portion of the original image to which the makeup coating material is to be applied, and the sample color of the makeup coating material. It is obtained by enlarging the color range of a plurality of pixels constituting a part of the object according to the ratio with the pixel value, and the pixel value of the coating material application layer is the pixel of the original image. This is because the pixel value corresponding to the position can be obtained by mapping the pixel value to the color range of the coating material application layer.
- the coating material application layer Since the color range of the coating material application layer can be obtained by expanding the color range of the original image, the coating material application layer expresses the part of the object with more abundant shades.
- the pixel bit value of the coating material application layer is a pixel of the original image, and is obtained by mapping the pixel bit value of the positionally corresponding one to the enlarged color range. The curved beauty that is combined is emphasized more, and the contrast produced by the illumination of ambient light can be reproduced.
- the part is a lip
- the volume of the lips is further enhanced by the application of the lipstick, so that the function of the coating material application layer can be more appealed to the user.
- FIG. 1B shows the image processing in the case where the difference added to the channel image for each pixel component has a range.
- A shows the external appearance of a tablet terminal.
- FIG. 2B shows an initial screen when performing the process as a makeup simulator. The situation where the user is going to photograph a self-portrait using the camera 102 in the tablet terminal 100 is shown.
- A shows the user's self-portrait before makeup
- FIG. 4 shows the user's self-portrait after makeup.
- A shows the hardware resource of the tablet terminal utilized by the image processing apparatus.
- FIG. 5B shows the recorded contents of the storage 6.
- FIG. 5C shows the user 001 directory
- FIG. 5D shows the structure of the item 001 directory. It is the figure which added the data flow between components to the hardware-like or software-like component which comprises an image processing apparatus.
- (A) shows the configuration and bit assignment of the face image face (RGBs).
- FIG. 7B is a diagram showing the configuration and bit assignment of the lip image (HCLs) in FIG.
- FIG. 7C is a diagram showing the configuration and bit assignment of the lipstick application layer (HCLs). It is a figure which shows the process of the image processing by an image processing apparatus concretely.
- (A) shows the distribution of the L component of the lip part that is the target part
- FIG. 9B shows the distribution of the L component of the lip part that is the target part of the lipstick application layer.
- FIG. 13B is a flowchart showing a processing procedure for calculating the color range of the lips.
- FIG. 13C is a flowchart showing a procedure for generating a lipstick application layer.
- A) is the flowchart which actualized color range calculation.
- FIG. 14B is a flowchart showing a detailed processing procedure of range map calculation.
- (A) is a flowchart which shows the color range production
- FIG. 15B is a flowchart showing a processing procedure for generating an L channel in the lipstick application layer. It is a flowchart of the synthetic
- (A) is the flowchart which actualized the color range calculation in H channel.
- FIG. 18B is a flowchart showing a detailed processing procedure of color range map calculation.
- (A) is a flowchart which shows the process sequence of the color range production
- FIG. 19B is a flowchart showing a processing procedure for generating the H channel of the lipstick application layer. It is a flowchart of the synthetic
- (A) is the flowchart which actualized the color range calculation in C channel.
- FIG. 21B is a flowchart showing a detailed processing procedure of the color range map calculation S42.
- (A) is a flowchart which shows the color range production
- FIG. 22B is a flowchart showing a processing procedure for generating the C channel of the lipstick application layer. It is a flowchart of the synthetic
- (A) shows the directory structure of the storage 6 in 3rd Embodiment.
- FIG. 24B shows an example of a range map sample stored in the sample file.
- FIG. 24C shows lip highlights and shadows realized by the sample file. It is a flowchart which shows the production
- (A) shows the outline of the lip formed by an anchor point and a complementary line.
- FIG. 26B shows the shadow of the lips generated by lowering the weight coefficient of the pixels existing along the interpolation line.
- FIG. 26C shows a process of changing the curvature of the interpolation line.
- (A) shows the feature point group detected in the frame image Fx and the frame image Fx + m.
- FIG. 29B shows a transformation matrix that defines transformation of feature points between the frame image Fx and the frame image Fx + m.
- (A) shows a plurality of feature points existing in the frame image Fx and the frame image Fx + m.
- FIG. 30B shows an example of the transformation matrix.
- FIG. 30C shows a situation where a hand-painted image is drawn on a still image. In the case where the operation mode is switched from the confirmation mode to the operation mode, the background is shown until the still image as the base of the confirmation mode is determined. It is a flowchart which shows the process sequence of the makeup for moving image.
- Lipsticks containing diffraction pigments look like the highlights illuminated by the ambient light shimmer, the area where the lipstick is applied spreads slowly, and the brightness decreases toward the depression. Attracting viewers with contrast.
- the lipstick color conversion system described in Patent Document 1 generates a post-application site image in which lipstick is applied to the lips shown in the image. Specifically, the lip region is extracted from the pixels constituting the original face image, and the repainting deviation as a difference is added to the pixel values of the pixels constituting the lip region, so that an image in which lipstick is applied to the lips is obtained. Is to get.
- the face image to be processed is converted from the RGB color space to the HSV color space, and then the RGB ⁇ HSV conversion, and the pixel-processed face image is converted from the HSV color space to the RGB color space. HSV ⁇ RGB conversion.
- the repainting deviation is obtained by calculating the average of the lip colors, using the calculated average value as the representative color of the lips, and calculating the difference between the representative color of the lips and the color of the target lipstick.
- Addition of saturation (S), luminance (V), and channel repainting deviation in the HSV color space is stopped for highlights and shadows in the original lip image.
- hue (H) channel By only adding the hue (H) channel, the original contrast can be further expressed.
- the addition of the difference between the lipstick and the lips is stopped.
- FIG. 1A shows the image processing in a case where the difference added to the channel image of each pixel component of HSV is a flat difference.
- the mathematical expression in FIG. 1A follows the following rules.
- a square in FIG. 1 is an operand, and indicates an image or a layer that is a target of an arithmetic operation.
- the small square is a fixed value and indicates the pixel bit value of the color used for the calculation by the mathematical formula.
- a circle symbol is an operator of a mathematical expression, and indicates the type of operation executed on the pixel bit value of each pixel constituting the image or layer.
- “layer” refers to an image in which some or all of the pixels are completely transparent or translucent and are used for synthesis with another image.
- the device configuration in FIG. 1 (a) is the difference between the average value of the lip region and the color of the lipstick (1), the H channel, S channel, and V channel constituting the image of the lip region. It can be seen that the calculation by the mathematical expression consisting of the addition (2) with is defined
- Lip1 indicates a part of the face image for which the lip area is specified.
- the contour line edg1 in the face image is composed of a circular mark anchor and an interpolation line connecting the anchors, and surrounds only the lips in the face image.
- the H channel in the figure refers to an image obtained by extracting the H component from the HSV pixel components constituting the lips.
- the S channel and the V channel are images obtained by extracting the S component and the V component from the HSV pixel components constituting the lips.
- the difference calculation sub1 performs subtraction between the average value of the lip area and the color of the lipstick for each of the H channel, S channel, and V channel.
- the addition operations pls1, 2, and 3 are ⁇ H, ⁇ S, and ⁇ V that are the differences for the H channel, S channel, and V channel obtained by subtraction, and the lip H channel, S channel, and V channel in the face image. Is added for each of the H channel, S channel, and V channel.
- the difference is flat in the entire image, and the same value is added to the pixel forming the maximum pixel bit value and the pixel forming the minimum pixel bit value.
- the variation range of the color is equivalent to the color range of the original image. This makes it impossible to produce an atmosphere in which contrast is enhanced by the application of lipstick, which is insufficient as a function of a makeup simulator that virtually applies a makeup coating material.
- FIG. 1B shows the image processing in the case where the difference added to the channel image for each pixel component has a range.
- the formula in FIG. 1B is the multiplication (1) of the lipstick color and the color range of the lip part, the hue (H) channel, the saturation (C) channel, and the luminance (L It can be seen that the calculation processing comprising the channel and the ⁇ blend (2) of the difference is defined.
- the L channel in the figure refers to an image obtained by extracting the L component from the HCL pixel components constituting the lips.
- the H channel and the C channel are images obtained by extracting the H component and the C component from the HCL pixel components constituting the lips.
- the point of calculating the difference from the lipstick for each of a plurality of channels included in the image and adding the difference for each channel is common to the apparatus configuration of FIG. The differences are as follows.
- the difference calculation is different.
- the difference in difference calculation means that the difference added for each of the H channel, C channel, and V channel is obtained by subtraction between the lip color and the lipstick color in the apparatus of FIG.
- the difference added to each of the H channel, C channel, and L channel gives the lipstick color to the layer with a range obtained from the lip region and the average value of the lips. It can be obtained by multiplying.
- a multiplication operation mul1 in the figure performs multiplication of the layer with the range and the color of the lipstick.
- the layer with a range of ⁇ L represents the face and lips in monotone.
- the face image is represented in a manner like a negative image in which the entire face is gray and the lips are white.
- the difference layer may be “flat”. Therefore, in FIG. 1B, the H channel layer is assumed to be a flat layer.
- the difference is different. Since the difference of the apparatus in FIG. 1A is calculated by subtraction, the difference in the apparatus of FIG. 1A becomes a flat one that is constant over the entire image plane, whereas FIG. ) Is a layer with a range obtained from the lip area and the average value of the lips, and the difference is that the value of each pixel differs depending on the plane position and has a certain numerical range.
- layer composition is different.
- the difference in composition is that the composition of the device in Fig. 1 (a) adds flat layer values to each of the H channel, S channel, and V channel constituting the image of the lip.
- the synthesis in FIG. 1B is performed by a synthesis operation including ⁇ blending of the C channel and L channel of the face image and a layer having a range for the C channel and L channel.
- the ⁇ blends bd1, bd2, and bd3 in the figure are an example of this composition operation, and the C channel and L channel of the face image are multiplied by the ⁇ value, while the layer having the range for the C channel and L channel is set to ⁇ . It means an operation of multiplying by an inversion value (1- ⁇ ) and adding. Since the layer for each pixel component of the HCL to be ⁇ -blended has a color range, the image after makeup has a clear lip texture.
- This embodiment discloses an embodiment of a makeup simulator that performs image processing on an L channel image composed of L components among H components, C components, and L components constituting each pixel in the HCL space. ing.
- a makeup simulator embodies the configuration requirements of the image processing apparatus using hardware resources of a tablet terminal.
- Fig. 2 (a) shows the appearance of the tablet terminal.
- the tablet terminal has a touch panel display 101 on a surface desired by the user, and a self-portrait camera 102 exists at an upper end portion of the touch panel display.
- Self-portrait is to photograph a user who operates the tablet terminal body, and the user's own image is photographed by the tablet terminal.
- the makeup simulator such a self-portrait is displayed on a touch panel display, and an operation on the self-portrait is accepted, thereby allowing the user to realize virtual makeup.
- the self-portrait to be made up includes still image data shot in the still image mode of the camera and a plurality of frame image data constituting a moving image shot in the moving image mode.
- the tablet terminal Since the self-portrait before and after the make-up is displayed on the touch panel 5 in front of itself, the tablet terminal becomes a digital mirror having a make-up function, so that the user can see his / her face reflected in the mirror after the make-up simulation. It is possible to check one's own face.
- the “digital mirror” is a function that realizes make-up according to the environment where the user and the tablet terminal are placed by performing image processing on a human image taken by a camera. For example, when the tablet terminal is placed outdoors, the self-image is subjected to image processing according to the ambient light, thereby applying makeup according to the outdoor ambient light to the self-image. By this, if the tablet terminal and the user are in the sunset light, how will they look after makeup in that state, if the tablet terminal and the user are in the street light at the street corner at night You can visually check how you look after makeup.
- FIG. 2B shows an initial screen when the touch panel 5 performs processing as a makeup simulator. It includes a button bn1 for accepting selection of a part to be a target of the makeup simulator, and a button bn2 for starting the user's photographing by the camera 102.
- FIG. 3 shows a situation where the user is taking a self-portrait using the camera 102 in the tablet terminal 100.
- the room lighting 200 and the spot lighting 201 exist in the room where the tablet terminal 100 and the user exist, and the lighting by the room lighting 200 and the lighting by the spot lighting 201 become ambient light. .
- the appearance of the user illuminated with such ambient light becomes the target of the makeup simulator.
- makeup items such as lipstick and blusher and sample colors are displayed on the touch panel along with the self-portrait and recommended makeup information.
- a cosmetic item can be selected by touching or dragging the self-portrait displayed on the touch panel.
- the cosmetic item is a cosmetic product that can be used virtually in the makeup simulator, and depends on the RGB value of a single sample pixel, the HCL value obtained by converting this RGB value, and the wavelength parameter. It is prescribed.
- the wavelength parameter specifies physical characteristics and optical characteristics of the makeup coating material (cosmetics) according to the wavelength.
- the image processing apparatus When the wavelength parameter is used, the image processing apparatus performs a predetermined calculation using the RGB value or the HCL value that constitutes the original lip pixel and the wavelength parameter, so that the pixel in a state where the lipstick is applied RGB value or HCL value indicating Subsequent processing is performed on the RGB value or HCL value thus obtained.
- cosmetic items such as lipstick are modeled as a single color sample, and the representative color data file corresponding to the cosmetic item is an RGB value of the modeled single color,
- the representative color data file corresponding to the cosmetic item is an RGB value of the modeled single color
- the HCL value what cosmetic items can be used for makeup is defined.
- the RGB value and the HCL value of the single sample pixel it is defined how the user is made up by the cosmetics corresponding to the cosmetic item.
- the value of the L component of the sample color pixel is a representative value of the color range to be taken by the lipstick application layer.
- the cosmetic item generates a lipstick application layer by forming a lip with a plurality of pixels having a certain numerical range, with the L component value of its sample color pixel as a representative value.
- FIG. 4 (a) shows the user's self-portrait before makeup
- FIG. 4 (b) shows the user's self-portrait after makeup
- the GUIs in FIGS. 4A and 4B have a common screen configuration.
- This common screen configuration is that there is a window for self-portrait display on the upper side, a tool for selecting cosmetic items on the lower right, and a tool box for selecting coloring on the lower left.
- a plurality of lipsticks are listed as a plurality of makeup items, and the user can select a makeup item that the user wants to make up by touching any product.
- the tablet terminal displays a list of color samples for coloring.
- the application color by the lip region is a representative color of the color distribution in the lip region. Therefore, the color of the lipstick is treated as a representative color in the lip region after the lipstick application.
- FIG. 4B shows a makeup face image after the lipstick is selected and the repaint color is selected and the makeup is executed.
- the self-portrait after make-up is obtained by designating a region of the lip portion from the face image in the original self-portrait, repainting the lip portion with the selected lipstick item, and then combining it with the original face image.
- the image processing apparatus extracts feature points of the lip part from the face image that is the self-portrait when selecting the lipstick, thereby specifying the region using the contour line and the selected cosmetic item.
- a lipstick application layer indicating a lip part to which lipstick is applied is generated and synthesized with a human image.
- the “face image” refers to a self-portrait in which facial feature points characterizing facial parts are detected at a certain level or higher.
- a self-portrait in which a facial feature point characterizing a human hand part is detected above a certain level is called a “hand image”.
- the image processing apparatus performs a makeup process in accordance with the user's selection of the cosmetic item and the selection of the part to be applied, and displays the processing result on the screen of the touch panel 5. Since it is possible to reflect the makeup result by the user in real time, the user can easily check the makeup simulation result and reduce the trouble of trying the cosmetics.
- FIG. 5A shows the hardware resources of the tablet terminal used by the image processing apparatus.
- the hardware configuration of the tablet terminal is configured to process a captured image acquired through a camera.
- the hardware of the tablet terminal includes an image sensor circuit 1 that photoelectrically converts a subject image formed by the imaging optical system of the camera 102 and outputs a video signal, and a focus that is an imaging optical system of the camera 102.
- An imaging control circuit 2 that performs exposure control using a lens, a zoom lens, and a shutter / aperture, an input pixel memory 3 that stores self-image data to be processed, and a makeup process for the self-image data stored in the input pixel memory 3
- a plain memory 4 for storing images, a touch panel 5 that displays the original self-portrait or a self-image after makeup, and a storage that includes a non-volatile memory and an embedded disk are composed of an LCD, a PDP, and an organic EL element.
- the GUI on tablet devices is composed of various widgets such as windows, scroll bars, radio buttons, text boxes, and pull-down menus.
- the event control circuit 9 outputs an event related to the touch coordinates to change the state of each widget.
- an interactive operation environment using a GUI is provided to the user.
- FIG. 5B shows the recorded contents of the storage 6.
- the storage 6 includes a subject directory and a makeup condition directory under the root directory.
- the makeup condition directory is a directory for defining makeup conditions, and includes a plurality of item directories (item 001 directory, item 002 directory, item 003 directory) corresponding to a plurality of makeup items to be used for makeup. Have. By specifying any one of a plurality of item directories existing under the makeup condition directory, the makeup is conditioned.
- FIG. 5C shows the configuration of the user 001 directory and the item 001 directory.
- the user 001 directory is a directory corresponding to the user 001 among a plurality of directories for each user existing in the user directory.
- an original image file storing original self-portrait data obtained by shooting by the camera 102 and make-up by the image processing apparatus are applied to the self-image.
- a makeup image file storing the makeup image data obtained in (1).
- the item 001 directory is a directory corresponding to the product item 001 among a plurality of directories for each product item existing in the makeup condition directory. This includes a product image of cosmetic items such as lipsticks, a product image file storing CM images, a representative color data file indicating the RGB values of the colors applied to the product item, and a manual file indicating the handling of the product item And are stored.
- FIG. 6 is a diagram in which the data flow between the components is added to the hardware or software components constituting the image processing apparatus.
- the image processing apparatus includes a lip region extraction unit 11, color space conversion units 12a and 12b, a temporary memory 13, a lip color range calculation unit 14, a lipstick application image layer generation unit 15, and a synthesis processing unit 16.
- the color space inverse transform unit 17 is configured.
- the input of the image processing apparatus is a face image in the RGB color space and the RGB value of the target application color
- the output is an RGB image in which the target application color is applied to the lip region of the face image.
- the image processing target for makeup includes both lips with no makeup and lips with lipstick already applied.
- the optical characteristics of the lipstick are also reflected in the face image that has been made up, and in this case as well, realistic makeup can be realized. .
- the lip region extraction unit 11 detects face feature points from the input self-portrait by face detection and face part detection, extracts feature points defining the lip part, and generates a contour line connecting the feature points. Then, a face image lip (RGBs) in which the region of the lip is designated by the contour line is output.
- RGBs face image lip
- various methods such as color segmentation, edge detection, and part detection using a face model can be employed.
- lip area detection by face part detection is used as an example.
- mask data for masking the region surrounded by the lip feature points as an effective region is created.
- the color space conversion unit 12 converts the color space (RGB, YUV, etc.) of the face image lip (RGBs) in which the region of the lip is designated into a processing intermediate color space, and stores the conversion result in the temporary memory 13.
- An HCL color space expressed by hue (H), saturation (C), and luminance (L) can be cited as the processing intermediate color space.
- the lip color range calculation unit 14 calculates a minimum value, a maximum value, and an average value representing the fluctuation range of the L component that is a specific pixel component from the face image lip (HCLs) in which the lip part is designated as an area, The color range in the L component is specified from the minimum value, maximum value, and average value. Then, a range map indicating the calculated average value and the ratio of each pixel in the L channel in association with the pixel position is output.
- the lipstick application layer generation unit 15 generates a lipstick application layer as a layer to be synthesized on the lips that are regions designated in the original self-portrait.
- the lipstick application layer generation unit 15 displays the representative color of the lipstick, which is the target application color converted into the HCL space, and the minimum, maximum, and average lips output by the lip color range calculation unit 14.
- the appropriate average, minimum and maximum values of the lips coated with lipstick are calculated from the values.
- a lipstick application layer is obtained from the maximum value, minimum value, and average value of the lips to which the lipstick is applied, and the range map of the lips.
- the synthesis processing unit 16 synthesizes the L channel of the lipstick application layer generated by the lipstick application layer generation unit 15 and the L channel of the face image lip (HCLs) stored in the temporary memory and having the lip region designated. Then, the face image made # face (HCLs) with lipstick applied is output. Examples of the synthesis processing include ⁇ blending, multiplication, addition, and soft light. In order to realize more accurate application, a mask that enables only the synthesis for the lip region is used. This is to limit the pixels to be combined.
- the color space reverse conversion unit 17 converts the face image made # face (HCLs) after lipstick application from the processing intermediate color space to a color space (RGB, YUV, etc.) that can be displayed by the device, and converts the face image made # face (RGBs )
- FIG. 7A shows a face image face (RGBs).
- the face image face (RGBs) is, for example, an image having a resolution of SD image quality (640 ⁇ 480), 1K FullHD image quality (1920 ⁇ 1080), 4K image quality (7680 ⁇ 4320), FullHD In the case of image quality, it is composed of horizontal 1920 ⁇ 1080 vertical pixel data.
- asg1 indicates a bit assignment for one pixel bit value.
- the bit length of one pixel data is 32 bits, 8 bits of red (R) component gradation bits, 8 bits of green (G) component gradation bits, 8 bits of It is composed of gradation bits of blue (B) component and gradation bits of 8-bit transparency ( ⁇ ) component.
- ext1 indicates extraction of a lip image that is a group of rectangular pixels including lips whose area is designated. Such a rectangular pixel group includes vertical N pixels ⁇ horizontal M pixels. M and N are the minimum rectangular vertical pixel number and horizontal pixel number including the lips, and are variable numbers.
- Asg2 in the figure indicates the pixel bit value of one pixel constituting the lip image (RGBs).
- Such pixel bit values also include an 8-bit red (R) component gradation bit, an 8-bit green (G) component gradation bit, an 8-bit blue (B) component gradation bit, and an 8-bit transparency ( It is composed of gradation bits of ⁇ ) component.
- FIG. 7B shows lip images (HCLs) obtained by converting the pixel bit value of the lip image (RGBs) of FIG. 7A from RGB format to HCL format.
- An arrow asg3 indicates a bit assignment in one pixel data of the lip image (HCLs).
- the bit length of one pixel data is 32 bits
- the brightness of 8 bits ( L) is composed of the gradation bit of the component and the bit value of the transparency ( ⁇ ) component of 8 bits.
- FIG. 7C shows lipstick application layers (HCLs) superimposed on the lip images (HCLs).
- An arrow asg4 indicates a bit assignment in one pixel data of the lipstick application layer (HCLs).
- the bit length of one pixel data is 32 bits
- the brightness of 8 bits ( L) is composed of the gradation bit of the component and the bit value of the transparency ( ⁇ ) component of 8 bits.
- FIG. 8 is a diagram specifically showing a process of image processing by the image processing apparatus.
- Gn1 in the figure indicates generation of mask data for the face image face (HCLs). In this mask data, the lip portion is white and the other portions are black.
- Us1 in the figure indicates the use of the lip portion of the face image when generating the lipstick application layer.
- Us2 in the figure indicates the use of the lipstick in generating the lipstick application layer.
- An arrow oby1 indicates the composition order of composition for the lipstick application layer, the mask data, and the lips.
- An arrow us3 indicates the citation of the mask data for the composition, and us4 indicates the citation of the lipstick application layer for the composition.
- aw3 indicates the completion of the final makeup image after such synthesis.
- the relationship between the representative lipstick color aveStick and the lipstick color range rangeL ' is as much as possible to the original lips so that the generated lipstick image gives the human eye a similar sensation to the color of the original image.
- the relationship between the representative color aveLip and the lip color range rangeL is maintained. That is, the ratio a / b between the distance a between the lip representative color aveLip and the maximum value Lmax and the distance b between the lip representative color aveLip and the minimum value Lmin is between the lipstick representative color aveStick and the maximum value L'max.
- a color range adjustment coefficient ⁇ is provided for contrast adjustment and saturation adjustment of the repainted color.
- the color range adjustment coefficient ⁇ may be given from outside the apparatus.
- Expression 2 is a calculation formula of the color range rangeL of the original color. The difference between the minimum value Lmin and the maximum value Lmax of the original color is obtained as the color range rangeL.
- pixels constituting the image are scanned and processed in units of lines.
- the pixel to be processed i-th during scanning in this line unit is defined as pixel [i], and the H component, C component, and L component of this pixel [i] are H [i], C [i] And L [i].
- the i-th pixel value L ′ [i] of the lipstick application image is represented by the range map data rmap [i] of the current pixel and the lipstick color range rangeL, as shown in (Equation 5). 'Calculated from the minimum color value L'min of the lipstick application image.
- each pixel value of the lips can be calculated by the equation (Equation 6).
- FIGS. 9A and 9B show the distribution of the L component of the lip part, which is the target part, and the distribution of the L component of the lip part, which is the target part of the lipstick application layer.
- the X-axis and Y-axis in FIG. 9A are the horizontal and vertical directions of the image, and the height direction indicates the size of the L component of each pixel.
- Distribution curves cv1 and cv2 in the figure are curves obtained by mapping the L component, which is the z coordinate of the object, to the YZ plane and the XZ plane. The maximum value of this curve is Lmin, and the maximum value is Lmax.
- Lmin and Lmax define the distribution range of the L component of the lip before repainting.
- (xi, yi) in the figure means the xy coordinates of the i-th pixel among the pixels constituting the lip part.
- the L component of the Z axis of xi and yi is L [i].
- the three-dimensional shape shp1 of the lip part in the figure is formed by plotting the value of the L component of the pixel of the lip part on the Z axis. When the average value of the L component pixels is aveLip, a straight line indicating aveLip crosses the distribution curve at the position of the intermediate portion in the height direction.
- the X-axis and Y-axis are the horizontal and vertical directions of the image, and the height direction indicates the size of the L component of each pixel.
- Distribution curves cv3 and cv4 in the figure are curves obtained by mapping the L component, which is the z coordinate of the object, to the YZ plane and the XZ plane. The minimum value of this curve is L'min, and the maximum value is L'max. L'min and L'max define the numerical range of the luminance component of the lip after repainting. Further, (xi, yi) in the figure means the xy coordinates of the i-th pixel among the pixels constituting the lip part.
- the L component that is the value of the Z axis of xi and yi is L ′ (i).
- the three-dimensional shape shp2 of the lip part in the figure is formed by plotting the L component value of the pixel of the lip part on the Z axis.
- the straight line indicating aveStick crosses the distribution curve at any part in the height direction.
- FIGS. 10A and 10B are diagrams schematically illustrating the concept of color range expansion by an image processing apparatus.
- Cont1 in FIG. 10A is the contrast of the lower lip in the user's original human image.
- cont2 is the contrast of the lower lip part in the face image after makeup, and cont1 has a color range of rangeL from Lmin to Lmax as indicated by the leader line pu1.
- cont2 has a color range of L 'from L'min to L'max as indicated by the leader line pu2. Since the color range of the same part of the lower lip is greatly expanded, it can be seen that the lip after applying the lipstick is expressed by an L component in a wide numerical range.
- the color range rangeL of the original lip region before enlargement and the color range rangeL ′ after enlargement exist on the Z axis.
- the aveStick in the lipstick application layer is located higher than the aveLip in the lip part.
- FIG. 10B is a diagram in which the meanings of the mathematical formulas 1 to 4 are added to the color range in the original face image and the color range in the face image that has been made up shown in FIG. It is.
- ⁇ in Equation 1 represents the ratio of the color range
- Equation 2 represents the length of the color range in the original face image
- Equations 3-1 and 3-2 represent the distance from Lmax to Lave in the color range
- Equations 4-1 to 4-3 indicate the upper and lower limits of the color range and the length of the color range.
- the difference between L′ max and aveStick is ⁇ times the difference between Lmax and aveLip.
- rangeL ′ is obtained by multiplying rangeL by ⁇ . Since ⁇ is the ratio between aveStick and aveLip, the difference between rangeL 'and aveStick and L'max may be the difference between rangeL and aveLip and Lmax that is increased by the ratio between aveStick and aveLip. Recognize. Furthermore, since L′ min is obtained by subtracting rangeL ′ from L′ max, it corresponds to the ratio between aveStick and aveLip.
- mapping of the L component of the pixel from the face image to the lipstick application layer will be described.
- L [i] of the pixel having Lmax which is the maximum value of range L is L′ max.
- L [i] of a pixel having Lmin which is the minimum value of rangeL is L'min.
- mapping is performed so that the ratio to the range L that the pixel has in the original image is also maintained in range L ′.
- 11 (a) and 11 (b) show a process of mapping the i-th pixel in the original lip region from the color range of the original lip region to the color range of the lip region after lipstick application.
- the guidelines for determining pixel values common to FIGS. 11A and 11B are as follows. That is, a right triangle with a straight line passing through L '[i] and L'max as one side and the X and Y axes as the other two sides passes through L [i] and Lmax, and the Z axis and ⁇ [i] L '[i] is determined so that it is similar to the triangle forming
- FIG. 11A shows the relationship between L [i] and L ′ [i].
- the straight line 1 is a straight line that passes through Lmax and L [i].
- the straight line 2 is a straight line that passes through L′ max and L ′ [i].
- ⁇ which is the slope of line1
- L ′ [i] exists at the position of aveStick + (L [i] ⁇ aveLip) ⁇ ⁇
- the slope of the line line2 passing through aveStick and L ′ [i] is aveLip and L [i] It becomes the same as the slope of the straight line line1 passing through. This is because aveStick is obtained by multiplying aveLip by ⁇ , and the difference between L ′ [i] and aveStick is equal to the difference between L [i] and aveLip multiplied by ⁇ .
- L [i] and L '[i] are the difference from aveLip.
- L [i] is mapped to the color range of the lipstick application layer, lipstick
- the position of L ′ [i] in the color range of the color is not different from the position of the color range in the lip region.
- L ′ [i] is determined for each pixel constituting the lipstick application layer.
- the lipstick application layer has different Z coordinate values according to the XY coordinates, and is defined as a “layer with range” in which the Z-axis direction has a range.
- FIG. 11 (b) shows the meaning of rmap (i).
- rmap [i] represents the ratio between the difference between L [i] ⁇ Lmin and the difference between Lmax and ⁇ Lmin.
- the near side shows the meaning of Equation 6.
- L ′ [i] which is the L component of the i-th pixel, is calculated by multiplying rangeL ′ by rmap [i] indicating the above ratio and adding Lmin.
- L ′ [i] is obtained by multiplying the ratio rmap [i] of L [i] in the color range of the original lip region by rangeL ′ and adding Lmin which is an offset
- L ′ [ The change amount of i] is equal to the change amount of L [i] with respect to Xi or Yi.
- the slope of the straight line passing through Lmax and L [i] is equal to the slope of the straight line passing through L'max and L '[i]. This relationship is established for all pixels constituting the lip region. Since this relationship holds for all the pixels in the lip after repainting, the subtle skin feel that appears on each pixel in the lip is expressed by abundant L components in the self-image after makeup. It will be.
- L'max of the lipstick application layer becomes Lmax + ⁇ L
- L ⁇ min becomes Lmin + ⁇ L
- L'max of the color range of the lipstick application layer L ⁇ min is increased by ⁇ L compared to Lmax and Lmin of the original lips. Since the minimum value and the maximum value of the color range of the lips are uniformly increased, a sense of volume is lost as compared with the original image. In addition, since ⁇ L is added to Lmin, the shadowed portion of the lips becomes brighter by ⁇ L.
- the processing contents of the constituent elements of the image processing apparatus described so far can be generalized as processing procedures for hardware resources according to various external events and internal parameters of the apparatus. Such generalized processing procedures are shown in the flowcharts of FIGS.
- FIG. 12 is a flowchart of feature point extraction of the lip part in the first embodiment.
- the flowchart of FIG. 12 corresponds to the highest level process, that is, the main routine, and the flowcharts of FIGS. 13 to 16 exist as flowcharts at a lower level of this flowchart.
- face detection is performed on the input image.
- Step S2 is a determination of whether a face has been found. If a face is found, more detailed face part detection is performed in step S3, and feature points for each face part are output.
- step S4 in order to roughly specify the processing target region, the lip region is designated by generating a contour line that defines the contour shape of the lip.
- mask data is created. This mask data is for masking non-lip portions, for example, skin and tooth regions with zero.
- the process skips steps S3 to S5 and proceeds to step S6.
- step S6 a lipstick color range is generated from the lip color range in the original self-portrait and the input representative lipstick color.
- a lipstick application layer is generated from the generated lipstick color range and lip color range map.
- step S8 the lipstick application layer is combined with the face image.
- FIG. 13A is a flowchart showing details of the above-mentioned lip effective mask creation processing.
- the processing procedure shown in this flowchart is made into a subroutine, and at the time of the subroutine call, the processing shown in this flowchart is executed after accepting an argument for specifying the face image. Returns mask data as a value.
- a variable j in FIG. 13A is a variable that indicates a target to be processed among a plurality of information elements existing in the data structure. Accordingly, in the subsequent flowcharts, a pixel mask to be processed in the process of the j-th loop is denoted as mask [j].
- step S10 one pixel [j] to be processed is selected in step S10, and step S11 is executed.
- step S11 it is checked whether the pixel to be processed belongs to the inside or outside of the lip region.
- step S12 it is determined whether or not the pixel to be processed is within the region. If it is within the area (Yes in S12), the mask value (mask [j]) is set to 1.0 in step S13. On the other hand, if it is outside the region (No in S12), the mask value (mask [j]) is set to 0.0 in step S14.
- step S15 it is determined whether or not there is an unprocessed pixel in the region. If there is, the process returns to step S10.
- mask blurring processing is executed (S16), and the processing is terminated.
- This effective mask creation process is to distinguish between lips and non-lip parts (teeth, skin, background, etc.), limit the processing range of the synthesis processing unit 16, and perform synthesis only in necessary areas. It is. Also, there is an advantage that the lip representative color aveLip can be calculated more accurately by finely determining the lip area.
- the region surrounded by the feature points detected by the above face part detection is a lip region.
- the mask value mask [j] of the current pixel is 1.0 if it is within the lip region, and conversely mask [j] is 0.0 if it is outside the region.
- mask blurring processing is performed as post-processing.
- FIG. 13B is a high-level flowchart of the lip color range calculation procedure.
- the color range is obtained by calculating the minimum value, maximum value, and average value of the L channel for the input lip image.
- step S22 a color range map representing the L channel distribution is calculated.
- FIG. 13C is a flowchart showing a procedure for generating the lipstick application layer.
- step S23 a lip color range to which lipstick is applied is generated, and in step S24, a lipstick application layer is generated from the lip color range and range map.
- FIG. 14A is a flowchart embodying color range calculation.
- a variable i in FIGS. 14A and 14B is a variable that indicates a target to be processed among a plurality of information elements existing in the data structure. Accordingly, in the following flowcharts, the L component of the pixel to be processed in the i-th cycle of the loop is expressed as L [i] and the pixel mask is expressed as mask [i].
- step S31 the next pixel to be processed is L [i].
- step S32 it is determined whether or not the mask value mask [i] of the pixel [i] is larger than the threshold insideTH, so that the current pixel is a pixel within the lip region or outside the lip region (skin, teeth, etc.). Check if it is a pixel.
- min (,) is a function that returns the smaller of the two arguments.
- Lmin min (L [i], Lmin)
- the smaller of L [i] Lmin is Lmin.
- max (,) is a function that returns the larger one of the two arguments.
- Lmax max (L [i], Lmax)
- the larger one of L [i], Lmax is Lmax.
- step S35 sumL is updated by adding L [i] to sumL.
- step S36 it is determined whether all pixels in the loop have been processed. If there is an unprocessed pixel, the process returns to step S31. If there is no unprocessed pixel, step S36 becomes No, and by calculating sumL / count in step S37, aveLip is obtained, and Lmax1, Lmin1, and aveLip are returned.
- FIG. 14B is a flowchart showing a detailed processing procedure of the color range map calculation S42.
- step S41 the next pixel to be processed is set to L [i], and in step S42, the difference L [i] ⁇ Lmin between the current pixel value and the minimum value, and the difference Lmax ⁇ Lmin between the maximum value and the minimum value.
- the ratio is calculated as a range map rmap [i] for the pixel L [i].
- step S43 it is determined whether or not the above processing has been performed for all the pixels existing in the region. If the above processing has been performed for all the pixels, rmap [i] is returned. If unprocessed pixels remain, the process proceeds to step S41, and a range map is calculated for the unprocessed pixels.
- FIG. 15A is a flowchart showing a color range generation procedure in the luminance component of the lipstick application layer.
- the ratio ⁇ is calculated by calculating aveStick / aveLip, and in step S52, range L is obtained by subtracting Lmin from Lmax.
- a value obtained by multiplying rangeL by ⁇ is set as RangeL ′.
- a coefficient ka is obtained from the calculation of (Lmax ⁇ aveLip) / rangeL.
- step S55 a value obtained by multiplying RangeL ′ by the coefficient ka is added to aveStick to obtain L′ max.
- a value obtained by subtracting RangeL 'from L'max is set as L'min. If L′ max and L′ min are obtained by the above processing, L′ max and L′ min are returned.
- FIG. 15B is a flowchart showing a processing procedure for generating a luminance component of the lipstick application layer.
- a variable i in FIG. 15B is a variable that indicates a pixel to be processed among a plurality of pixels existing in the face image. Accordingly, in the following flowcharts, the L component of the pixel to be processed in the i-th cycle of the loop is expressed as L [i] and the pixel mask is expressed as mask [i].
- This flowchart includes a loop lp1.
- the loop lp1 defines a loop structure that repeats Steps S61 to S64 for all variables of the lip region.
- step S61 L [i] which is the value of the L component of the i-th pixel is acquired, and in step S62, it is determined whether or not the range map is used. If a range map is used, L ′ [i] is obtained by calculating L′ min + rmap [i] ⁇ RangeL ′ in step S63.
- L ′ [i] is obtained by calculating aveStick + (L [i] -aveLip) ⁇ ⁇ in step S64.
- FIG. 16 is a flowchart showing the processing procedure of the composition processing in the first embodiment.
- a variable i in FIG. 16 is a variable that indicates a target to be processed among a plurality of information elements existing in the data structure. Therefore, in the following flowcharts, the L component of the pixel to be processed in the loop i-th process is L [i], the pixel of the image after makeup, and the process target in the i-th process of the loop The L component of the pixel becomes L ′ [i], and the pixel mask is mask [i].
- Step S71 is a determination as to whether or not the processing of all the pixels constituting the area has been completed. If completed, the process of this flowchart is terminated and the process returns.
- step S72 the next pixel constituting the lip region is set to L [i] in step S72, and the next pixel constituting the lipstick application layer is set to L '[i] in step S73.
- step S74 the result gousei (L [i], L '[i]) is calculated as a result of combining each pixel L [i] of the lip region and the corresponding lipstick pixel value L' [i].
- mask processing is performed on gousei (L [i], L ′ [i]) which is the calculation result.
- gousei (L [i], L ′ [i]) ⁇ mask [i] + L [i] ⁇ (1-mask [i]) is a mathematical expression of this mask processing.
- the blending process gousei can use alpha blending, multiplication, addition, highlighting, and the like.
- ⁇ blending the mask processing is performed by using the effective mask data mask [j] of the current pixel as an ⁇ value and the result of usei (L [i], L ′ [i]) and the pixel value L [ This is realized by applying ⁇ blend to i] and outputting it as the final result outL [i].
- Lmin, Lmax, aveLip, L'min, L'max, and aveStick calculated in the above process are stored in the metadata file, and it corresponds to the makeup image file in the directory for each user Add and record. This is to leave the makeup conditions as a log.
- the color range of the face image made up by expanding the color range of the original lips according to the ratio between the representative color of the face image and the application color of the lipstick is increased.
- the L channel is the target of color range conversion and pixel bit value mapping.
- the color ranges of the H channel and C channel are also included. It is an embodiment which performs conversion and pixel bit value mapping.
- the configuration of the image processing apparatus for the improvement is as shown in FIG. In FIG. 17, the same components as those of FIG.
- the output from the lip color range calculator 14 is different.
- the difference in output from the lip color range calculation unit 14 is that the lip color range calculation unit 14 of the first embodiment outputs Lmax, Lmin, and aveLip, whereas in the second embodiment, Lmax, Lmin In addition to aveLip, Hmax, Hmin, and Have for the H channel and Cmax, Cmin, and Cave for the C channel are output.
- the channel of the lipstick application layer generated by the lipstick application layer generation unit 15 is different.
- the difference between the channels of the lipstick application layer is that the lipstick application layer generation unit 15 in the first embodiment receives Lmax, Lmin, and aveLip, and generates L'max, L'min, and aveStick.
- the lipstick application layer generation unit 15 of the second embodiment accepts Hmax, Hmin, Have, and Cmax, Cmin, and Cave for the C channel, and L′ max and L ′.
- H′max, H′min, and H′ave for the H channel and C′max, C′min, and C′ave for the C channel are generated.
- the configuration requirements for synthesizing with face images are different.
- the difference in the structural requirements for performing the composition is that only the composition processing unit 16 that composes the L channel of the lipstick application layer and the L channel of the face image in the first embodiment as the composition subject of the first embodiment.
- the composition processing unit 16 in addition to the first composition processing unit 61 in which the composition processing unit 16 composes the L channels of the lipstick application layer and the face image, the composition processing unit 16 combines the lipstick application layer and the face image.
- the difference in processing procedure is that color range calculation, color range map calculation, color range generation, lipstick application layer wrinkle generation, lipstick application layer and face image synthesis are executed for the L channel.
- color range calculation, color range map calculation, color range generation, lipstick application layer generation, and lipstick application layer face image synthesis are not limited to the L channel, but to the H channel.
- 21 to 23 are flowcharts showing a color range calculation procedure, a color range map calculation procedure, a color range generation procedure, a lipstick application layer generation procedure, and a lipstick application layer face image synthesis procedure for the C channel.
- the difference between these flowcharts and the flowchart of the first embodiment is that, in the first embodiment, color range calculation, color range map calculation, color range generation, lipstick application layer generation, and lipstick application layer face image synthesis target
- the color range calculation, color range map calculation, color range generation, lipstick application layer generation, and lipstick application layer face image synthesis target are the H channel and C channel. It is just a difference in the processing target. The difference between these flowcharts is only the difference between the objects to be processed.
- the range of identity of the color range in the H component is narrow, such as yellow. If these values are added to the H component, the color identity may be lost.
- how much color range identity each selectable cosmetic item has is stored in advance. Then, when a cosmetic item is selected by the user, the identity of the sample color of the selected cosmetic item is determined to be wide or narrow, and if the cosmetic item has a wide color range, the H component as described above is determined. Hmin, Hmax, and Have are calculated, and H'min, H'max, and H'ave are also calculated.
- the synthesis processing is performed for each of the H channel, the C channel, and the L channel, which are channel images of a plurality of types of pixel components in the original face image. You can repaint it with a rich and full of feeling.
- the present invention relates to an improvement in order to add highlights and shadows that the user likes to a makeup face image.
- the image processing apparatus according to the third embodiment has the following differences.
- the range map acquisition method is different. This is because the acquisition of the range map in the first embodiment is to create a range map from the image to be processed, whereas the acquisition of the range map in the present embodiment accesses the recording medium, The difference is that it is made by reading a sample file that stores a range map corresponding to a makeup item.
- the range map values are different.
- the difference between the range maps is a value obtained by dividing the difference (L [i] -Lmin) of each pixel by the color range (Lmax-Lmin).
- the value of each pixel in the range map of the present embodiment is different in that the value of each pixel in the range map sample is given by a weight value from 0.0 to 1.0.
- the method of calculating the pixel bit value of a makeup face image using a range is different. This is calculated by multiplying rmap [i] by the color range rangeL ′ of the makeup face image and adding Lmin to the pixel bit value L [i] of the i-th pixel in the first embodiment.
- the pixel bit value L [i] of the i-th pixel in this embodiment is obtained by scaling the range map sample according to the size of the user's lips, and the i-th weight in the scaled range map sample. The difference is that the value is calculated by multiplying the value by the color range rangeL ′ of the makeup face image and adding Lmin.
- a sample file is recorded in a recording medium accessible by the image processing apparatus in association with each of a plurality of cosmetic items. These sample files are samples in a distribution manner of how the pixel values are distributed in a plane, and an appropriate sample file is read according to the selected cosmetic item.
- FIG. 24A shows the directory structure of the storage 6 in the third embodiment.
- the product image file, the representative color data file, and the manual file are stored in the directory corresponding to each item.
- a sample file that stores a sample of the range map is stored as a constituent requirement of the product item.
- the range map sample stored in the sample file has a customized luminance distribution according to the optical characteristics of the corresponding cosmetic item.
- FIG. 24B shows an example of a range map sample stored in the sample file.
- the background portion is a pixel having a weight of 0, and the foreground portion is composed of pixels having a weight with a non-zero weight coefficient.
- the pixel having the non-zero weight has a lip shape, and both the left and right sides of the mouth corner and the boundary between the upper and lower lips have a low value of 0.1.
- the lower lip, the bulging part of the upper lip, and the part illuminated by illumination have a high value of 0,9. This expresses the highlights of being illuminated by indoor lighting.
- FIG. 24C shows lip highlights and shadows realized by the sample file. Highlights high1 and high2 in this figure are given by setting the value 0.9 in the sample file of FIG. The shadows shd1 and shd2 are given by setting the value 0.1 in the sample file of FIG.
- Some lipsticks contain compositions such as reflective pigments, diffusion pigments, and multilayer interference pigments in addition to colorants.
- the colorant is sensitive to external stimuli.
- the reflective pigment performs metal reflection.
- the diffractive pigment contains a diffractive pigment in order to produce a light diffraction effect or a nacreous effect.
- the diffusion pigment generates light by an absorption phenomenon.
- Multilayer interference pigments produce color intensity greater than that of diffusive pigments due to multilayer deposition and reflection index.
- a plurality of sample files having different highlight amounts and positions, shadow shapes and positions are prepared, and each of these is set as a component requirement of the cosmetic item.
- FIG. 25 is a flowchart showing a procedure for generating a makeup face image using a sample file.
- the range map sample is read from the sample file corresponding to the selected makeup item (step S91), and the ratio between the vertical pixel number of the range map sample and the vertical pixel number of the lips is calculated (step S92).
- a ratio between the number of horizontal pixels of the range map sample and the number of horizontal pixels of the lips is calculated (step S93).
- the number of vertical pixels and the number of horizontal pixels of the range map sample are enlarged according to the calculated ratio (step S94), and the process proceeds to a loop of step S95 to step S99.
- This loop initializes the variable i with 1 (step S95), obtains the i-th weight value value [i] in the range map sample (step S96), and multiplies value [i] by rangeL ′.
- An operation of adding L'min is performed (step S97), and it is determined whether i has reached the maximum number N (step S98). If the maximum number has not been reached, the variable i is incremented (step S99), and the process returns to step S96.
- the pattern of the planar distribution indicating how the pixel values are distributed in a plane is sampled and recorded on the recording medium in the form of a sample file.
- the processing apparatus can enjoy various variations of makeup by reading a sample file corresponding to a desired makeup mode.
- the weight value given to each pixel is changed according to the contour shape of the lips.
- the way of giving weighting factors is different.
- the difference in the weighting coefficient is that the relative value of the pixel bit value of each pixel from aveLip is the weighting coefficient that should be multiplied by aveLip.
- a low weight is given to those existing in the mouth corner portion.
- the corner of the mouth is the joint between the upper lip and the lower lip, or both sides of the lip.
- the lip outline is formed by an anchor point and a complementary line.
- FIG. 26A shows an outline of a lip part formed by an anchor point and a complementary line.
- the left side portion of the mouth corner is based on the leftmost anchor point Xmin having the smallest X coordinate, and an interpolation line left1 extending from the anchor point to the upper right, and from the anchor point to the lower right. It is specified by the interpolated line left2.
- the right side part of the mouth corner is specified by the interpolation line right1 from the anchor point to the upper left and the interpolation line right2 from the anchor point to the lower left, with reference to the rightmost anchor point Xmax having the largest X coordinate. .
- the upper and lower lip boundaries are specified by a series of complementary line sequences bord1,2,3,4 from the left anchor Xmin to the right anchor Xmax.
- FIG. 26B shows the shadow of the lips generated by lowering the weight coefficient of the pixels existing along the interpolation line.
- the shape of the shadow along the base can be changed according to the shape of the lips.
- FIG. 26 (c) shows a process of changing the curvature of the interpolation line.
- (C) bar is an operation bar pulled out from the anchor, and can be rotated counterclockwise around the anchor.
- the left side of (c) shows before rotation, and the right side shows after rotation.
- FIG. 27 is a flowchart showing a processing procedure for emphasis processing of the base portion.
- the anchor point Xmax having the largest X coordinate is identified (step S101), and the complementary line extending from the anchor point Xmax to the upper left and the complementary line directed to the lower left are set as a cap part on the right side (step S102).
- the anchor point Xmin having the smallest X coordinate is identified (step S103), and the complementary line extending from the anchor point Xmin to the upper right and the complementary line directed to the lower right are set as the base part on the left side (step S104). ).
- step S105 a collection of complementary lines from Xmin to Xmax is specified (step S105), and the specified collection of complementary lines is set as a boundary line between the upper and lower lips (step S106). Pixels existing around the boundary between the right side base part, the left side base part, and the upper and lower lips are weighted relatively low (step S107).
- the contour shape of the lips can be emphasized, and the appearance of the makeup can be improved.
- the present embodiment relates to an improvement for realizing a makeup operation for moving images.
- the pixel value conversion target is different between the present embodiment and the first embodiment. This is because the pixel value conversion of the first embodiment targets one still image, whereas the pixel value conversion of the present embodiment targets each of a plurality of frame images constituting a moving image. The difference is that it is done.
- the makeup method is different. This is because the makeup of the first embodiment uses the lipstick item selected by the user to change the color of the lips designated by the region using the feature points. Up is a difference in that the color of an arbitrary part of the face is changed by hand-painting using a cosmetic item by the user.
- the target of pixel value conversion is different. This is because the pixel value conversion of the first embodiment is intended for the lip region designated through the feature points, whereas the pixel value conversion of the present embodiment is intended for the range in which the hand painting operation is accepted. The difference is to do.
- the present embodiment since a plurality of frame images constituting a moving image are to be processed, one of the plurality of frame images is fixed as a still image, and an operation mode for providing a makeup operation, and each frame image There is a confirmation mode for reproducing the image as a moving image for confirmation after makeup.
- FIG. 28 shows three phases according to mode switching.
- the phase ph1 on the left side is a phase for selecting a frame image as a base of the operation while viewing individual frame images of the input moving image.
- the mode is the operation mode, and the individual frame images constituting the moving image are sequentially displayed on the touch panel. By touching the screen, one frame image is captured, and the still image that is the target of the makeup operation is determined.
- the middle phase ph2 is a phase for accepting a makeup operation based on a still image obtained by capturing a frame image.
- the first level of ph2 indicates that the mode is set to the operation mode.
- the second level shows a situation where a makeup operation is being performed.
- the frame image Fx in the second row is a frame image used for still image display as an operation mode target. In this situation, a frame image Fx that is a self-portrait captured as a still image is displayed on the display, and the surface of the display is traced with a finger. By this finger tracing operation, a hand-painted image of eyebrows, a hand-painted image of blusher, and a hand-painted image of lipstick are combined into a still image.
- the rightmost phase ph3 shows the situation where the mode has been switched to the confirmation mode again after the hand-painting operation in the operation mode.
- the frame image Fx + m means any one frame image among the moving images displayed in the confirmation mode.
- a hand-painted image of eyebrows, a hand-painted image of blusher, and a hand-painted image of lipstick are mapped to portions corresponding to the eyebrows, cheeks, and lips of the subsequent frames.
- Lmin, Lmax, and aveLip are calculated for the representative color of the cosmetic item and the hand-painted part, and the cosmetic item selected by hand-painting Calculate aveStick. Then, the ratio ⁇ is obtained from aveLip and aveStick, and the pixel bit value of the hand-painted part is mapped to the color range of the hand-painted cosmetic item.
- the user can visually confirm the appearance of his / her movement with his / her make-up.
- the above hand-painted image mapping is performed by extracting feature points that define the contour of each part from each frame, and obtaining a transformation matrix for converting a group of feature points existing in the frame image into feature points of another frame image. That is done.
- a feature point group composed of a plurality of feature points is detected from the frame image Fx and the frame image Fx + m. These feature points define the contour shape of the part of the object.
- the image processing apparatus associates feature points by searching for corresponding points between the frame image Fx set as a still image and the subsequent frame image Fx + m.
- the corresponding point search between the frame images is performed by calculating a correlation value based on a luminance value or the like for each pixel and detecting a pixel having the highest correlation value. If a makeup operation by hand-painting is performed on any part represented in the still image, it is a part in the subsequent frame image that has been associated with the part related to the operation by the corresponding point search Map the hand-painted image related to the operation.
- FIG. 29A shows a feature point group detected in the frame image Fx and the frame image Fx + m.
- the feature point groups gp1, gp2, gp3, and gp4 in FIG. 29A surround the representative parts (eyebrows, lips, cheeks) of the face image in the frame image Fx, and define the contour shape of the parts.
- a feature point group gp11, gp12, gp13, gp14 in the figure surrounds a representative part of the face image in the frame image Fx + m and defines the contour shape of the part.
- Arrows sr1, sr2, sr3, and sr4 schematically show a process of searching for corresponding points performed by feature points in the frame image Fx and feature points in the frame image Fx + m. This correspondence point search defines the correspondence between the feature point group that defines the eyebrow in the frame image Fx and the feature point group that defines the eyebrow in the frame image Fx + m.
- FIG. 29B shows a transformation matrix that defines transformation of feature points between the frame image Fx and the frame image Fx + m.
- H1, H2, H3, and H4 indicate transformation matrices that define feature point transformation between corresponding parts.
- the hand-painted image obtained by the hand-painting operation on the frame image Fx is mapped to the frame image Fx + m.
- the hand-painted image is deformed and displayed according to the appearance of the feature points indicating the frame image Fx + m.
- FIG. 30A shows a plurality of feature points existing in the frame image Fx and the frame image Fx + m.
- FIG. 30B shows an example of the transformation matrix.
- the transformation matrix H in FIG. 30 (b) is obtained by converting the feature points i1, i2, i3, i4... I8 in the frame image Fx into the feature points j1, j2, j3, j4. This is a matrix to be converted into j8 and is composed of 8 ⁇ 8 matrix components.
- FIG. 30 (c) shows a situation where a hand-painted image is drawn on a still image.
- a trajectory trk1 in the figure is a trajectory obtained by tracing the face drawn on the touch panel with a finger, and this trajectory is specified as a hand-painted image.
- the middle arrow cv1 schematically shows the conversion of the hand-painted image using the conversion matrix H shown in FIG.
- a hand-painted image trk2 in the figure shows a hand-painted image synthesized with the face image of the frame Fx + m through the conversion using the conversion matrix.
- the hand-painted image obtained by tracing the face image in the still image with the finger is mapped to the frame image of the frame Fx + m. .
- FIG. 31 shows the process until the still image serving as the base of the confirmation mode is determined when the operation mode is switched from the confirmation mode to the operation mode.
- the first level shows the transition of the setting of the operation mode
- the second level is the frame image input in real time from the camera 102
- the third level is the facial part detection according to the feature point extraction
- the fourth level indicates operations that can be accepted on the screen.
- a time point t1 indicates a time point when the operation mode is switched from the confirmation mode to the operation mode.
- the frame image Ft1 is a frame image captured at that time. Since the frame image captures the user at the moment of operating the GUI, the user's line of sight is off and the face is not facing the front. In the feature point detection, if it is found that the line-of-sight direction is strange, a determination result that the face part detection is NG is given in the fourth stage.
- the frame image Ft5 is a frame image that captures the moment when the user closes his eyes. If it is determined by the feature point detection that the user's eyes are closed, the face part detection is NG as shown in the third row.
- Frame Ft9 is a frame image that captures the user facing the front of the camera. If the feature point extraction reveals that the user is facing the front and the eyes are open, the facial part detection is OK, and the frame Ft7 is the base of the operation mode and accepts the hand-painting operation by the user .
- FIG. 32 is a flowchart showing a makeup processing procedure for moving images.
- the current mode is set to the operation mode (step S201), and the process proceeds to a loop of steps S202 to S203.
- steps S202 to S203 In this loop, individual frame images constituting the input moving image are displayed (S202), and it is determined whether switching to the still image based confirmation mode has occurred (step S203). If mode switching occurs, the frame image is captured as a still image (step S204), and the process proceeds to a loop composed of steps S205 to S206.
- This loop determines whether a touch on a makeup item has occurred (step S205), whether a touch on a face in a still image has occurred (step S206), or whether switching to a confirmation mode has occurred (step S207). Is.
- step S205 the makeup item touched in step S208 is selected. If a touch on the face image occurs, the touched part is specified (step S209), and it is determined whether the finger painting operation is continued (step S210). If it continues, the drawing with the hand-painted image following the operation is continued (step S211). If the operation is completed, the hand-painted image drawn by the previous operation is determined and held (step S212).
- Step S207 If switching from the operation mode to the confirmation mode occurs, Step S207 becomes Yes, and the process proceeds to Step S213.
- Step S213 waits for the input of a subsequent frame image (step S213). If input, a facial feature point is extracted from the subsequent frame image, and a mapping site in the frame image is specified according to this feature point (step S213). Step S214). Thereafter, the hand-painted image is mapped to the mapping part of the frame image (step S215). Finally, the frame image to which the hand-painted image is mapped is reproduced (step S216), and the process returns to step S213.
- a still image that is a base for makeup is selected from a plurality of frame images constituting a moving image, and makeup by hand-painting of a coating material is applied to the still image.
- the hand-painted image obtained by such hand-painting is combined with the subsequent frame image. Since the user can see his / her movement after makeup, he / she feels like buying and using makeup paint, which can lead to promotion of cosmetic items.
- SIMD Single Instruction Multi Data
- the SIMD processor includes n processing elements and n input-n output selectors.
- Each processing element includes an arithmetic unit and a register file.
- the arithmetic operation unit includes a product-sum operation unit including a barrel shifter, a register, a multiplier, and an adder.
- a product-sum operation unit including a barrel shifter, a register, a multiplier, and an adder.
- the n input-n output selector executes n inputs / outputs simultaneously.
- Each input / output in the n input-n output selector is a register file in any processing element as an input source and an arithmetic unit in any processing element as an output destination. It is prescribed.
- Arithmetic operation units in the n processing elements execute an operation using the elements stored in the register file as operands. Since an arithmetic operation unit exists in each of the n processing elements, n operations are executed in parallel.
- the register file in each processing element contains multiple registers. Since such a register file exists in each of the n processing elements, when N is 16, in the image processing apparatus, a matrix of 16 ⁇ 16 horizontal elements is a 16 ⁇ 16 horizontal register. It will be held by.
- the n input-n output selector supplies pixel bit values of n pixels existing in the horizontal direction to any arithmetic operation unit or register file of the n processing elements.
- the SIMD processor selects any two pixels in the lip image, compares the L components of the two selected pixels, and determines the smaller Lmin. It is realized by the operation of storing as. Specifically, the L component of two pixels is supplied to the arithmetic operation unit of one processing element by horizontal input / output of the processing element by the selector, and the arithmetic operation unit compares the two to make a small one. Is written back to the register as Lmin, and the same processing is repeated for the stored Lmin and the L components of other pixels.
- the operation by the function of max (,) in step S34 in FIG. 14 (a) is such that the SIMD processor selects any two pixels in the lip image and compares the L components of the two selected pixels. Is stored as Lmax. Specifically, the L component of two pixels is supplied to the arithmetic operation unit of one processing element by the horizontal input / output of the processing element by this selector, and the arithmetic operation unit compares both, and the large This is realized by writing the data back to the register as Lmax and repeating the same processing for the stored Lmax and the L components of other pixels.
- n input-n output selector can supply the L component of n pixels existing in the horizontal direction to one processing element, one L component of n pixels existing in the horizontal direction is supplied. If the arithmetic operation unit performs an operation of performing a product-sum operation on the L component of each supplied pixel in the first embodiment, the arithmetic operation unit of FIG.
- the calculation of SUM shown in step S35 can be performed at high speed. By dividing this by the number of pixels, aveLip, which is the average value of the lip region, can be calculated at high speed.
- the calculation of L ′ [i] shown in the flowchart of FIG. 15B is performed as follows.
- the n processing elements included in the SIMD processor read in parallel the pixel bit values of n pixels existing in the horizontal direction in the range map, and supply them to each of the n processing elements.
- the arithmetic operation unit of each processing element performs the multiplication / addition process of multiplying the difference between the i-th pixel value L [i] of the original face image and aveLip by ⁇ and adding aveStick, and the operation result Are written into the memory as pixels of the face image that has been made up. Thereby, a face image with makeup is obtained.
- rmap [i] For the calculation of rmap [i] shown in FIG. 14B, the L component of n horizontal pixels existing in the horizontal direction in the original face image stored in the memory is read in parallel, Stored in the register file.
- the arithmetic operation unit of each processing element executes subtraction of subtracting Lmin from L [i] and multiplication of multiplying the subtraction result by 1 / (Lmax-Lmin), and the operation result is obtained for each pixel of the range map. Write to memory as the value of. Thereby, a range map is obtained.
- the arithmetic unit of the processing element performs multiplication / addition processing by multiplying rmap [i], which is the i-th value of the range map, by rangeL 'and adding Lmin. Write to memory as bit value. Thereby, a face image with makeup is obtained.
- Make-up paints are not limited to virtualized items of cosmetics. Color contacts, earrings, ears, and nose piercings may be used. Although the object including the target site for makeup is a human face, it may be a hand. The coating material may be nail polish. Further, the makeup object may be a pet animal instead of a human being. Further, the makeup by the image processing apparatus may be a car dress-up.
- the makeup simulator may have a function of creating a color variation from the lipstick item. These color variations are created by selecting a lipstick tool, selecting a color sample, and replacing the H and C components of the application color of the lip item with the H and C components of this color sample.
- the color space to be converted has been described as RGB and HCL.
- a color space of three channels of other hue, saturation, and luminance is used.
- HSV or HSL may be used.
- any one of a plurality of frame images constituting a moving image is selected and hand-painted makeup is performed.
- the part selected by the user on the initial screen is automatically detected from each frame image. You may implement makeup by extracting and repainting the corresponding part in each frame image with a cosmetic item.
- Equations 1 through 5 do not mean mathematical concepts, but rather numerical operations executed on a computer, and of course, are necessary for the computer to realize them. It goes without saying that various modifications are made. For example, it goes without saying that a saturation operation or a positive value operation for handling a numerical value in an integer type, a fixed-point type, or a floating-point type may be performed. Further, in the arithmetic processing based on the mathematical expressions shown in the embodiments, multiplication with a constant can be realized by a ROM multiplier using a constant ROM. In the constant ROM, the product of the multiplicand and the constant is calculated and stored in advance.
- the multiplicand is 16 bits long, this multiplicand is divided into four every 4 bits, and the product of this 4-bit part and a constant, that is, a multiple of 0 to 15 of the constant is stored in the above constant ROM. Stored.
- the term “arithmetic processing” in this specification does not mean only pure arithmetic operation, but is stored in a recording medium such as a ROM. It also includes reading of the recording medium in which the calculated result is read according to the value of the operand.
- the tablet and the image processing apparatus may be connected via a network.
- the image processing apparatus receives a self-portrait (original image data) from the camera of the display device via the network and performs a makeup process.
- the image that is the makeup result is output to an external device for display.
- the image processing apparatus may perform image processing for make-up on a plurality of user images obtained by photographing a plurality of users.
- a cloud network server cloud server
- the hypervisor activates the guest operating system (guest OS) in the cloud server.
- guest OS guest operating system
- an application program for processing the configuration requirements of the image processing apparatus is loaded from the intra network existing in the company to the cloud server. By such loading, the processing described up to the above embodiments is executed for big data.
- each of the above devices is a computer system including a microprocessor, ROM, RAM, a hard disk unit, a display unit, a keyboard, a mouse, and the like.
- a computer program is stored in the RAM or hard disk unit.
- Each device achieves its functions by the microprocessor operating according to the computer program.
- the computer program is configured by combining a plurality of instruction codes indicating instructions for the computer in order to achieve a predetermined function.
- Part or all of the constituent elements constituting each of the above-described devices may be configured by one system LSI (Large Scale Integration).
- the system LSI is an ultra-multifunctional LSI manufactured by integrating a plurality of components on a single chip.
- the system LSI is a computer system including a microprocessor, a ROM, a RAM, and the like. .
- a computer program is stored in the RAM.
- the system LSI achieves its functions by the microprocessor operating according to the computer program.
- the integrated circuit architecture is composed of a pre-programmed DMA master circuit, etc., and is composed of a front-end processing circuit (1) that performs general stream processing and a SIMD processor, etc., and a signal processing circuit that performs general signal processing ( 2), a back-end circuit (3) that performs pixel processing, image superposition, resizing, image format conversion AV output processing in general, a media interface circuit (4) that is an interface with the drive and network, and a memory access It is a slave circuit and is composed of a memory controller circuit (5) that implements reading and writing of packets and data in response to requests from the front end unit, signal processing unit, and back end unit.
- QFP Quad Flood Array
- PGA Peripheral Component Interconnect Express
- the IC card or the module is a computer system including a microprocessor, a ROM, a RAM, and the like.
- the IC card or the module may include the super multifunctional LSI described above.
- the IC card or the module achieves its function by the microprocessor operating according to the computer program. This IC card or this module may have tamper resistance.
- the present invention may be the method described above. Further, the present invention may be a computer program that realizes these methods by a computer, or may be a digital signal composed of the computer program.
- the present invention also provides a computer-readable recording medium for the computer program or the digital signal, such as a flexible disk, hard disk, CD-ROM, MO, DVD, DVD-ROM, DVD-RAM, BD (Blu-ray ( (Registered trademark) Disc), or recorded in a semiconductor memory or the like.
- the digital signal may be recorded on these recording media.
- the computer program or the digital signal may be transmitted via an electric communication line, a wireless or wired communication line, a network represented by the Internet, a data broadcast, or the like.
- the present invention may be a computer system including a microprocessor and a memory, wherein the memory stores the computer program, and the microprocessor may operate according to the computer program.
- the program or the digital signal may be recorded on the recording medium and transferred, or the program or the digital signal may be transferred via the network or the like, and may be implemented by another independent computer system. .
- the program code that causes the computer to perform the processes of FIGS. 12 to 16 and FIGS. 17 to 23 can be created as follows.
- a software developer uses a programming language to write a source program that implements each flowchart and functional components.
- the software developer describes a source program that embodies each flowchart and functional components using a class structure, a variable, an array variable, and an external function call according to the syntax of the programming language.
- the described source program is given to the compiler as a file.
- the compiler translates these source programs to generate an object program.
- Translator translation consists of processes such as syntax analysis, optimization, resource allocation, and code generation.
- syntax analysis lexical analysis, syntax analysis, and semantic analysis of the source program are performed, and the source program is converted into an intermediate program.
- optimization operations such as basic block formation, control flow analysis, and data flow analysis are performed on the intermediate program.
- resource allocation in order to adapt to the instruction set of the target processor, a variable in the intermediate program is allocated to a register or memory of the processor of the target processor.
- code generation each intermediate instruction in the intermediate program is converted into a program code to obtain an object program.
- the object program generated here is composed of one or more program codes that cause a computer to execute the steps of the flowcharts shown in the embodiments and the individual procedures of the functional components.
- program codes such as a processor native code and JAVA (registered trademark) byte code.
- JAVA registered trademark
- a call statement that calls the external function becomes a program code.
- a program code that realizes one step may belong to different object programs.
- each step of the flowchart may be realized by combining arithmetic operation instructions, logical operation instructions, branch instructions, and the like.
- the programmer When object programs are generated, the programmer activates the linker for them.
- the linker allocates these object programs and related library programs to a memory space, and combines them into one to generate a load module.
- the load module generated in this manner is premised on reading by a computer, and causes the computer to execute the processing procedures and the functional component processing procedures shown in each flowchart.
- Such a computer program may be recorded on a non-transitory computer-readable recording medium and provided to the user.
- the image processing apparatus uses a designation means for designating a portion to which the makeup coating material is to be applied from the object shown in the original image, and a sample color of the makeup coating material as a representative color.
- a generating unit that generates a coating material application layer composed of a plurality of pixels having a color range, and a synthesis unit that combines the coating material application layer with the original image;
- the color range of the plurality of pixels constituting the coating material application layer includes the representative values of the plurality of pixels constituting the portion of the original image to which the makeup coating material is to be applied and the values of the sample color pixels of the makeup coating material.
- the pixel value of the coating material application layer is a pixel of the original image, and is obtained by expanding the color range of a plurality of pixels constituting a part of the object according to the ratio It is characterized in that it can be obtained by mapping the pixel value of the corresponding one to the color range of the coating material application layer.
- the color range of the part image after application is obtained by expanding the color range of the original image, and the pixel bit value of the coating material application layer is the pixel bit value of the pixel corresponding to the position of the original image. Therefore, the pixel bit values of a plurality of pixels constituting the coating material application layer are enhanced in contrast along the shape of the target portion in the original image. It will appear in the coating material application layer.
- contrast enhancement makes it possible to realistically reproduce a state in which a makeup coating material having optical characteristics is applied to a target portion and ambient light is illuminated.
- the part to be applied means a part of the surface area of the building, creature, vehicle, still life, or human body shown in the image.
- ⁇ Makeup paints when the target is a human body include lipstick, hair spray, nail polish, eye shadow, and one-datum.
- Examples of the makeup coating material when the object is a building, a vehicle, or a still life include paint, spray paint, and fluorescent paint. If embodiments of these subordinate concepts are described, the explanation becomes complicated, which is not desirable. Therefore, the embodiment has been described on the assumption that the application target is the lips of the human body and the lipstick is a makeup coating material.
- the portion specified by the specifying means in the original image includes a pixel of a highlight portion that is generated by being illuminated with ambient light and a pixel of a shadow portion that is generated by being blocked by the ambient light
- the representative values of a plurality of pixels constituting the original image part are calculated by performing statistical processing on a plurality of pixels including a highlight portion pixel and a shadow portion pixel.
- the rewriting of the highlighted or shadowed part is stopped, whereas in the image processing apparatus of the above aspect, the highlighted or shadowed part is expanded by expanding the curry range.
- the part where the highlight exists in the original image, the part where the shadow is formed, and the part where the other part is connected become unnatural. Na Yes.
- the statistical processing widely includes average calculation, weighted average calculation, variance calculation, deviation calculation, and the like.
- mapping by the generation means can be expanded to a more specific one. It is a generalization of the mathematical formula (Formula 5) using the range map in the first embodiment. Specifically, the image processing apparatus further acquires a distribution map indicating how the pixel values are distributed in the plane coordinate system corresponding to the original image, and the mapping by the generation unit is indicated in the distribution map. What should be the distribution of pixels in the applied image by weighting the color range using the positionally corresponding value as a weighting factor and adding an offset to the weighted value? Is shown in the distribution map, and the determination of the pixel bit value of the coating material application layer is governed by this distribution map, so the brightness distribution in the coating material application layer is expressed more realistically.
- weight values and offsets shown in the distribution map can be developed into more specific ones. That is a limitation by rmap [i] * rangeL ′, L′ min shown in the first embodiment.
- the value of each pixel position indicated by the distribution map is the value of the pixel corresponding to the position among the plurality of pixels constituting the portion of the original image to which the coating material is to be applied, and
- the difference from the lower limit value of the color constituting the part is normalized using the color range in the part, and the offset is the lower limit value of the color range of the repaint color.
- the value of the distribution map is the normalized difference between the pixel bit value and the color range, so by multiplying this by the color range length, the corresponding part of the coating material application layer has a brightness distribution closer to the real thing. It will be. This makes it possible to make up makeup that is mistaken for the real thing.
- a sample file is recorded on a recording medium that can be accessed by the image processing apparatus, and the sample file is a sample of a distribution mode of how pixel values are distributed in a plane.
- the distribution map is acquired by the generation means by reading the sample file from the recording medium. Since the pattern of the planar distribution indicating how the pixel bit values are distributed is sampled and recorded on the recording medium in the form of a sample file, the image processing apparatus can perform a desired makeup mode. By reading the sample file, you can enjoy various variations of makeup.
- the conversion of the pixel bit value can be developed into a specific one using a bit operation. That is the generalization of Equation 6 in the first embodiment. Specifically, the conversion is a plurality of pixels constituting a portion of the original image to which the coating material is to be applied, and the values of pixels corresponding to the positions are the values of the plurality of pixels constituting the portion. This is obtained by multiplying a relative value indicating how much of the entire color range is occupied by a predetermined coefficient and adding the offset.
- the process of multiplying the weighting coefficient and adding the offset can be implemented by hardware using a product-sum operation circuit using a shift register. Such hardware enables parallel processing and pipeline processing for a plurality of pixels.
- the relative value, the weighting factor, and the offset in the aspect can be expanded to the contents of lower conceptualization. It is limited by L [i] -Lave), ⁇ , L'ave in Equation 6, and the relative value is the value of a pixel of a representative color in the part of the original image to which the coating material is to be applied, A plurality of pixels constituting the part, the difference from the pixel value of the corresponding pixel is normalized using the color range length of the pixel constituting the part, and the predetermined coefficient Is the color range length of a plurality of pixels representing the part, and the offset is the lower limit value of the color range of the color constituting the part.
- the above relative value indicates how far the pixel bit value of the target pixel is from the representative color in the original image, so that the relative value is mapped to the color range of the coating material coating layer, and the coating material coating layer By adding the lower limit value in the color range, the part in the coating material application layer can reproduce a realistic texture that does not impair the illumination of the original image.
- the upper limit and the lower limit of the color range in the coating material application layer can be expanded to lower conceptualization.
- the subordinate concept is limitation by Lmin and Lmax in the first embodiment.
- the upper limit value of the color range in the coating material application range constitutes the portion of the original image to which the coating material is to be applied.
- the range of the color range of the coating material application layer is a ratio of the difference between the representative value of the plurality of pixels to be processed and the upper limit value of the color range of the part and the range length of the color range of the pixels constituting the part.
- the lower limit value of the color range of the coating material application range is calculated by subtracting the length of the color range of the coating material application range from the upper limit value of the color range of the coating material application range.
- the upper limit of the color range in the paint application layer is obtained by multiplying the ratio of the difference between the representative color of the original image and the upper limit of the original image and the range length of the color range of the original image by the range length of the paint application layer.
- the difference between the original image and the representative color is suitably reflected in the color range of the coating material application layer.
- the lower limit of the color range of the coating material application layer is determined by subtracting the color range length of the coating material application layer from the upper limit determined in this way, the darkness of the darkened part can also be reduced. This is in accordance with the ratio between the representative color and the representative color of the coating material application layer. Thereby, it is possible to realistically reproduce the state in which the makeup coating material is applied to the photographed user's face.
- the color range can be expanded to the content of lower conceptualization.
- the subordinate concept is a generalization of the fact that the color range is an individual provision of HCL.
- the color range of the part of the original image to which the coating material is to be applied is a numerical range of a specific type among a plurality of types of pixel components included in the pixels constituting the part
- the portion of the original image to which the coating material is to be applied includes a plurality of channel images
- the values of the pixels constituting the coating material application layer are the individual pixels of the channel image configured by a specific type of pixel component. It is obtained by mapping to the color range of the coating material application layer.
- a color range is calculated for a specific pixel component among a plurality of types of pixel components constituting a pixel, such a pixel component may not be converted into a plurality of pixel components constituting the pixel.
- the method of synthesis by the synthesis means can be expanded to the contents of lower conceptualization.
- the subordinate concept is a generalization that the synthesis parameter is an individual definition of HCL.
- the image processing device has a composition parameter for each component type, and the composition by the composition unit is a pixel component of a specific type among a plurality of channel images included in the coating material application layer, The pixel components of a specific type among the plurality of channel images are weighted using a synthesis parameter corresponding to the specific type and added to each other.
- Plural types of pixel components can be developed into lower conceptualization contents.
- the subordinate concept is that the plurality of types of pixel components are hue, saturation, and luminance, and the specific type of pixel components is any one of hue, saturation, and luminance, or a combination of two or more. It is that. Makes up the natural aspect of the hue component, the saturation component, and the luminance component, maintaining the same hue component by using the saturation component and luminance component as the target of color range generation, mapping, and composition. An image can be obtained.
- the image processing apparatus includes a first component, a second component, and a third pixel component that form individual pixels in a portion of an original image to which a coating material is to be applied, a hue component, a saturation component, First conversion means for converting to a luminance component;
- the image forming apparatus includes second conversion means for converting the hue component, saturation component, and luminance component of the pixels constituting the portion where the coating material application layer is combined into first, second, and third pixel components.
- Original face images made up of RGB pixel components and original images made up of YCrCb pixel components can be targeted for make-up, so image input from various types of cameras should be used as make-up targets. Can do.
- the location where the weight should be lowered can be more specifically defined.
- the subordinate concept is a generalization of the weight of the base shape.
- the part to which the coating material is to be applied is a lip
- the designation unit generates a contour line that defines the contour shape of the designated part when designating the part to be subjected to image processing.
- the generating means determines the shape of the lip base portion from the contour shape line that defines the lip contour shape.
- the weights of the pixels located around the specified base shape line are reduced. Since the weight coefficient is set low on both sides of the lips that are the corners of the mouth and the upper and lower lip boundaries, the lip contours stand out and the appearance is improved.
- the method in this aspect is such that the makeup processing material is applied to a part of the object shown in the original image by performing image processing on the original image.
- the color range of the plurality of pixels constituting the coating material application layer includes the representative values of the plurality of pixels constituting the portion of the original image to which the makeup coating material is to be applied and the values of the sample color pixels of the makeup coating material.
- the pixel value of the coating material application layer is a pixel of the original image, and is obtained by expanding the color range of a plurality of pixels constituting a part of the object according to the ratio Therefore, the pixel value of the corresponding one is obtained by mapping the pixel value to the color range of the coating material application layer.
- This method embodiment can be improved from 2. to 13. as described above.
- Such an image processing method can be used in a place used by a user in an enterprise or an end user, so that the application of the method invention belonging to the technical scope of the present application can be expanded.
- the computer system is a computer system formed by a client computer, a server computer, a cloud network computer system in which an application is loaded on a cloud server, a computer system in which computers perform peer-to-peer connection, the computer functions as a grid, and performs distributed processing Widely includes grid computer systems to perform.
- “Terminal devices” widely include laptop computers, notebook computers, smart phones, tablet terminals, cash register terminals operated with a pointing device such as a mouse or keypad. If embodiments of all these subordinate concepts are described, the explanation will be complicated, which is not desirable. Therefore, in the first embodiment, the tablet terminal is a terminal device.
- the program in the aspect performs image processing on the original image, so that a makeup coating material is applied to a part of the target object shown in the original image.
- a program that causes a computer to execute a process for generating a makeup image indicating the applied state, and that specifies a portion to which the makeup coating material is to be applied from the object shown in the original image, and the makeup coating material Including program code for causing a computer to generate a coating material coating layer composed of a plurality of pixels having a color range with a sample color as a representative color;
- the coating material application layer is an image combined with the original image, and the color range of the plurality of pixels constituting the coating material application layer is a plurality of pixels constituting the portion of the original image to which the makeup coating material is to be applied.
- the pixel value of the coating material application layer is obtained by mapping the pixel value of the original image pixel corresponding to the position to the color range of the coating material application layer.
- the present invention relates to the field of face image processing, and is particularly useful as an image processing apparatus for makeup simulation that applies lipstick to a face image using color information.
Abstract
Description
発明者らは、本発明の実施化にあたって、様々な技術的障壁に直面した。以下、その克服のまでの歩みを述べる。
本実施形態は、HCL空間において、各画素を構成するH成分、C成分、L成分のうち、L成分からなるLチャネルの画像に対して画像処理を行う、メイクアップシミュレータの実施形態を開示している。かかるメイクアップシミュレータは、上記画像処理装置の構成要件を、タブレット端末のハードウェア資源を利用して具現化したものである。
第1実施形態では、唇を構成するH,C,Lチャネルのうち、Lチャネルを色レンジ変換及び画素ビット値マッピングの対象としたが、本実施形態では、Hチャネル、Cチャネルについても色レンジ変換、及び、画素ビット値マッピングを行う実施形態である。かかる改良のための画像処理装置の構成は、図17の通りになる。図17では、図6と同じ構成要素については同じ符号を用い、説明を省略する。
ユーザが好むようなハイライトや陰影をメイクアップ顔画像に付与するため改良に関する。第1実施形態と比較すると、第3実施形態に係る画像処理装置には以下の差異が存在する。
本実施形態は、唇の輪郭形状に応じて、各画素に与える重み値を変化させる実施形態である。
本実施形態は、動画像を対象としたメイクアップ操作を実現する改良に関する。
本実施形態は、レンジマップの値やL'[i]の算出に、SIMD(Single Instruction Multi Data)プロセッサを利用する改良を開示する。
以上、本願の出願時点において、出願人が知り得る最良の実施形態について説明したが、以下に示す技術的トピックについては、更なる改良や変更実施を加えることができる。各実施形態に示した通り実施するか、これらの改良・変更を施すか否かは、何れも任意的であり、実施する者の主観によることは留意されたい。
メイクアップ塗材は、化粧品を仮想化したアイテムに留まらない。カラーコンタクトやイヤリング、耳、鼻のピアスであってもよい。メイクアップの対象部位を含む対象物を人の顔としたが、手としてもよい。また塗材は、マニキュアであってもよい。また、メイクアップの対象物は、人間ではなく愛玩動物でもよい。更に、画像処理装置によるメイクアップは、自動車のドレスアップでもよい。
メイクアップシミュレーターは、口紅アイテムからカラーバリエーションを作成する機能を具備してもよい。これらのカラーバリエーションは、口紅ツールの選択後、カラーサンプルを選択し、唇アイテムの塗布色のH成分、C成分を、このカラーサンプルのH成分、C成分に置き換えることで作成される。
各実施形態で開示した装置の内部構成、フローチャート、動作例は、『対象体の一部の部位』という用語の下位概念の1つである『人の顔の唇』を処理対象にしていた。同様に、各実施形態で開示した装置の内部構成、フローチャート、動作例は、『メイクアップ塗布材』という用語の下位概念の1つである『口紅』を処理対象にしていた。しかしながら当該用語に包含される他の下位概念を処理対象にしてもよい。処理対象が複数の下位概念の何れであるかは、データ素材の中身の違いに過ぎず、装置の構成や処理内容に変化をもたらすものではないからである。
色空間変換部2、色空間逆変換部7では、変換対象となる色空間はRGB、HCLと説明したが、HCL色空間の代わりに、他の色相、彩度、輝度の3チャンネルの色空間、例えばHSV、HSLなどを使用しても良い。
レンジマップでは、現画素値と平均値の比率L[i]/Laveを使用したりしても良い。
第2実施形態では、動画像を構成する複数のフレーム画像か何れか1つを選んで、手塗りによるメイクアップを行ったが、初期画面においてユーザが選択した部位を各フレーム画像から自動的に抽出して、各フレーム画像における該当部位を化粧アイテムで塗りかえることにより、メイクアップを実現してもよい。
数1から数5までの数式は、数学的な概念を意味するのではなく、あくまでも、コンピュータ上で実行される数値演算を意味するものなので、当然のことながら、コンピュータに実現させるための、必要な改変が加えられることはいうまでもない。例えば、数値を、整数型、固定少数点型、浮動小数点型で扱うための飽和演算や正値化演算が施されてよいことはいうまでもない。更に、各実施形態に示した、数式に基づく演算処理のうち、定数との乗算は、定数ROMを用いたROM乗算器で実現することができる。定数ROMには、被乗数と定数との積の値はあらかじめ計算されて格納されている。例えば、被乗数が16ビット長である場合、この被乗数は、4ビット毎に四つに区切られ、この4ビット部分と定数との積、つまり、定数の0~15の倍数が上記の定数ROMに格納されている。上記の一区切りの4ビットと定数16ビットとの積は20ビット長であり、上記の四つの定数が同一のアドレスに格納されるので、20×4=80ビット長が一語のビット長となる。以上述べたように、ROM乗算器での実現が可能であるので、本明細書でいうところの“演算処理”は、純粋な算術演算のみを意味するのではなく、ROM等の記録媒体に格納された演算結果を、被演算子の値に応じて読み出すという、記録媒体の読み出しをも包含する。
タブレットと、画像処理装置とはネットワークを介して接続されてもよい。この場合、画像処理装置は、ネットワークを介して表示装置のカメラによる自画像(オリジナルの画像データ)を受け取り、メイクアップ処理を行う。そしてメイクアップ結果となる画像を外部装置に出力して、表示に供する。
画像処理装置は、複数ユーザを撮影することで得られた、複数ユーザの人物像に対してメイクアップのための画像処理を施してもよい。この場合、多くの人物像を処理せねばならないから、不特定多数のユーザの顔画像をビッグデータとして処理できるようなクラウドネットワークのサーバ(クラウドサーバ)で画像処理装置を動作させるのが望ましい。クラウドネットワークにおいてメイクアップ開始が命じられれば、ハイパーバイザは、クラウドサーバにおいてゲストオペレーティングシステム(ゲストOS)を起動する。こうしてゲストOSを起動した上で、企業内部に存在するイントラネットワークから画像処理装置の構成要件の処理を行うアプリケーションプログラムを、クラウドサーバにロードする。かかるロードにより、これまでの実施形態までの述べた処理をビッグデータに対し実行する。
上記の各装置は、具体的には、マイクロプロセッサ、ROM、RAM、ハードディスクユニット、ディスプレイユニット、キーボード、マウスなどから構成されるコンピュータシステムである。前記RAMまたはハードディスクユニットには、コンピュータプログラムが記憶されている。前記マイクロプロセッサが、前記コンピュータプログラムにしたがって動作することにより、各装置は、その機能を達成する。ここでコンピュータプログラムは、所定の機能を達成するために、コンピュータに対する指令を示す命令コードが複数個組み合わされて構成されたものである。
上記の各装置を構成する構成要素の一部または全部は、1個のシステムLSI(Large Scale Integration:大規模集積回路)から構成されているとしてもよい。システムLSIは、複数の構成部を1個のチップ上に集積して製造された超多機能LSIであり、具体的には、マイクロプロセッサ、ROM、RAMなどを含んで構成されるコンピュータシステムである。前記RAMには、コンピュータプログラムが記憶されている。前記マイクロプロセッサが、前記コンピュータプログラムにしたがって動作することにより、システムLSIは、その機能を達成する。集積回路のアーキテクチャは、プリプログラムされたDMAマスタ回路等から構成され、ストリーム処理全般を実行するフロントエンド処理回路(1)と、SIMDプロセッサ等から構成され、信号処理全般を実行する信号処理回路(2)と、画素処理や画像重畳、リサイズ、画像フォーマット変換AV出力処理全般を行うバックエンド回路(3)と、ドライブ、ネットワークとのインターフェイスであるメディアインターフェイス回路(4)と、メモリアクセスのためのスレーブ回路であり、フロントエンド部、信号処理部、バックエンド部の要求に応じて、パケットやデータの読み書きを実現するメモリコントローラ回路(5)とから構成される。ここでパッケージの種別に着目するとシステムLSIには、QFP(クッド フラッド アレイ)、PGA(ピン グリッド アレイ)という種別がある。QFPは、パッケージの四側面にピンが取り付けられたシステムLSIである。PGAは、底面全体に、多くのピンが取り付けられたシステムLSIである。
上記の各装置を構成する構成要素の一部または全部は、各装置に脱着可能なICカードまたは単体のモジュールから構成されているとしてもよい。前記ICカードまたは前記モジュールは、マイクロプロセッサ、ROM、RAMなどから構成されるコンピュータシステムである。前記ICカードまたは前記モジュールは、上記の超多機能LSIを含むとしてもよい。マイクロプロセッサが、コンピュータプログラムにしたがって動作することにより、前記ICカードまたは前記モジュールは、その機能を達成する。このICカードまたはこのモジュールは、耐タンパ性を有するとしてもよい。
本発明は、上記に示す方法であるとしてもよい。また、これらの方法をコンピュータにより実現するコンピュータプログラムであるとしてもよいし、前記コンピュータプログラムからなるデジタル信号であるとしてもよい。また、本発明は、前記コンピュータプログラムまたは前記デジタル信号をコンピュータ読み取り可能な記録媒体、例えば、フレキシブルディスク、ハードディスク、CD-ROM、MO、DVD、DVD-ROM、DVD-RAM、BD(Blu-ray(登録商標) Disc)、半導体メモリなどに記録したものとしてもよい。また、これらの記録媒体に記録されている前記デジタル信号であるとしてもよい。
上記実施の形態及び上記変形例をそれぞれ組み合わせるとしてもよい。
第1実施形態から第6実施形態までに示した画像処理装置の具体的な形態から抽出される技術的思想の創作は、以下の1.、2.、3.、4.・・・・・の体系をなす。この体系において、1.は、上体系の根幹となる基本的態様であり、2.、3.~は、その派生の形態である。
ここで上記体系の根幹となる画像処理装置は、原画像に現された対象体からメイクアップ塗材を塗布すべき部位を指定する指定手段と、メイクアップ塗材の見本色を代表色とした色レンジをもつ複数の画素から構成される塗材塗布レイヤを生成する生成手段と、原画像に塗材塗布レイヤを合成する合成手段を備え、
塗材塗布レイヤを構成する複数の画素の色レンジは、メイクアップ塗材が塗布されるべき原画像の部位を構成する複数画素の代表的な値と、メイクアップ塗材の見本色画素の値との比率に応じて、前記対象体の一部の部位を構成する複数画素の色レンジを拡大することで得られ、塗材塗布レイヤの画素の値は、原画像の画素であって、位置的に対応するものの画素の値を塗材塗布レイヤの色レンジにマッピングすることで得られることを特徴としている。
また、上記の基本的態様において、ハイライトが存在する箇所や影のできた箇所と、それ以外の箇所との繋ぎめの不自然さを解消するには、部位画像を以下のものとすることはできる。
特許文献1に記載された先行技術では、ハイライトや影になっている部分の書き換えを停止しているのに対し、上記態様の画像処理装置では、カレーレンジを拡大することでハイライトや影になっている部分も画素ビット値の書き換えの対象にしているから、本発明では、元画像でハイライトが存在する箇所や影のできた箇所と、それ以外の箇所との繋ぎめの部分が不自然になることはない。また代表色を決めるにあたっては、ハイライトや陰影を除外せず、これらを含む複数の画素群に対して統計処理を施すことで代表色を得るので、塗布がなされた塗材塗布レイヤにより自然なコントラスト変化をもたらすことができる。ここで上記統計処理は、平均計算、加重平均計算、分散計算、偏差計算等を広く含む。
生成手段によるマッピングは、より具体的なものに展開することができる。それは、第1実施形態におけるレンジマップを用いた数式(数5)の一般化である。具体的には、画像処理装置は更に、原画像に対応する平面座標系において、画素の値がどのように分布しているかを示す分布マップを取得し、生成手段によるマッピングは、分布マップに示される値であって、位置的に対応するものを重み係数として用いて色レンジを重み付け、重み付け後の値に、オフセットを付与することでなされる
塗布された画像で、画素の分布がどうあるべきかが分布マップに示され、塗材塗布レイヤの画素ビット値の決定は、この分布マップに支配されるので、塗材塗布レイヤにおける明るさの分布はよりリアルに表現される。
分布マップに示される重み値やオフセットは、より具体的なものに展開することができる。それは、第1実施形態に示したrmap[i]*rangeL',L'minによる限定である。
分布マップをどのように取得するかについては、より具体的なものに展開することができる。それは、レンジマップをサンプルファイルとして取得するというものである。
画素ビット値の変換は、ビット演算を用いた具体的なものに展開することができる。それは、第1実施形態における数6の一般化である。具体的にいうと、前記変換は、塗材が塗布されるべき原画像の部位を構成する複数の画素であって、位置的に対応するものの画素の値が、当該部位を構成する複数画素の色レンジ全体においてどれだけの割合を占めているかを示す相対値に所定の係数を乗じ、オフセットを足し合わせることで得られるというものである。重み係数を乗じ、オフセットを加算するという処理は、シフトレジスタを用いた積和演算回路等によるハードウェア化が可能となる。かかるハードウェア化により、複数の画素に対する処理の並列化、パイプライン化が可能になる。
6.の態様における相対値、重み係数、オフセットは、より下位概念化の内容に展開することができる。それは、数6のL[i]-Lave),β,L'aveによる限定であり、前記相対値は、塗材が塗布されるべき原画像の部位における代表的な色の画素の値と、当該部位を構成する複数の画素であって、位置的に対応するものの画素の値との差分を、当該部位を構成する画素の色レンジ長を用いて正規化したものであり、前記所定の係数は、当該部位を表す複数画素の色レンジ長であり、前記オフセットは、当該部位を構成する色の色レンジの下限値であるというものである。上記の相対値は、対象となる画素の画素ビット値が、オリジナル画像における代表色からどれだけかけ離れているかを示すので、かかる相対値が塗材塗布レイヤの色レンジにマッピングされ、塗材塗布レイヤの色レンジにおける下限値と加算されることで、塗材塗布レイヤにおける部位は、オリジナル画像における照明の照らされ方を損なわないリアルな質感を再現することができる。
ここで塗材塗布レイヤにおける色レンジの上限、下限は、より下位概念化の内容に展開することができる。その下位概念とは、第1実施形態におけるLmin,Lmaxによる限定であり、具体的にいうと、塗材塗布レンジにおける色レンジの上限値は、塗材が塗布されるべき原画像の部位を構成する複数の画素の代表的な値と、当該部位の色レンジの上限値との差分と、当該部位を構成する画素の色レンジのレンジ長との比率に、塗材塗布レイヤの色レンジのレンジ長を乗じて、塗材の見本色の画素の値を足し合わせることで算出され、
塗材塗布レンジの色レンジの下限値は、塗材塗布レンジの色レンジの上限値から、塗材塗布レンジの色レンジのレンジ長を差し引くことで算出されるというものである。塗材塗布レイヤにおける色レンジの上限は、オリジナル画像の代表色からオリジナル画像の上限までの差分と、オリジナル画像の色レンジのレンジ長との比率に、塗材塗布レイヤのレンジ長を乗じることで定められるから、オリジナル画像と、代表色との差分が塗材塗布レイヤの色レンジに好適に反映されることになる。更に、塗材塗布レイヤの色レンジの下限は、こうして定められた上限から、塗材塗布レイヤの色レンジ長を差し引くことで定められるので、暗くなっている部分の暗くなり方も、オリジナル画像の代表色と、塗材塗布レイヤの代表色との比率に応じたものになる。これにより、撮影されたユーザの顔に、メイクアップ塗材が塗布された状態をリアルに再現することができる。
色レンジはより下位概念化の内容に展開することができる。その下位概念とは、色レンジがHCLの個別規定であることを一般化したものである。具体的にいうと、前記塗材が塗布されるべき原画像の部位の色レンジは、当該部位を構成する画素に含まれる複数種別の画素成分のうち、特定の種別のものの数値範囲であり、塗材が塗布されるべき原画像の部位は複数のチャネル画像を包含しており、塗材塗布レイヤを構成する画素の値は、特定種別の画素成分により構成されるチャネル画像の個々の画素を、塗材塗布レイヤの色レンジにマッピングすることで得られるというものである。画素を構成する複数種別の画素成分のうち、特定のものについて色レンジが算出されるので、画素を構成する複数の画素成分の中に変換することが望ましくないものがあった場合、かかる画素成分を処理から除外することで、オリジナル画像との同一性を維持することができる。
合成手段による合成の仕方はより下位概念化の内容に展開することができる。その下位概念とは、合成パラメータはHCLの個別規定であることを一般化したものである。具体的にいうと、前記画像処理装置は成分種別毎の合成パラメータを有しており、前記合成手段による合成は、塗材塗布レイヤに含まれる複数チャネル画像のうち特定種別のものの画素成分と、前記複数のチャネル画像のうち特定種別のものの画素成分とを、特定種別に対応する合成パラメータを用いて重み付けて、互いに加算することでなされるというものである。オリジナル画像の画素、塗材塗布レイヤの画素のそれぞれに与えるべき重み係数を、画素成分毎に変化させるから、例えば、色相成分、彩度成分、輝度成分のうち、彩度成分、輝度成分が強調された態様の塗材塗布レイヤを得ることができる。
複数種別の画素成分はより下位概念化の内容に展開することができる。その下位概念とは、前記複数種別の画素成分は、色相、彩度、輝度であり、特定種別の画素成分は、色相、彩度、輝度のうち何れか1つ、又は、2以上のものの組合せであるというものである。色相成分、彩度成分、輝度成分のうち、彩度成分、輝度成分を色レンジ生成の対象とし、マッピング、合成の対象とすることで、色相成分の同一性を維持した自然な態様のメイクアップ画像を得ることができる。
画像処理装置には、任意的な構成要件を追加することができる。具体的にいうと、前記画像処理装置は、塗材が塗布されるべき原画像の部位における個々の画素を構成する第1、第2、第3の画素成分を、色相成分、彩度成分、輝度成分に変換する第1変換手段、
塗材塗布レイヤが合成された部位を構成する画素の色相成分、彩度成分、輝度成分を、第1、第2、第3の画素成分に変換する第2変換手段を具備するというものである。RGBの画素成分からなるオリジナルの顔画像、YCrCbの画素成分からなるオリジナル画像をメイクアップの対象とすることができるから、様々なタイプのカメラからの画像入力を、メイクアップの対象として利用することができる。
重み付けを低くすべき箇所をより具体的に規定することができる。その下位概念とは、、口金形状の重みを一般化したものである。具体的にいうと、塗材が塗布されるべき部位は唇であり、指定手段は、画像処理の対象となる部位の指定にあたって、指定された部位の輪郭形状を規定する輪郭形状線を生成し、前記生成手段は、塗材塗布レイヤを構成する個々の画素の色レンジを決定するにあたって、唇の輪郭形状を規定する輪郭形状線の中から、唇の口金部分の形状を規定する口金形状線を特定し、唇を構成する複数の画素のうち、特定された口金形状線の周辺に位置するものの画素の重みを低くするというものである。口角となる唇の両脇、及び、上下の唇の境界には重み係数が低く設定されることで陰影が与えられるから、唇の輪郭が際立ち、見栄えがよくなる。
方法発明の局面で実施化を図る場合、当該局面における方法は、原画像に対して画像処理を行うことで、原画像に現された対象体の一部の部位に、メイクアップ塗材が塗布された状態を示すメイクアップ画像を生成するコンピュータシステムにおける画像処理方法であって、原画像に現された対象体からメイクアップ塗材を塗布すべき部位の指定と、メイクアップ塗材の見本色を代表色とした色レンジをもつ複数の画素から構成される塗材塗布レイヤの生成と、塗材塗布レイヤは、原画像へと合成される画像であり、
塗材塗布レイヤを構成する複数の画素の色レンジは、メイクアップ塗材が塗布されるべき原画像の部位を構成する複数画素の代表的な値と、メイクアップ塗材の見本色画素の値との比率に応じて、前記対象体の一部の部位を構成する複数画素の色レンジを拡大することで得られ、塗材塗布レイヤの画素の値は、原画像の画素であって、位置的に対応するものの画素の値を塗材塗布レイヤの色レンジにマッピングすることで得られるというものになる。この方法の態様は、上述したような2.から13.までの改良を施すことができる。かかる画像処理方法では、企業内のユーザ、又は、エンドユーザが使用する場所での使用が可能になるから、本願の技術的範囲に属する方法発明の用途を広げることができる。
プログラムを実施する局面で実施化を図る場合、当該局面におけるプログラムは、原画像に対して画像処理を行うことで、原画像に現された対象体の一部の部位に、メイクアップ塗材が塗布された状態を示すメイクアップ画像を生成する処理をコンピュータに実行させプログラムであって、原画像に現された対象体からメイクアップ塗材を塗布すべき部位の指定と、メイクアップ塗材の見本色を代表色とした色レンジをもつ複数の画素から構成される塗材塗布レイヤの生成とをコンピュータに行わせるプログラムコードを含み、
塗材塗布レイヤは、原画像へと合成される画像であり、塗材塗布レイヤを構成する複数の画素の色レンジは、メイクアップ塗材が塗布されるべき原画像の部位を構成する複数画素の代表的な値と、メイクアップ塗材の見本色画素の値との比率に応じて、前記対象体の一部の部位を構成する複数画素の色レンジを拡大することで得られ、
塗材塗布レイヤの画素の値は、原画像の画素であって、位置的に対応するものの画素の値を塗材塗布レイヤの色レンジにマッピングすることで得られるというものになる。このプログラムの態様には、上述したような2.から13.までの改良を施すことができる。ネットワークプロバイダサーバや各種記録媒体を通じたプログラムの配布が可能になるから、一般のコンピュータソフトウェアやオンラインサービスの業界まで、本発明の用途を広げることができる。
12 色空間変換部
13 一時的メモリ
14 唇色レンジ算出部
15 口紅塗布レイヤ生成部
16 合成処理部
17 色空間逆変換部
Claims (15)
- 原画像に対して画像処理を行うことで、原画像に現された対象体の一部の部位に、メイクアップ塗材が塗布された状態を示すメイクアップ画像を生成する画像処理装置であって、
原画像に現された対象体からメイクアップ塗材を塗布すべき部位を指定する指定手段と、
メイクアップ塗材の見本色を代表色とした色レンジをもつ複数の画素から構成される塗材塗布レイヤを生成する生成手段と、
原画像に塗材塗布レイヤを合成する合成手段を備え、
塗材塗布レイヤを構成する複数の画素の色レンジは、メイクアップ塗材が塗布されるべき原画像の部位を構成する複数画素の代表的な値と、メイクアップ塗材の見本色画素の値との比率に応じて、前記対象体の一部の部位を構成する複数画素の色レンジを拡大することで得られ、
塗材塗布レイヤの画素の値は、原画像の画素であって、位置的に対応するものの画素の値を塗材塗布レイヤの色レンジにマッピングすることで得られる
ことを特徴とする画像処理装置。 - 前記原画像において指定手段により指定される部位は、環境光で照らされることにより生じるハイライト部分の画素と、環境光が遮られることで生じる陰影部分の画素とを含み、
前記原画像部位を構成する複数画素の代表的な値は、ハイライト部分の画素と、陰影部分の画素とを含む複数の画素に対して統計処理を施すことで算出される
ことを特徴とする請求項1記載の画像処理装置。 - 画像処理装置は更に、原画像に対応する平面座標系において、画素の値がどのように分布しているかを示す分布マップを取得し、
生成手段によるマッピングは、分布マップに示される値であって、位置的に対応するものを重み係数として用いて色レンジを重み付け、重み付け後の値に、オフセットを付与することでなされる
ことを特徴とする請求項1記載の画像処理装置。 - 前記分布マップにより示される各画素位置の値は、塗材が塗布されるべき原画像の部位を構成する複数画素のうち、位置的に対応するものの画素の値と、当該部位を構成する色の下限値との差分を、当該部位における色レンジを用いて正規化したものであり、
前記オフセットは、塗り替え色の色レンジの下限値である
ことを特徴とする請求項3記載の画像処理装置。 - 画像処理装置がアクセスし得る記録媒体には、サンプルファイルが記録されており、サンプルファイルは、画素の値が平面的にどのように分布しているかという分布態様のサンプルであり、
生成手段による分布マップの取得は、サンプルファイルを記録媒体から読み出すことでなされる
ことを特徴とする請求項3記載の画像処理装置。 - 前記変換は、塗材が塗布されるべき原画像の部位を構成する複数の画素であって、位置的に対応するものの画素の値が、当該部位を構成する複数画素の色レンジ全体においてどれだけの割合を占めているかを示す相対値に所定の係数を乗じ、オフセットを足し合わせることで得られる
ことを特徴とする請求項1記載の画像処理装置。 - 前記相対値は、塗材が塗布されるべき原画像の部位における代表的な色の画素の値と、当該部位を構成する複数の画素であって、位置的に対応するものの画素の値との差分を、当該部位を構成する画素の色レンジ長を用いて正規化したものであり、前記所定の係数は、当該部位を表す複数画素の色レンジ長であり、前記オフセットは、当該部位を構成する色の色レンジの下限値である
ことを特徴とする請求項6記載の画像処理装置。 - 塗材塗布レンジにおける色レンジの上限値は、塗材が塗布されるべき原画像の部位を構成する複数の画素の代表的な値と、当該部位の色レンジの上限値との差分と、当該部位を構成する画素の色レンジのレンジ長との比率に、塗材塗布レイヤの色レンジのレンジ長を乗じて、塗材の見本色の画素の値を足し合わせることで算出され、
塗材塗布レンジの色レンジの下限値は、塗材塗布レンジの色レンジの上限値から、塗材塗布レンジの色レンジのレンジ長を差し引くことで算出される
ことを特徴とする請求項1記載の画像処理装置。 - 前記塗材が塗布されるべき原画像の部位の色レンジは、当該部位を構成する画素に含まれる複数種別の画素成分のうち、特定の種別のものの数値範囲であり、
塗材が塗布されるべき原画像の部位は複数のチャネル画像を包含しており、
塗材塗布レイヤを構成する画素の値は、特定種別の画素成分により構成されるチャネル画像の個々の画素を、塗材塗布レイヤの色レンジにマッピングすることで得られる
ことを特徴とする請求項1記載の画像処理装置。 - 前記画像処理装置は成分種別毎の合成パラメータを有しており、
前記合成手段による合成は、
塗材塗布レイヤに含まれる複数チャネル画像のうち特定種別のものの画素成分と、前記複数のチャネル画像のうち特定種別のものの画素成分とを、特定種別に対応する合成パラメータを用いて重み付けて、互いに加算することでなされる
ことを特徴とする請求項9記載の画像処理装置。 - 前記複数種別の画素成分は、色相、彩度、輝度であり、特定種別の画素成分は、色相、彩度、輝度のうち何れか1つ、又は、2以上のものの組合せである
ことを特徴とする請求項9記載の画像処理装置。 - 前記画像処理装置は、
塗材が塗布されるべき原画像の部位における個々の画素を構成する第1、第2、第3の画素成分を、色相成分、彩度成分、輝度成分に変換する第1変換手段、
塗材塗布レイヤが合成された部位を構成する画素の色相成分、彩度成分、輝度成分を、第1、第2、第3の画素成分に変換する第2変換手段
を備えることを特徴とする請求項1記載の画像処理装置。 - 塗材が塗布されるべき部位は唇であり、
指定手段は、画像処理の対象となる部位の指定にあたって、指定された部位の輪郭形状を規定する輪郭形状線を生成し、
前記生成手段は、
塗材塗布レイヤを構成する個々の画素の色レンジを決定するにあたって、唇の輪郭形状を規定する輪郭形状線の中から、唇の口金部分の形状を規定する口金形状線を特定し、唇を構成する複数の画素のうち、特定された口金形状線の周辺に位置するものの画素の重みを低くする
ことを特徴とする請求項1記載の画像処理装置。 - 原画像に対して画像処理を行うことで、原画像に現された対象体の一部の部位に、メイクアップ塗材が塗布された状態を示すメイクアップ画像を生成するコンピュータシステムにおける画像処理方法であって、
原画像に現された対象体からメイクアップ塗材を塗布すべき部位の指定と、
メイクアップ塗材の見本色を代表色とした色レンジをもつ複数の画素から構成される塗材塗布レイヤの生成とを含み、
塗材塗布レイヤは、原画像へと合成される画像であり、
塗材塗布レイヤを構成する複数の画素の色レンジは、メイクアップ塗材が塗布されるべき原画像の部位を構成する複数画素の代表的な値と、メイクアップ塗材の見本色画素の値との比率に応じて、前記対象体の一部の部位を構成する複数画素の色レンジを拡大することで得られ、
塗材塗布レイヤの画素の値は、原画像の画素であって、位置的に対応するものの画素の値を塗材塗布レイヤの色レンジにマッピングすることで得られる
ことを特徴とする画像処理方法。 - 原画像に対して画像処理を行うことで、原画像に現された対象体の一部の部位に、メイクアップ塗材が塗布された状態を示すメイクアップ画像を生成する処理をコンピュータに実行させプログラムであって、
原画像に現された対象体からメイクアップ塗材を塗布すべき部位の指定と、
メイクアップ塗材の見本色を代表色とした色レンジをもつ複数の画素から構成される塗材塗布レイヤの生成とをコンピュータに行わせるプログラムコードを含み、
塗材塗布レイヤは、原画像へと合成される画像であり、
塗材塗布レイヤを構成する複数の画素の色レンジは、メイクアップ塗材が塗布されるべき原画像の部位を構成する複数画素の代表的な値と、メイクアップ塗材の見本色画素の値との比率に応じて、前記対象体の一部の部位を構成する複数画素の色レンジを拡大することで得られ、
塗材塗布レイヤの画素の値は、原画像の画素であって、位置的に対応するものの画素の値を塗材塗布レイヤの色レンジにマッピングすることで得られる
ことを特徴とするプログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/405,169 US9603437B2 (en) | 2013-04-08 | 2014-04-07 | Image processing device, image processing method, and program, capable of virtual reproduction of makeup application state |
EP14783201.8A EP2985732B1 (en) | 2013-04-08 | 2014-04-07 | Image processing device, image processing method, and program, capable of virtual reproduction of makeup application state |
JP2015511106A JP6396890B2 (ja) | 2013-04-08 | 2014-04-07 | メイクアップ塗材が塗布された状態を仮想的に再現することができる画像処理装置、画像処理方法、プログラム |
CN201480001468.0A CN104380339B (zh) | 2013-04-08 | 2014-04-07 | 图像处理装置、图像处理方法、以及介质 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013-080090 | 2013-04-08 | ||
JP2013080090 | 2013-04-08 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014167831A1 true WO2014167831A1 (ja) | 2014-10-16 |
Family
ID=51689253
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/001991 WO2014167831A1 (ja) | 2013-04-08 | 2014-04-07 | メイクアップ塗材が塗布された状態を仮想的に再現することができる画像処理装置、画像処理方法、プログラム |
Country Status (5)
Country | Link |
---|---|
US (1) | US9603437B2 (ja) |
EP (1) | EP2985732B1 (ja) |
JP (1) | JP6396890B2 (ja) |
CN (1) | CN104380339B (ja) |
WO (1) | WO2014167831A1 (ja) |
Cited By (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016139382A (ja) * | 2015-01-29 | 2016-08-04 | パナソニックIpマネジメント株式会社 | 画像処理装置、スタイラス、および画像処理方法 |
JP2016139381A (ja) * | 2015-01-29 | 2016-08-04 | パナソニックIpマネジメント株式会社 | 画像処理装置および画像処理方法 |
WO2016121329A1 (ja) * | 2015-01-29 | 2016-08-04 | パナソニックIpマネジメント株式会社 | 画像処理装置、スタイラス、および画像処理方法 |
WO2017212878A1 (ja) * | 2016-06-10 | 2017-12-14 | パナソニックIpマネジメント株式会社 | バーチャルメイク装置、およびバーチャルメイク方法 |
JP2018063681A (ja) * | 2016-10-14 | 2018-04-19 | パナソニックIpマネジメント株式会社 | バーチャルメイクアップ装置、バーチャルメイクアップ方法及びバーチャルメイクアッププログラム |
JP2018185782A (ja) * | 2017-04-27 | 2018-11-22 | 麗寶大數據股▲フン▼有限公司 | リップ・グロス案内装置及びその方法 |
JP2019525307A (ja) * | 2016-06-30 | 2019-09-05 | キーン アイ テクノロジーズ | マルチモーダルビューア |
JP2019205205A (ja) * | 2015-04-15 | 2019-11-28 | キヤノン株式会社 | 画像処理装置、画像処理方法、およびプログラム |
JP2020074100A (ja) * | 2019-12-19 | 2020-05-14 | カシオ計算機株式会社 | 画像処理装置、画像処理方法及びプログラム |
US11106938B2 (en) | 2015-04-15 | 2021-08-31 | Canon Kabushiki Kaisha | Image processing apparatus, image processing method, and storage medium for lighting processing on image using model data |
JP2021132321A (ja) * | 2020-02-20 | 2021-09-09 | 株式会社セガ | 撮影遊戯装置およびプログラム |
JP2022514599A (ja) * | 2018-12-19 | 2022-02-14 | コーニンクレッカ フィリップス エヌ ヴェ | ミラーアセンブリ |
US20220150381A1 (en) * | 2013-08-23 | 2022-05-12 | Preemadonna Inc. | Apparatus for applying coating to nails |
JP2022539544A (ja) * | 2019-06-25 | 2022-09-12 | ロレアル | 物理的要素のセットから入力データの特定値を決定するための方法 |
US11717070B2 (en) | 2017-10-04 | 2023-08-08 | Preemadonna Inc. | Systems and methods of adaptive nail printing and collaborative beauty platform hosting |
JP7364676B2 (ja) | 2018-11-15 | 2023-10-18 | エルモズニーノ・エリック | 条件付きサイクル一貫性を有する生成画像変換のモデルを使用した拡張現実のためのシステムおよび方法 |
Families Citing this family (34)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103489107B (zh) * | 2013-08-16 | 2015-11-25 | 北京京东尚科信息技术有限公司 | 一种制作虚拟试衣模特图像的方法和装置 |
JP6458570B2 (ja) * | 2015-03-12 | 2019-01-30 | オムロン株式会社 | 画像処理装置および画像処理方法 |
USD764498S1 (en) | 2015-06-07 | 2016-08-23 | Apple Inc. | Display screen or portion thereof with graphical user interface |
JP6249263B2 (ja) | 2015-09-08 | 2017-12-20 | 日本電気株式会社 | 顔認識システム、顔認識方法、表示制御装置、表示制御方法および表示制御プログラム |
CN106846240B (zh) * | 2015-12-03 | 2021-05-07 | 斑马智行网络(香港)有限公司 | 一种调整融合素材的方法、装置和设备 |
JP6778877B2 (ja) * | 2015-12-25 | 2020-11-04 | パナソニックIpマネジメント株式会社 | メイクパーツ作成装置、メイクパーツ利用装置、メイクパーツ作成方法、メイクパーツ利用方法、メイクパーツ作成プログラム、およびメイクパーツ利用プログラム |
USD810117S1 (en) * | 2016-01-14 | 2018-02-13 | Perfect Corp. | Display screen with transitional graphical user interface |
US20170263031A1 (en) * | 2016-03-09 | 2017-09-14 | Trendage, Inc. | Body visualization system |
CN107180453B (zh) | 2016-03-10 | 2019-08-16 | 腾讯科技(深圳)有限公司 | 人物面部模型的编辑方法及装置 |
CN106780768A (zh) * | 2016-11-29 | 2017-05-31 | 深圳市凯木金科技有限公司 | 一种远程实时3d模拟化妆系统及其方法 |
US10417738B2 (en) * | 2017-01-05 | 2019-09-17 | Perfect Corp. | System and method for displaying graphical effects based on determined facial positions |
US10565741B2 (en) * | 2017-02-06 | 2020-02-18 | L'oreal | System and method for light field correction of colored surfaces in an image |
JP6967065B2 (ja) * | 2017-03-09 | 2021-11-17 | 株式会社 資生堂 | 情報処理装置、プログラム、及び、情報処理方法 |
CN108734070A (zh) * | 2017-04-24 | 2018-11-02 | 丽宝大数据股份有限公司 | 腮红指引装置及方法 |
WO2019014646A1 (en) | 2017-07-13 | 2019-01-17 | Shiseido Americas Corporation | REMOVAL OF VIRTUAL FACIAL MAKE-UP, FAST FACIAL DETECTION AND TRACK POINT TRACKING |
CN107545536A (zh) * | 2017-08-17 | 2018-01-05 | 上海展扬通信技术有限公司 | 一种智能终端的图像处理方法及图像处理系统 |
CN109427075A (zh) * | 2017-08-24 | 2019-03-05 | 丽宝大数据股份有限公司 | 身体信息分析装置及其眼影分析方法 |
CN109427078A (zh) * | 2017-08-24 | 2019-03-05 | 丽宝大数据股份有限公司 | 身体信息分析装置及其唇妆分析方法 |
CN109508581A (zh) * | 2017-09-15 | 2019-03-22 | 丽宝大数据股份有限公司 | 身体信息分析装置及其腮红分析方法 |
CN109508587A (zh) * | 2017-09-15 | 2019-03-22 | 丽宝大数据股份有限公司 | 身体信息分析装置及其底妆分析方法 |
CN109712065A (zh) * | 2017-10-25 | 2019-05-03 | 丽宝大数据股份有限公司 | 身体信息分析装置及其脸形模拟方法 |
US10607264B2 (en) | 2018-02-02 | 2020-03-31 | Perfect Corp. | Systems and methods for virtual application of cosmetic effects to photo albums and product promotion |
CN108564526A (zh) * | 2018-03-30 | 2018-09-21 | 北京金山安全软件有限公司 | 一种图像处理方法、装置、电子设备及介质 |
KR102081947B1 (ko) * | 2018-04-24 | 2020-02-26 | 주식회사 엘지생활건강 | 이동 단말기 및 화장품 자동인식 시스템 |
EP3628187A1 (en) | 2018-09-26 | 2020-04-01 | Chanel Parfums Beauté | Method for simulating the rendering of a make-up product on a body area |
CN111259696B (zh) * | 2018-11-30 | 2023-08-29 | 百度在线网络技术(北京)有限公司 | 用于显示图像的方法及装置 |
TWI708183B (zh) * | 2019-03-29 | 2020-10-21 | 麗寶大數據股份有限公司 | 個人化化妝資訊推薦方法 |
USD916856S1 (en) | 2019-05-28 | 2021-04-20 | Apple Inc. | Display screen or portion thereof with graphical user interface |
TWI735901B (zh) * | 2019-07-02 | 2021-08-11 | 華碩電腦股份有限公司 | 輸入裝置 |
CN112308944A (zh) * | 2019-07-29 | 2021-02-02 | 丽宝大数据股份有限公司 | 仿真唇妆的扩增实境显示方法 |
US11212483B2 (en) | 2020-02-14 | 2021-12-28 | Perfect Mobile Corp. | Systems and methods for event-based playback control during virtual application of makeup effects |
US11798202B2 (en) * | 2020-09-28 | 2023-10-24 | Snap Inc. | Providing augmented reality-based makeup in a messaging system |
BR102020022162A2 (pt) | 2020-10-29 | 2022-05-10 | Botica Comercial Farmacêutica Ltda. | Método de detecção e segmentação da região labial |
US11825184B1 (en) | 2022-05-09 | 2023-11-21 | Perfect Mobile Corp. | Systems and methods for event-based playback control during virtual application of accessories |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3723349B2 (ja) | 1998-06-18 | 2005-12-07 | 株式会社資生堂 | 口紅色変換システム |
JP2010211497A (ja) * | 2009-03-10 | 2010-09-24 | Nikon Corp | デジタルカメラおよび画像処理プログラム |
WO2012056743A1 (ja) * | 2010-10-29 | 2012-05-03 | オムロン株式会社 | 画像処理装置、画像処理方法、および制御プログラム |
WO2012120697A1 (ja) * | 2011-03-10 | 2012-09-13 | オムロン株式会社 | 画像処理装置、画像処理方法、および制御プログラム |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5974189A (en) * | 1993-05-24 | 1999-10-26 | Eastman Kodak Company | Method and apparatus for modifying electronic image data |
WO2002005249A2 (en) * | 2000-06-27 | 2002-01-17 | Rami Orpaz | Make-up and fashion accessory display and marketing system and method |
US20070019882A1 (en) * | 2004-01-30 | 2007-01-25 | Shoji Tanaka | Makeup simulation program, makeup simulation device, and makeup simulation method |
US20090234716A1 (en) * | 2008-03-17 | 2009-09-17 | Photometria, Inc. | Method of monetizing online personal beauty product selections |
US9118876B2 (en) * | 2012-03-30 | 2015-08-25 | Verizon Patent And Licensing Inc. | Automatic skin tone calibration for camera images |
-
2014
- 2014-04-07 US US14/405,169 patent/US9603437B2/en active Active
- 2014-04-07 CN CN201480001468.0A patent/CN104380339B/zh not_active Expired - Fee Related
- 2014-04-07 EP EP14783201.8A patent/EP2985732B1/en not_active Not-in-force
- 2014-04-07 WO PCT/JP2014/001991 patent/WO2014167831A1/ja active Application Filing
- 2014-04-07 JP JP2015511106A patent/JP6396890B2/ja active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3723349B2 (ja) | 1998-06-18 | 2005-12-07 | 株式会社資生堂 | 口紅色変換システム |
JP2010211497A (ja) * | 2009-03-10 | 2010-09-24 | Nikon Corp | デジタルカメラおよび画像処理プログラム |
WO2012056743A1 (ja) * | 2010-10-29 | 2012-05-03 | オムロン株式会社 | 画像処理装置、画像処理方法、および制御プログラム |
WO2012120697A1 (ja) * | 2011-03-10 | 2012-09-13 | オムロン株式会社 | 画像処理装置、画像処理方法、および制御プログラム |
Cited By (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20220150381A1 (en) * | 2013-08-23 | 2022-05-12 | Preemadonna Inc. | Apparatus for applying coating to nails |
JP2016139381A (ja) * | 2015-01-29 | 2016-08-04 | パナソニックIpマネジメント株式会社 | 画像処理装置および画像処理方法 |
WO2016121329A1 (ja) * | 2015-01-29 | 2016-08-04 | パナソニックIpマネジメント株式会社 | 画像処理装置、スタイラス、および画像処理方法 |
CN107111861A (zh) * | 2015-01-29 | 2017-08-29 | 松下知识产权经营株式会社 | 图像处理装置、触笔以及图像处理方法 |
JP2016139382A (ja) * | 2015-01-29 | 2016-08-04 | パナソニックIpマネジメント株式会社 | 画像処理装置、スタイラス、および画像処理方法 |
US9984281B2 (en) | 2015-01-29 | 2018-05-29 | Panasonic Intellectual Property Management Co., Ltd. | Image processing apparatus, stylus, and image processing method |
JP2019205205A (ja) * | 2015-04-15 | 2019-11-28 | キヤノン株式会社 | 画像処理装置、画像処理方法、およびプログラム |
US11106938B2 (en) | 2015-04-15 | 2021-08-31 | Canon Kabushiki Kaisha | Image processing apparatus, image processing method, and storage medium for lighting processing on image using model data |
JP2017220158A (ja) * | 2016-06-10 | 2017-12-14 | パナソニックIpマネジメント株式会社 | バーチャルメイク装置、バーチャルメイク方法、およびバーチャルメイクプログラム |
WO2017212878A1 (ja) * | 2016-06-10 | 2017-12-14 | パナソニックIpマネジメント株式会社 | バーチャルメイク装置、およびバーチャルメイク方法 |
JP2019525307A (ja) * | 2016-06-30 | 2019-09-05 | キーン アイ テクノロジーズ | マルチモーダルビューア |
JP2018063681A (ja) * | 2016-10-14 | 2018-04-19 | パナソニックIpマネジメント株式会社 | バーチャルメイクアップ装置、バーチャルメイクアップ方法及びバーチャルメイクアッププログラム |
JP2018185782A (ja) * | 2017-04-27 | 2018-11-22 | 麗寶大數據股▲フン▼有限公司 | リップ・グロス案内装置及びその方法 |
US11717070B2 (en) | 2017-10-04 | 2023-08-08 | Preemadonna Inc. | Systems and methods of adaptive nail printing and collaborative beauty platform hosting |
JP7364676B2 (ja) | 2018-11-15 | 2023-10-18 | エルモズニーノ・エリック | 条件付きサイクル一貫性を有する生成画像変換のモデルを使用した拡張現実のためのシステムおよび方法 |
JP2022514599A (ja) * | 2018-12-19 | 2022-02-14 | コーニンクレッカ フィリップス エヌ ヴェ | ミラーアセンブリ |
JP2022539544A (ja) * | 2019-06-25 | 2022-09-12 | ロレアル | 物理的要素のセットから入力データの特定値を決定するための方法 |
JP7405874B2 (ja) | 2019-06-25 | 2023-12-26 | ロレアル | 物理的要素のセットから入力データの特定値を決定するための方法 |
JP2020074100A (ja) * | 2019-12-19 | 2020-05-14 | カシオ計算機株式会社 | 画像処理装置、画像処理方法及びプログラム |
JP7015009B2 (ja) | 2019-12-19 | 2022-02-02 | カシオ計算機株式会社 | 画像処理装置、画像処理方法及びプログラム |
JP2021132321A (ja) * | 2020-02-20 | 2021-09-09 | 株式会社セガ | 撮影遊戯装置およびプログラム |
Also Published As
Publication number | Publication date |
---|---|
JPWO2014167831A1 (ja) | 2017-02-16 |
US20150145882A1 (en) | 2015-05-28 |
JP6396890B2 (ja) | 2018-09-26 |
EP2985732A4 (en) | 2016-04-13 |
EP2985732A1 (en) | 2016-02-17 |
US9603437B2 (en) | 2017-03-28 |
CN104380339A (zh) | 2015-02-25 |
EP2985732B1 (en) | 2017-08-02 |
CN104380339B (zh) | 2018-11-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6396890B2 (ja) | メイクアップ塗材が塗布された状態を仮想的に再現することができる画像処理装置、画像処理方法、プログラム | |
CN106548455B (zh) | 用于调整图像的亮度的设备和方法 | |
CN108876931B (zh) | 三维物体颜色调整方法、装置、计算机设备及计算机可读存储介质 | |
US9019310B2 (en) | Methods and apparatus for applying complex continuous gradients to images | |
CN109448089B (zh) | 一种渲染方法及装置 | |
US10922860B2 (en) | Line drawing generation | |
JP5299173B2 (ja) | 画像処理装置および画像処理方法、並びにプログラム | |
CN111666017A (zh) | 自动化方法和设备及预处理方法和计算机可读媒体 | |
TW202234341A (zh) | 圖像處理方法及裝置、電子設備、儲存媒體和程式產品 | |
Kumaragurubaran | High dynamic range image processing toolkit for lighting simulations and analysis | |
US8842913B2 (en) | Saturation varying color space | |
JP2017157014A (ja) | 画像処理装置、画像処理方法、画像処理システムおよびプログラム | |
JP2003168130A (ja) | リアルタイムで合成シーンのフォトリアルなレンダリングをプレビューするための方法 | |
CN116363288A (zh) | 目标物的渲染方法、装置、存储介质及计算机设备 | |
JP2001266176A (ja) | 画像処理装置および画像処理方法、並びに記録媒体 | |
Li | Research and Implementation of Oil Painting Virtual Reality Based on Internet of Things | |
Lu Lu | Large Scale Immersive Holograms with Microsoft Hololens | |
KR101752701B1 (ko) | 영상에 화장 효과를 재현하는 방법 | |
WO2023227886A1 (en) | Simulating foundation makeup effect in augmented images | |
Pokorný et al. | A 3D Visualization of Zlín in the Eighteen–Nineties Using Virtual Reality | |
Rafi et al. | High dynamic range images: Evolution, applications and suggested processes | |
Ma | The direct manipulation of pasted surfaces | |
CN117745920A (zh) | 一种模型贴图方法、装置、设备及存储介质 | |
CN117274460A (zh) | 虚拟角色的妆容渲染方法、装置、设备及存储介质 | |
CN114549607A (zh) | 主体材质确定方法、装置、电子设备及存储介质 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14783201 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14405169 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 2015511106 Country of ref document: JP Kind code of ref document: A |
|
REEP | Request for entry into the european phase |
Ref document number: 2014783201 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2014783201 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |