WO2012120697A1 - 画像処理装置、画像処理方法、および制御プログラム - Google Patents
画像処理装置、画像処理方法、および制御プログラム Download PDFInfo
- Publication number
- WO2012120697A1 WO2012120697A1 PCT/JP2011/056793 JP2011056793W WO2012120697A1 WO 2012120697 A1 WO2012120697 A1 WO 2012120697A1 JP 2011056793 W JP2011056793 W JP 2011056793W WO 2012120697 A1 WO2012120697 A1 WO 2012120697A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- color
- lip
- candidate
- image
- weight
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/46—Colour picture communication systems
- H04N1/56—Processing of colour picture signals
- H04N1/60—Colour correction or control
- H04N1/62—Retouching, i.e. modification of isolated colours only or in isolated picture areas only
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/56—Extraction of image or video features relating to colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/168—Feature extraction; Face representation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/168—Feature extraction; Face representation
- G06V40/171—Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
- G06T2207/30201—Face
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/46—Colour picture communication systems
- H04N1/56—Processing of colour picture signals
- H04N1/60—Colour correction or control
- H04N1/62—Retouching, i.e. modification of isolated colours only or in isolated picture areas only
- H04N1/628—Memory colours, e.g. skin or sky
Definitions
- the present invention relates to an image processing apparatus and an image processing method, and more particularly to an image processing apparatus and an image processing method for correcting a face image.
- the contours of the upper and lower lips are extracted, the part surrounded by the detected inner contours of the upper and lower lips is specified as the part corresponding to the teeth, and the brightness of the part corresponding to the teeth is determined.
- Techniques for adjustment are disclosed.
- the pixel value of the portion corresponding to the teeth can be brightened to improve the appearance of the face.
- a lip contour is detected from a change in luminance value of each pixel of an image. Specifically, a plurality of detection lines in the vertical direction (the height direction of the face) are defined in the mouth area, and the change in the luminance value of the pixel is examined along each detection line.
- Japanese Patent Publication Japanese Patent Laid-Open No. 2009-231879 (published on Oct. 8, 2009)”
- the luminance value of the lip contour changes greatly, making it easier to identify the contour point.
- shadows are formed by the unevenness of the face, and a luminance change larger than the change of the luminance value in the lip contour occurs at the boundary portion of the shadow. As a result, it becomes difficult to distinguish the shadow boundary from the lip contour.
- the color of the lips of the face image generally taken varies greatly depending on individual differences, changes in lighting conditions, and coloring of lipstick or lip gloss. Therefore, in general, the lip color of the face image is not known. Because of various conditions (such as shooting environment and makeup), the color of the lips in the face image may be similar to the color of the skin, and it is difficult to specify the color of the lips surrounded by the skin. There may be.
- the present invention has been made in view of the above problems, and its purpose is to identify the color of the lips in a face image photographed under various conditions, and to identify the region of the lips using the identified color of the lips. It is to identify.
- An image processing apparatus is an image processing apparatus that specifies lip characteristics from a face image including a person's mouth, and specifies a representative skin color of the face image in order to solve the above problems.
- a lip representative color specifying unit for specifying a representative color of the lips from the plurality of candidate colors according to a difference in hue and saturation between the representative color of the skin and each candidate color.
- An image processing method is an image processing method for specifying lip characteristics from a face image including a person's mouth, and in order to solve the above-mentioned problem, a representative color of the skin of the face image is specified.
- the representative color of the lips is identified from the candidate colors based on the difference in hue and saturation of the face image including the lips and the skin. Therefore, the representative colors of the lips, which can be various colors, can be accurately specified in distinction from the skin color.
- the difference in hue and saturation refers to a difference in the hue saturation plane of two colors, and includes a difference in hue, a difference in saturation, a distance in the hue saturation plane, and the like.
- the representative color of the lips is specified from the candidate colors based on the difference in hue and saturation of the face image including the lips and the skin.
- the representative colors of the lips which can be various colors, can be identified accurately from the skin color.
- FIG. 1 is a block diagram illustrating a schematic configuration of a digital camera according to an embodiment of the present invention.
- (A) is an image which shows the image of the normalized mouth area
- (b) is an image which shows the image of the smoothed mouth area
- (A) is a figure which shows the relationship between the distance in the CbCr plane of a candidate color and the representative color of skin, and weight Wa, (b) respond
- (A) is a figure which shows the relationship between the hue in the CbCr plane of a candidate color and the representative color of skin, and the weight Wb, (b) respond
- FIG. 6 is an image corresponding to FIG.
- FIG. 2B is an image corresponding to FIG. 2B and showing the result of calculating the weight Wc of each pixel of the mouth image instead of the candidate color.
- FIG. 3B is an image corresponding to FIG. 2B and showing the result of calculating the lip color degree D1 from the pixel values of each pixel of the mouth image instead of the candidate color.
- FIG. 4 is an image corresponding to FIG. 2B and showing the result of calculating the weight Wd of each pixel of the mouth image instead of the candidate color.
- FIG. 5B is an image corresponding to FIG. 2B and showing the result of calculating the candidate evaluation value D2 from the pixel value of each pixel of the mouth image instead of the candidate color.
- (A) is a figure which shows the relationship between the distance in the CbCr plane of the color of each pixel and the representative color of a lip, and 1st lip color similarity We
- (b) is a figure of (b) of FIG. Is an image showing the result of calculating the first lip color similarity We of each pixel of the mouth image.
- (A) is a figure which shows the relationship between the hue in the CbCr plane of the color of each pixel and the representative color of a lip, and the 2nd lip color similarity Wf
- (b) is a figure of (b) of FIG. Is an image showing the result of calculating the second lip color similarity Wf of each pixel of the mouth image.
- FIG. 16 is an image showing a modeled lip region corresponding to FIG.
- FIG. 17 is an image corresponding to FIG. 16 and showing a correction weight Wg.
- FIG. 20 is an image corresponding to FIG. 19 and showing a corrected partial evaluation value D3. It is an image which shows the brightness
- FIG. 22 is an image corresponding to FIG. 21 and showing a glossy image having only a luminance component.
- A is a figure which shows the 1st tone curve with respect to each extracted pixel
- (b) is a figure which shows the 2nd tone curve with respect to each extracted pixel. It is a figure which shows the synthetic
- A is an image showing a part of the face image before correction
- (b) is an image showing a part of the face image after correction.
- A) corresponds to (a) of FIG.
- FIG. 17 is an image corresponding to FIG. 16 and showing the mouth inner region and the correction weight Wh of the mouth inner region. It is an image corresponding to (b) of Drawing 2, and showing the result of having calculated tooth color similarity Wi of each pixel of a mouth picture. 13 is an image corresponding to (b) of FIG.
- FIG. 12 is an image corresponding to (b) of FIG. 13 and showing the value of (1-Wf) of each pixel.
- FIG. 32 is an image corresponding to FIG. 31 and showing a tooth gloss image.
- an image processing apparatus that is mainly mounted on a digital camera and performs processing on a face image included in a captured image
- an image processing apparatus according to the present invention is mounted on a photographing apparatus such as a digital video camera, a personal computer (PC) Web camera, or a mobile phone with a camera, and an image obtained by photographing with the photographing apparatus. Processing may be performed.
- the image processing apparatus according to the present invention may perform processing on an image acquired from a communication path such as a network or an external storage device. Further, not only the captured still image but also a facial image such as a moving image may be processed. Further, a preview image displayed on the display device of the digital camera when imaging with the digital camera may be processed.
- FIG. 1 is a block diagram showing a schematic configuration of a digital camera 1 according to the present embodiment.
- the digital camera 1 includes an instruction input device 2, an imaging device 3, an image storage device 4, a display device 5, and an image processing device 6.
- the instruction input device 2 includes an input device such as a button, a key, or a touch panel, receives an imaging instruction from a user, and outputs an imaging instruction to the imaging device 3.
- the instruction input device 2 receives a face image correction processing instruction from the user and outputs a correction processing instruction to the image processing device 6.
- the imaging device 3 includes an imaging element such as a CCD (charge coupled device) or a CMOS (complementary metal oxide semiconductor) imaging element.
- the imaging device 3 captures an image according to an imaging instruction, and outputs the captured image (image data) to the image storage device 4.
- the image storage device 4 stores various types of information, and includes a storage device such as an HDD (Hard Disk Drive) or a flash memory.
- the image storage device 4 stores and stores the image received from the imaging device 3.
- the display device 5 includes a display, displays the input image, and presents it to the user.
- the display device 5 receives the corrected image from the image processing device 6 and displays the corrected image.
- the image processing apparatus 6 includes an image acquisition unit (instruction receiving unit) 11, a face detection unit 12, a feature detection unit 13, a suitability determination unit 14, a mouth image normalization unit 15, a smoothing unit 16, a skin representative color specifying unit 17, A candidate color specifying unit 18, a lip representative color specifying unit 19, a lip region specifying unit 20, an image correcting unit 21, a combining unit 22, and a display control unit 23 are provided.
- the image acquisition unit 11 receives an instruction for correction processing from the instruction input device 2.
- the correction processing instruction includes information indicating an image to be processed and information indicating what correction processing is to be performed. Examples of the type of correction processing include lip gloss correction for correcting an image as if lip gloss was applied to the lips, or tooth whitening correction for correcting an image so that teeth become white.
- the image acquisition unit 11 acquires an image to be processed from the image storage device 4 based on the received correction processing instruction. Note that the image acquisition unit 11 may directly receive an image captured from the imaging device 3.
- the image acquisition unit 11 outputs the acquired processing target image to the face detection unit 12, the feature detection unit 13, the suitability determination unit 14, the mouth image normalization unit 15, the skin representative color identification unit 17, and the synthesis unit 22. Further, the image acquisition unit 11 outputs the received correction processing instruction to the image correction unit 21.
- the face detection unit 12 detects a face image included in the image received from the image acquisition unit 11. When detecting the face image included in the image, the face detection unit 12 specifies the position of the face image. The position of the face image may indicate the coordinates of a predetermined point of the face image, or may indicate a region of the face image. The face detection unit 12 outputs the position of the face image to the feature detection unit 13, suitability determination unit 14, mouth image normalization unit 15, and skin representative color identification unit 17. The face detection unit 12 may detect a plurality of face images from the processing target image. When a plurality of face images are detected, the face detection unit 12 may specify the positions of the face images and output the positions of the plurality of face images to the respective units.
- the feature detection unit 13 detects the position of each feature of the face of the face image from the image to be processed received from the image acquisition unit 11 and the position of the face image received from the face detection unit 12.
- the feature detection unit 13 includes features of facial organs such as eyes (head of eyes, corners of eyes, etc.), mouth (mouth end points, mouth center points, etc.), and nose (vertex of nose, etc.), and faces.
- the features (feature points) such as the contours are detected and their positions are specified.
- the position of the feature may indicate the coordinates of the feature point, or may indicate an area including the feature. Each feature can be detected using a known technique.
- the feature detection unit 13 outputs the detected position of the facial feature to the suitability determination unit 14, the mouth image normalization unit 15, and the skin representative color identification unit 17.
- the feature detection unit 13 may specify the positions of the features of the plurality of face images and output the positions of the features of the plurality of face images to each of the above-described units.
- the suitability determination unit 14 corrects the face image from the processing target image received from the image acquisition unit 11, the face image position received from the face detection unit 12, and the face feature position received from the feature detection unit 13. It is determined whether it is suitable for processing. For example, the suitability determination unit 14 determines a face image that faces sideways, a face image in which the image of the face is too small, and the like as inappropriate. A specific determination method will be described later. When a plurality of face images are included in the image to be processed, the suitability determination unit 14 may determine the suitability of performing correction processing on each face image, or more suitable for performing correction processing. A predetermined number (for example, one) of face images may be specified. The suitability determination unit 14 outputs information indicating a face image determined to be appropriate as a processing target to the mouth image normalization unit 15, the skin representative color specifying unit 17, and the candidate color specifying unit 18.
- the mouth image normalization unit 15 receives the image to be processed, the position of the face image, and the position of the facial feature from the image acquisition unit 11, the face detection unit 12, and the feature detection unit 13, respectively. Based on the received information, the mouth image normalization unit 15 extracts an image of the mouth region of the face image to be processed for the face image determined to be appropriate as the processing target by the suitability determination unit 14. In order to facilitate calculation in later image processing, the mouth image normalization unit 15 normalizes the image size so that the mouth region of the image to be processed has a predetermined size.
- the mouth image normalization unit 15 rotates and enlarges / reduces the face image to be processed as necessary so that the left and right end points of the mouth are located at predetermined coordinates, and the mouth image having a predetermined size is obtained.
- a region (region including the mouth) is cut out from the face image to be processed.
- (A) of FIG. 2 is an image which shows the image of the normalized mouth area
- the mouth image normalization unit 15 outputs the normalized mouth region image (mouth image) to the smoothing unit 16 and the image correction unit 21.
- the smoothing unit 16 smoothes the mouth image received from the mouth image normalization unit 15. Specifically, the smoothing unit 16 generates a smoothed mouth image by applying a Gaussian filter or the like to the mouth image.
- FIG. 2B is an image showing a smoothed mouth area image. By using the smoothed mouth image, it is possible to eliminate noise and accurately specify a desired region such as a lip. Note that the normalized mouth image and the smoothed mouth image are color images, but FIG. 2 shows light and dark depending on the luminance value (Y value).
- the smoothing unit 16 outputs the smoothed mouth image to the candidate color specifying unit 18, the lip region specifying unit 20, and the image correcting unit 21.
- the skin representative color specifying unit 17 receives the image to be processed, the position of the face image, and the position of the facial feature from the image acquisition unit 11, the face detection unit 12, and the feature detection unit 13, respectively. Based on the received information, the skin representative color specifying unit 17 specifies the skin representative color of the face image to be processed for the face image determined to be appropriate as the processing target by the suitability determining unit 14.
- a partial color of the face area for example, an average color, a median value, a mode value color, or the like of the center part (near the nose) of the face area may be used as the representative color of skin. Further, the average color or the like of the entire face area may be used as the representative color of the skin.
- an average color of an area having a face is obtained, pixels having a hue different from the average color in the area (an angle with the average color in the CbCr plane is larger than a threshold value), and / or the average color in the area
- the pixels having a large color difference may be excluded, and the average color calculated from the remaining pixels may be used as the representative color.
- the skin representative color specifying unit 17 obtains the degree of skin color dispersion.
- the skin representative color specifying unit 17 outputs the skin representative color to the candidate color specifying unit 18 and the lip representative color specifying unit 19.
- the skin representative color specifying unit 17 outputs the degree of skin color dispersion to the lip representative color specifying unit 19.
- the candidate color specifying unit 18 specifies a plurality of candidate colors that are lip color candidates.
- the candidate color specifying unit 18 sets a plurality of areas in the mouth image, specifies a representative color of each area, and sets it as a candidate color.
- FIG. 3 is a diagram illustrating a plurality of regions in the mouth image. The crosses in the figure indicate the mouth end points detected by the feature detection unit 13. Specifically, the candidate color specifying unit 18 performs the following processing.
- the candidate color specifying unit 18 divides a predetermined region at the center in the horizontal direction of the mouth image into a plurality of regions arranged in the vertical (vertical) direction.
- the candidate color specifying unit 18 specifies the representative colors (average color, median or mode value, etc.) of each divided area as a plurality of candidate colors of the lips. At least one of the areas thus divided is considered to be an area including a portion that is mainly a lip. Therefore, it is considered that at least one of the plurality of candidate colors is suitable as a representative color of the lips.
- the method of setting (dividing) each region is not limited to the above, and a plurality of regions may be set between two mouth end points that are considered to have lips.
- the size of the plurality of divided areas is not limited, and each pixel may be a plurality of areas.
- the candidate color specifying unit 18 specifies a candidate color using the smoothed mouth image.
- the candidate color specifying unit 18 may specify a candidate color using a mouth image that has not been smoothed.
- the candidate color specifying unit 18 obtains the degree of color dispersion of the divided areas as the degree of dispersion of the corresponding candidate color.
- the candidate color specifying unit 18 outputs a plurality of candidate colors to the lip representative color specifying unit 19.
- the candidate color specifying unit 18 outputs the degree of dispersion of the candidate colors to the lip representative color specifying unit 19.
- the lip representative color specifying unit 19 specifies the representative color of the lips from among a plurality of candidate colors based on the representative color of the skin.
- the lip representative color specifying unit 19 specifies a candidate color having a large difference from the skin representative color as a representative color of the lip according to the difference in hue and saturation between the skin representative color and each candidate color.
- the lip representative color specifying unit 19 performs processing in a color space such as a YCbCr color space or an L * a * b * color space that expresses a color by luminance (or lightness), hue, and saturation.
- the lip representative color specifying unit 19 does not use the luminance (or lightness) information, and uses the lip color for each candidate color based on the information on the CbCr plane (hue saturation plane) representing the hue and saturation in the color space. And the representative color of the lips on the CbCr plane is specified. Detailed processing for specifying the representative color of the lips will be described later.
- the lip representative color specifying unit 19 outputs the lip representative color on the specified CbCr plane to the lip region specifying unit 20 and the image correcting unit 21.
- FIG. 4 is a diagram showing a simplified range of lip color candidates on the CbCr plane.
- the color of the lips is considered to be somewhat different from the representative color of the skin. Therefore, in the CbCr plane, the color in the range A that is close to the skin representative color is preferably excluded from the lip color candidates.
- the color of the lips is considered to have a hue different from the representative color of the skin. Therefore, in the CbCr plane, the color in the range B whose hue is close to the representative color of the skin should be excluded from the lip color candidates.
- the lip color candidate is considered to be in a range outside the range A, range B, and range C.
- the skin color range is a range that is close to the skin representative color and the hue is close to the skin representative color
- the range with low saturation is the white range
- the lip color candidate range is considered to be in a range outside the range A, range B, and range C.
- the lip area specifying unit 20 specifies an area that is a lip in the mouth image based on the smoothed mouth image and the representative color of the lips.
- the lip area specifying unit 20 specifies a color area similar to the representative color of the lips as the lip area according to the difference in hue and saturation with the representative color of the lips on the CbCr plane. Detailed processing for specifying the lip area will be described later.
- the lip region specifying unit 20 specifies the lip region by using the smoothed mouth image in order to specify the lip region by eliminating image noise and the like.
- the present invention is not limited to this, and the lip region specifying unit 20 may use an unsmoothed mouth image.
- the lip region specifying unit 20 outputs information indicating the specified lip region, information indicating the lip candidate region, and the like to the image correcting unit 21.
- the image correction unit 21 corrects the appearance of the mouth image based on the correction processing instruction, the normalized mouth image, the smoothed mouth image, and the representative color of the lips, and generates a corrected mouth image. . A method of correcting the mouth image will be described later.
- the image correction unit 21 outputs the corrected mouth image to the synthesis unit 22.
- the synthesizing unit 22 returns the corrected mouth image to the original size before normalization (rotates and enlarges / reduces the corrected mouth image as necessary), and synthesizes and corrects the image to be processed. Generated images. In this way, an image in which the appearance of lips or the like is corrected in the processing target image is obtained.
- the synthesizing unit 22 outputs the corrected image to the display control unit 23.
- the synthesizing unit 22 may output the corrected image to the image storage device 4 and store it.
- the display control unit 23 outputs the corrected image to the display device 5 and controls the display device 5 to display the corrected image.
- the user selects an image to be processed from, for example, images captured and stored in the image storage device 4 via the instruction input device 2. Further, the user selects the type of correction processing (lip gloss correction, tooth whitening correction, etc.) to be applied to the image to be processed from the plurality of candidates via the instruction input device 2.
- the instruction input device 2 outputs a correction processing instruction including information on the specified type of correction processing to the image acquisition unit 11 of the image processing device 6.
- FIG. 5 is a flowchart showing the flow of the lip color specifying process and the lip region specifying process in the image processing apparatus 6.
- the image acquisition unit 11 When the image acquisition unit 11 receives a correction processing instruction from the instruction input device 2, the image acquisition unit 11 acquires an image to be processed from the image storage device 4 (S1).
- the face detection unit 12 detects a face image included in the image to be processed and specifies the position of the face image (S2).
- the face detection unit 12 may detect a plurality of face images included in the processing target image.
- Feature detection unit 13 detects the position of the facial feature included in the detected facial image (S3).
- the feature detection unit 13 detects features (feature points) of facial organs such as eyes (head of the eyes, corners of the eyes, etc.), mouth (mouth end point, mouth center point, etc.), and nose (vertex of the nose, etc.) Identify their location.
- the feature detection unit 13 may detect a feature such as a face outline.
- the suitability determination unit 14 determines whether or not the face image is suitable for performing the correction process based on the position of the detected facial feature (S4). For example, the suitability determination unit 14 stores a face model created by learning in advance the luminance distribution features around the features of facial organs such as eyes, nose and mouth from a plurality of face image samples. The suitability determination unit 14 compares the face model with the detected face image to identify the reliability of the detected feature of the face image and the face orientation.
- the suitability determination unit 14 determines that the face image is not suitable for performing the correction process.
- the suitability determination unit 14 determines that the face image is not suitable for performing the correction process.
- the suitability determination unit 14 determines that the face image is not suitable for performing the correction process.
- the skin representative color specifying unit 17 determines the skin of the face image to be processed for the face image determined to be appropriate as the processing target.
- the representative color is specified (S5).
- the average color of the central part of the face area (near the nose) is used as the representative color of the skin.
- the skin representative color specifying unit 17 obtains the degree of skin color dispersion (standard deviation). Specifically, the dispersion Shigumars 2 in distributed ⁇ bs 2, Cr axis in Cb axis of the color space of the pixel values of the pixels of the area to determine the average color (near the nose), and the hue variance Shigumaps 2 of seeking.
- the mouth image normalization unit 15 extracts an image of the mouth area of the face image to be processed, and generates a mouth image in which the image size is normalized so that the mouth area of the image to be processed has a predetermined size. (S6). Specifically, the mouth image normalization unit 15 rotates and enlarges / reduces the face image to be processed as necessary so that the left and right end points of the mouth are located at predetermined coordinates, and the mouth image having a predetermined size is obtained. A region is cut out from the face image to be processed.
- the smoothing unit 16 smoothes the normalized mouth image (S7).
- the candidate color specifying unit 18 divides a predetermined region in the center in the horizontal direction of the mouth image into a plurality of regions arranged in the vertical direction, and sets the representative colors of the divided regions as a plurality of candidate colors for the lips. Specify (S8). Here, for each region, the average color of the region is a candidate color.
- the skin representative color and the plurality of candidate colors may not include luminance information.
- the representative color of the lip color and the lip region are specified using the skin representative color and a plurality of candidate colors on the CbCr plane without using the luminance (Y).
- the lip representative color specifying unit 19 obtains the degree of lip color (lip color degree) for each candidate color (S9).
- the lip color of the mouth image is a color different from the skin color and is considered to have a hue different from the skin color. Further, it is considered that the color of the lips of the mouth image is generally different from the teeth that appear whitish. It is assumed that the degree of lip color increases as the distance between the skin representative color and the CbCr plane increases, the difference between the skin representative color and the hue increases, and the saturation increases.
- the lip representative color specifying unit 19 has a first non-skin color degree (degree that is not a skin color) that increases according to the distance between the candidate color and the skin representative color in the CbCr plane. As the weight of the lip color degree.
- the lip representative color specifying unit 19 obtains, as a weight of the lip color degree, a second non-skin color degree that increases in accordance with the difference in hue between the candidate color and the skin representative color.
- the lip representative color specifying unit 19 obtains a non-tooth color degree (degree that is not a tooth color) that increases according to the saturation of the candidate color as a weight of the lip color degree.
- the weight Wa (first non-skin color degree) of the lip color degree according to the distance in the CbCr plane can be obtained by the following equation.
- Cbs and Crs are a Cb component and a Cr component, respectively, of the representative color (average color) of skin
- Cb and Cr are a Cb component and a Cr component, respectively, of candidate colors
- ⁇ bs and ⁇ rs are the standard deviation of the skin color on the Cb axis and the standard deviation of the skin color on the Cr axis, respectively.
- (A) of FIG. 6 is a figure which shows the relationship between the distance in the CbCr plane of a candidate color and the representative color of skin, and the weight Wa.
- Equation (1) if the distance between the candidate color and the skin representative color in the CbCr plane is small, the weight Wa is close to 0, and the distance between the candidate color and the skin representative color in the CbCr plane increases. , The weight Wa increases and approaches 1.
- FIG. 6B corresponds to FIG. 2B, and is an image showing the result of calculating the weight Wa by applying the pixel value of each pixel of the mouth image to Equation (1) instead of the candidate color. .
- a bright part indicates that the weight Wa is large, and a dark part indicates that the weight Wa is small. This indicates that the lip area has a large weight Wa.
- the weight Wa also increases in the tooth region.
- the weight Wb (second non-skin color degree) of the lip color degree according to the difference in hue can be obtained by the following equation.
- Ps is the hue of the representative color (average color) of the skin, and is indicated by the phase angle on the CbCr plane.
- P is the hue of the candidate color.
- ⁇ ps is the standard deviation of the hue of the skin color.
- ⁇ is a predetermined constant for preventing the weight Wb from becoming zero even if the hue of the candidate color and the hue of the representative color of the skin are the same.
- (A) of FIG. 7 is a figure which shows the relationship between the hue in the CbCr plane of a candidate color and the representative color of skin, and the weight Wb.
- FIG. 7B corresponds to FIG. 2B and is an image showing the result of calculating the weight Wb by applying the pixel value of each pixel of the mouth image to equation (2) instead of the candidate color.
- a bright part indicates that the weight Wb is large, and a dark part indicates that the weight Wb is small. According to this, it can be seen that the lip area has a large weight Wb. However, the weight Wb is also increased in a partial region of the tooth.
- the weight Wc (non-tooth color degree) of the lip color degree according to the saturation can be obtained by the following equation.
- Cb and Cr are a candidate color Cb component and a Cr component, respectively.
- C is a predetermined constant.
- the numerator of exp in Equation (3) represents saturation.
- the weight Wc is closer to 0 if the saturation of the candidate color is smaller, and the weight Wc is larger and closer to 1 as the saturation of the candidate color is larger.
- FIG. 8 is an image corresponding to (b) of FIG. 2 and showing the result of calculating the weight Wc by applying the pixel value of each pixel of the mouth image to Equation (3) instead of the candidate color.
- a bright part indicates that the weight Wc is large, and a dark part indicates that the weight Wc is small. According to this, it can be seen that the lip area has a large weight Wc.
- the weight Wc is small in the partial region of the tooth having a whitish color.
- the weight Wc is also small in a part of the mouth that is dark and easy to appear in the photograph.
- the distance in the color space or the CbCr plane from the average color of the skin to the color of each point of the skin increases. There is. That is, the dispersion of the skin color in the color space or CbCr plane may be large.
- the hue does not change much depending on conditions such as lighting. Therefore, for some skin and lips, even when the value of the weight Wa according to the distance in the CbCr plane is approximately the same, the skin color and the lip color are discriminated by the weight Wb according to the difference in hue. be able to.
- the value of the weight Wb corresponding to the difference in hue may be approximately the same for the skin and the lip. .
- the skin color and the lip color can be discriminated based on the weight Wa corresponding to the distance in the CbCr plane.
- both the weight Wa according to the distance in the CbCr plane and the weight Wb according to the hue difference can be increased.
- the tooth color is generally whitish and low in saturation, whereas the lip color is considered to be high in saturation. Therefore, it is possible to discriminate between the tooth color and the lip color based on the weight Wc corresponding to the saturation. it can.
- the saturation of a portion that appears dark as a shadow in the mouth is low, it can be distinguished from the color of the lips by the weight Wc corresponding to the saturation.
- the lip representative color specifying unit 19 obtains, for each candidate color, the product of the first non-skin color degree Wa, the second non-skin color degree Wb, and the non-tooth color degree Wc as the lip color degree D1.
- FIG. 9 is an image corresponding to FIG. 2B and showing the result of calculating the lip color degree D1 from the pixel value of each pixel of the mouth image instead of the candidate color.
- a bright part indicates that the lip color degree D1 is large, and a dark part indicates that the lip color degree D1 is small. It is considered that a candidate color having a large lip color degree D1 is likely to be a lip color.
- the candidate color obtained from the lip region has the largest lip color degree D1.
- the lip representative color specifying unit 19 selects a candidate color having the largest lip color degree D1 as the first lip color candidate (first selection candidate color).
- the first selection candidate There is a possibility of selecting a tooth color as the color.
- the second candidate (second selection candidate color) having a large difference in hue and saturation from the first candidate color is selected from the remaining candidate colors, and the first selection candidate color is selected.
- the second selection candidate color are specified as the lip color.
- the lip representative color specifying unit 19 For each of the other candidate colors excluding the first selection candidate color, the lip representative color specifying unit 19 increases the weight Wd (first value) according to the distance between the candidate color and the first selection candidate color in the CbCr plane. A degree that is not a selection candidate color) is obtained (S10).
- the weight Wd corresponding to the distance from the first selection candidate color can be obtained by the following equation.
- Cbd and Crd are the Cb component and Cr component of the first selection candidate color, respectively
- Cb and Cr are the Cb component and Cr component of the candidate color, respectively
- ⁇ bd and ⁇ rd are the standard deviation of the first selection candidate color on the Cb axis of the color space (standard deviation of the Cb component of each pixel in the region of the first selection candidate color) and the first selection on the Cr axis, respectively.
- This is the standard deviation of the candidate color (standard deviation of the Cr component of each pixel in the region of the first selection candidate color).
- the standard deviation of the first selection candidate color can be obtained from the pixel value of each pixel in the region corresponding to the selection candidate color (region divided by the candidate color specifying unit 18).
- FIG. 10 is an image corresponding to (b) of FIG. 2 and showing the result of calculating the weight Wd by applying the pixel value of each pixel of the mouth image to Equation (5) instead of the candidate color.
- a bright part indicates that the weight Wd is large, and a dark part indicates that the weight Wd is small.
- a candidate color obtained from the lip region is selected as the first selection candidate color. Therefore, the pixel weight Wd of the lip region is small.
- the lip representative color specifying unit 19 obtains the product of the lip color degree D1 and the weight Wd as a candidate evaluation value D2 for each candidate color.
- FIG. 11 is an image corresponding to (b) of FIG. 2 and showing the result of calculating the candidate evaluation value D2 from the pixel value of each pixel of the mouth image instead of the candidate color.
- a bright part indicates that the candidate evaluation value D2 is large, and a dark part indicates that the candidate evaluation value D2 is small.
- the lip representative color specifying unit 19 selects the candidate color having the largest candidate evaluation value D2 as the second lip color candidate (second selection candidate color). According to this, if a tooth color is selected as the first selection candidate color, it is considered that the candidate color obtained from the lip region is selected as the second selection candidate color. Due to the weight Wd, the first selection candidate color and the second selection candidate color are likely to be candidate colors obtained from different regions (different facial parts). By selecting two candidate colors, which are different colors, from among a plurality of candidate colors as selection candidate colors, one of the selection candidate colors includes an appropriate candidate color as a lip color.
- the lip representative color specifying unit 19 specifies a selection candidate color that is more likely to be a lip color from the first and second selection candidate colors as the representative color of the lips (S11).
- the lip representative color specifying unit 19 specifies, as the representative color of the lips, one of the first and second selection candidate colors having higher saturation.
- the lip representative color specifying unit 19 may specify the luminance Y of the representative color of the lips, or may not specify the luminance Y. What is necessary is to specify at least the hue and saturation (or Cb component and Cr component) of the representative color of the lips.
- the lip representative color specifying unit 19 may select one first selection candidate color according to the lip color degree D1 and specify it as the lip color. Further, the lip representative color specifying unit 19 may select a plurality of first selection candidate colors according to the lip color degree D1, and specify one having high saturation as the representative color of the lips. . The lip representative color specifying unit 19 selects the first selection candidate color according to the lip color degree D1, selects a plurality of second selection candidate colors according to the candidate evaluation value D2, and performs the first selection. Of the candidate colors and the plurality of second selection candidate colors, the one with the highest saturation may be specified as the representative color of the lips.
- the lip representative color specifying unit 19 may specify a lip representative color that has a hue closest to a predetermined hue from the first and second selection candidate colors. Since it is often assumed that the lips are red, for example, the predetermined hue may be a hue close to red, which is a typical lip color.
- the lip area specifying unit 20 obtains the degree of similarity with the representative color of the lips for each pixel of the mouth image (S12). A color area similar to the representative color of the lips is considered to be a lip area.
- the lip area specifying unit 20 specifies an area similar to the representative color of the lips according to the difference in hue and saturation between the representative color of the lips and the color of each pixel. Specifically, for each pixel, the lip region specifying unit 20 determines the first lip color similarity We according to the distance between the color of each pixel and the representative color of the lips in the CbCr plane, and the color of each pixel. A second lip color similarity Wf corresponding to the hue difference from the representative color of the lips is obtained.
- the first lip color similarity We according to the distance in the CbCr plane can be obtained by the following equation.
- Cbl and Crl are the Cb component and Cr component of the representative color of the lips, respectively
- Cb and Cr are the Cb component and Cr component of the color of each pixel, respectively.
- ⁇ bl and ⁇ rl are the standard deviation of the lip color on the Cb axis and the standard deviation of the lip color on the Cr axis, respectively.
- the standard deviation of the lip color is obtained from the color of each pixel in the region (region divided by the candidate color specifying unit 18) corresponding to the representative color of the lips (candidate color finally specified as the representative color of the lips). Can do.
- FIG. 12A shows the relationship between the distance between the color of each pixel and the representative color of the lips in the CbCr plane and the first lip color similarity We.
- FIG. 12B corresponds to FIG. 2B and is an image showing the result of calculating the first lip color similarity We by applying the color of each pixel of the mouth image to Equation (7).
- a bright part indicates that the first lip color similarity We is large, and a dark part indicates that the first lip color similarity We is small. According to this, it can be seen that the lip region has a large first lip color similarity We. However, since a part of the lip region is reflected by illumination or the like and has low saturation, the first lip color similarity We is small. Also, when there is a shadow or the like on the lips, the first lip color similarity We may be small.
- the second lip color similarity Wf corresponding to the hue difference can be obtained by the following equation.
- FIG. 13A is a diagram showing the relationship between the hue on the CbCr plane between the color of each pixel and the representative color of the lips and the second lip color similarity Wf. According to Equation (8), if the difference in hue between the color of each pixel and the representative color of the lips is small, the second lip color similarity Wf is close to 1, and the color of each pixel and the representative color of the lips As the hue difference increases, the second lip color similarity Wf decreases and approaches zero.
- FIG. 13B corresponds to FIG.
- the second lip color similarity Wf corresponding to the difference in hue is less affected by illumination or the like than the first lip color similarity We, and a stable and accurate result can be obtained.
- the lips can be of various colors because lipstick or lip gloss is applied.
- a lipstick having the same hue as the skin color is applied to the lips, it is difficult to accurately specify the lip area with the second lip color similarity Wf according to the difference in hue. Therefore, when the hue of the lip color is similar to the hue of the skin color, the first lip color similarity We can be a better index for determining the lip region.
- the lip region specifying unit 20 specifies lip candidate regions (lip candidate regions, first lip region) from the mouth image based on the first lip color similarity We and the second lip color similarity Wf. (S13).
- the lip candidate area can be said to be an area having a color similar to the representative color of the lips.
- the lip region specifying unit 20 determines a pixel having a large value of at least one of the first lip color similarity We and the second lip color similarity Wf as a lip candidate region. Specifically, for each pixel, the lip region specifying unit 20 compares the first lip color similarity We with a predetermined threshold value, and determines a pixel whose first lip color similarity We is greater than the threshold value as a lip candidate region.
- FIG. 14 is an image corresponding to (b) of FIG. 2 and showing pixels classified into lip candidate regions. A bright spot indicates a lip candidate area.
- the image shown in FIG. 14 corresponds to an image obtained by binarizing the image shown in (b) of FIG. 12 and the image shown in (b) of FIG.
- the lip region specifying unit 20 may specify the lip candidate region using only one of the first lip color similarity We and the second lip color similarity Wf.
- the lip region specifying unit 20 may specify a pixel having the first lip color similarity We greater than a threshold and the second lip color similarity Wf greater than another threshold as a lip candidate region.
- the image indicating the lip candidate region corresponds to an image obtained by binarizing and multiplying the image illustrated in FIG. 12B and the image illustrated in FIG. 13B.
- the lip region specifying unit 20 calculates the sum or product of the first lip color similarity We and the second lip color similarity Wf before binarization for each pixel, and determines the lip candidate region based on the result. You may specify.
- the lip area specifying unit 20 may specify a pixel in which 1 ⁇ (1 ⁇ We) ⁇ (1 ⁇ Wf) is larger than a predetermined threshold as a lip candidate area.
- the lip region specifying unit 20 may exclude from the lip candidate region a portion that can be clearly determined not to be a lip from the distribution of the lip candidate region.
- the lip area specifying unit 20 specifies the modeled lip area (second lip area) from the lip candidate areas of the mouth image (S14). There are various methods for specifying the modeled lip region, and the boundary of the lip candidate region of the mouth image may be approximated by a model function (higher order function etc.), or prepared in advance for the spatial distribution of the lip candidate region
- the lip shape model may be specified by fitting, or the lip region may be specified by a segmentation technique based on a lip shape model prepared in advance.
- the lip shape model defines the shape of the lips that are likely to be lips by a function or a range, and may be defined by a predetermined procedure that indicates the range of the lips.
- FIG. 15 and (b) in FIG. 15 are images corresponding to FIG. 14 and showing the procedure for specifying the modeled lip region from the image indicating the lip candidate region.
- the x-axis is taken in the horizontal direction and the y-axis is taken in the vertical (vertical) direction.
- a modeled lip region is specified for the upper lip.
- the center x coordinate (x0) in the horizontal direction of the mouth is specified from the positions of the left and right mouth end points that are already known.
- the vertical center position (y coordinate y0) of the upper lip region is estimated from the upper end and lower end positions of the upper lip candidate region continuously distributed in the vertical direction. .
- a rectangle of a predetermined size is set as a search block with the coordinates (x0, y0) as the center position ((a) of FIG. 15).
- the search block preferably has a large size in the vertical direction so as to include the upper and lower ends of the upper lip candidate region.
- the search block is moved by ⁇ x to the mouth end side, and the search block is set with the coordinates (x1, y1) as the center position ((b) of FIG. 15).
- x1 x0 + ⁇ x.
- the processing may be continued until the search block reaches a predetermined position (for example, the mouth end point), or the processing may be continued until the position where the lip candidate region is interrupted.
- a curve indicating the vertical center position of the upper lip area by obtaining a curve (secondary curve or higher order curve) connecting a plurality of points indicating the vertical center position of the upper lip area.
- a plurality of points indicating the center position in the vertical direction of the lip region of the lower lip can be obtained in the same manner.
- the lip region specifying unit 20 specifies a range of a predetermined width in the vertical direction around each point indicating the center position in the vertical direction of the lip region as a modeled lip region.
- a range of a predetermined width in the vertical direction centering on each point indicating the vertical center position of the lip region is a lip shape model representing a shape like a lip.
- FIG. 16 is an image corresponding to (a) of FIG. 15 and showing the modeled lip region. Bright spots indicate lip areas.
- the distance from the upper end to the lower end of the lip candidate region continuously distributed in the vertical direction near the center in the horizontal direction of the mouth may be the vertical width of the modeled lip region.
- the lip region specifying unit 20 obtains a curve connecting the points indicating the center position in the vertical direction of the lip region, and specifies a region having a predetermined width in the vertical direction of the curve as a modeled lip region. Also good.
- the lip region specifying unit 20 may specify the lip region based on each point indicating the center position in the vertical direction of the lip region so that the vertical width of the lip region becomes smaller toward the mouth end point side. . In that case, the specified lip region becomes a more natural model lip shape.
- the lip area specifying unit 20 specifies the lip area.
- the lip area specifying unit 20 may specify the pixels of the lip candidate area as the lip area.
- the lip area specifying unit 20 selects only the lip candidate area and the modeled lip area (the area obtained by multiplying the image area of FIG. 14 and the image area of FIG. 16) by the lips. It may be specified as a region. This completes the lip color specifying process and the lip area specifying process.
- lip gloss correction is performed on the specified lip region.
- the luminance of a partial region of the lips is increased, and the color change of the lip region is made smooth overall.
- a gloss image for adding brightness to some pixels in the lip area is prepared, and the lip area is smoothed by superimposing (compositing) the gloss image on the mouth image obtained by smoothing the lip area. Enhance the gloss of part of the lips.
- FIG. 17 is a flowchart showing the flow of correction processing in the image processing apparatus 6.
- the image correction unit 21 obtains a weight (correction weight Wg) for correcting the lip image for each pixel position of the lip region shown in FIG. 16 (S21). For example, the correction of the gloss enhancement is applied more in the vicinity of the center of the lip region, and the correction of the gloss enhancement is applied more in the vicinity of the periphery of the lip region, thereby performing natural appearance image correction.
- the image correction unit 21 corrects the upper and lower lips so that the weight is larger at the center position in the horizontal direction of the lip region and is smaller toward the outer side of the lip region in the horizontal direction (mouth end point side).
- a weight Wg is set.
- the image correction unit 21 sets the correction weight Wg so that the weight is larger at the vertical center of the lip region and the weight is smaller at the vertical end of the lip region for each of the upper and lower lips.
- the correction weight is 1 at the horizontal center position of the lip region and the vertical center position, and the correction weight is at the horizontal center position of the lip region and at the edge of the vertical lip region. To be zero.
- FIG. 18 is a diagram showing the correction weight Wg at each horizontal position.
- the vertical axis represents the position in the vertical direction of the lip region, and the horizontal axis represents the correction weight Wg at each position.
- the left graph corresponds to the horizontal center position of the lip region, and the right graph corresponds to the horizontal position of the lip region in the horizontal direction.
- the correction weight Wg is large at the center of the width of the lip region.
- FIG. 19 corresponds to FIG. 16 and is an image showing the correction weight Wg. A bright spot indicates that the correction weight Wg is large. It can be seen that the correction weight Wg shown in FIG. 19 is larger than the lip area shown in FIG. Note that this processing may be performed by the lip region specifying unit 20, and the lip region specifying unit 20 may specify a binarized version of the image shown in FIG.
- the image correction unit 21 performs gloss correction for correcting the gloss (brightness) in the lip region based on the correction weight Wg, the lip candidate region, the first lip color similarity We, and the second lip color similarity Wf.
- a correction area is specified (S22). Specifically, for each pixel, the image correction unit 21 corrects the correction weight Wg (FIG. 19), the lip candidate region (FIG. 16), the first lip color similarity We (FIG. 12 (b)), and the first The product of the lip color similarity Wf of 2 ((b) of FIG. 13) is obtained as the corrected partial evaluation value D3.
- FIG. 20 corresponds to FIG. 19 and is an image showing the corrected partial evaluation value D3. A bright part indicates that the correction part evaluation value D3 is large.
- the image correction unit 21 specifies an area (pixel) in which the correction partial evaluation value D3 is larger than a predetermined threshold as a gloss correction area that is a target for gloss correction.
- correction is performed so that the brightness becomes higher at some brightest portions (areas) in the gloss correction area.
- the luminance is increased for pixels around the pixel having the maximum luminance in the gloss correction region.
- a gloss image is prepared for adding the luminance of some pixels of the lip region.
- the image correction unit 21 extracts only the pixels included in the gloss correction area from the smoothed mouth image (S23).
- FIG. 21 is an image corresponding to (b) of FIG. 2 and showing the luminance of the extracted pixel. A bright part indicates that the extracted pixel has a high luminance, and a dark part indicates that the extracted pixel has a low luminance. In addition, the position of the pixel which is not extracted is shown darkly.
- the image correction unit 21 multiplies the luminance value of each pixel by a concentric weight around the pixel having the maximum luminance among the extracted pixels. The concentric weight is 1 at the center and decreases as the distance from the center increases. In FIG. 21, the pixel having the maximum luminance is located at the center of the circle shown in the figure.
- FIG. 22 is a histogram of the luminance value of each extracted pixel as a result of weighting the luminance value of the extracted pixel.
- the image correcting unit 21 generates a glossy image having a high luminance for the extracted pixels, and a predetermined ratio of pixels having a high luminance value of each pixel multiplied by the weight (S24).
- FIG. 23 corresponds to FIG. 21 and is an image showing a glossy image having only a luminance component.
- the pixel of the corresponding glossy image has a large luminance, centering on the pixel having the maximum luminance.
- the image correction unit 21 generates a glossy image in which the luminance is increased for the pixels whose luminance value obtained by multiplying the luminance value of the extracted pixel by the weight in the upper 4%. That is, the tone curve is adjusted for the image shown in FIG. 21 so that the luminance values of the pixels included in the upper 4% of the histogram shown in FIG.
- FIG. 24A is a diagram showing a first tone curve for each extracted pixel (FIG. 21), and FIG. 24B is a diagram showing a second tone curve for each extracted pixel.
- the horizontal axis indicates the input luminance
- the vertical axis indicates the output luminance by the tone curve
- the histogram of FIG. 22 is overlaid for reference.
- the luminance value takes a value from 0 to 255.
- the first tone curve with respect to the extracted luminance value of each pixel, all the pixels excluding the upper 4% of the luminance value so that the luminance of the pixel having the highest luminance value is 128.
- the output luminance is corrected so that the luminance of the output becomes zero.
- the output luminance of the pixels having the upper 4% of the luminance value is corrected so that the luminance linearly changes from 0 to 128 according to the luminance value.
- the output luminance is linearly changed so that the maximum input luminance 255 becomes the maximum output luminance 127 with respect to the extracted luminance value of each pixel.
- An image obtained by multiplying the added image (maximum luminance is 255) by a predetermined correction degree (for example, 0.1) is generated as a glossy image (FIG. 23).
- the predetermined correction degree the maximum luminance of the glossy image is about 25, for example.
- the luminance value of the glossy image is a luminance correction value for the mouth image.
- the image correction unit 21 normalizes the mouth image shown in FIG. 2A, the smoothed mouth image shown in FIG. 2B, and the glossy image shown in FIG. And a corrected mouth image is generated (S25). Specifically, the image correction unit 21 has normalized each pixel by multiplying the luminance value of the smoothed mouth image (FIG. 2B) by the correction weight Wg shown in FIG. The sum of the brightness value of the mouth image (FIG. 2A) multiplied by the weight (1-Wg) and the brightness value of the glossy image (FIG. 23) is used as the corrected mouth image brightness value. .
- FIG. 25 is a diagram illustrating a synthesis process of a normalized mouth image, a smoothed mouth image, and a glossy image.
- the luminance value Yc of each pixel of the corrected mouth image is obtained by the following equation.
- Yf represents the pixel value of the pixel of the smoothed mouth image
- Yn represents the pixel value of the pixel of the normalized mouth image
- Yg represents the pixel value of the pixel of the glossy image.
- the correction weight Wg takes a value from 0 to 1.
- the luminance value Yc as a result of addition exceeds the maximum luminance 255
- the luminance value of the pixel is set to 255.
- the synthesizing unit 22 returns the corrected mouth image to the original size before normalization (rotates and enlarges / reduces the corrected mouth image as necessary), and processes the image (face image). ) To generate a corrected image (S26). In this way, an image in which the appearance of lips and the like in the image to be processed is corrected is obtained.
- FIG. 26 is an image showing a part of the face image before correction
- (b) in FIG. 26 is an image showing a part of the face image after correction. It can be seen that the appearance of the entire upper and lower lips is smooth, and the gloss of part of the lower lip is enhanced.
- FIG. 26B the correction of the tooth whitening process described in the second embodiment is also applied.
- the display control unit 23 displays the corrected image on the display device 5 and ends the correction process.
- the representative colors of the lips that can be various colors are specified and specified based on the hue and saturation information excluding the luminance of the mouth image including the lips and the skin.
- the lip region can be accurately specified based on the representative color of the lips. Then, by performing correction processing on the specified lip region, appropriate correction processing can be performed on the lips of the person in the image.
- the method based on luminance can be applied to grayscale images.
- the Y component of the mouth image has a clearer edge than the Cb component and the Cr component, and the use of the Y component makes it easier to detect edges such as lips.
- (A) in FIG. 27 corresponds to (a) in FIG. 2 and is an image showing the value of the Cb component of the normalized mouth image
- (b) in FIG. 27 is shown in (a) in FIG. It is an image which shows the value of the corresponding Cr component of the normalized mouth image. A bright part shows that the value of each component is large.
- 27 (a) and 27 (b) showing the CbCr component and FIG. 2 (a) showing the brightness Y the image showing the brightness Y makes the lip edge clearer. Can be distinguished.
- the lip region and the lip color could be specified based on the luminance.
- the lighting conditions etc. are bad (partial shadows or lighting is too strong, etc.)
- the lip area and lip color are specified based on the luminance as in the past, false detection will occur.
- the number of lips and the color of the lips may not be accurately specified.
- the lip region is conventionally specified by the hue or saturation.
- the image processing apparatus 6 of the present embodiment when the lip color is not known and the lip region is not known, the representative color of the lip is specified and the lip region is specified. can do.
- the image processing apparatus 6 identifies candidate colors that are different from the skin color and the tooth color (the difference is large) as the representative color of the lips according to the difference in hue and saturation. Therefore, even when the skin and lips in the image have the same hue due to shadows, lighting, makeup, etc., and when the skin color in the image varies greatly depending on the location, the skin and teeth and lips are removed. It is possible to accurately identify and specify the representative color of the lips.
- the difference in hue and saturation refers to a difference in the hue saturation plane (CbCr plane) of two colors, and includes a difference in hue, a difference in saturation, a distance in the hue saturation plane, and the like. .
- the image processing apparatus 6 selects a plurality of candidate colors according to the hue and saturation differences from the skin color, and selects a candidate color that seems to be a lip color (for example, has a higher saturation) from among them. Select as representative color. Therefore, the representative color of the lips can be specified more accurately.
- the image processing device 6 identifies a region similar to the representative color of the lips as the lip region according to the difference in hue and saturation between the representative color of the lips and each pixel.
- part of the lips may become whitish.
- the glossy area where the lips become whitish has low saturation, while the other areas of the lips have high saturation, so the distance between the glossy area and the color of the other lip areas in the CbCr plane increases.
- the hue of the glossy region does not change, the glossy region can be accurately determined as the lip region by considering the hue.
- the lip region can be accurately specified by considering the distance between the representative color of the lips and each pixel in the CbCr plane.
- the image processing device 6 applies a predetermined lip shape model based on the spatial distribution of pixels similar to the representative color of the lips (pixels of the lip candidate region), and / or A pixel at a position like a lip is specified as a modeled lip region.
- a predetermined lip shape model based on the spatial distribution of pixels similar to the representative color of the lips (pixels of the lip candidate region), and / or A pixel at a position like a lip is specified as a modeled lip region.
- the pixels included in the lip shape model are Can be specified as Therefore, a lip region in which hue / saturation information is lost due to shining white at the time of photographing can be accurately identified as a lip region.
- the image processing device 6 of the present embodiment it is possible to accurately specify the representative color of the lips and specify the lip region even for an image shot under bad conditions.
- FIG. 28 is a block diagram showing a schematic configuration of the digital camera 30 according to the present embodiment.
- the digital camera 30 includes an instruction input device 2, an imaging device 3, an image storage device 4, a display device 5, and an image processing device 31.
- the image processing device 31 includes a mouth inner region specifying unit 32, a tooth candidate color specifying unit 33, and a tooth representative color specifying unit 34.
- the mouth inner region specifying unit 32 receives information indicating the lip region from the lip region specifying unit 20, and specifies the region between the upper lip region and the lower lip region specified by the lip region specifying unit 20 as the mouth inner region. To do.
- the lip region specifying unit 20 can specify the upper and lower lip regions accurately, thereby specifying the region inside the mouth including the teeth in the mouth image. If there is no space between the upper and lower lip regions, tooth whitening correction is not performed.
- the mouth inner region specifying unit 32 outputs information indicating the specified mouth inner region to the tooth candidate color specifying unit 33.
- the tooth candidate color specifying unit 33 specifies a plurality of tooth candidate colors that are tooth color candidates based on the information indicating the mouth inner region and the smoothed mouth image received from the smoothing unit 16.
- the tooth candidate color specifying unit 33 specifies a representative color of each region from a plurality of regions included in the mouth inner region of the mouth image, and sets it as a tooth candidate color.
- the tooth candidate color specifying unit 33 divides the mouth image into a plurality of regions as shown in FIG. Among them, the representative colors (average color, median or mode color, etc.) of each region included in the mouth inner region are specified as a plurality of candidate colors (tooth candidate colors) of the tooth color. At least one of the regions divided in this way is considered to be a region mainly including teeth.
- the tooth candidate color specification part 33 specifies a tooth candidate color using the smoothed mouth image.
- the present invention is not limited to this, and the tooth candidate color specifying unit 33 may specify a tooth candidate color using a mouth image that has not been smoothed.
- the tooth candidate color specifying unit 33 obtains the degree of color dispersion of the divided areas as the degree of dispersion of the corresponding tooth candidate color.
- the tooth candidate color specifying unit 33 outputs a plurality of tooth candidate colors and the degree of dispersion of the tooth candidate colors to the tooth representative color specifying unit 34. Further, the tooth candidate color specifying unit 33 outputs the degree of dispersion of the tooth candidate color to the image correcting unit 21.
- the tooth representative color specifying unit 34 specifies a tooth representative color having a large degree of tooth color from among a plurality of tooth candidate colors. Specifically, the tooth representative color specifying unit 34 specifies the tooth candidate color with the smallest saturation as the tooth representative color. The tooth representative color specifying unit 34 outputs the tooth representative color to the image correction unit 21.
- the image correction unit 21 corrects the appearance of the mouth image based on the correction processing instruction, the normalized mouth image, the smoothed mouth image, the representative color of the lips, and the representative color of the teeth, and the corrected mouth Generate an image. A method of correcting the mouth image will be described later.
- the image correction unit 21 outputs the corrected mouth image to the synthesis unit 22.
- FIG. 29 is a flowchart showing the flow of the intraoral area specifying process and the correction process (tooth whitening process) in the image processing apparatus 31.
- the image processing device 31 corrects the teeth in the image so that they appear to shine white by increasing the luminance of the tooth region.
- the mouth inner region specifying unit 32 specifies the region between the upper lip region and the lower lip region specified by the lip region specifying unit 20 as the mouth inner region (S31). Specifically, a region sandwiched between the upper and lower lip regions shown in FIG. It is considered that a tooth region is included in the mouth inner region of the mouth image.
- the tooth candidate color specifying unit 33 divides at least a part of the mouth inner region of the mouth image into a plurality of regions, and sets the representative color of each divided region to a plurality of tooth color candidate colors (tooth candidate colors). (S32).
- the average color of the region is a tooth candidate color.
- the tooth representative color specifying unit 34 compares the saturation of each tooth candidate color, and specifies the tooth candidate color having the lowest saturation as the representative tooth color (S33).
- the image correction unit 21 obtains a weight (correction weight Wh) for correcting the tooth image for each pixel position in the mouth inner region (S34). For example, it is considered that the region near the mouth end point of the mouth inner region is often shaded inside the mouth. For this reason, the correction process for whitening the teeth may be more focused on the vicinity of the center in the horizontal direction of the mouth inner region. Thereby, natural-looking image correction is performed.
- the image correction unit 21 sets the correction weight Wh so that the weight is larger at the center position in the horizontal direction of the mouth inner region and is smaller toward the outer side of the mouth inner region in the horizontal direction (mouth end point side).
- the correction weight is 1 at the horizontal center position of the mouth inner region, and the correction weight is 0 at the outer edge of the mouth inner region in the horizontal direction.
- the correction weight in the meantime may be changed linearly, for example.
- FIG. 30 is an image corresponding to FIG. 16 and showing the mouth inner region and the correction weight Wh of the mouth inner region. A bright spot indicates that the correction weight Wh is large.
- the image correction unit 21 obtains the degree of similarity with the representative tooth color for each pixel in the mouth inner region (S35). Specifically, the image correction unit 21 calculates the tooth color similarity Wi according to the distance in the color space between the color of each pixel and the representative color of the tooth.
- the tooth color similarity Wi according to the distance in the color space can be obtained by the following equation.
- Yt, Cbt, and Crt are the luminance Y component, Cb component, and Cr component of the representative color of the tooth, respectively.
- Y, Cb, and Cr are the Y component, Cb component, and Cr component of the color of each pixel, respectively. is there.
- ⁇ yt, ⁇ bt, and ⁇ rt are the standard deviation of the tooth color on the Y axis of the color space, the standard deviation of the tooth color on the Cb axis, and the standard deviation of the tooth color on the Cr axis, respectively.
- the standard deviation of the tooth color is based on the color of each pixel in the region corresponding to the tooth representative color (the tooth candidate color finally specified as the tooth representative color) (the region divided by the tooth candidate color specifying unit 33). Can be sought.
- a dark pixel has a small hue and saturation, similar to a whitish pixel in a tooth region. Therefore, when obtaining the tooth color similarity Wi, the luminance Y is taken into consideration so that the dark pixel has a lower tooth color similarity Wi.
- FIG. 31 is an image corresponding to FIG. 2B and showing the result of calculating the tooth color similarity Wi by applying the color of each pixel of the mouth image to Equation (10).
- a bright portion indicates that the tooth color similarity Wi is large, and a dark portion indicates that the tooth color similarity Wi is small. According to this, it can be seen that the tooth region has a large tooth color similarity Wi.
- the tooth color similarity Wi is small in areas other than the teeth in the mouth inner area, that is, the gums, the back of the mouth, the tongue, and the like.
- a tooth gloss image for adding the luminance of the pixels in the tooth area is prepared in the following processing.
- the image correction unit 21 calculates a tooth gloss image for the mouth inner area based on the correction weight Wh, the tooth color similarity Wi, the first lip color similarity We, and the second lip color similarity Wf. Is generated (S36). Specifically, for each pixel, the product of the correction weight Wh of the mouth inner region and the tooth color similarity Wi is multiplied by (1-We) and (1-Wf), and a predetermined correction degree ( For example, an image multiplied by 20) is generated as a tooth gloss image.
- FIG. 32 is an image corresponding to (b) of FIG. 12 and showing the value of (1-We) of each pixel.
- FIG. 33 is an image corresponding to (b) of FIG. 13 and showing the value of (1-Wf) of each pixel.
- FIG. 34 corresponds to FIG. 31 and is an image showing a tooth gloss image. A bright spot indicates a large value.
- the correction weight Wh, the tooth color similarity Wi, the first lip color similarity We, and the second lip color similarity Wf of the mouth inner region take values from 0 to 1. For example, when the correction degree is 20, each pixel of the tooth gloss image takes a value from 0 to 20.
- the value of each pixel of the tooth gloss image is a luminance correction value when correcting the mouth image. Note that by taking the product of the correction weight Wh of the mouth inner region and the tooth color similarity Wi, the correction weight near the center in the horizontal direction of the region that is the tooth of the mouth inner region can be increased.
- the color of the tongue When the mouth is open in the image but the teeth are not visible, the color of the tongue may be identified as the representative color of the teeth.
- the color of the tongue is considered to be relatively similar to the color of the lips. In such a case, by considering (1-We) and (1-Wf) so that the color of the tongue is not corrected brightly, a hue / saturation region (for example, tongue) similar to the lips is considered. It is possible to avoid correction.
- the image correction unit 21 combines the normalized mouth image shown in FIG. 2A and the tooth gloss image shown in FIG. 34 with respect to the luminance value of each pixel, and generates a corrected mouth image (S37). ). Specifically, the image correction unit 21 corrects the sum of the luminance value of the normalized mouth image ((a) in FIG. 2) and the luminance value of the tooth gloss image (FIG. 34) for each pixel. It is set as the luminance value of the mouth image. When the luminance value as a result of the addition exceeds the maximum luminance 255, the luminance value of the pixel is set to 255.
- the synthesizing unit 22 synthesizes the corrected mouth image with the processing target image (face image).
- the subsequent processing is the same as in the first embodiment.
- FIG. 26 (a) and FIG. 26 (b) are compared, in the image showing a part of the corrected face image shown in FIG. 26 (b), the brightness of the central region of the teeth, particularly in the horizontal direction, is high. You can see that it is up.
- the mouth inner region can be accurately specified based on the lip region. Then, by performing correction processing on a region having a high degree of teeth in the mouth inner region, appropriate correction processing can be performed on a person's teeth in the image.
- the tooth gloss image shown in FIG. 34 shows a tooth region.
- the tooth region can be specified accurately.
- An image processing apparatus is an image processing apparatus that specifies lip characteristics from a face image including a person's mouth, the skin representative color specifying unit that specifies a skin representative color of the face image, and the face In the image, a plurality of regions are set so that at least one region includes a part of the lips, a candidate color specifying unit that specifies the representative color of each region as a candidate color, and the skin representative color and each candidate color
- a lip representative color specifying unit that specifies a representative color of the lips from the plurality of candidate colors according to the difference in hue and saturation.
- An image processing method is an image processing method for specifying a lip feature from a face image including a person's mouth, the skin representative color specifying step for specifying a skin representative color of the face image, and the face
- a plurality of areas are set so that at least one of the areas includes a part of the lips, a candidate color specifying step for specifying a representative color of each area as a candidate color, the representative color of the skin, and each candidate
- the representative color of the lips is identified from the candidate colors based on the difference in hue and saturation of the face image including the lips and the skin. Therefore, the representative colors of the lips, which can be various colors, can be accurately specified in distinction from the skin color.
- the difference in hue and saturation refers to a difference in the hue saturation plane of two colors, and includes a difference in hue, a difference in saturation, a distance in the hue saturation plane, and the like.
- the lip representative color specifying unit may specify the representative color of the lips according to a difference in hue and saturation, excluding luminance or brightness, between the representative color of the skin and each candidate color. .
- the lip area and lip color are specified based on the brightness as in the past, and false detection is performed. In some cases, the lip region and the lip color cannot be accurately specified.
- the representative color of the lips is specified according to the difference in hue and saturation without using information on luminance or lightness. Therefore, even when the skin and lips in the image have the same hue due to shadows, lighting, makeup, etc., and even when the skin color in the image varies greatly depending on the location, the skin and lips are accurately It is possible to distinguish and accurately identify the representative color of the lips.
- the lip representative color specifying unit a distance between each candidate color in the hue saturation plane of the color space and the representative color of the skin, a difference in hue between each candidate color and the representative color of the skin, The representative color of the lips may be specified according to the saturation level of each candidate color.
- the lip representative color specifying unit for each candidate color, a first weight according to a distance between the candidate color on the hue saturation plane of a color space and the representative color of the skin, and the candidate color A second weight according to the difference in hue from the representative color of the skin and a third weight according to the saturation level of the candidate color are obtained, and the first weight and the second weight are determined.
- the lip color degree based on the third weight is obtained, and the candidate color having the large lip color degree is specified as a representative color of the lips, and the first weight is a hue of a color space.
- the weight increases as the distance between the candidate color and the skin representative color in the saturation plane increases, and the second weight increases as the difference in hue between the candidate color and the skin representative color increases.
- the weight increases, and the third weight increases as the saturation of the candidate color increases. Color degree, the first weight, the second weight, and may be made larger in accordance with the third weight.
- the skin and lips are determined by the distance between each candidate color in the hue saturation plane and the representative color of the skin, and the difference in hue between each candidate color and the representative color of the skin. Can be distinguished. Further, teeth and lips can be distinguished according to the saturation of each candidate color. Therefore, the representative color of the lips can be accurately identified by distinguishing the skin or teeth from the lips.
- the lip representative color specifying unit for each candidate color, a first weight according to a distance between the candidate color on the hue saturation plane of a color space and the representative color of the skin, and the candidate color A second weight according to the difference in hue from the representative color of the skin and a third weight according to the saturation level of the candidate color are obtained, and the first weight and the second weight are determined. And a lip color degree based on the third weight, the candidate color having a large lip color degree is selected as a first selection candidate color, and the first weight is a hue saturation plane of a color space. The weight increases as the distance between the candidate color and the skin representative color increases, and the second weight increases as the difference in hue between the candidate color and the skin representative color increases.
- the third weight is greater as the saturation of the candidate color is larger, and the lip color degree is
- the lip representative color specifying unit is increased in accordance with the first weight, the second weight, and the third weight, and the lip representative color specifying unit determines the candidate colors excluding the first selection candidate color.
- a fourth weight corresponding to the distance between the candidate color and the first selected candidate color in the hue saturation plane of the color space is obtained, and the candidate evaluation based on the lip color degree and the fourth weight A value is obtained, and at least one candidate color having a large candidate evaluation value is selected as a second selection candidate color from each candidate color excluding the first selection candidate color, and the fourth weight is a color
- the lip representative color specifying part is the first selection.
- the candidate color having the highest saturation or the hue closest to the predetermined hue is specified as the representative color of the lips. May be.
- the distance between each candidate color and the representative color of the skin on the hue saturation plane, the difference in hue between each candidate color and the representative color of the skin, and the saturation of each candidate color A candidate color having a large difference from the skin color is selected as the first selection candidate color. Further, for each candidate color excluding the first selection candidate color, the first selection candidate color is excluded according to the distance between the candidate color and the first selection candidate color on the hue saturation plane. From each candidate color, a candidate color having a large difference from the first selection candidate color is selected as the second selection candidate color. Of the candidate colors selected as the first selection candidate color or the second selection candidate color, the candidate color having the highest saturation or the hue closest to the predetermined hue is used as the representative color of the lips. Identify.
- a plurality of candidate colors are selected according to the hue and saturation differences from the skin color, and the candidate color that seems to be a lip color (for example, the saturation is high or the hue is close to red) is selected.
- Select as the representative color of the lips Therefore, even if the skin and lips are similar in color, it is possible to prevent the tooth color from being mistakenly set as the representative color of the lips, distinguish the skin or teeth from the lips, and more accurately represent the lips.
- the color can be specified.
- the image processing apparatus includes a lip region specifying unit that specifies a region having a color similar to the representative color of the lips as a first lip region according to a difference in hue and saturation from the representative color of the lips. May be provided.
- part of the lips may become whitish and there is a partially bright area on the lips. Sometimes.
- the above configuration even in a partially bright region, it can be accurately determined as a lip region by considering the hue.
- the hue saturation plane between the representative color of the lips and the color of each region for example, pixel
- the lip area can be specified accurately. Therefore, it is possible to accurately specify the lip region based on the specified representative color of the lips.
- the lip region specifying unit has a distance from the representative color of the lips on the hue saturation plane of the color space equal to or less than a first threshold and / or a difference in hue from the representative color of the lips is a second value. You may identify the area
- the lip region specifying unit adapts a predetermined lip shape model that defines the shape of the lips to the first lip region, and sets the region indicated by the adapted lip shape model as a second lip region. You may specify.
- the first lip region which is a region similar to the representative color of the lips, may include pixels (noise) that are similar to the representative color of the lips in the skin or mouth but are not lips.
- the shape and / or position of the lip shape model that defines the shape of the lips that are likely to be lips based on the first lip region that is a region similar to the representative color of the lips is defined as the first lip region.
- the region indicated by the fitted lip shape model is a set of pixels that are likely to be lips, excluding the noise pixels. Therefore, it is possible to specify only a pixel at a position more like a lip as a (second) lip region.
- a lip region in which hue / saturation information is lost due to shining white at the time of photographing can be accurately identified as a lip region.
- the lip region specifying unit specifies a center point in the vertical direction of the upper lip or the lower lip for a plurality of horizontal positions based on the first lip region, and determines a predetermined vertical direction around each center point. May be specified as the second lip region.
- the center point in the vertical direction of the upper lip or the lower lip can be specified, and a more lip-like area can be specified as the second lip area based on each center point.
- the image processing apparatus may be partially realized by a computer.
- a control program that causes the image processing apparatus to be realized by the computer by causing the computer to operate as the respective units, and the control program
- a computer-readable recording medium on which is recorded also falls within the scope of the present invention.
- the color specifying unit 34 may be configured by hardware logic, or may be realized by software using a CPU (central processing unit) as follows.
- the image processing apparatuses 6 and 31 include a CPU that executes instructions of a control program that realizes each function, a ROM (read memory) that stores the program, a RAM (random access memory) that develops the program, and the program And a storage device (recording medium) such as a memory for storing various data.
- An object of the present invention is a recording in which the program code (execution format program, intermediate code program, source program) of the control program of the image processing apparatus 6.31, which is software that realizes the functions described above, is recorded so as to be readable by a computer. This can also be achieved by supplying a medium to the image processing apparatuses 6 and 31, and reading and executing the program code recorded on the recording medium by the computer (or CPU or MPU (microprocessor unit)).
- Examples of the recording medium include a tape system such as a magnetic tape and a cassette tape, a magnetic disk such as a floppy (registered trademark) disk / hard disk, a CD-ROM (compact disk-read-only memory) / MO (magneto-optical) / Disc system including optical disc such as MD (Mini Disc) / DVD (digital versatile disc) / CD-R (CD Recordable), card system such as IC card (including memory card) / optical card, or mask ROM / EPROM ( A semiconductor memory system such as erasable, programmable, read-only memory, EEPROM (electrically erasable, programmable, read-only memory) / flash ROM, or the like can be used.
- a tape system such as a magnetic tape and a cassette tape
- a magnetic disk such as a floppy (registered trademark) disk / hard disk
- the image processing devices 6 and 31 may be configured to be connectable to a communication network, and the program code may be supplied via the communication network.
- the communication network is not particularly limited.
- the Internet an intranet, an extranet, a LAN (local area network), an ISDN (integrated services network, digital network), a VAN (value-added network), and a CATV (community antenna) television communication.
- a network, a virtual private network, a telephone line network, a mobile communication network, a satellite communication network, etc. can be used.
- the transmission medium constituting the communication network is not particularly limited.
- IEEE institute of electrical and electronic engineering
- USB power line carrier
- cable TV line telephone line
- ADSL asynchronous digital subscriber loop
- Bluetooth registered trademark
- 802.11 wireless high data rate
- mobile phone network satellite line, terrestrial digital network, etc. But it is available.
- the present invention can be used for a digital camera equipped with an image processing apparatus.
Landscapes
- Engineering & Computer Science (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Health & Medical Sciences (AREA)
- Multimedia (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- Signal Processing (AREA)
- Image Processing (AREA)
- Image Analysis (AREA)
Abstract
Description
以下、本発明の実施形態について、図面を参照して詳細に説明する。
図1は、本実施形態に係るデジタルカメラ1の概略構成を示すブロック図である。デジタルカメラ1は、指示入力装置2、撮像装置3、画像記憶装置4、表示装置5、および画像処理装置6を備える。
画像処理装置6は、画像取得部(指示受領部)11、顔検出部12、特徴検出部13、適否判定部14、口画像正規化部15、平滑化部16、肌代表色特定部17、候補色特定部18、唇代表色特定部19、唇領域特定部20、画像補正部21、合成部22、および表示制御部23を備える。
以下に、デジタルカメラ1における画像補正処理の流れについて説明する。
本実施形態の画像処理装置6によれば、唇と肌とを含む口画像の輝度を除いた色相および彩度の情報に基づいて、様々な色であり得る唇の代表色を特定し、特定した唇の代表色に基づいて唇領域を正確に特定することができる。そして、特定した唇領域に補正処理を行うことで、画像の中の人物の唇に、適切な補正処理を施すことができる。
本実施形態では、補正処理として歯ホワイトニング補正を行う場合について説明する。なお、説明の便宜上、実施形態1にて説明した図面と同じ機能を有する部材・構成については、同じ符号を付記し、その詳細な説明を省略する。以下、本実施形態について、図面を参照して詳細に説明する。
図28は、本実施の形態に係るデジタルカメラ30の概略構成を示すブロック図である。デジタルカメラ30は、指示入力装置2、撮像装置3、画像記憶装置4、表示装置5、および画像処理装置31を備える。
画像処理装置31は、実施形態1の画像処理装置6の構成に加えて、口内部領域特定部32、歯候補色特定部33、および歯代表色特定部34を備える。
以下に、デジタルカメラ30における画像補正処理の流れについて説明する。なお、唇領域を特定するまでの処理(図5に示す処理)は、実施形態1と同様である。
本実施形態の画像処理装置31によれば、唇領域に基づいて、口内部領域を正確に特定することができる。そして、口内部領域の中の歯である度合いが大きい領域に対して補正処理を行うことで、画像の中の人物の歯に、適切な補正処理を施すことができる。
本発明に係る画像処理装置は、人物の口を含む顔画像から唇の特徴を特定する画像処理装置であって、上記顔画像の肌の代表色を特定する肌代表色特定部と、上記顔画像において、少なくともいずれかの領域が唇の一部を含むように複数の領域を設定し、各領域の代表色を候補色として特定する候補色特定部と、上記肌の代表色と各候補色との色相および彩度における差に応じて、上記複数の候補色から唇の代表色を特定する唇代表色特定部とを備える。
なお、上記画像処理装置は、一部をコンピュータによって実現してもよく、この場合には、コンピュータを上記各部として動作させることにより上記画像処理装置をコンピュータにて実現させる制御プログラム、および上記制御プログラムを記録したコンピュータ読み取り可能な記録媒体も、本発明の範疇に入る。
2 指示入力装置
3 撮像装置
4 画像記憶装置
5 表示装置
6、31 画像処理装置
11 画像取得部(指示受領部)
12 顔検出部
13 特徴検出部
14 適否判定部
15 口画像正規化部
16 平滑化部
17 肌代表色特定部
18 候補色特定部
19 唇代表色特定部
20 唇領域特定部
21 画像補正部
22 合成部
23 表示制御部
32 口内部領域特定部
33 歯候補色特定部
34 歯代表色特定部
Claims (11)
- 人物の口を含む顔画像から唇の特徴を特定する画像処理装置において、
上記顔画像の肌の代表色を特定する肌代表色特定部と、
上記顔画像において、少なくともいずれかの領域が唇の一部を含むように複数の領域を設定し、各領域の代表色を候補色として特定する候補色特定部と、
上記肌の代表色と各候補色との色相および彩度における差に応じて、上記複数の候補色から唇の代表色を特定する唇代表色特定部とを備えることを特徴とする画像処理装置。 - 上記唇代表色特定部は、上記肌の代表色と各候補色との、輝度または明度を除いた、色相および彩度における差に応じて、上記唇の代表色を特定することを特徴とする請求項1に記載の画像処理装置。
- 上記唇代表色特定部は、色空間の色相彩度平面における各候補色と上記肌の代表色との間の距離と、各候補色と上記肌の代表色との色相の差と、各候補色の彩度の大きさとに応じて、上記唇の代表色を特定することを特徴とする請求項1または2に記載の画像処理装置。
- 上記唇代表色特定部は、各候補色について、色空間の色相彩度平面における該候補色と上記肌の代表色との間の距離に応じた第1の重みと、該候補色と上記肌の代表色との色相の差に応じた第2の重みと、該候補色の彩度の大きさに応じた第3の重みとを求め、上記第1の重み、上記第2の重み、および上記第3の重みに基づいた唇色度合いを求め、上記唇色度合いが大きい上記候補色を、上記唇の代表色として特定するものであり、
上記第1の重みは、色空間の色相彩度平面における上記候補色と上記肌の代表色との間の距離が大きいほど重みが大きくなり、上記第2の重みは、上記候補色と上記肌の代表色との色相の差が大きいほど重みが大きくなり、上記第3の重みは、上記候補色の彩度が大きいほど重みが大きくなり、上記唇色度合いは、上記第1の重み、上記第2の重み、および上記第3の重みに応じて大きくなるものであることを特徴とする請求項1から3のいずれか一項に記載の画像処理装置。 - 上記唇代表色特定部は、各候補色について、色空間の色相彩度平面における該候補色と上記肌の代表色との間の距離に応じた第1の重みと、該候補色と上記肌の代表色との色相の差に応じた第2の重みと、該候補色の彩度の大きさに応じた第3の重みとを求め、上記第1の重み、上記第2の重み、および上記第3の重みに基づいた唇色度合いを求め、上記唇色度合いが大きい上記候補色を第1の選択候補色として選択し、
上記第1の重みは、色空間の色相彩度平面における上記候補色と上記肌の代表色との間の距離が大きいほど重みが大きくなり、上記第2の重みは、上記候補色と上記肌の代表色との色相の差が大きいほど重みが大きくなり、上記第3の重みは、上記候補色の彩度が大きいほど重みが大きくなり、上記唇色度合いは、上記第1の重み、上記第2の重み、および上記第3の重みに応じて大きくなるものであり、
上記唇代表色特定部は、上記第1の選択候補色を除いた各候補色について、色空間の色相彩度平面における該候補色と上記第1の選択候補色との間の距離に応じた第4の重みを求め、上記唇色度合い、および上記第4の重みに基づいた候補評価値を求め、上記第1の選択候補色を除いた各候補色から、上記候補評価値が大きい上記候補色を少なくとも1つ第2の選択候補色として選択し、
上記第4の重みは、色空間の色相彩度平面における上記候補色と上記第1の選択候補色との間の距離が大きいほど重みが大きくなり、上記候補評価値は、上記唇色度合い、および上記第4の重みに応じて大きくなるものであり、
上記唇代表色特定部は、第1の選択候補色または第2の選択候補色として選択された上記候補色のうち、彩度が最も大きい、または、色相が最も所定の色相に近い上記候補色を上記唇の代表色として特定することを特徴とする請求項1または2に記載の画像処理装置。 - 上記唇の代表色との色相および彩度における差に応じて、上記唇の代表色に類似した色を有する領域を第1の唇領域として特定する唇領域特定部を備えることを特徴とする請求項1から5のいずれか一項に記載の画像処理装置。
- 上記唇領域特定部は、色空間の色相彩度平面における上記唇の代表色からの距離が第1の閾値以下、かつ/または、上記唇の代表色との色相の差が第2の閾値以下である色を有する領域を上記第1の唇領域として特定することを特徴とする請求項6に記載の画像処理装置。
- 上記唇領域特定部は、唇の形状を規定する所定の唇形状モデルを、上記第1の唇領域に適合させ、適合させた上記唇形状モデルの示す領域を、第2の唇領域として特定することを特徴とする請求項6または7に記載の画像処理装置。
- 上記唇領域特定部は、上記第1の唇領域に基づいて、複数の水平位置について、上唇または下唇の縦方向における中心点を特定し、各中心点を中心とする縦方向の所定の幅の領域を、第2の唇領域として特定することを特徴とする請求項6または7に記載の画像処理装置。
- 人物の口を含む顔画像から唇の特徴を特定する画像処理方法において、
上記顔画像の肌の代表色を特定する肌代表色特定ステップと、
上記顔画像において、少なくともいずれかの領域が唇の一部を含むように複数の領域を設定し、各領域の代表色を候補色として特定する候補色特定ステップと、
上記肌の代表色と、各候補色との色相および彩度における差に応じて、上記複数の候補色から唇の代表色を特定する唇代表色特定ステップとを備えることを特徴とする画像処理方法。 - 人物の口を含む顔画像から唇の特徴を特定する画像処理装置の制御プログラムであって、
上記顔画像の肌の代表色を特定する肌代表色特定ステップと、
上記顔画像において、少なくともいずれかの領域が唇の一部を含むように複数の領域を設定し、各領域の代表色を候補色として特定する候補色特定ステップと、
上記肌の代表色と、各候補色との色相および彩度における差に応じて、上記複数の候補色から唇の代表色を特定する唇代表色特定ステップとをコンピュータに実行させる制御プログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020137020730A KR101554403B1 (ko) | 2011-03-10 | 2011-03-22 | 화상 처리 장치, 화상 처리 방법, 및 제어 프로그램이 기록된 기억 매체 |
EP11860381.0A EP2685419B1 (en) | 2011-03-10 | 2011-03-22 | Image processing device, image processing method, and computer-readable medium |
US13/984,863 US9111132B2 (en) | 2011-03-10 | 2011-03-22 | Image processing device, image processing method, and control program |
CN201180068788.4A CN103430208B (zh) | 2011-03-10 | 2011-03-22 | 图像处理装置、图像处理方法以及控制程序 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2011053617A JP4831259B1 (ja) | 2011-03-10 | 2011-03-10 | 画像処理装置、画像処理方法、および制御プログラム |
JP2011-053617 | 2011-03-10 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012120697A1 true WO2012120697A1 (ja) | 2012-09-13 |
Family
ID=45418141
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/056793 WO2012120697A1 (ja) | 2011-03-10 | 2011-03-22 | 画像処理装置、画像処理方法、および制御プログラム |
Country Status (6)
Country | Link |
---|---|
US (1) | US9111132B2 (ja) |
EP (1) | EP2685419B1 (ja) |
JP (1) | JP4831259B1 (ja) |
KR (1) | KR101554403B1 (ja) |
CN (1) | CN103430208B (ja) |
WO (1) | WO2012120697A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014167831A1 (ja) * | 2013-04-08 | 2014-10-16 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | メイクアップ塗材が塗布された状態を仮想的に再現することができる画像処理装置、画像処理方法、プログラム |
JP2017194421A (ja) * | 2016-04-22 | 2017-10-26 | 株式会社リコー | 色差算出装置、色差算出方法及びプログラム |
Families Citing this family (29)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011221812A (ja) * | 2010-04-09 | 2011-11-04 | Sony Corp | 情報処理装置及び方法、並びにプログラム |
JP5273208B2 (ja) * | 2011-06-07 | 2013-08-28 | オムロン株式会社 | 画像処理装置、画像処理方法、および制御プログラム |
US9251588B2 (en) * | 2011-06-20 | 2016-02-02 | Nokia Technologies Oy | Methods, apparatuses and computer program products for performing accurate pose estimation of objects |
JP6326808B2 (ja) * | 2013-12-18 | 2018-05-23 | カシオ計算機株式会社 | 顔画像処理装置、投影システム、画像処理方法及びプログラム |
TWI503789B (zh) * | 2013-12-19 | 2015-10-11 | Nat Inst Chung Shan Science & Technology | An Automatic Threshold Selection Method for Image Processing |
GB201413047D0 (en) * | 2014-07-23 | 2014-09-03 | Boots Co Plc | Method of selecting colour cosmetics |
US9378543B2 (en) * | 2014-07-28 | 2016-06-28 | Disney Enterprises, Inc. | Temporally coherent local tone mapping of high dynamic range video |
JP6369246B2 (ja) | 2014-09-08 | 2018-08-08 | オムロン株式会社 | 似顔絵生成装置、似顔絵生成方法 |
CN105741327B (zh) * | 2014-12-10 | 2019-06-11 | 阿里巴巴集团控股有限公司 | 提取图片的主色和醒目色的方法和装置 |
CN104573391B (zh) * | 2015-01-27 | 2017-09-22 | 福建医科大学附属口腔医院 | 一种基于线性回归的牙齿选色推测方法 |
JP6458570B2 (ja) | 2015-03-12 | 2019-01-30 | オムロン株式会社 | 画像処理装置および画像処理方法 |
US10068346B2 (en) * | 2016-01-13 | 2018-09-04 | Thomson Licensing | Color distance determination |
JP6720882B2 (ja) * | 2017-01-19 | 2020-07-08 | カシオ計算機株式会社 | 画像処理装置、画像処理方法及びプログラム |
DE102017202702A1 (de) | 2017-02-20 | 2018-08-23 | Henkel Ag & Co. Kgaa | Verfahren und Einrichtung zum Ermitteln einer Homogenität von Hautfarbe |
JP6677221B2 (ja) * | 2017-06-06 | 2020-04-08 | カシオ計算機株式会社 | 画像処理装置、画像処理方法及びプログラム |
JP7200139B2 (ja) * | 2017-07-13 | 2023-01-06 | 株式会社 資生堂 | 仮想顔化粧の除去、高速顔検出およびランドマーク追跡 |
CN109427078A (zh) * | 2017-08-24 | 2019-03-05 | 丽宝大数据股份有限公司 | 身体信息分析装置及其唇妆分析方法 |
CN109816741B (zh) * | 2017-11-22 | 2023-04-28 | 北京紫光展锐通信技术有限公司 | 一种自适应虚拟唇彩的生成方法及系统 |
JP7229881B2 (ja) * | 2018-08-14 | 2023-02-28 | キヤノン株式会社 | 医用画像処理装置、学習済モデル、医用画像処理方法及びプログラム |
CN109344752B (zh) * | 2018-09-20 | 2019-12-10 | 北京字节跳动网络技术有限公司 | 用于处理嘴部图像的方法和装置 |
JP6908013B2 (ja) * | 2018-10-11 | 2021-07-21 | カシオ計算機株式会社 | 画像処理装置、画像処理方法及びプログラム |
EP3664035B1 (en) * | 2018-12-03 | 2021-03-03 | Chanel Parfums Beauté | Method for simulating the realistic rendering of a makeup product |
JP7248820B2 (ja) | 2019-04-23 | 2023-03-29 | ザ プロクター アンド ギャンブル カンパニー | 美容的皮膚属性を決定するための機器及び方法 |
WO2020219612A1 (en) | 2019-04-23 | 2020-10-29 | The Procter & Gamble Company | Apparatus and method for visualizing cosmetic skin attributes |
US11341685B2 (en) * | 2019-05-03 | 2022-05-24 | NipLips, LLC | Color-matching body part to lip product |
DE102019134799B3 (de) * | 2019-12-17 | 2021-03-25 | Schölly Fiberoptic GmbH | Bildaufnahmeverfahren unter Verwendung einer Farbtransformation sowie zugehöriges medizinisches Bildaufnahmesystem |
JP7015009B2 (ja) * | 2019-12-19 | 2022-02-02 | カシオ計算機株式会社 | 画像処理装置、画像処理方法及びプログラム |
JP6969622B2 (ja) * | 2020-02-20 | 2021-11-24 | 株式会社セガ | 撮影遊戯装置およびプログラム |
CN113674177B (zh) * | 2021-08-25 | 2024-03-26 | 咪咕视讯科技有限公司 | 一种人像唇部的自动上妆方法、装置、设备和存储介质 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005276182A (ja) * | 2004-02-26 | 2005-10-06 | Dainippon Printing Co Ltd | 人物の肌および唇領域マスクデータの作成方法および作成装置 |
JP2008160474A (ja) * | 2006-12-22 | 2008-07-10 | Canon Inc | 画像処理装置およびその方法 |
JP2009231879A (ja) | 2008-03-19 | 2009-10-08 | Seiko Epson Corp | 画像処理装置および画像処理方法 |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6035867A (en) * | 1998-03-16 | 2000-03-14 | Barrick; Judith I. | Lip color sampling screen |
JP3902887B2 (ja) | 1999-06-04 | 2007-04-11 | 松下電器産業株式会社 | 唇抽出方法 |
US6504546B1 (en) * | 2000-02-08 | 2003-01-07 | At&T Corp. | Method of modeling objects to synthesize three-dimensional, photo-realistic animations |
JP4849761B2 (ja) * | 2002-09-02 | 2012-01-11 | 株式会社 資生堂 | 質感に基づく化粧方法及び質感イメージマップ |
JP2004348674A (ja) * | 2003-05-26 | 2004-12-09 | Noritsu Koki Co Ltd | 領域検出方法及びその装置 |
ES2373366T3 (es) * | 2004-10-22 | 2012-02-02 | Shiseido Co., Ltd. | Procedimiento de maquillaje de labios. |
WO2008115405A2 (en) * | 2007-03-16 | 2008-09-25 | Sti Medicals Systems, Llc | A method of image quality assessment to procuce standardized imaging data |
JP2009064423A (ja) * | 2007-08-10 | 2009-03-26 | Shiseido Co Ltd | メイクアップシミュレーションシステム、メイクアップシミュレーション装置、メイクアップシミュレーション方法およびメイクアップシミュレーションプログラム |
JP5290585B2 (ja) * | 2008-01-17 | 2013-09-18 | 株式会社 資生堂 | 肌色評価方法、肌色評価装置、肌色評価プログラム、及び該プログラムが記録された記録媒体 |
KR101035768B1 (ko) * | 2009-01-02 | 2011-05-20 | 전남대학교산학협력단 | 립 리딩을 위한 입술 영역 설정 방법 및 장치 |
-
2011
- 2011-03-10 JP JP2011053617A patent/JP4831259B1/ja active Active
- 2011-03-22 KR KR1020137020730A patent/KR101554403B1/ko active IP Right Grant
- 2011-03-22 EP EP11860381.0A patent/EP2685419B1/en active Active
- 2011-03-22 CN CN201180068788.4A patent/CN103430208B/zh active Active
- 2011-03-22 US US13/984,863 patent/US9111132B2/en active Active
- 2011-03-22 WO PCT/JP2011/056793 patent/WO2012120697A1/ja active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005276182A (ja) * | 2004-02-26 | 2005-10-06 | Dainippon Printing Co Ltd | 人物の肌および唇領域マスクデータの作成方法および作成装置 |
JP2008160474A (ja) * | 2006-12-22 | 2008-07-10 | Canon Inc | 画像処理装置およびその方法 |
JP2009231879A (ja) | 2008-03-19 | 2009-10-08 | Seiko Epson Corp | 画像処理装置および画像処理方法 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2685419A4 |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014167831A1 (ja) * | 2013-04-08 | 2014-10-16 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | メイクアップ塗材が塗布された状態を仮想的に再現することができる画像処理装置、画像処理方法、プログラム |
US9603437B2 (en) | 2013-04-08 | 2017-03-28 | Panasonic Intellectual Property Corporation Of America | Image processing device, image processing method, and program, capable of virtual reproduction of makeup application state |
JP2017194421A (ja) * | 2016-04-22 | 2017-10-26 | 株式会社リコー | 色差算出装置、色差算出方法及びプログラム |
Also Published As
Publication number | Publication date |
---|---|
US20130343647A1 (en) | 2013-12-26 |
JP2012190287A (ja) | 2012-10-04 |
CN103430208A (zh) | 2013-12-04 |
EP2685419A1 (en) | 2014-01-15 |
KR20130108456A (ko) | 2013-10-02 |
US9111132B2 (en) | 2015-08-18 |
KR101554403B1 (ko) | 2015-09-18 |
CN103430208B (zh) | 2016-04-06 |
EP2685419A4 (en) | 2014-09-17 |
JP4831259B1 (ja) | 2011-12-07 |
EP2685419B1 (en) | 2018-01-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4831259B1 (ja) | 画像処理装置、画像処理方法、および制御プログラム | |
JP5273208B2 (ja) | 画像処理装置、画像処理方法、および制御プログラム | |
EP2615576B1 (en) | Image-processing device, image-processing method, and control program | |
EP2615577B1 (en) | Image-processing device, image-processing method, and control program | |
US8525847B2 (en) | Enhancing images using known characteristics of image subjects | |
US10304166B2 (en) | Eye beautification under inaccurate localization | |
JP6312714B2 (ja) | 陰影検出および減衰のためのマルチスペクトル撮像システム | |
US20170323465A1 (en) | Image processing apparatus, image processing method, and storage medium | |
JP4722923B2 (ja) | コンピュータビジョンによりシーンをモデル化する方法 | |
JP2020526809A (ja) | 仮想顔化粧の除去、高速顔検出およびランドマーク追跡 | |
JP4893863B1 (ja) | 画像処理装置、および画像処理方法 | |
WO2012124142A1 (ja) | 画像処理装置、および画像処理方法 | |
US20110002506A1 (en) | Eye Beautification | |
US11030799B2 (en) | Image processing apparatus, image processing method and storage medium. With estimation of parameter of real illumination based on normal information on pixel included in high luminance area | |
JP2004005384A (ja) | 画像処理方法、画像処理装置、プログラム及び記録媒体、自動トリミング装置、並びに肖像写真撮影装置 | |
Abebe et al. | Towards an automatic correction of over-exposure in photographs: Application to tone-mapping | |
CN108965646A (zh) | 图像处理装置、图像处理方法和存储介质 | |
JP5272775B2 (ja) | 電子スチルカメラ | |
You et al. | Saturation enhancement of blue sky for increasing preference of scenery images | |
Choi et al. | Skin-representative region in a face for finding real skin color | |
JP2009205469A (ja) | オブジェクト領域の抽出システムおよび方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201180068788.4 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11860381 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 20137020730 Country of ref document: KR Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13984863 Country of ref document: US |