US9697618B2 - Image processing apparatus, image processing method, program, and image processing system - Google Patents

Image processing apparatus, image processing method, program, and image processing system Download PDF

Info

Publication number
US9697618B2
US9697618B2 US14/420,522 US201314420522A US9697618B2 US 9697618 B2 US9697618 B2 US 9697618B2 US 201314420522 A US201314420522 A US 201314420522A US 9697618 B2 US9697618 B2 US 9697618B2
Authority
US
United States
Prior art keywords
feature
image
skin
calculation unit
value calculation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US14/420,522
Other languages
English (en)
Other versions
US20150213619A1 (en
Inventor
Yusuke Nakamura
Shinichiro Gomi
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp filed Critical Sony Corp
Assigned to SONY CORPORATION reassignment SONY CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: NAKAMURA, YUSUKE, GOMI, SHINICHIRO
Publication of US20150213619A1 publication Critical patent/US20150213619A1/en
Application granted granted Critical
Publication of US9697618B2 publication Critical patent/US9697618B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • G06T7/408
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0077Devices for viewing the surface of the body, e.g. camera, magnifying lens
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/44Detecting, measuring or recording for evaluating the integumentary system, e.g. skin, hair or nails
    • A61B5/441Skin evaluation, e.g. for skin disorder diagnosis
    • A61B5/444Evaluating skin marks, e.g. mole, nevi, tumour, scar
    • G06K9/4652
    • G06T5/002
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • G06T7/0014Biomedical image inspection using an image reference approach
    • G06T7/0016Biomedical image inspection using an image reference approach involving temporal comparison
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10141Special mode during image acquisition
    • G06T2207/10152Varying illumination
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • G06T2207/20028Bilateral filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20192Edge enhancement; Edge preservation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30088Skin; Dermal

Definitions

  • the present technique relates to an image processing apparatus, an image processing method, a program, and an image processing system that enable accurate detection of feature sections from a skin image.
  • Patent Literature 1 a smoothing image is subtracted from an original image and binarization is performed on the image, on which subtraction has been performed, so as to divide the image into areas indicating the pores and other areas to measure the number, size, and the like of the areas indicating the pores.
  • Patent Literature 1 JP H7-55447A
  • an object of the present technique is to provide an image processing apparatus, an image processing method, a program, and an image processing system that are capable of accurately detecting feature sections from a skin image.
  • an image processing apparatus including a feature value calculation unit configured to calculate, as feature values, a polarity related to gradation of a skin image and a scale indicating image areas each having a pixel value that is similar to each other and that are different from surroundings of the image areas, and a feature section extraction unit configured to extract a feature section of the skin image on a basis of the feature values calculated by the feature value calculation unit.
  • a brightness information separation processing is performed on a skin image, a skin-surface reflection of which has been removed by, for example, configuring a light source and a polarizing filter that has been provided in an imagining unit to have an orthogonal relationship with each other, so as to acquire a global brightness information that indicates the structural components of the skin.
  • Calculation of the feature values is performed using the global brightness information. For example, polarities related to gradation of the skin image and scales indicating image areas having pixel values that are similar to each other and that are different from the surroundings of the image areas are calculated as the feature values. Furthermore, intensities indicating signal differences between the image areas that have pixel values similar to each other and the surroundings of the image areas may be further calculated as the feature values.
  • a statistic is calculated to generate information related to at least either of a number, a size, and a color density of the feature sections.
  • a melanin analysis unit that performs analysis of melanin is provided and extraction of the feature sections is performed with the feature values that have been calculated on the basis of the skin image and with the melanin analysis result of the skin image.
  • extraction of the feature sections is performed with the feature values that have been calculated on the basis of the skin image taken with a white light and with the melanin analysis result of the skin image and the like that has been taken with a red light and a near infrared light, or a ultraviolet light.
  • an image positioning unit that matches the position of the skin image of the past and the position of the skin image of the present, for example, to match the feature sections is provided, and with matching of the feature sections performed with the second feature value indicating polarities of the intensity change, matching of the positions of the skin images is performed so that the corresponding feature sections of the skin image of the past and skin image of the present are matched. Furthermore, when the number of corresponding feature sections is equivalent to or smaller than a predetermined number, a piece of advice is presented for moving an imaging area so that the positions of the corresponding feature sections of the second skin image are positioned at the positions of the corresponding feature section of the first skin image.
  • an image processing method including the steps of calculating, as feature values, a polarity related to gradation of a skin image and a scale indicating image areas each having a pixel value that is similar to each other and that are different from surroundings of the image areas, and extracting a feature section of the skin image on a basis of the calculated feature values.
  • a program for causing a computer to execute a skin image processing the program causing the computer to execute calculating, as feature values, a polarity related to gradation of a skin image and a scale indicating image areas each having a pixel value that is similar to each other and that are different from surroundings of the image areas, and extracting a feature section of the skin image on a basis of the calculated feature values.
  • the program of the present technology is a program that can be provided using a storage medium and a communication medium that is provided to a general-purpose computer that can execute various program codes in a computer-readable form, for example, a storage medium such as an optical disc, a magnetic disk, a semiconductor memory or a communication medium such as a network.
  • a storage medium such as an optical disc, a magnetic disk, a semiconductor memory or a communication medium such as a network.
  • polarities related to gradation of the skin image and scales indicating image areas having pixel values that are similar to each other and that are different from the surroundings of the image areas are calculated as feature values, and on the basis of the feature values, the feature sections of the skin image are extracted. Accordingly, the pores, the principles, and the like of the skin can be accurately detected as the feature sections and, thus, various pieces of advice and the like can be given in an appropriate manner in accordance with the skin condition.
  • the effects described in the present description are only exemplifications and the effects are not limited to those described in the present description and, further, there may be additional effects.
  • FIG. 1 is a diagram exemplifying a configuration of an image processing system.
  • FIG. 2 includes diagrams illustrating a configuration of an imaging device in a schematic manner.
  • FIG. 3 includes diagrams each illustrating an exemplification of light sources of an attachment.
  • FIG. 4 is a diagram illustrating a position of the imaging device when an image is taken.
  • FIG. 5 is a diagram illustrating a configuration of a first embodiment of the image processing apparatus.
  • FIG. 6 includes diagrams exemplifying function C.
  • FIG. 7 is a diagram exemplifying blend ratios BR s and BR M .
  • FIG. 8 is a diagram illustrating a configuration of a feature section analysis unit.
  • FIG. 9 is a flow chart illustrating an operation extracting feature sections.
  • FIG. 10 is a diagram illustrating a relationship between detected feature sections, light sources, and extraction conditions of the feature sections.
  • FIG. 11 is a diagram exemplifying an operation of a presentation unit.
  • FIG. 12 is a diagram exemplifying light sources of an attachment.
  • FIG. 14 is a diagram illustrating a configuration of a melanin analysis unit.
  • FIG. 15 is a diagram illustrating transfer characteristics that transfers melanin distribution Mx into melanin index ID Mx .
  • FIG. 16 is a flowchart illustrating an operation extracting the feature sections.
  • FIG. 17 is a diagram illustrating a configuration of a third embodiment of the image processing apparatus.
  • FIG. 18 is a diagram illustrating a configuration of a feature section analysis unit.
  • FIG. 19 is a flowchart illustrating an operation of the third embodiment.
  • FIG. 20 is a diagram exemplifying an operation of a presentation unit.
  • FIG. 21 exemplifies a piece of advice given when the number of pairs of feature sections is equivalent to or smaller than a predetermined number.
  • FIG. 1 exemplifies a configuration of an image processing system of the present technique.
  • An image processing system 10 is constituted by employing a piece of equipment (hereinafter referred to as “an imaging device”) 11 that has an imaging function, information processing devices 15 and 16 , and the like.
  • the imaging device 11 and the information processing device (a personal computer device, for example) 15 can be directly coupled to each other through a wired or wireless transmission path.
  • the imaging device 11 and the information processing device (a server device, for example) 16 can be coupled to each other through a public telecommunication network and the like.
  • the imaging device 11 includes an imaging unit 112 that generates a captured image of the skin.
  • An image processing apparatus 20 detecting feature sections of the skin is provided in either of the imaging device 11 and the information processing devices 15 and 16 .
  • the image processing apparatus 20 includes a feature value calculation unit and a feature section extraction unit.
  • the feature value calculation unit calculates, as feature values, polarities related to gradation and scales indicating pixel areas having pixel values that are similar to each other and that are different from the surroundings of the pixel areas from the captured image of the skin (hereinafter, referred to as a “skin image”).
  • the feature section extraction unit extracts feature sections of the skin on the basis of the calculated feature values.
  • a presentation unit 50 that presents an extraction result of the feature sections of the skin is provided in either of the imaging device 11 and the information processing device 15 and 16 .
  • the skin image used to detect the feature sections of the skin with the image processing apparatus 20 is not limited to the skin image that is output from the imaging unit 112 and maybe a skin image that has been generated in the imaging unit 112 and that has been stored in a recording medium or the like.
  • FIG. 2 illustrates, in a schematic manner, a configuration of the imaging device that takes an image of the skin.
  • FIG. 2(A) is a front view of the imaging device 11
  • FIG. 2(B) is a side view of the imaging device 11 .
  • An attachment 12 is provided at the distal end of a lens barrel 111 of the imaging device 11 .
  • the attachment 12 may be integrally formed with the lens barrel 111 or may be configured so as to be attachable to and detachable from the lens barrel 111 .
  • a plurality of light sources 121 light emitting diodes (LEDs) 1 to n, for example) that constitute an illumination unit are arranged in a ring shape in the attachment 12 .
  • LEDs light emitting diodes
  • a white LED is preferable as the light source.
  • white LEDs 121 - 1 provided with polarizing filters each having a predetermined polarization plane and white LEDs 121 - 2 provided with a polarizing filter having a polarization plane that is, for example, orthogonal to the predetermined polarization planes are provided.
  • impurities in pores serving as feature sections may be detected with near-ultraviolet light LEDs 121 - v.
  • the imaging device 11 provided with the attachment 12 captures a skin image by being in close contact with the skin. Furthermore, when the light sources that are provided with the polarizing filter are used, a polarizing filter 113 , the polarization plane of which is orthogonal to the predetermined polarization plane, is provided in the optical path extending to the imaging unit 112 . Since the polarizing filter is provided in the above described manner, by taking an image after turning on the white LEDs 121 - 1 , an image, the skin-surface reflection component of which has been removed, can be obtained, and by taking an image after turning on the white LEDs 121 - 2 , an image, the skin-surface reflection component of which has been not removed, can be obtained.
  • FIG. 5 illustrates a configuration of the first embodiment of an image processing apparatus.
  • the image processing apparatus 20 includes a preprocessing unit 21 , a brightness information separation unit 22 , and a feature section analysis unit 24 .
  • the preprocessing unit 21 acquires a skin image and performs preprocessing.
  • the preprocessing unit 21 performs contrast enhancement processing to the acquired brightness information of the skin image to greatly emphasize the shadows. Histogram equalization processing, for example, is employed as the method of performing the contrast enhancement processing. Furthermore, when noise stands out, noise removal may be performed before the contrast enhancement, and when shades stand out, shading compensation may be performed before the contrast enhancement.
  • the brightness information separation unit 22 separates the brightness information that has been obtained by the preprocessing unit 21 into global brightness information and local brightness information.
  • the global brightness information is information that indicates lighting components included in the image and structural components of the skin.
  • the local brightness information is information that indicates detailed patterns of the skin, such as texture.
  • the global brightness information may be separated by performing a low pass processing on the input brightness information and, accordingly, the separation method disclosed in JP 2009-98925A may be employed.
  • brightness information f G ′ (see expression (1)) is generated by performing a low-pass filtering processing on brightness information (input brightness information) f in W that has been acquired.
  • low amplitude components of the differential between the input brightness information f in W and the brightness information f G ′ are solely set as local brightness information f L W . In such a case, when a simple low-pass filtering processing is performed on a portion with high edge intensity, an intense edge will be included in the local brightness information.
  • local brightness information f L W is generated by extracting only the low amplitude components by using function C that suppresses high amplitude components (see expression (2)).
  • global brightness information f G W is generated by calculating the differential between the input brightness information f in W and the local brightness information f L W (see expression (3)).
  • f G ′( x,y ) lpf ⁇ circle around ( ⁇ ) ⁇ f in W ( x,y ) (1)
  • f L W ( x,y ) C ( f in W ( x,y ) ⁇ f G ′( x,y )) (2)
  • f G W ( x,y ) ( f in W ( x,y ) ⁇ f L W ( x,y )) (3)
  • FIG. 6 exemplifies function C.
  • function C processing that suppresses amplitude is performed on signals whose amplitudes are not included in the range of “i_min-i_max”. Note that FIGS. 6(A) to 6(D) exemplify function C having different amplitude suppression characteristics.
  • the limitation of amplitude is performed by applying function C to the differentials between the input brightness information and each of the pieces of brightness information on which filter processing has been performed by a plurality of low-pass filters having different pass frequency bands.
  • the global brightness information can be generated using the local brightness information that is an integration of the brightness information on which the limitation of amplitude has been performed.
  • the brightness information f G ′ is generated by performing the low-pass filtering processing on the input brightness information f in W with three low-pass filters having different pass frequency bands (see expression (4)).
  • S, M, and L indicate the number of taps of the low-pass filters, and the bands of the low-pass filters become narrower in this order.
  • function C only the low amplitude components are extracted from the differential between the input brightness information f in W and brightness information f Gj W to generate brightness information f Lj W (see expression (5)).
  • the local brightness information f L W is generated by integrating the brightness information f Lj W with the blended ratios BR s and BR M corresponding to the edge intensities (see expression (6)).
  • the global brightness information f G W may be generated from expression (3) with the local brightness information f L W generated as above.
  • FIG. 7 exemplifies the blended ratios BR s and BR M .
  • the ratios of the blend ratio BR s and the blend ratio BR M are both high when the edge intensity is high.
  • the ratio of the blend ratio BR M is higher than that of the blend ratio BR s when the edge intensity is moderate.
  • the ratio of the blend ratio BR s and the blend ratio BR M are both low when the edge intensity is low.
  • the separation of the brightness information is not limited by the methods described above and, for example, a known edge-preserving smoothing filter such as, for example, a bilateral filter may be applied as the low-pass filter.
  • a known edge-preserving smoothing filter such as, for example, a bilateral filter may be applied as the low-pass filter.
  • the feature section analysis unit 24 calculates the feature values from the skin image and, on the basis of the calculated feature values, analyzes the feature sections of the skin.
  • FIG. 8 illustrates a configuration of the feature section analysis unit 24 .
  • the feature section analysis unit 24 includes a feature value calculation unit 241 , a feature section extraction unit 243 , and a statistic calculation unit 244 .
  • the feature value calculation unit 241 calculates the feature values.
  • the feature value calculation unit 241 calculates the feature values using the global brightness information generated in the brightness information separation unit 22 , for example.
  • the feature value calculation unit 241 calculates, as feature values, polarities related to gradation of the skin image, and scales indicating pixel areas having pixel values that are similar to each other and that are different from the surroundings of the pixel areas.
  • the feature value calculation unit 241 may further calculate, as feature values, intensities indicating signal differences between the image areas that have pixel values similar to each other and the surroundings of the image areas.
  • Calculation of the feature values will be described next. Calculation of the feature points is performed using an image feature extraction technique. Techniques such as Speeded Up Robust Features (SURF) and Scale Invariant Feature Transform (SIFT) are known as image feature extraction techniques.
  • SURF Speeded Up Robust Features
  • SIFT Scale Invariant Feature Transform
  • the feature points (white circles and black circles) in the image are selected in the following manner. Specifically, the feature points in the image are detected by searching points having the maximum matrix value of the Hessian matrix while the standard deviation ⁇ of the Gaussian function is changed. In order to increase speed while calculating the matrix value of the Hessian matrix, computation of expression (7) that employs an approximation filter is performed. Note that in expression (7), D xx , D yy , and D xy indicate convolutions of the image based on the second order Gaussian derivatives that are components of the Hessian matrix calculated using a box filter.
  • det( H approx ) D xx D yy ⁇ (0.9 D xy ) 2 (7)
  • the feature value calculation unit 241 calculates, as feature values, scales indicating pixel areas having pixel values that are similar to each other and that are different from the surroundings of the pixel areas, in other words, the feature value calculation unit 241 calculates, as feature values, scales in which the matrix value calculated in expression (7) becomes maximum. Furthermore, the feature value calculation unit 241 sets the polarities related to gradation of the skin image as the feature values, in other words, the feature value calculation unit 241 sets the Laplacian value of the feature points from which the feature values has been calculated as the feature values.
  • the feature value calculation unit 241 may include, in the feature values, the intensities indicating signal differences between the image areas that have pixel values similar to each other and the surroundings of the image areas, such as the maximum value of the matrix values that have been calculated in expression (7). Note that the feature value calculation unit 241 may use the other feature extraction technique described above to calculate the feature values.
  • the feature section extraction unit 243 extracts the feature sections on the basis of the feature values that have been obtained by the feature value calculation unit 241 . For example, when the pore portions are extracted as the feature sections, the feature section extraction unit 243 sets Laplacian to “1” as the extraction condition. This is because the above matches with the characteristics of the pores since the Lapacian indicates the polarities related to gradation of the skin image and the black pixels are surrounded by the white pixels when the Laplacian is “1”. Furthermore, the feature section extraction unit 243 limits the scales that indicate the pixel areas having pixel values that are similar to each other and that are different from the surroundings of the pixel areas. It is known that the size of a pore is about 0.1 mm to 0.3 mm. Accordingly, the feature section extraction unit 243 sets an extraction condition that the scale of the feature values obtained by the feature value calculation unit 241 to correspond to the size of the pores.
  • the feature section extraction unit 243 may limit the intensities.
  • a feature that is considered to be an area where the pores stand out needs a certain amount of signal intensity (contrast difference) with respect to the other areas. Accordingly, the feature section extraction unit 243 sets, as an extraction condition, an intensity (a matrix value) that is capable of extracting pores and the like that stands out.
  • the static calculation unit 244 calculates statistics of the feature sections that have been extracted by the feature section extraction unit 243 . For example, when the pores are extracted as the feature sections, statistic calculation unit 244 measures the number of extracted feature sections and calculates the number of pores. Furthermore, the statistic calculation unit 244 calculates an average value of the scales of the extracted feature sections to obtain a statistic of the size of the pores from the average value. Furthermore, the statistic calculation unit 244 calculates an average value of the intensities (Hessian value, for example) of the extracted feature sections to set a color density of the pores. Furthermore, not limited to the average value, the statistic calculation unit 244 may calculate a maximum value, a variance, and the like.
  • FIG. 9 is a flowchart illustrating an operation extracting the feature sections.
  • the image processing apparatus 20 acquires a feature value.
  • the image processing apparatus 20 acquires a feature value that has been calculated using the global brightness information and that indicates, for example, a polarity related to gradation of the skin image, a scale indicating pixel areas having pixel values that are similar to each other and that are different from the surroundings of the pixel areas, and an intensity (a contrast difference), and proceeds to step ST 2 .
  • step ST 2 the image processing apparatus 20 discriminates whether the feature value has a predetermined polarity.
  • the image processing apparatus 20 proceeds to step ST 3 .
  • the image processing apparatus 20 proceeds to step ST 7 when the feature value does not indicate the polarity of the feature sections.
  • step ST 4 the image processing apparatus 20 discriminates whether the feature value has a predetermined intensity.
  • the feature value indicates the intensity of the extracted feature section, for example, when the intensity indicated in the feature value is an intensity that corresponds to the contrast difference between the pore portions and the skin portion, then the image processing apparatus 20 proceeds to step ST 5 .
  • the image processing apparatus 20 proceeds to step ST 7 .
  • step ST 6 the image processing apparatus 20 discriminates whether the extraction processing of the feature sections have been completed. If an area of the feature value that has not been determined whether it is a feature section or not remains, the image processing apparatus 20 proceeds to step ST 7 , and if no such feature value remains, the image processing apparatus 20 ends the operation of extracting the feature sections.
  • step ST 7 the image processing apparatus 20 acquires a new feature value.
  • the image processing apparatus 20 acquires a feature value that has not been determined whether it is a feature section and returns to step ST 2 .
  • the image processing apparatus 20 extracts the pore portions from the skin image in an accurate manner. Furthermore, when an extraction result of the feature sections is presented on the presentation unit 50 , the result is presented in a manner that the user can easily understand.
  • FIG. 10 illustrates the relationship between the extracted feature sections, the light source, and the extraction conditions.
  • a white light source is used.
  • the extraction conditions are set such that polarities indicate features that are surrounded by pixels with high brightness and such that the range of the scale is about 0.1 to 0.3 mm.
  • a white light source is used.
  • the extraction conditions are set such that polarities indicate features that are surrounded by pixels with low brightness and such that the range of the scale is about 0.5 to 1.0 mm.
  • a near-ultraviolet light source is used. It is known that a portion with sebum emits green light with near-ultraviolet light and that a portion with porphyrin created by acne bacteria emits orange light with near-ultraviolet light. Accordingly, the extraction conditions are set such that polarities indicate features that are surrounded by pixels with low brightness when the skin image is gray scaled, such that polarities indicate a green and an orange light with near-ultraviolet light, and such that the range of the scale is about 0.2 to 0.5 mm.
  • FIG. 11 exemplifies an operation of the presentation unit.
  • FIG. 11(A) illustrates a state in which the pores are plotted on the skin image according to their sizes, for example, and
  • FIG. 11(B) is a distribution (a histogram) of the sizes of the pores displayed according to their sizes.
  • FIG. 11(C) illustrates a state in which the pores are plotted on the captured image according to their color densities.
  • FIG. 11(D) is a distribution (a histogram) of the color densities of the pores displayed according to their color densities.
  • the extraction result of the feature sections are presented on the presentation unit 50 in a manner that can be easily understood by the user.
  • red LEDs 121 - r and near infrared LEDs 121 - ir are further provided in the attachment 12 , and the imaging device 11 takes an image of the skin after changing the light source that is used for lighting.
  • a captured image that has been taken while irradiating ultraviolet rays may be used to perform analysis of the melanin and the melanin analysis may be performed using a spectral reflection factor of the skin.
  • analysis of melanin is performed using an image that has been obtained by taking the image while switching between red light and near infrared light will be described.
  • FIG. 13 illustrates a configuration of the second embodiment of the image processing apparatus.
  • the image processing apparatus 20 includes the preprocessing unit 21 , the brightness information separation unit 22 , a melanin analysis unit 23 , and the feature section analysis unit 24 .
  • the preprocessing unit 21 acquires a skin image that has been taken with a white light source and preforms preprocessing. Similar to the first embodiment, the preprocessing unit 21 applies contrast enhancement processing to the acquired brightness information of the skin image to emphasize the shadows. Furthermore, when noise stands out, noise removal may be performed before the contrast enhancement, and when shades stand out, shading compensation may be performed before the contrast enhancement. Note that as the brightness information, a blue (B) channel signal is preferably used on a skin image taken with a white light source and a red (R) channel signal is preferably used on a skin image taken with a red light source and a skin image taken with a near infrared light source.
  • B blue
  • R red
  • the brightness information separation unit 22 separates the brightness information that has been obtained by the preprocessing unit 21 into global brightness information and local brightness information. In a similar manner to that of the first embodiment, the brightness information separation unit 22 separates the global brightness information, which is information indicating the lighting components included in the image and the structural components of the skin, and the local brightness information, which indicates detailed patterns of the skin, such as texture, from each other.
  • the global brightness information which is information indicating the lighting components included in the image and the structural components of the skin
  • the local brightness information which indicates detailed patterns of the skin, such as texture
  • the melanin analysis unit 23 acquires both the skin image that has been taken using a red light source and the skin image that has been taken using a near infrared light source and performs analysis of melanin on the basis of the acquired skin images.
  • FIG. 14 illustrates a configuration of the melanin analysis unit 23 .
  • the melanin analysis unit 23 includes a melanin distribution calculation unit 231 and a melanin index calculation unit 232 .
  • the melanin distribution calculation unit 231 calculates an average value Avg std R of a brightness information (red component) f std R from a captured image that is obtained by irradiating a red light to a standard diffuse reflection plate used to perform calibration (see expression (8)). Similarly, an average value Avg std IR of a brightness information (near infrared component) f std IR is calculated from the captured image that is obtained by irradiating a near infrared light to the standard diffuse reflection plate (see expression (9)). [Math. 4]
  • Avg std R ⁇ x , y ⁇ f std R ⁇ ( x , y ) ⁇ / ⁇ ⁇ x , y ( 8 )
  • Avg std IR ⁇ x , y ⁇ f std IR ⁇ ( x , y ) ⁇ / ⁇ ⁇ x , y ( 9 )
  • a melanin distribution Mx is calculated from an input brightness information (red component) f in R of a captured image that is obtained by irradiating a red light to the skin, an input brightness information (near infrared component) f in IR of a captured image that is obtained by irradiating a near infrared light to the skin, and the average values Avg std R and Avg std IR when the standard diffuse reflection plate is used (see expression (10)).
  • Mx ⁇ ( x , y ) k ⁇ ⁇ log ⁇ [ f in R ⁇ ( x , y ) Avg std R ] - log ⁇ [ f in IR ⁇ ( x , y ) Avg std IR ] ⁇ + q ( 10 )
  • the melanin index calculation unit 232 calculates a melanin index by normalizing the melanin distribution that has been calculated by the melanin distribution calculation unit 231 .
  • the melanin index calculation unit 232 calculates an average value Avg in R of the input brightness information (red component) f in R and an average value Avg in IR of the input brightness information (near infrared component) f in IR (see expressions (11) and (12)). [Math. 6]
  • an average melanin amount Mx avg is calculated from the average value Avg in R of the input brightness information (red component) f in R , the average value Avg in IR of the input brightness information (near infrared component) f in IR , and the average values Avg std R and Avg std IR when the standard diffuse reflection had been used (expression (13)).
  • threshold values Mx_th 1 and Mx_th 2 are set using the average melanin amount Mx avg (expressions (14) and (15)).
  • Mx _ th 1 Mx Avg ⁇ e (14)
  • Mx _ th 2 Mx Avg +e (15)
  • “e” is a parameter that defines the melanin distribution Mx that sets the range of the melanin index from “0” to “1”. Note that the method of setting the threshold values is not limited to the method described above and, for example, the threshold values may be set as fixed values.
  • the melanin index calculation unit 232 sets the threshold value Mx_th 1 and Mx_th 2 and, with the transfer characteristics illustrated in FIG. 15 , transfers the melanin distribution Mx into a melanin index ID Mx .
  • the feature section analysis unit 24 calculates the feature values from the skin image and, on the basis of the calculated feature values, extracts the feature sections of the skin. Similar to the first embodiment, the feature section analysis unit 24 includes the feature value calculation unit 241 , the feature section extraction unit 243 , and the statistic calculation unit 244 .
  • the feature value calculation unit 241 calculates the feature values.
  • the feature value calculation unit 241 calculates the feature values using the global brightness information generated in the brightness information separation unit 22 , for example.
  • the feature value calculation unit 241 calculates, as feature values, polarities related to gradation of the skin image, and scales indicating pixel areas having pixel values that are similar to each other and that are different from the surroundings of the pixel areas.
  • the feature value calculation unit 241 may further calculate, as feature values, intensities indicating signal differences between the image areas that have pixel values similar to each other and the surroundings of the image areas.
  • the feature section extraction unit 243 extracts the feature sections on the basis of the feature values that have been obtained by the feature value calculation unit 241 . For example, when the pore portions are extracted as the feature sections, the feature section extraction unit 243 sets Laplacian to “1” as the extraction condition. This is because the above matches with the characteristics of the pores since the Lapacian indicates the polarities related to gradation of the skin image and the black pixels are surrounded by the white pixels when the Laplacian is “1”. Furthermore, the feature section extraction unit 243 limits the scales that indicate the pixel areas having pixel values that are similar to each other and that are different from the surroundings of the pixel areas. It is known that the size of a pore is about 0.1 mm to 0.3 mm. Accordingly, the feature section extraction unit 243 sets an extraction condition that the scale of the feature values obtained by the feature value calculation unit 241 correspond to the size of the pores.
  • the feature section extraction unit 243 may limit the intensities.
  • a feature that is considered to be an area where the pores stand out needs a certain amount of signal intensity (contrast difference) with respect to the other areas. Accordingly, the feature section extraction unit 243 sets, as an extraction condition, an intensity (a matrix value) that is capable of extracting pores and the like that stands out.
  • the feature section extraction unit 243 determines, on the basis of the melanin index ID Mx of the extracted feature section, whether the feature section is an area of a pore or an area of a blemish. For example, when the melanin index ID Mx is larger than a predetermined value, the feature section is determined as an area of a pore, and when equivalent or smaller than the predetermined value, the feature section is determined as an area of a blemish.
  • the static calculation unit 244 calculates statistics of the feature sections that have been extracted by the feature section extraction unit 243 . For example, when the pores are extracted as the feature sections while being distinguished from the blemishes, statistic calculation unit 244 measures the number of extracted feature sections and calculates the number of pores. Furthermore, the statistic calculation unit 244 calculates an average value of the scales of the feature sections of the pores to obtain a statistic of the size of the pores from the average value. Furthermore, the statistic calculation unit 244 calculates an average value of the intensities (Hessian value, for example) of the extracted feature sections to set a color density of the pores. Moreover, the statistic calculation unit 244 may obtain a rate of the areas with high melanin index.
  • the presentation unit 50 that is provided in the imaging device 11 , the information processing device 15 , or the like presents the analysis result of the feature section analysis unit 24 to a user. For example, the presentation unit 50 displays the statistic that has been calculated in the feature section analysis unit 24 on a screen.
  • FIG. 16 is a flowchart illustrating an operation extracting the feature sections.
  • the image processing apparatus 20 acquires a feature value.
  • the image processing apparatus 20 acquires a feature value that has been calculated using the brightness information and that indicates, for example, a polarity related to gradation of the skin image, a scale indicating pixel areas having pixel values that are similar to each other and that are different from the surroundings of the pixel areas, and an intensity (a contrast difference), and proceeds to step ST 12 .
  • step ST 12 the image processing apparatus 20 discriminates whether the feature value has a predetermined polarity.
  • the image processing apparatus 20 proceeds to step ST 13 .
  • the image processing apparatus 20 proceeds to step ST 7 when the feature value does not indicate the polarity of the feature sections.
  • step ST 13 the image processing apparatus 20 discriminates whether the feature value is of a predetermined scale.
  • the image processing apparatus 20 proceeds to step ST 14 .
  • the image processing apparatus 20 proceeds to step ST 18 when the feature value does not indicate the predetermined scale.
  • step ST 14 the image processing apparatus 20 discriminates whether the feature value has a predetermined intensity.
  • the feature value indicates the intensity of the extracted feature section, for example, when the intensity indicated in the feature value is an intensity that corresponds to the contrast difference between the pore portions and the skin portion, then the image processing apparatus 20 proceeds to step ST 15 .
  • the image processing apparatus 20 proceeds to step ST 18 .
  • step ST 15 the image processing apparatus 20 discrimates whether the melanin index is equivalent to or smaller than a predetermined value.
  • the image processing apparatus 20 proceeds to step ST 16 when the melanin index of the feature section is equivalent to or smaller than the predetermined value and proceeds to step ST 18 when the melanin index is larger than the predetermined value.
  • step ST 16 the image processing apparatus 20 discriminates that the feature value is a feature section. For example, when in step ST 12 through step ST 14 , the feature value satisfies the extraction conditions of the pores (blemishes) and when in step ST 15 , the melanin index is equivalent to or smaller than the predetermined value, the image processing apparatus 20 determines that the feature value is a feature section that indicates a pore and proceeds to step ST 17 . Note that when the melanin index is larger than the predetermined value, the feature value may be discriminated as a blemish and the process may be proceeded to step ST 18 .
  • step ST 17 the image processing apparatus 20 discriminates whether the extraction processing of the feature sections have been completed. If an area of the feature value that has not been determined whether it is a feature section or not remains, the image processing apparatus 20 proceeds to step ST 18 , and if no such feature value remains, the image processing apparatus 20 ends the operation of extracting the feature sections.
  • step ST 18 the image processing apparatus 20 acquires a new feature value.
  • the image processing apparatus 20 acquires a feature value that has not been determined whether it is a feature section and returns to step ST 12 .
  • the order in which the polarities, the scales, and the intensities are determined is not limited to the order illustrated in FIG. 16 but may be ordered in a different manner. Furthermore, the feature sections may be extracted without performing any determination of the intensity.
  • the image processing apparatus 20 is made capable of distinguishing the pores from the blemishes and, accordingly, the pores are extracted accurately. Furthermore, when presenting the extraction result of the feature sections on the presentation unit 50 , in addition to the first embodiment, the area with high melanin index may be displayed with a different color and the rate of the area with high melanin index that has been obtained by the statistic calculation unit 244 may be displayed.
  • the pores and the blemishes can be distinguished from each other with the melanin analysis result and the pores can be extracted in an accurate manner. Furthermore, since the pores can be distinguished from the blemishes, the states of the pores and blemishes can be presented in an accurate manner.
  • While processing that detects the feature sections of the skin has been described in the first embodiment and the second embodiment, it is desirable that discrimination of the change in the feature section of the skin with elapse of time can be made. For example, when a treatment of reducing the size of the pores or treatment and the like of ridding of the pimples and the like is performed, it is desirable that comparison of how the pores and pimples have changed can be performed. Accordingly, in the third embodiment, a description will be given of an image processing apparatus that is capable of positioning an image so that corresponding feature sections are positioned at the same positions in a plurality of skin images of different times.
  • FIG. 17 illustrates a configuration of the third embodiment of the image processing apparatus.
  • the image processing apparatus 20 includes the preprocessing unit 21 , the brightness information separation unit 22 , and the feature section analysis unit 24 .
  • the preprocessing unit 21 acquires a skin image and preforms preprocessing. Similar to the first embodiment, the preprocessing unit 21 applies contrast enhancement processing to the acquired brightness information of the skin image to emphasize the shadows. Furthermore, when noise stands out, noise removal may be performed before the contrast enhancement, and when shades stand out, shading compensation may be performed before the contrast enhancement.
  • the brightness information separation unit 22 separates the brightness information that has been obtained by the preprocessing unit 21 into global brightness information and local brightness information. In a similar manner to that of the first embodiment, the brightness information separation unit 22 separates the global brightness information, which is information indicating the lighting components included in the image and the structural components of the skin, and the local brightness information, which indicates detailed patterns of the skin, such as texture, from each other.
  • the global brightness information which is information indicating the lighting components included in the image and the structural components of the skin
  • the local brightness information which indicates detailed patterns of the skin, such as texture
  • the feature section analysis unit 24 calculates the feature values from the skin image and, on the basis of the calculated feature values, analyzes the feature sections of the skin. As illustrated in FIG. 18 , the feature section analysis unit 24 includes the feature value calculation unit 241 , a positioning unit 242 , the feature section extraction unit 243 , and the statistic calculation unit 244 .
  • the feature value calculation unit 241 calculates the feature values.
  • the feature value calculation unit 241 calculates the feature values using the global brightness information generated in the brightness information separation unit 22 , for example.
  • the feature value calculation unit 241 calculates, as feature values, polarities related to gradation of the skin image, and scales indicating pixel areas having pixel values that are similar to each other and that are different from the surroundings of the pixel areas.
  • the feature value calculation unit 241 may further calculate, as feature values, intensities indicating signal differences between the image areas that have pixel values similar to each other and the surroundings of the image areas.
  • the feature value calculation unit 241 calculates second feature values used to match the feature sections in order to match the position of the skin image of the past and the position of the skin image of the present to each other.
  • the feature value calculation unit 241 calculates, as the second feature values, feature values that indicate polarities of the intensity change, for example.
  • a square area for example, a “20s ⁇ 20s” area, centering around a feature section is divided into “4 ⁇ 4” sub-areas and is rotated by the angle of the orientation.
  • each of the sub-areas is divided into four to form a Haar Wavelet (2s) with the same size so as to obtain a gradient vector.
  • Four-dimensional vectors ( ⁇ dx, ⁇ dy, ⁇
  • the four-dimensional vectors have large gradients in the x direction and the gradient directions are positive when the absolute values thereof are large.
  • the four-dimensional vectors become information that indicates the polarities of the intensity change. Accordingly, the four-dimensional vectors are obtained from the 16 sub-areas, and 64 feature values are used as the second feature values.
  • the feature value calculation unit 241 stores the global brightness information and the second feature values that have been calculated from the global brightness information so as to allow the position to be matched with the skin image that is taken subsequently.
  • An information storage unit 25 stores the global brightness information and the second feature values that have been calculated from the global brightness information. Note that the skin image, the skin image after preprocessing, or the global brightness information may be stored in the information storage unit 25 and processing as described above may be performed in the feature value calculation unit 241 and the like to calculate the second feature values, and the calculated second feature values may be output to the positioning unit 242 .
  • the positioning unit 242 performs projective transformation and the like and matches the positions of the new skin image and the skin image of the past. For example, by obtaining a homography matrix from the pairs of feature sections and by performing projective transformation, the position of the new skin image is matched with the position of the skin image of the past. Furthermore, when the number of pairs of feature sections that have been found through matching is equivalent to or smaller than the predetermined number, the positioning unit 242 presents a piece of advice, on the basis of the positions of the paired portions of the feature points, to move the imaging area in a direction in which more pairs of feature sections can be found.
  • a piece of advice is presented to move the imaging area to the left so that the position of the corresponding feature sections of the skin image of the past and those of the skin image of the present are at similar positions.
  • the feature section extraction unit 243 extracts the feature sections on the basis of the feature values that have been obtained by the feature value calculation unit 241 . For example, when the pore portions are extracted as the feature sections, the feature section extraction unit 243 sets Laplacian to “1” as the extraction condition. This is because the above matches with the characteristics of the pores since the Lapacian indicates the polarities related to gradation of the skin image and the black pixels are surrounded by the white pixels when the Laplacian is “1”. Furthermore, the feature section extraction unit 243 limits the scales that indicate the pixel areas having pixel values that are similar to each other and that are different from the surroundings of the pixel areas. It is known that the size of a pore is about 0.1 mm to 0.3 mm. Accordingly, the feature section extraction unit 243 sets an extraction condition that the scale of the feature values obtained by the feature value calculation unit 241 correspond to the size of the pores.
  • the feature section extraction unit 243 may limit the intensities.
  • a feature that is considered to be an area where the pores stand out needs a certain amount of signal intensity (contrast difference) with respect to the other areas. Accordingly, the feature section extraction unit 243 sets, as an extraction condition, an intensity (a matrix value) that is capable of extracting pores and the like that stands out.
  • the feature section extraction unit 243 By extracting the feature points that satisfy all of the extraction conditions that are set as above as the feature sections of the skin, the feature section extraction unit 243 will be capable of extracting only the pore areas from the skin image. Note that as described later, the extraction conditions may be changed to extract pimples and the like as feature sections of the skin.
  • the feature section extraction unit 243 extracts the feature sections of the pores, pimples, and the like of the skin by extracting feature points with the polarities related to gradation of the skin image and the scales.
  • the static calculation unit 244 calculates statistics of the feature sections that have been extracted by the feature section extraction unit 243 . For example, when the pores are extracted as the feature sections, statistic calculation unit 244 measures the number of extracted feature sections and calculates the number of pores. Furthermore, the statistic calculation unit 244 calculates an average value of the scales of the extracted feature sections to obtain a statistic of the size of the pores from the average value. Furthermore, the statistic calculation unit 244 calculates an average value of the intensities (Hessian value, for example) of the extracted feature sections to set a color density of the pores.
  • the presentation unit 50 that is provided in the imaging device 11 , the information processing device 15 , or the like presents the analysis result of the feature section analysis unit 24 to a user. For example, the presentation unit 50 displays the statistic that has been calculated in the feature section analysis unit 24 on a screen.
  • FIG. 19 is a flowchart illustrating an operation of the third embodiment.
  • the image processing apparatus 20 acquires a skin image.
  • the image processing apparatus 20 acquires the skin image generated by the imaging device 11 and proceeds to step ST 22 .
  • step ST 22 the image processing apparatus 20 performs preprocessing.
  • the image processing apparatus 20 performs contrast enhancement processing and the like to the brightness information of the acquired skin image to greatly emphasize the shadows and proceeds to step ST 23 .
  • step ST 23 the image processing apparatus 20 separates the brightness information.
  • the image processing apparatus 20 separates the brightness information, on which preprocessing has been performed, into global brightness information and local brightness information and proceeds to step ST 24 .
  • step ST 24 the image processing apparatus 20 calculates the feature values.
  • the image processing apparatus 20 using the global brightness information calculates polarities related to gradation of the skin image and scales indicating pixel areas having pixel values that are similar to each other and that are different from the surroundings of the pixel areas as feature values, and proceeds to step ST 25 .
  • the image processing apparatus 20 may further calculate intensities indicating signal differences between the image areas that have pixel values similar to each other and the surroundings of the image areas.
  • step ST 25 the image processing apparatus 20 discriminates whether the number of pairs of feature sections is larger than a predetermined number.
  • the image processing apparatus 20 discriminates whether the number of pairs of a feature section that has been detected from the skin image of the past and a corresponding feature section of the skin image of the present are larger than the predetermined number, and when the number is equivalent to or smaller than the predetermined number, the image processing apparatus 20 proceeds to step ST 26 and when the number is larger than the predetermined number, proceeds to step ST 27 .
  • step ST 26 the image processing apparatus 20 presents a piece of advice.
  • the image processing apparatus 20 presents a piece of advice, such as a moving direction of the imaging area, on the presentation unit 50 so that the positions of the pairs of feature sections in the skin image of the past and those in the skin image generated by the imaging device 11 are positioned at substantially the same position, and then returns to step ST 21 .
  • step ST 27 the image processing apparatus 20 calculates a homography matrix.
  • the image processing apparatus 20 calculates a homography matrix that indicates the positional correspondence between the skin image of the past and the skin image generated by the imaging device 11 , and proceeds to step ST 28 .
  • FIG. 20 exemplifies an operation of the presentation unit.
  • FIG. 20(A) illustrates a state in which pores are plotted on the skin image of the past according to their sizes, for example, and
  • FIG. 20(B) is a distribution (a histogram) of the sizes of the pores of the skin image of the past displayed according to their sizes.
  • FIG. 20(C) illustrates a state in which pores are plotted on the skin image, which has been newly generated by the imaging device, according to their sizes, for example, and
  • FIG. 20(D) is a distribution (a histogram) of the sizes of the pores of the skin image, which has been newly generated, displayed according to their sizes.
  • changes from before may be presented to the user. For example, in FIG.
  • the processing sequence that is explained in the specification can be implemented by hardware, by software and by a configuration that combines hardware and software.
  • the processing is implemented by software, it is possible to install in memory within a computer that is incorporated into dedicated hardware a program in which the processing sequence is encoded and to execute the program. It is also possible to install a program in a general-purpose computer that is capable of performing various types of processing and to execute the program.
  • the program can be, not only installed on a computer from a removable recording medium, but also transferred wirelessly or by wire to the computer from a download site via a network such as a LAN (Local Area Network) or the Internet.
  • a program transferred in the aforementioned manner can be received and installed on a recording medium such as built-in hardware.
  • a feature value calculation unit configured to calculate, as feature values, a polarity related to gradation of a skin image and a scale indicating image areas each having a pixel value that is similar to each other and that are different from surroundings of the image areas;
  • a feature section extraction unit configured to extract a feature section of the skin image on a basis of the feature values calculated by the feature value calculation unit.
  • the image processing apparatus further including:
  • a brightness information separation unit configured to perform brightness information separation processing on the skin image so as to acquire global brightness information
  • the feature value calculation unit calculates, as the feature values, intensities indicating signal differences between the image areas each having a pixel value that is similar to each other and the surroundings.
  • the image processing apparatus according to any one of (1) to (3), further including:
  • a melanin analysis unit configured to analyze melanin
  • the feature section extraction unit extracts a feature section using the feature values calculated by the feature value calculation unit and an analysis result obtained by the melanin analysis unit.
  • the image processing apparatus according to any one of (1) to (4), further including:
  • an image positioning unit configured to match positions of first and second skin images in a manner that feature sections are consistent with each other
  • the feature value calculation unit calculates a second feature value that indicates a polarity of an intensity change
  • the image positioning unit matches the positions of the skin images by performing matching of the feature sections using the second feature value in a manner that corresponding feature sections of the first and second skin images are consistent with each other.
  • the image positioning unit presents a piece of advice on moving an imaging area in a manner that a position of the corresponding feature section of the second skin image is positioned at a position of the corresponding feature section of the first skin image.
  • the feature section extraction unit extracts, as the feature section, at least one of pores, pimples, blemishes, and impurities in pores.
  • the feature section extraction unit extracts at least one of the pores, the pimples, and the blemishes on a basis of feature values calculated by the feature value calculation unit from a skin image taken using white light, and extracts the impurities in pores on a basis of feature values calculated by the feature value calculation unit from a skin image taken using near-ultraviolet light.
  • a statistic calculation unit configured to generate information related to at least one of numbers, sizes, and color densities of feature sections by calculating a statistic on a basis of an extraction result of the feature sections having feature values satisfying an extraction condition set in advance.
  • the skin image is an image in which a surface reflection of the skin has been removed by configuring a light source and a polarizing filter provided on an imagining unit to have an orthogonal relationship with each other.
  • the image processing apparatus In the image processing apparatus, the image processing method, the program, and the image processing system of the present technology, polarities related to gradation of a skin image and scales indicating image areas having pixel values that are similar to each other and that are different from the surroundings of the pixel areas are calculated as feature values, and on the basis of the feature values, feature sections of the skin image are extracted. Accordingly, pores, pimples, and the like of the skin can be accurately detected as the feature sections and various pieces of advice and the like can be given in an appropriate manner in accordance with the skin condition. Therefore, the present technique is suitable for electronic devices that include an imaging function of the skin, such as, for example, digital cameras, portable terminal devices, and information processing devices and the like that provide various services through a network and the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Quality & Reliability (AREA)
  • Dermatology (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)
US14/420,522 2012-08-17 2013-07-02 Image processing apparatus, image processing method, program, and image processing system Active 2034-01-09 US9697618B2 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2012180861 2012-08-17
JP2012-180861 2012-08-17
PCT/JP2013/068140 WO2014027522A1 (ja) 2012-08-17 2013-07-02 画像処理装置、画像処理方法、プログラムおよび画像処理システム

Publications (2)

Publication Number Publication Date
US20150213619A1 US20150213619A1 (en) 2015-07-30
US9697618B2 true US9697618B2 (en) 2017-07-04

Family

ID=50685532

Family Applications (1)

Application Number Title Priority Date Filing Date
US14/420,522 Active 2034-01-09 US9697618B2 (en) 2012-08-17 2013-07-02 Image processing apparatus, image processing method, program, and image processing system

Country Status (4)

Country Link
US (1) US9697618B2 (ja)
JP (1) JP6299594B2 (ja)
CN (1) CN104540445B (ja)
WO (1) WO2014027522A1 (ja)

Families Citing this family (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104540444B (zh) * 2012-08-17 2017-08-11 索尼公司 图像处理设备、图像处理方法、以及图像处理系统
GB201321748D0 (en) * 2013-12-09 2014-01-22 Ucb Pharma Sa Therapeutic agents
TW201540264A (zh) * 2014-04-18 2015-11-01 Sony Corp 資訊處理裝置、資訊處理方法、及程式
JP6003963B2 (ja) 2014-11-07 2016-10-05 カシオ計算機株式会社 診断装置並びに当該診断装置における画像処理方法及びそのプログラム
US9881368B2 (en) * 2014-11-07 2018-01-30 Casio Computer Co., Ltd. Disease diagnostic apparatus, image processing method in the same apparatus, and medium storing program associated with the same method
JP6003964B2 (ja) 2014-11-07 2016-10-05 カシオ計算機株式会社 診断装置並びに当該診断装置における画像処理方法及びそのプログラム
TWI662356B (zh) * 2014-11-12 2019-06-11 日商新力股份有限公司 Information processing device, information processing method and computer program product
TWI701018B (zh) 2015-01-29 2020-08-11 日商新力股份有限公司 資訊處理裝置、資訊處理方法、及程式
US20160366316A1 (en) * 2015-06-12 2016-12-15 Htc Corporation Skin analysis device and image capturing module thereof
JP6756524B2 (ja) * 2015-06-16 2020-09-16 株式会社シーボン 美容施術効果の解析方法
CN105787929B (zh) * 2016-02-15 2018-11-27 天津大学 基于斑点检测的皮肤疹点提取方法
JP6650819B2 (ja) * 2016-04-15 2020-02-19 株式会社 資生堂 色ムラ部位の評価方法、色ムラ部位評価装置及び色ムラ部位評価プログラム
CN106264463A (zh) * 2016-08-05 2017-01-04 深圳美立知科技有限公司 一种皮肤敏感度分析方法及装置
TWI657799B (zh) * 2016-09-29 2019-05-01 麗寶大數據股份有限公司 電子裝置與其提供膚質檢測資訊的方法
KR102528866B1 (ko) * 2016-12-20 2023-05-04 가부시키가이샤 시세이도 도포 제어 장치, 도포 제어 방법, 프로그램 및 기록매체
EP3384829A1 (en) * 2017-04-05 2018-10-10 Koninklijke Philips N.V. Skin gloss measurement for quantitative estimation of skin gloss
CN108784647B (zh) * 2017-04-27 2021-07-27 立特克科技股份有限公司 皮肤检测装置与其检测方法
JP6898150B2 (ja) * 2017-05-23 2021-07-07 花王株式会社 毛穴検出方法及び毛穴検出装置
CN109064438A (zh) * 2017-06-09 2018-12-21 丽宝大数据股份有限公司 皮肤状态检测方法、电子装置与皮肤状态检测系统
CN110013102B (zh) * 2017-12-22 2021-12-10 卡西欧计算机株式会社 图像处理装置、图像处理方法以及记录介质
CN109978810B (zh) * 2017-12-26 2024-03-12 南通罗伯特医疗科技有限公司 痣的检测方法、系统、设备及存储介质
EP3840646B1 (en) * 2018-08-21 2024-07-03 The Procter & Gamble Company Methods for identifying pore color
TWI770480B (zh) * 2019-03-20 2022-07-11 學校法人慶應義塾 估計方法、估計模型的生成方法、記憶介質和估計裝置
CN110298815B (zh) * 2019-03-27 2023-04-14 天津财经大学 一种皮肤毛孔检测与评价的方法
WO2020218180A1 (ja) * 2019-04-24 2020-10-29 ソニー株式会社 画像処理装置、方法及び電子機器
US11961608B2 (en) * 2019-11-11 2024-04-16 Healthy.Io Ltd. Image processing systems and methods for caring for skin features
US20210196186A1 (en) * 2019-12-30 2021-07-01 L'oreal Acne detection using image analysis
CN115516503A (zh) 2020-05-08 2022-12-23 宝洁公司 用于识别树突状毛孔的方法
CN112274110B (zh) * 2020-10-10 2023-06-16 苏州万微光电科技有限公司 一种基于皮肤荧光图像的毛孔检测系统、装置及方法
CN115358930B (zh) * 2022-10-19 2023-02-03 成都菁蓉联创科技有限公司 一种基于多无人机实时图像拼接方法及目标检测方法

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0755447A (ja) 1993-06-30 1995-03-03 Shiseido Co Ltd 肌の表面状態の解析システム
JPH1176173A (ja) 1997-09-12 1999-03-23 Kenji Hashimoto 肌診断器具
US6215893B1 (en) * 1998-05-24 2001-04-10 Romedix Ltd. Apparatus and method for measurement and temporal comparison of skin surface images
JP2006305184A (ja) 2005-04-28 2006-11-09 Shiseido Co Ltd 肌状態解析方法、肌状態解析装置、肌状態解析プログラム、及び該プログラムが記録された記録媒体
JP2007130329A (ja) 2005-11-11 2007-05-31 Matsushita Electric Works Ltd 画像による外観検査方法
JP2008293325A (ja) 2007-05-25 2008-12-04 Noritsu Koki Co Ltd 顔画像解析システム
US20090054744A1 (en) 2005-04-28 2009-02-26 Naomi Kitamura Skin state analyzing method, skin state analyzing apparatus, and computer-readable medium storing skin state analyzing program
JP2009098925A (ja) 2007-10-17 2009-05-07 Sony Corp 画像処理装置、画像処理方法、および、プログラム
JP2010119431A (ja) 2008-11-17 2010-06-03 Shiseido Co Ltd シワ評価方法、シワ評価装置、シワ評価プログラム、及び該プログラムが記録された記録媒体
JP2011118671A (ja) 2009-12-03 2011-06-16 Kao Corp 画像処理装置、画像処理方法、画像処理システム、肌評価方法

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003070749A (ja) * 2001-09-07 2003-03-11 Scalar Corp 診断システム、肌の診断方法、及びそれに用いられる情報処理装置、並びにプログラム、記録媒体
JP3919722B2 (ja) * 2003-09-11 2007-05-30 花王株式会社 肌形状計測方法及び肌形状計測装置
JP4548076B2 (ja) * 2003-10-02 2010-09-22 パナソニック電工株式会社 光式生体情報測定装置
JP2007061307A (ja) * 2005-08-30 2007-03-15 Shiseido Co Ltd シミの分類方法
JP5080060B2 (ja) * 2005-11-08 2012-11-21 株式会社 資生堂 肌状態解析方法、肌状態解析装置、肌状態解析プログラム、及び該プログラムが記録された記録媒体
US20080194928A1 (en) * 2007-01-05 2008-08-14 Jadran Bandic System, device, and method for dermal imaging
JP5426475B2 (ja) * 2010-05-21 2014-02-26 株式会社 資生堂 肌の色ムラ解析装置、肌の色ムラ解析方法、及び肌の色ムラ解析プログラム
WO2012012576A1 (en) * 2010-07-20 2012-01-26 Lockheed Martin Corporation Image analysis systems using non-linear data processing techniques and methods using same

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0755447A (ja) 1993-06-30 1995-03-03 Shiseido Co Ltd 肌の表面状態の解析システム
JPH1176173A (ja) 1997-09-12 1999-03-23 Kenji Hashimoto 肌診断器具
US6215893B1 (en) * 1998-05-24 2001-04-10 Romedix Ltd. Apparatus and method for measurement and temporal comparison of skin surface images
JP2006305184A (ja) 2005-04-28 2006-11-09 Shiseido Co Ltd 肌状態解析方法、肌状態解析装置、肌状態解析プログラム、及び該プログラムが記録された記録媒体
US20090054744A1 (en) 2005-04-28 2009-02-26 Naomi Kitamura Skin state analyzing method, skin state analyzing apparatus, and computer-readable medium storing skin state analyzing program
US8591414B2 (en) 2005-04-28 2013-11-26 Shiseido Company, Ltd. Skin state analyzing method, skin state analyzing apparatus, and computer-readable medium storing skin state analyzing program
JP2007130329A (ja) 2005-11-11 2007-05-31 Matsushita Electric Works Ltd 画像による外観検査方法
JP2008293325A (ja) 2007-05-25 2008-12-04 Noritsu Koki Co Ltd 顔画像解析システム
JP2009098925A (ja) 2007-10-17 2009-05-07 Sony Corp 画像処理装置、画像処理方法、および、プログラム
US8265417B2 (en) 2007-10-17 2012-09-11 Sony Corporation Image processing apparatus, method, and program for adding shadow information to images
JP2010119431A (ja) 2008-11-17 2010-06-03 Shiseido Co Ltd シワ評価方法、シワ評価装置、シワ評価プログラム、及び該プログラムが記録された記録媒体
JP2011118671A (ja) 2009-12-03 2011-06-16 Kao Corp 画像処理装置、画像処理方法、画像処理システム、肌評価方法

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
International Search Report from International Publication No. PCT/JP2013/068140 mailed Aug. 6, 2013.

Also Published As

Publication number Publication date
CN104540445B (zh) 2017-05-17
WO2014027522A1 (ja) 2014-02-20
US20150213619A1 (en) 2015-07-30
JPWO2014027522A1 (ja) 2016-07-25
CN104540445A (zh) 2015-04-22
JP6299594B2 (ja) 2018-03-28

Similar Documents

Publication Publication Date Title
US9697618B2 (en) Image processing apparatus, image processing method, program, and image processing system
US9811905B2 (en) Anomaly detection in medical imagery
Li et al. Multifocus image fusion via fixed window technique of multiscale images and non-local means filtering
Ajmal et al. A comparison of RGB and HSV colour spaces for visual attention models
CN103914708B (zh) 基于机器视觉的食品品种检测方法及系统
CN110033040B (zh) 一种火焰识别方法、系统、介质和设备
CN103544484A (zh) 一种基于surf的交通标志识别方法及系统
CN106442556A (zh) 一种板状带孔工件表面缺陷检测装置和方法
Qu et al. Detect digital image splicing with visual cues
US20200141804A1 (en) Method and system for hyperspectral light field imaging
CN107529963B (zh) 图像处理装置、图像处理方法和存储介质
Paulsen et al. Introduction to medical image analysis
CN102542564A (zh) 图像处理装置以及图像处理方法
Cho et al. Hyperspectral face recognition using improved inter-channel alignment based on qualitative prediction models
Fang et al. Detection of building shadow in remote sensing imagery of urban areas with fine spatial resolution based on saturation and near-infrared information
Figueiredo et al. An intelligent system for polyp detection in wireless capsule endoscopy images
Roy et al. Detection of retinal microaneurysms using fractal analysis and feature extraction technique
CN107529962B (zh) 图像处理装置、图像处理方法和记录介质
Burge et al. Multispectral iris fusion and cross-spectrum matching
Kar et al. Video shot boundary detection based on Hilbert and wavelet transform
Palm et al. Color texture analysis of moving vocal cords using approaches from statistics and signal theory
Guan et al. A new metric for latent fingerprint image preprocessing
KR101327482B1 (ko) 이파리 영상의 잎맥 검출 및 특징 추출 방법 및 장치
Chen et al. Saliency‐Based Bleeding Localization for Wireless Capsule Endoscopy Diagnosis
Fahn et al. A cross-dataset evaluation of anti-face-spoofing methods using random forests and convolutional neural networks

Legal Events

Date Code Title Description
AS Assignment

Owner name: SONY CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:NAKAMURA, YUSUKE;GOMI, SHINICHIRO;SIGNING DATES FROM 20141215 TO 20141224;REEL/FRAME:034937/0856

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4