EP2532301B1 - Image processing device, image processing method, image processing system, program, and recording medium - Google Patents
Image processing device, image processing method, image processing system, program, and recording medium Download PDFInfo
- Publication number
- EP2532301B1 EP2532301B1 EP12169199.2A EP12169199A EP2532301B1 EP 2532301 B1 EP2532301 B1 EP 2532301B1 EP 12169199 A EP12169199 A EP 12169199A EP 2532301 B1 EP2532301 B1 EP 2532301B1
- Authority
- EP
- European Patent Office
- Prior art keywords
- image
- unit
- measurement
- capturing
- image processing
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Not-in-force
Links
- 238000012545 processing Methods 0.000 title claims description 89
- 238000003672 processing method Methods 0.000 title claims description 6
- 238000005259 measurement Methods 0.000 claims description 167
- 238000004458 analytical method Methods 0.000 claims description 120
- 238000000034 method Methods 0.000 claims description 77
- 102000011782 Keratins Human genes 0.000 claims description 32
- 108010076876 Keratins Proteins 0.000 claims description 32
- 230000010287 polarization Effects 0.000 claims description 32
- 230000037303 wrinkles Effects 0.000 claims description 32
- 238000005286 illumination Methods 0.000 claims description 26
- 210000003491 skin Anatomy 0.000 description 77
- 238000004891 communication Methods 0.000 description 35
- 238000010586 diagram Methods 0.000 description 26
- 238000005516 engineering process Methods 0.000 description 21
- 230000006870 function Effects 0.000 description 14
- 238000012935 Averaging Methods 0.000 description 9
- 210000001061 forehead Anatomy 0.000 description 9
- 239000000284 extract Substances 0.000 description 8
- 230000004044 response Effects 0.000 description 8
- 230000005540 biological transmission Effects 0.000 description 7
- 238000001914 filtration Methods 0.000 description 7
- 238000002372 labelling Methods 0.000 description 4
- 238000012706 support-vector machine Methods 0.000 description 3
- 238000004590 computer program Methods 0.000 description 2
- 230000002159 abnormal effect Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000003745 diagnosis Methods 0.000 description 1
- 210000002615 epidermis Anatomy 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 230000001537 neural effect Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000003595 spectral effect Effects 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/44—Detecting, measuring or recording for evaluating the integumentary system, e.g. skin, hair or nails
- A61B5/441—Skin evaluation, e.g. for skin disorder diagnosis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0077—Devices for viewing the surface of the body, e.g. camera, magnifying lens
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/40—Analysis of texture
- G06T7/41—Analysis of texture based on statistical description of texture
- G06T7/44—Analysis of texture based on statistical description of texture using image operators, e.g. filters, edge density metrics or local histograms
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10141—Special mode during image acquisition
- G06T2207/10148—Varying focus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10141—Special mode during image acquisition
- G06T2207/10152—Varying illumination
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30088—Skin; Dermal
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
- G06T2207/30201—Face
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/20—ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
Definitions
- the present invention relates to an image processing device, an image processing method, an image processing system, a program, and a recording medium.
- a diagnosis system that irradiates a plurality of illumination lights having different wavelengths on a part of a subject to be diagnosed and determines a healthy state and an abnormal state on the basis of spectral reflectances of the respective wavelengths on the part is proposed (see International Publication No. 2006/064635 ).
- US 7657101B2 describes a device for acquiring first and subsequent images of a suspect area on a patient.
- US 2009/0196475 A1 describes automatic mask design and registration and feature detection for computer-aided skin analysis.
- Embodiments of the invention relate to an image processing device, an image processing method, an image processing system, a program, and a recording medium that are suitable to measure a state of skin. According to a first aspect of the present technology, there is provided an image processing device according to claim 1.
- the image processing device may further include a measurement unit configured to measure the state of the skin of the specified part on the basis of the second image captured under the set capturing condition.
- the measurement unit may measure at least one of texture, a wrinkle, a spot, and a keratin plug of the skin in the specified part.
- the image processing device may further include a display control unit configured to control display of a measurement result of the part.
- the setting unit may set at least one of a wavelength of light used to capture the second image, a depth of a focus location from a surface of the part, a relationship between a polarization direction of illumination light and a polarization direction of incident light incident on an image sensor, and a capturing magnification on the basis of at least one of the part and measurement items in the part.
- the image processing device may further include a characteristic analysis unit configured to extract a characteristic amount of the first image.
- the specifying unit may specify the part of the living body within the first image on the basis of the characteristic amount extracted by the characteristic analysis unit.
- the image processing device may further include a learning unit configured to learn identification information for identifying the part of the living body on the basis of the characteristic amount.
- the specifying unit may specify the part of the living body within the first image on the basis of the characteristic amount extracted by the characteristic analysis unit and the identification information learned by the learning unit.
- the image processing device may further include a capturing unit configured to capture the living body.
- a part of a living body within a first image is specified on the basis of a characteristic amount of the first image capturing skin of the living body, and a capturing condition under which a second image indicating a state of the skin of the part is captured in response to the specified part is set.
- a part of a living body within a first image is specified on the basis of a characteristic amount of the first image capturing skin of the living body, and a capturing condition under which a second image indicating a state of the skin of the part is captured in response to the specified part is set on the capturing device.
- the skin state can be simply and correctly recognized.
- the image processing system 1 is a system that captures a part of a subject 2 such as cheeks, a nose, a forehead, and a back of the hand (hereinafter referred to as a target part), and measures a state of skin of the target part of the subject 2 and analyzes the measurement result (e.g., processes various statistics and so forth) on the basis of the captured image (hereinafter referred to as a part image).
- the image processing system 1 measures and analyzes not only the state of the skin surface but also a state inside the skin in which the spot or the like is present and on which the spot or the like appears in the future, for example.
- the image processing system 1 includes a capturing device 11, an image processing device 12, and a display device 13.
- the capturing device 11 and the image processing device 12 perform wired or wireless communication.
- communication methods between the capturing device 11 and the image processing device 12 are not limited to a specific method but any communication methods including the wired or wireless communication method may be employed.
- FIG. 1 it is illustrated that the capturing device 11 and the image processing device 12 are connected by a cable to perform wired communication.
- FIG. 2 is a block diagram illustrating a configuration example of the capturing device 11.
- the capturing device 11 includes a capturing unit 31, an illumination unit 32, an image processing unit 33, a communication unit 34, and a control unit 35.
- the capturing unit 31 has an image sensor and so forth, captures the skin of the subject, and supplies the obtained part image to the image processing unit 33 under control of the control unit 35.
- the capturing unit 31 has an image sensor capable of capturing light in a wavelength band ranging at least from visible light to UV light.
- the illumination unit 32 irradiates illumination light on a region including the target part of the subject captured by the capturing unit 31 under control of the control unit 35.
- the control unit 35 controls the illumination unit 32 to control the illumination light.
- the image processing unit 33 performs a predetermined image process such as noise removal on the part image and supplies the processed result to the communication unit 34.
- the communication unit 34 is in communication with the image processing device 12 using a predetermined communication method. The communication unit 34 then transmits the part image to the image processing device 12. In addition, the communication unit 34 receives capturing condition setting information transmitted from the image processing device 12 and supplies the capturing condition setting information to the control unit 35.
- the control unit 35 sets the capturing condition of the part image for the capturing unit 31 and the illumination unit 32 on the basis of the capturing condition setting information while controlling the capturing unit 31 and the illumination unit 32.
- FIG. 3 is a block diagram illustrating a configuration example of functions of the image processing device 12.
- the image processing device 12 includes a communication unit 51, a characteristic analysis unit 52, a learning unit 53, a storage unit 54, a part specifying unit 55, a capturing condition setting unit 56, a skin measurement and analysis unit 57, and a display control unit 58.
- the skin measurement and analysis unit 57 includes a texture measurement and analysis unit 71, a keratin plug measurement and analysis unit 72, a spot measurement and analysis unit 73, and a wrinkle measurement and analysis unit 74.
- the communication unit 51 is in communication with the capturing device 11 using a predetermined communication method.
- the characteristic analysis unit 52 receives the part image from the capturing device 11 through the communication unit 51.
- the characteristic analysis unit 52 analyzes the characteristic of the part image and extracts the characteristic amount of the part image as will be described below.
- the characteristic analysis unit 52 supplies information indicating the extracted characteristic amount of the part image to the learning unit 53 or the part specifying unit 55.
- the learning unit 53 learns identification information used to identify the part of the subject reflected in the part image based on the characteristic amount on the basis of the information indicating the characteristic amount of the part image supplied from the characteristic analysis unit 52 and correct data given from the outside as will be described below.
- the learning unit 53 causes learning data for generating the identification information, and identification information obtained from the learning to be stored in the storage unit 54.
- the storage unit 54 stores a measurement condition setting table in addition to the learning data and the identification information described above.
- the measurement condition setting table acts to set the capturing condition, the measurement items, and so forth of the part image that are used to measure the skin state of the target part.
- the measurement condition setting table is created by a user or provided by a manufacturer, a seller, and so forth of the image processing system 1.
- the part specifying unit 55 specifies a part (target part) of the subject within the part image on the basis of the characteristic amount extracted from the part image by the characteristic analysis unit 52 and the identification information stored in the storage unit 54.
- the part specifying unit 55 notifies the capturing condition setting unit 56 and the skin measurement and analysis unit 57 of the specified target part.
- the capturing condition setting unit 56 sets the capturing condition of the capturing device on the basis of the target part specified by the part specifying unit 55 and the measurement condition setting table stored in the storage unit 54. The capturing condition setting unit 56 then transmits capturing condition setting information indicating the set capturing condition to the capturing device 11 through the communication unit 51.
- the skin measurement and analysis unit 57 receives the part image from the capturing device 11 through the communication unit 51.
- the skin measurement and analysis unit 57 specifies the item to be measured in the target part on the basis of the target part specified by the part specifying unit 55 and the measurement condition setting table stored in the storage unit 54.
- the skin measurement and analysis unit 57 measures the item to be measured in the target part on the basis of the part image and also analyzes the measurement result.
- the skin measurement and analysis unit 57 then supplies the results of measurement and analysis of the target part and the part image (hereinafter referred to as a measurement and analysis result) to the display control unit 58.
- the items to be measured include, for example, texture, a keratin plug, a spot, and a wrinkle.
- the texture measurement and analysis unit 71, the keratin plug measurement and analysis unit 72, the spot measurement and analysis unit 73, and the wrinkle measurement and analysis unit 74 measure states of the texture, the keratin plug, the spot, and the wrinkle in the target part on the basis of the part image, and analyze the measurement results, respectively.
- the display control unit 58 generates data indicating the measurement and analysis result of the skin state of the target part on the basis of the part image and the measurement and analysis result, and supplies the data to the display device 13.
- FIG. 4 is a block diagram illustrating a configuration example of functions of the characteristic analysis unit 52.
- the characteristic analysis unit 52 includes a filtering unit 81, an averaging unit 82, and a vectorization unit 83.
- the filtering unit 81 applies a plurality of Gabor filters of different parameters as will be described below.
- the filtering unit 81 then supplies a plurality of obtained images to the averaging unit 82.
- the averaging unit 82 divides each of the images into small blocks, and calculates an average value of pixel values per small block.
- the averaging unit 82 supplies the calculated averages to the vectorization unit 83.
- the vectorization unit 83 generates the vector including the average values supplied from the averaging unit 82 as a characteristic amount vector indicating the characteristic amount of the part image.
- the vectorization unit 83 supplies the generated characteristic amount vector to the learning unit 53 or the part specifying unit 55.
- this process is initiated when an instruction to initiate the learning process is input to the image processing system 1 through an input unit not shown.
- step S1 the image processing system 1 captures the part image.
- a user captures at least one of cheeks, a nose, a forehead, and a back of the hand of the subject using the capturing device 11.
- the user (capturer) and the subject may be the same person or may not be the same person.
- the capturing unit 31 then supplies the captured part image to the image processing unit 33.
- the image processing unit 33 performs the image process such as noise removal on the part image and also cuts out a predetermined region of a central portion of the part image. In addition, hereinafter, it is considered that the region of 128 ⁇ 128 pixels of the central portion of the part image is cut out.
- the image processing unit 33 supplies the part image subjected to the image process to the communication unit 34.
- the communication unit 34 transmits the part image to the image processing device 12.
- the communication unit 51 of the image processing device 12 receives the part image transmitted from the capturing device 11, and supplies the part image to the filtering unit 81 of the characteristic analysis unit 52.
- step S2 the characteristic analysis unit 52 analyzes the characteristic ofthe part image.
- the filtering unit 81 of the characteristic analysis unit 52 applies the Gabor filter to the part image by carrying out an operation as in equation (1) below.
- g x ⁇ y exp - x ⁇ 2 + ⁇ 2 ⁇ y ⁇ 2 2 ⁇ ⁇ 2 ⁇ cos 2 ⁇ ⁇ ⁇ x ⁇ ⁇ + ⁇
- g(x, y) indicates the pixel value of the coordinates (x, y) of the image subjected to the Gabor filter.
- x' and y' are expressed as in equations (2) and (3) below using the x and y coordinates ofthe part image.
- ⁇ indicates a cosine component of the wavelength
- ⁇ indicates a direction of a striped pattern of the Gabor function
- ⁇ indicates a phase offset
- ⁇ indicates a scale
- ⁇ indicates a spatial aspect ratio
- the filtering unit 81 selects ⁇ from four kinds such as 0°, 45°, 90°, and 135° and ⁇ from two kinds such as 1.0 and 5.0 as parameters of the Gabor filter, and applies eight kinds of the Gabor filter as a total combination of ⁇ and ⁇ . Eight result images are thus generated from the part image.
- the filtering unit 81 supplies the generated eight result images to the averaging unit 82.
- the averaging unit 82 divides each of the result images into sixteen small blocks b1 to b16 each having 32 ⁇ 32 pixels as shown in FIG. 6 , and obtains an average value of the pixel values of each small block.
- the averaging unit 82 supplies the obtained average values to the vectorization unit 83.
- the vectorization unit 83 generates a 128-dimensional vector that is a characteristic amount of the 128 average values supplied from the averaging unit 82 as a characteristic amount vector indicating the characteristic of the part image.
- the characteristic amount vector is denoted with v, for example, the characteristic amount vector v is expressed as in equation (4) below.
- gi bj indicates the average of the pixel values of j th small block of the i th result image.
- the vectorization unit 83 supplies the generated characteristic amount vector to the learning unit 53.
- the learning unit 53 accumulates learning data.
- the user inputs correct data indicating which one is captured among cheeks, a nose, a forehead, and a back of the hand as the part image to the learning unit 53.
- the learning unit 53 causes the storage unit 54 to store the learning data of which the acquired correct data and the characteristic amount vector are associated with each other.
- step S4 the learning unit 53 determines whether or not the learning data is sufficiently accumulated. When it is determined that the learning data is not yet sufficiently accumulated, the process returns to step S1.
- step S4 processes from step S1 to step S4 are then repeatedly carried out until it is determined that the learning data is sufficiently accumulated.
- the user captures a plurality of part images by shifting the location of the part of each of cheeks, a nose, a forehead, and a back of the hand of the subject little by little.
- the image processing system 1 then generates and accumulates the learning data in each of the captured part images.
- the process described above is performed on a plurality of subjects.
- the learning data in a plurality of locations of each part of each of the plurality of subjects is thus accumulated.
- step S4 when it is determined that the learning data is sufficiently accumulated, the process proceeds to step S5.
- step S5 the learning unit 53 generates a discriminator.
- the learning unit 53 generates a discriminator using a support vector machine (SVM) based on the learning data accumulated in the storage unit 54 as identification information used to identify the part within the part image on the basis of the characteristic amount vector extracted from the part image.
- SVM support vector machine
- the learning process is then finished.
- this process is initiated when an instruction to initiate the skin measurement and analysis process is input to the image processing system 1 through an input unit not shown.
- step S101 the part image of the part as the target where the skin state of the subject is measured is captured in the same manner as in the process of step S1 of FIG. 5 .
- step S102 the characteristic of the part image is analyzed in the same manner as in the process of step S2 of FIG. 5 .
- the characteristic amount vector obtained from the process is then supplied from the characteristic analysis unit 52 to the part specifying unit 55.
- the part specifying unit 55 specifics the target part.
- the part specifying unit 55 specifies the part of the subject (target part) reflected in the part image from cheeks, a nose, a forehead, and a back of the hand on the basis of the characteristic amount vector extracted by the characteristic analysis unit 52 and the discriminator stored in the storage unit 54.
- the part specifying unit 55 notifies the capturing condition setting unit 56 and the skin measurement and analysis unit 57 of the specified target part.
- step S104 the image processing system 1 sets the capturing condition.
- the capturing condition setting unit 56 sets the capturing condition of the capturing device 11 on the basis of the target part specified by the part specifying unit 55 and the measurement condition setting table stored in the storage unit 54.
- FIG. 8 illustrates an example of the measurement condition setting table.
- the measurement condition setting table includes six items such as target part, wavelength, measurement depth, polarization direction, capturing range, and measurement item.
- the wavelength, the measurement depth, the polarization direction, and the capturing range are items for setting the capturing condition of the capturing device 11, and values of each of the items are determined by at least one of the target part and the measurement items.
- the wavelength indicates the wavelength of light used to capture the part image, and is set to a value suitable for capturing elements (e.g., skin ridge, keratin plug, and so forth) necessary to measure the measurement item of the target part, for example.
- a value suitable for capturing elements e.g., skin ridge, keratin plug, and so forth
- two kinds of set values such as white light and UV light (wavelength 390 nm) are illustrated.
- the measurement depth indicates how deep the measurement is performed from the skin surface.
- the depth is set to 0 mm (skin surface) for all cases.
- the polarization direction indicates a relationship between the polarization direction of illumination light emitted from the illumination unit 32 and the polarization direction of incident light incident on the image sensor of the capturing unit 31.
- three kinds of set values such as parallel, orthogonal, and - (no polarization) are illustrated.
- parallel indicates that the polarization direction of the illumination light is made to be parallel to the polarization direction of the incident light, in other words, the polarization direction of the illumination light is made to be equal to the polarization direction of the incident light. It is thus possible to extract and capture light reflected from the surface of the target part by causing the polarization direction of the illumination light to be parallel to the polarization direction of the incident light.
- Orthogonal indicates that the polarization direction of the illumination light is made to be orthogonal to the polarization direction of the incident light. It is thus possible to block out light reflected from the surface of the target part by causing the polarization direction of the illumination light to be orthogonal to the polarization direction of the incident light, and is thus possible to extract light having components (e.g., components reflected inside the skin) other than the reflected light. (No polarization) indicates that there is no polarization between the illumination light and the incident light.
- the polarization direction is thus set on the basis of on which one measurement is performed between the surface and the inside of the target part, for example.
- the capturing range indicates the range in which the part image is captured, and, for example, is set to an area that enables elements necessary to measure the measurement items of the target part to be captured. In this case, three kinds of set values such as 1 cm ⁇ 1 cm, 3 cm ⁇ 3 cm, and 5 cm ⁇ 5 cm are illustrated.
- the measurement item is the texture when the target part is cheeks, however, the number of skin ridges or the like is measured as the texture state as will be described below. Since the size of the skin ridge is about 0.1 to 0.5 mm, the capturing range is set to 1 cm ⁇ 1 cm, for example. In addition, the measurement item is the wrinkle when the target part is the forehead, however, since the length of the wrinkle of the forehead is about several cm, the capturing range is set to 5 cm ⁇ 5 cm, for example.
- the measurement item indicates the item to be measured on the target part.
- four kinds of set values such as texture, a keratin plug, a spot, and a wrinkle are illustrated.
- the capturing condition setting unit 56 sets the capturing condition on the basis of wavelength, measurement depth, polarization direction, and capturing range with respect to the specified target part. For example, the capturing condition setting unit 56 sets the kind of the light source or the filter to be applied to capturing on the basis of the wavelength corresponding to the target part. In addition, for example, the capturing condition setting unit 56 sets the depth of the focus location of the capturing unit 31 from a surface of the target part on the basis of the measurement depth corresponding to the specified target part. In addition, for example, the capturing condition setting unit 56 sets the polarization direction of the capturing unit 31 and the polarization direction of the illumination unit 32 on the basis of the polarization direction corresponding to the specified target part. In addition, the capturing condition setting unit 56 sets capturing magnification of the capturing unit 31 on the basis of the capturing range corresponding to the specified target part.
- the capturing condition setting unit 56 transmits capturing condition setting information indicating the set capturing conditions to the capturing device 11 through the communication unit 51.
- the control unit 35 of the capturing device 11 then receives the capturing condition setting information through the communication unit 34.
- the control unit 35 then performs setting on the capturing unit 31 and the illumination unit 32 on the basis of the received capturing condition setting information
- control unit 35 sets the kind of the light source or the filter to be employed and sets the wavelength of light used to capture the part image on the basis of the capturing condition setting information
- light sources 111a to 111d having different wavelengths from each other and a light source switching circuit 112 may be disposed in the illumination unit 32, and the light source to be used may be switched.
- the wavelength of the illumination light irradiated on the subject and the incident light incident on the image sensor 101 of the capturing unit 31 is thus switched.
- the light sources 111a to 111d include light emitting diodes (LEDs).
- a turret 131 including band pass filters (BPFs) 132a to 132d having different transmission wavelength ranges from each other may be disposed in front of the image sensor 101, and a light source 141 emitting light in a wavelength band ranging from visible light to UV light may be disposed in the illumination unit 32.
- the wavelength of the incident light incident on the image sensor 101 may be switched by rotating the turret 131 and thus adjusting the location of the BPFs 132a to 132d.
- the light source 141 includes an LED.
- a four-way dividing filter 151 including BPFs 151a to 151d having different transmission wavelengths from each other may be disposed such that transmitting light of each BPF is incident on a different region of the image sensor 101, and the region of the image sensor 101 used to capture the part image may thus be switched.
- BPFs 171a to 171d having different transmission wavelength ranges from each other may be disposed so as to correspond to four image sensors 101a to 101d, and the image sensor used for capturing may thus be switched.
- control unit 35 sets the focus location of the capturing unit 31 on the basis of the capturing condition setting information. In this case, since the measurement depth of all target parts is 0 mm, the focus location is set to surfaces of all of the target parts regardless of the individual target parts.
- control unit 35 sets the polarization direction of illumination light emitted from the illumination unit 32 and the polarization direction of incident light incident on the image sensor of the capturing unit 31 on the basis of the capturing condition setting information.
- control unit 35 sets the capturing magnification of the zoom lens of the capturing unit 31 on the basis of the capturing condition setting information.
- step S105 the image processing system 1 performs the same process as in step S 1 of FIG. 5 , and captures the part image again under the capturing condition set in response to the target part.
- the captured part image is transmitted from the capturing device 11 to the image processing device 12 and then supplied to the skin measurement and analysis unit 57.
- step S106 the skin measurement and analysis unit 57 performs measurement and analysis on the target part.
- the skin measurement and analysis unit 57 specifies items to be measured and analyzed on the target part on the basis of the target part specified by the part specifying unit 55 and the measurement condition setting table stored in the storage unit 54.
- the texture measurement and analysis unit 71 of the skin measurement and analysis unit 57 measures the texture state of the cheeks and analyzes the measurement result.
- details of the texture measurement and analysis process carried out by the texture measurement and analysis unit 71 will be described with reference to the flowchart of FIG. 13 .
- step S121 the texture measurement and analysis unit 71 detects a high brightness region.
- the skin ridge regions tend to be brighter than other regions such as skin grooves in the part image.
- the texture measurement and analysis unit 71 thus binarizes the part image using a predetermined threshold value and extracts the high brightness regions from the binarized image.
- step S122 the texture measurement and analysis unit 71 performs a labeling process. As a result, the extracted high brightness regions are individually identified.
- step S123 the texture measurement and analysis unit 71 measures the number, size, shape, and direction of the skin ridges.
- the texture measurement and analysis unit 71 obtains the number of skin ridges within the part image by counting the number of identified high brightness regions.
- the texture measurement and analysis unit 71 measures the shape and size of each high brightness region as the shape and size of the skin ridge.
- the texture measurement and analysis unit 71 measures a direction of an edge portion of each high brightness region.
- step S124 the texture measurement and analysis unit 71 calculates the similarity of the size and shape of the skin ridges and the distribution of edge directions of the skin ridges.
- fineness of the texture of the target part is evaluated by the number of skin ridges.
- uniformity of the texture of the target part is evaluated by the similarity of size and shape of the skin ridges and the distribution of edge directions of the skin ridges.
- the texture measurement and analysis process is then finished.
- the keratin plug measurement and analysis unit 72 of the skin measurement and analysis unit 57 measures the state of the keratin plug of the nose and analyzes the measurement result.
- details of the keratin plug measurement and analysis process carried out by the keratin plug measurement and analysis unit 72 will be described with reference to the flowchart of FIG. 14 .
- step S141 the keratin plug measurement and analysis unit 72 extracts orange or green regions.
- FIG. 15 schematically illustrates the part image of which a surface of the nose is captured using UV light.
- portions illustrated in gray in FIG. 15 are regions of the keratin plugs.
- regions of the keratin plugs illustrated in gray in FIG. 15 exhibit colors close to orange or green, and stand out against the others.
- one color such as gray is illustrated in FIG. 15 , however, color or brightness actually changes within the region.
- the keratin plug measurement and analysis unit 72 thus extracts the orange or green regions as regions at which the keratin plugs within the part image are reflected.
- step S142 the keratin plug measurement and analysis unit 72 performs the labeling process. As a result, the extracted orange or green regions are individually identified.
- step S 143 the keratin plug measurement and analysis unit 72 measures the number and size of the keratin plugs.
- the keratin plug measurement and analysis unit 72 obtains the number of keratin plugs within the part image by counting the number of identified orange or green regions.
- the keratin plug measurement and analysis unit 72 measures the size of each of the orange or green regions as the size of the keratin plug.
- step S 144 the keratin plug measurement and analysis unit 72 calculates an average of the sizes of the keratin plugs.
- an amount of the keratin plugs of the target part is evaluated by the number of keratin plugs and the average of the sizes of the keratin plugs.
- the keratin plug measurement and analysis process is then finished.
- the spot measurement and analysis unit 73 of the skin measurement and analysis unit 57 measures the spot of the back of the hand and analyzes the measurement result.
- details of the spot measurement and analysis process carried out by the spot measurement and analysis unit 73 will be described with reference to the flowchart of FIG. 16 .
- step S161 the spot measurement and analysis unit 73 extracts low brightness regions.
- spot regions appear as blackish regions in the part image.
- the spot measurement and analysis unit 73 thus binarizes the part image, for example, using a predetermined threshold value, and extracts the low brightness regions from the binarized image.
- step S162 the spot measurement and analysis unit 73 performs the labeling process. As a result, the extracted low brightness regions are individually identified.
- the spot measurement and analysis unit 73 measures the number and sizes of the spots.
- the spot measurement and analysis unit 73 obtains the number of spots within the part image by counting the number of identified low brightness regions.
- the spot measurement and analysis unit 73 measures the size of each low brightness region as the size of the spot.
- step S164 the spot measurement and analysis unit 73 calculates an average of the sizes of the spot regions.
- An amount of the spots of the target part is thus evaluated by the number of spots and the average ofthe sizes, for example.
- the spot measurement and analysis process is then finished.
- the wrinkle measurement and analysis unit 74 of the skin measurement and analysis unit 57 measures the wrinkle of the forehead and analyzes the measurement result.
- details of the wrinkle measurement and analysis process carried out by the wrinkle measurement and analysis unit 74 will be described with reference to the flowchart of FIG. 17 .
- step S181 the wrinkle measurement and analysis unit 74 extracts edges within the part image. That is, the wrinkle measurement and analysis unit 74 extracts edge regions within the part image using a Sobel filter or the like so as to extract the wrinkle reflected in the part image as shown in FIG. 18 .
- step S182 the wrinkle measurement and analysis unit 74 performs a labeling process. As a result, each of the extracted edge regions is individually identified.
- step S183 the wrinkle measurement and analysis unit 74 measures the number and the sizes of the wrinkles.
- the wrinkle measurement and analysis unit 74 obtains the number of wrinkles within the part image by counting the number of identified edge regions.
- the wrinkle measurement and analysis unit 74 measures the length of each wrinkle within the part image by counting connection pixels of each edge region.
- step S 184 the wrinkle measurement and analysis unit 74 calculates an average of the lengths of the wrinkles.
- An amount of the wrinkles of the target part is thus evaluated by the number of wrinkles and the average of the lengths of the wrinkles, for example.
- the wrinkle measurement and analysis process is then finished.
- step S107 the image processing system 1 displays the measurement and analysis result.
- the skin measurement and analysis unit 57 supplies the part image and the measurement and analysis result ofthe target part to the display control unit 58.
- the display control unit 58 generates data indicating the measurement and analysis result ofthe skin state of the target part on the basis of the part image and the measurement and analysis result of the target part, and transmits the data to the display device 13 through the communication unit 51.
- the display device 13 for example, displays the measurement and analysis result of the skin state of the target part along with the part image on the basis of the received data.
- the skin measurement and analysis process is then finished.
- the target part it is possible to automatically specify the target part, capture the part image under the condition suitable for the target part, and perform measurement and analysis on items according to the target part.
- the user can thus correctly and simply measure and analyze the skin state of a desired part without fail.
- the capturing condition and the measurement item of the measurement condition setting table of FIG. 19 is different from the measurement condition setting table of FIG. 8 .
- the wavelength is set to UV light
- the measurement depth is set to 0.2 mm
- the polarization direction is set to an orthogonal direction in the measurement condition setting table of FIG. 19 . Accordingly, when the target part is the cheeks, the part image of the cheeks is captured under the set condition.
- the measurement depth of the cheeks is different from the measurement depth of other parts. This difference will be described with reference to FIG. 20 .
- FIG. 20 is a diagram schematically illustrating states of the image sensor 201 and the lens 202 of the capturing unit 31 of the capturing device 11.
- the focus location of the lens 202 is set to a surface of the target part as shown in the left side of FIG. 20 .
- the focus location of the lens 202 is set to a location 0.2 mm deep from the surface of the target part as shown in the right side of FIG. 20 . Since an average thickness of the human epidermis is about 0.2 mm, it is possible to detect spots that are slightly inside the surface of the cheeks and will appear on the surface in the future.
- the present technology may be embodied by other configurations.
- the capturing device 11, the image processing device 12, and the display device 13 may be integrated to be one device.
- two of the capturing device 11, the image processing device 12, and the display device 13 may be integrated.
- disposition of functions of the capturing device 11 and the image processing device 12 are not limited to the examples described above, for example, some of the functions of the capturing device 11 may be disposed in the image processing device 12 and some of the functions of the image processing device 12 may be disposed in the capturing device 11.
- the display device 13 may be disposed to be dedicated to the image processing system 1, and display devices of other devices such as a television receiver or a cellular phone may also be employed.
- the present technology may be embodied by the system that performs a remote process through the network such as the image processing system 301 of FIG. 21 .
- the image processing system 301 includes a capturing device 311, a server 312, and a display device 313.
- the capturing device 311, the server 312, and the display device 313 correspond to the capturing device 11, the image processing device 12, and the display device 13, and perform almost the same processes, respectively.
- the capturing device 311, the server 312, and the display device 313 are connected to each other through the network 314 to perform communication.
- communication methods between the capturing device 311, the server 312, and the display device 313 are not limited to particular methods, but arbitrary wired or wireless communication methods may be employed.
- FIG. 22 is a block diagram illustrating a configuration example of the functions of the capturing device 311.
- portions of FIG. 22 corresponding to those of FIG. 2 are denoted with the same numerals, and the description of the portions at which the same processes are carried out will not be repeated, but rather, omitted.
- the capturing device 311 differs from the capturing device 11 of FIG. 2 in that an encryption unit 331 is disposed between the image processing unit 33 and the communication unit 34.
- the encryption unit 331 encrypts or scrambles the part image captured by the capturing unit 31 using a predetermined method so as to ensure security in a transmission path between the capturing device 311 and the server 312.
- the encryption unit 331 supplies the encrypted or scrambled part image to the communication unit 34.
- FIG. 23 is a block diagram illustrating a configuration example of the functions of the server 312.
- portions of FIG. 23 corresponding to those of FIG. 3 are denoted with the same numerals, and the description of the portions where the same processes are carried out will not be repeated, but rather, omitted.
- the server 312 differs from the image processing device 12 of FIG. 3 in that a decoding unit 551 is disposed between the communication unit 51, the characteristic analysis unit 52, and the skin measurement and analysis unit 57.
- the decoding unit 551 receives the part image from the capturing device 311 through the network 314 and the communication unit 51.
- the decoding unit 551 decodes the encrypted or scrambled part image, and supplies the decoded part image to the characteristic analysis unit 52 or the skin measurement and analysis unit 57.
- Methods of analyzing the characteristic of the part image of the characteristic analysis unit 52 are not limited to those described above, but other methods, for example, those of extracting color information or the like as the characteristic amount using other edge extraction filters, may be employed.
- Learning methods of the learning unit 53 are not limited to the SVM mentioned above, but other learning methods, for example, linear discriminant analysis, neural net, and so forth, may be employed.
- the learning unit 53 may not be disposed in the image processing device 12, and the result of another device carrying out the learning process may be given to the image processing device 12.
- target parts and the measurement items described above are merely examples, and arbitrary target parts and measurement items may be added or deleted.
- combinations of the target part and the measurement items may be set to be different from the examples described above, or a plurality of measurement items may be set for one target part.
- the capturing conditions are merely examples, and arbitrary conditions may be added or deleted.
- intensity of illumination light, exposure of the capturing unit 31, capturing angle, and so forth may be set.
- the present technology may also be applied to a case of measuring and analyzing the skin state of a living body other than a human being.
- the present technology may also be applied to cases not of analyzing the measurement result but only of measuring the skin state, displaying the measurement result only, recording the measurement result only, or supplying the measurement result to other devices.
- the present technology may also be applied to a case of not performing measurement and analysis on the skin state.
- the present technology may also be applied to a case of displaying and recording the part image captured under the capturing condition in response to the specified part, or supplying the part image to other devices.
- the series of processes described above may be carried out by software or hardware.
- a program constituting the software is installed in a computer.
- the computer includes, for example, a computer built in dedicated hardware, and a computer in which various programs are installed and various functions are carried out such as a general purpose personal computer.
- FIG. 24 is a block diagram illustrating a configuration example of the computer hardware carried out by the series of processes described above.
- a central processing unit (CPU) 401, a read only memory (ROM) 402, and a random access memory (RAM) 403 are connected to each other by a bus 404 in the computer.
- an input and output interface 405 is connected to the bus 404.
- An input unit 406, an output unit 407, a storage unit 408, a communication unit 409, and a drive 410 are connected to the input and output interface 405.
- the input unit 406 includes a keyboard, a mouse, a microphone, and so forth.
- the output unit 407 includes a display, a speaker, and so forth.
- the storage unit 408 includes a hard disk, a non-volatile memory, and so forth.
- the communication unit 409 includes a network interface, and so forth.
- the drive 410 drives removable media 411 such as a magnetic disk, an optical disc, a magneto-optical disc, and a semiconductor memory.
- the series of processes are carried out by the CPU 401 that causes the program stored in the storage unit 408 to be loaded onto the RAM 403 through the input and output interface 405 and the bus 404 and then executed.
- the program carried out by the computer (CPU 401), for example, may be recorded on the removable media 411 as package media and provided.
- the program may be provided through wired or wireless transmission media such as a local area network, the Internet, and digital satellite broadcasting.
- the program may be installed in the storage unit 408 through the input and output interface 405 by mounting the removable media 411 on the drive 410.
- the program may be received at the communication unit 409 through the wired or wireless transmission medium and then installed on the storage unit 408.
- the program may be installed on the ROM 402 or the storage unit 408 in advance.
- the program carried out by the computer may be a program of which the processes are carried out in time series in accordance with the order described in the present specification, or may be a program of which the processes are carried out in parallel or at a required timing when called upon.
- system means a general device configured to include a plurality of devices, means, and so forth in the present specification.
- the present technology may include the configurations below.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Veterinary Medicine (AREA)
- Public Health (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- General Health & Medical Sciences (AREA)
- Animal Behavior & Ethology (AREA)
- Surgery (AREA)
- Dentistry (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Dermatology (AREA)
- Theoretical Computer Science (AREA)
- Probability & Statistics with Applications (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Description
- The present invention relates to an image processing device, an image processing method, an image processing system, a program, and a recording medium.
- Typically, devices that measure the state of skin on the basis of an image capturing the skin have been proposed.
- For example, a diagnosis system that irradiates a plurality of illumination lights having different wavelengths on a part of a subject to be diagnosed and determines a healthy state and an abnormal state on the basis of spectral reflectances of the respective wavelengths on the part is proposed (see International Publication No.
2006/064635 ). - It is desired to be able to simply and correctly recognize the state of the skin through the technique of measuring the state of the skin.
-
US 7657101B2 describes a device for acquiring first and subsequent images of a suspect area on a patient. -
US 2009/0196475 A1 describes automatic mask design and registration and feature detection for computer-aided skin analysis. -
US 2007/0064985 describes a method and system for analyzing skin conditions in digital images. - Various respective aspects and features of the invention are defined in the appended claims. Combinations of features from the dependent claims may be combined with features of the independent claims as appropriate and not merely as explicitly set out in the claims.
- The present technology enables the state of the skin to be simply and correctly recognized. Embodiments of the invention relate to an image processing device, an image processing method, an image processing system, a program, and a recording medium that are suitable to measure a state of skin. According to a first aspect of the present technology, there is provided an image processing device according to claim 1.
- The image processing device may further include a measurement unit configured to measure the state of the skin of the specified part on the basis of the second image captured under the set capturing condition.
- The measurement unit may measure at least one of texture, a wrinkle, a spot, and a keratin plug of the skin in the specified part.
- The image processing device may further include a display control unit configured to control display of a measurement result of the part.
- The setting unit may set at least one of a wavelength of light used to capture the second image, a depth of a focus location from a surface of the part, a relationship between a polarization direction of illumination light and a polarization direction of incident light incident on an image sensor, and a capturing magnification on the basis of at least one of the part and measurement items in the part.
- The image processing device may further include a characteristic analysis unit configured to extract a characteristic amount of the first image. The specifying unit may specify the part of the living body within the first image on the basis of the characteristic amount extracted by the characteristic analysis unit.
- The image processing device may further include a learning unit configured to learn identification information for identifying the part of the living body on the basis of the characteristic amount. The specifying unit may specify the part of the living body within the first image on the basis of the characteristic amount extracted by the characteristic analysis unit and the identification information learned by the learning unit.
- The image processing device may further include a capturing unit configured to capture the living body.
- According to a first aspect of the present technology, there is provided an image processing method according to claim 9.
- According to a first aspect of the present technology, there is provided a program according to claim 10.
- According to a second aspect of the present technology, there is provided an image processing system according to
claim 12. - According to the first aspect of the present technology, a part of a living body within a first image is specified on the basis of a characteristic amount of the first image capturing skin of the living body, and a capturing condition under which a second image indicating a state of the skin of the part is captured in response to the specified part is set.
- According to the second aspect of the present technology, a part of a living body within a first image is specified on the basis of a characteristic amount of the first image capturing skin of the living body, and a capturing condition under which a second image indicating a state of the skin of the part is captured in response to the specified part is set on the capturing device.
- According to the first or second aspect of the present technology, the skin state can be simply and correctly recognized.
- Embodiments of the invention will now be described with reference to the accompanying drawings, throughout which like parts are referred to by like references, and in which:
-
FIG. 1 is a block diagram illustrating a first embodiment of an image processing system to which the present technology is applied; -
FIG. 2 is a block diagram illustrating a configuration example of functions of a capturing device; -
FIG. 3 is a block diagram illustrating a configuration example of functions of an image processing device; -
FIG. 4 is a block diagram illustrating a configuration example of functions of a characteristic analysis unit; -
FIG. 5 is a flowchart illustrating a learning process; -
FIG. 6 is a diagram illustrating a method of generating a characteristic amount vector; -
FIG. 7 is a flowchart illustrating a skin measurement analysis process; -
FIG. 8 is a diagram illustrating a first example of a measurement condition setting table; -
FIG. 9 is a diagram illustrating a first method of setting a wavelength of light used to capture a part image; -
FIG. 10 is a diagram illustrating a second method of setting a wavelength of light used to capture a part image; -
FIG. 11 is a diagram illustrating a third method of setting a wavelength of light used to capture a part image; -
FIG. 12 is a diagram illustrating a fourth method of setting a wavelength of light used to capture a part image; -
FIG. 13 is a flowchart illustrating a texture measurement analysis process; -
FIG. 14 is a flowchart illustrating a keratin plug measurement analysis process; -
FIG. 15 is a diagram schematically illustrating a part image in which a surface of a nose is captured using ultraviolet (UV) light; -
FIG. 16 is a flowchart illustrating a spot measurement analysis process; -
FIG. 17 is a flowchart illustrating a wrinkle measurement analysis process; -
FIG. 18 is a diagram illustrating a flow of a wrinkle measurement analysis process; -
FIG. 19 is a diagram illustrating a second example of a measurement condition setting table; -
FIG. 20 is a diagram illustrating an example of a focus setting location of a capturing device with respect to a measurement depth; -
FIG. 21 is a diagram illustrating a second embodiment of an image processing system to which the present technology is applied; -
FIG. 22 is a diagram illustrating a configuration example of functions of a capturing device; -
FIG. 23 is a block diagram illustrating a configuration example of functions of a server; and -
FIG. 24 is a block diagram illustrating a configuration example of a computer. -
FIG. 1 is a block diagram illustrating a first embodiment of an image processing system 1 to which the present technology is applied. - The image processing system 1, for example, is a system that captures a part of a subject 2 such as cheeks, a nose, a forehead, and a back of the hand (hereinafter referred to as a target part), and measures a state of skin of the target part of the subject 2 and analyzes the measurement result (e.g., processes various statistics and so forth) on the basis of the captured image (hereinafter referred to as a part image). In addition, the image processing system 1 measures and analyzes not only the state of the skin surface but also a state inside the skin in which the spot or the like is present and on which the spot or the like appears in the future, for example.
- The image processing system 1 includes a capturing
device 11, animage processing device 12, and adisplay device 13. The capturingdevice 11 and theimage processing device 12 perform wired or wireless communication. In addition, communication methods between the capturingdevice 11 and theimage processing device 12 are not limited to a specific method but any communication methods including the wired or wireless communication method may be employed. InFIG. 1 , it is illustrated that the capturingdevice 11 and theimage processing device 12 are connected by a cable to perform wired communication. -
FIG. 2 is a block diagram illustrating a configuration example of the capturingdevice 11. - The capturing
device 11 includes a capturingunit 31, anillumination unit 32, animage processing unit 33, acommunication unit 34, and acontrol unit 35. - The capturing
unit 31 has an image sensor and so forth, captures the skin of the subject, and supplies the obtained part image to theimage processing unit 33 under control of thecontrol unit 35. In addition, the capturingunit 31 has an image sensor capable of capturing light in a wavelength band ranging at least from visible light to UV light. - The
illumination unit 32 irradiates illumination light on a region including the target part of the subject captured by the capturingunit 31 under control of thecontrol unit 35. In addition, a specific example of the configuration of theillumination unit 32 will be described later with reference toFIG. 9 . - The
image processing unit 33 performs a predetermined image process such as noise removal on the part image and supplies the processed result to thecommunication unit 34. - The
communication unit 34 is in communication with theimage processing device 12 using a predetermined communication method. Thecommunication unit 34 then transmits the part image to theimage processing device 12. In addition, thecommunication unit 34 receives capturing condition setting information transmitted from theimage processing device 12 and supplies the capturing condition setting information to thecontrol unit 35. - The
control unit 35 sets the capturing condition of the part image for the capturingunit 31 and theillumination unit 32 on the basis of the capturing condition setting information while controlling the capturingunit 31 and theillumination unit 32. -
FIG. 3 is a block diagram illustrating a configuration example of functions of theimage processing device 12. - The
image processing device 12 includes acommunication unit 51, acharacteristic analysis unit 52, alearning unit 53, astorage unit 54, apart specifying unit 55, a capturingcondition setting unit 56, a skin measurement andanalysis unit 57, and adisplay control unit 58. In addition, the skin measurement andanalysis unit 57 includes a texture measurement andanalysis unit 71, a keratin plug measurement andanalysis unit 72, a spot measurement andanalysis unit 73, and a wrinkle measurement andanalysis unit 74. - The
communication unit 51 is in communication with the capturingdevice 11 using a predetermined communication method. - The
characteristic analysis unit 52 receives the part image from the capturingdevice 11 through thecommunication unit 51. Thecharacteristic analysis unit 52 analyzes the characteristic of the part image and extracts the characteristic amount of the part image as will be described below. Thecharacteristic analysis unit 52 supplies information indicating the extracted characteristic amount of the part image to thelearning unit 53 or thepart specifying unit 55. - The
learning unit 53 learns identification information used to identify the part of the subject reflected in the part image based on the characteristic amount on the basis of the information indicating the characteristic amount of the part image supplied from thecharacteristic analysis unit 52 and correct data given from the outside as will be described below. Thelearning unit 53 causes learning data for generating the identification information, and identification information obtained from the learning to be stored in thestorage unit 54. - The
storage unit 54 stores a measurement condition setting table in addition to the learning data and the identification information described above. The measurement condition setting table acts to set the capturing condition, the measurement items, and so forth of the part image that are used to measure the skin state of the target part. For example, the measurement condition setting table is created by a user or provided by a manufacturer, a seller, and so forth of the image processing system 1. - In addition, a specific example of the measurement condition setting table will be described below with reference to
FIG. 8 and so forth. - The
part specifying unit 55 specifies a part (target part) of the subject within the part image on the basis of the characteristic amount extracted from the part image by thecharacteristic analysis unit 52 and the identification information stored in thestorage unit 54. Thepart specifying unit 55 notifies the capturingcondition setting unit 56 and the skin measurement andanalysis unit 57 of the specified target part. - The capturing
condition setting unit 56 sets the capturing condition of the capturing device on the basis of the target part specified by thepart specifying unit 55 and the measurement condition setting table stored in thestorage unit 54. The capturingcondition setting unit 56 then transmits capturing condition setting information indicating the set capturing condition to the capturingdevice 11 through thecommunication unit 51. - The skin measurement and
analysis unit 57 receives the part image from the capturingdevice 11 through thecommunication unit 51. The skin measurement andanalysis unit 57 specifies the item to be measured in the target part on the basis of the target part specified by thepart specifying unit 55 and the measurement condition setting table stored in thestorage unit 54. In addition, the skin measurement andanalysis unit 57 measures the item to be measured in the target part on the basis of the part image and also analyzes the measurement result. The skin measurement andanalysis unit 57 then supplies the results of measurement and analysis of the target part and the part image (hereinafter referred to as a measurement and analysis result) to thedisplay control unit 58. - In addition, the items to be measured include, for example, texture, a keratin plug, a spot, and a wrinkle. The texture measurement and
analysis unit 71, the keratin plug measurement andanalysis unit 72, the spot measurement andanalysis unit 73, and the wrinkle measurement andanalysis unit 74 measure states of the texture, the keratin plug, the spot, and the wrinkle in the target part on the basis of the part image, and analyze the measurement results, respectively. - The
display control unit 58 generates data indicating the measurement and analysis result of the skin state of the target part on the basis of the part image and the measurement and analysis result, and supplies the data to thedisplay device 13. -
FIG. 4 is a block diagram illustrating a configuration example of functions of thecharacteristic analysis unit 52. - The
characteristic analysis unit 52 includes afiltering unit 81, an averagingunit 82, and avectorization unit 83. - The
filtering unit 81 applies a plurality of Gabor filters of different parameters as will be described below. Thefiltering unit 81 then supplies a plurality of obtained images to the averagingunit 82. - The averaging
unit 82 divides each of the images into small blocks, and calculates an average value of pixel values per small block. The averagingunit 82 supplies the calculated averages to thevectorization unit 83. - The
vectorization unit 83 generates the vector including the average values supplied from the averagingunit 82 as a characteristic amount vector indicating the characteristic amount of the part image. Thevectorization unit 83 supplies the generated characteristic amount vector to thelearning unit 53 or thepart specifying unit 55. - Next, processes of the image processing system 1 will be described with reference to
FIGS. 5 to 20 . - First, the learning process carried out by the image processing system 1 will be described with reference to the flowchart of
FIG. 5 . - In addition, for example, this process is initiated when an instruction to initiate the learning process is input to the image processing system 1 through an input unit not shown.
- In step S1, the image processing system 1 captures the part image. In particular, a user captures at least one of cheeks, a nose, a forehead, and a back of the hand of the subject using the
capturing device 11. In addition, the user (capturer) and the subject may be the same person or may not be the same person. - The capturing
unit 31 then supplies the captured part image to theimage processing unit 33. - The
image processing unit 33 performs the image process such as noise removal on the part image and also cuts out a predetermined region of a central portion of the part image. In addition, hereinafter, it is considered that the region of 128 × 128 pixels of the central portion of the part image is cut out. Theimage processing unit 33 supplies the part image subjected to the image process to thecommunication unit 34. - The
communication unit 34 transmits the part image to theimage processing device 12. - The
communication unit 51 of theimage processing device 12 receives the part image transmitted from the capturingdevice 11, and supplies the part image to thefiltering unit 81 of thecharacteristic analysis unit 52. - In step S2, the
characteristic analysis unit 52 analyzes the characteristic ofthe part image. -
-
- In addition, λ indicates a cosine component of the wavelength, θ indicates a direction of a striped pattern of the Gabor function, ψ indicates a phase offset, σ indicates a scale, and γ indicates a spatial aspect ratio.
- In this case, for example, the
filtering unit 81 selects θ from four kinds such as 0°, 45°, 90°, and 135° and σ from two kinds such as 1.0 and 5.0 as parameters of the Gabor filter, and applies eight kinds of the Gabor filter as a total combination of θ and σ. Eight result images are thus generated from the part image. Thefiltering unit 81 supplies the generated eight result images to the averagingunit 82. The averagingunit 82 divides each of the result images into sixteen small blocks b1 to b16 each having 32×32 pixels as shown inFIG. 6 , and obtains an average value of the pixel values of each small block. As a result, 128 average values of the pixel values of the small blocks are obtained in total from the eight result images. The averagingunit 82 supplies the obtained average values to thevectorization unit 83. Thevectorization unit 83 generates a 128-dimensional vector that is a characteristic amount of the 128 average values supplied from the averagingunit 82 as a characteristic amount vector indicating the characteristic of the part image. When the characteristic amount vector is denoted with v, for example, the characteristic amount vector v is expressed as in equation (4) below. - In addition, gibj indicates the average of the pixel values of jth small block of the ith result image. The
vectorization unit 83 supplies the generated characteristic amount vector to thelearning unit 53. In step S3, thelearning unit 53 accumulates learning data. In particular, for example, the user inputs correct data indicating which one is captured among cheeks, a nose, a forehead, and a back of the hand as the part image to thelearning unit 53. Thelearning unit 53 causes thestorage unit 54 to store the learning data of which the acquired correct data and the characteristic amount vector are associated with each other. - In step S4, the
learning unit 53 determines whether or not the learning data is sufficiently accumulated. When it is determined that the learning data is not yet sufficiently accumulated, the process returns to step S1. - In step S4, processes from step S1 to step S4 are then repeatedly carried out until it is determined that the learning data is sufficiently accumulated.
- For example, the user captures a plurality of part images by shifting the location of the part of each of cheeks, a nose, a forehead, and a back of the hand of the subject little by little. The image processing system 1 then generates and accumulates the learning data in each of the captured part images. The process described above is performed on a plurality of subjects. The learning data in a plurality of locations of each part of each of the plurality of subjects is thus accumulated.
- In step S4, when it is determined that the learning data is sufficiently accumulated, the process proceeds to step S5.
- In step S5, the
learning unit 53 generates a discriminator. - For example, the
learning unit 53 generates a discriminator using a support vector machine (SVM) based on the learning data accumulated in thestorage unit 54 as identification information used to identify the part within the part image on the basis of the characteristic amount vector extracted from the part image. Thelearning unit 53 then causes the generated discriminator to be stored in thestorage unit 54. - The learning process is then finished.
- Next, the skin measurement and analysis process carried out by the image processing system 1 will be described with reference to the flowchart of
FIG.7 . - In addition, for example, this process is initiated when an instruction to initiate the skin measurement and analysis process is input to the image processing system 1 through an input unit not shown.
- In step S101, the part image of the part as the target where the skin state of the subject is measured is captured in the same manner as in the process of step S1 of
FIG. 5 . - In step S102, the characteristic of the part image is analyzed in the same manner as in the process of step S2 of
FIG. 5 . The characteristic amount vector obtained from the process is then supplied from thecharacteristic analysis unit 52 to thepart specifying unit 55. - In step S103, the
part specifying unit 55 specifics the target part. In particular, thepart specifying unit 55 specifies the part of the subject (target part) reflected in the part image from cheeks, a nose, a forehead, and a back of the hand on the basis of the characteristic amount vector extracted by thecharacteristic analysis unit 52 and the discriminator stored in thestorage unit 54. Thepart specifying unit 55 notifies the capturingcondition setting unit 56 and the skin measurement andanalysis unit 57 of the specified target part. - In step S104, the image processing system 1 sets the capturing condition. In particular, the capturing
condition setting unit 56 sets the capturing condition of the capturingdevice 11 on the basis of the target part specified by thepart specifying unit 55 and the measurement condition setting table stored in thestorage unit 54. -
FIG. 8 illustrates an example of the measurement condition setting table. For example, the measurement condition setting table includes six items such as target part, wavelength, measurement depth, polarization direction, capturing range, and measurement item. Among these items, the wavelength, the measurement depth, the polarization direction, and the capturing range are items for setting the capturing condition of the capturingdevice 11, and values of each of the items are determined by at least one of the target part and the measurement items. - The wavelength indicates the wavelength of light used to capture the part image, and is set to a value suitable for capturing elements (e.g., skin ridge, keratin plug, and so forth) necessary to measure the measurement item of the target part, for example. In this example, two kinds of set values such as white light and UV light (wavelength 390 nm) are illustrated.
- The measurement depth indicates how deep the measurement is performed from the skin surface. In this example, the depth is set to 0 mm (skin surface) for all cases.
- The polarization direction indicates a relationship between the polarization direction of illumination light emitted from the
illumination unit 32 and the polarization direction of incident light incident on the image sensor of the capturingunit 31. In this example, three kinds of set values such as parallel, orthogonal, and - (no polarization) are illustrated. - Here, parallel indicates that the polarization direction of the illumination light is made to be parallel to the polarization direction of the incident light, in other words, the polarization direction of the illumination light is made to be equal to the polarization direction of the incident light. It is thus possible to extract and capture light reflected from the surface of the target part by causing the polarization direction of the illumination light to be parallel to the polarization direction of the incident light.
- Orthogonal indicates that the polarization direction of the illumination light is made to be orthogonal to the polarization direction of the incident light. It is thus possible to block out light reflected from the surface of the target part by causing the polarization direction of the illumination light to be orthogonal to the polarization direction of the incident light, and is thus possible to extract light having components (e.g., components reflected inside the skin) other than the reflected light. (No polarization) indicates that there is no polarization between the illumination light and the incident light.
- >
- The polarization direction is thus set on the basis of on which one measurement is performed between the surface and the inside of the target part, for example.
- The capturing range indicates the range in which the part image is captured, and, for example, is set to an area that enables elements necessary to measure the measurement items of the target part to be captured. In this case, three kinds of set values such as 1 cm×1 cm, 3 cm×3 cm, and 5 cm×5 cm are illustrated.
- For example, the measurement item is the texture when the target part is cheeks, however, the number of skin ridges or the like is measured as the texture state as will be described below. Since the size of the skin ridge is about 0.1 to 0.5 mm, the capturing range is set to 1 cm×1 cm, for example. In addition, the measurement item is the wrinkle when the target part is the forehead, however, since the length of the wrinkle of the forehead is about several cm, the capturing range is set to 5 cm×5 cm, for example.
- The measurement item indicates the item to be measured on the target part. In this case, four kinds of set values such as texture, a keratin plug, a spot, and a wrinkle are illustrated.
- The capturing
condition setting unit 56 sets the capturing condition on the basis of wavelength, measurement depth, polarization direction, and capturing range with respect to the specified target part. For example, the capturingcondition setting unit 56 sets the kind of the light source or the filter to be applied to capturing on the basis of the wavelength corresponding to the target part. In addition, for example, the capturingcondition setting unit 56 sets the depth of the focus location of the capturingunit 31 from a surface of the target part on the basis of the measurement depth corresponding to the specified target part. In addition, for example, the capturingcondition setting unit 56 sets the polarization direction of the capturingunit 31 and the polarization direction of theillumination unit 32 on the basis of the polarization direction corresponding to the specified target part. In addition, the capturingcondition setting unit 56 sets capturing magnification of the capturingunit 31 on the basis of the capturing range corresponding to the specified target part. - The capturing
condition setting unit 56 transmits capturing condition setting information indicating the set capturing conditions to the capturingdevice 11 through thecommunication unit 51. Thecontrol unit 35 of the capturingdevice 11 then receives the capturing condition setting information through thecommunication unit 34. Thecontrol unit 35 then performs setting on the capturingunit 31 and theillumination unit 32 on the basis of the received capturing condition setting information - For example, the
control unit 35 sets the kind of the light source or the filter to be employed and sets the wavelength of light used to capture the part image on the basis of the capturing condition setting information - Here, a specific example of a method of setting the wavelength of light used to capture the part image will be described with reference to
FIGS. 9 to 12 . In addition, hereinafter, an example of setting the wavelength of light used to capture the part image from four kinds will be described. - For example, as shown in
FIG. 9 ,light sources 111a to 111d having different wavelengths from each other and a lightsource switching circuit 112 may be disposed in theillumination unit 32, and the light source to be used may be switched. The wavelength of the illumination light irradiated on the subject and the incident light incident on theimage sensor 101 of the capturingunit 31 is thus switched. - In addition, for example, the
light sources 111a to 111d include light emitting diodes (LEDs). - In addition, for example, as shown in
FIG, 10 , aturret 131 including band pass filters (BPFs) 132a to 132d having different transmission wavelength ranges from each other may be disposed in front of theimage sensor 101, and alight source 141 emitting light in a wavelength band ranging from visible light to UV light may be disposed in theillumination unit 32. The wavelength of the incident light incident on theimage sensor 101 may be switched by rotating theturret 131 and thus adjusting the location of theBPFs 132a to 132d. In addition, for example, thelight source 141 includes an LED. - In addition, for example, as shown in
FIG. 11 , a four-way dividing filter 151 includingBPFs 151a to 151d having different transmission wavelengths from each other may be disposed such that transmitting light of each BPF is incident on a different region of theimage sensor 101, and the region of theimage sensor 101 used to capture the part image may thus be switched. - In addition, for example, as shown in
FIG. 12 ,BPFs 171a to 171d having different transmission wavelength ranges from each other may be disposed so as to correspond to fourimage sensors 101a to 101d, and the image sensor used for capturing may thus be switched. - In addition, the
control unit 35 sets the focus location of the capturingunit 31 on the basis of the capturing condition setting information. In this case, since the measurement depth of all target parts is 0 mm, the focus location is set to surfaces of all of the target parts regardless of the individual target parts. - In addition, the
control unit 35 sets the polarization direction of illumination light emitted from theillumination unit 32 and the polarization direction of incident light incident on the image sensor of the capturingunit 31 on the basis of the capturing condition setting information. - In addition, the
control unit 35 sets the capturing magnification of the zoom lens of the capturingunit 31 on the basis of the capturing condition setting information. - Referring back to
FIG. 7 , in step S105, the image processing system 1 performs the same process as in step S 1 ofFIG. 5 , and captures the part image again under the capturing condition set in response to the target part. The captured part image is transmitted from the capturingdevice 11 to theimage processing device 12 and then supplied to the skin measurement andanalysis unit 57. - In step S106, the skin measurement and
analysis unit 57 performs measurement and analysis on the target part. In particular, the skin measurement andanalysis unit 57 specifies items to be measured and analyzed on the target part on the basis of the target part specified by thepart specifying unit 55 and the measurement condition setting table stored in thestorage unit 54. - When the target part is the cheeks, the texture measurement and
analysis unit 71 of the skin measurement andanalysis unit 57 measures the texture state of the cheeks and analyzes the measurement result. Here, details of the texture measurement and analysis process carried out by the texture measurement andanalysis unit 71 will be described with reference to the flowchart ofFIG. 13 . - In step S121, the texture measurement and
analysis unit 71 detects a high brightness region. In particular, the skin ridge regions tend to be brighter than other regions such as skin grooves in the part image. For example, the texture measurement andanalysis unit 71 thus binarizes the part image using a predetermined threshold value and extracts the high brightness regions from the binarized image. - In step S122, the texture measurement and
analysis unit 71 performs a labeling process. As a result, the extracted high brightness regions are individually identified. - In step S123, the texture measurement and
analysis unit 71 measures the number, size, shape, and direction of the skin ridges. In particular, the texture measurement andanalysis unit 71 obtains the number of skin ridges within the part image by counting the number of identified high brightness regions. In addition, the texture measurement andanalysis unit 71 measures the shape and size of each high brightness region as the shape and size of the skin ridge. In addition, the texture measurement andanalysis unit 71 measures a direction of an edge portion of each high brightness region. - In step S124, the texture measurement and
analysis unit 71 calculates the similarity of the size and shape of the skin ridges and the distribution of edge directions of the skin ridges. - For example, fineness of the texture of the target part is evaluated by the number of skin ridges. In addition, for example, uniformity of the texture of the target part is evaluated by the similarity of size and shape of the skin ridges and the distribution of edge directions of the skin ridges.
- The texture measurement and analysis process is then finished.
- In addition, when the target part is the nose, the keratin plug measurement and
analysis unit 72 of the skin measurement andanalysis unit 57 measures the state of the keratin plug of the nose and analyzes the measurement result. Here, details of the keratin plug measurement and analysis process carried out by the keratin plug measurement andanalysis unit 72 will be described with reference to the flowchart ofFIG. 14 . - In step S141, the keratin plug measurement and
analysis unit 72 extracts orange or green regions. -
FIG. 15 schematically illustrates the part image of which a surface of the nose is captured using UV light. In addition, portions illustrated in gray inFIG. 15 are regions of the keratin plugs. - When the surface of the nose is captured using the UV light, regions of the keratin plugs illustrated in gray in
FIG. 15 exhibit colors close to orange or green, and stand out against the others. In addition, one color such as gray is illustrated inFIG. 15 , however, color or brightness actually changes within the region. - The keratin plug measurement and
analysis unit 72 thus extracts the orange or green regions as regions at which the keratin plugs within the part image are reflected. - In step S142, the keratin plug measurement and
analysis unit 72 performs the labeling process. As a result, the extracted orange or green regions are individually identified. - In step S 143, the keratin plug measurement and
analysis unit 72 measures the number and size of the keratin plugs. In particular, the keratin plug measurement andanalysis unit 72 obtains the number of keratin plugs within the part image by counting the number of identified orange or green regions. In addition, the keratin plug measurement andanalysis unit 72 measures the size of each of the orange or green regions as the size of the keratin plug. - In step S 144, the keratin plug measurement and
analysis unit 72 calculates an average of the sizes of the keratin plugs. - For example, an amount of the keratin plugs of the target part is evaluated by the number of keratin plugs and the average of the sizes of the keratin plugs.
- The keratin plug measurement and analysis process is then finished.
- In addition, when the target part is the back of the hand, the spot measurement and
analysis unit 73 of the skin measurement andanalysis unit 57 measures the spot of the back of the hand and analyzes the measurement result. Here, details of the spot measurement and analysis process carried out by the spot measurement andanalysis unit 73 will be described with reference to the flowchart ofFIG. 16 . - In step S161, the spot measurement and
analysis unit 73 extracts low brightness regions. In particular, spot regions appear as blackish regions in the part image. The spot measurement andanalysis unit 73 thus binarizes the part image, for example, using a predetermined threshold value, and extracts the low brightness regions from the binarized image. - In step S162, the spot measurement and
analysis unit 73 performs the labeling process. As a result, the extracted low brightness regions are individually identified. - In step S163, the spot measurement and
analysis unit 73 measures the number and sizes of the spots. In particular, the spot measurement andanalysis unit 73 obtains the number of spots within the part image by counting the number of identified low brightness regions. In addition, the spot measurement andanalysis unit 73 measures the size of each low brightness region as the size of the spot. - In step S164, the spot measurement and
analysis unit 73 calculates an average of the sizes of the spot regions. - An amount of the spots of the target part is thus evaluated by the number of spots and the average ofthe sizes, for example.
- The spot measurement and analysis process is then finished.
- In addition, when the target part is the forehead, the wrinkle measurement and
analysis unit 74 of the skin measurement andanalysis unit 57 measures the wrinkle of the forehead and analyzes the measurement result. Here, details of the wrinkle measurement and analysis process carried out by the wrinkle measurement andanalysis unit 74 will be described with reference to the flowchart ofFIG. 17 . - In step S181, the wrinkle measurement and
analysis unit 74 extracts edges within the part image. That is, the wrinkle measurement andanalysis unit 74 extracts edge regions within the part image using a Sobel filter or the like so as to extract the wrinkle reflected in the part image as shown inFIG. 18 . - In step S182, the wrinkle measurement and
analysis unit 74 performs a labeling process. As a result, each of the extracted edge regions is individually identified. - In step S183, the wrinkle measurement and
analysis unit 74 measures the number and the sizes of the wrinkles. In particular, the wrinkle measurement andanalysis unit 74 obtains the number of wrinkles within the part image by counting the number of identified edge regions. In addition, the wrinkle measurement andanalysis unit 74 measures the length of each wrinkle within the part image by counting connection pixels of each edge region. - In step S 184, the wrinkle measurement and
analysis unit 74 calculates an average of the lengths of the wrinkles. - An amount of the wrinkles of the target part is thus evaluated by the number of wrinkles and the average of the lengths of the wrinkles, for example.
- The wrinkle measurement and analysis process is then finished.
- Referring back to
FIG. 7 , in step S107, the image processing system 1 displays the measurement and analysis result. - In particular, the skin measurement and
analysis unit 57 supplies the part image and the measurement and analysis result ofthe target part to thedisplay control unit 58. Thedisplay control unit 58 generates data indicating the measurement and analysis result ofthe skin state of the target part on the basis of the part image and the measurement and analysis result of the target part, and transmits the data to thedisplay device 13 through thecommunication unit 51. Thedisplay device 13, for example, displays the measurement and analysis result of the skin state of the target part along with the part image on the basis of the received data. - The skin measurement and analysis process is then finished.
- As described above, it is possible to automatically specify the target part, capture the part image under the condition suitable for the target part, and perform measurement and analysis on items according to the target part. The user can thus correctly and simply measure and analyze the skin state of a desired part without fail.
- In addition, it is possible to simply increase the kind of the target part by increasing the part on which the learning process is performed.
- In addition, it is possible to simply change the measurement item or the capturing condition by changing the measurement condition setting table.
- Here, a case of using the measurement condition setting table of
FIG. 19 instead of the measurement condition setting table ofFIG. 8 will be described. - The capturing condition and the measurement item of the measurement condition setting table of
FIG. 19 is different from the measurement condition setting table ofFIG. 8 . In particular, as the capturing condition of the cheeks, the wavelength is set to UV light, the measurement depth is set to 0.2 mm, and the polarization direction is set to an orthogonal direction in the measurement condition setting table ofFIG. 19 . Accordingly, when the target part is the cheeks, the part image of the cheeks is captured under the set condition. - In addition, the measurement depth of the cheeks is different from the measurement depth of other parts. This difference will be described with reference to
FIG. 20 . -
FIG. 20 is a diagram schematically illustrating states of theimage sensor 201 and thelens 202 of the capturingunit 31 of the capturingdevice 11. - Since the measurement depth is 0.0 mm when the target part is a part other than the cheeks, the focus location of the
lens 202 is set to a surface of the target part as shown in the left side ofFIG. 20 . - On the other hand, since the measurement depth is 0.2 mm when the target part is the cheeks, the focus location of the
lens 202 is set to a location 0.2 mm deep from the surface of the target part as shown in the right side ofFIG. 20 . Since an average thickness of the human epidermis is about 0.2 mm, it is possible to detect spots that are slightly inside the surface of the cheeks and will appear on the surface in the future. - Hereinafter, modified examples of the embodiments of the present technology will be described.
- Although the image processing system 1 including the capturing
device 11, theimage processing device 12, and thedisplay device 13 has been described, the present technology may be embodied by other configurations. - For example, the capturing
device 11, theimage processing device 12, and thedisplay device 13 may be integrated to be one device. In addition, for example, two of the capturingdevice 11, theimage processing device 12, and thedisplay device 13 may be integrated. - In addition, disposition of functions of the capturing
device 11 and theimage processing device 12 are not limited to the examples described above, for example, some of the functions of the capturingdevice 11 may be disposed in theimage processing device 12 and some of the functions of theimage processing device 12 may be disposed in the capturingdevice 11. - In addition, the
display device 13 may be disposed to be dedicated to the image processing system 1, and display devices of other devices such as a television receiver or a cellular phone may also be employed. - In addition, for example, the present technology may be embodied by the system that performs a remote process through the network such as the
image processing system 301 ofFIG. 21 . - The
image processing system 301 includes acapturing device 311, aserver 312, and adisplay device 313. Thecapturing device 311, theserver 312, and thedisplay device 313 correspond to the capturingdevice 11, theimage processing device 12, and thedisplay device 13, and perform almost the same processes, respectively. - The
capturing device 311, theserver 312, and thedisplay device 313 are connected to each other through thenetwork 314 to perform communication. In addition, communication methods between the capturingdevice 311, theserver 312, and thedisplay device 313 are not limited to particular methods, but arbitrary wired or wireless communication methods may be employed. -
FIG. 22 is a block diagram illustrating a configuration example of the functions of thecapturing device 311. In addition, portions ofFIG. 22 corresponding to those ofFIG. 2 are denoted with the same numerals, and the description of the portions at which the same processes are carried out will not be repeated, but rather, omitted. - The
capturing device 311 differs from the capturingdevice 11 ofFIG. 2 in that anencryption unit 331 is disposed between theimage processing unit 33 and thecommunication unit 34. - The
encryption unit 331 encrypts or scrambles the part image captured by the capturingunit 31 using a predetermined method so as to ensure security in a transmission path between the capturingdevice 311 and theserver 312. Theencryption unit 331 supplies the encrypted or scrambled part image to thecommunication unit 34. -
FIG. 23 is a block diagram illustrating a configuration example of the functions of theserver 312. In addition, portions ofFIG. 23 corresponding to those ofFIG. 3 are denoted with the same numerals, and the description of the portions where the same processes are carried out will not be repeated, but rather, omitted. - The
server 312 differs from theimage processing device 12 ofFIG. 3 in that adecoding unit 551 is disposed between thecommunication unit 51, thecharacteristic analysis unit 52, and the skin measurement andanalysis unit 57. - The
decoding unit 551 receives the part image from thecapturing device 311 through thenetwork 314 and thecommunication unit 51. Thedecoding unit 551 decodes the encrypted or scrambled part image, and supplies the decoded part image to thecharacteristic analysis unit 52 or the skin measurement andanalysis unit 57. - Methods of analyzing the characteristic of the part image of the
characteristic analysis unit 52 are not limited to those described above, but other methods, for example, those of extracting color information or the like as the characteristic amount using other edge extraction filters, may be employed. Learning methods of thelearning unit 53 are not limited to the SVM mentioned above, but other learning methods, for example, linear discriminant analysis, neural net, and so forth, may be employed. - In addition, the
learning unit 53 may not be disposed in theimage processing device 12, and the result of another device carrying out the learning process may be given to theimage processing device 12. - The target parts and the measurement items described above are merely examples, and arbitrary target parts and measurement items may be added or deleted. In addition, combinations of the target part and the measurement items may be set to be different from the examples described above, or a plurality of measurement items may be set for one target part.
- The capturing conditions are merely examples, and arbitrary conditions may be added or deleted. For example, intensity of illumination light, exposure of the capturing
unit 31, capturing angle, and so forth may be set. - The present technology may also be applied to a case of measuring and analyzing the skin state of a living body other than a human being.
- In addition, for example, the present technology may also be applied to cases not of analyzing the measurement result but only of measuring the skin state, displaying the measurement result only, recording the measurement result only, or supplying the measurement result to other devices.
- In addition, for example, the present technology may also be applied to a case of not performing measurement and analysis on the skin state. For example, the present technology may also be applied to a case of displaying and recording the part image captured under the capturing condition in response to the specified part, or supplying the part image to other devices.
- The series of processes described above may be carried out by software or hardware. When the series of processes are carried out by the software, a program constituting the software is installed in a computer. Here, the computer includes, for example, a computer built in dedicated hardware, and a computer in which various programs are installed and various functions are carried out such as a general purpose personal computer.
-
FIG. 24 is a block diagram illustrating a configuration example of the computer hardware carried out by the series of processes described above. - A central processing unit (CPU) 401, a read only memory (ROM) 402, and a random access memory (RAM) 403 are connected to each other by a
bus 404 in the computer. - In addition, an input and
output interface 405 is connected to thebus 404. Aninput unit 406, anoutput unit 407, astorage unit 408, acommunication unit 409, and adrive 410 are connected to the input andoutput interface 405. - The
input unit 406 includes a keyboard, a mouse, a microphone, and so forth. Theoutput unit 407 includes a display, a speaker, and so forth. Thestorage unit 408 includes a hard disk, a non-volatile memory, and so forth. Thecommunication unit 409 includes a network interface, and so forth. Thedrive 410 drivesremovable media 411 such as a magnetic disk, an optical disc, a magneto-optical disc, and a semiconductor memory. - In the computer described above, for example, the series of processes are carried out by the
CPU 401 that causes the program stored in thestorage unit 408 to be loaded onto theRAM 403 through the input andoutput interface 405 and thebus 404 and then executed. - The program carried out by the computer (CPU 401), for example, may be recorded on the
removable media 411 as package media and provided. In addition, the program may be provided through wired or wireless transmission media such as a local area network, the Internet, and digital satellite broadcasting. - In the computer, the program may be installed in the
storage unit 408 through the input andoutput interface 405 by mounting theremovable media 411 on thedrive 410. In addition, the program may be received at thecommunication unit 409 through the wired or wireless transmission medium and then installed on thestorage unit 408. Also, the program may be installed on theROM 402 or thestorage unit 408 in advance. - In addition, the program carried out by the computer may be a program of which the processes are carried out in time series in accordance with the order described in the present specification, or may be a program of which the processes are carried out in parallel or at a required timing when called upon.
- In addition, the term system means a general device configured to include a plurality of devices, means, and so forth in the present specification.
- In addition, the embodiments of the present technology are not limited to the embodiments described above, but may be variously changed within the scope without departing from the subject matter of the present technology as set out in the claims.
- In so far as the embodiments of the invention described above are implemented, at least in part, using software-controlled data processing apparatus, it will be appreciated that a computer program providing such software control and a transmission, storage or other medium by which such a computer program is provided are envisaged as aspects of the present invention.
- In addition, for example, the present technology may include the configurations below.
- (1) An image processing device including:
- a specifying unit configured to specify a part of a living body within a first image on the basis of a characteristic amount of the first image capturing skin of the living body; and
- a setting unit configured to set a capturing condition under which a second image indicating a state of the skin of the part is captured in response to the specified part.
- (2) The image processing device according to (1), further including:
- a measurement unit configured to measure the state of the skin of the specified part on the basis of the second image captured under the set capturing condition.
- (3) The image processing device according to (2), wherein the measurement unit measures at least one of texture, a wrinkle, a spot, and a keratin plug of the skin in the specified part.
- (4) The image processing device according to (2), further including:
- a display control unit configured to control display of a measurement result of the part.
- (5) The image processing device according to any one of (1) to (4), wherein the setting unit sets at least one of a wavelength of light used to capture the second image, a depth of a focus location from a surface of the part, a relationship between a polarization direction of illumination light and a polarization direction of incident light incident on an image sensor, and a capturing magnification on the basis of at least one of the part and measurement items in the part.
- (6) The image processing device according to any one of (1) to (5), further including:
- a characteristic analysis unit configured to extract a characteristic amount of the first image,
- wherein the specifying unit specifies the part of the living body within the first image on the basis of the characteristic amount extracted by the characteristic analysis unit.
- (7) The image processing device according to (6), further including:
- a learning unit configured to learn identification information for identifying the part of the living body on the basis of the characteristic amount,
- (8) The image processing device according to any one of (1) to (7), further including:
- a capturing unit configured to capture the living body.
- (9) An image processing method including:
- by an image processing device, specifying a part of a living body within a first image on the basis of a characteristic amount of the first image capturing skin of the living body; and
- setting a capturing condition under which a second image indicating a state of the skin of the part is captured in response to the specified part.
- (10) A program for causing a computer to execute processes including:
- specifying a part of a living body within a first image on the basis of a characteristic amount of the first image capturing skin of the living body; and
setting a capturing condition under which a second image indicating a state of the skin of the part is captured in response to the specified part.
- specifying a part of a living body within a first image on the basis of a characteristic amount of the first image capturing skin of the living body; and
- (11) A computer-readable recording medium having the program according to (10) recorded thereon.
- (12) An image processing system including:
- a capturing device; and
- an image processing device,
wherein the image processing device includes
a specifying unit configured to specify a part of a living body within a first image on the basis of a characteristic amount of the first image of skin of the living body captured by the capturing device; and
a setting unit configured to set, on the capturing device, a capturing condition under which a second image indicating a state of the skin of the part is captured in response to the specified part.
Claims (12)
- An image processing device (12) comprising:a specifying unit (55) configured to specify a part of a living body within a first image on the basis of a characteristic amount of the first image capturing skin of the living body and identification information for identifying the part of the living body on the basis of the characteristic amount; anda setting unit (56) configured to set a capturing condition under which a second image indicating a state of the skin of the part is captured; anda storage unit (54) configured to store a measurement condition setting table identifying capturing conditions corresponding to a plurality of parts of the living body;wherein the setting unit (56) is configured to set the capturing condition under which the second image is captured to the capturing condition of the measurement condition setting table corresponding to the part of the living body identified by the specifying unit (55).
- The image processing device according to claim 1, further comprising:a measurement unit (57) configured to measure the state of the skin of the specified part on the basis of the second image captured under the set capturing condition.
- The image processing device according to claim 2, wherein the measurement unit (57) measures at least one of texture, a wrinkle, a spot, and a keratin plug of the skin in the specified part.
- The image processing device according to claim 2, further comprising:a display control unit (58) configured to control display of a measurement result of the part.
- The image processing device according to claim 1, wherein the setting unit (56) sets at least one of a wavelength of light used to capture the second image, a depth of a focus location from a surface of the part, a relationship between a polarization direction of illumination light and a polarization direction of incident light incident on an image sensor, and a capturing magnification on the basis of at least one of the part and measurement items in the part.
- The image processing device according to claim 1, further comprising:a characteristic analysis unit (52) configured to extract a characteristic amount of the first image,wherein the specifying unit (55) specifies the part of the living body within the first image on the basis of the characteristic amount extracted by the characteristic analysis unit.
- The image processing device according to claim 6, further comprising:a learning unit (53) configured to learn the identification information for identifying the part of the living body on the basis of the characteristic amount.
- The image processing device according to claim 1, further comprising:a capturing unit (31) configured to capture the first image and the second image.
- An image processing method by an image processing device comprising:storing a measurement condition setting table identifying capturing conditions corresponding to a plurality of parts of the living body;specifying (S103) a part of a living body within a first image on the basis of a characteristic amount of the first image capturing skin of the living body and identification information for identifying the part of the living body on the basis of the characteristic amount;setting (S104) a capturing condition under which a second image indicating a state of the skin of the part is captured; andwherein the capturing condition under which the second image is captured is set to the capturing condition of the measurement condition setting table corresponding to the identified part of the living body.
- A program for causing a computer to execute the method according to claim 9.
- A computer-readable recording medium having the program according to claim 10 recorded thereon.
- An image processing system comprising:a capturing device (11); andan image processing device (12) according to claim 1.
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2011126021A JP5733032B2 (en) | 2011-06-06 | 2011-06-06 | Image processing apparatus and method, image processing system, program, and recording medium |
Publications (2)
Publication Number | Publication Date |
---|---|
EP2532301A1 EP2532301A1 (en) | 2012-12-12 |
EP2532301B1 true EP2532301B1 (en) | 2014-12-03 |
Family
ID=46651327
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP12169199.2A Not-in-force EP2532301B1 (en) | 2011-06-06 | 2012-05-24 | Image processing device, image processing method, image processing system, program, and recording medium |
Country Status (4)
Country | Link |
---|---|
US (1) | US9517029B2 (en) |
EP (1) | EP2532301B1 (en) |
JP (1) | JP5733032B2 (en) |
CN (1) | CN102973242B (en) |
Families Citing this family (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8184901B2 (en) * | 2007-02-12 | 2012-05-22 | Tcms Transparent Beauty Llc | System and method for applying a reflectance modifying agent to change a person's appearance based on a digital image |
FR2969915B1 (en) * | 2011-01-04 | 2014-01-03 | Ergylink | SYSTEM AND METHOD FOR ANALYZING AT LEAST ONE CHARACTERISTIC OF THE SKIN |
TWI556116B (en) * | 2012-02-15 | 2016-11-01 | Hitachi Maxell | Skin condition analysis and analysis information management system, skin condition analysis and analysis information management method, and data management server |
JP6251489B2 (en) * | 2013-03-28 | 2017-12-20 | 株式会社 資生堂 | Image analysis apparatus, image analysis method, and image analysis program |
JP6379519B2 (en) * | 2014-02-27 | 2018-08-29 | カシオ計算機株式会社 | Skin treatment device, skin treatment method and program |
TW201540264A (en) * | 2014-04-18 | 2015-11-01 | Sony Corp | Information processing device, information processing method, and program |
TWI669103B (en) * | 2014-11-14 | 2019-08-21 | 日商新力股份有限公司 | Information processing device, information processing method and program |
US9918662B2 (en) * | 2015-07-13 | 2018-03-20 | Victoria Anna Breitner | Measuring cervical spine posture using nostril tracking |
CN108885134A (en) | 2016-02-08 | 2018-11-23 | 平等化妆品公司 | For preparing and outpouring the device and method of visual customization cosmetics |
KR102429838B1 (en) * | 2016-03-11 | 2022-08-05 | (주)아모레퍼시픽 | Evaluation device for skin texture based on skin blob and method thereof |
US11122206B2 (en) | 2016-11-08 | 2021-09-14 | Preh Holding, Llc | Personal care device with camera |
US10511777B2 (en) | 2016-11-08 | 2019-12-17 | Thomas Nichols | Personal care device with camera |
CN110178014B (en) * | 2016-11-14 | 2023-05-02 | 美国西门子医学诊断股份有限公司 | Method and apparatus for characterizing a sample using pattern illumination |
CN110191661B (en) * | 2016-12-20 | 2022-07-05 | 株式会社资生堂 | Coating control device, coating control method, and recording medium |
JP6918584B2 (en) * | 2017-06-08 | 2021-08-11 | 花王株式会社 | Pore clogging evaluation method and pore clogging evaluation device |
CN109493310A (en) * | 2017-09-08 | 2019-03-19 | 丽宝大数据股份有限公司 | Biological information analytical equipment and its hand skin analysis method |
JP7010057B2 (en) * | 2018-02-26 | 2022-01-26 | オムロン株式会社 | Image processing system and setting method |
CN110547866B (en) * | 2018-06-01 | 2022-01-11 | 福美生技有限公司 | Feedback energy release system and method of operation thereof |
US10575623B2 (en) | 2018-06-29 | 2020-03-03 | Sephora USA, Inc. | Color capture system and device |
EP3782536A1 (en) * | 2019-08-20 | 2021-02-24 | Koninklijke Philips N.V. | Identifying a body part |
USD1000624S1 (en) | 2019-12-27 | 2023-10-03 | Thomas Nichols | Personal care device with camera |
JP7264296B1 (en) | 2022-04-20 | 2023-04-25 | 堺化学工業株式会社 | Condition determining method, condition determining device, and condition determining program for determining condition of hair |
JP7505714B2 (en) * | 2022-05-08 | 2024-06-25 | 知行 宍戸 | AI skin analysis method, device, or system for skin including bedsore scars, and AI model for identifying skin units |
JP7348448B1 (en) | 2022-06-01 | 2023-09-21 | 株式会社 サティス製薬 | AI skin moisture analysis method, device, or system and trained AI skin moisture analysis model |
Family Cites Families (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2973676B2 (en) * | 1992-01-23 | 1999-11-08 | 松下電器産業株式会社 | Face image feature point extraction device |
JPH0775629A (en) * | 1993-09-07 | 1995-03-20 | Kao Corp | Method and device for observing surface of skin |
JP2004073802A (en) * | 2002-08-20 | 2004-03-11 | Forest Denki Kk | Multipurpose measuring unit for skin |
US20040207743A1 (en) * | 2003-04-15 | 2004-10-21 | Nikon Corporation | Digital camera system |
JP2005296280A (en) * | 2004-04-09 | 2005-10-27 | Kao Corp | Method for preparing photo scale |
JP4761924B2 (en) * | 2004-10-22 | 2011-08-31 | 株式会社 資生堂 | Skin condition diagnosis system and beauty counseling system |
JPWO2006064635A1 (en) | 2004-12-17 | 2008-06-12 | コニカミノルタホールディングス株式会社 | Diagnostic system |
JP2006218291A (en) * | 2005-01-13 | 2006-08-24 | Nidec Copal Corp | Skin observation apparatus |
JP4498224B2 (en) * | 2005-06-14 | 2010-07-07 | キヤノン株式会社 | Image processing apparatus and method |
US7454046B2 (en) * | 2005-09-20 | 2008-11-18 | Brightex Bio-Photonics, Llc | Method and system for analyzing skin conditions using digital images |
JP5080060B2 (en) * | 2005-11-08 | 2012-11-21 | 株式会社 資生堂 | Skin condition analysis method, skin condition analysis apparatus, skin condition analysis program, and recording medium on which the program is recorded |
JP2008011994A (en) * | 2006-07-04 | 2008-01-24 | Shiseido Co Ltd | Spots recurrence discrimination method |
JP4222388B2 (en) * | 2006-07-10 | 2009-02-12 | ソニー株式会社 | Image management apparatus, image recording apparatus, imaging apparatus, image management system, image analysis information management method, and program |
US20100185064A1 (en) * | 2007-01-05 | 2010-07-22 | Jadran Bandic | Skin analysis methods |
US20090245603A1 (en) * | 2007-01-05 | 2009-10-01 | Djuro Koruga | System and method for analysis of light-matter interaction based on spectral convolution |
JP5080116B2 (en) * | 2007-03-23 | 2012-11-21 | 株式会社 資生堂 | Skin imaging device |
US8718336B2 (en) * | 2007-03-30 | 2014-05-06 | Fujifilm Corporation | Image correction apparatus, image correction method, skin diagnosis method, and computer program product |
JP2009000494A (en) * | 2007-05-23 | 2009-01-08 | Noritsu Koki Co Ltd | Porphyrin detection method, porphyrin display method, and porphyrin detector |
JP2009000410A (en) * | 2007-06-25 | 2009-01-08 | Noritsu Koki Co Ltd | Image processor and image processing method |
JP5290585B2 (en) * | 2008-01-17 | 2013-09-18 | 株式会社 資生堂 | Skin color evaluation method, skin color evaluation device, skin color evaluation program, and recording medium on which the program is recorded |
US8218862B2 (en) * | 2008-02-01 | 2012-07-10 | Canfield Scientific, Incorporated | Automatic mask design and registration and feature detection for computer-aided skin analysis |
KR101475684B1 (en) * | 2008-10-17 | 2014-12-23 | 삼성전자주식회사 | Apparatus and method for improving face image in digital image processing device |
JP2010097126A (en) * | 2008-10-20 | 2010-04-30 | Shiseido Co Ltd | Face illuminating device and color measuring device using the same |
US8373859B2 (en) * | 2009-03-27 | 2013-02-12 | Brightex Bio-Photonics Llc | Methods and systems for imaging skin using polarized lighting |
JP2010250420A (en) * | 2009-04-13 | 2010-11-04 | Seiko Epson Corp | Image processing apparatus for detecting coordinate position of characteristic part of face |
TW201116257A (en) * | 2009-11-13 | 2011-05-16 | Inst Information Industry | System and method for analysis of facial defects and computer program product |
-
2011
- 2011-06-06 JP JP2011126021A patent/JP5733032B2/en active Active
-
2012
- 2012-05-24 EP EP12169199.2A patent/EP2532301B1/en not_active Not-in-force
- 2012-05-30 US US13/483,446 patent/US9517029B2/en active Active
- 2012-06-06 CN CN201210184838.1A patent/CN102973242B/en not_active Expired - Fee Related
Also Published As
Publication number | Publication date |
---|---|
JP2012249917A (en) | 2012-12-20 |
EP2532301A1 (en) | 2012-12-12 |
CN102973242B (en) | 2017-09-29 |
JP5733032B2 (en) | 2015-06-10 |
US9517029B2 (en) | 2016-12-13 |
US20120307032A1 (en) | 2012-12-06 |
CN102973242A (en) | 2013-03-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP2532301B1 (en) | Image processing device, image processing method, image processing system, program, and recording medium | |
USRE47921E1 (en) | Reflectance imaging and analysis for evaluating tissue pigmentation | |
EP3270354B1 (en) | Living body detection device, living body detection method, and recording medium | |
Wang et al. | Material classification using BRDF slices | |
US9697618B2 (en) | Image processing apparatus, image processing method, program, and image processing system | |
US11278236B2 (en) | Imaging-based methods and apparatuses for assessing skin pigmentation | |
Lyu | Natural image statistics in digital image forensics | |
US20120269417A1 (en) | Computer-aided staining of multispectral images | |
EP3707670A1 (en) | Enhancing pigmentation in dermoscopy images | |
CN110717446B (en) | Detection apparatus, detection method, and recording medium | |
Min et al. | Development and evaluation of an automatic acne lesion detection program using digital image processing | |
US8577150B2 (en) | System and method for removing specularity from an image | |
WO2015150292A1 (en) | Image processing method and apparatus | |
JP6273640B2 (en) | Captured image display device | |
WO2013098512A1 (en) | Method and device for detecting and quantifying cutaneous signs on an area of skin | |
CA2975054A1 (en) | Method for evaluating the authenticity of a painting as well as a corresponding use | |
DE112008001530T5 (en) | Contactless multispectral biometric acquisition | |
KR20160097209A (en) | Medical imaging | |
KR102558937B1 (en) | Systems and method for vision inspection with multiple types of light | |
Jost et al. | Contribution of depth to visual attention: comparison of a computer model and human behavior | |
Nouri et al. | Efficient tissue discrimination during surgical interventions using hyperspectral imaging | |
EP3430472B1 (en) | Method of producing video images that are independent of the background lighting | |
Wang et al. | Illumination scheme for high-contrast contactless fingerprint images | |
EP4101368A1 (en) | Determining specular reflection information | |
JP7343020B2 (en) | information processing equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20120524 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R079 Ref document number: 602012004056 Country of ref document: DE Free format text: PREVIOUS MAIN CLASS: A61B0005000000 Ipc: A61B0005103000 |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: A61B 5/103 20060101AFI20130823BHEP |
|
17Q | First examination report despatched |
Effective date: 20140214 |
|
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
INTG | Intention to grant announced |
Effective date: 20140625 |
|
GRAS | Grant fee paid |
Free format text: ORIGINAL CODE: EPIDOSNIGR3 |
|
GRAA | (expected) grant |
Free format text: ORIGINAL CODE: 0009210 |
|
AK | Designated contracting states |
Kind code of ref document: B1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
REG | Reference to a national code |
Ref country code: GB Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: EP Ref country code: AT Ref legal event code: REF Ref document number: 698872 Country of ref document: AT Kind code of ref document: T Effective date: 20141215 |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R096 Ref document number: 602012004056 Country of ref document: DE Effective date: 20150108 |
|
REG | Reference to a national code |
Ref country code: NL Ref legal event code: VDEP Effective date: 20141203 |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: MK05 Ref document number: 698872 Country of ref document: AT Kind code of ref document: T Effective date: 20141203 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: LT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20141203 Ref country code: NL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20141203 Ref country code: ES Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20141203 Ref country code: NO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20150303 Ref country code: FI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20141203 |
|
REG | Reference to a national code |
Ref country code: FR Ref legal event code: PLFP Year of fee payment: 4 |
|
REG | Reference to a national code |
Ref country code: LT Ref legal event code: MG4D |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: AT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20141203 Ref country code: LV Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20141203 Ref country code: CY Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20141203 Ref country code: RS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20141203 Ref country code: SE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20141203 Ref country code: GR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20150304 Ref country code: HR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20141203 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: CZ Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20141203 Ref country code: SK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20141203 Ref country code: EE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20141203 Ref country code: RO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20141203 Ref country code: PT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20150403 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: PL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20141203 Ref country code: IS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20150403 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R097 Ref document number: 602012004056 Country of ref document: DE |
|
PLBE | No opposition filed within time limit |
Free format text: ORIGINAL CODE: 0009261 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: DK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20141203 |
|
26N | No opposition filed |
Effective date: 20150904 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20141203 |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: PL |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: LU Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20150524 Ref country code: CH Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20150531 Ref country code: LI Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20150531 Ref country code: MC Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20141203 |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: MM4A |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20141203 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20150524 |
|
REG | Reference to a national code |
Ref country code: FR Ref legal event code: PLFP Year of fee payment: 5 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: BE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20141203 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20141203 |
|
REG | Reference to a national code |
Ref country code: FR Ref legal event code: PLFP Year of fee payment: 6 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: HU Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO Effective date: 20120524 Ref country code: SM Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20141203 Ref country code: BG Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20141203 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: TR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20141203 |
|
REG | Reference to a national code |
Ref country code: FR Ref legal event code: PLFP Year of fee payment: 7 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20141203 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: AL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20141203 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: GB Payment date: 20220426 Year of fee payment: 11 Ref country code: FR Payment date: 20220421 Year of fee payment: 11 Ref country code: DE Payment date: 20220420 Year of fee payment: 11 |
|
P01 | Opt-out of the competence of the unified patent court (upc) registered |
Effective date: 20230527 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R119 Ref document number: 602012004056 Country of ref document: DE |
|
GBPC | Gb: european patent ceased through non-payment of renewal fee |
Effective date: 20230524 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: DE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20231201 Ref country code: GB Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20230524 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: FR Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20230531 |