WO2022059596A1 - 皮表解析装置及び皮表解析方法 - Google Patents
皮表解析装置及び皮表解析方法 Download PDFInfo
- Publication number
- WO2022059596A1 WO2022059596A1 PCT/JP2021/033184 JP2021033184W WO2022059596A1 WO 2022059596 A1 WO2022059596 A1 WO 2022059596A1 JP 2021033184 W JP2021033184 W JP 2021033184W WO 2022059596 A1 WO2022059596 A1 WO 2022059596A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- skin
- unit
- skin surface
- sweat
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 62
- 238000005211 surface analysis Methods 0.000 title claims description 53
- 239000000463 material Substances 0.000 claims abstract description 42
- 238000010801 machine learning Methods 0.000 claims abstract description 39
- 238000012546 transfer Methods 0.000 claims abstract description 37
- 230000011218 segmentation Effects 0.000 claims abstract description 35
- 230000008569 process Effects 0.000 claims abstract description 26
- 210000004243 sweat Anatomy 0.000 claims description 160
- 238000012545 processing Methods 0.000 claims description 65
- 238000004458 analytical method Methods 0.000 claims description 64
- 238000000605 extraction Methods 0.000 claims description 44
- 239000000284 extract Substances 0.000 claims description 19
- 230000002194 synthesizing effect Effects 0.000 claims description 7
- 241001270131 Agaricus moelleri Species 0.000 claims description 4
- 208000008454 Hyperhidrosis Diseases 0.000 description 40
- 230000035900 sweating Effects 0.000 description 40
- 206010012438 Dermatitis atopic Diseases 0.000 description 31
- 201000008937 atopic dermatitis Diseases 0.000 description 31
- 229920001296 polysiloxane Polymers 0.000 description 19
- 230000000007 visual effect Effects 0.000 description 13
- 238000004891 communication Methods 0.000 description 8
- 210000000245 forearm Anatomy 0.000 description 7
- 210000000689 upper leg Anatomy 0.000 description 6
- 238000010586 diagram Methods 0.000 description 5
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 5
- 210000001061 forehead Anatomy 0.000 description 5
- 230000005484 gravity Effects 0.000 description 5
- 208000024891 symptom Diseases 0.000 description 5
- 208000035475 disorder Diseases 0.000 description 4
- 230000001575 pathological effect Effects 0.000 description 4
- 230000002708 enhancing effect Effects 0.000 description 3
- 238000003384 imaging method Methods 0.000 description 3
- 230000004888 barrier function Effects 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 238000013135 deep learning Methods 0.000 description 2
- 238000003745 diagnosis Methods 0.000 description 2
- 238000011156 evaluation Methods 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 238000000691 measurement method Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 230000036548 skin texture Effects 0.000 description 2
- 201000004624 Dermatitis Diseases 0.000 description 1
- 102000011782 Keratins Human genes 0.000 description 1
- 108010076876 Keratins Proteins 0.000 description 1
- 206010037083 Prurigo Diseases 0.000 description 1
- 241000872198 Serjania polyphylla Species 0.000 description 1
- 206010046740 Urticaria cholinergic Diseases 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 201000005681 cholinergic urticaria Diseases 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 238000012850 discrimination method Methods 0.000 description 1
- 201000010099 disease Diseases 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 210000002414 leg Anatomy 0.000 description 1
- 208000015413 lichen amyloidosis Diseases 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000011002 quantification Methods 0.000 description 1
- 208000017520 skin disease Diseases 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000001225 therapeutic effect Effects 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/44—Detecting, measuring or recording for evaluating the integumentary system, e.g. skin, hair or nails
- A61B5/441—Skin evaluation, e.g. for skin disorder diagnosis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/40—Analysis of texture
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0077—Devices for viewing the surface of the body, e.g. camera, magnifying lens
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/107—Measuring physical dimensions, e.g. size of the entire body or parts thereof
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/145—Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
- A61B5/14507—Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue specially adapted for measuring characteristics of body fluids other than blood
- A61B5/14517—Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue specially adapted for measuring characteristics of body fluids other than blood for sweat
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/20—Image enhancement or restoration using local operators
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/60—Image enhancement or restoration using machine learning, e.g. neural networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/90—Dynamic range modification of images or parts thereof
- G06T5/92—Dynamic range modification of images or parts thereof based on global image properties
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
- G06T7/62—Analysis of geometric attributes of area, perimeter, diameter or volume
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2576/00—Medical imaging apparatus involving image processing or analysis
- A61B2576/02—Medical imaging apparatus involving image processing or analysis specially adapted for a particular organ or body part
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/44—Detecting, measuring or recording for evaluating the integumentary system, e.g. skin, hair or nails
- A61B5/441—Skin evaluation, e.g. for skin disorder diagnosis
- A61B5/442—Evaluating skin mechanical properties, e.g. elasticity, hardness, texture, wrinkle assessment
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
- G06T2207/10012—Stereo images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10056—Microscopic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20021—Dividing image into blocks, subimages or windows
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20076—Probabilistic image processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30088—Skin; Dermal
Definitions
- This disclosure relates to a skin surface analysis device and a skin surface analysis method for analyzing a human skin surface.
- a skin groove On the surface of human skin (skin surface), there are a groove-like part called a skin groove and a raised part called a skin hill separated by the skin groove.
- basal sweating A person sweats a small amount even at rest, and this sweating at rest is called basal sweating.
- sweat during basal sweating is mainly secreted into the skin groove, is related to the amount of keratin water, and plays an important role in maintaining the barrier function of the skin.
- inflammatory skin diseases such as atopic dermatitis, cholinergic urticaria, prurigo, and amyloid lichen are caused by a decrease in the barrier function of the skin, that is, due to basal sweating disorder, or due to basal sweating disorder. Symptoms may worsen. If the patient's basal sweating can be detected, it is possible to formulate a treatment policy, alleviate the symptoms, and determine the degree of cure, which is effective for diagnosis and treatment.
- Expression mold technique also referred to as IMT or IM method
- IMT applies a dental silicone impression material to the skin surface in a film form and leaves it for a predetermined time, and then peels the silicone impression material from the skin to acquire the skin surface structure and sweating state. It is a functional quantitative measurement method.
- the inspector When discriminating skin hills and sweat droplets in IMT, it can be performed based on an enlarged image of the transfer surface of the silicone material. Specifically, after acquiring an enlarged image of the transfer surface of the silicone material with an optical microscope and displaying it on a monitor, the inspector identifies the skin hill and the skin groove while viewing the image on the monitor. The area corresponding to the hill is colored and colored, and the area of the colored area is calculated. Further, the sweat droplets are found and the part corresponding to the sweat droplets is colored, and the area of the colored area is calculated. By doing so, the state of the skin surface can be obtained quantitatively, but there is a problem described below.
- the structure of the skin surface is complicated and also differs greatly depending on the skin disease that is occurring. Therefore, when the inspector makes a judgment, what is the skin groove or what is the skin hill in the image? It takes time to determine whether or not the sample is, and there is a limit to the number of samples that can be processed within a certain period of time.
- the silicone may contain air bubbles, and it is difficult to distinguish between the air bubbles and the sweat droplets, and it is a time-consuming and laborious task to distinguish the sweat droplets.
- the number of sweat drops differs depending on the part of the skin surface even for the same person, and if the part with the average number of sweat drops is not targeted for measurement, the analysis result may be inappropriate. be.
- This disclosure is in view of this point, and the purpose of the present disclosure is to improve the accuracy of analysis of the state of the skin surface and to shorten the time required for analysis.
- an image obtained by imaging the transfer material is input in a skin surface analyzer that analyzes the skin surface using the transfer material to which the human skin surface structure is transferred.
- a patch image generation unit that divides the generated enhanced image into a plurality of patch images, and a machine learning identification that inputs each patch image generated by the patch image generation unit and executes segmentation of each input patch image.
- the whole image generator Based on the segmentation result from the device, the whole image generator that generates the whole image by synthesizing the patch image after segmentation output from the machine learning classifier, and the whole image generated by the whole image generator.
- the likelihood map generation unit that generates the likelihood map image of the skin hill and the binarization that generates the binarized image by executing the binarization process on the likelihood map image generated by the likelihood map generation unit.
- a local image enhancement process is executed on the image to generate an enhanced image.
- the image before the local image enhancement process is executed may be a color image or a grayscaled image.
- each patch image is segmented.
- the method of segmentation of each patch image is a conventional deep learning method, and this segmentation determines, for example, which category each pixel belongs to, such as skin hills, skin grooves, sweat droplets, and others. It is divided as follows.
- the likelihood map image of the skin hill is generated from the whole image based on the segmentation result.
- a binarized image is generated from the likelihood map image, it is possible to discriminate the skin hill region by extracting the white region, for example, when white is the skin hill region. By calculating the area of the extracted skin hill area, it is possible to analyze the skin surface.
- a likelihood map generation unit that generates a sweat drop likelihood map image based on the segmentation result from the entire image generated by the overall image generation unit, and a likelihood map generation unit that generates the likelihood map image. It is provided with a sweating drop extraction unit that extracts sweating droplets based on the obtained likelihood map image, and a sweating droplet analysis unit that calculates the distribution of sweating droplets extracted by the sweating droplet extraction unit.
- the likelihood map image of the sweat drop is generated from the whole image based on the segmentation result.
- white in the likelihood map image is a sweat drop
- By calculating the distribution of the extracted sweat droplets it is possible to analyze the skin surface.
- the transfer material is acquired by Expression mold technique, and has a configuration including a gray scale processing unit that grayscales an image obtained by capturing the image of the transfer material.
- This silicone may be colored in pink, for example, but according to this configuration, the image captured by the transfer material is grayscaled by the grayscale processing unit, so it is treated as a grayscale image suitable for analysis. be able to. As a result, the processing speed can be increased.
- the patch image generation unit can generate a patch image so that a part of adjacent patch images overlaps with each other.
- the machine learning classifier can have the same resolution of the input image and the resolution of the output image. According to this configuration, for example, the shape of fine skin hills and the size of sweat droplets can be accurately output.
- the skin hill analysis unit can set a plurality of grids of a predetermined size on the image and calculate the ratio of the skin hill region and the skin groove region in each grid.
- the ratio of the skin hill area is more than the specified value, it can be used as one of the guidelines for judging that the texture is rough, and when the ratio of the skin hill area is less than the specified value, it is judged that the texture is fine. It can be used as one of the guidelines.
- the skin hill analysis unit can calculate the frequency distribution (histogram) by quantifying the ratio of the skin hill region and the skin groove region in each grid.
- the region extraction unit determines whether or not each portion of the extracted skin hill region is convex after extracting the skin hill region, and the region determined to be non-convex determines the skin hill region. Can be split.
- a groove may be formed in a part of the skin hill, and in this case, a non-convex part, that is, a concave portion exists in the extracted skin hill region.
- the tenth disclosure includes an information output unit that generates and outputs information on the shape of the skin hill region extracted by the region extraction unit, so that each information is presented to a medical professional or the like for diagnosis or the like. It can be used.
- a machine learning classifier can be used to generate a likelihood map image of the skin surface, and the likelihood map image can be used to discriminate the skin hill region and sweat droplets. Therefore, it is possible to eliminate individual differences during analysis, improve the accuracy of analysis of the state of the skin surface, and shorten the time required for analysis.
- FIG. 1 is a schematic diagram showing the configuration of the skin surface analysis device 1 according to the embodiment of the present invention.
- the skin surface analysis device 1 is a device that analyzes the skin surface using the transfer material 100 to which the human skin surface structure is transferred, and by using this skin surface analysis device 1, the skin surface analysis method according to the present invention can be obtained. Can be executed.
- the transfer material 100 is obtained by transferring the human skin surface structure by a method other than IMT. May be good.
- IMT applies a dental silicone impression material to the skin surface in a film form and leaves it for a predetermined time, and then peels the silicone impression material from the skin to acquire the skin surface structure and sweating state. It is a functional quantitative measurement method. Since this IMT has been conventionally used as a method for detecting basal sweating, detailed description thereof will be omitted. Dental silicone impression materials may be colored pink, for example.
- FIG. 1 describes a case where the silicone is spread on the forearm, left to stand for a predetermined time, cured, and then peeled off from the skin to obtain the transfer material 100, but the present invention is not limited to this, and the legs and chest are not limited to this. It may be a transfer of any skin surface structure such as the back and forearm.
- IMT the skin surface structure is precisely transferred to the film-like silicone material, so that it is possible to identify the skin hills and measure the area, and further, the sweat droplets are also precisely transferred to the silicone material. Therefore, the number, diameter, and area of sweat droplets can also be measured.
- FIG. 5A is a diagram illustrating the prior art, and shows a procedure for discriminating the skin hills by IMT and measuring the area of the skin hills.
- This figure is based on an image taken by magnifying the transfer surface of the transfer material 100 with a reflective stereomicroscope 101 (shown in FIG. 1). The inspector displays this image on a monitor and discriminates the skin hill region and the skin groove region by using the shading and lightness as clues.
- the area of the skin hill can be obtained by drawing a figure so as to surround the area determined to be the skin hill area and measuring the area of the drawn figure.
- FIG. 5B is a diagram for explaining the prior art, and shows a procedure for discriminating sweat droplets by IMT and measuring the number, diameter, and area of sweat droplets.
- the inspector uses an image obtained by magnifying the transfer surface of the transfer material 100 with a stereomicroscope 101, and displays this image on a monitor to discriminate sweat droplets by using the shade, brightness, and shape as clues. do.
- Sweat drops are marked with a circle. The color of the mark is changed to distinguish the sweat droplets on the skin hill from the sweat droplets on the skin groove. This makes it possible to measure the number, diameter, and area of sweat droplets.
- the silicone may contain air bubbles, a portion having a diameter of, for example, 40 ⁇ m or less and having a shape close to a circle is discriminated as air bubbles.
- the above is the method for discriminating skin hills / grooves and sweat droplets by the inspector.
- the structure of the skin surface is complicated and the affected skin is affected. Since it varies greatly depending on the disease, it takes time for the inspector to determine which part of the image is the skin groove or which part is the skin hill, and the number of samples that can be processed within a certain period of time is reached. Had a limit.
- the silicone may contain air bubbles, and it is difficult to distinguish between the air bubbles and the sweat droplets, and it is a time-consuming and laborious task to distinguish the sweat droplets.
- the skin surface analysis device 1 generates a skin surface likelihood map image by using the machine learning classifier 24 described later even for the images shown in FIGS. 5A and 5B. It is possible to discriminate the skin hill region and sweating droplets by using the degree map image, and it is possible to improve the analysis accuracy of the state of the skin surface and shorten the time required for the analysis.
- the skin surface analysis device 1 can be configured by, for example, a personal computer or the like, and includes a main body 10, a monitor 11, a keyboard 12, and a mouse 13.
- the skin surface analysis device 1 can be obtained by installing a program for executing control contents, image processing, arithmetic processing, and statistical processing, which will be described later, on a general-purpose personal computer.
- the skin surface analysis device 1 may be configured with dedicated hardware on which the program is mounted.
- the monitor 11 displays various images, user interface images for setting, and the like, and can be configured by, for example, a liquid crystal display.
- the keyboard 12 and the mouse 13 have been conventionally used as operating means of a personal computer or the like. Instead of the keyboard 12 and the mouse 13, or in addition to the keyboard 12 and the mouse 13, a touch operation panel or the like may be provided.
- the main body 10, the monitor 11, and the operating means may be integrated.
- the main body unit 10 includes a communication unit 10a, a control unit 10b, and a storage unit 10c.
- the communication unit 10a is a part that executes data exchange with the outside, and is composed of various communication modules and the like. By connecting to a network line such as the Internet via the communication unit 10a, it is possible to read data from the outside and send out the data of the main body unit 10.
- the storage unit 10c is composed of, for example, a hard disk, SSD (Solid State Drive), or the like, and can store various images, setting information, analysis results, statistical processing results, and the like.
- the storage unit 10c may be configured by an external storage device, or may be configured by a so-called cloud server or the like.
- control unit 10b can be configured by, for example, a system LSI, MPU, GPU, DSP, dedicated hardware, etc., and performs numerical calculation and information processing based on various programs, and controls each hardware unit. conduct. Each piece of hardware is connected so as to be capable of two-way communication or one-way communication via an electrical communication path (wiring) such as a bus.
- the control unit 10b is configured to perform various processes as described later, and these processes may be realized by a logic circuit or may be realized by executing software. You may.
- the processing that can be executed by the control unit 10b includes various general image processing. Further, the control unit 10b can be configured by a combination of hardware and software.
- control unit 10b First, the configuration of the control unit 10b will be described, and then a skin surface analysis method by the control unit 10b will be described by showing a specific image example.
- the control unit 10b can capture an image from the outside via the communication unit 10a or directly.
- the captured image can be stored in the storage unit 10c.
- the captured image is an image taken by magnifying the transfer surface of the transfer material 100 with a stereomicroscope 101, and is, for example, an image that is the basis of FIGS. 5A and 5B.
- the captured image may be a color image or a grayscaled grayscale image.
- the control unit 10b includes an image input unit 20 for inputting a color image or a grayscaled image.
- An image that has been grayscaled by the grayscale processing unit 21 described later may be input to the image input unit 20, or an image that has been grayscaled in advance outside the skin surface analysis device 1 may be input. good.
- the user of the skin surface analysis device 1 can execute the same as the above-mentioned reading of the image into the gray scale processing unit 21.
- a color image can also be input to the image input unit 20.
- the control unit 10b includes a gray scale processing unit 21 that grayscales the captured image when the captured image is a color image. It is not always necessary to grayscale the color image, and the local image enhancement process and subsequent steps described later may be executed with the color image as it is.
- image capture can be performed by the user of the skin surface analysis device 1.
- an image in which the image data output from the image pickup element is saved in the JPEG format is used, but the present invention is not limited to this, and the image data may be compressed in another compression format, or may be a RAW image. good.
- the size of the image is set to 1600 ⁇ 1200 pixels (pixels), but this can also be set arbitrarily.
- the gray scale processing unit 21 grayscales a color image with, for example, an 8-bit gradation. Specifically, the grayscale processing unit 21 converts the sample value of each pixel constituting the image into an image in which information other than the luminance is not included. This gray scale is different from a binary image, and expresses an image from white with the strongest brightness to black with the weakest brightness, including light and dark gray. This gradation is not limited to 8 bits, and can be set to any gradation.
- the control unit 10b includes a local image enhancement processing unit 22.
- the local image enhancement processing unit 22 executes a local image enhancement process for enhancing the contrast of the local region of the grayscaled image input to the image input unit 20 to generate an enhanced image. This enhances the visibility of the details of the image.
- Examples of the local image enhancement process include a process of enhancing the visibility of details by enhancing the contrast of a local region of an image, such as a histogram equalization process.
- the control unit 10b includes a patch image generation unit 23.
- the patch image generation unit 23 is a portion that divides the enhancement processing image generated by the local image enhancement processing unit 22 into a plurality of patch images. Specifically, the patch image generation unit 23, for example, assuming that an image having a size of 1600 ⁇ 1200pixel is an enhanced image, one of these images becomes an image (patch image) having a size of 256 ⁇ 256pixel. Divide into.
- the patch image generation unit 23 can also generate a patch image so that a part of adjacent patch images overlaps with each other. That is, the patch image generated by the patch image generation unit 23 overlaps with a part of the adjacent patch images, and this overlapping range can be set to, for example, about 64pixel.
- the setting of this overlapping range can be called, for example, "64pixel stride".
- the above-mentioned value of pixel is an example, and other values can be used.
- the adjacent patch images are not duplicated when dividing into multiple patch images, it is possible that the edges of the skin hills and sweat droplets happen to overlap the boundaries of the adjacent patch images, and the boundaries The accuracy of discrimination by the machine learning classifier 24 (described later) for overlapping skin hills and sweating droplets may decrease.
- the machine learning classifier 24 described later
- even the skin hills and sweat droplets at the above-mentioned positions can be accurately discriminated.
- the control unit 10b includes a machine learning classifier 24.
- the machine learning classifier 24 is a part where each patch image generated by the patch image generation unit 23 is input and segmentation of each input patch image is executed.
- the machine learning classifier 24 itself segments an input image according to a well-known deep learning method, and by this segmentation, for example, it obtains which category each pixel belongs to and outputs it as an output image.
- the machine learning classifier 24 has an input layer into which an input image is input and an output layer for outputting an output image, and also has a plurality of hidden layers between the input layer and the output layer.
- the machine learning classifier 24 automatically extracts common features by learning a large number of teacher data and enables flexible judgment, and the learning is completed.
- the machine learning classifier 24 has the same resolution of the input image and the resolution of the output image.
- the resolution of the input image is high and the resolution of the output image is lowered, but in this example, it is necessary to accurately determine the shape of fine skin hills and the size of sweating droplets. Therefore, I try not to lower the resolution of the output image.
- a patch image having a size of 256 ⁇ 256 pixel is input to the input layer of the machine learning classifier 24, an output image having a size of 256 ⁇ 256 pixel is output from the output layer.
- the machine learning classifier 24 of this example is configured to be able to simultaneously detect skin hills / grooves and sweat droplets. That is, the machine learning classifier 24 has a skin hill / skin groove detector 24a for detecting a skin hill / skin groove and a sweat drop detector 24b for detecting a sweat drop.
- the skin hill / skin groove detector 24a and the sweat drop detector 24b can be constructed using, for example, Unit as a network, respectively.
- the control unit 10b includes an overall image generation unit 25.
- the whole image generation unit 25 is a part that generates a whole image by synthesizing the patch images after segmentation output from the machine learning classifier 24. Specifically, the whole image generation unit 25 synthesizes the patch image output from the skin hill / skin groove detector 24a like the image before division to generate the whole image for distinguishing the skin hill / skin groove.
- the patch images output from the sweat drop detector 24b are similarly combined to generate an overall image for determining sweat drops. The entire image will be the same as the image size before division.
- the control unit 10b includes a likelihood map generation unit 26.
- the likelihood map generation unit 26 generates a likelihood map image of the skin hill based on the segmentation result by the machine learning classifier 24 from the whole image for discriminating the skin hill / skin groove generated by the whole image generation unit 25.
- the likelihood map image is an image color-coded and displayed according to the likelihood of each pixel, and relatively indicates which pixel has a high or low likelihood. For example, the pixel with the highest likelihood is red, the pixel with the lowest likelihood is blue, and the color map image expressing the space between them with 8-bit gradation can be used as the likelihood map image of the skin hill / groove. .. It should be noted that this display form is an example, and may be displayed in gray scale, may be a display form in which the brightness is changed, and the gradation may be other than 8 bits.
- the likelihood map generation unit 26 generates a likelihood map image of sweat droplets based on the segmentation result by the machine learning classifier 24 from the overall image for discriminating sweat droplets generated by the overall image generation unit 25.
- the pixel with the highest likelihood of sweating drops is red
- the pixel with the lowest likelihood of sweating drops is blue
- the color map image expressing the space between them with 8-bit gradation is used as the likelihood map image of sweating drops.
- the likelihood map image of the sweat droplet may be displayed in gray scale or in a display form in which the brightness is changed, as in the case of the skin hill / groove, and the gradation is other than 8 bits. May be.
- the control unit 10b has a binarization processing unit 27.
- the binarization processing unit 27 is a portion that executes binarization processing on the likelihood map image generated by the likelihood map generation unit 26 to generate a binarization image (black and white image).
- the control unit 10b includes a region extraction unit 28.
- the region extraction unit 28 is a portion that extracts the skin hill region based on the binarized image generated by the binarization processing unit 27. Specifically, when the black of the binarized image is the skin hill, the group of white pixels of the binarized image is extracted as the skin hill region. Further, the region extraction unit 28 may extract the skin groove region based on the binarized image generated by the binarization processing unit 27. In this case, assuming that the black of the binarized image is the skin groove, a group of white pixels of the binarized image is extracted as the skin groove region. The region extraction unit 28 may extract the skin groove and then extract the other regions as the skin hill region. Further, the region extraction unit 28 may extract the skin hills and then extract other regions as skin groove regions.
- the control unit 10b includes a sweat droplet extraction unit 29.
- the sweating droplet extraction unit 29 is a portion that extracts sweating droplets based on the likelihood map image of the sweating droplets. Specifically, when the white (or red) of the sweat drop likelihood map image is a sweat drop, a group of white (or red) pixels of the sweat drop likelihood map image is extracted as a sweat drop. .. Further, the sweat drop extraction unit 29 may extract a portion other than the sweat drop based on the likelihood map image of the sweat drop. In this case, assuming that the black (or blue) of the sweat drop likelihood map image is a part other than the sweat drop, the group of black (or red) pixels of the sweat drop likelihood map image is the part other than the sweat drop. Extract as. The sweat drop extraction unit 29 may extract a portion of the sweat drop likelihood map image other than the sweat drop, and then extract the other region as a sweat drop.
- the transfer material 100 may contain air bubbles, and these air bubbles may be determined to be sweat droplets.
- a discrimination method using dimensions is also applied. For example, by setting "40 ⁇ m" as an example of the discrimination threshold, it is determined that a small region having a diameter of 40 ⁇ m or less is a bubble and only a region having a diameter exceeding 40 ⁇ m is a sweat droplet.
- an area can be mentioned. For example, an area of a circle having a diameter of 40 ⁇ m is obtained, a small area smaller than the area is assumed to be a bubble, and only a region exceeding the area is a sweat droplet. It can be determined as being present.
- the above "diameter” may be, for example, a major axis in the case of an ellipse approximation.
- the control unit 10b includes a skin hill analysis unit 30.
- the skin hill analysis unit 30 is a part that calculates the area of the skin hill region extracted by the region extraction unit 28.
- the skin hill analysis unit 30 can acquire the shape of the skin hill by generating, for example, a contour line surrounding the skin hill region extracted by the region extraction unit 28.
- the skin hill analysis unit 30 can calculate the area of the skin hill by obtaining the area of the region surrounded by the contour line of the skin hill.
- the skin hill analysis unit 30 can also acquire the shape of the skin groove by generating a contour line surrounding the skin groove region extracted by the region extraction unit 28, for example.
- the skin hill analysis unit 30 can also calculate the area of the skin groove by obtaining the area of the region surrounded by the contour line of the skin groove.
- the skin hill analysis unit 30 sets a plurality of grids of a predetermined size on the binarized image or the gray scale image, and calculates the ratio of the skin hill region and the skin groove region in each grid. Specifically, when the skin hill analysis unit 30 sets the grid so as to divide the binarized image into nine equal parts as an example and assumes the first to ninth divided images, it is included in each divided image. The area of the skin hill area and the area of the skin groove area are calculated, and the ratio of the area of the skin hill area and the area of the skin groove area is calculated. For example, when it is desired to evaluate the fineness of the skin surface, it can be evaluated based on the ratio of the skin hill region and the skin groove region in the grid set on the binarized image or the gray scale image.
- the ratio of the skin hill area When the ratio of the skin hill area is more than a predetermined value, it can be a criterion for judging that the texture is rough. In addition, when the ratio of the skin hill area is less than the predetermined value, it can be used as a criterion for judging that the texture is fine.
- the skin hill and the skin groove are analyzed by the gray scale image in the skin hill analysis unit 30.
- the boundary between the skin hill and the skin groove is clear, and it is possible to measure the area of the skin hill, but in the case of atopic dermatitis, the skin hill and the skin groove.
- the grayscale image as it is for analysis, the ratio of skin hills and skin grooves can be divided into multiple grid sizes, and the grayscale of the pixels in the grid can be divided.
- the ratio of the skin hill and the skin groove is analyzed, and by displaying the histogram, it can be used as a criterion for judging the fineness of the skin (described later).
- the skin hill analysis unit 30 calculates the frequency distribution by quantifying the ratio between the skin hill region and the skin groove region in each of the above grids. Specifically, after calculating the ratio of the area of the skin hill area and the area of the skin groove area, this is quantified and the data is aggregated in the format of the frequency distribution table. Further, the skin hill analysis unit 31 can calculate the position of the center of gravity of each skin hill region, the peripheral length of the skin hill region, the rectangular approximation, the elliptical approximation, the circularity, the aspect ratio, the consistency density and the like.
- a groove may be formed in a part of the skin hill, and in this case, a non-convex part, that is, a concave portion exists in the extracted skin hill area.
- the skin hill analysis unit 30 determines whether or not each part of the extracted skin hill region is convex after extracting the skin hill region, and the skin is determined by the portion determined not to be convex. Divide the hill area.
- the skin hill region there may be a groove-shaped portion in the skin hill region, and in this case, not all of the skin hill region is convex, but a part (groove-shaped portion) is concave. Since the portion determined to be non-convex, that is, the portion determined to be concave is a groove-shaped portion, the skin hill region is divided by this groove-shaped portion, whereby one skin hill region becomes a plurality of skins. It becomes a hill area.
- the control unit 10b includes a sweat drop analysis unit 31.
- the sweat drop analysis unit 31 calculates the distribution of sweat drops extracted by the sweat drop extraction unit 29.
- the sweat drop analysis unit 31 can calculate, for example, the number of sweat drops existing per unit area (1 mm 2 , 1 cm 2 , etc.) of the skin surface, the size (diameter) of each sweat drop, the area of the sweat drops, and the like.
- the sweat drop analysis unit 31 can also calculate the total area of sweat drops existing per unit area of the skin surface.
- the control unit 10b includes an information output unit 32.
- the information output unit 32 generates and outputs information on the shape of the skin hill region extracted by the region extraction unit 28 and information on the sweat droplets extracted by the sweat droplet extraction unit 29.
- the information regarding the shape of the skin hill region includes the result calculated by the skin hill analysis unit 30, for example, the area of the skin hill region, the position of the center of gravity of the skin hill region, the peripheral length of the skin hill region, the rectangular approximation, and the elliptical approximation. , Circularity, aspect ratio, consistency density and the like.
- the information on the sweat droplets includes the result calculated by the sweat droplet analysis unit 31, for example, the number of sweat droplets existing per unit area, the total area of sweat droplets existing per unit area, and the like. Can be done.
- step S1 of the flowchart shown in FIG. 3 IMT is performed.
- a dental silicone impression material is applied to the skin surface in a film form and left for a predetermined time, and then the silicone impression material is peeled off from the skin to transfer the human skin surface structure.
- the transfer material 100 is obtained.
- step S2 the transfer material 100 is set on the stereomicroscope 101 and observed at a predetermined magnification, and the observation field of view is imaged by the image pickup device. As a result, a color image (1600 ⁇ 1200pixel) in JPEG format is acquired. Subsequently, the process proceeds to step S3, and the color image captured by the image pickup device is read into the control unit 10b of the skin surface analysis device 1. After that, the process proceeds to step S4, and the color image read in step S3 is grayscaled to 8 bits by the grayscale processing unit 21 (shown in FIG. 2) to generate a grayscale image. An example of the generated grayscale image is shown in FIG.
- the light-colored part is the skin hill and the dark-colored part is the skin groove, but the boundary is not clear, and when the inspector determines, what part of the image is the skin groove, or It took time to determine where the skin was, and there was a limit to the number of samples that could be processed within a certain period of time. If the image read by the control unit 10b is a grayscale image, the grayscale processing is unnecessary.
- step S5 the grayscale image is input to the image input unit 20.
- step S6 the local image enhancement processing unit 22 executes the local image enhancement processing on the grayscale image input in step S5.
- step S5 the local image enhancement processing step.
- FIG. 7 shows an image in which the local image enhancement process has been executed. It can be seen that in the image shown in FIG. 7, the contrast in the local region is emphasized and the visibility of details is enhanced as compared with the image shown in FIG.
- step S7 the patch image generation unit 23 divides the enhancement processing image generated in step S6 into a plurality of patch images.
- FIG. 8 shows how the patch image is divided, and the grid-like lines correspond to the boundaries of the patch image.
- the patch images adjacent to each other in the vertical direction and the horizontal direction in the figure are overlapped with each other by "64pixel stride".
- This step is a patch image generation step.
- step S8 After generating the patch image, proceed to step S8.
- step S8 each patch image generated in step S7 is input to the machine learning classifier 24, and the segmentation of each input patch image is executed by the machine learning classifier 24.
- the same patch image is input to both the skin hill / skin groove detector 24a and the sweat drop detector 24b (steps S9 and S10). This step is the segmentation step.
- the eight patch images are input to the skin hill / groove detector 24a and the sweat drop detector 24b. Also enter.
- the skin hill / groove detector 24a is designed so that the higher the possibility of the skin hill, the whiter the entire input image, and the lower the possibility of the skin hill (the higher the possibility of the skin groove), the blacker the skin hill / groove detector 24a. Generates and outputs an output image in which the color of each pixel is set. Further, the sweat drop detector 24b generates an output image in which the color of each pixel is set so that the higher the possibility of sweat droplets is, the whiter the entire input image is, and the lower the possibility of sweat drops is, the blacker the color is. And output.
- FIG. 9 shows an example of a skin hill / skin groove output image output from the skin hill / skin groove detector 24a and a sweat drop output image output from the sweat drop detector 24b.
- the white part in the skin hill / skin groove output image is the skin hill area, and the black part is the skin groove area.
- the white part in the sweat drop output image is the sweat drop.
- step S7 when dividing into a plurality of patch images in step S7, adjacent patch images are duplicated. If the patch images are not duplicated, it is possible that the edges of the skin hills and sweat droplets happen to overlap the boundaries of adjacent patch images, and the accuracy of distinguishing the skin hills and sweat droplets that overlap the boundaries deteriorates. There is a risk of On the other hand, in this example, since some of the adjacent patch images overlap each other, even the skin hills and sweat droplets at the above-mentioned positions can be accurately discriminated.
- step S11 the process proceeds to step S11, and the skin hill / skin groove output image (patch image) after step S9 is combined to generate an overall image as shown in FIG. Further, in this step S11, the sweat droplet output image (patch image) after step S9 is combined to generate an overall image as shown in FIG.
- the number of pixels of each whole image is the same as the number of pixels of the image input in step S5. This step is the whole image generation step.
- step S12 shown in FIG. 4 the likelihood map generation unit 26 obtains the likelihood map image of the skin hill and the likelihood map image of the sweating droplet based on the segmentation result from the whole image generated in step S11. Generate.
- This step is the likelihood map generation step.
- FIG. 12 shows an example of the likelihood map image of the skin hill. In this figure, a grayscale image is used for convenience, but in this example, the pixel with the highest likelihood of skin hills is red, the pixel with the lowest likelihood of skin hills is blue, and the space between them is represented by 8-bit gradation. It is a color image. This facilitates the distinction between the skin hill region and the skin groove region.
- FIG. 13 shows an example of a likelihood map image of sweat droplets.
- This image is also originally a color, and the pixel with the highest likelihood of sweating droplets is red, the pixel with the lowest likelihood of sweating droplets is blue, and the space between them is a color image expressed with 8-bit gradation. This facilitates the discrimination of sweat droplets.
- step S13 the likelihood map image of the skin hill generated in step S12 is binarized to generate a binarized image.
- This step is executed by the binarization processing unit 27, and is a binarization processing step.
- FIG. 14 shows a binarized image generated by performing a binarization process on the likelihood map image of the skin hill.
- step S14 the region extraction unit 28 extracts the skin hill region based on the binarized image generated in step S13.
- the skin groove region may be extracted.
- FIG. 15 is an image obtained by extracting the skin hill and the skin groove, and is displayed by surrounding the skin hill region with a black line. This step is the area extraction step.
- the sweat droplet extraction unit 29 extracts the sweat droplets based on the likelihood map image of the sweat droplets generated in step S12.
- This step is a sweat drop extraction step.
- FIG. 16 is an image obtained by extracting sweat droplets, and the sweat droplets are displayed by surrounding them with a black line.
- step S16 the position of the sweat droplet and the skin hill / groove are compared.
- the position and range of sweat droplets can be specified by the XY coordinates on the image.
- the positions and ranges of the skin hills and grooves can also be specified by the XY coordinates on the image. Since the original images of the image specifying the position and range of the sweat droplet and the image specifying the position and range of the skin hill / groove are the same, the skin hill / skin groove is shown as shown in FIG. Sweat drops can be placed on the image. As a result, the relative positional relationship between the sweat droplets and the skin hills / grooves can be obtained. At this time, the region of the skin hill and the coordinates of the center of gravity of the sweat droplet can be used.
- step S17 the sweat droplets in the skin hill / groove are identified.
- FIG. 18 is an image in which the sweat droplets in the skin hill / groove are identified, which makes it possible to distinguish between the sweat droplets in the skin hill and the sweat droplets in the skin groove.
- sweat droplets have a shape close to a circle.
- step S18 a histogram of skin hill information is created and displayed on the monitor 11.
- the skin hill analysis unit 30 calculates the area of each skin hill region extracted in step S14.
- a histogram is created in which the area is taken on the horizontal axis and the frequency is taken on the vertical axis.
- This step is the skin hill analysis step. This makes it possible to grasp the distribution of the area of the skin hill area. For example, in the case of atopic dermatitis, the area of one skin hill tends to be large, and if the frequency of the large area is high, it can be seen that the tendency of sweating disorder of atopic dermatitis is strong.
- step S19 a heat map image of sweat droplets is created and displayed on the monitor 11.
- the sweat drop analysis unit 31 calculates the distribution of the sweat drops extracted in step S15. For example, as shown in FIG. 20, a grid is formed on the image of the transfer material 100, and the number of sweat droplets existing in each grid is counted. This is possible by determining in which grid the coordinates of the center of gravity of the sweat droplets extracted in step S15 are located. For example, a grid without sweat droplets, a grid with one sweat droplet, a grid with two sweat droplets, a grid with three sweat droplets, ... The distribution of sweat droplets can be grasped, and the image displayed in different colors in this way can be called a heat map image.
- This step is a sweat drop analysis step. If the distribution of sweat droplets is sparse, it can be seen that there is a strong tendency for sweating disorders in atopic dermatitis.
- the heat map images can be arranged in chronological order and displayed on the monitor 11. For example, by generating heat map images for patients with atopic dermatitis one week, two weeks, and three weeks after starting treatment, and displaying them in a list format, symptoms can be manifested. It is possible to judge whether or not it is improving, and it is possible to quantitatively judge the progress.
- FIG. 21 is an example of a skin hill region image showing a line surrounding each skin hill region extracted in step S14.
- the image shown in this figure is generated by the skin hill analysis unit 30 and can be displayed on the monitor 11.
- the skin hill analysis unit 30 The measurement results of the specifications are created in a table format and displayed on the monitor 11.
- These specification values can be calculated by the skin hill analysis unit 30 by using, for example, image analysis software. In this way, by using not only one index but also a plurality of indexes, it is possible to make a judgment in correspondence with clinical information. Further, since this index can also contribute to the determination of the fineness of the skin surface, it is also possible to determine the fineness of the skin surface using the machine learning classifier 24. Further, as shown in FIG. 22, it is also possible to perform statistical processing (total, maximum, minimum, deviation, etc.).
- FIG. 23 is a graph showing a two-dimensional distribution of skin hills and skin grooves per a grid of 128 ⁇ 128 pixels, and such a graph can also be generated by the skin hill analysis unit 30 and displayed on the monitor 11.
- an 8-bit color image can be displayed with the skin hill region as red and the skin groove region as blue.
- it can be used as an example of a method to express the fineness of the skin surface and the improvement of symptoms, it may be used as a heat map, or when it is quantified by the ratio of the area of the skin hill and the skin groove, it can be used.
- the frequency around the median is high when the skin surface is fine, while the distribution is wide and the hem is widened in the case of atopic dermatitis. This makes it possible to quantify two-dimensional information and use it as diagnostic information.
- FIG. 24 shows a case where the skin hill analysis unit 30 sets a plurality of grids of a predetermined size (18 in this example) on the image and calculates the ratio of the skin hill region and the skin groove region in the grid.
- the frequency distribution can be calculated by quantifying the ratio between the skin hill region and the skin groove region in each grid, and can be displayed on the monitor 11 in the form of a histogram.
- the fineness of the skin surface it is conceivable to use only the area of the skin hills, but in that case, two adjacent skin hills are very close to each other and are identified as one skin hill. At that time, the skin hill is about twice as large, and the analysis result may be inaccurate.
- the fineness of the skin surface can be quantitatively calculated.
- each of the above-mentioned analyzes is performed on the image of the average visual field of sweating in the wide range. For example, when focusing on only one visual field, it is not possible to determine whether the visual field is a visual field with less sweating, a visual field with a large amount of sweating, or an average visual field, but sweating in all of a wide visual field of about 9 visual fields.
- an average visual field that is, a visual field suitable for skin surface analysis, excluding the visual field with less sweating and the visual field with high sweating.
- the analysis result becomes accurate.
- the field of view to be processed was about 3 fields of view due to time constraints, but the present invention enables analysis of a large number of fields of view that greatly exceeds 3 fields of view, and more accurate skin surface. Can be analyzed.
- the skin hill analysis unit 30 can also display the images shown in FIG. 25 on the monitor 11 in chronological order. For example, one week, two weeks, and three weeks after the patient with atopic dermatitis started treatment, images were generated as shown in FIG. 25, and the images were displayed on the monitor 11 in a list format. By displaying it, it is possible to judge whether or not the symptom is improved, and it is possible to quantitatively judge the progress.
- FIG. 26 is a graph (histogram) showing the ratio of the skin hills and skin grooves of the forearm of a healthy person, and shows a case where a grid having a size of 100 ⁇ 100 pixels is set in a gray scale image.
- the horizontal axis is the ratio of the skin hill and the skin groove, and the vertical axis is the number.
- the graph on the right side of FIG. 26 also shows a graph of kernel density estimation.
- FIG. 27 shows a case where a grid having a size of 150 ⁇ 150 pixels is set
- FIG. 28 shows a case where a grid having a size of 200 ⁇ 200 pixels is set
- FIG. 29 shows a case where a grid having a size of 250 ⁇ 250 pixels is set.
- Each is shown.
- the distribution has a peak in the center of the grid of 100 x 100, 150 x 150, 200 x 200, and 250 x 250 pixel sizes. Become. Moreover, since the ratio of the skin hill and the skin groove can be known from the grid size, not only the skin hill but also the size of the skin groove can be quantified.
- 30 to 33 are graphs showing the ratio of the thigh skin hills and skin grooves of patients with atopic dermatitis, and correspond to FIGS. 26 to 29, respectively.
- the peaks may be offset from the center or multiple peaks may be formed.
- FIGS. 34 to 37 are graphs showing the ratio of the skin hill and the skin groove of the forehead of the atopic dermatitis patient, and correspond to FIGS. 26 to 29, respectively.
- the peaks are generally shifted to the right side (the side where the ratio of skin hills / grooves is large), or multiple peaks are formed.
- FIGS. 38 to 41 are graphs showing the ratio of the skin hills and the skin grooves of the elbows of patients with atopic dermatitis, and correspond to FIGS. 26 to 29, respectively.
- the peaks may be offset from the center or multiple peaks may be formed.
- healthy subjects and patients with atopic dermatitis can be seen. Since it is possible to grasp the difference in skin texture and the condition of the skin of a patient with atopic dermatitis, it is possible to present the therapeutic effect as an objective index in the follow-up.
- the machine learning classifier 24 is used to generate a likelihood map image of the skin surface, and the likelihood map image is used to discriminate the skin hill region and sweat droplets. Therefore, it is possible to eliminate individual differences during analysis, improve the accuracy of analysis of the state of the skin surface, and shorten the time required for analysis.
- the skin surface analysis device and the skin surface analysis method according to the present invention can be used, for example, when analyzing a human skin surface.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Medical Informatics (AREA)
- General Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Public Health (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Veterinary Medicine (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Quality & Reliability (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Dentistry (AREA)
- Geometry (AREA)
- Dermatology (AREA)
- Optics & Photonics (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
- Image Processing (AREA)
Abstract
Description
制御部10bには、通信部10aを介して、または直接、外部から画像を取り込むことができるようになっている。取り込んだ画像は記憶部10cに記憶させておくことができる。取り込む画像は、転写材100の転写面を実体顕微鏡101で拡大し、撮像した画像であり、例えば図5A、図5Bの基になる画像である。取り込む画像は、カラー画像であってもよいし、グレースケール化されたグレースケール画像であってもよい。
次に、上記のように構成された皮表解析装置1を用いて行われる皮表解析方法について、具体的な画像例を示しながら説明する。皮表解析方法の流れは図3及び図4に示すフローチャートのとおりである。図3に示すフローチャートのステップS1では、IMTを実施する。このステップでは、図1に示すように、歯科用のシリコーン印象材を皮膚表面に膜状に塗布して所定時間放置した後、シリコーン印象材を皮膚から剥離して人の皮表構造が転写された転写材100を得る。
図26は、健常者の前腕の皮丘と皮溝の割合を示すグラフ(ヒストグラム)であり、グレースケール画像において100×100ピクセルサイズのグリッドを設定した場合を示す。横軸は、皮丘と皮溝の割合であり、縦軸は数である。図26の右側のグラフは、カーネル密度推定のグラフも表示している。また、同様に、図27は150×150ピクセルサイズのグリッドを設定した場合、図28は200×200ピクセルサイズのグリッドを設定した場合、図29は250×250ピクセルサイズのグリッドを設定した場合をそれぞれ示している。
以上説明したように、この実施形態によれば、機械学習識別器24を利用して皮表の尤度マップ画像を生成し、尤度マップ画像を利用して皮丘領域や発汗滴を判別することができるので、解析時の個人差を排除して皮表の状態の解析精度を高めることができるとともに、解析に要する時間を短縮できる。
20 画像入力部
21 グレースケール処理部
22 画像強調処理部
23 パッチ画像生成部
24 機械学習識別器
24a 皮丘・皮溝検出器
24b 発汗滴検出器
25 全体画像生成部
26 尤度マップ生成部
27 2値化処理部
28 領域抽出部
29 発汗滴抽出部
30 皮丘解析部
31 発汗滴解析部
100 転写材
Claims (12)
- 人の皮表構造が転写された転写材を用いて皮表を解析する皮表解析装置において、
前記転写材を撮像した画像が入力される画像入力部と、
前記画像入力部に入力された画像の局所領域のコントラストを強調するローカル画像強調処理を実行して強調処理画像を生成するローカル画像強調処理部と、
前記ローカル画像強調処理部により生成された強調処理画像を複数のパッチ画像に分割するパッチ画像生成部と、
前記パッチ画像生成部により生成された各パッチ画像が入力され、入力された各パッチ画像のセグメンテーションを実行する機械学習識別器と、
前記機械学習識別器から出力されたセグメンテーション後のパッチ画像を合成して全体画像を生成する全体画像生成部と、
前記全体画像生成部により生成された全体画像から前記セグメンテーション結果に基づいて皮丘の尤度マップ画像を生成する尤度マップ生成部と、
前記尤度マップ生成部により生成された尤度マップ画像に2値化処理を実行して2値化画像を生成する2値化処理部と、
前記2値化処理部により生成された2値化画像に基づいて皮丘領域を抽出する領域抽出部と、
前記領域抽出部により抽出された皮丘領域の面積を算出する皮丘解析部とを備えていることを特徴とする皮表解析装置。 - 人の皮表構造が転写された転写材を用いて皮表を解析する皮表解析装置において、
前記転写材を撮像した画像が入力される画像入力部と、
前記画像入力部に入力された画像の局所領域のコントラストを強調するローカル画像強調処理を実行して強調処理画像を生成するローカル画像強調処理部と、
前記ローカル画像強調処理部により生成された強調処理画像を複数のパッチ画像に分割するパッチ画像生成部と、
前記パッチ画像生成部により生成された各パッチ画像が入力され、入力された各パッチ画像のセグメンテーションを実行する機械学習識別器と、
前記機械学習識別器から出力されたセグメンテーション後のパッチ画像を合成して全体画像を生成する全体画像生成部と、
前記全体画像生成部により生成された全体画像から前記セグメンテーション結果に基づいて発汗滴の尤度マップ画像を生成する尤度マップ生成部と、
前記尤度マップ生成部により生成された尤度マップ画像に基づいて発汗滴を抽出する発汗滴抽出部と、
前記発汗滴抽出部により抽出された発汗滴の分布を算出する発汗滴解析部とを備えていることを特徴とする皮表解析装置。 - 請求項1に記載の皮表解析装置において、
前記全体画像生成部により生成された全体画像から前記セグメンテーション結果に基づいて発汗滴の尤度マップ画像を生成する尤度マップ生成部と、
前記尤度マップ生成部により生成された尤度マップ画像に基づいて発汗滴を抽出する発汗滴抽出部と、
前記発汗滴抽出部により抽出された発汗滴の分布を算出する発汗滴解析部とを備えていることを特徴とする皮表解析装置。 - 請求項1から3のいずれか1つに記載の皮表解析装置において、
前記転写材は、Impresstion mold techniqueで取得されたものであり、
前記転写材を撮像した画像をグレースケール化するグレースケール処理部を備えていることを特徴とする皮表解析装置。 - 請求項1から4のいずれか1つに記載の皮表解析装置において、
前記パッチ画像生成部は、隣合うパッチ画像同士の一部が互いに重複するようにパッチ画像を生成することを特徴とする皮表解析装置。 - 請求項1から5のいずれか1つに記載の皮表解析装置において、
前記機械学習識別器は、入力画像の解像度と出力画像の解像度とを同じにしていることを特徴とする皮表解析装置。 - 請求項1に記載の皮表解析装置において、
前記皮丘解析部は、画像上に所定の大きさのグリッドを複数設定し、前記各グリッド内の皮丘領域と皮溝領域との割合を算出することを特徴とする皮表解析装置。 - 請求項7に記載の皮表解析装置において、
前記皮丘解析部は、前記各グリッド内の皮丘領域と皮溝領域との割合を数値化して度数分布を算出することを特徴とする皮表解析装置。 - 請求項1に記載の皮表解析装置において、
前記領域抽出部は、皮丘領域の抽出後、抽出された皮丘領域の各部が凸であるか否かを判定し、凸でないと判定された部分によって前記皮丘領域を分割することを特徴とする皮表解析装置。 - 請求項3に記載の皮表解析装置において、
前記領域抽出部により抽出された皮丘領域の形状に関する情報を生成して出力する情報出力部を備えていることを特徴とする皮表解析装置。 - 人の皮表構造が転写された転写材を用いて皮表を解析する皮表解析方法において、
前記転写材を撮像した画像を入力する画像入力ステップと、
前記画像入力ステップで入力された画像の局所領域のコントラストを強調するローカル画像強調処理を実行して強調処理画像を生成するローカル画像強調処理ステップと、
前記ローカル画像強調処理ステップにより生成された強調処理画像を複数のパッチ画像に分割するパッチ画像生成ステップと、
前記パッチ画像生成ステップにより生成された各パッチ画像を機械学習識別器に入力し、入力した各パッチ画像のセグメンテーションを機械学習識別器によって実行するセグメンテーションステップと、
前記セグメンテーションステップ後のパッチ画像を合成して全体画像を生成する全体画像生成ステップと、
前記全体画像生成ステップにより生成された全体画像から前記セグメンテーション結果に基づいて皮丘の尤度マップ画像を生成する尤度マップ生成ステップと、
前記尤度マップ生成ステップにより生成された尤度マップ画像に2値化処理を実行して2値化画像を生成する2値化処理ステップと、
前記2値化処理ステップにより生成された2値化画像に基づいて皮丘領域を抽出する領域抽出ステップと、
前記領域抽出ステップにより抽出された皮丘領域の面積を算出する皮丘解析ステップとを備えていることを特徴とする皮表解析方法。 - 人の皮表構造が転写された転写材を用いて皮表を解析する皮表解析方法において、
前記転写材を撮像した画像を入力する画像入力ステップと、
前記画像入力ステップで入力された画像の局所領域のコントラストを強調するローカル画像強調処理を実行して強調処理画像を生成するローカル画像強調処理ステップと、
前記ローカル画像強調処理ステップにより生成された強調処理画像を複数のパッチ画像に分割するパッチ画像生成ステップと、
前記パッチ画像生成ステップにより生成された各パッチ画像を機械学習識別器に入力し、入力した各パッチ画像のセグメンテーションを機械学習識別器によって実行するセグメンテーションステップと、
前記セグメンテーションステップ後のパッチ画像を合成して全体画像を生成する全体画像生成ステップと、
前記全体画像生成ステップにより生成された全体画像から前記セグメンテーション結果に基づいて発汗滴の尤度マップ画像を生成する尤度マップ生成ステップと、
前記尤度マップ生成ステップにより生成された尤度マップ画像に基づいて発汗滴を抽出する発汗滴抽出ステップと、
前記発汗滴抽出ステップにより抽出された発汗滴の分布を算出する発汗滴解析ステップとを備えていることを特徴とする皮表解析方法。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP21869283.8A EP4216153A4 (en) | 2020-09-17 | 2021-09-09 | SKIN SURFACE ANALYSIS DEVICE AND SKIN SURFACE ANALYSIS METHOD |
JP2022550513A JPWO2022059596A1 (ja) | 2020-09-17 | 2021-09-09 | |
CN202180061633.1A CN116113984A (zh) | 2020-09-17 | 2021-09-09 | 皮肤表面分析装置及皮肤表面分析方法 |
KR1020237011964A KR20230069953A (ko) | 2020-09-17 | 2021-09-09 | 피부 표면 해석 장치 및 피부 표면 해석 방법 |
US18/120,366 US20230214970A1 (en) | 2020-09-17 | 2023-03-11 | Skin surface analysis device and skin surface analysis method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2020-156214 | 2020-09-17 | ||
JP2020156214 | 2020-09-17 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/120,366 Continuation US20230214970A1 (en) | 2020-09-17 | 2023-03-11 | Skin surface analysis device and skin surface analysis method |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022059596A1 true WO2022059596A1 (ja) | 2022-03-24 |
Family
ID=80777006
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2021/033184 WO2022059596A1 (ja) | 2020-09-17 | 2021-09-09 | 皮表解析装置及び皮表解析方法 |
Country Status (6)
Country | Link |
---|---|
US (1) | US20230214970A1 (ja) |
EP (1) | EP4216153A4 (ja) |
JP (1) | JPWO2022059596A1 (ja) |
KR (1) | KR20230069953A (ja) |
CN (1) | CN116113984A (ja) |
WO (1) | WO2022059596A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7264296B1 (ja) | 2022-04-20 | 2023-04-25 | 堺化学工業株式会社 | 毛髪の状態を判別する状態判別方法、状態判別装置、および状態判別プログラム |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH09308619A (ja) * | 1996-05-23 | 1997-12-02 | Kao Corp | 皮膚表面分析方法及び装置 |
JP2013188325A (ja) * | 2012-03-13 | 2013-09-26 | Shiseido Co Ltd | 肌状態解析方法、肌状態解析装置、及び、肌状態解析システム、並びに、該肌状態解析方法を実行させるためのプログラム、及び、該プログラムを記録した記録媒体 |
WO2018230733A1 (ja) | 2017-06-16 | 2018-12-20 | マルホ株式会社 | 皮膚外用剤 |
US20200234444A1 (en) * | 2019-01-18 | 2020-07-23 | Tissue Analytics, Inc. | Systems and methods for the analysis of skin conditions |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0793549A (ja) * | 1993-09-24 | 1995-04-07 | Shiseido Co Ltd | 帰納学習を用いた物理情報分類装置及び帰納学習を用いた物理情報分類規則生成方法 |
JP6546852B2 (ja) * | 2013-08-30 | 2019-07-17 | 株式会社ニュートリション・アクト | 解析装置、解析方法、およびプログラム |
DK179723B1 (en) | 2017-02-15 | 2019-04-12 | Vkr Holding A/S | A method for attaching a pane element to a sash and a pane module including a pane element |
-
2021
- 2021-09-09 JP JP2022550513A patent/JPWO2022059596A1/ja active Pending
- 2021-09-09 EP EP21869283.8A patent/EP4216153A4/en active Pending
- 2021-09-09 WO PCT/JP2021/033184 patent/WO2022059596A1/ja unknown
- 2021-09-09 KR KR1020237011964A patent/KR20230069953A/ko unknown
- 2021-09-09 CN CN202180061633.1A patent/CN116113984A/zh active Pending
-
2023
- 2023-03-11 US US18/120,366 patent/US20230214970A1/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH09308619A (ja) * | 1996-05-23 | 1997-12-02 | Kao Corp | 皮膚表面分析方法及び装置 |
JP2013188325A (ja) * | 2012-03-13 | 2013-09-26 | Shiseido Co Ltd | 肌状態解析方法、肌状態解析装置、及び、肌状態解析システム、並びに、該肌状態解析方法を実行させるためのプログラム、及び、該プログラムを記録した記録媒体 |
WO2018230733A1 (ja) | 2017-06-16 | 2018-12-20 | マルホ株式会社 | 皮膚外用剤 |
US20200234444A1 (en) * | 2019-01-18 | 2020-07-23 | Tissue Analytics, Inc. | Systems and methods for the analysis of skin conditions |
Non-Patent Citations (1)
Title |
---|
See also references of EP4216153A4 |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7264296B1 (ja) | 2022-04-20 | 2023-04-25 | 堺化学工業株式会社 | 毛髪の状態を判別する状態判別方法、状態判別装置、および状態判別プログラム |
JP2023159529A (ja) * | 2022-04-20 | 2023-11-01 | 堺化学工業株式会社 | 毛髪の状態を判別する状態判別方法、状態判別装置、および状態判別プログラム |
Also Published As
Publication number | Publication date |
---|---|
EP4216153A4 (en) | 2024-04-03 |
CN116113984A (zh) | 2023-05-12 |
US20230214970A1 (en) | 2023-07-06 |
EP4216153A1 (en) | 2023-07-26 |
JPWO2022059596A1 (ja) | 2022-03-24 |
KR20230069953A (ko) | 2023-05-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Mall et al. | Glcm based feature extraction and medical x-ray image classification using machine learning techniques | |
Yilmaz et al. | Computer-aided diagnosis of periapical cyst and keratocystic odontogenic tumor on cone beam computed tomography | |
Garnavi et al. | Automatic segmentation of dermoscopy images using histogram thresholding on optimal color channels | |
Sumithra et al. | Segmentation and classification of skin lesions for disease diagnosis | |
EP1875863B1 (en) | Skin state analyzing method, skin state analyzing device, and recording medium on which skin state analyzing program is recorded | |
Lee et al. | Detection of neovascularization based on fractal and texture analysis with interaction effects in diabetic retinopathy | |
JP5576775B2 (ja) | 画像処理装置、画像処理方法、及び画像処理プログラム | |
US9418422B2 (en) | Skin image analysis | |
Ghaedi et al. | An automated dental caries detection and scoring system for optical images of tooth occlusal surface | |
KR20200108686A (ko) | 딥러닝 알고리즘을 이용한 근감소증 분석 프로그램 및 애플리케이션 | |
JPWO2019044095A1 (ja) | 医用情報表示装置、方法及びプログラム | |
WO2022059596A1 (ja) | 皮表解析装置及び皮表解析方法 | |
JPWO2019044081A1 (ja) | 医用画像表示装置、方法及びプログラム | |
Shah et al. | Algorithm mediated early detection of oral cancer from image analysis | |
JP5640280B2 (ja) | 骨粗鬆症診断支援装置及び骨粗鬆症診断支援プログラム | |
Marcal et al. | Evaluation of the Menzies method potential for automatic dermoscopic image analysis. | |
JP2007252892A (ja) | 皮膚表面の立体形状の目視評価値の推定方法 | |
Selvarasu et al. | Image processing techniques and neural networks for automated cancer analysis from breast thermographs-A review | |
TWM527991U (zh) | 醫療影像處理裝置 | |
KR20190104001A (ko) | 피부 질환 자동 분류 장치 및 피부 질환 자동 분류 방법 | |
Iyatomi et al. | Automated color normalization for dermoscopy images | |
EP2980757B1 (en) | Quantification and imaging methods of the echo-texture feature | |
Mankar et al. | Comparison of different imaging techniques used for chronic wounds | |
Mishra et al. | Automatic separation of basal cell carcinoma from benign lesions in dermoscopy images with border thresholding techniques | |
JP6390458B2 (ja) | 情報処理装置、情報処理システム及び情報処理方法並びに情報処理用プログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21869283 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2022550513 Country of ref document: JP Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 20237011964 Country of ref document: KR Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2021869283 Country of ref document: EP Effective date: 20230417 |