WO2020189754A1 - 推定方法、推定モデルの生成方法、プログラム、及び推定装置 - Google Patents
推定方法、推定モデルの生成方法、プログラム、及び推定装置 Download PDFInfo
- Publication number
- WO2020189754A1 WO2020189754A1 PCT/JP2020/012249 JP2020012249W WO2020189754A1 WO 2020189754 A1 WO2020189754 A1 WO 2020189754A1 JP 2020012249 W JP2020012249 W JP 2020012249W WO 2020189754 A1 WO2020189754 A1 WO 2020189754A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- skin
- estimation
- image
- feature amount
- parameters related
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 49
- 230000004215 skin function Effects 0.000 claims abstract description 69
- 238000000605 extraction Methods 0.000 claims abstract description 8
- 238000005259 measurement Methods 0.000 claims description 27
- 230000036572 transepidermal water loss Effects 0.000 claims description 19
- 238000010801 machine learning Methods 0.000 claims description 7
- 238000012545 processing Methods 0.000 claims description 6
- 238000007637 random forest analysis Methods 0.000 claims description 6
- 238000010276 construction Methods 0.000 claims description 4
- 238000012937 correction Methods 0.000 claims description 4
- 230000010365 information processing Effects 0.000 claims description 3
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 claims description 3
- 230000002596 correlated effect Effects 0.000 abstract 1
- 238000004891 communication Methods 0.000 description 15
- 238000010586 diagram Methods 0.000 description 14
- 230000008859 change Effects 0.000 description 12
- 239000000284 extract Substances 0.000 description 9
- 230000006870 function Effects 0.000 description 9
- 230000008591 skin barrier function Effects 0.000 description 8
- 230000002688 persistence Effects 0.000 description 7
- 238000003066 decision tree Methods 0.000 description 6
- 230000015654 memory Effects 0.000 description 6
- 230000003287 optical effect Effects 0.000 description 5
- 238000012360 testing method Methods 0.000 description 5
- 238000004458 analytical method Methods 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 230000003796 beauty Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 238000005401 electroluminescence Methods 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 230000003211 malignant effect Effects 0.000 description 2
- 238000010295 mobile communication Methods 0.000 description 2
- 230000000877 morphologic effect Effects 0.000 description 2
- 206010012438 Dermatitis atopic Diseases 0.000 description 1
- 102100028314 Filaggrin Human genes 0.000 description 1
- 101710088660 Filaggrin Proteins 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 201000008937 atopic dermatitis Diseases 0.000 description 1
- 230000004888 barrier function Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 239000003795 chemical substances by application Substances 0.000 description 1
- 239000000470 constituent Substances 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 238000003745 diagnosis Methods 0.000 description 1
- 239000003814 drug Substances 0.000 description 1
- 230000004064 dysfunction Effects 0.000 description 1
- 230000002500 effect on skin Effects 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 230000005861 gene abnormality Effects 0.000 description 1
- 230000003902 lesion Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000003020 moisturizing effect Effects 0.000 description 1
- 230000003950 pathogenic mechanism Effects 0.000 description 1
- 238000000985 reflectance spectrum Methods 0.000 description 1
- 238000000611 regression analysis Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 208000017520 skin disease Diseases 0.000 description 1
- 230000036559 skin health Effects 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
- 238000005406 washing Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/44—Detecting, measuring or recording for evaluating the integumentary system, e.g. skin, hair or nails
- A61B5/441—Skin evaluation, e.g. for skin disorder diagnosis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0077—Devices for viewing the surface of the body, e.g. camera, magnifying lens
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/44—Detecting, measuring or recording for evaluating the integumentary system, e.g. skin, hair or nails
- A61B5/441—Skin evaluation, e.g. for skin disorder diagnosis
- A61B5/443—Evaluating skin constituents, e.g. elastin, melanin, water
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7264—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
- A61B5/7267—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/742—Details of notification to user or communication with user or patient ; user input means using visual displays
- A61B5/7425—Displaying combinations of multiple images regardless of image source, e.g. displaying a reference anatomical image with a live image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/20—ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/40—ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/20—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30088—Skin; Dermal
Definitions
- the present invention relates to an estimation method, an estimation model generation method, a program, and an estimation device.
- Patent Document 1 a plurality of LEDs (Light Emitting Diodes), each of which irradiates light of a different wavelength, are arranged at different angles with respect to the skin, and benign tissue and malignant tissue are classified from the measured reflectance spectrum.
- a method of operating an optical transmission diagnostic device for assisting the distinction between the two is disclosed.
- the method of operation of the optical transmission diagnostic device relates to an optical method for determining some of the morphological parameters and physiological characteristics of living tissue, particularly the morphological parameters and physiological of benign and malignant tissue lesions.
- Target methods for determining properties are disclosed.
- Patent Document 2 discloses a skin condition analysis method for analyzing the condition of the skin surface based on the shape of the skin groove on the skin surface.
- the skin condition analysis method a plurality of optical cross-sectional images which are three-dimensional shape data of the skin groove on the skin surface are acquired by using a confocal microscope, and the condition of the skin surface is evaluated.
- TEWL transepidermal water loss
- Patent Document 1 and Patent Document 2 the point of analyzing the state of the living tissue is considered, but the function of the living tissue including the skin barrier function is considered, not the state of the living tissue. It wasn't done. Therefore, these prior arts have not considered the point of estimating the function of living tissue. On the other hand, there has been a demand to accurately estimate the parameters related to the skin function including TEWL and the like to accurately estimate the function of the biological tissue including the skin barrier function and the like.
- An object of the present invention made in view of such problems is to provide an estimation method, an estimation model generation method, a program, and an estimation device capable of accurately estimating parameters related to skin function.
- the estimation method is An estimation method for estimating parameters related to skin function.
- An image acquisition step to acquire a skin image showing the unevenness of the skin surface An extraction step of extracting a feature amount vector based on the phase information of the skin image with respect to the skin image acquired in the image acquisition step.
- the parameters related to the skin function are estimated based on the feature quantity vector extracted in the extraction step using an estimation model constructed based on the past actual measurement data in which the feature quantity vector and the parameters related to the skin function are associated with each other.
- Estimating steps and A presentation step that presents the parameters related to the skin function estimated in the estimation step, and including.
- a method for generating an estimation model is used.
- a method for generating the estimation model used in the above estimation method An acquisition step for acquiring past actual measurement data in which the feature quantity vector and the parameter related to the skin function are associated with each other.
- the program according to the embodiment of the present invention Have the information processing apparatus execute the above estimation method or the above estimation model generation method.
- the estimation device is An estimation device that estimates parameters related to skin function.
- An image acquisition unit that acquires a skin image showing the unevenness of the skin surface, For the skin image acquired by the image acquisition unit, a feature amount vector based on the phase information of the skin image is extracted, and the past actual measurement data in which the feature amount vector and the parameter related to the skin function are associated with each other is obtained.
- a control unit that estimates parameters related to the skin function based on the extracted feature quantity vector using the estimation model constructed based on the above.
- a presenting unit that presents parameters related to the skin function estimated by the control unit, and To be equipped.
- the estimation method the estimation model generation method, the program, and the estimation device according to the embodiment of the present invention, it is possible to accurately estimate the parameters related to the skin function.
- FIG. 1 is a block diagram showing a schematic configuration of an estimation device 1 according to an embodiment of the present invention. The configuration and function of the estimation device 1 according to the embodiment of the present invention will be mainly described with reference to FIG.
- the estimation device 1 acquires a skin image showing irregularities on the skin surface.
- the estimation device 1 extracts a feature amount vector based on the topological information of the skin image from the acquired skin image.
- the estimation device 1 estimates the parameters related to skin function based on the extracted feature vector using an estimation model constructed based on the past actual measurement data in which the feature vector and the parameters related to skin function are associated with each other.
- the estimation device 1 presents parameters related to the estimated skin function.
- Parameters related to skin function include, for example, TEWL.
- the parameters relating to the skin function may include any index associated with the function of the living tissue including the skin barrier function and the like.
- parameters related to skin function may include the amount of water in the skin.
- the estimation device 1 is, for example, an electronic device that estimates parameters related to skin function based on a skin image showing irregularities on the surface of human skin.
- the estimation device 1 may be a dedicated electronic device, or may be any general-purpose electronic device such as a smartphone, a PC (Personal Computer), and a server device.
- the estimation device 1 itself may image a human skin surface to acquire a skin image, and estimate parameters related to skin function based on the skin image.
- the estimation device 1 is not limited to this, and for example, the estimation device 1 acquires a skin image of the human skin surface imaged by another image pickup device or the like from the other image pickup device or the like by any means such as communication. Parameters related to skin function may be estimated based on the skin image.
- the estimation device 1 includes a control unit 11, a communication unit 12, a storage unit 13, an image acquisition unit 14, a data acquisition unit 15, and a presentation unit 16.
- the control unit 11 includes one or more processors.
- the "processor” is, but is not limited to, a general purpose processor or a dedicated processor specialized for a particular process.
- the control unit 11 is communicably connected to each of the constituent units constituting the estimation device 1 and controls the operation of the entire estimation device 1.
- the control unit 11 may control the communication unit 12 and transmit the estimation result by the estimation device 1 to any other information processing device.
- the control unit 11 may control the storage unit 13 to store the estimation result by the estimation device 1 and the acquired skin image.
- the control unit 11 may control the image acquisition unit 14 to acquire a skin image in which the unevenness of the skin surface is reflected.
- the control unit 11 may control the data acquisition unit 15 to acquire past actual measurement data in which the feature amount vector and the parameter related to the skin function are associated with each other.
- the control unit 11 may control the presentation unit 16 to present the estimation result by the estimation device 1 to the user.
- the communication unit 12 includes a communication module that connects to a network including a mobile communication network and the Internet.
- the communication unit 12 may include a communication module corresponding to a mobile communication standard such as 4G (4th Generation) and 5G (5th Generation).
- the communication unit 12 may include a communication module corresponding to a wired LAN (Local Area Network) standard.
- the storage unit 13 includes one or more memories.
- the "memory” is, for example, a semiconductor memory, a magnetic memory, an optical memory, or the like, but is not limited thereto.
- Each memory included in the storage unit 13 may function as, for example, a main storage device, an auxiliary storage device, or a cache memory.
- the storage unit 13 stores arbitrary information used for the operation of the estimation device 1.
- the storage unit 13 may store the system program, the application program, various information acquired by the estimation device 1, the estimation result by the estimation device 1, and the like.
- the information stored in the storage unit 13 may be updatable with information acquired from the network via, for example, the communication unit 12.
- the image acquisition unit 14 includes an arbitrary imaging device such as a camera.
- the image acquisition unit 14 may acquire a skin image in which the unevenness of the skin surface is captured, for example, by imaging using the image pickup device of the image acquisition unit 14 itself.
- the image acquisition unit 14 may acquire a skin image in which the unevenness of the skin surface is reflected by any method.
- the image acquisition unit 14 may acquire a skin image of the skin surface imaged by another image pickup device or the like from the other image pickup device or the like by any means such as communication.
- the data acquisition unit 15 includes, for example, an arbitrary interface capable of acquiring past actual measurement data in which a feature quantity vector and a parameter related to skin function are associated with each other.
- the data acquisition unit 15 may include an arbitrary input interface capable of accepting an input operation by the user, and may acquire actual measurement data based on the input by the user.
- the data acquisition unit 15 may include an arbitrary communication interface and acquire actual measurement data from an external device or the like by an arbitrary communication protocol.
- the presentation unit 16 includes, for example, an arbitrary output interface for outputting an image.
- the presentation unit 16 includes an arbitrary display such as a liquid crystal display and an organic EL (Electro Luminescence) display.
- the presentation unit 16 presents the estimation result by the estimation device 1 to the user or the like.
- the presentation unit 16 presents parameters related to skin function estimated by the control unit 11 of the estimation device 1.
- FIG. 2 is a flowchart showing an example of the first operation by the estimation device 1 of FIG.
- FIG. 2 shows a flow in which the estimation device 1 generates an estimation model based on past actual measurement data. That is, FIG. 2 shows a method of generating an estimation model used in the estimation method described later using the estimation device 1.
- step S101 the control unit 11 of the estimation device 1 uses the data acquisition unit 15 to acquire past actual measurement data in which the feature amount vector and the parameters related to the skin function are associated with each other.
- step S102 the control unit 11 builds an estimation model that estimates parameters related to skin function based on the feature amount vector based on the past actual measurement data acquired in step S101.
- the estimation model may be, for example, a machine learning model including a random forest model learned based on the past actual measurement data acquired in step S101.
- the estimation model is not limited to this, and may be any machine learning model including a neural network, a local regression model, a kernel regression model, and the like.
- FIG. 3 is a flowchart showing an example of the second operation by the estimation device 1 of FIG.
- FIG. 3 mainly shows a flow in which the estimation device 1 estimates parameters related to skin function using the estimation model constructed by the flow of FIG. That is, FIG. 3 shows an estimation method for estimating parameters related to skin function using the estimation device 1.
- step S201 the control unit 11 of the estimation device 1 uses the image acquisition unit 14 to acquire a skin image showing the unevenness of the skin surface.
- step S202 the control unit 11 extracts a feature amount vector based on the phase information of the skin image from the skin image acquired in step S201. Since step S202 includes a more detailed flow as described later in FIG. 4, the box of step S202 is shown by a double line in FIG.
- step S203 the control unit 11 estimates the parameters related to the skin function based on the feature amount vector extracted in step S202 by using the estimation model constructed by the flow of FIG.
- step S204 the control unit 11 presents the parameters related to the skin function estimated in step S203 by using the presentation unit 16.
- FIG. 4 is a flowchart showing an example of the third operation by the estimation device 1 of FIG.
- FIG. 4 shows the flow in step S202 of FIG. 3 in more detail.
- the flow until the control unit 11 of the estimation device 1 extracts the feature amount vector based on the acquired skin image will be described in more detail.
- step S301 the control unit 11 of the estimation device 1 generates a corrected image obtained by performing brightness correction processing and binarization processing on the skin image acquired in step S201 of FIG.
- FIG. 5 is a schematic diagram showing an example of the corrected image generated in step S301 of FIG.
- the control unit 11 uses, for example, a wavelet transform to generate a corrected image as shown in FIG. 5, which includes only information in a predetermined frequency domain. By generating such a corrected image, the control unit 11 removes extra information that may be noise, which is not related to the unevenness of the skin surface, from the skin image acquired in step S201 of FIG.
- step S302 of FIG. 4 the control unit 11 acquires information on the 0-dimensional feature amount and the 1-dimensional feature amount extracted based on the corrected image generated in step S301.
- Such information on the 0-dimensional feature amount and the 1-dimensional feature amount constitutes the above-mentioned phase information.
- FIG. 6 is a schematic diagram showing an example of a method for acquiring topological information in step S302 of FIG. With reference to FIG. 6, a method in which the control unit 11 extracts the 0-dimensional feature amount and the 1-dimensional feature amount based on the corrected image generated in step S301 will be mainly described.
- the control unit 11 executes the density estimation of white pixels on the corrected image generated in step S301, and generates an image representing the density of white pixels with respect to the pixel area like a topographic map.
- the density change of white pixels is represented as a peak in the pixel region where the density of white pixels is high and as a valley in the pixel region where the density of black pixels is high.
- FIG. 6 is a schematic diagram showing one-dimensionally the density change of white pixels along a predetermined row of pixels in such an image.
- the vertical axis shows the density of white pixels.
- the horizontal axis indicates the pixel position.
- the control unit 11 changes the threshold value t of the density of white pixels with respect to the graph showing the density change of white pixels as shown in FIG. 6, for example.
- the control unit 11 is, for example, a pixel region in which the straight line corresponding to the threshold value t as shown by the broken line in FIG. 6 intersects the graph, and the value of the density of white pixels in the graph becomes larger than the threshold value t. For, all pixels are determined to be white. For example, the control unit 11 determines all the pixels to be black for the other pixel areas.
- FIG. 7 is a schematic diagram showing an example of changes in the image and the phase information when the threshold value t is changed stepwise. More specifically, the set of images in the upper part of FIG. 7 shows how the white regions are connected, which is obtained when the threshold value t is changed stepwise. The middle part of FIG. 7 shows the state of change in the 0-dimensional feature amount obtained when the threshold value t is changed stepwise. The lower part of FIG. 7 shows the state of change in the one-dimensional feature amount obtained when the threshold value t is changed stepwise.
- the control unit 11 determines all the pixels in the image to be black because the straight line corresponding to the threshold value t1 and the graph do not intersect. Therefore, as shown in the upper part of FIG. 7, the image at the threshold value t1 is an image that is entirely filled with black.
- the control unit 11 determines the threshold value t to be t2 in FIG. 6, the straight line corresponding to the threshold value t2 and the graph intersect in the region R2, and the value of the density of white pixels in the graph is higher than the threshold value t2 in the region R2. Is also getting bigger. Therefore, the control unit 11 determines all the pixels in the region R2 to be white. The control unit 11 determines all the pixels to be black for the pixel area other than the area R2. Therefore, as shown in the upper part of FIG. 7, the image at the threshold value t2 is an image in which a slightly white region starts to appear, but there are many black pixels as a whole.
- the control unit 11 determines the threshold value t to be t3 in FIG. 6, the straight line corresponding to the threshold value t3 and the graph intersect in the region R3, and the value of the density of white pixels in the graph is higher than the threshold value t3 in the region R3. Is also getting bigger. Therefore, the control unit 11 determines all the pixels in the region R3 to be white. The control unit 11 determines all the pixels to be black for the pixel area other than the area R3. Therefore, as shown in the upper part of FIG. 7, the image at the threshold value t3 is an image in which white areas are further increased than the image at the threshold value t2.
- the control unit 11 determines the threshold value t to be t4 in FIG. 6, the straight line corresponding to the threshold value t4 and the graph intersect in the region R4, and the value of the density of white pixels in the graph is higher than the threshold value t4 in the region R4. Is also getting bigger. Therefore, the control unit 11 determines all the pixels in the region R4 to be white. The control unit 11 determines all the pixels to be black for the pixel area other than the area R4. Therefore, as shown in the upper part of FIG. 7, the image at the threshold value t4 is an image in which white areas are further increased than the image at the threshold value t3.
- the control unit 11 determines the threshold value t to be t5 in FIG. 6, the straight line corresponding to the threshold value t5 and the graph intersect in the region R5, and the value of the density of white pixels in the graph is higher than the threshold value t5 in the region R5. Is also getting bigger. Therefore, the control unit 11 determines all the pixels in the region R5 to be white. The control unit 11 determines all the pixels to be black for the pixel area other than the area R5. Therefore, as shown in the upper part of FIG. 7, the image at the threshold value t5 has a slightly black region, but has many white pixels as a whole.
- the control unit 11 determines the threshold value t to be t6 in FIG. 6, the straight line corresponding to the threshold value t6 and the graph intersect in the region R6 as a whole, and the value of the density of white pixels in the graph is in the region R6. It is larger than the threshold value t6. Therefore, the control unit 11 determines all the pixels in the region R6 to be white. Therefore, as shown in the upper part of FIG. 7, the image at the threshold value t6 is an image that is entirely filled with white.
- control unit 11 gradually changes the threshold value t and acquires a set of a series of images showing the change in how the white areas are connected.
- the control unit 11 extracts the topological information including the 0-dimensional feature amount and the 1-dimensional feature amount from the acquired series of images.
- the control unit 11 extracts a portion in which white pixels are connected as a 0-dimensional feature amount from the acquired series of images.
- the zero-dimensional features correspond to the connected components in the set of images. For example, in an image having a threshold value t1, the number of zero-dimensional features is 0. For example, in an image having a threshold value of t6, the number of zero-dimensional features is 1.
- the control unit 11 traces white pixels to a set of acquired series of images, and a portion in which a black pixel exists in the center thereof is a one-dimensional feature amount. Extract as.
- the one-dimensional feature corresponds to a hole in a set of images. For example, in the images of the threshold values t1 and t6, the number of one-dimensional features is 0.
- the connected components and holes extracted for the set of images shown in the upper part of FIG. 7 are generated and disappear with the change of the threshold value t. That is, given connected component, when to have been generated by a certain threshold tb c, it disappears with another threshold td c having a value smaller than the threshold value tb c. Similarly, if a predetermined hole is created at a certain threshold tb h , it disappears at another threshold td h having a value smaller than the threshold tb h .
- Control unit 11 stores the set of values of thresholds tb c and td c in the storage unit 13 for each connected component. Similarly, the control unit 11 stores a set of values of the threshold values tb h and td h in the storage unit 13 for each hole.
- step S303 in FIG. 4 the control unit 11, based on the set of values of thresholds tb c and td c stored in the storage unit 13, a distribution diagram illustrating the persistence of the characteristic amounts with respect to a zero-dimensional feature quantity Generate. Similarly, the control unit 11 generates a distribution map showing the sustainability of each feature amount for the one-dimensional feature amount based on the set of the values of the threshold values tb h and td h stored in the storage unit 13. The control unit 11 generates, for example, a distribution map showing the sustainability of each of the 0-dimensional feature amount and the 1-dimensional feature amount based on the one skin image acquired in step S201 of FIG. May be good. Not limited to this, the control unit 11 shows the sustainability of each feature amount for each of the 0-dimensional feature amount and the 1-dimensional feature amount, based on, for example, the plurality of skin images acquired in step S201 of FIG. A distribution map may be generated.
- FIG. 8 is a distribution map showing an example of the persistence of 0-dimensional features.
- the vertical axis represents the difference between the threshold tb c and the threshold td c. That is, the vertical axis of FIG. 8 gives a measure of sustainability of how long the zero-dimensional feature lasts with respect to the change of the threshold value t.
- the horizontal axis shows the average value between the threshold value tb c and the threshold value td c . That is, the horizontal axis of FIG. 8 gives an indication of what value the threshold value t the 0-dimensional feature amount exists with respect to the change in the threshold value t.
- each point can take any density value. That is, at each point, a predetermined number of 0-dimensional features overlap.
- FIG. 9 is a distribution map showing an example of the persistence of one-dimensional features.
- the vertical axis shows the difference between the threshold value tb h and the threshold value td h . That is, the vertical axis of FIG. 9 gives a measure of sustainability of how long the one-dimensional feature quantity lasts with respect to the change of the threshold value t.
- the horizontal axis shows the average value between the threshold value tb h and the threshold value td h . That is, the horizontal axis of FIG. 9 gives an indication of what value the one-dimensional feature amount exists at the threshold value t with respect to the change of the threshold value t.
- each point can take any density value. That is, at each point, a predetermined number of one-dimensional features overlap.
- step S304 of FIG. 4 the control unit 11 extracts the feature amount vector based on the distribution map generated in step S303.
- FIG. 10 is a schematic diagram showing an estimation model according to an embodiment based on a random forest.
- an example of a method of extracting the feature amount vector in step S304 of FIG. 4 and a method of estimating parameters related to skin function in step S203 of FIG. 3 will be mainly described.
- step S304 of FIG. 4 the control unit 11 defines a grid for each of the distribution map of the 0-dimensional feature amount and the 1-dimensional feature amount generated in step S303, and sets a plurality of regions G.
- the control unit 11 calculates the number of points included in each of the set plurality of regions G for each region G.
- the control unit 11 extracts a vector in which the calculated number of points is arranged for each region G as a feature quantity vector. At this time, the density value of each point in the distribution map of the 0-dimensional feature amount and the 1-dimensional feature amount may be taken into consideration.
- the control unit 11 estimates the parameters related to the skin function based on the feature amount vector extracted through the flow of FIG. 4 using the estimation model constructed by the flow of FIG. More specifically, the control unit 11 estimates parameters related to skin function using a machine learning model including a random forest model. At this time, the control unit 11 may estimate parameters related to skin function based on, for example, the attributes of the subject, in addition to the feature amount vector extracted through the flow of FIG. Subject attributes may include, for example, the subject's age and gender.
- the control unit 11 randomly selects one or more components of the feature amount vector extracted through the flow of FIG. For example, the control unit 11 associates one or more randomly selected feature vector components with the decision tree 1. For example, the control unit 11 executes the same processing on a plurality of decision trees from the decision tree 2 to the decision tree N. The control unit 11 estimates TEWL using the feature vector component associated with each decision tree as a variable. The control unit 11 averages a plurality of TEWLs estimated for each of the plurality of decision trees to estimate the final TEWL.
- FIG. 11 is a scatter diagram showing an example of the first estimation result by the estimation device 1 of FIG.
- the vertical axis shows the measured value of TEWL.
- the horizontal axis shows the estimated value of TEWL.
- Black circles show data when using skin images of adult men.
- Adult men include men over the age of 20.
- White circles show data when using skin images of minor boys.
- Minor boys include boys aged 0 to 19 years.
- White triangles show data when skin images of minor girls are used. Underage girls include girls between the ages of 0 and 19.
- the measured value of TEWL and the estimated value show a good correspondence relationship. That is, the difference between the TEWL value estimated by the estimation device 1 and the measured value of TEWL is within a predetermined error range.
- the coefficient of determination at this time was 0.667.
- a strong correlation was found between the two.
- the same analysis was performed on the water content of the skin, and as a result, a correlation was found between the two.
- FIG. 12 is a graph showing an example of a second estimation result by the estimation device 1 of FIG.
- the vertical axis indicates the type of variable.
- the horizontal axis shows the importance of the variable.
- the estimation device 1 can also present the importance of variables in the estimation result. For example, when the control unit 11 estimates parameters related to skin function based on the attributes of the subject in addition to the feature vector, the estimation device 1 uses age and gender as variables in addition to the feature vector component, and these variables are important. It is also possible to calculate the degree. In FIG. 12, it is shown that age is more important as a variable than gender in the estimation result by the estimation device 1. Although the feature vector component and the subject's attributes including age and gender are shown as variables in FIG. 12, the variables used for estimating TEWL may include any other variables. For example, the variable used to estimate TEWL may include skin moisture content.
- the estimation device 1 it is possible to accurately estimate the parameters related to the skin function. More specifically, the estimation device 1 estimates the parameters related to the skin function using an estimation model constructed based on the past actual measurement data in which the feature quantity vector and the parameters related to the skin function are associated with each other. As a result, the estimation device 1 can accurately estimate the parameters related to the skin function by the learned estimation model. For example, the estimation device 1 can accurately estimate parameters related to skin function by a machine learning model including a random forest model learned based on acquired past actual measurement data.
- the estimation device 1 Since the parameters related to the skin function can be accurately estimated by the estimation device 1, it is possible to accurately estimate the function of the biological tissue including the skin barrier function. Thereby, the estimation device 1 can be utilized in a wide range of fields such as medical treatment and beauty. For example, the estimation device 1 can also contribute to the diagnosis and evaluation of skin health. The estimation device 1 can also contribute to the verification of the effect on skin treatment and skin care. The estimation device 1 can also contribute to the prediction of the onset of skin diseases.
- the conventional TEWL measurement and skin conductance measurement it is necessary to wash the test site before the measurement and perform the measurement stably in an environment of constant temperature and humidity. Further, in the conventional TEWL measurement, it is necessary to keep the test site stationary for about 10 seconds during the measurement. As described above, in the prior art, it has been difficult to measure in an environment where the temperature and humidity cannot be controlled, and to measure newborns and infants who have difficulty in keeping the test site stationary. As described above, the convenience of the measuring device using the conventional technique is low.
- parameters related to skin function can be accurately estimated from a skin image showing irregularities on the skin surface by using a method by machine learning, so that stable measurement is performed as in the prior art. Does not need to be done. That is, the user who uses the estimation device 1 only needs to acquire a skin image showing the unevenness of the skin surface, and the environment and the test subject can be estimated without limitation.
- the estimation device 1 can be applied to a case where a skin image is directly acquired at a medical site or a beauty-related store, or a skin image of a subject or the like at a remote location is acquired by communication or the like. It can also be applied in cases. Further, in some cases, it is possible to make an estimation without washing the test site. As described above, the estimation device 1 improves the convenience of the user in estimating the parameters related to the skin function.
- the estimation device 1 can easily present the values of the parameters related to the skin function to the user as compared with the prior art, for example, what kind of moisturizing agent and medicine should be applied to what kind of person, etc. It can also be applied to the creation and use of guidelines that indicate the criteria for. That is, unlike the prior art, it is possible to frequently measure parameters related to skin function using the estimation device 1, so that such a guideline can be easily created and used.
- the estimation device 1 generates a corrected image obtained by subjecting the acquired skin image to a brightness correction process and a binarization process, thereby providing extra information that may be noise and is not related to the unevenness of the skin surface. It can be removed from the acquired skin image. As a result, the estimation device 1 can estimate the parameters related to the skin function with higher accuracy.
- the estimation device 1 can accurately separate essential information such as phase information from noise by acquiring a series of image sets in step S302 of FIG. For example, when only one image is used, it is difficult to determine which of the plurality of connected components and holes contained in the image is the essential feature and which is the noise.
- the estimation device 1 can determine, for example, the persistence of a connected component or a hole existing in a predetermined region by acquiring a series of image sets by changing the threshold value t stepwise. The estimation device 1 can accurately separate essential information from noise based on such persistence.
- the estimation device 1 reduces the number of required samples by extracting the feature amount vector based on the skin image and then estimating the parameters related to the skin function by the machine learning model. Can be made to. In addition, the estimation device 1 can suppress the amount of calculation. Further, the estimation device 1 facilitates interpretation of what features of the skin image are related to the parameters related to the estimated skin function.
- step S203 of FIG. 3 the estimation device 1 estimates the parameters related to the skin function based on the attributes of the subject in addition to the feature amount vector, so that the parameters related to the skin function can be estimated more accurately according to the attributes of the subject. Is possible.
- each step in the estimation method using the estimation device 1 described above and the functions included in each step can be rearranged so as not to be logically inconsistent, and the order of the steps can be changed or a plurality of steps can be changed. It can be combined or divided into one.
- the present invention can also be realized as a program that describes the processing content that realizes each function of the estimation device 1 described above, or as a storage medium that records the program. It should be understood that the scope of the present invention also includes these.
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Medical Informatics (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Biomedical Technology (AREA)
- Pathology (AREA)
- Surgery (AREA)
- Molecular Biology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Veterinary Medicine (AREA)
- Animal Behavior & Ethology (AREA)
- Biophysics (AREA)
- Heart & Thoracic Surgery (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Epidemiology (AREA)
- Primary Health Care (AREA)
- Artificial Intelligence (AREA)
- Quality & Reliability (AREA)
- Dermatology (AREA)
- Signal Processing (AREA)
- Databases & Information Systems (AREA)
- Psychiatry (AREA)
- Data Mining & Analysis (AREA)
- Mathematical Physics (AREA)
- Fuzzy Systems (AREA)
- Evolutionary Computation (AREA)
- Physiology (AREA)
- Multimedia (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
- Image Analysis (AREA)
- Debugging And Monitoring (AREA)
- Radar Systems Or Details Thereof (AREA)
Abstract
Description
皮膚機能に関するパラメータを推定する推定方法であって、
皮膚表面の凹凸が写っている皮膚画像を取得する画像取得ステップと、
前記画像取得ステップにおいて取得された前記皮膚画像に対して、前記皮膚画像の位相的情報に基づく特徴量ベクトルを抽出する抽出ステップと、
前記特徴量ベクトルと前記皮膚機能に関するパラメータとを関連付けた過去の実測データに基づき構築された推定モデルを用いて、前記抽出ステップにおいて抽出された前記特徴量ベクトルに基づき前記皮膚機能に関するパラメータを推定する推定ステップと、
前記推定ステップにおいて推定された前記皮膚機能に関するパラメータを提示する提示ステップと、
を含む。
上記の推定方法に用いられる前記推定モデルの生成方法であって、
前記特徴量ベクトルと前記皮膚機能に関するパラメータとを関連付けた過去の実測データを取得する取得ステップと、
該取得ステップにおいて取得された前記過去の実測データに基づいて、前記特徴量ベクトルに基づき前記皮膚機能に関するパラメータを推定する前記推定モデルを構築する構築ステップと、
を含む。
上記の推定方法又は上記の推定モデルの生成方法を情報処理装置に実行させる。
皮膚機能に関するパラメータを推定する推定装置であって、
皮膚表面の凹凸が写っている皮膚画像を取得する画像取得部と、
前記画像取得部により取得された前記皮膚画像に対して、前記皮膚画像の位相的情報に基づく特徴量ベクトルを抽出し、前記特徴量ベクトルと前記皮膚機能に関するパラメータとを関連付けた過去の実測データに基づき構築された推定モデルを用いて、抽出された前記特徴量ベクトルに基づき前記皮膚機能に関するパラメータを推定する制御部と、
前記制御部により推定された前記皮膚機能に関するパラメータを提示する提示部と、
を備える。
11 制御部
12 通信部
13 記憶部
14 画像取得部
15 データ取得部
16 提示部
G、R2、R3、R4、R5、R6 領域
t、t1、t2、t3、t4、t5、t6、tbc、tdc、tbh、tdh 閾値
Claims (11)
- 皮膚機能に関するパラメータを推定する推定方法であって、
皮膚表面の凹凸が写っている皮膚画像を取得する画像取得ステップと、
前記画像取得ステップにおいて取得された前記皮膚画像に対して、前記皮膚画像の位相的情報に基づく特徴量ベクトルを抽出する抽出ステップと、
前記特徴量ベクトルと前記皮膚機能に関するパラメータとを関連付けた過去の実測データに基づき構築された推定モデルを用いて、前記抽出ステップにおいて抽出された前記特徴量ベクトルに基づき前記皮膚機能に関するパラメータを推定する推定ステップと、
前記推定ステップにおいて推定された前記皮膚機能に関するパラメータを提示する提示ステップと、
を含む、
推定方法。 - 前記抽出ステップにおいて、取得された前記皮膚画像に対して明るさの補正処理及び二値化処理を施した補正画像が生成される、
請求項1に記載の推定方法。 - 前記位相的情報は、生成された前記補正画像に基づいて抽出された0次元特徴量及び1次元特徴量に関する情報を含む、
請求項2に記載の推定方法。 - 前記抽出ステップにおいて、前記0次元特徴量及び前記1次元特徴量のそれぞれに対して、各特徴量の持続性を示す分布図が生成され、生成された前記分布図に基づいて前記特徴量ベクトルが抽出される、
請求項3に記載の推定方法。 - 前記推定ステップにおいて、被験者の属性に基づき前記皮膚機能に関するパラメータが推定される、
請求項1乃至4のいずれか1項に記載の推定方法。 - 前記皮膚機能に関するパラメータは、経表皮水分蒸散量を含む、
請求項1乃至5のいずれか1項に記載の推定方法。 - 前記皮膚機能に関するパラメータは、皮膚の水分量を含む、
請求項1乃至5のいずれか1項に記載の推定方法。 - 請求項1乃至7のいずれか1項に記載の推定方法に用いられる前記推定モデルの生成方法であって、
前記特徴量ベクトルと前記皮膚機能に関するパラメータとを関連付けた過去の実測データを取得する取得ステップと、
該取得ステップにおいて取得された前記過去の実測データに基づいて、前記特徴量ベクトルに基づき前記皮膚機能に関するパラメータを推定する前記推定モデルを構築する構築ステップと、
を含む、
推定モデルの生成方法。 - 前記推定モデルは、前記取得ステップにおいて取得された前記過去の実測データに基づき学習されたランダムフォレストモデルを含む機械学習モデルである、
請求項8に記載の推定モデルの生成方法。 - 請求項1乃至7のいずれか1項に記載の推定方法又は請求項8若しくは9に記載の推定モデルの生成方法を情報処理装置に実行させるプログラム。
- 皮膚機能に関するパラメータを推定する推定装置であって、
皮膚表面の凹凸が写っている皮膚画像を取得する画像取得部と、
前記画像取得部により取得された前記皮膚画像に対して、前記皮膚画像の位相的情報に基づく特徴量ベクトルを抽出し、前記特徴量ベクトルと前記皮膚機能に関するパラメータとを関連付けた過去の実測データに基づき構築された推定モデルを用いて、抽出された前記特徴量ベクトルに基づき前記皮膚機能に関するパラメータを推定する制御部と、
前記制御部により推定された前記皮膚機能に関するパラメータを提示する提示部と、
を備える、
推定装置。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/593,346 US11534105B2 (en) | 2019-03-20 | 2020-03-19 | Estimation method, estimation model generation method, program, and estimation device |
JP2021507415A JP6983357B2 (ja) | 2019-03-20 | 2020-03-19 | 推定方法、推定モデルの生成方法、プログラム、及び推定装置 |
KR1020217028038A KR102411108B1 (ko) | 2019-03-20 | 2020-03-19 | 추정 방법, 추정 모델의 생성 방법, 프로그램, 및 추정 장치 |
CN202080020708.7A CN113574564B (zh) | 2019-03-20 | 2020-03-19 | 推测方法、推测模型的生成方法、程序和推测装置 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2019053776 | 2019-03-20 | ||
JP2019-053776 | 2019-03-20 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020189754A1 true WO2020189754A1 (ja) | 2020-09-24 |
Family
ID=72519107
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2020/012249 WO2020189754A1 (ja) | 2019-03-20 | 2020-03-19 | 推定方法、推定モデルの生成方法、プログラム、及び推定装置 |
Country Status (6)
Country | Link |
---|---|
US (1) | US11534105B2 (ja) |
JP (1) | JP6983357B2 (ja) |
KR (1) | KR102411108B1 (ja) |
CN (1) | CN113574564B (ja) |
TW (1) | TWI770480B (ja) |
WO (1) | WO2020189754A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7348448B1 (ja) | 2022-06-01 | 2023-09-21 | 株式会社 サティス製薬 | Ai肌水分量解析方法、装置、またはシステムおよび学習済みai肌水分量解析モデル |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012519894A (ja) * | 2009-03-06 | 2012-08-30 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | 少なくとも1の生物体の画像の処理 |
US20180035941A1 (en) * | 2016-08-04 | 2018-02-08 | Samsung Electronics Co., Ltd. | Apparatus and method for estimating skin condition |
Family Cites Families (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS6035268B2 (ja) | 1976-04-06 | 1985-08-13 | 松下電器産業株式会社 | 布の貼着け方法 |
JPS6058902B2 (ja) | 1980-06-16 | 1985-12-23 | セイコーエプソン株式会社 | 液晶性エステル化合物 |
JP4808975B2 (ja) | 2005-02-15 | 2011-11-02 | 株式会社 資生堂 | 肌の凹凸状態を評価・模擬するシミュレーション装置及びシミュレーション方法 |
US9823189B2 (en) | 2008-03-18 | 2017-11-21 | Balter, As. | Optical method for determining morphological parameters and physiological properties of tissue |
JP5733570B2 (ja) * | 2011-05-23 | 2015-06-10 | ソニー株式会社 | 画像処理装置、画像処理方法、プログラム、および、記録媒体 |
JP5794889B2 (ja) | 2011-10-25 | 2015-10-14 | 富士フイルム株式会社 | シミ種別分類装置の作動方法、シミ種別分類装置およびシミ種別分類プログラム |
JP6058902B2 (ja) | 2012-03-13 | 2017-01-11 | 株式会社 資生堂 | 肌状態解析方法、肌状態解析装置、及び、肌状態解析システム、並びに、該肌状態解析方法を実行させるためのプログラム、及び、該プログラムを記録した記録媒体 |
WO2014027522A1 (ja) * | 2012-08-17 | 2014-02-20 | ソニー株式会社 | 画像処理装置、画像処理方法、プログラムおよび画像処理システム |
US9256963B2 (en) | 2013-04-09 | 2016-02-09 | Elc Management Llc | Skin diagnostic and image processing systems, apparatus and articles |
US10667744B2 (en) * | 2013-06-28 | 2020-06-02 | Panasonic Intellectual Property Corporation Of America | Skin function evaluation device and skin evaluation method |
CN104887183B (zh) * | 2015-05-22 | 2017-12-22 | 杭州雪肌科技有限公司 | 基于光学的肌肤健康监测和预诊断智能方法 |
EP3236840B1 (en) * | 2015-06-15 | 2024-03-27 | E.S.I. Novel Ltd. | System and method for adaptive cosmetic skin treatment |
CN105427306B (zh) * | 2015-11-19 | 2018-02-23 | 上海家化联合股份有限公司 | 皮肤光泽度的图像分析方法和装置 |
JP6245590B1 (ja) * | 2016-06-20 | 2017-12-13 | 公立大学法人大阪市立大学 | 皮膚診断装置、皮膚状態出力方法、プログラムおよび記録媒体 |
KR101992403B1 (ko) * | 2016-10-18 | 2019-06-24 | 성균관대학교산학협력단 | 영상을 이용한 피부 수분도 측정 방법 |
US20180103915A1 (en) * | 2016-10-18 | 2018-04-19 | Alcare Co., Ltd. | Operation processing aparatus calculating numerical value representing skin barrier function, equipument, computer readable medium, and method for evaluating skin barrier function |
US10425633B2 (en) * | 2016-12-30 | 2019-09-24 | Konica Minolta Laboratory U.S.A., Inc. | Method and system for capturing images for wound assessment with moisture detection |
JP7444362B2 (ja) * | 2019-02-04 | 2024-03-06 | 国立研究開発法人理化学研究所 | 皮膚疾患の予防、改善または治療用医薬組成物 |
KR20210057506A (ko) * | 2019-11-12 | 2021-05-21 | 삼성전자주식회사 | 피부 장벽 기능 추정 장치 및 방법 |
-
2020
- 2020-03-19 WO PCT/JP2020/012249 patent/WO2020189754A1/ja active Application Filing
- 2020-03-19 JP JP2021507415A patent/JP6983357B2/ja active Active
- 2020-03-19 KR KR1020217028038A patent/KR102411108B1/ko active IP Right Grant
- 2020-03-19 TW TW109109282A patent/TWI770480B/zh active
- 2020-03-19 US US17/593,346 patent/US11534105B2/en active Active
- 2020-03-19 CN CN202080020708.7A patent/CN113574564B/zh active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012519894A (ja) * | 2009-03-06 | 2012-08-30 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | 少なくとも1の生物体の画像の処理 |
US20180035941A1 (en) * | 2016-08-04 | 2018-02-08 | Samsung Electronics Co., Ltd. | Apparatus and method for estimating skin condition |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7348448B1 (ja) | 2022-06-01 | 2023-09-21 | 株式会社 サティス製薬 | Ai肌水分量解析方法、装置、またはシステムおよび学習済みai肌水分量解析モデル |
Also Published As
Publication number | Publication date |
---|---|
TWI770480B (zh) | 2022-07-11 |
CN113574564B (zh) | 2022-09-13 |
CN113574564A (zh) | 2021-10-29 |
JPWO2020189754A1 (ja) | 2021-11-11 |
JP6983357B2 (ja) | 2021-12-17 |
KR20210118192A (ko) | 2021-09-29 |
TW202100098A (zh) | 2021-01-01 |
US11534105B2 (en) | 2022-12-27 |
US20220142561A1 (en) | 2022-05-12 |
KR102411108B1 (ko) | 2022-06-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11670143B2 (en) | Method for estimating a quantity of a blood component in a fluid receiver and corresponding error | |
US9619883B2 (en) | Systems and methods for evaluating hyperspectral imaging data using a two layer media model of human tissue | |
Lee et al. | Detection of neovascularization based on fractal and texture analysis with interaction effects in diabetic retinopathy | |
US9870625B2 (en) | Method for estimating a quantity of a blood component in a fluid receiver and corresponding error | |
US20160338603A1 (en) | Signal processing device, signal processing method, and computer-readable recording medium | |
JP6544648B2 (ja) | 骨粗鬆症診断支援装置 | |
JP6743129B2 (ja) | 心機能測定装置、心機能測定方法および心機能測定プログラム | |
US20110243415A1 (en) | Image processing apparatus, control method thereof, and program | |
Onat et al. | The contributions of image content and behavioral relevancy to overt attention | |
US20210133473A1 (en) | Learning apparatus and learning method | |
US20150110376A1 (en) | Methods, Systems and Computer Program Products for Dynamic Optical Histology Using Optical Coherence Tomography | |
JP2018521764A (ja) | 光干渉断層像スキャンの処理 | |
Li et al. | Retinal vessel detection and measurement for computer-aided medical diagnosis | |
US20220192521A1 (en) | Systems and methods for processing laser speckle signals | |
WO2020189754A1 (ja) | 推定方法、推定モデルの生成方法、プログラム、及び推定装置 | |
US20230113721A1 (en) | Functional measurements of vessels using a temporal feature | |
TWI528296B (zh) | 皮脂量的推定方法、皮脂量推定裝置及皮脂量推定程式 | |
CN111598966A (zh) | 一种基于生成对抗网络的磁共振成像方法及装置 | |
KR102239575B1 (ko) | 피부 진단 장치 및 방법 | |
Hani et al. | Non-invasive contrast enhancement for retinal fundus imaging | |
KR102340152B1 (ko) | 피판술 후 조직 괴사 및 손상의 조기 예측 시스템 및 그 방법 | |
Krishnamurthy et al. | Spatio-temporal feature analysis of laser speckle images for simultaneous quantification of skin thickness and perfusion demonstrated using in-vitro scleroderma phantoms | |
US20230180999A1 (en) | Learning apparatus, learning method, program, trained model, and endoscope system | |
Ti et al. | Contrast measurement for MRI images using histogram of second-order derivatives | |
Thompson et al. | Measuring Doppler-like power spectra and dermal perfusion using laser speckle contrast with multiple exposures |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20773526 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2021507415 Country of ref document: JP Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 20217028038 Country of ref document: KR Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20773526 Country of ref document: EP Kind code of ref document: A1 |