US20220036549A1 - Method and apparatus for providing information associated with immune phenotypes for pathology slide image - Google Patents
Method and apparatus for providing information associated with immune phenotypes for pathology slide image Download PDFInfo
- Publication number
- US20220036549A1 US20220036549A1 US17/502,661 US202117502661A US2022036549A1 US 20220036549 A1 US20220036549 A1 US 20220036549A1 US 202117502661 A US202117502661 A US 202117502661A US 2022036549 A1 US2022036549 A1 US 2022036549A1
- Authority
- US
- United States
- Prior art keywords
- immune
- rois
- image
- pathology slide
- phenotype
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000007170 pathology Effects 0.000 title claims abstract description 222
- 238000000034 method Methods 0.000 title claims abstract description 44
- 230000000007 visual effect Effects 0.000 claims description 85
- 238000001514 detection method Methods 0.000 claims description 83
- 239000013598 vector Substances 0.000 claims description 25
- 238000000605 extraction Methods 0.000 claims description 11
- 238000004590 computer program Methods 0.000 claims description 10
- 206010028980 Neoplasm Diseases 0.000 description 54
- 201000011510 cancer Diseases 0.000 description 52
- 238000013528 artificial neural network Methods 0.000 description 44
- 230000010365 information processing Effects 0.000 description 41
- 210000002865 immune cell Anatomy 0.000 description 35
- 229940076838 Immune checkpoint inhibitor Drugs 0.000 description 25
- 102000037984 Inhibitory immune checkpoint proteins Human genes 0.000 description 25
- 108091008026 Inhibitory immune checkpoint proteins Proteins 0.000 description 25
- 239000012274 immune-checkpoint protein inhibitor Substances 0.000 description 25
- 210000004027 cell Anatomy 0.000 description 24
- 230000004044 response Effects 0.000 description 23
- 238000010586 diagram Methods 0.000 description 22
- 238000010801 machine learning Methods 0.000 description 22
- 210000001519 tissue Anatomy 0.000 description 20
- 238000003860 storage Methods 0.000 description 15
- 238000004891 communication Methods 0.000 description 14
- 238000012545 processing Methods 0.000 description 11
- 230000006870 function Effects 0.000 description 10
- 239000000090 biomarker Substances 0.000 description 9
- 238000012549 training Methods 0.000 description 8
- 238000004458 analytical method Methods 0.000 description 7
- 238000009826 distribution Methods 0.000 description 7
- 230000008569 process Effects 0.000 description 6
- 238000011282 treatment Methods 0.000 description 6
- 210000004204 blood vessel Anatomy 0.000 description 5
- 230000000694 effects Effects 0.000 description 5
- 210000004698 lymphocyte Anatomy 0.000 description 5
- 239000002246 antineoplastic agent Substances 0.000 description 4
- 229940041181 antineoplastic drug Drugs 0.000 description 4
- 230000028993 immune response Effects 0.000 description 4
- 239000003550 marker Substances 0.000 description 4
- 238000012986 modification Methods 0.000 description 4
- 230000004048 modification Effects 0.000 description 4
- 210000002889 endothelial cell Anatomy 0.000 description 3
- 210000000981 epithelium Anatomy 0.000 description 3
- 210000002950 fibroblast Anatomy 0.000 description 3
- 210000000987 immune system Anatomy 0.000 description 3
- 210000002540 macrophage Anatomy 0.000 description 3
- 230000000946 synaptic effect Effects 0.000 description 3
- 108010074708 B7-H1 Antigen Proteins 0.000 description 2
- 102000008096 B7-H1 Antigen Human genes 0.000 description 2
- 208000032818 Microsatellite Instability Diseases 0.000 description 2
- 208000033878 Tertiary Lymphoid Structures Diseases 0.000 description 2
- 238000013529 biological neural network Methods 0.000 description 2
- 238000013145 classification model Methods 0.000 description 2
- 239000003086 colorant Substances 0.000 description 2
- 229940079593 drug Drugs 0.000 description 2
- 239000003814 drug Substances 0.000 description 2
- 230000003511 endothelial effect Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 239000013604 expression vector Substances 0.000 description 2
- 238000002509 fluorescent in situ hybridization Methods 0.000 description 2
- 210000001365 lymphatic vessel Anatomy 0.000 description 2
- 230000017074 necrotic cell death Effects 0.000 description 2
- 210000005036 nerve Anatomy 0.000 description 2
- 210000004180 plasmocyte Anatomy 0.000 description 2
- 230000011218 segmentation Effects 0.000 description 2
- 210000004881 tumor cell Anatomy 0.000 description 2
- 102100022005 B-lymphocyte antigen CD20 Human genes 0.000 description 1
- 108010009992 CD163 antigen Proteins 0.000 description 1
- 102000017420 CD3 protein, epsilon/gamma/delta subunit Human genes 0.000 description 1
- 108050005493 CD3 protein, epsilon/gamma/delta subunit Proteins 0.000 description 1
- 102100027581 Forkhead box protein P3 Human genes 0.000 description 1
- 101000897405 Homo sapiens B-lymphocyte antigen CD20 Proteins 0.000 description 1
- 101000861452 Homo sapiens Forkhead box protein P3 Proteins 0.000 description 1
- 101000934372 Homo sapiens Macrosialin Proteins 0.000 description 1
- 101000738771 Homo sapiens Receptor-type tyrosine-protein phosphatase C Proteins 0.000 description 1
- 101000716102 Homo sapiens T-cell surface glycoprotein CD4 Proteins 0.000 description 1
- 101000946843 Homo sapiens T-cell surface glycoprotein CD8 alpha chain Proteins 0.000 description 1
- 102100025136 Macrosialin Human genes 0.000 description 1
- 102100037422 Receptor-type tyrosine-protein phosphatase C Human genes 0.000 description 1
- 102100025831 Scavenger receptor cysteine-rich type 1 protein M130 Human genes 0.000 description 1
- 102100036011 T-cell surface glycoprotein CD4 Human genes 0.000 description 1
- 102100034922 T-cell surface glycoprotein CD8 alpha chain Human genes 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000001149 cognitive effect Effects 0.000 description 1
- 238000013527 convolutional neural network Methods 0.000 description 1
- 238000013523 data management Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 230000007812 deficiency Effects 0.000 description 1
- 210000004443 dendritic cell Anatomy 0.000 description 1
- 238000003745 diagnosis Methods 0.000 description 1
- 201000010099 disease Diseases 0.000 description 1
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 1
- 238000007490 hematoxylin and eosin (H&E) staining Methods 0.000 description 1
- 230000006801 homologous recombination Effects 0.000 description 1
- 238000002744 homologous recombination Methods 0.000 description 1
- 238000011532 immunohistochemical staining Methods 0.000 description 1
- 238000007901 in situ hybridization Methods 0.000 description 1
- 210000001165 lymph node Anatomy 0.000 description 1
- 230000035772 mutation Effects 0.000 description 1
- 238000003062 neural network model Methods 0.000 description 1
- 210000002569 neuron Anatomy 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000001151 other effect Effects 0.000 description 1
- 230000001575 pathological effect Effects 0.000 description 1
- 230000002265 prevention Effects 0.000 description 1
- 230000009257 reactivity Effects 0.000 description 1
- 238000004579 scanning voltage microscopy Methods 0.000 description 1
- 238000007447 staining method Methods 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 230000004083 survival effect Effects 0.000 description 1
- 210000000225 synapse Anatomy 0.000 description 1
- 210000003171 tumor-infiltrating lymphocyte Anatomy 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/40—ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N1/00—Sampling; Preparing specimens for investigation
- G01N1/28—Preparing specimens for investigation including physical details of (bio-)chemical methods covered elsewhere, e.g. G01N33/50, C12Q
- G01N1/30—Staining; Impregnating ; Fixation; Dehydration; Multistep processes for preparing samples of tissue, cell or nucleic acid material and the like for analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/12—Edge-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/25—Determination of region of interest [ROI] or a volume of interest [VOI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/77—Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
- G06V10/774—Generating sets of training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/82—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/69—Microscopic objects, e.g. biological cells or cellular parts
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16B—BIOINFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR GENETIC OR PROTEIN-RELATED DATA PROCESSING IN COMPUTATIONAL MOLECULAR BIOLOGY
- G16B20/00—ICT specially adapted for functional genomics or proteomics, e.g. genotype-phenotype associations
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/20—ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/30—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/50—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for simulation or modelling of medical disorders
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/70—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for mining of medical data, e.g. analysing previous cases of other patients
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30024—Cell structures in vitro; Tissue sections in vitro
Definitions
- the present disclosure relates to a method and a device for providing information associated with immune phenotype for pathology slide image, and more specifically, to a method and a device for generating and outputting an image indicative of information associated with immune phenotype for one or more regions of interest (ROIs) in a pathology slide image.
- ROIs regions of interest
- the immune anticancer drug may refer to any drug that prevents cancer cells from evading the body's immune system or makes immune cells better recognize and attack cancer cells. Since it acts through the body's immune system, there are few side effects from the anticancer drugs, and the survival period of cancer patients treated with the immune checkpoint inhibitor may be longer than when treated with other anticancer drugs. However, these immune checkpoint inhibitor are not always effective for all cancer patients. Therefore, it is important to predict the response rate of the immune checkpoint inhibitor in order to predict the effect of the immune checkpoint inhibitor on the current cancer patient.
- a user e.g., a doctor, a patient, and the like
- immune response information generated through a pathology slide image of a patient's tissue.
- the user e.g., doctor, patient, and the like
- information on the immune response e.g., immune cell expression information, and the like
- the immune response information may also be generated for a patch among a plurality of patches that is substantially unnecessary for predicting the response to the immune checkpoint inhibitor.
- the present disclosure provides a method and a device for providing information associated with immune phenotype for pathology slide image to solve the problems described above.
- the present disclosure may be implemented in various ways, including a method, a device (system), a computer readable storage medium storing instructions, or a computer program.
- a method, performed by at least one computing device, for providing information associated with immune phenotype for pathology slide image includes obtaining information associated with immune phenotype for one or more regions of interest (ROIs) in a pathology slide image, generating, based on the information associated with the immune phenotype for one or more ROIs, an image indicative of the information associated with the immune phenotype, and outputting the image indicative of the information associated with immune phenotype.
- ROIs regions of interest
- the one or more ROIs are determined based on a detection result for one or more target items for the pathology slide image.
- the one or more ROIs include at least some regions in the pathology slide image that satisfy a condition associated with the one or more target items.
- the one or more ROIs are regions being output upon input of the detection result for the one or more target items for the pathology slide image or the pathology slide image to an ROI extraction model, and the ROI extraction model is trained to output a reference ROI upon input of the detection result for one or more target items for a reference pathology slide image or a reference pathology slide image.
- the obtaining includes obtaining the immune phenotype of the one or more ROIs
- the generating includes generating an image including a visual representation corresponding to the immune phenotype of the one or more ROIs
- the immune phenotype includes at least one of immune inflamed, immune excluded, or immune desert.
- the obtaining includes obtaining one or more immune phenotype scores for the one or more ROIs
- the generating includes generating an image including a visual representation corresponding to the one or more immune phenotype scores
- the one or more immune phenotype scores include at least one of a score for immune inflamed, a score for immune excluded, or a score for immune desert.
- the obtaining includes obtaining a feature associated with one or more immune phenotypes for the one or more ROIs
- the generating includes generating an image including a visual representation corresponding to the feature associated with the one or more immune phenotypes
- the feature associated with the one or more immune phenotypes includes at least one of a statistical value or a vector associated with the immune phenotype.
- the outputting includes outputting one or more ROIs in the pathology slide image together with an image including a visual representation.
- the outputting may include overlaying the image including the visual representation on one or more ROIs in the pathology slide image.
- the method further includes obtaining a detection result for one or more target items from the pathology slide image, generating an image indicative of the detection result for one or more target items, and outputting the image indicative of the detection result for one or more target items.
- a computer program stored in a computer-readable recording medium for executing, on a computer, the method for providing the information associated with the immune phenotype for the pathology slide image described above according to an embodiment of the present disclosure.
- a computing device may include a memory storing one or more instructions, and a processor configured to execute the stored one or more instructions to obtain information associated with immune phenotype for one or more ROIs in the pathology slide image, generate, based on the information associated with the immune phenotype for one or more ROIs, an image indicative of the information associated with the immune phenotype, and output the image indicative of the information associated with the immune phenotype.
- the user by providing the user with an image visually representing the information associated with immune phenotype, it is possible to enable the user to intuitively recognize the information associated with immune phenotype for each region.
- a visual representation indicative of the information associated with immune phenotype by overlaying it on a corresponding ROI in a pathology slide image, it is possible to enable the user to recognize at a glance which region of the pathology slide image corresponds to the information indicated by the corresponding visual representation.
- the ROI in the pathology slide image which actually requires analysis may be determined. That is, instead of analyzing the entire pathology slide image, the information processing system and/or the user terminal may perform processing (e.g., determining an immune phenotype and/or calculating an immune phenotype score, and the like) only on the ROIs while excluding the regions where analysis is unnecessary, such that computer resources, processing costs, and the like can be minimized.
- processing e.g., determining an immune phenotype and/or calculating an immune phenotype score, and the like
- more accurate results can be provided by processing (e.g., determining an immune phenotype and/or calculating an immune phenotype score, and the like) only on the significant region when determining the immune phenotype and/or determining response or non-response to the immune checkpoint inhibitor.
- FIG. 1 is an exemplary configuration diagram illustrating a system in which an information processing system provides information associated with immune phenotype for pathology slide image according to an embodiment of the present disclosure.
- FIG. 2 is a block diagram illustrating an internal configuration of the information processing system according to an embodiment of the present disclosure.
- FIG. 3 is a block diagram illustrating an internal configuration of a user terminal according to an embodiment of the present disclosure.
- FIG. 4 is a flowchart illustrating a method for providing information associated with immune phenotype for pathology slide image according to an embodiment of the present disclosure.
- FIG. 5 is a diagram illustrating an example of determining one or more ROIs in a pathology slide image according to an embodiment of the present disclosure.
- FIG. 6 is a diagram illustrating an example of generating an immune phenotype determination result according to an embodiment of the present disclosure.
- FIG. 7 is a diagram illustrating an example of outputting the immune phenotype determination result according to an embodiment.
- FIG. 8 is a diagram illustrating an example of outputting an immune phenotype determination result according to an embodiment of the present disclosure.
- FIG. 9 is an exemplary diagram illustrating an artificial neural network model according to an exemplary embodiment.
- FIG. 10 is a configuration diagram of an exemplary computing device (e.g., user terminal) that provides information associated with immune phenotype for pathology slide image according to an embodiment.
- exemplary computing device e.g., user terminal
- FIG. 11 is a diagram illustrating an example of outputting a detection result for target item according to an embodiment of the present disclosure.
- FIG. 12 is a diagram illustrating an example of outputting a detection result for target item and an immune phenotype determination result according to an embodiment.
- module refers to a software or hardware component, and “module” or “unit” performs certain roles.
- the “module” or “unit” may be configured to be in an addressable storage medium or configured to reproduce one or more processors.
- the “module” or “unit” may include components such as software components, object-oriented software components, class components, and task components, and at least one of processes, functions, attributes, procedures, subroutines, program code segments of program code, drivers, firmware, micro-codes, circuits, data, database, data structures, tables, arrays, and variables.
- functions provided in the components and the “modules” or “units” may be combined into a smaller number of components and “modules” or “units”, or further divided into additional components and “modules” or “units.”
- the “module” or “unit” may be implemented as a processor and a memory.
- the “processor” should be interpreted broadly to encompass a general-purpose processor, a central processing unit (CPU), a microprocessor, a digital signal processor (DSP), a controller, a microcontroller, a state machine, and so forth.
- the “processor” may refer to an application-specific integrated circuit (ASIC), a programmable logic device (PLD), a field-programmable gate array (FPGA), and so on.
- ASIC application-specific integrated circuit
- PLD programmable logic device
- FPGA field-programmable gate array
- the “processor” may refer to a combination of processing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other combination of such configurations.
- the “memory” should be interpreted broadly to encompass any electronic component capable of storing electronic information.
- the “memory” may refer to various types of processor-readable media such as random access memory (RAM), read-only memory (ROM), non-volatile random access memory (NVRAM), programmable read-only memory (PROM), erasable programmable read-only memory (EPROM), electrically erasable PROM (EEPROM), flash memory, magnetic or optical data storage, registers, and so on.
- RAM random access memory
- ROM read-only memory
- NVRAM non-volatile random access memory
- PROM programmable read-only memory
- EPROM erasable programmable read-only memory
- EEPROM electrically erasable PROM
- flash memory magnetic or optical data storage, registers, and so on.
- the “system” may refer to at least one of a server device and a cloud device, but not limited thereto.
- the system may include one or more server devices.
- the system may include one or more cloud devices.
- the system may be configured together with both a server device and a cloud device and operated.
- target data may refer to any data or data item that can be used for training of a machine learning model, and may include, for example, data indicative of an image, data indicative of voice or voice characteristics, and the like, but is not limited thereto.
- the whole pathology slide image and/or at least one patch (or region) included in the pathology slide image are explained as the target data, but is not limited thereto, and any data that can be used for training a machine learning model may correspond to the target data.
- the target data may be tagged with label information through an annotation task.
- the “pathology slide image” refers to an image obtained by capturing a pathological slide fixed and stained through a series of chemical treatments in order to observe a tissue removed from a human body with a microscope.
- the pathology slide image may refer to a digital image captured with a microscope, and may include information on cells, tissues, and/or structures in the human body.
- the pathology slide image may include one or more patches, and the one or more patches may be tagged with label information (e.g., information on immune phenotype) through the annotation work.
- the “pathology slide image” may include H&E-stained tissue slides and/or IHC-stained tissue slides, but is not limited thereto, and tissue slides applied with various staining methods (e.g., chromogenic in situ hybridization (CISH), Fluorescent in situ hybridization (FISH), Multiplex IHC, and the like), or unstained tissue slides may also be included.
- the “pathology slide image” may be a patient's tissue slide generated to predict a response to immune checkpoint inhibitor, and it may include a tissue slide of a patient before treatment with immune checkpoint inhibitor and/or a tissue slide of a patient after treatment with immune checkpoint inhibitor.
- a “biomarker” may be defined as a marker that can objectively measure a normal or pathology state, a degree of response to a drug, and the like.
- the biomarker may include immune checkpoint inhibitor and PD-L1 as a biomarker, but is not limited thereto, and may include Tumor Mutation Burden (TMB) value, Microsatellite Instability (MSI) value, Homologous Recombination Deficiency (HRD) value, CD3, CD8, CD68, FOXP3, CD20, CD4, CD45, CD163, and other various biomarkers related to immune cells.
- TMB Tumor Mutation Burden
- MSI Microsatellite Instability
- HRD Homologous Recombination Deficiency
- the “patch” may refer to a small region within the pathology slide image.
- the patch may include a region corresponding to a semantic object extracted by performing segmentation on the pathology slide image.
- the patch may refer to a combination of pixels associated with the label information generated by analyzing the pathology slide image.
- the “regions of interest (ROIs)” may refer to at least some regions to be analyzed in the pathology slide image.
- the ROIs may refer to at least some regions in the pathology slide image that include a target item.
- the ROIs may refer to at least some of a plurality of patches generated by segmenting the pathology slide image.
- a “machine learning model” and/or an “artificial neural network model” may include any model that is used for inferring an answer to a given input.
- the machine learning model may include an artificial neural network model including an input layer (layer), a plurality of hidden layers, and output layers.
- each layer may include a plurality of nodes.
- the machine learning model may be trained to infer label information for pathology slide images and/or at least one patch included in the pathology slides.
- the label information generated through the annotation task may be used to train the machine learning model.
- the machine learning model may include weights associated with a plurality of nodes included in the machine learning model. In an example, the weight may include an any parameter associated with the machine learning model.
- training may refer to any process of changing a weight associated with the machine learning model using at least one patch and the label information.
- the training may refer to a process of changing or updating weights associated with the machine learning model through one or more of forward propagation and backward propagation of the machine learning model using at least one patch and the label information.
- the “label information” is correct answer information of the data sample information, which is obtained as a result of the annotation task.
- the label or label information may be used interchangeably with terms such as annotation, tag, and so on as used in the art.
- the “annotation” may refer to an annotation work and/or annotation information (e.g., label information, and the like) determined by performing the annotation work.
- annotation information may refer to information for the annotation work and/or information generated by the annotation work (e.g., label information).
- the “target item” may refer to data/information, an image region, an object, and the like to be detected in the pathology slide image.
- the target item may include a target to be detected from the pathology slide image for diagnosis, treatment, prevention, or the like of a disease (e.g., cancer).
- the “target item” may include a target item in units of cells and a target item in units of areas.
- each of a plurality of A and/or “respective ones of a plurality of A” may refer to each of all components included in the plurality of A, or may refer to each of some of the components included in a plurality of A.
- each of the plurality of ROIs may refer to each of all ROIs included in the plurality of ROIs or may refer to each of some ROIs included in the plurality of ROIs.
- instructions may refer to one or more instructions grouped based on functions, which are the components of a computer program and executed by the processor.
- a “user” may refer to a person who uses a user terminal.
- the user may include an annotator who performs an annotation work.
- the user may include a doctor, a patient, and the like who is provided with the information associated with immune phenotype and/or a prediction result of a response to immune checkpoint inhibitor (e.g., a prediction result as to whether or not the patient responds to immune checkpoint inhibitor).
- the user may refer to the user terminal, or conversely, the user terminal may refer to the user. That is, the user and the user terminal may be interchangeably used herein.
- FIG. 1 is an exemplary configuration diagram illustrating a system in which an information processing system 100 provides information associated with immune phenotype for pathology slide image according to an embodiment of the present disclosure.
- the system for providing information associated with immune phenotype for pathology slide image may include the information processing system 100 , a user terminal 110 , and a storage system 120 .
- the information processing system 100 may be configured to be connected to each of the user terminal 110 and the storage system 120 for communication.
- FIG. 1 illustrates one user terminal 110
- the present disclosure is not limited thereto, and in an exemplary configuration, a plurality of user terminals 110 may be connected to the information processing system 100 for communication.
- the information processing system 100 is shown as one computing device in FIG.
- the information processing system 100 may be configured to process information and/or data in a distributed manner through a plurality of computing devices.
- the storage system 120 is shown as a single device in FIG. 1 , embodiment is not limited thereto, and the system may be configured with a plurality of storage devices or as a system that supports a cloud.
- the respective components of the system for providing information associated with immune phenotype of a pathology slide image illustrated in FIG. 1 represent functional components divided on the basis of functions, and in an actual physical environment, a plurality of components may be implemented as being incorporated with each other.
- the information processing system 100 and the user terminal 110 are any computing devices used to generate and provide the information associated with immune phenotype for pathology slide image.
- the computing device may refer to any type of device equipped with a computing function, and may be a notebook, a desktop, a laptop, a server, a cloud system, and the like, for example, but is not limited thereto.
- the information processing system 100 may receive a pathology slide image.
- the information processing system 100 may receive the pathology slide image from the storage system 120 and/or the user terminal 110 .
- the information processing system 100 may generate information associated with immune phenotype of the pathology slide image and provide it to the user 130 through the user terminal 110 .
- the information processing system 100 may determine one or more ROIs in the pathology slide image and generate information associated with immune phenotype for the one or more ROIs.
- the information associated with immune phenotype may include at least one of an immune phenotype of one or more ROIs, an immune phenotype score of one or more ROIs, and a feature associated with the immune phenotype of one or more ROIs.
- the information processing system 100 may determine one or more ROIs based on a detection result for one or more target items for the pathology slide image. For example, the information processing system 100 may determine one or more ROIs including at least some regions in the pathology slide image that satisfy a condition associated with the one or more target items. Additionally or alternatively, upon input of the detection result for one or more target items for a pathology slide image (e.g., a pathology slide image including a detection result for target item) to a region-of-interest (ROI) extraction model, the information processing system 100 may determine regions being output as one or more ROIs.
- the ROI extraction model may correspond to a model trained to output a reference ROI when the detection result for one or more target items for reference pathology slide image (e.g., a reference pathology slide image including the detection result for target item) is input.
- the user terminal 110 may obtain from the information processing system 100 the information associated with immune phenotype for one or more ROIs in the pathology slide image.
- the information associated with immune phenotype for one or more ROIs may include an immune phenotype (e.g., at least one of immune inflamed, immune excluded, or immune desert) for one or more ROIs.
- the information associated with immune phenotype for one or more ROIs may include one or more immune phenotype scores for the one or more ROIs (e.g., the immune phenotype score is at least one of a score for immune inflamed, a score for immune excluded or a score for immune desert).
- the information associated with immune phenotype for one or more ROIs may include a feature associated with one or more immune phenotypes for one or more ROIs (e.g., at least one of statistical value or vector associated with the immune phenotype).
- the user terminal 110 may generate an image indicative of the information associated with immune phenotype based on the information associated with immune phenotype for one or more ROIs.
- the user terminal 110 may generate an image including a visual representation corresponding to the immune phenotype of one or more ROIs.
- the user terminal 110 may generate an image including a visual representation corresponding to the one or more immune phenotype scores.
- the visual representation may include color (e.g., color, brightness, saturation, and the like), text, image, mark, figure, and the like.
- the user terminal 110 may output an image indicative of the information associated with the generated immune phenotype.
- the user terminal 110 may output one or more ROIs in the pathology slide image together with an image including a visual representation. That is, the one or more ROIs in the pathology slide image and the image including the visual representation may be displayed together on the display device associated with the user terminal 110 .
- the user terminal 110 may overlay the image including the visual representation on one or more ROIs in the pathology slide image. That is, one or more ROIs in the pathology slide image overlaid with the image including the visual representation may be displayed on the display device associated with the user terminal 110 .
- the user 130 e.g., doctor, patient, and the like
- the storage system 120 is a device or a cloud system that stores and manages pathology slide images associated with a target patient and various data associated with a machine learning model to provide information associated with immune phenotype for the pathology slide image.
- the storage system 120 may store and manage various types of data using a database.
- the various data may include any data associated with the machine learning model, and include, for example, a file of the target data, meta information of the target data, label information for the target data that is the result of the annotation work, data related to the annotation work, a machine learning model (e.g., an artificial neural network model), and the like, but are not limited thereto. While FIG. 1 shows the information processing system 100 and the storage system 120 as separate systems, embodiment is not limited thereto, and they may be incorporated into one system.
- the ROI in the pathology slide image which actually requires analysis may be determined.
- the information processing system 100 and/or the user terminal 110 may perform processing (e.g., determining an immune phenotype and/or calculating an immune phenotype score, and the like) only on the ROIs while excluding the regions where analysis is unnecessary, such that computer resources, processing costs, and the like can be minimized.
- processing e.g., determining an immune phenotype and/or calculating an immune phenotype score, and the like
- more accurate prediction results may be provided by processing (e.g., determining an immune phenotype and/or calculating an immune phenotype score, and the like) only on the significant region when determining the immune phenotype and/or determining response or non-response to the immune checkpoint inhibitor.
- FIG. 2 is a block diagram illustrating an internal configuration of the information processing system 100 according to an embodiment of the present disclosure.
- the information processing system 100 may generate the information associated with immune phenotype for the pathology slide image.
- the information processing system 100 may include a target item detection unit 210 , an ROI determination unit 220 , and an immune phenotype determination unit 230 .
- Respective components of the information processing system 100 illustrated in FIG. 2 represent functional components that can be divided on the basis of functions, and in an actual physical environment, a plurality of components may be implemented as being incorporated with each other.
- the target item detection unit 210 may receive the pathology slide image (e.g., H&E-stained pathology slide image, IHC-stained pathology slide image, and the like), and detect one or more target items in the received pathology slide image.
- the target item detection unit 210 may use the artificial neural network model for target item detection to detect one or more target items in the pathology slide image.
- the artificial neural network model for target item detection may correspond to a model trained to detect one or more reference target items from the reference pathology slide image.
- the target item detection unit 210 may detect a target item in units of cells and/or a target item in units of regions in the pathology slide image.
- the target item detection unit 210 may detect tumor cell, lymphocyte, macrophages, dendritic cell, fibroblast, endothelial cell, blood vessel, cancer stroma, cancer epithelium, cancer area, normal area (e.g., normal lymph node architecture region), and the like, as the target item in the pathology slide image.
- the ROI determination unit 220 may determine one or more ROIs in the pathology slide image.
- the ROI may include a region in which one or more target items are detected in the pathology slide image.
- the ROI determination unit 220 may determine, as the region of interest, a patch including one or more target items from among a plurality of patches forming the pathology slide image.
- the ROI determination unit 220 may determine one or more ROIs based on a detection result for one or more target items for a pathology slide image.
- the ROI determination unit 220 may determine one or more ROIs including at least some regions in the pathology slide image that satisfy a condition associated with the one or more target items.
- the ROI determination unit 220 may determine the regions being output as one or more ROIs upon input of the detection result for one or more target items for a pathology slide image and/or the pathology slide image to an ROI extraction model.
- the immune phenotype determination unit 230 may generate the information associated with immune phenotype for one or more ROIs in the pathology slide image. In an embodiment, the immune phenotype determination unit 230 may determine the immune phenotype of one or more ROIs based on the detection result for one or more target items. For example, the immune phenotype determination unit 230 may determine whether the immune phenotype of one or more ROIs is immune inflamed, immune excluded, or immune desert, based on the detection result for one or more target items. In another embodiment, the immune phenotype determination unit 230 may calculate the immune phenotype score of one or more ROIs based on the detection result for one or more target items.
- the immune phenotype determination unit 230 may calculate a score for immune inflamed of one or more ROIs, a score for immune excluded, and/or a score for immune desert, based on the detection result for one or more target items. To this end, the immune phenotype determination unit 230 may calculate a score indicating a probability that the immune phenotype of one or more ROIs is immune inflamed, a score indicating a probability that it is immune excluded, and/or a score indicating a probability that it is immune desert.
- the immune phenotype determination unit 230 may generate a feature associated with one or more immune phenotypes for one or more ROIs.
- the feature associated with one or more immune phenotypes may include at least one of a statistical value or a vector associated with the immune phenotype.
- the feature associated with one or more immune phenotypes may include score values related to the immune phenotype output from an artificial neural network model or a machine learning model. That is, it may include score values output in the process of determining the immune phenotype for one or more ROIs.
- the feature associated with one or more immune phenotypes may include a density value, number, or various statistics of immune cells corresponding to a threshold (or cut-off) for an immune phenotype or a vector value or the like expressing the distribution of immune cells.
- the feature associated with one or more immune phenotypes may include a scalar value or a vector value including a relative relationship (e.g., a histogram vector or a graph expression vector considering the direction and distance) or relative statistics (e.g., the ratio of the number of immune cells to the number of specific cells, and the like) between an immune cell or cancer cell and a specific cell (e.g., cancer cells, immune cells, fibroblasts, lymphocytes, plasma cells, macrophage, endothelial cells, and the like).
- a relative relationship e.g., a histogram vector or a graph expression vector considering the direction and distance
- relative statistics e.g., the ratio of the number of immune cells to the number of specific cells, and the like
- the feature associated with one or more immune phenotypes may include scalar values or vector values including statistics (e.g., the ratio of the number of immune cells to the cancer stromal region, and the like) or distributions (e.g., histogram vector or graph representation vector, and the like) of immune cells or cancer cells in a specific region (e.g., cancer area, cancer stromal region, tertiary lymphoid structure, normal region, necrosis, fat, blood vessel, high endothelial venule, lymphatic vessel, nerve, and the like).
- statistics e.g., the ratio of the number of immune cells to the cancer stromal region, and the like
- distributions e.g., histogram vector or graph representation vector, and the like
- the feature associated with one or more immune phenotypes may include scalar values or vector values including relative relationship (e.g., histogram vector or graph expression vector considering the direction and distance) or relative statistics (e.g., the ratio of the number of immune cells to the number of specific cells, and the like) between positive/negative cells according to the expression amount of the biomarker and the specific cells (e.g., cancer cells, immune cells, fibroblasts, lymphocytes, plasma cells, macrophage, endothelial cells, and the like).
- relative relationship e.g., histogram vector or graph expression vector considering the direction and distance
- relative statistics e.g., the ratio of the number of immune cells to the number of specific cells, and the like
- the feature associated with one or more immune phenotypes may include scalar values or vector values including statistics (e.g., the ratio of the number of immune cells to the cancer stromal region, and the like) or distributions (e.g., histogram vector or graph representation vector, and the like) of positive/negative cells according to the expression amount of the biomarker in a specific region (e.g., cancer area, cancer stromal region, tertiary lymphoid structure, normal region, necrosis, fat, blood vessel, high endothelial venule, lymphatic vessel, nerve, and the like).
- a specific region e.g., cancer area, cancer stromal region, tertiary lymphoid structure, normal region, necrosis, fat, blood vessel, high endothelial venule, lymphatic vessel, nerve, and the like.
- the information processing system 100 includes the target item detection unit 210 , the ROI determination unit 220 , and the immune phenotype determination unit 230 , but embodiments are not limited thereto, and some components may be omitted or other components may be added.
- the information processing system 100 may further include a response predicting unit (not illustrated) for immune checkpoint inhibitor, and this response prediction unit for immune checkpoint inhibitor may generate a prediction result for whether the patient responds to the immune checkpoint inhibitor or not based on the information associated with immune phenotype.
- the information processing system 100 may further include an output unit (not illustrated), and this output unit may output at least one of a detection result for the one or more target items, an immune phenotype of one or more ROIs, a prediction result as to whether or not the patient responds to immune checkpoint inhibitor, or a density of immune cells in each of one or more ROIs.
- FIG. 3 is a block diagram illustrating an internal configuration of the user terminal 110 according to an embodiment of the present disclosure.
- the user terminal 110 may include an image generation unit 310 and an image output unit 320 .
- Respective components of the user terminal 110 illustrated in FIG. 3 represent functional components that can be divided on the basis of functions, and in an actual physical environment, a plurality of components may be implemented as being incorporated with each other.
- the image generation unit 310 may obtain the information associated with immune phenotype for one or more ROIs in the pathology slide image. For example, the image generation unit 310 may receive the information associated with immune phenotype for one or more ROIs generated by the information processing system. Additionally or alternatively, as the user terminal 110 generates the information associated with immune phenotype for one or more ROIs, the image generation unit 310 may obtain the information associated with immune phenotype for one or more ROIs. Additionally or alternatively, the image generation unit 310 may receive the information associated with immune phenotype for one or more ROIs stored in an internal and/or external device of the user terminal 110 .
- the image generation unit 310 may generate an image indicative of the information associated with immune phenotype based on the information associated with immune phenotype for one or more ROIs. In an embodiment, when obtaining the immune phenotype of one or more ROIs, the image generation unit 310 may generate an image including a visual representation corresponding to the immune phenotype of one or more ROIs. In another embodiment, when receiving one or more immune phenotype scores for one or more ROIs, the image generation unit 310 may generate an image including a visual representation corresponding to one or more immune phenotype scores.
- the image generation unit 310 may generate an image including a visual representation corresponding to a feature (e.g., a value of a feature) associated with one or more immune phenotypes.
- a feature e.g., a value of a feature
- the classification map may include a map visualizing a result of classifying a tumor proportion score (TPS) and/or a combined proportion score (CPS), which are information related to the expression of PD-L1, based on a specific threshold.
- TPS tumor proportion score
- CPS combined proportion score
- the image output unit 320 may output through the display device an image indicative of the information associated with immune phenotype.
- the image output unit 320 may output through the display device one or more ROIs in the pathology slide image and an image including a visual representation.
- the image output unit 320 may overlay an image including a visual representation on one or more ROIs in the pathology slide image.
- FIG. 3 illustrates that the image generation unit 310 is included in the user terminal 110 , but embodiments are not limited thereto, and the image generation unit 310 may be included in any external device (e.g., in the information processing system 100 , and the like) capable of communicating with the user terminal 110 by wire and/or wirelessly.
- the image output unit 320 of the user terminal 110 may receive an image generated from the external device, and display the received image on a display device connected to the user terminal 110 by wire and/or wirelessly.
- FIGS. 2 and 3 illustrate that the target item detection unit 210 , the ROI determination unit 220 , the immune phenotype determination unit 230 , the image generation unit 310 , and the image output unit 320 are separately executed in the information processing system 100 and the user terminal 110 , but embodiments are not limited thereto, and these components may be executed by one device. In another embodiment, these components may be distributed and processed in any combination by a plurality of any devices (e.g., the information processing system 100 , the user terminal 110 , and the like).
- any devices e.g., the information processing system 100 , the user terminal 110 , and the like.
- FIG. 4 is a flowchart illustrating a method 400 for providing information associated with immune phenotype for pathology slide image according to an embodiment of the present disclosure.
- the method 400 for providing information associated with immune phenotype for pathology slide image may be performed by a processor (e.g., at least one processor of the user terminal and/or at least one processor of the information processing system).
- the method 400 for providing information associated with immune phenotype for pathology slide image may be initiated by the processor obtaining the information associated with immune phenotype for one or more ROIs in the pathology slide image (S 410 ).
- the information associated with immune phenotype for one or more ROIs may include an immune phenotype for one or more ROIs (e.g., at least one of immune inflamed, immune excluded, or immune desert), and/or an immune phenotype score (e.g., at least one of a score for immune inflamed, a score for immune excluded, or a score for immune desert) for one or more ROIs.
- the information associated with immune phenotype for one or more ROIs may include a feature associated with one or more immune phenotypes for one or more ROIs (e.g., statistical value or vector associated with the immune phenotype).
- the one or more ROIs may be determined based on a detection result for one or more target items for a pathology slide image.
- the one or more ROIs may include at least some regions in the pathology slide image that satisfy a condition associated with the one or more target items.
- the one or more ROIs may be the regions being output upon input of the detection result for one or more target items for the pathology slide image and/or the pathology slide image to the ROI extraction model.
- the ROI extraction model may correspond to a model that is trained to output a reference ROI upon input of the detection result for one or more target items for reference pathology slide image (e.g., a reference pathology slide image including the detection result for target item), and/or a reference pathology slide image.
- the processor may generate an image indicative of information associated with immune phenotype based on the information associated with immune phenotype for one or more ROIs (S 420 ).
- the processor may generate an image including a visual representation corresponding to immune phenotype of one or more ROIs.
- the processor may generate an image including a visual representation corresponding to one or more immune phenotype scores.
- the processor may generate an image including a visual representation corresponding to a score for immune inflamed.
- the processor may generate an image including a visual representation corresponding to a score for immune excluded.
- the processor may generate an image including a visual representation corresponding to a score for immune desert.
- the processor may generate an image including a visual representation corresponding to a feature associated with one or more immune phenotypes.
- the processor may generate an image including a visual representation corresponding to a value of a feature associated with one or more immune phenotypes.
- the processor may output an image indicative of the information associated with immune phenotype (S 430 ).
- the processor may output one or more ROIs in the pathology slide image together with an image including a visual representation.
- the processor may overlay an image including a visual representation on one or more ROIs in the pathology slide image.
- the processor may obtain a detection result for one or more target items from the pathology slide image, and generate an image indicative of the detection result for one or more target items. Then, the processor may output the image indicative of the detection result for one or more target items.
- FIG. 5 is a diagram illustrating an example of determining one or more ROIs 522 _ 1 , 522 _ 2 , 522 _ 3 , 522 _ 4 , 524 , and 526 in a pathology slide image 510 according to an embodiment of the present disclosure.
- a user e.g., a doctor, a researcher, and the like
- may obtain a patient's tissue e.g., tissue immediately prior to treatment, tissue after treated with immune checkpoint inhibitor, and the like
- tissue e.g., tissue immediately prior to treatment, tissue after treated with immune checkpoint inhibitor, and the like
- the user may perform H&E staining on the obtained patient's tissue and digitize the H&E-stained tissue slide through a scanner to generate a pathology slide image.
- the user may perform IHC staining on the obtained patient's tissue and digitize the IHC-stained tissue slide through a scanner to generate a pathology slide image.
- the ROI determination unit 220 of the information processing system may determine one or more ROIs in the pathology slide image.
- the ROIs may correspond to the regions having various shapes, such as circle, square, rectangle, polygon, contour, and the like.
- the ROI determination unit 220 may determine one or more ROIs based on the detection result for one or more target items for a pathology slide image. For example, among a plurality of patches (e.g., 1 mm 2 sized patches) generated by dividing the pathology slide image into N grids (where N is any natural number), the ROI determination unit 220 may determine a patch, from which one or more target items (e.g., items and/or immune cells associated with a cancer) are detected, as the ROI.
- one or more target items e.g., items and/or immune cells associated with a cancer
- the ROI determination unit 220 may determine one or more ROIs such that the ROIs include at least some regions in the pathology slide image that satisfy a condition associated with one or more target items. For example, the ROI determination unit 220 may determine a region, which has the number and/or area of target items (e.g., tumor cells, immune cells, cancer area, cancer stroma, and the like) equal to or greater than a reference value in the pathology slide image, as the ROI. Additionally or alternatively, the ROI determination unit 220 may determine a region, which has a numerical value such as a ratio, a density, and the like of the target item equal to or greater than a reference value in the pathology slide image, as the ROI.
- target items e.g., tumor cells, immune cells, cancer area, cancer stroma, and the like
- the target item may correspond to a cell and/or region detected to determine the immune phenotype.
- the reference value may refer to a numerical value set for the target item to define a statistically significant immune phenotype and/or a clinically significant immune phenotype.
- the ROI determination unit 220 may determine a region, which has any area in the pathology slide image, as the ROI.
- the area of the ROI may be dynamically determined to satisfy the condition associated with one or more target items described above. That is, the area of the ROI is not fixedly predetermined, and may be dynamically determined as the ROI determination unit 200 determines the region, which has the number, area, ratio, and/or density and the like of the target items equal to or greater than the reference value, as the ROI.
- the ROI determination unit 220 may determine one or more ROIs such that the area of the ROIs has a statically predetermined value.
- the ROI determination unit 220 may determine the regions being output as one or more ROIs.
- the ROI extraction model may correspond to a machine learning model (e.g., Neural Network, CNN, SVM, and the like) trained to output a reference ROI upon input of a detection result for one or more target items for reference pathology slide image and/or a reference pathology slide image.
- the area of the ROI may be determined dynamically and/or statically.
- the ROI determination unit 220 may determine the plurality of ROIs such that at least some of the plurality of ROIs overlap with each other. For example, when the ROI determination unit 220 determines a first ROI and a second ROI for the pathology slide image, at least some regions of the first ROI and at least some regions of the second ROI may be regions overlapping with each other. In another embodiment, when a plurality of ROIs for one pathology slide image is determined, the ROI determination unit 220 may determine the plurality of ROIs such that at least some of the plurality of ROIs do not overlap with each other.
- the ROI determination unit 220 may receive the pathology slide image 510 (e.g., pathology slide image including a detection result for target item), and determine one or more ROIs in the pathology slide images 522 _ 1 , 522 _ 2 , 522 _ 3 , 522 _ 4 , 524 , and 526 .
- the ROI determination unit 220 may determine the regions 522 _ 1 , 522 _ 2 , 522 _ 3 and 522 _ 4 , which have a specific area (e.g., 1 mm 2 as a predetermined area) that satisfies the condition associated with the target item, as the ROIs.
- the ROI determination unit 220 may determine a region 524 , which satisfies the condition associated with the target item, as the ROI, and the area of the ROI may be dynamically determined.
- the ROI determination unit 220 may determine an elliptical region 526 , which satisfies the condition associated with the target item, as the ROI.
- FIG. 6 is a diagram illustrating an example of generating an immune phenotype determination result 620 according to an embodiment of the present disclosure.
- the immune phenotype determination unit 230 may generate the immune phenotype determination result 620 of one or more ROIs based on the detection result for one or more target items (e.g., items and/or immune cells associated with a cancer) in the one or more ROIs.
- the immune phenotype determination unit 230 may determine the immune phenotype of the corresponding ROI as at least one of immune inflamed, immune excluded, or immune desert based on the detection result for one or more target items in one or more ROIs.
- the immune phenotype determination unit 230 may calculate an immune phenotype score for one or more ROIs based on the detection result for one or more target items in one or more ROIs. For example, the immune phenotype determination unit 230 may calculate at least one of a score for immune inflamed, a score for immune excluded, or a score for immune desert in the corresponding ROI based on the detection result for one or more target items in one or more ROIs. In this case, the immune phenotype determination unit 230 may determine an immune phenotype of the corresponding ROI based on at least one of a score for immune inflamed, a score for immune excluded, and a score for immune desert for one or more ROIs. For example, when the score for immune inflamed of a specific ROI is equal to or greater than a threshold value, the immune phenotype determination unit 230 may determine the immune phenotype of the corresponding ROI as immune inflamed.
- the immune phenotype determination unit 230 may calculate at least one of the number, distribution, or density of target items in one or more ROIs, and may determine an immune phenotype and/or an immune phenotype score of one or more ROIs based on at least one of the calculated number, distribution, or density of the immune cells. For example, the immune phenotype determination unit 230 may calculate, within one or more ROIs, a density of lymphocytes in the cancer area and a density of lymphocytes in the cancer stroma area, and determine the immune phenotype of the one or more ROIs based on at least one of the density of immune cells in the cancer area or the density of immune cells in the cancer stroma.
- the immune phenotype determination unit 230 may determine the immune phenotype of one or more ROIs as one of immune inflamed, immune excluded, or immune desert, by referring to the number of immune cells included in the specific region in the cancer area.
- the immune phenotype determination unit 230 may determine the immune phenotype of a first ROI 612 , which has a density of immune cells in the cancer area equal to or greater than a first threshold density, as immune inflamed.
- the immune phenotype determination unit 230 may determine the immune phenotype of a second ROI 614 , which has a density of immune cells in the cancer area less than the first threshold density and a density of immune cells in the cancer stroma equal to or greater than the second threshold density, as immune excluded.
- the immune phenotype determination unit 230 may determine the immune phenotype of a third ROI 616 having a density of immune cells in the cancer area less than the first threshold density and having a density of immune cells in the cancer stroma less than the second threshold density is immune desert.
- the first threshold density may be determined based on a distribution of the density of the immune cells in the cancer area in each of a plurality of ROIs in the plurality of pathology slide images.
- the second threshold density may be determined based on a distribution of the density of the immune cells in the cancer stroma in each of the plurality of ROIs in the plurality of pathology slide images.
- the immune phenotype determination unit 230 may input the feature for each of the one or more ROIs to the artificial neural network immune phenotype classification model to determine the immune phenotype and/or immune phenotype score of each of the one or more ROIs.
- the artificial neural network immune phenotype classification model may correspond to a classifier that is trained to determine the immune phenotype of the reference ROI as one of immune inflamed, immune excluded, or immune desert upon input of the feature for the reference ROI.
- the feature for each of one or more ROIs may include a statistical feature for one or more target items in each of one or more ROIs (e.g., density, number, and the like, of specific target items in the ROI), a geometric feature for one or more target items (e.g., a feature including relative position information between specific target items, and the like), and/or an image feature, and the like corresponding to each of the one or more ROIs (e.g., a feature extracted from a plurality of pixels included in the ROIs, an image vector corresponding to the ROIs, and the like).
- a statistical feature for one or more target items in each of one or more ROIs e.g., density, number, and the like, of specific target items in the ROI
- a geometric feature for one or more target items e.g., a feature including relative position information between specific target items, and the like
- an image feature, and the like corresponding to each of the one or more ROIs (e.g., a feature extracted from a plurality of pixels included
- the feature for each of the one or more ROIs may include a feature obtained by concatenating two or more features from among the statistical feature for one or more target items in each of the one or more ROIs, the geometric feature for the one or more target items, or the image feature corresponding to each of the one or more ROIs.
- the immune phenotype determination unit 230 may receive one or more ROIs 612 , 614 , and 616 , and generate the immune phenotype determination result 620 .
- one or more ROIs 612 , 614 , and 616 may include the detection result for target item for the corresponding ROI.
- the immune phenotype determination unit 230 may generate the immune phenotype determination result 620 for the ROI that includes the detection result for target item.
- the immune phenotype determination unit 230 may generate the immune phenotype determination result 620 including the immune phenotype of the corresponding ROI and/or the immune phenotype score of the corresponding ROI.
- the generated immune phenotype determination result 620 and/or one or more ROIs 612 , 614 , and 616 may be provided to the user terminal.
- FIG. 7 is a diagram illustrating an example of outputting the immune phenotype determination result according to an embodiment of the present disclosure.
- the information processing system e.g., at least one processor of the information processing system
- the user terminal may output the received information through an output device to provide it to the user.
- the information associated with immune phenotype for one or more ROIs may include the immune phenotype score for one or more ROIs.
- the user terminal e.g., at least one processor of the user terminal
- the user terminal may generate an image including a visual representation corresponding to one or more immune phenotype scores for one or more ROIs.
- the user terminal may generate an image including a visual representation corresponding to one or more immune phenotype scores for the ROI.
- the one or more immune phenotype scores may include at least one of a score for immune inflamed, a score for immune excluded, or a score for immune desert.
- the visual representation may include color (e.g., color, brightness, saturation, and the like), text, image, mark, figure, and the like.
- the user terminal may generate an image including a color with higher saturation for a region corresponding to ROI having a higher immune inflamed score, and generate an image including a color with lower saturation for a region corresponding to ROI having a lower immune inflamed score.
- the user terminal may generate an image including a first visual representation in a region corresponding to the ROI having the immune inflamed score corresponding to a first score section, including a second visual representation in a region corresponding to the ROI having the immune inflamed score corresponding to a second score section, and including a third visual representation in a region corresponding to the ROI having the immune inflamed score corresponding to a third score section.
- the first visual representation, the second visual representation, and the third visual representation may be different from each other.
- the user terminal may output one or more ROIs in the pathology slide image together with an image including a visual representation.
- the user terminal may simultaneously display the image (e.g., the image including a visual representation) generated as described above and at least some regions of the pathology slide image (e.g., the region corresponding to the generated image) on a display device.
- the user terminal may overlay the image including the visual representation on one or more ROIs in the pathology slide image.
- the user terminal may overlap the image (e.g., the image including a visual representation) generated as described above on at least some regions of the pathology slide image (e.g., the region corresponding to the generated image), and display on the display device.
- the user terminal may generate an image 710 in which the region corresponding to the ROI having the immune inflamed score falling in the first score section is displayed in white, the region corresponding to the ROI having the immune inflamed score falling in the second score section is displayed in light gray, the region corresponding to the ROI having the immune inflamed score falling in the third score section is displayed in dark gray, and the regions other than the ROIs are displayed in black. Then, the user terminal may arrange, in parallel, the image 710 including a visual representation and a region 720 in the pathology slide image which corresponds to the generated image 710 , and display them together on the user interface.
- the user terminal may further display on the user interface the information associated with immune phenotype, such as text, numerical values, graphs, and the like for a “total tissue region” (e.g., a tissue region in a pathology slide image), a “cancer region” (e.g., a cancer area in a pathology slide image, an “analyzable region” (e.g., an ROI in pathology slide image), and an “immune phenotype proportion” (e.g., immune phenotype proportion).
- a region in the pathology slide images, which is displayed on the display device may include the detection result for one or more target items for the corresponding region. That is, when outputting an image of at least some regions of the pathology slide image, the user terminal may output an image of at least some regions displaying the detection results for one or more target items through the display device.
- FIG. 8 is a diagram illustrating an example of outputting an immune phenotype determination result according to an embodiment of the present disclosure.
- the information associated with immune phenotype for one or more ROIs obtained by the user terminal may include the immune phenotypes of the one or more ROIs.
- the user terminal may output an image indicative of the information associated with immune phenotype for one or more ROIs.
- the user terminal may generate an image including a visual representation corresponding to the immune phenotype of one or more ROIs.
- the user terminal may generate an image including a visual representation corresponding to the immune phenotype of the corresponding ROI in a region corresponding to one or more ROIs. That is, an image may be generated, which includes the first visual representation in a region corresponding to the ROI having the immune phenotype of immune inflamed, includes the second visual representation in a region corresponding to the ROI having the immune phenotype of immune excluded, and includes the third visual representation in a region corresponding to the ROI having the immune phenotype of immune desert.
- the visual representation may include color (e.g., color, brightness, saturation, and the like), text, image, mark, figure, and the like, to distinguish each immune phenotype.
- the first visual representation indicating immune inflamed may be red color
- the second visual representation indicating immune excluded may be green color
- the third visual representation indicating immune desert may be blue color.
- the first visual representation indicating immune inflamed may be a circle mark
- the second visual representation indicating immune excluded may be a triangle mark
- the third visual representation indicating immune desert may be an X mark.
- the user terminal may output one or more ROIs in the pathology slide image together with an image including the visual representation.
- the user terminal may display the image (e.g., the image including a visual representation) generated as described above together with at least some regions of the pathology slide image (e.g., the region corresponding to the generated image) on the display device.
- the user terminal may overlay the image including the visual representation on one or more ROIs in the pathology slide image.
- the user terminal may overlap the image (e.g., the image including a visual representation) generated as described above transparently, translucently, or opaquely on at least some regions of the pathology slide image (e.g., the region corresponding to the generated image), and display on the display device.
- the user terminal may generate an image that includes a diagonal marker in a region corresponding to the ROI having the immune phenotype of immune inflamed, includes a vertical line marker in a region corresponding to the ROI having the immune phenotype of immune excluded, and includes a horizontal line marker in a region corresponding to the ROI having the immune phenotype of immune desert. Then, the user terminal may display on the user interface an image 810 including the image including a visual representation overlaid on a corresponding region of the pathology slide image.
- the user terminal may also display on the user interface an image 820 of at least some regions of the pathology slide image (e.g., region corresponding to an image including a visual representation and/or the entire pathology slide image) separately (e.g., in a form of a minimap).
- an image 820 of at least some regions of the pathology slide image e.g., region corresponding to an image including a visual representation and/or the entire pathology slide image
- separately e.g., in a form of a minimap
- the user terminal may display on the user interface the information associated with immune phenotype, such as text, numerical values, graphs, and the like for “analysis summary”, “biomarker findings”, “score” (e.g., immune inflamed score, and the like), “cutoff” (e.g., reference value used in determining the immune phenotype, and the like), “total tissue region”, “cancer region”, “analyzable region”, “immune phenotype proportion”, “tumor infiltrating lymphocyte density” (e.g., immune cell density in the cancer area, immune cell density in the cancer stromal region, and the like).
- the information associated with immune phenotype such as text, numerical values, graphs, and the like for “analysis summary”, “biomarker findings”, “score” (e.g., immune inflamed score, and the like), “cutoff” (e.g., reference value used in determining the immune phenotype, and the like), “total tissue region”, “cancer region
- the region in the pathology slide images which is displayed on the display device, may include the detection result for one or more target items for the corresponding region. That is, when outputting an image of at least some regions of the pathology slide image, the user terminal may output an image of at least some regions displaying the detection results for one or more target items through the display device.
- FIG. 9 is an exemplary diagram illustrating an artificial neural network model 900 according to an embodiment of the present disclosure.
- an artificial neural network model 900 as an example of the machine learning model refers to a statistical learning algorithm implemented based on a structure of a biological neural network, or to a structure that executes such algorithm.
- the artificial neural network model 900 may represent a machine learning model that obtains a problem solving ability by repeatedly adjusting the weights of synapses by the nodes that are artificial neurons forming the network through synaptic combinations as in the biological neural networks, thus training to reduce errors between a target output corresponding to a specific input and a deduced output.
- the artificial neural network model 900 may include any probability model, neural network model, and the like, that is used in artificial intelligence learning methods such as machine learning and deep learning.
- the artificial neural network model 900 may include an artificial neural network model configured to detect one or more target items from a pathology slide image being inputted. Additionally or alternatively, the artificial neural network model 900 may include an artificial neural network model configured to determine one or more ROIs from an input pathology slide image.
- the artificial neural network model 900 is implemented as a multilayer perceptron (MLP) formed of multiple nodes and connections between them.
- the artificial neural network model 900 may be implemented using one of various artificial neural network model structures including the MLP.
- the artificial neural network model 900 includes an input layer 920 receiving an input signal or data 910 from the outside, an output layer 940 outputting an output signal or data 950 corresponding to the input data, and (n) number of hidden layers 930 _ 1 to 930 _ n (where n is a positive integer) positioned between the input layer 920 and the output layer 940 to receive a signal from the input layer 920 , extract the features, and transmit the features to the output layer 940 .
- the output layer 940 receives signals from the hidden layers 930 _ 1 to 930 _ n and outputs them to the outside.
- the method of training the artificial neural network model 900 includes the supervised learning that trains to optimize for solving a problem with inputs of teacher signals (correct answer), and the unsupervised learning that does not require a teacher signal.
- the information processing system may train the artificial neural network model 900 by supervised learning and/or unsupervised learning to detect one or more target items from a pathology slide image.
- the information processing system may train the artificial neural network model 900 by supervised learning to detect one or more target items from the pathology slide image by using a reference pathology slide image and label information for one or more reference target items.
- the information processing system may train the artificial neural network model 900 by supervised learning and/or unsupervised learning to determine one or more ROIs from the pathology slide image.
- the information processing system may train the artificial neural network model 900 by supervised learning to determine one or more ROIs from the pathology slide image using the reference pathology slide image and/or the detection result for one or more target items for the reference pathology slide image (e.g., reference pathology slide image including the detection result for target item), and the label information on the reference ROI.
- the ROI and/or the reference ROI may include at least some regions in the pathology slide image and/or the reference pathology slide image that satisfy the condition associated with one or more target items.
- the artificial neural network model 900 trained as described above may be stored in a memory (not illustrated) of the information processing system, and in response to an input for the pathology slide image received from the communication module and/or the memory, may detect one or more target items in the pathology slide image. Additionally or alternatively, the artificial neural network model 900 may determine one or more ROIs from the pathology slide image, in response to the detection result for the one or more target items for the pathology slide image and/or an input to the pathology slide image.
- the input variable of the artificial neural network model for detecting the target item may be one or more pathology slide images (e.g., H&E-stained pathology slide images, IHC-stained pathology slide images).
- the input variable input to the input layer 920 of the artificial neural network model 900 may be the image vector 910 which may be one or more pathology slide images configured as one vector data element.
- the output variable output from the output layer 940 of the artificial neural network model 900 in response to the input of the image may be a vector 950 representing or characterizing one or more target items detected in the pathology slide image. That is, the output layer 940 of the artificial neural network model 900 may be configured to output a vector representing or characterizing one or more target items detected from the pathology slide image.
- the output variable of the artificial neural network model 900 is not limited to the types described above, and may include any information/data indicative of one or more target items detected from the pathology slide image.
- the output layer 940 of the artificial neural network model 900 may be configured to output a vector indicative of the reliability and/or accuracy of the output detection result for target item and the like.
- the input variable of the machine learning model for determining the ROI may be the detection result for one or more target items for the pathology slide image (e.g., detection data for the target item in the pathology slide image) and/or the pathology slide image.
- the input variable input to the input layer 920 of the artificial neural network model 900 may be the image vector 910 which may be the detection result for the one or more target items for the pathology slide image and/or the pathology slide image configured as one vector data element.
- An output variable output from the output layer 940 of the artificial neural network model 900 in response to input for the detection result for one or more target items for the pathology slide image and/or the pathology slide image may be the vector 950 that indicates or characterizes one or more ROIs.
- the output variable of the artificial neural network model 900 is not limited to the types described above, and may include any information/data indicative of one or more ROIs.
- the input layer 920 and the output layer 940 of the artificial neural network model 900 are respectively matched with a plurality of output variables corresponding to a plurality of input variables, and the synaptic values between nodes included in the input layer 920 , the hidden layers 930 _ 1 to 930 _ n , and the output layer 940 are adjusted, so that by training, a correct output corresponding to a specific input can be extracted.
- the features hidden in the input variables of the artificial neural network model 900 may be confirmed, and the synaptic values (or weights) between the nodes of the artificial neural network model 900 may be adjusted so as to reduce the errors between the output variable calculated based on the input variable and the target output.
- the detection result for target item may be output. Additionally or alternatively, by using the artificial neural network model 900 , one or more ROIs may be output in response to input of the pathology slide image and/or the detection result for one or more target items for the pathology slide image (e.g., the pathology slide image including the detection result for target item).
- FIG. 10 is a configuration diagram of an exemplary computing device 1000 (e.g., user terminal) that provides information associated with immune phenotype for pathology slide image according to an embodiment of the present disclosure.
- the computing device 1000 may include one or more processors 1010 , a bus 1030 , a communication interface 1040 , a memory 1020 that loads a computer program 1060 executable by the processors 1010 , and a storage module 1050 storing the computer program 1060 .
- FIG. 10 only the components related to the embodiment of the present disclosure are illustrated in FIG. 10 . Accordingly, those of ordinary skill in the art to which the present disclosure pertains will be able to recognize that other general-purpose components may be further included in addition to the components shown in FIG. 10 .
- the processors 1010 control the overall operation of components of the computing device 1000 .
- the processors 1010 may be configured to include a central processing unit (CPU), a microprocessor unit (MPU), a micro controller unit (MCU), a graphic processing unit (GPU), or any type of processor well known in the technical field of the present disclosure.
- the processors 1010 may perform an arithmetic operation on at least one application or program for executing the method according to the embodiments of the present disclosure.
- the computing device 1000 may include one or more processors.
- the memory 1020 may store various types of data, commands, and/or information.
- the memory 1020 may load one or more computer programs 1060 from the storage module 1050 in order to execute a method/operation according to various embodiments of the present disclosure.
- the memory 1020 may be implemented as a volatile memory such as RAM, but the technical scope of the present disclosure is not limited thereto.
- the bus 1030 may provide a communication function between components of the computing device 1000 .
- the bus 1030 may be implemented as various types of buses such as an address bus, a data bus, a control bus, or the like.
- the communication interface 1040 may support wired/wireless Internet communication of the computing device 1000 .
- the communication interface 1040 may support various other communication methods in addition to the Internet communication.
- the communication interface 1040 may be configured to include a communication module well known in the technical field of the present disclosure.
- the storage module 1050 may non-temporarily store one or more computer programs 1060 .
- the storage module 1050 may be configured to include a nonvolatile memory such as a read only memory (ROM), an erasable programmable ROM (EPROM), an electrically erasable programmable ROM (EEPROM), a flash memory, and the like, a hard disk, a detachable disk, or any type of computer-readable recording medium well known in the art to which the present disclosure pertains.
- ROM read only memory
- EPROM erasable programmable ROM
- EEPROM electrically erasable programmable ROM
- flash memory and the like
- hard disk a hard disk
- detachable disk or any type of computer-readable recording medium well known in the art to which the present disclosure pertains.
- the computer program 1060 may include one or more instructions that, when loaded into the memory 1020 , cause the processors 1010 to perform an operation/method in accordance with various embodiments of the present disclosure. That is, the processors 1010 may perform operations/methods according to various embodiments of the present disclosure by executing one or more instructions.
- the computer program 1060 may include one or more instructions for causing the following operations to be performed: obtaining information associated with immune phenotype for one or more ROIs in the pathology slide image; generating an image indicative of information associated with immune phenotype based on the information associated with immune phenotype for one or more ROIs; outputting the image indicative of the information associated with immune phenotype, and the like.
- a system for predicting a response to immune checkpoint inhibitor may be implemented through the computing device 1000 .
- FIG. 11 is a diagram illustrating an example of outputting a detection result for target item according to an embodiment of the present disclosure.
- the processor e.g., at least one processor of the user terminal
- the information processing system may use a target item detection model to detect one or more target items from the pathology slide image, and provide the detection result for target item to the user terminal.
- the target item detection model may include a model trained to detect one or more reference target items from the reference pathology slide image.
- the processor may generate an image indicating the detection result for one or more target items, and output the generated image indicating the detection result for one or more target items.
- the processor may generate and output an image including visual representations capable of distinguishing respective target items, based on the detection result for one or more target items from the pathology slide image.
- the processor may generate an image including a visual representation indicating each of the target items (e.g., cancer area, cancer stromal region, blood vessel, cancer cell, immune cell, positive/negative cells according to the expression amount of the biomarker, and the like) that may be considered in determining the immune phenotype.
- the processor may generate an image including a segmentation map for the target item in units of regions, a contour for the target item having a specific structure (e.g., a blood vessel, and the like), a center point of the target item in units of cells, or a contour indicative of the shape of the target item in units of cells, and the like.
- a specific structure e.g., a blood vessel, and the like
- the processor may output one or more ROIs in the pathology slide image and an image indicative of the detection result for one or more target items for one or more ROIs.
- the processor may output the image generated as described above (e.g., the image including a visual representation) together with at least some regions of the pathology slide image (e.g., region corresponding to the generated image).
- the processor may overlay the image indicative of the detection result for one or more target items (e.g., the image including a visual representation) on one or more ROIs in the pathology slide image.
- the user terminal may overlap the image (e.g., the image including a visual representation) generated as described above on at least some regions of the pathology slide image (e.g., the region corresponding to the generated image), and display it on a display device connected to the user terminal.
- the image e.g., the image including a visual representation
- the pathology slide image e.g., the region corresponding to the generated image
- the processor may display on the user interface an image 1110 displaying a first color indicating a background other than the target item, a second color indicating a cancer epithelium area, a third color indicating a cancer stromal region, a fourth color indicating an immune cell (e.g., a central point of an immune cell), and a fifth color indicating a cancer cell (e.g., a center point of a cancer cell) on the corresponding regions of the pathology slide image.
- the image 1110 may correspond to an image obtained by overlaying (or merging) the image indicative of the detection result for one or more target items on a corresponding region of the pathology slide image.
- the first color, the second color, the third color, the fourth color and/or the fifth color may correspond to different colors that can be distinguished from each other.
- the user terminal may display on the user interface visual representations corresponding to respective target items (e.g., color information corresponding to respective target items, and the like), text, numerical values, markers, graphs, and the like for “analysis summary”. Additionally or alternatively, the user terminal may output a user interface through which the user can select a target item to be displayed with a visual representation in a corresponding region in the pathology slide image. In FIG. 11 , color is used as an example of the visual representation to indicate the information associated with immune phenotype, but embodiments are not limited thereto.
- FIG. 12 is a diagram illustrating an example of outputting a detection result for target item and an immune phenotype determination result according to an embodiment of the present disclosure.
- the processor e.g., at least one processor of the user terminal
- the processor may output an image indicative of the detection result for one or more target items.
- the processor may overlay (or merge) an image indicative of the information associated with immune phenotype for one or more ROIs and/or the detection result for one or more target items on a corresponding region of the pathology slide image and output the result.
- the processor may display on the user interface an image 1210 displaying a first color indicative of immune desert, a second color indicative of immune excluded, a third color indicative of immune inflamed, a fourth color indicative of cancer epithelium region, a fifth color indicative of a cancer stromal region, a sixth color indicative of an immune cell (e.g., a center point of immune cell), and a seventh color indicative of a cancer cell (e.g., a center point of a cancer cell) are displayed on the corresponding region in the pathology slide images.
- the image 1210 may correspond to an image in which an image (e.g., an immune phenotype map) indicative of the information associated with immune phenotype for one or more ROIs and an image indicative of the detection result for one or more target items are overlaid on (or merged with) the corresponding region in the pathology slide image.
- an image e.g., an immune phenotype map
- the first color, the second color, the third color, the fourth color, the fifth color, the sixth color and/or the seventh color may correspond to different colors that can be distinguished from each other.
- the processor may also display on the user interface an image 1220 of at least some regions of the pathology slide image (e.g., regions corresponding to the image including the visual representation and/or the entire pathology slide image) separately (e.g., in a form of a minimap). Additionally or alternatively, as the information associated with immune phenotype, the processor may display on the user interface the visual representations corresponding to the respective target items (e.g., information on color corresponding to respective target items, and the like), text, numerical values, graphs, and the like for “analysis summary”.
- the visual representations corresponding to the respective target items e.g., information on color corresponding to respective target items, and the like
- text, numerical values, graphs, and the like for “analysis summary”.
- a user interface may be output, through which the user may select the information associated with target item and/or immune phenotype to mark with a visual representation, in a corresponding region in the pathology slide image.
- color is used as an example of the visual representation to indicate the information associated with immune phenotype and/or target item, but embodiments are not limited thereto.
- example implementations may refer to utilizing aspects of the presently disclosed subject matter in the context of one or more standalone computer systems, the subject matter is not so limited, and they may be implemented in conjunction with any computing environment, such as a network or distributed computing environment. Furthermore, aspects of the presently disclosed subject matter may be implemented in or across a plurality of processing chips or devices, and storage may be similarly influenced across a plurality of devices. Such devices may include PCs, network servers, and handheld devices.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- General Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Public Health (AREA)
- Biomedical Technology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Primary Health Care (AREA)
- Epidemiology (AREA)
- Data Mining & Analysis (AREA)
- Life Sciences & Earth Sciences (AREA)
- Radiology & Medical Imaging (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Databases & Information Systems (AREA)
- Pathology (AREA)
- Multimedia (AREA)
- Evolutionary Computation (AREA)
- Molecular Biology (AREA)
- Artificial Intelligence (AREA)
- Computing Systems (AREA)
- Software Systems (AREA)
- Quality & Reliability (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Biophysics (AREA)
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Biotechnology (AREA)
- Genetics & Genomics (AREA)
- Proteomics, Peptides & Aminoacids (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Computational Linguistics (AREA)
- General Engineering & Computer Science (AREA)
- Mathematical Physics (AREA)
- Biochemistry (AREA)
- Immunology (AREA)
Abstract
The present disclosure relates to a method, performed by at least one computing device, for providing information associated with immune phenotype for pathology slide image. The method may include obtaining information associated with immune phenotype for one or more regions of interest (ROIs) in a pathology slide image, generating, based on the information associated with the immune phenotype for one or more ROIs, an image indicative of the information associated with the immune phenotype, and outputting the image indicative of the information associated with immune phenotype.
Description
- This application is a continuation of International Application No. PCT/KR2021/005772 filed on May 7, 2021 which claims priority to Korean Patent Application No. 10-2020-0055483 filed on May 8, 2020, Korean Patent Application No. 10-2021-0054206 filed on Apr. 27, 2021, and Korean Patent Application No. 10-2021-0059519 filed on May 7, 2021, the entire contents of which are herein incorporated by reference.
- The present disclosure relates to a method and a device for providing information associated with immune phenotype for pathology slide image, and more specifically, to a method and a device for generating and outputting an image indicative of information associated with immune phenotype for one or more regions of interest (ROIs) in a pathology slide image.
- Recently, a third-generation anticancer drug for cancer treatment, that is, the immune checkpoint inhibitor that utilize the immune system of the patient's body has gained attention. By the immune anticancer drug, it may refer to any drug that prevents cancer cells from evading the body's immune system or makes immune cells better recognize and attack cancer cells. Since it acts through the body's immune system, there are few side effects from the anticancer drugs, and the survival period of cancer patients treated with the immune checkpoint inhibitor may be longer than when treated with other anticancer drugs. However, these immune checkpoint inhibitor are not always effective for all cancer patients. Therefore, it is important to predict the response rate of the immune checkpoint inhibitor in order to predict the effect of the immune checkpoint inhibitor on the current cancer patient.
- Meanwhile, in order to predict the response to the immune checkpoint inhibitor, a user (e.g., a doctor, a patient, and the like) may be provided with immune response information generated through a pathology slide image of a patient's tissue. According to the related art, the user (e.g., doctor, patient, and the like) may be provided with information on the immune response (e.g., immune cell expression information, and the like) to each of the plurality of patches included in the pathology slide image in order to predict the reactivity of the immune checkpoint inhibitor. In this case, it may be difficult for the user to intuitively recognize immune response information for each of numerous patches included in the pathology slide image. In addition, the immune response information may also be generated for a patch among a plurality of patches that is substantially unnecessary for predicting the response to the immune checkpoint inhibitor.
- The present disclosure provides a method and a device for providing information associated with immune phenotype for pathology slide image to solve the problems described above.
- The present disclosure may be implemented in various ways, including a method, a device (system), a computer readable storage medium storing instructions, or a computer program.
- According to an embodiment of the present disclosure, a method, performed by at least one computing device, for providing information associated with immune phenotype for pathology slide image includes obtaining information associated with immune phenotype for one or more regions of interest (ROIs) in a pathology slide image, generating, based on the information associated with the immune phenotype for one or more ROIs, an image indicative of the information associated with the immune phenotype, and outputting the image indicative of the information associated with immune phenotype.
- According to an embodiment, the one or more ROIs are determined based on a detection result for one or more target items for the pathology slide image.
- According to an embodiment, the one or more ROIs include at least some regions in the pathology slide image that satisfy a condition associated with the one or more target items.
- According to an embodiment, the one or more ROIs are regions being output upon input of the detection result for the one or more target items for the pathology slide image or the pathology slide image to an ROI extraction model, and the ROI extraction model is trained to output a reference ROI upon input of the detection result for one or more target items for a reference pathology slide image or a reference pathology slide image.
- According to an embodiment, the obtaining includes obtaining the immune phenotype of the one or more ROIs, the generating includes generating an image including a visual representation corresponding to the immune phenotype of the one or more ROIs, and the immune phenotype includes at least one of immune inflamed, immune excluded, or immune desert.
- According to an embodiment, the obtaining includes obtaining one or more immune phenotype scores for the one or more ROIs, the generating includes generating an image including a visual representation corresponding to the one or more immune phenotype scores, and the one or more immune phenotype scores include at least one of a score for immune inflamed, a score for immune excluded, or a score for immune desert.
- According to an embodiment, the obtaining includes obtaining a feature associated with one or more immune phenotypes for the one or more ROIs, the generating includes generating an image including a visual representation corresponding to the feature associated with the one or more immune phenotypes, and the feature associated with the one or more immune phenotypes includes at least one of a statistical value or a vector associated with the immune phenotype.
- According to an embodiment, the outputting includes outputting one or more ROIs in the pathology slide image together with an image including a visual representation.
- According to an embodiment, the outputting may include overlaying the image including the visual representation on one or more ROIs in the pathology slide image.
- According to an embodiment, the method further includes obtaining a detection result for one or more target items from the pathology slide image, generating an image indicative of the detection result for one or more target items, and outputting the image indicative of the detection result for one or more target items.
- There may be provided a computer program stored in a computer-readable recording medium for executing, on a computer, the method for providing the information associated with the immune phenotype for the pathology slide image described above according to an embodiment of the present disclosure.
- A computing device according to an embodiment may include a memory storing one or more instructions, and a processor configured to execute the stored one or more instructions to obtain information associated with immune phenotype for one or more ROIs in the pathology slide image, generate, based on the information associated with the immune phenotype for one or more ROIs, an image indicative of the information associated with the immune phenotype, and output the image indicative of the information associated with the immune phenotype.
- According to some embodiments of the present disclosure, by providing the user with an image visually representing the information associated with immune phenotype, it is possible to enable the user to intuitively recognize the information associated with immune phenotype for each region. In addition, by providing a visual representation indicative of the information associated with immune phenotype by overlaying it on a corresponding ROI in a pathology slide image, it is possible to enable the user to recognize at a glance which region of the pathology slide image corresponds to the information indicated by the corresponding visual representation.
- According to some embodiments of the present disclosure, in order to determine the immune phenotype and/or response or non-response to the immune checkpoint inhibitor, the ROI in the pathology slide image which actually requires analysis may be determined. That is, instead of analyzing the entire pathology slide image, the information processing system and/or the user terminal may perform processing (e.g., determining an immune phenotype and/or calculating an immune phenotype score, and the like) only on the ROIs while excluding the regions where analysis is unnecessary, such that computer resources, processing costs, and the like can be minimized.
- According to some embodiments of the present disclosure, more accurate results can be provided by processing (e.g., determining an immune phenotype and/or calculating an immune phenotype score, and the like) only on the significant region when determining the immune phenotype and/or determining response or non-response to the immune checkpoint inhibitor.
- The effects of the present disclosure are not limited to the effects described above, and other effects not described will be able to be clearly understood by those of ordinary skill in the art (hereinafter, referred to as “ordinary technician”) from the description of the claims.
- Embodiments of the present disclosure will be described with reference to the accompanying drawings described below, in which like reference numerals denote like elements, but are not limited thereto.
-
FIG. 1 is an exemplary configuration diagram illustrating a system in which an information processing system provides information associated with immune phenotype for pathology slide image according to an embodiment of the present disclosure. -
FIG. 2 is a block diagram illustrating an internal configuration of the information processing system according to an embodiment of the present disclosure. -
FIG. 3 is a block diagram illustrating an internal configuration of a user terminal according to an embodiment of the present disclosure. -
FIG. 4 is a flowchart illustrating a method for providing information associated with immune phenotype for pathology slide image according to an embodiment of the present disclosure. -
FIG. 5 is a diagram illustrating an example of determining one or more ROIs in a pathology slide image according to an embodiment of the present disclosure. -
FIG. 6 is a diagram illustrating an example of generating an immune phenotype determination result according to an embodiment of the present disclosure. -
FIG. 7 is a diagram illustrating an example of outputting the immune phenotype determination result according to an embodiment. -
FIG. 8 is a diagram illustrating an example of outputting an immune phenotype determination result according to an embodiment of the present disclosure. -
FIG. 9 is an exemplary diagram illustrating an artificial neural network model according to an exemplary embodiment. -
FIG. 10 is a configuration diagram of an exemplary computing device (e.g., user terminal) that provides information associated with immune phenotype for pathology slide image according to an embodiment. -
FIG. 11 is a diagram illustrating an example of outputting a detection result for target item according to an embodiment of the present disclosure. -
FIG. 12 is a diagram illustrating an example of outputting a detection result for target item and an immune phenotype determination result according to an embodiment. - Hereinafter, specific details for the practice of the present disclosure will be described in detail with reference to the accompanying drawings. However, in the following description, detailed descriptions of well-known functions or configurations will be omitted when it may make the subject matter of the present disclosure rather unclear.
- In the accompanying drawings, the same or corresponding elements are assigned the same reference numerals. In addition, in the following description of the embodiments, duplicate descriptions of the same or corresponding components may be omitted. However, even if descriptions of elements are omitted, it is not intended that such elements are not included in any embodiment.
- Advantages and features of the disclosed embodiments and methods of accomplishing the same will be apparent by referring to embodiments described below in connection with the accompanying drawings. However, the present disclosure is not limited to the embodiments disclosed below, and may be implemented in various different forms, and the present embodiments are merely provided to make the present disclosure complete, and to fully disclose the scope of the invention to those skilled in the art to which the present disclosure pertains.
- The terms used herein will be briefly described prior to describing the disclosed embodiments in detail. The terms used herein have been selected as general terms which are widely used at present in consideration of the functions of the present disclosure, and this may be altered according to the intent of an operator skilled in the art, conventional practice, or introduction of new technology. In addition, in a specific case, a term is arbitrarily selected by the applicant, and the meaning of the term will be described in detail in a corresponding description of the embodiments. Therefore, the terms used in the present disclosure should be defined based on the meaning of the terms and the overall contents of the present disclosure rather than a simple name of each of the terms.
- As used herein, the singular forms “a,” “an,” and “the” are intended to include the plural forms as well, unless the context clearly indicates the singular forms. Further, the plural forms are intended to include the singular forms as well, unless the context clearly indicates the plural forms. As used throughout the description, when one part is referred to as “comprising” (or “including” or “having”) other elements, the part can comprise (or include or have) only those elements or other elements as well as those elements unless specifically described otherwise.
- Further, the term “module” or “unit” used herein refers to a software or hardware component, and “module” or “unit” performs certain roles. However, the meaning of the “module” or “unit” is not limited to software or hardware. The “module” or “unit” may be configured to be in an addressable storage medium or configured to reproduce one or more processors. Accordingly, as an example, the “module” or “unit” may include components such as software components, object-oriented software components, class components, and task components, and at least one of processes, functions, attributes, procedures, subroutines, program code segments of program code, drivers, firmware, micro-codes, circuits, data, database, data structures, tables, arrays, and variables. Furthermore, functions provided in the components and the “modules” or “units” may be combined into a smaller number of components and “modules” or “units”, or further divided into additional components and “modules” or “units.”
- According to an embodiment, the “module” or “unit” may be implemented as a processor and a memory. The “processor” should be interpreted broadly to encompass a general-purpose processor, a central processing unit (CPU), a microprocessor, a digital signal processor (DSP), a controller, a microcontroller, a state machine, and so forth. Under some circumstances, the “processor” may refer to an application-specific integrated circuit (ASIC), a programmable logic device (PLD), a field-programmable gate array (FPGA), and so on. The “processor” may refer to a combination of processing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other combination of such configurations. In addition, the “memory” should be interpreted broadly to encompass any electronic component capable of storing electronic information. The “memory” may refer to various types of processor-readable media such as random access memory (RAM), read-only memory (ROM), non-volatile random access memory (NVRAM), programmable read-only memory (PROM), erasable programmable read-only memory (EPROM), electrically erasable PROM (EEPROM), flash memory, magnetic or optical data storage, registers, and so on. The memory is said to be in electronic communication with a processor if the processor can read information from and/or write information to the memory. The memory integrated with a processor is in electronic communication with the processor.
- In the present disclosure, the “system” may refer to at least one of a server device and a cloud device, but not limited thereto. For example, the system may include one or more server devices. As another example, the system may include one or more cloud devices. As another example, the system may be configured together with both a server device and a cloud device and operated.
- In the present disclosure, “target data” may refer to any data or data item that can be used for training of a machine learning model, and may include, for example, data indicative of an image, data indicative of voice or voice characteristics, and the like, but is not limited thereto. In the present disclosure, the whole pathology slide image and/or at least one patch (or region) included in the pathology slide image are explained as the target data, but is not limited thereto, and any data that can be used for training a machine learning model may correspond to the target data. In addition, the target data may be tagged with label information through an annotation task.
- In the present disclosure, the “pathology slide image” refers to an image obtained by capturing a pathological slide fixed and stained through a series of chemical treatments in order to observe a tissue removed from a human body with a microscope. For example, the pathology slide image may refer to a digital image captured with a microscope, and may include information on cells, tissues, and/or structures in the human body. In addition, the pathology slide image may include one or more patches, and the one or more patches may be tagged with label information (e.g., information on immune phenotype) through the annotation work. For example, the “pathology slide image” may include H&E-stained tissue slides and/or IHC-stained tissue slides, but is not limited thereto, and tissue slides applied with various staining methods (e.g., chromogenic in situ hybridization (CISH), Fluorescent in situ hybridization (FISH), Multiplex IHC, and the like), or unstained tissue slides may also be included. As another example, the “pathology slide image” may be a patient's tissue slide generated to predict a response to immune checkpoint inhibitor, and it may include a tissue slide of a patient before treatment with immune checkpoint inhibitor and/or a tissue slide of a patient after treatment with immune checkpoint inhibitor.
- In the present disclosure, a “biomarker” may be defined as a marker that can objectively measure a normal or pathology state, a degree of response to a drug, and the like. For example, the biomarker may include immune checkpoint inhibitor and PD-L1 as a biomarker, but is not limited thereto, and may include Tumor Mutation Burden (TMB) value, Microsatellite Instability (MSI) value, Homologous Recombination Deficiency (HRD) value, CD3, CD8, CD68, FOXP3, CD20, CD4, CD45, CD163, and other various biomarkers related to immune cells.
- In the present disclosure, the “patch” may refer to a small region within the pathology slide image. For example, the patch may include a region corresponding to a semantic object extracted by performing segmentation on the pathology slide image. As another example, the patch may refer to a combination of pixels associated with the label information generated by analyzing the pathology slide image.
- In the present disclosure, the “regions of interest (ROIs)” may refer to at least some regions to be analyzed in the pathology slide image. For example, the ROIs may refer to at least some regions in the pathology slide image that include a target item. As another example, the ROIs may refer to at least some of a plurality of patches generated by segmenting the pathology slide image.
- In the present disclosure, a “machine learning model” and/or an “artificial neural network model” may include any model that is used for inferring an answer to a given input. According to an embodiment, the machine learning model may include an artificial neural network model including an input layer (layer), a plurality of hidden layers, and output layers. In an example, each layer may include a plurality of nodes. For example, the machine learning model may be trained to infer label information for pathology slide images and/or at least one patch included in the pathology slides. In this case, the label information generated through the annotation task may be used to train the machine learning model. In addition, the machine learning model may include weights associated with a plurality of nodes included in the machine learning model. In an example, the weight may include an any parameter associated with the machine learning model.
- In the present disclosure, “training” may refer to any process of changing a weight associated with the machine learning model using at least one patch and the label information. According to an embodiment, the training may refer to a process of changing or updating weights associated with the machine learning model through one or more of forward propagation and backward propagation of the machine learning model using at least one patch and the label information.
- In the present disclosure, the “label information” is correct answer information of the data sample information, which is obtained as a result of the annotation task. The label or label information may be used interchangeably with terms such as annotation, tag, and so on as used in the art. In the present disclosure, the “annotation” may refer to an annotation work and/or annotation information (e.g., label information, and the like) determined by performing the annotation work. In the present disclosure, the “annotation information” may refer to information for the annotation work and/or information generated by the annotation work (e.g., label information).
- In the present disclosure, the “target item” may refer to data/information, an image region, an object, and the like to be detected in the pathology slide image. According to an embodiment, the target item may include a target to be detected from the pathology slide image for diagnosis, treatment, prevention, or the like of a disease (e.g., cancer). For example, the “target item” may include a target item in units of cells and a target item in units of areas.
- In the present disclosure, “each of a plurality of A” and/or “respective ones of a plurality of A” may refer to each of all components included in the plurality of A, or may refer to each of some of the components included in a plurality of A. For example, each of the plurality of ROIs may refer to each of all ROIs included in the plurality of ROIs or may refer to each of some ROIs included in the plurality of ROIs.
- In the present disclosure, “instructions” may refer to one or more instructions grouped based on functions, which are the components of a computer program and executed by the processor.
- In the present disclosure, a “user” may refer to a person who uses a user terminal. For example, the user may include an annotator who performs an annotation work. As another example, the user may include a doctor, a patient, and the like who is provided with the information associated with immune phenotype and/or a prediction result of a response to immune checkpoint inhibitor (e.g., a prediction result as to whether or not the patient responds to immune checkpoint inhibitor). In addition, the user may refer to the user terminal, or conversely, the user terminal may refer to the user. That is, the user and the user terminal may be interchangeably used herein.
-
FIG. 1 is an exemplary configuration diagram illustrating a system in which aninformation processing system 100 provides information associated with immune phenotype for pathology slide image according to an embodiment of the present disclosure. As illustrated, the system for providing information associated with immune phenotype for pathology slide image may include theinformation processing system 100, auser terminal 110, and astorage system 120. In an example, theinformation processing system 100 may be configured to be connected to each of theuser terminal 110 and thestorage system 120 for communication. WhileFIG. 1 illustrates oneuser terminal 110, the present disclosure is not limited thereto, and in an exemplary configuration, a plurality ofuser terminals 110 may be connected to theinformation processing system 100 for communication. In addition, while theinformation processing system 100 is shown as one computing device inFIG. 1 , embodiment is not limited thereto, and theinformation processing system 100 may be configured to process information and/or data in a distributed manner through a plurality of computing devices. In addition, while thestorage system 120 is shown as a single device inFIG. 1 , embodiment is not limited thereto, and the system may be configured with a plurality of storage devices or as a system that supports a cloud. In addition, the respective components of the system for providing information associated with immune phenotype of a pathology slide image illustrated inFIG. 1 represent functional components divided on the basis of functions, and in an actual physical environment, a plurality of components may be implemented as being incorporated with each other. - The
information processing system 100 and theuser terminal 110 are any computing devices used to generate and provide the information associated with immune phenotype for pathology slide image. In an example, the computing device may refer to any type of device equipped with a computing function, and may be a notebook, a desktop, a laptop, a server, a cloud system, and the like, for example, but is not limited thereto. - The
information processing system 100 may receive a pathology slide image. For example, theinformation processing system 100 may receive the pathology slide image from thestorage system 120 and/or theuser terminal 110. Theinformation processing system 100 may generate information associated with immune phenotype of the pathology slide image and provide it to theuser 130 through theuser terminal 110. In an embodiment, theinformation processing system 100 may determine one or more ROIs in the pathology slide image and generate information associated with immune phenotype for the one or more ROIs. In this example, the information associated with immune phenotype may include at least one of an immune phenotype of one or more ROIs, an immune phenotype score of one or more ROIs, and a feature associated with the immune phenotype of one or more ROIs. - In an embodiment, the
information processing system 100 may determine one or more ROIs based on a detection result for one or more target items for the pathology slide image. For example, theinformation processing system 100 may determine one or more ROIs including at least some regions in the pathology slide image that satisfy a condition associated with the one or more target items. Additionally or alternatively, upon input of the detection result for one or more target items for a pathology slide image (e.g., a pathology slide image including a detection result for target item) to a region-of-interest (ROI) extraction model, theinformation processing system 100 may determine regions being output as one or more ROIs. In this example, the ROI extraction model may correspond to a model trained to output a reference ROI when the detection result for one or more target items for reference pathology slide image (e.g., a reference pathology slide image including the detection result for target item) is input. - The
user terminal 110 may obtain from theinformation processing system 100 the information associated with immune phenotype for one or more ROIs in the pathology slide image. For example, the information associated with immune phenotype for one or more ROIs may include an immune phenotype (e.g., at least one of immune inflamed, immune excluded, or immune desert) for one or more ROIs. Additionally or alternatively, the information associated with immune phenotype for one or more ROIs may include one or more immune phenotype scores for the one or more ROIs (e.g., the immune phenotype score is at least one of a score for immune inflamed, a score for immune excluded or a score for immune desert). Additionally or alternatively, the information associated with immune phenotype for one or more ROIs may include a feature associated with one or more immune phenotypes for one or more ROIs (e.g., at least one of statistical value or vector associated with the immune phenotype). - Then, the
user terminal 110 may generate an image indicative of the information associated with immune phenotype based on the information associated with immune phenotype for one or more ROIs. In an embodiment, theuser terminal 110 may generate an image including a visual representation corresponding to the immune phenotype of one or more ROIs. In another embodiment, theuser terminal 110 may generate an image including a visual representation corresponding to the one or more immune phenotype scores. For example, the visual representation may include color (e.g., color, brightness, saturation, and the like), text, image, mark, figure, and the like. - The
user terminal 110 may output an image indicative of the information associated with the generated immune phenotype. In an embodiment, theuser terminal 110 may output one or more ROIs in the pathology slide image together with an image including a visual representation. That is, the one or more ROIs in the pathology slide image and the image including the visual representation may be displayed together on the display device associated with theuser terminal 110. In another embodiment, theuser terminal 110 may overlay the image including the visual representation on one or more ROIs in the pathology slide image. That is, one or more ROIs in the pathology slide image overlaid with the image including the visual representation may be displayed on the display device associated with theuser terminal 110. Accordingly, the user 130 (e.g., doctor, patient, and the like) may be provided with an image indicative of information associated with immune phenotype through theuser terminal 110. - The
storage system 120 is a device or a cloud system that stores and manages pathology slide images associated with a target patient and various data associated with a machine learning model to provide information associated with immune phenotype for the pathology slide image. For efficient data management, thestorage system 120 may store and manage various types of data using a database. In this example, the various data may include any data associated with the machine learning model, and include, for example, a file of the target data, meta information of the target data, label information for the target data that is the result of the annotation work, data related to the annotation work, a machine learning model (e.g., an artificial neural network model), and the like, but are not limited thereto. WhileFIG. 1 shows theinformation processing system 100 and thestorage system 120 as separate systems, embodiment is not limited thereto, and they may be incorporated into one system. - According to some embodiments of the present disclosure, by providing the
user 130 with an image visually representing the information associated with immune phenotype, it is possible to enable theuser 130 to intuitively recognize the information associated with immune phenotype for each region. In addition, according to some embodiments of the present disclosure, in order to determine the immune phenotype and/or response or non-response to the immune checkpoint inhibitor, the ROI in the pathology slide image which actually requires analysis may be determined. That is, instead of analyzing the entire pathology slide image, theinformation processing system 100 and/or theuser terminal 110 may perform processing (e.g., determining an immune phenotype and/or calculating an immune phenotype score, and the like) only on the ROIs while excluding the regions where analysis is unnecessary, such that computer resources, processing costs, and the like can be minimized. In addition, more accurate prediction results may be provided by processing (e.g., determining an immune phenotype and/or calculating an immune phenotype score, and the like) only on the significant region when determining the immune phenotype and/or determining response or non-response to the immune checkpoint inhibitor. -
FIG. 2 is a block diagram illustrating an internal configuration of theinformation processing system 100 according to an embodiment of the present disclosure. In order to provide the information associated with immune phenotype for the pathology slide image, theinformation processing system 100 may generate the information associated with immune phenotype for the pathology slide image. According to an embodiment, as illustrated, theinformation processing system 100 may include a targetitem detection unit 210, anROI determination unit 220, and an immunephenotype determination unit 230. Respective components of theinformation processing system 100 illustrated inFIG. 2 represent functional components that can be divided on the basis of functions, and in an actual physical environment, a plurality of components may be implemented as being incorporated with each other. - The target
item detection unit 210 may receive the pathology slide image (e.g., H&E-stained pathology slide image, IHC-stained pathology slide image, and the like), and detect one or more target items in the received pathology slide image. In an embodiment, the targetitem detection unit 210 may use the artificial neural network model for target item detection to detect one or more target items in the pathology slide image. In this example, the artificial neural network model for target item detection may correspond to a model trained to detect one or more reference target items from the reference pathology slide image. For example, the targetitem detection unit 210 may detect a target item in units of cells and/or a target item in units of regions in the pathology slide image. That is, the targetitem detection unit 210 may detect tumor cell, lymphocyte, macrophages, dendritic cell, fibroblast, endothelial cell, blood vessel, cancer stroma, cancer epithelium, cancer area, normal area (e.g., normal lymph node architecture region), and the like, as the target item in the pathology slide image. - The
ROI determination unit 220 may determine one or more ROIs in the pathology slide image. In this example, the ROI may include a region in which one or more target items are detected in the pathology slide image. For example, theROI determination unit 220 may determine, as the region of interest, a patch including one or more target items from among a plurality of patches forming the pathology slide image. In an embodiment, theROI determination unit 220 may determine one or more ROIs based on a detection result for one or more target items for a pathology slide image. For example, theROI determination unit 220 may determine one or more ROIs including at least some regions in the pathology slide image that satisfy a condition associated with the one or more target items. Additionally or alternatively, upon input of the detection result for one or more target items for a pathology slide image and/or the pathology slide image to an ROI extraction model, theROI determination unit 220 may determine the regions being output as one or more ROIs. - The immune
phenotype determination unit 230 may generate the information associated with immune phenotype for one or more ROIs in the pathology slide image. In an embodiment, the immunephenotype determination unit 230 may determine the immune phenotype of one or more ROIs based on the detection result for one or more target items. For example, the immunephenotype determination unit 230 may determine whether the immune phenotype of one or more ROIs is immune inflamed, immune excluded, or immune desert, based on the detection result for one or more target items. In another embodiment, the immunephenotype determination unit 230 may calculate the immune phenotype score of one or more ROIs based on the detection result for one or more target items. For example, the immunephenotype determination unit 230 may calculate a score for immune inflamed of one or more ROIs, a score for immune excluded, and/or a score for immune desert, based on the detection result for one or more target items. To this end, the immunephenotype determination unit 230 may calculate a score indicating a probability that the immune phenotype of one or more ROIs is immune inflamed, a score indicating a probability that it is immune excluded, and/or a score indicating a probability that it is immune desert. - In another embodiment, the immune
phenotype determination unit 230 may generate a feature associated with one or more immune phenotypes for one or more ROIs. In this example, the feature associated with one or more immune phenotypes may include at least one of a statistical value or a vector associated with the immune phenotype. For example, the feature associated with one or more immune phenotypes may include score values related to the immune phenotype output from an artificial neural network model or a machine learning model. That is, it may include score values output in the process of determining the immune phenotype for one or more ROIs. As another example, the feature associated with one or more immune phenotypes may include a density value, number, or various statistics of immune cells corresponding to a threshold (or cut-off) for an immune phenotype or a vector value or the like expressing the distribution of immune cells. - As another example, the feature associated with one or more immune phenotypes may include a scalar value or a vector value including a relative relationship (e.g., a histogram vector or a graph expression vector considering the direction and distance) or relative statistics (e.g., the ratio of the number of immune cells to the number of specific cells, and the like) between an immune cell or cancer cell and a specific cell (e.g., cancer cells, immune cells, fibroblasts, lymphocytes, plasma cells, macrophage, endothelial cells, and the like). As another example, the feature associated with one or more immune phenotypes may include scalar values or vector values including statistics (e.g., the ratio of the number of immune cells to the cancer stromal region, and the like) or distributions (e.g., histogram vector or graph representation vector, and the like) of immune cells or cancer cells in a specific region (e.g., cancer area, cancer stromal region, tertiary lymphoid structure, normal region, necrosis, fat, blood vessel, high endothelial venule, lymphatic vessel, nerve, and the like).
- As another example, the feature associated with one or more immune phenotypes may include scalar values or vector values including relative relationship (e.g., histogram vector or graph expression vector considering the direction and distance) or relative statistics (e.g., the ratio of the number of immune cells to the number of specific cells, and the like) between positive/negative cells according to the expression amount of the biomarker and the specific cells (e.g., cancer cells, immune cells, fibroblasts, lymphocytes, plasma cells, macrophage, endothelial cells, and the like). As another example, the feature associated with one or more immune phenotypes may include scalar values or vector values including statistics (e.g., the ratio of the number of immune cells to the cancer stromal region, and the like) or distributions (e.g., histogram vector or graph representation vector, and the like) of positive/negative cells according to the expression amount of the biomarker in a specific region (e.g., cancer area, cancer stromal region, tertiary lymphoid structure, normal region, necrosis, fat, blood vessel, high endothelial venule, lymphatic vessel, nerve, and the like).
- In
FIG. 2 , theinformation processing system 100 includes the targetitem detection unit 210, theROI determination unit 220, and the immunephenotype determination unit 230, but embodiments are not limited thereto, and some components may be omitted or other components may be added. In an embodiment, theinformation processing system 100 may further include a response predicting unit (not illustrated) for immune checkpoint inhibitor, and this response prediction unit for immune checkpoint inhibitor may generate a prediction result for whether the patient responds to the immune checkpoint inhibitor or not based on the information associated with immune phenotype. In another embodiment, theinformation processing system 100 may further include an output unit (not illustrated), and this output unit may output at least one of a detection result for the one or more target items, an immune phenotype of one or more ROIs, a prediction result as to whether or not the patient responds to immune checkpoint inhibitor, or a density of immune cells in each of one or more ROIs. -
FIG. 3 is a block diagram illustrating an internal configuration of theuser terminal 110 according to an embodiment of the present disclosure. According to an embodiment, as illustrated, theuser terminal 110 may include animage generation unit 310 and animage output unit 320. Respective components of theuser terminal 110 illustrated inFIG. 3 represent functional components that can be divided on the basis of functions, and in an actual physical environment, a plurality of components may be implemented as being incorporated with each other. - The
image generation unit 310 may obtain the information associated with immune phenotype for one or more ROIs in the pathology slide image. For example, theimage generation unit 310 may receive the information associated with immune phenotype for one or more ROIs generated by the information processing system. Additionally or alternatively, as theuser terminal 110 generates the information associated with immune phenotype for one or more ROIs, theimage generation unit 310 may obtain the information associated with immune phenotype for one or more ROIs. Additionally or alternatively, theimage generation unit 310 may receive the information associated with immune phenotype for one or more ROIs stored in an internal and/or external device of theuser terminal 110. - The
image generation unit 310 may generate an image indicative of the information associated with immune phenotype based on the information associated with immune phenotype for one or more ROIs. In an embodiment, when obtaining the immune phenotype of one or more ROIs, theimage generation unit 310 may generate an image including a visual representation corresponding to the immune phenotype of one or more ROIs. In another embodiment, when receiving one or more immune phenotype scores for one or more ROIs, theimage generation unit 310 may generate an image including a visual representation corresponding to one or more immune phenotype scores. - In another embodiment, the
image generation unit 310 may generate an image including a visual representation corresponding to a feature (e.g., a value of a feature) associated with one or more immune phenotypes. For example, it is possible to generate an image including a heatmap according to the expression rate of a biomarker, a classification map classified based on a specific threshold (e.g., cut-off), and the like. In this example, the classification map may include a map visualizing a result of classifying a tumor proportion score (TPS) and/or a combined proportion score (CPS), which are information related to the expression of PD-L1, based on a specific threshold. - The
image output unit 320 may output through the display device an image indicative of the information associated with immune phenotype. In an embodiment, theimage output unit 320 may output through the display device one or more ROIs in the pathology slide image and an image including a visual representation. Alternatively, theimage output unit 320 may overlay an image including a visual representation on one or more ROIs in the pathology slide image. -
FIG. 3 illustrates that theimage generation unit 310 is included in theuser terminal 110, but embodiments are not limited thereto, and theimage generation unit 310 may be included in any external device (e.g., in theinformation processing system 100, and the like) capable of communicating with theuser terminal 110 by wire and/or wirelessly. According to this configuration, theimage output unit 320 of theuser terminal 110 may receive an image generated from the external device, and display the received image on a display device connected to theuser terminal 110 by wire and/or wirelessly. - In addition,
FIGS. 2 and 3 illustrate that the targetitem detection unit 210, theROI determination unit 220, the immunephenotype determination unit 230, theimage generation unit 310, and theimage output unit 320 are separately executed in theinformation processing system 100 and theuser terminal 110, but embodiments are not limited thereto, and these components may be executed by one device. In another embodiment, these components may be distributed and processed in any combination by a plurality of any devices (e.g., theinformation processing system 100, theuser terminal 110, and the like). -
FIG. 4 is a flowchart illustrating amethod 400 for providing information associated with immune phenotype for pathology slide image according to an embodiment of the present disclosure. In an embodiment, themethod 400 for providing information associated with immune phenotype for pathology slide image may be performed by a processor (e.g., at least one processor of the user terminal and/or at least one processor of the information processing system). Themethod 400 for providing information associated with immune phenotype for pathology slide image may be initiated by the processor obtaining the information associated with immune phenotype for one or more ROIs in the pathology slide image (S410). In this example, the information associated with immune phenotype for one or more ROIs may include an immune phenotype for one or more ROIs (e.g., at least one of immune inflamed, immune excluded, or immune desert), and/or an immune phenotype score (e.g., at least one of a score for immune inflamed, a score for immune excluded, or a score for immune desert) for one or more ROIs. Additionally or alternatively, the information associated with immune phenotype for one or more ROIs may include a feature associated with one or more immune phenotypes for one or more ROIs (e.g., statistical value or vector associated with the immune phenotype). - In an embodiment, the one or more ROIs may be determined based on a detection result for one or more target items for a pathology slide image. For example, the one or more ROIs may include at least some regions in the pathology slide image that satisfy a condition associated with the one or more target items. As another example, the one or more ROIs may be the regions being output upon input of the detection result for one or more target items for the pathology slide image and/or the pathology slide image to the ROI extraction model. In this case, the ROI extraction model may correspond to a model that is trained to output a reference ROI upon input of the detection result for one or more target items for reference pathology slide image (e.g., a reference pathology slide image including the detection result for target item), and/or a reference pathology slide image.
- The processor may generate an image indicative of information associated with immune phenotype based on the information associated with immune phenotype for one or more ROIs (S420). In an embodiment, the processor may generate an image including a visual representation corresponding to immune phenotype of one or more ROIs. In another embodiment, the processor may generate an image including a visual representation corresponding to one or more immune phenotype scores. For example, the processor may generate an image including a visual representation corresponding to a score for immune inflamed. As another example, the processor may generate an image including a visual representation corresponding to a score for immune excluded. As another example, the processor may generate an image including a visual representation corresponding to a score for immune desert. In still another embodiment, the processor may generate an image including a visual representation corresponding to a feature associated with one or more immune phenotypes. For example, the processor may generate an image including a visual representation corresponding to a value of a feature associated with one or more immune phenotypes.
- Then, the processor may output an image indicative of the information associated with immune phenotype (S430). In an embodiment, the processor may output one or more ROIs in the pathology slide image together with an image including a visual representation. In another embodiment, the processor may overlay an image including a visual representation on one or more ROIs in the pathology slide image.
- In an embodiment, the processor may obtain a detection result for one or more target items from the pathology slide image, and generate an image indicative of the detection result for one or more target items. Then, the processor may output the image indicative of the detection result for one or more target items.
-
FIG. 5 is a diagram illustrating an example of determining one or more ROIs 522_1, 522_2, 522_3, 522_4, 524, and 526 in apathology slide image 510 according to an embodiment of the present disclosure. To predict whether or not a patient will respond to immune checkpoint inhibitor, a user (e.g., a doctor, a researcher, and the like) may obtain a patient's tissue (e.g., tissue immediately prior to treatment, tissue after treated with immune checkpoint inhibitor, and the like) and generate one or more pathology slide images. For example, the user may perform H&E staining on the obtained patient's tissue and digitize the H&E-stained tissue slide through a scanner to generate a pathology slide image. As another example, the user may perform IHC staining on the obtained patient's tissue and digitize the IHC-stained tissue slide through a scanner to generate a pathology slide image. - The
ROI determination unit 220 of the information processing system may determine one or more ROIs in the pathology slide image. In this example, the ROIs may correspond to the regions having various shapes, such as circle, square, rectangle, polygon, contour, and the like. In an embodiment, theROI determination unit 220 may determine one or more ROIs based on the detection result for one or more target items for a pathology slide image. For example, among a plurality of patches (e.g., 1 mm2 sized patches) generated by dividing the pathology slide image into N grids (where N is any natural number), theROI determination unit 220 may determine a patch, from which one or more target items (e.g., items and/or immune cells associated with a cancer) are detected, as the ROI. - In an embodiment, the
ROI determination unit 220 may determine one or more ROIs such that the ROIs include at least some regions in the pathology slide image that satisfy a condition associated with one or more target items. For example, theROI determination unit 220 may determine a region, which has the number and/or area of target items (e.g., tumor cells, immune cells, cancer area, cancer stroma, and the like) equal to or greater than a reference value in the pathology slide image, as the ROI. Additionally or alternatively, theROI determination unit 220 may determine a region, which has a numerical value such as a ratio, a density, and the like of the target item equal to or greater than a reference value in the pathology slide image, as the ROI. In this example, the target item may correspond to a cell and/or region detected to determine the immune phenotype. In addition, the reference value may refer to a numerical value set for the target item to define a statistically significant immune phenotype and/or a clinically significant immune phenotype. - In this case, the
ROI determination unit 220 may determine a region, which has any area in the pathology slide image, as the ROI. For example, the area of the ROI may be dynamically determined to satisfy the condition associated with one or more target items described above. That is, the area of the ROI is not fixedly predetermined, and may be dynamically determined as the ROI determination unit 200 determines the region, which has the number, area, ratio, and/or density and the like of the target items equal to or greater than the reference value, as the ROI. Alternatively, theROI determination unit 220 may determine one or more ROIs such that the area of the ROIs has a statically predetermined value. - In another embodiment, upon input of the detection result for one or more target items for the pathology slide image and/or the pathology slide image to an ROI extraction model, the
ROI determination unit 220 may determine the regions being output as one or more ROIs. In this example, the ROI extraction model may correspond to a machine learning model (e.g., Neural Network, CNN, SVM, and the like) trained to output a reference ROI upon input of a detection result for one or more target items for reference pathology slide image and/or a reference pathology slide image. Even in this case, the area of the ROI may be determined dynamically and/or statically. - In an embodiment, when a plurality of ROIs are determined for the pathology slide image, the
ROI determination unit 220 may determine the plurality of ROIs such that at least some of the plurality of ROIs overlap with each other. For example, when theROI determination unit 220 determines a first ROI and a second ROI for the pathology slide image, at least some regions of the first ROI and at least some regions of the second ROI may be regions overlapping with each other. In another embodiment, when a plurality of ROIs for one pathology slide image is determined, theROI determination unit 220 may determine the plurality of ROIs such that at least some of the plurality of ROIs do not overlap with each other. - As illustrated, the
ROI determination unit 220 may receive the pathology slide image 510 (e.g., pathology slide image including a detection result for target item), and determine one or more ROIs in the pathology slide images 522_1, 522_2, 522_3, 522_4, 524, and 526. For example, theROI determination unit 220 may determine the regions 522_1, 522_2, 522_3 and 522_4, which have a specific area (e.g., 1 mm2 as a predetermined area) that satisfies the condition associated with the target item, as the ROIs. As another example, theROI determination unit 220 may determine aregion 524, which satisfies the condition associated with the target item, as the ROI, and the area of the ROI may be dynamically determined. As still another example, theROI determination unit 220 may determine anelliptical region 526, which satisfies the condition associated with the target item, as the ROI. -
FIG. 6 is a diagram illustrating an example of generating an immunephenotype determination result 620 according to an embodiment of the present disclosure. In an embodiment, the immunephenotype determination unit 230 may generate the immunephenotype determination result 620 of one or more ROIs based on the detection result for one or more target items (e.g., items and/or immune cells associated with a cancer) in the one or more ROIs. For example, the immunephenotype determination unit 230 may determine the immune phenotype of the corresponding ROI as at least one of immune inflamed, immune excluded, or immune desert based on the detection result for one or more target items in one or more ROIs. - In another embodiment, the immune
phenotype determination unit 230 may calculate an immune phenotype score for one or more ROIs based on the detection result for one or more target items in one or more ROIs. For example, the immunephenotype determination unit 230 may calculate at least one of a score for immune inflamed, a score for immune excluded, or a score for immune desert in the corresponding ROI based on the detection result for one or more target items in one or more ROIs. In this case, the immunephenotype determination unit 230 may determine an immune phenotype of the corresponding ROI based on at least one of a score for immune inflamed, a score for immune excluded, and a score for immune desert for one or more ROIs. For example, when the score for immune inflamed of a specific ROI is equal to or greater than a threshold value, the immunephenotype determination unit 230 may determine the immune phenotype of the corresponding ROI as immune inflamed. - In an embodiment, the immune
phenotype determination unit 230 may calculate at least one of the number, distribution, or density of target items in one or more ROIs, and may determine an immune phenotype and/or an immune phenotype score of one or more ROIs based on at least one of the calculated number, distribution, or density of the immune cells. For example, the immunephenotype determination unit 230 may calculate, within one or more ROIs, a density of lymphocytes in the cancer area and a density of lymphocytes in the cancer stroma area, and determine the immune phenotype of the one or more ROIs based on at least one of the density of immune cells in the cancer area or the density of immune cells in the cancer stroma. Additionally or alternatively, the immunephenotype determination unit 230 may determine the immune phenotype of one or more ROIs as one of immune inflamed, immune excluded, or immune desert, by referring to the number of immune cells included in the specific region in the cancer area. - For example, the immune
phenotype determination unit 230 may determine the immune phenotype of afirst ROI 612, which has a density of immune cells in the cancer area equal to or greater than a first threshold density, as immune inflamed. In addition, the immunephenotype determination unit 230 may determine the immune phenotype of asecond ROI 614, which has a density of immune cells in the cancer area less than the first threshold density and a density of immune cells in the cancer stroma equal to or greater than the second threshold density, as immune excluded. In addition, the immunephenotype determination unit 230 may determine the immune phenotype of athird ROI 616 having a density of immune cells in the cancer area less than the first threshold density and having a density of immune cells in the cancer stroma less than the second threshold density is immune desert. In this example, the first threshold density may be determined based on a distribution of the density of the immune cells in the cancer area in each of a plurality of ROIs in the plurality of pathology slide images. Likewise, the second threshold density may be determined based on a distribution of the density of the immune cells in the cancer stroma in each of the plurality of ROIs in the plurality of pathology slide images. - In another embodiment, the immune
phenotype determination unit 230 may input the feature for each of the one or more ROIs to the artificial neural network immune phenotype classification model to determine the immune phenotype and/or immune phenotype score of each of the one or more ROIs. In this example, the artificial neural network immune phenotype classification model may correspond to a classifier that is trained to determine the immune phenotype of the reference ROI as one of immune inflamed, immune excluded, or immune desert upon input of the feature for the reference ROI. In addition, in this example, the feature for each of one or more ROIs may include a statistical feature for one or more target items in each of one or more ROIs (e.g., density, number, and the like, of specific target items in the ROI), a geometric feature for one or more target items (e.g., a feature including relative position information between specific target items, and the like), and/or an image feature, and the like corresponding to each of the one or more ROIs (e.g., a feature extracted from a plurality of pixels included in the ROIs, an image vector corresponding to the ROIs, and the like). Additionally or alternatively, the feature for each of the one or more ROIs may include a feature obtained by concatenating two or more features from among the statistical feature for one or more target items in each of the one or more ROIs, the geometric feature for the one or more target items, or the image feature corresponding to each of the one or more ROIs. - As illustrated, the immune
phenotype determination unit 230 may receive one ormore ROIs phenotype determination result 620. In this example, one ormore ROIs phenotype determination unit 230 may generate the immunephenotype determination result 620 for the ROI that includes the detection result for target item. For example, the immunephenotype determination unit 230 may generate the immunephenotype determination result 620 including the immune phenotype of the corresponding ROI and/or the immune phenotype score of the corresponding ROI. The generated immunephenotype determination result 620 and/or one ormore ROIs -
FIG. 7 is a diagram illustrating an example of outputting the immune phenotype determination result according to an embodiment of the present disclosure. As the information processing system (e.g., at least one processor of the information processing system) provides the user terminal with information associated with immune phenotype for one or more ROIs in the pathology slide image, the user terminal may output the received information through an output device to provide it to the user. In this example, the information associated with immune phenotype for one or more ROIs may include the immune phenotype score for one or more ROIs. The user terminal (e.g., at least one processor of the user terminal) may output an image indicative of the information associated with immune phenotype for one or more ROIs. - To this end, the user terminal may generate an image including a visual representation corresponding to one or more immune phenotype scores for one or more ROIs. For example, in a region corresponding to one or more ROIs, the user terminal may generate an image including a visual representation corresponding to one or more immune phenotype scores for the ROI. In this example, the one or more immune phenotype scores may include at least one of a score for immune inflamed, a score for immune excluded, or a score for immune desert. In addition, the visual representation may include color (e.g., color, brightness, saturation, and the like), text, image, mark, figure, and the like.
- For example, with respect to the score for immune inflamed (that is, the immune inflamed score), the user terminal may generate an image including a color with higher saturation for a region corresponding to ROI having a higher immune inflamed score, and generate an image including a color with lower saturation for a region corresponding to ROI having a lower immune inflamed score. As another example, with respect to the score for the immune inflamed, the user terminal may generate an image including a first visual representation in a region corresponding to the ROI having the immune inflamed score corresponding to a first score section, including a second visual representation in a region corresponding to the ROI having the immune inflamed score corresponding to a second score section, and including a third visual representation in a region corresponding to the ROI having the immune inflamed score corresponding to a third score section. In this example, the first visual representation, the second visual representation, and the third visual representation may be different from each other.
- In an embodiment, the user terminal may output one or more ROIs in the pathology slide image together with an image including a visual representation. For example, the user terminal may simultaneously display the image (e.g., the image including a visual representation) generated as described above and at least some regions of the pathology slide image (e.g., the region corresponding to the generated image) on a display device. In another embodiment, the user terminal may overlay the image including the visual representation on one or more ROIs in the pathology slide image. For example, the user terminal may overlap the image (e.g., the image including a visual representation) generated as described above on at least some regions of the pathology slide image (e.g., the region corresponding to the generated image), and display on the display device.
- For example, as illustrated, the user terminal may generate an
image 710 in which the region corresponding to the ROI having the immune inflamed score falling in the first score section is displayed in white, the region corresponding to the ROI having the immune inflamed score falling in the second score section is displayed in light gray, the region corresponding to the ROI having the immune inflamed score falling in the third score section is displayed in dark gray, and the regions other than the ROIs are displayed in black. Then, the user terminal may arrange, in parallel, theimage 710 including a visual representation and aregion 720 in the pathology slide image which corresponds to the generatedimage 710, and display them together on the user interface. - Additionally or alternatively, the user terminal may further display on the user interface the information associated with immune phenotype, such as text, numerical values, graphs, and the like for a “total tissue region” (e.g., a tissue region in a pathology slide image), a “cancer region” (e.g., a cancer area in a pathology slide image, an “analyzable region” (e.g., an ROI in pathology slide image), and an “immune phenotype proportion” (e.g., immune phenotype proportion). Additionally or alternatively, a region in the pathology slide images, which is displayed on the display device, may include the detection result for one or more target items for the corresponding region. That is, when outputting an image of at least some regions of the pathology slide image, the user terminal may output an image of at least some regions displaying the detection results for one or more target items through the display device.
-
FIG. 8 is a diagram illustrating an example of outputting an immune phenotype determination result according to an embodiment of the present disclosure. In an embodiment, the information associated with immune phenotype for one or more ROIs obtained by the user terminal (e.g., at least one processor of the user terminal) may include the immune phenotypes of the one or more ROIs. For example, the user terminal may output an image indicative of the information associated with immune phenotype for one or more ROIs. - To this end, the user terminal may generate an image including a visual representation corresponding to the immune phenotype of one or more ROIs. For example, the user terminal may generate an image including a visual representation corresponding to the immune phenotype of the corresponding ROI in a region corresponding to one or more ROIs. That is, an image may be generated, which includes the first visual representation in a region corresponding to the ROI having the immune phenotype of immune inflamed, includes the second visual representation in a region corresponding to the ROI having the immune phenotype of immune excluded, and includes the third visual representation in a region corresponding to the ROI having the immune phenotype of immune desert. In this case, the visual representation may include color (e.g., color, brightness, saturation, and the like), text, image, mark, figure, and the like, to distinguish each immune phenotype. For example, the first visual representation indicating immune inflamed may be red color, the second visual representation indicating immune excluded may be green color, and the third visual representation indicating immune desert may be blue color. Additionally or alternatively, the first visual representation indicating immune inflamed may be a circle mark, the second visual representation indicating immune excluded may be a triangle mark, and the third visual representation indicating immune desert may be an X mark.
- In an embodiment, the user terminal may output one or more ROIs in the pathology slide image together with an image including the visual representation. For example, the user terminal may display the image (e.g., the image including a visual representation) generated as described above together with at least some regions of the pathology slide image (e.g., the region corresponding to the generated image) on the display device. In another embodiment, the user terminal may overlay the image including the visual representation on one or more ROIs in the pathology slide image. For example, the user terminal may overlap the image (e.g., the image including a visual representation) generated as described above transparently, translucently, or opaquely on at least some regions of the pathology slide image (e.g., the region corresponding to the generated image), and display on the display device.
- For example, as illustrated, the user terminal may generate an image that includes a diagonal marker in a region corresponding to the ROI having the immune phenotype of immune inflamed, includes a vertical line marker in a region corresponding to the ROI having the immune phenotype of immune excluded, and includes a horizontal line marker in a region corresponding to the ROI having the immune phenotype of immune desert. Then, the user terminal may display on the user interface an
image 810 including the image including a visual representation overlaid on a corresponding region of the pathology slide image. Additionally, the user terminal may also display on the user interface animage 820 of at least some regions of the pathology slide image (e.g., region corresponding to an image including a visual representation and/or the entire pathology slide image) separately (e.g., in a form of a minimap). - Additionally or alternatively, the user terminal may display on the user interface the information associated with immune phenotype, such as text, numerical values, graphs, and the like for “analysis summary”, “biomarker findings”, “score” (e.g., immune inflamed score, and the like), “cutoff” (e.g., reference value used in determining the immune phenotype, and the like), “total tissue region”, “cancer region”, “analyzable region”, “immune phenotype proportion”, “tumor infiltrating lymphocyte density” (e.g., immune cell density in the cancer area, immune cell density in the cancer stromal region, and the like). Additionally or alternatively, the region in the pathology slide images, which is displayed on the display device, may include the detection result for one or more target items for the corresponding region. That is, when outputting an image of at least some regions of the pathology slide image, the user terminal may output an image of at least some regions displaying the detection results for one or more target items through the display device.
-
FIG. 9 is an exemplary diagram illustrating an artificialneural network model 900 according to an embodiment of the present disclosure. In machine learning technology and cognitive science, an artificialneural network model 900 as an example of the machine learning model refers to a statistical learning algorithm implemented based on a structure of a biological neural network, or to a structure that executes such algorithm. - According to an embodiment, the artificial
neural network model 900 may represent a machine learning model that obtains a problem solving ability by repeatedly adjusting the weights of synapses by the nodes that are artificial neurons forming the network through synaptic combinations as in the biological neural networks, thus training to reduce errors between a target output corresponding to a specific input and a deduced output. For example, the artificialneural network model 900 may include any probability model, neural network model, and the like, that is used in artificial intelligence learning methods such as machine learning and deep learning. - According to an embodiment, the artificial
neural network model 900 may include an artificial neural network model configured to detect one or more target items from a pathology slide image being inputted. Additionally or alternatively, the artificialneural network model 900 may include an artificial neural network model configured to determine one or more ROIs from an input pathology slide image. - The artificial
neural network model 900 is implemented as a multilayer perceptron (MLP) formed of multiple nodes and connections between them. The artificialneural network model 900 according to an embodiment may be implemented using one of various artificial neural network model structures including the MLP. As shown inFIG. 9 , the artificialneural network model 900 includes aninput layer 920 receiving an input signal ordata 910 from the outside, anoutput layer 940 outputting an output signal ordata 950 corresponding to the input data, and (n) number of hidden layers 930_1 to 930_n (where n is a positive integer) positioned between theinput layer 920 and theoutput layer 940 to receive a signal from theinput layer 920, extract the features, and transmit the features to theoutput layer 940. In an example, theoutput layer 940 receives signals from the hidden layers 930_1 to 930_n and outputs them to the outside. - The method of training the artificial
neural network model 900 includes the supervised learning that trains to optimize for solving a problem with inputs of teacher signals (correct answer), and the unsupervised learning that does not require a teacher signal. In an embodiment, the information processing system may train the artificialneural network model 900 by supervised learning and/or unsupervised learning to detect one or more target items from a pathology slide image. For example, the information processing system may train the artificialneural network model 900 by supervised learning to detect one or more target items from the pathology slide image by using a reference pathology slide image and label information for one or more reference target items. - In another embodiment, the information processing system may train the artificial
neural network model 900 by supervised learning and/or unsupervised learning to determine one or more ROIs from the pathology slide image. For example, the information processing system may train the artificialneural network model 900 by supervised learning to determine one or more ROIs from the pathology slide image using the reference pathology slide image and/or the detection result for one or more target items for the reference pathology slide image (e.g., reference pathology slide image including the detection result for target item), and the label information on the reference ROI. In this example, the ROI and/or the reference ROI may include at least some regions in the pathology slide image and/or the reference pathology slide image that satisfy the condition associated with one or more target items. - The artificial
neural network model 900 trained as described above may be stored in a memory (not illustrated) of the information processing system, and in response to an input for the pathology slide image received from the communication module and/or the memory, may detect one or more target items in the pathology slide image. Additionally or alternatively, the artificialneural network model 900 may determine one or more ROIs from the pathology slide image, in response to the detection result for the one or more target items for the pathology slide image and/or an input to the pathology slide image. - According to an embodiment, the input variable of the artificial neural network model for detecting the target item may be one or more pathology slide images (e.g., H&E-stained pathology slide images, IHC-stained pathology slide images). For example, the input variable input to the
input layer 920 of the artificialneural network model 900 may be theimage vector 910 which may be one or more pathology slide images configured as one vector data element. The output variable output from theoutput layer 940 of the artificialneural network model 900 in response to the input of the image may be avector 950 representing or characterizing one or more target items detected in the pathology slide image. That is, theoutput layer 940 of the artificialneural network model 900 may be configured to output a vector representing or characterizing one or more target items detected from the pathology slide image. In the present disclosure, the output variable of the artificialneural network model 900 is not limited to the types described above, and may include any information/data indicative of one or more target items detected from the pathology slide image. In addition, theoutput layer 940 of the artificialneural network model 900 may be configured to output a vector indicative of the reliability and/or accuracy of the output detection result for target item and the like. - In another embodiment, the input variable of the machine learning model for determining the ROI, that is, the input variable of the artificial
neural network model 900 may be the detection result for one or more target items for the pathology slide image (e.g., detection data for the target item in the pathology slide image) and/or the pathology slide image. For example, the input variable input to theinput layer 920 of the artificialneural network model 900 may be theimage vector 910 which may be the detection result for the one or more target items for the pathology slide image and/or the pathology slide image configured as one vector data element. An output variable output from theoutput layer 940 of the artificialneural network model 900 in response to input for the detection result for one or more target items for the pathology slide image and/or the pathology slide image may be thevector 950 that indicates or characterizes one or more ROIs. In the present disclosure, the output variable of the artificialneural network model 900 is not limited to the types described above, and may include any information/data indicative of one or more ROIs. - As described above, the
input layer 920 and theoutput layer 940 of the artificialneural network model 900 are respectively matched with a plurality of output variables corresponding to a plurality of input variables, and the synaptic values between nodes included in theinput layer 920, the hidden layers 930_1 to 930_n, and theoutput layer 940 are adjusted, so that by training, a correct output corresponding to a specific input can be extracted. Through this training process, the features hidden in the input variables of the artificialneural network model 900 may be confirmed, and the synaptic values (or weights) between the nodes of the artificialneural network model 900 may be adjusted so as to reduce the errors between the output variable calculated based on the input variable and the target output. By using the artificialneural network model 900 trained in this way, in response to input of the pathology slide image, the detection result for target item may be output. Additionally or alternatively, by using the artificialneural network model 900, one or more ROIs may be output in response to input of the pathology slide image and/or the detection result for one or more target items for the pathology slide image (e.g., the pathology slide image including the detection result for target item). -
FIG. 10 is a configuration diagram of an exemplary computing device 1000 (e.g., user terminal) that provides information associated with immune phenotype for pathology slide image according to an embodiment of the present disclosure. As illustrated, thecomputing device 1000 may include one ormore processors 1010, abus 1030, acommunication interface 1040, amemory 1020 that loads acomputer program 1060 executable by theprocessors 1010, and astorage module 1050 storing thecomputer program 1060. However, only the components related to the embodiment of the present disclosure are illustrated inFIG. 10 . Accordingly, those of ordinary skill in the art to which the present disclosure pertains will be able to recognize that other general-purpose components may be further included in addition to the components shown inFIG. 10 . - The
processors 1010 control the overall operation of components of thecomputing device 1000. Theprocessors 1010 may be configured to include a central processing unit (CPU), a microprocessor unit (MPU), a micro controller unit (MCU), a graphic processing unit (GPU), or any type of processor well known in the technical field of the present disclosure. In addition, theprocessors 1010 may perform an arithmetic operation on at least one application or program for executing the method according to the embodiments of the present disclosure. Thecomputing device 1000 may include one or more processors. - The
memory 1020 may store various types of data, commands, and/or information. Thememory 1020 may load one ormore computer programs 1060 from thestorage module 1050 in order to execute a method/operation according to various embodiments of the present disclosure. Thememory 1020 may be implemented as a volatile memory such as RAM, but the technical scope of the present disclosure is not limited thereto. - The
bus 1030 may provide a communication function between components of thecomputing device 1000. Thebus 1030 may be implemented as various types of buses such as an address bus, a data bus, a control bus, or the like. - The
communication interface 1040 may support wired/wireless Internet communication of thecomputing device 1000. In addition, thecommunication interface 1040 may support various other communication methods in addition to the Internet communication. To this end, thecommunication interface 1040 may be configured to include a communication module well known in the technical field of the present disclosure. - The
storage module 1050 may non-temporarily store one ormore computer programs 1060. Thestorage module 1050 may be configured to include a nonvolatile memory such as a read only memory (ROM), an erasable programmable ROM (EPROM), an electrically erasable programmable ROM (EEPROM), a flash memory, and the like, a hard disk, a detachable disk, or any type of computer-readable recording medium well known in the art to which the present disclosure pertains. - The
computer program 1060 may include one or more instructions that, when loaded into thememory 1020, cause theprocessors 1010 to perform an operation/method in accordance with various embodiments of the present disclosure. That is, theprocessors 1010 may perform operations/methods according to various embodiments of the present disclosure by executing one or more instructions. - For example, the
computer program 1060 may include one or more instructions for causing the following operations to be performed: obtaining information associated with immune phenotype for one or more ROIs in the pathology slide image; generating an image indicative of information associated with immune phenotype based on the information associated with immune phenotype for one or more ROIs; outputting the image indicative of the information associated with immune phenotype, and the like. In this case, a system for predicting a response to immune checkpoint inhibitor according to some embodiments of the present disclosure may be implemented through thecomputing device 1000. -
FIG. 11 is a diagram illustrating an example of outputting a detection result for target item according to an embodiment of the present disclosure. The processor (e.g., at least one processor of the user terminal) may obtain a detection result for one or more target items from the pathology slide image. For example, the information processing system may use a target item detection model to detect one or more target items from the pathology slide image, and provide the detection result for target item to the user terminal. In this example, the target item detection model may include a model trained to detect one or more reference target items from the reference pathology slide image. - The processor may generate an image indicating the detection result for one or more target items, and output the generated image indicating the detection result for one or more target items. In an embodiment, the processor may generate and output an image including visual representations capable of distinguishing respective target items, based on the detection result for one or more target items from the pathology slide image. For example, the processor may generate an image including a visual representation indicating each of the target items (e.g., cancer area, cancer stromal region, blood vessel, cancer cell, immune cell, positive/negative cells according to the expression amount of the biomarker, and the like) that may be considered in determining the immune phenotype. Additionally or alternatively, based on the detection result for one or more target items, the processor may generate an image including a segmentation map for the target item in units of regions, a contour for the target item having a specific structure (e.g., a blood vessel, and the like), a center point of the target item in units of cells, or a contour indicative of the shape of the target item in units of cells, and the like.
- In an embodiment, the processor may output one or more ROIs in the pathology slide image and an image indicative of the detection result for one or more target items for one or more ROIs. For example, the processor may output the image generated as described above (e.g., the image including a visual representation) together with at least some regions of the pathology slide image (e.g., region corresponding to the generated image). As another example, the processor may overlay the image indicative of the detection result for one or more target items (e.g., the image including a visual representation) on one or more ROIs in the pathology slide image. For example, the user terminal may overlap the image (e.g., the image including a visual representation) generated as described above on at least some regions of the pathology slide image (e.g., the region corresponding to the generated image), and display it on a display device connected to the user terminal.
- For example, as illustrated, the processor may display on the user interface an
image 1110 displaying a first color indicating a background other than the target item, a second color indicating a cancer epithelium area, a third color indicating a cancer stromal region, a fourth color indicating an immune cell (e.g., a central point of an immune cell), and a fifth color indicating a cancer cell (e.g., a center point of a cancer cell) on the corresponding regions of the pathology slide image. In this example, theimage 1110 may correspond to an image obtained by overlaying (or merging) the image indicative of the detection result for one or more target items on a corresponding region of the pathology slide image. Further, the first color, the second color, the third color, the fourth color and/or the fifth color may correspond to different colors that can be distinguished from each other. - Additionally or alternatively, as the information associated with immune phenotype, the user terminal may display on the user interface visual representations corresponding to respective target items (e.g., color information corresponding to respective target items, and the like), text, numerical values, markers, graphs, and the like for “analysis summary”. Additionally or alternatively, the user terminal may output a user interface through which the user can select a target item to be displayed with a visual representation in a corresponding region in the pathology slide image. In
FIG. 11 , color is used as an example of the visual representation to indicate the information associated with immune phenotype, but embodiments are not limited thereto. -
FIG. 12 is a diagram illustrating an example of outputting a detection result for target item and an immune phenotype determination result according to an embodiment of the present disclosure. The processor (e.g., at least one processor of the user terminal) may output an image indicative of information (e.g., an immune phenotype map) associated with immune phenotype for one or more ROIs. Additionally, the processor may output an image indicative of the detection result for one or more target items. In an embodiment, the processor may overlay (or merge) an image indicative of the information associated with immune phenotype for one or more ROIs and/or the detection result for one or more target items on a corresponding region of the pathology slide image and output the result. - For example, as illustrated, the processor may display on the user interface an
image 1210 displaying a first color indicative of immune desert, a second color indicative of immune excluded, a third color indicative of immune inflamed, a fourth color indicative of cancer epithelium region, a fifth color indicative of a cancer stromal region, a sixth color indicative of an immune cell (e.g., a center point of immune cell), and a seventh color indicative of a cancer cell (e.g., a center point of a cancer cell) are displayed on the corresponding region in the pathology slide images. In this example, theimage 1210 may correspond to an image in which an image (e.g., an immune phenotype map) indicative of the information associated with immune phenotype for one or more ROIs and an image indicative of the detection result for one or more target items are overlaid on (or merged with) the corresponding region in the pathology slide image. Further, the first color, the second color, the third color, the fourth color, the fifth color, the sixth color and/or the seventh color may correspond to different colors that can be distinguished from each other. - Additionally or alternatively, the processor may also display on the user interface an
image 1220 of at least some regions of the pathology slide image (e.g., regions corresponding to the image including the visual representation and/or the entire pathology slide image) separately (e.g., in a form of a minimap). Additionally or alternatively, as the information associated with immune phenotype, the processor may display on the user interface the visual representations corresponding to the respective target items (e.g., information on color corresponding to respective target items, and the like), text, numerical values, graphs, and the like for “analysis summary”. Additionally or alternatively, a user interface may be output, through which the user may select the information associated with target item and/or immune phenotype to mark with a visual representation, in a corresponding region in the pathology slide image. InFIG. 12 , color is used as an example of the visual representation to indicate the information associated with immune phenotype and/or target item, but embodiments are not limited thereto. - The above description of the present disclosure is provided to enable those skilled in the art to make or use the present disclosure. Various modifications of the present disclosure will be readily apparent to those skilled in the art, and the general principles defined herein may be applied to various modifications without departing from the spirit or scope of the present disclosure. Thus, the present disclosure is not intended to be limited to the examples described herein but is intended to be accorded the broadest scope consistent with the principles and novel features disclosed herein.
- Although example implementations may refer to utilizing aspects of the presently disclosed subject matter in the context of one or more standalone computer systems, the subject matter is not so limited, and they may be implemented in conjunction with any computing environment, such as a network or distributed computing environment. Furthermore, aspects of the presently disclosed subject matter may be implemented in or across a plurality of processing chips or devices, and storage may be similarly influenced across a plurality of devices. Such devices may include PCs, network servers, and handheld devices.
- Although the present disclosure has been described in connection with some embodiments herein, it should be understood that various modifications and changes can be made without departing from the scope of the present disclosure, which can be understood by those skilled in the art to which the present disclosure pertains. In addition, such modifications and changes should be considered within the scope of the claims appended herein.
Claims (20)
1. A method, performed by at least one computing device, for providing information associated with immune phenotype for pathology slide image, comprising:
obtaining information associated with immune phenotype for one or more regions of interest (ROIs) in a pathology slide image;
generating, based on the information associated with the immune phenotype for one or more ROIs, an image indicative of the information associated with the immune phenotype; and
outputting the image indicative of the information associated with the immune phenotype.
2. The method according to claim 1 , wherein the one or more ROIs are determined based on a detection result for one or more target items for the pathology slide image.
3. The method according to claim 2 , wherein the one or more ROIs include at least some regions in the pathology slide image that satisfy a condition associated with the one or more target items.
4. The method according to claim 2 , wherein the one or more ROIs are regions being output upon input of the detection result for the one or more target items for the pathology slide image or the pathology slide image to a region-of-interest (ROI) extraction model, and
the ROI extraction model is trained to output a reference ROI upon input of the detection result for one or more target items for a reference pathology slide image or a reference pathology slide image.
5. The method according to claim 1 , wherein the obtaining includes obtaining the immune phenotype of the one or more ROIs,
the generating includes generating an image including a visual representation corresponding to the immune phenotype of the one or more ROIs, and
the immune phenotype includes at least one of immune inflamed, immune excluded, or immune desert.
6. The method according to claim 5 , wherein the outputting includes outputting the one or more ROIs in the pathology slide image together with the image including the visual representation.
7. The method according to claim 5 , wherein the outputting includes overlaying the image including the visual representation on the one or more ROIs in the pathology slide image.
8. The method according to claim 1 , wherein the obtaining includes obtaining one or more immune phenotype scores for the one or more ROIs,
the generating includes generating an image including a visual representation corresponding to the one or more immune phenotype scores, and
the one or more immune phenotype scores include at least one of a score for immune inflamed, a score for immune excluded, or a score for immune desert.
9. The method according to claim 8 , wherein the outputting includes outputting the one or more ROIs in the pathology slide image together with the image including the visual representation.
10. The method according to claim 8 , wherein the outputting includes overlaying the image including the visual representation on the one or more ROIs in the pathology slide image.
11. The method according to claim 1 , wherein the obtaining includes obtaining a feature associated with one or more immune phenotypes for the one or more ROIs,
the generating includes generating an image including a visual representation corresponding to the feature associated with the one or more immune phenotypes, and
the feature associated with the one or more immune phenotypes includes at least one of a statistical value or a vector associated with the immune phenotype.
12. The method according to claim 11 , wherein the outputting includes outputting the one or more ROIs in the pathology slide image together with the image including the visual representation.
13. The method according to claim 11 , wherein the outputting includes overlaying the image including the visual representation on the one or more ROIs in the pathology slide image.
14. The method according to claim 1 , further comprising:
obtaining a detection result for one or more target items from the pathology slide image;
generating an image indicative of the detection result for one or more target items; and
outputting the image indicative of the detection result for one or more target items.
15. A non-transitory computer-readable recording medium storing a computer program for executing, on a computer, the method for providing the information associated with the immune phenotype for the pathology slide image according to claim 1 .
16. A computing device comprising:
a memory storing one or more instructions; and
a processor configured to execute the stored one or more instructions to:
obtain information associated with immune phenotype for one or more regions of interest (ROIs) in a pathology slide image;
generate, based on the information associated with the immune phenotype for one or more ROIs, an image indicative of the information associated with the immune phenotype; and
output the image indicative of the information associated with the immune phenotype.
17. The computing device according to claim 16 , wherein the processor is further configured to:
obtaining the immune phenotype of the one or more ROIs; and
generate an image including a visual representation corresponding to the immune phenotype of the one or more ROIs, and
the immune phenotype includes at least one of immune inflamed, immune excluded, or immune desert.
18. The computing device according to claim 17 , wherein the processor is further configured to output the one or more ROIs in the pathology slide image together with the image including the visual representation.
19. The computing device according to claim 17 , wherein the processor is further configured to overlay the image including the visual representation on the one or more ROIs in the pathology slide image.
20. The computing device according to claim 16 , wherein the processor is further configured to:
obtain one or more immune phenotype scores for the one or more ROIs; and
generate an image including a visual representation corresponding to the one or more immune phenotype scores, and
the one or more immune phenotype scores include at least one of a score for immune inflamed, a score for immune excluded, or a score for immune desert.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US18/463,912 US20230419492A1 (en) | 2020-05-08 | 2023-09-08 | Method and apparatus for providing information associated with immune phenotypes for pathology slide image |
Applications Claiming Priority (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR20200055483 | 2020-05-08 | ||
KR10-2020-0055483 | 2020-05-08 | ||
KR20210054206 | 2021-04-27 | ||
KR10-2021-0054206 | 2021-04-27 | ||
KR1020210059519A KR102591696B1 (en) | 2020-05-08 | 2021-05-07 | Method and apparatus for providing information associateed with immune phenotype for whole slide image |
PCT/KR2021/005772 WO2021225422A1 (en) | 2020-05-08 | 2021-05-07 | Method and apparatus for providing information associated with immune phenotypes for pathology slide image |
KR10-2021-0059519 | 2021-05-07 |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2021/005772 Continuation WO2021225422A1 (en) | 2020-05-08 | 2021-05-07 | Method and apparatus for providing information associated with immune phenotypes for pathology slide image |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/463,912 Continuation US20230419492A1 (en) | 2020-05-08 | 2023-09-08 | Method and apparatus for providing information associated with immune phenotypes for pathology slide image |
Publications (1)
Publication Number | Publication Date |
---|---|
US20220036549A1 true US20220036549A1 (en) | 2022-02-03 |
Family
ID=78468162
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/502,661 Pending US20220036549A1 (en) | 2020-05-08 | 2021-10-15 | Method and apparatus for providing information associated with immune phenotypes for pathology slide image |
US18/463,912 Pending US20230419492A1 (en) | 2020-05-08 | 2023-09-08 | Method and apparatus for providing information associated with immune phenotypes for pathology slide image |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/463,912 Pending US20230419492A1 (en) | 2020-05-08 | 2023-09-08 | Method and apparatus for providing information associated with immune phenotypes for pathology slide image |
Country Status (4)
Country | Link |
---|---|
US (2) | US20220036549A1 (en) |
JP (1) | JP2023525465A (en) |
KR (1) | KR20230150776A (en) |
WO (1) | WO2021225422A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20220351366A1 (en) * | 2021-05-03 | 2022-11-03 | Rakuten Group, Inc. | Deep learning model to predict data from an image |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004534206A (en) * | 2001-01-05 | 2004-11-11 | イムニベスト・コーポレイション | Device and method for imaging an object |
SG11201404313YA (en) * | 2012-01-25 | 2014-10-30 | Dnatrix Inc | Biomarkers and combination therapies using oncolytic virus and immunomodulation |
WO2017061449A1 (en) | 2015-10-05 | 2017-04-13 | 国立研究開発法人産業技術総合研究所 | Method for detecting cancer cells, reagent for introducing substance into cancer cells, and composition for treating cancer |
CN111094977B (en) * | 2017-07-13 | 2024-02-13 | 古斯塔夫·鲁西研究所 | Imaging tools based on image histology for monitoring tumor lymphocyte infiltration and prognosis in anti-PD-1/PD-L1 treated tumor patients |
KR101889723B1 (en) * | 2018-07-04 | 2018-08-20 | 주식회사 루닛 | Method and Apparatus for Diagnosing Malignant Tumor |
KR102068279B1 (en) * | 2019-10-04 | 2020-01-20 | 주식회사 루닛 | Method and System for analysing image |
-
2021
- 2021-05-07 JP JP2022561179A patent/JP2023525465A/en active Pending
- 2021-05-07 WO PCT/KR2021/005772 patent/WO2021225422A1/en unknown
- 2021-10-15 US US17/502,661 patent/US20220036549A1/en active Pending
-
2023
- 2023-09-08 US US18/463,912 patent/US20230419492A1/en active Pending
- 2023-10-16 KR KR1020230138057A patent/KR20230150776A/en active Application Filing
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20220351366A1 (en) * | 2021-05-03 | 2022-11-03 | Rakuten Group, Inc. | Deep learning model to predict data from an image |
Also Published As
Publication number | Publication date |
---|---|
WO2021225422A1 (en) | 2021-11-11 |
KR20230150776A (en) | 2023-10-31 |
US20230419492A1 (en) | 2023-12-28 |
JP2023525465A (en) | 2023-06-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP4148746A1 (en) | Method and apparatus for providing information associated with immune phenotypes for pathology slide image | |
US10332254B2 (en) | Computer Aided Diagnosis (CAD) apparatus and method | |
Lu et al. | Estimating sheep pain level using facial action unit detection | |
US20230420072A1 (en) | Method and system for predicting response to immune checkpoint inhibitor | |
JP5868231B2 (en) | Medical image diagnosis support apparatus, medical image diagnosis support method, and computer program | |
CN109859168A (en) | A kind of X-ray rabat picture quality determines method and device | |
US10007834B2 (en) | Detection control device, detection system, non-transitory storage medium, and detection control method | |
Xia et al. | A multi-scale segmentation-to-classification network for tiny microaneurysm detection in fundus images | |
US20230419492A1 (en) | Method and apparatus for providing information associated with immune phenotypes for pathology slide image | |
KR102580419B1 (en) | Method and system for detecting region of interest in pathological slide image | |
US20210366594A1 (en) | Method and system for refining label information | |
US20190259154A1 (en) | Predicting response to immunotherapy using computer extracted features of cancer nuclei from hematoxylin and eosin (h&e) stained images of non-small cell lung cancer (nsclc) | |
CN113096080A (en) | Image analysis method and system | |
JP2018185265A (en) | Information processor, method for control, and program | |
Les et al. | Fusion of FISH image analysis methods of HER2 status determination in breast cancer | |
US10902256B2 (en) | Predicting response to immunotherapy using computer extracted features relating to spatial arrangement of tumor infiltrating lymphocytes in non-small cell lung cancer | |
US20220145401A1 (en) | Method and system for predicting responsiveness to therapy for cancer patient | |
US20220262513A1 (en) | Method and system for training machine learning model for detecting abnormal region in pathological slide image | |
US20220261988A1 (en) | Method and system for detecting region of interest in pathological slide image | |
US11861836B2 (en) | Image analysis method, estimating device, estimating system, and storage medium | |
Arjmand et al. | Fat Droplets Identification in Liver Biopsies using Supervised Learning Techniques | |
Prasetyo et al. | Channel Attention HalfU-Net for Skin Lesion Segmentation | |
Hao et al. | Automatic detection of breast nodule in the ultrasound images using CNN | |
KR20230120992A (en) | Method and apparatus for selecting medical data for annotation | |
Sarshar et al. | Advancing Brain MRI Images Classification: Integrating VGG16 and ResNet50 with a Multi-verse Optimization Method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: LUNIT INC., KOREA, REPUBLIC OF Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:YOO, DONGGEUN;OCK, CHANYOUNG;PAENG, KYUNGHYUN;REEL/FRAME:057832/0437 Effective date: 20211012 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |