WO2022149894A1 - 병리 검체에 대한 판단 결과를 제공하는 인공 뉴럴 네트워크의 학습 방법, 및 이를 수행하는 컴퓨팅 시스템 - Google Patents
병리 검체에 대한 판단 결과를 제공하는 인공 뉴럴 네트워크의 학습 방법, 및 이를 수행하는 컴퓨팅 시스템 Download PDFInfo
- Publication number
- WO2022149894A1 WO2022149894A1 PCT/KR2022/000269 KR2022000269W WO2022149894A1 WO 2022149894 A1 WO2022149894 A1 WO 2022149894A1 KR 2022000269 W KR2022000269 W KR 2022000269W WO 2022149894 A1 WO2022149894 A1 WO 2022149894A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- slide image
- pathology
- neural network
- image
- pathological
- Prior art date
Links
- 238000013528 artificial neural network Methods 0.000 title claims abstract description 122
- 230000001575 pathological effect Effects 0.000 title claims abstract description 108
- 238000000034 method Methods 0.000 title claims abstract description 83
- 238000012549 training Methods 0.000 title claims abstract description 69
- 230000007170 pathology Effects 0.000 claims abstract description 158
- 239000012128 staining reagent Substances 0.000 claims abstract description 25
- 238000010186 staining Methods 0.000 claims abstract description 8
- 230000003902 lesion Effects 0.000 claims description 22
- 230000009466 transformation Effects 0.000 claims description 17
- 238000004590 computer program Methods 0.000 claims description 12
- 238000012545 processing Methods 0.000 claims description 8
- 230000001131 transforming effect Effects 0.000 claims description 5
- 201000010099 disease Diseases 0.000 abstract description 6
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 abstract description 6
- 206010028980 Neoplasm Diseases 0.000 description 18
- 238000003364 immunohistochemistry Methods 0.000 description 11
- 230000008569 process Effects 0.000 description 10
- 238000010586 diagram Methods 0.000 description 9
- 238000004458 analytical method Methods 0.000 description 8
- 238000004393 prognosis Methods 0.000 description 8
- 238000011532 immunohistochemical staining Methods 0.000 description 7
- 201000011510 cancer Diseases 0.000 description 6
- 238000013527 convolutional neural network Methods 0.000 description 4
- 238000003745 diagnosis Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 230000004043 responsiveness Effects 0.000 description 3
- 230000000903 blocking effect Effects 0.000 description 2
- 239000003153 chemical reaction reagent Substances 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000001914 filtration Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 239000011521 glass Substances 0.000 description 2
- 238000007490 hematoxylin and eosin (H&E) staining Methods 0.000 description 2
- 210000002865 immune cell Anatomy 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 102000004169 proteins and genes Human genes 0.000 description 2
- 108090000623 proteins and genes Proteins 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 238000001574 biopsy Methods 0.000 description 1
- 210000004027 cell Anatomy 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 210000002569 neuron Anatomy 0.000 description 1
- 210000000056 organ Anatomy 0.000 description 1
- 238000001356 surgical procedure Methods 0.000 description 1
- 230000002194 synthesizing effect Effects 0.000 description 1
- 230000008685 targeting Effects 0.000 description 1
- 238000000844 transformation Methods 0.000 description 1
- 210000004881 tumor cell Anatomy 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/0464—Convolutional networks [CNN, ConvNet]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/09—Supervised learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/77—Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
- G06V10/774—Generating sets of training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/82—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/69—Microscopic objects, e.g. biological cells or cellular parts
- G06V20/698—Matching; Classification
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/40—ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/20—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10056—Microscopic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20076—Probabilistic image processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30024—Cell structures in vitro; Tissue sections in vitro
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30096—Tumor; Lesion
Definitions
- the present invention relates to a learning method of an artificial neural network that provides a judgment result for a pathological specimen, and a computing system for performing the same. More specifically, by learning artificial neural networks using pathology slides in which serial sections of a single sample are stained with various different staining reagents, it is possible to perform high-accuracy judgment on diseases. It relates to a method and a computing system for performing the same.
- tumor microenvironment Neighboring cells that interact with tumor cells around the tumor and influence the growth of the tumor are called the tumor microenvironment.
- the study of the tumor microenvironment is very important for the diagnosis, prognosis, and responsiveness to specific treatment methods for the current state of cancer, and for developing new treatment methods.
- IHC staining reagents targeting specific immune cells or proteins expected to exist in the tumor microenvironment were used for analysis of the tumor microenvironment. That is, a process of staining a pathological specimen with an IHC staining reagent for a specific target and reading the staining result with the naked eye through an optical microscope by a pathologist was performed to determine the positional relationship between the targets and to quantify the amount.
- the technical task of the present invention is to learn artificial neural networks by using pathological slides in which serial sections of a single sample are stained with H&E or various target IHC staining reagents, etc. It is to provide a method and system that enables the network to comprehensively analyze the tumor microenvironment, thereby enabling the diagnosis, prognosis, and responsiveness to specific treatment methods for the current state of cancer to be identified with high accuracy.
- the first pathology slide image to the Nth pathology slide image is a pathology slide in which serial sections for a single pathology specimen are stained with different staining reagents. image; and generating the mth learning data based on the first pathological slide image to the Nth pathological slide image.
- the generating of the m-th learning data based on the first pathological slide image to the N-th pathological slide image comprises: channel overlapping the first pathological slide image to the N-th pathological slide image. and converting into one multi-channel image through stacking), and the m-th learning data may include the multi-channel image.
- the training data includes N channels
- the step of converting the first pathological slide image to the Nth pathological slide image into one multi-channel image through channel stacking is 1 ⁇
- the generating of the m-th learning data based on the first pathology slide image to the N-th pathology slide image includes the biological tissue present in the first pathological slide image to the N-th pathology slide image. specifying an area; matching the images of the first pathology slide to the Nth pathology slide so that the positions and shapes between the biological tissue regions present in each of the first pathology slides to the Nth pathology slide images match; and converting the matched first pathological slide image to the registered Nth pathological slide image into one multi-channel image through channel stacking, wherein the m-th learning data is the multi-channel image may include.
- the training data includes N channels, and converting the matched first pathological slide image to the registered Nth pathological slide image into one multi-channel image through channel stacking.
- a method of providing a determination result for a predetermined target pathology specimen through an artificial neural network learned by the above-described artificial neural network learning method wherein the computing system includes: Acquiring a slide image to an Nth judgment target pathology slide image (where N is a natural number equal to or greater than 2), wherein the first judgment target pathology slide image to the Nth judgment target pathology slide image are Image of a pathology slide in which serial sections were stained with different staining reagents; A method comprising the step of outputting, by the computing system, a result of the determination by the artificial neural network on the target pathology sample to be determined based on the slide image of the first judgment target pathology or the Nth judgment target pathology slide image do.
- a computer program installed in a data processing apparatus and recorded in a medium for performing the above-described method.
- a computer-readable recording medium in which a computer program for performing the above-described method is recorded.
- an artificial neural network learning system comprising: a processor; and a memory storing a computer program, wherein the computer program, when executed by the processor, causes the computing system to perform a method of learning an artificial neural network
- the first pathology slide image to the Nth pathology slide image is a path
- An artificial neural network learning system comprising generating the m-th learning data based on the first pathological slide image to the N-th pathological slide image.
- the processor and a memory for storing a computer program, wherein the computer program, when executed by the processor, causes the computing system to determine the pathological specimen through the artificial neural network learned by the artificial neural network learning method described above.
- a method of providing a result is performed, and the method of providing a result of the determination includes: acquiring a first judgment target pathology slide image to an Nth judgment target pathology slide image (where N is a natural number of 2 or more) -
- the first judgment target pathology slide image to the Nth judgment target pathology slide image are pathology slide images in which serial sections of a predetermined judgment target pathology specimen are stained with different staining reagents; and outputting, by the artificial neural network, a judgment result for the judgment target pathology sample determined based on the first judgment target pathology slide image or the N-th judgment target pathology slide image;
- the learned artificial neural network is transformed into a tumor microenvironment. It is possible to provide a method and system that enables the diagnosis of the current state of cancer, prognosis, and responsiveness to a specific treatment method to be identified with high accuracy by allowing comprehensive analysis of the cancer.
- a similar effect to multiplex IHC can be obtained by overlapping and utilizing multiple pathological slide images generated by using a widely used method instead of a high-cost, non-generalized method such as multiplex IHC. , and also has the effect of increasing the accuracy of tumor microenvironment analysis by blocking the possibility of errors that may occur in the process of separating location information for each target through color filtering of the multiplex IHC result.
- FIG. 1 is a diagram schematically illustrating an environment in which a method for learning an artificial neural network and a method for providing a judgment result for a pathological specimen according to the technical spirit of the present invention are performed.
- FIG. 2 is a flowchart illustrating a method for learning a neural network according to an embodiment of the present invention.
- FIG. 3 is a diagram illustrating one multi-channel image generated by a plurality of pathological slide images represented by an RGB color model through channel overlap.
- FIG. 4 is a diagram illustrating an example of a process of generating individual learning data according to an embodiment of the present invention.
- FIG. 5 is a flowchart illustrating an example of a method for providing a determination result for a pathological specimen according to an embodiment of the present invention.
- FIG. 6 is a diagram illustrating a schematic configuration of an artificial neural network learning system according to an embodiment of the present invention
- FIG. 7 is a diagram illustrating a schematic configuration of a system for providing a determination result according to an embodiment of the present invention.
- the component when any one component 'transmits' data to another component, the component may directly transmit the data to the other component or through at least one other component. This means that the data may be transmitted to the other component. Conversely, when one component 'directly transmits' data to another component, it means that the data is transmitted from the component to the other component without passing through the other component.
- FIG. 1 is a diagram schematically illustrating an environment in which a method for learning an artificial neural network and a method for providing a judgment result for a pathological specimen according to the technical spirit of the present invention are performed.
- the artificial neural network learning method may be performed by the neural network learning system 100, and a method for providing a determination result for a pathological specimen according to an embodiment of the present invention may be performed by the judgment result providing system 200 for the pathological specimen (hereinafter, referred to as a 'judgment result providing system').
- the neural network learning system 100 may learn the artificial neural network 300 for providing diagnostic information, prognosis information, and/or response information to a treatment method for a pathological specimen, and the determination result providing system 200 ) can make various judgments (eg, whether or not disease manifests, prognosis, treatment method, etc.) on the target sample using the learned artificial neural network 300 .
- the neural network learning system 100 and/or the determination result providing system 200 may be a computing system that is a data processing device having computational power for implementing the technical idea of the present invention, and generally, a client through a network It may include a computing device such as a personal computer or a portable terminal as well as a server that is an accessible data processing device.
- the neural network learning system 100 and/or the determination result providing system 200 may be implemented as any one physical device, but if necessary, a plurality of physical devices may be organically coupled to each other according to the technical idea of the present invention.
- An average expert in the technical field of the present invention can easily infer that the neural network learning system 100 and/or the judgment result providing system 200 can be implemented.
- the neural network learning system 100 may learn the neural network 300 based on training data generated from a plurality of pathological specimens.
- the pathological specimen may be a biopsy collected from various organs of the human body or a living tissue excised by surgery.
- the neural network learning system 100 generates individual training data using digital pathology slide images of a serial section of a pathological specimen, and inputs it to the input layer of the neural network 300 to the neural network (300) can be learned.
- the neural network 300 may be an artificial neural network trained to output a probability value of whether or not a disease is present with respect to a predetermined disease.
- the neural network 300 may output a numerical value, ie, a probability value, indicating a determination result (eg, a possibility of disease expression) for a target sample based on data input through an input layer.
- an artificial neural network is a neural network artificially constructed based on the operation principle of human neurons.
- the artificial neural network may be a convolutional neural network or may include a convolutional neural network.
- the learned neural network 300 may be stored in the judgment result providing system 200, and the judgment result providing system 200 may make a judgment on a predetermined diagnostic target sample using the learned artificial neural network.
- the neural network learning system 100 and/or the determination result providing system 200 may be implemented in the form of a subsystem of a predetermined parent system 10 .
- the parent system 10 may be a server.
- the server 10 means a data processing device having computational capability for implementing the technical idea of the present invention, and generally provides a specific service such as a personal computer, a mobile terminal, etc. as well as a data processing device that a client can access through a network.
- An average expert in the art of the present invention can easily infer that any device capable of performing can be defined as a server.
- the neural network learning system 100 and the determination result providing system 200 may be implemented in a separate form.
- FIG. 2 is a flowchart illustrating a method for learning a neural network according to an embodiment of the present invention.
- the neural network learning system 100 may obtain a first pathological slide image to an N-th pathological slide image (where N is a natural number equal to or greater than 2) (S110) ).
- the first pathology slide image to the Nth pathology slide image may be pathology slide images in which serial sections of a single pathology specimen are stained with different staining reagents.
- Each section of the pathology specimen may be a portion of a slice of the pathology specimen in order to produce a digital slide image, and by sequentially slicing the pathology specimen to make a plurality of glass slides, staining them with different staining reagents and digitizing them, A first pathology slide image to an Nth pathology slide image may be generated.
- the staining reagent may be a reagent for hematoxylin and eosin (H&E) staining or a reagent for IHC (immunohistochemistry) staining of a specific target.
- one pathological specimen is successively sliced, and each sliced section is sequentially stained with an H&E staining reagent, a first IHC staining reagent, a second IHC staining reagent, etc. to make a glass slide, which is then digitally imaged to A plurality of pathology slide images corresponding to pathological specimens may be generated.
- the neural network learning system 100 may receive a first pathology slide image to an Nth pathology slide image corresponding to a predetermined pathological specimen from an external terminal, and the first pathology corresponding to the pathological specimen.
- the slide image to the Nth pathology slide image may be acquired from a memory device pre-stored in the slide image to the Nth pathology slide image.
- the neural network learning system 100 may generate the m-th learning data based on the first pathological slide image to the N-th pathological slide image (S120).
- the neural network learning system 100 may generate m-th learning data through a channel overlap method. That is, the neural network learning system 100 may convert the first pathological slide image to the N-th pathological slide image into one multi-channel image through channel stacking, and the m-th learning data is The multi-channel image may be included.
- 3 is a diagram illustrating one multi-channel image generated by a plurality of pathological slide images represented by an RGB color model through channel overlap. 3 shows a case of converting slide images of four consecutive sections extracted from a single pathological specimen into a multi-channel image.
- the multi-channel image 20 may be composed of 12 channels, which is the product of 4, which is the number of slide images, and 3, which is the number of channels constituting each slide image.
- the second channel 21-2 may be composed of the G channel value of each pixel of the first slide image
- the third channel 21- 3) may be composed of the B channel value of each pixel of the first slide image
- the first channel 21-1 may be composed of the R channel value of each pixel of the first slide image
- the second channel 21-2 may be composed of the G channel value of each pixel of the first slide image
- the third channel 21-3 may be composed of the B channel value of each pixel of the first slide image
- the fourth channel 22-1 may be composed of the R channel value of each pixel of the second slide image
- the fifth channel 22-2 may consist of the G channel value of each pixel of the second slide image
- the sixth channel 22-3 may include the B channel value of each pixel of the second slide image.
- the seventh channel 23-1 may be composed of the R channel value of each pixel of the third slide image
- the eighth channel 23-2 is the G of each pixel of the third slide image.
- the ninth channel 23-3 may be composed of a B channel value of each pixel of the first slide image
- the tenth channel 24-1 may be composed of each pixel of the fourth slide image may be composed of the R channel value of
- the eleventh channel 24-2 may be composed of the G channel value of each pixel of the fourth slide image
- the twelfth channel 24-3 may be composed of the G channel value of each pixel of the fourth slide image. It may consist of the B channel value of each pixel.
- each pathological slide image extracted from a single pathological specimen may be slightly shifted in position or direction in the process of producing the corresponding slide image.
- a process of matching each pathological slide image may have to be performed. A flowchart in this case is shown in FIG. 4 .
- the neural network learning system 100 may specify a biological tissue region existing in the first pathology slide image to the Nth pathology slide image ( S121 ).
- a method for the neural network learning system 100 to specify a biological tissue region from an image may vary.
- the neural network learning system 100 may specify a biological tissue region by using the corresponding information.
- the neural network learning system 100 may specify a biological tissue region by using a pre-learned neural network for determining the biological tissue region.
- the biological tissue region may be specified through various known methods.
- the neural network learning system 100 matches the first pathological slide to the Nth pathological slide image so that the position and shape between the biological tissue regions present in each of the first pathological slide to the Nth pathological slide image are matched. It can be done (S122).
- Image registration is a technique used in the field, and refers to a processing technique that transforms different images and displays them in one coordinate system.
- the method of matching the two images may include a method of transforming so that the contour of the tissue region included in the image is as similar as possible, or a method of transforming the feature points within the tissue region so that the feature points within the tissue region match as much as possible.
- SIFT Scale-Invariant Feature
- a matching algorithm based on the similarity between two images measured through Transform), Sum of Squared Difference (SSD), Sum of Absolute Difference (SAD), Normalized Cross Correlation (NCC), etc. may be used.
- the transformation relation corresponding to the ith pathology slide image is a transformation relation between the ith pathology slide image and the matched slide image with the ith pathology corresponding thereto.
- the neural network learning system 100 converts the matched first pathological slide image to the registered Nth pathological slide image into one multi-channel image through channel stacking. can be done (S123), which is similar to that described above with reference to FIG. 3, and thus a detailed description thereof will be omitted.
- a lesion region may be pre-annotated in each pathology slide image, and in this case, the pre-annotated lesion annotation region may be additionally included in the training data.
- the m-th training data may further include the multi-channel lesion annotation region.
- the neural network learning system 100 uses the information on the m-th learning data. It can be set as a label.
- the neural network learning system 100 inputs the generated training data set as an input layer of the neural network 300 to the neural network (300) can be learned (S130 in FIG. 2).
- the neural network learning method is similar to multiplex IHC by allowing multiple pathological slide images generated by using a widely used method to overlap and utilize instead of a high-cost, non-generalized method such as multiplex IHC.
- FIG. 5 is a flowchart illustrating an example of a method for providing a determination result for a pathological specimen according to an embodiment of the present invention.
- the method for providing a determination result for a pathological specimen according to FIG. 5 may be performed by the determination result providing system 200 , and the determination result providing system 200 is an artificial neural network learned by the neural network learning system 100 .
- the network 300 may be stored.
- the determination result providing system 200 may acquire a first determination target pathology slide image to an Nth determination target pathology slide image of a predetermined determination target pathology specimen ( S210 ).
- the first judgment target pathology slide image to the Nth judgment target pathology slide image is a pathology slide image in which serial sections of the judgment target pathology sample are stained with different staining reagents.
- the determination result providing system 200 may generate input data based on the first pathology slide image to the Nth pathology slide image of the determination target sample ( S220 ). Since the process of generating input data corresponding to the first pathological slide image to the Nth pathological slide image of the target sample is very similar to the process described with reference to FIGS. 3 to 4, a separate description will be omitted.
- the determination result providing system 200 may input input data to the artificial neural network 300 and output a determination result for the pathological sample to be determined based on the result output by the artificial neural network (S230). ).
- FIG. 6 is a diagram illustrating a schematic configuration of an artificial neural network learning system 100 according to an embodiment of the present invention
- FIG. 7 is a schematic configuration of a determination result providing system 200 according to an embodiment of the present invention. is a diagram showing
- the artificial neural network learning system 100 and the determination result providing system 200 may mean a logical configuration having hardware resources and/or software necessary to implement the technical idea of the present invention, and must be It does not mean a single physical component or a single device. That is, the artificial neural network learning system 100 and the determination result providing system 200 may mean a logical combination of hardware and/or software provided to implement the technical idea of the present invention, and if necessary, each other It may be implemented as a set of logical configurations for implementing the technical idea of the present invention by being installed in spaced devices to perform respective functions. In addition, the artificial neural network learning system 100 and the determination result providing system 200 may refer to a set of components separately implemented for each function or role for implementing the technical idea of the present invention.
- Each component of the artificial neural network learning system 100 and the determination result providing system 200 may be located in different physical devices or may be located in the same physical device.
- the combination of software and/or hardware constituting each of the components of the artificial neural network learning system 100 and the determination result providing system 200 is also located in different physical devices, and different physical devices The components located in the may be organically combined with each other to implement each of the modules.
- a module may mean a functional and structural combination of hardware for carrying out the technical idea of the present invention and software for driving the hardware.
- the module may mean a logical unit of a predetermined code and a hardware resource for executing the predetermined code, and does not necessarily mean physically connected code or one type of hardware. can be easily inferred to an average expert in the technical field of the present invention.
- the artificial neural network learning system 100 may include a storage module 110 , an acquisition module 120 , a generation module 130 , and a learning module 140 .
- the artificial neural network learning system 100 may include more components than this, of course.
- the artificial neural network learning system 100 includes a communication module (not shown) for communicating with an external device, and a control module (not shown) for controlling components and resources of the artificial neural network learning system 100 . may further include.
- the storage module 110 may store the artificial neural network 40 to be learned.
- the acquisition module 120 may acquire images of the first pathology slide to the Nth pathology slide obtained by staining consecutive sections for each single pathological specimen with different staining reagents.
- the generating module 130 may generate individual learning data based on the first pathological slide to the Nth pathological slide image, and may constitute a learning data set including a plurality of individual learning data.
- the learning module 140 may learn the artificial neural network 300 based on the training data set.
- the determination result providing system 200 may include a storage module 210 , an acquisition module 220 , a generation module 230 , and a determination module 240 .
- some of the above-described components may not necessarily correspond to the components essential for the implementation of the present invention, and also according to the embodiment, the determination result providing system 200 Of course, it may include more components than this.
- the determination result providing system 200 includes a communication module (not shown) for communicating with the triaxial vibration sensor 20 , and a control module for controlling components and resources of the determination result providing system 200 . (not shown) may be further included.
- the storage module 210 may store the learned artificial neural network 40 .
- the acquisition module 220 may acquire a first judgment target pathology slide image to an Nth judgment target pathology slide image in which serial sections of a predetermined judgment target pathology specimen are stained with different staining reagents.
- the generating module 230 may generate input data based on the first judgment target pathology slide to the Nth judgment target pathology slide image.
- the determination module 240 may input the input data into the artificial neural network and determine the object to be determined based on the predicted value output from the artificial neural network 40 .
- the artificial neural network learning system 100 and the determination result providing system 200 may include a processor and a memory for storing a program executed by the processor.
- the processor may include a single-core CPU or a multi-core CPU.
- the memory may include high-speed random access memory and may include non-volatile memory such as one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid-state memory devices. Access to memory by the processor and other components may be controlled by a memory controller.
- the method according to the embodiment of the present invention may be implemented in the form of a computer-readable program command and stored in a computer-readable recording medium, and the control program and the target program according to the embodiment of the present invention are also implemented in the computer. It may be stored in a readable recording medium.
- the computer-readable recording medium includes all types of recording devices in which data readable by a computer system is stored.
- the program instructions recorded on the recording medium may be specially designed and configured for the present invention, or may be known and available to those skilled in the software field.
- Examples of the computer-readable recording medium include magnetic media such as hard disks, floppy disks and magnetic tapes, optical recording media such as CD-ROMs and DVDs, and floppy disks. hardware devices specially configured to store and execute program instructions, such as magneto-optical media and ROM, RAM, flash memory, and the like.
- the computer-readable recording medium is distributed in a computer system connected through a network, so that the computer-readable code can be stored and executed in a distributed manner.
- Examples of the program instruction include not only machine code such as generated by a compiler, but also a device for electronically processing information using an interpreter or the like, for example, a high-level language code that can be executed by a computer.
- the hardware devices described above may be configured to operate as one or more software modules to perform the operations of the present invention, and vice versa.
- the present invention can be applied to a method for learning an artificial neural network that provides a judgment result for a pathological specimen, and a computing system for performing the same.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- General Physics & Mathematics (AREA)
- Biomedical Technology (AREA)
- Evolutionary Computation (AREA)
- Software Systems (AREA)
- Artificial Intelligence (AREA)
- Computing Systems (AREA)
- Data Mining & Analysis (AREA)
- Life Sciences & Earth Sciences (AREA)
- Molecular Biology (AREA)
- Public Health (AREA)
- Biophysics (AREA)
- General Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Databases & Information Systems (AREA)
- Mathematical Physics (AREA)
- Computational Linguistics (AREA)
- Multimedia (AREA)
- Epidemiology (AREA)
- Primary Health Care (AREA)
- Radiology & Medical Imaging (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Pathology (AREA)
- Quality & Reliability (AREA)
- Image Analysis (AREA)
- Investigating Or Analysing Biological Materials (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Biotechnology (AREA)
- Evolutionary Biology (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Bioethics (AREA)
Abstract
Description
Claims (12)
- 인공 뉴럴 네트워크를 학습하는 방법으로서,뉴럴 네트워크 학습 시스템이, M개의 개별 학습 데이터(여기서, M은 2 이상의 자연수)를 포함하는 학습 데이터 세트를 생성하는 단계; 및상기 뉴럴 네트워크 학습 시스템이, 상기 학습 데이터 세트에 기초하여 상기 인공 뉴럴 네트워크를 학습하는 단계(상기 뉴럴 네트워크 학습 시스템이, 상기 학습 데이터 세트에 포함된 M개의 개별 학습 데이터 각각을 상기 인공 뉴럴 네트워크의 입력 레이어에 입력하여 상기 인공 뉴럴 네트워크를 학습하는 단계)를 포함하되,상기 M개의 개별 학습 데이터를 포함하는 학습 데이터 세트를 생성하는 단계는, 1<=m<=M인 모든 m에 대하여, 상기 학습 데이터 세트에 포함될 제m학습 데이터를 생성하는 단계를 포함하고,상기 제m학습 데이터를 생성하는 단계는,제1 병리 슬라이드 이미지 내지 제N병리 슬라이드 이미지(여기서, N은 2 이상의 자연수)를 획득하는 단계-여기서, 상기 제1병리 슬라이드 이미지 내지 제N병리 슬라이드 이미지는, 단일 병리 검체에 대한 연속된 섹션(serial section)을 서로 다른 염색 시약으로 염색한 병리 슬라이드 이미지임; 및상기 제1 병리 슬라이드 이미지 내지 상기 제N병리 슬라이드 이미지에 기초하여 상기 제m학습 데이터를 생성하는 단계를 포함하는 인공 뉴럴 네트워크 학습 방법.
- 제1항에 있어서,상기 제1병리 슬라이드 이미지 내지 상기 제N병리 슬라이드 이미지에 기초하여 상기 제m학습 데이터를 생성하는 단계는,상기 제1병리 슬라이드 이미지 내지 상기 제N병리 슬라이드 이미지를 채널 중첩(channel stacking)을 통해 하나의 다채널 이미지로 변환하는 단계를 포함하고,상기 제m학습 데이터는 상기 다채널 이미지를 포함하는 인공 뉴럴 네트워크 학습 방법.
- 제1항에 있어서,상기 제1 병리 슬라이드 이미지 내지 상기 제N병리 슬라이드 이미지에 기초하여 상기 제m학습 데이터를 생성하는 단계는,상기 제1병리 슬라이드 이미지 내지 상기 제N병리 슬라이드 이미지 내에 존재하는 생체 조직 영역을 특정하는 단계;상기 제1병리 슬라이드 내지 상기 제N병리 슬라이드 이미지 각각에 존재하는 생체 조직 영역 간의 위치 및 형태가 일치하도록 상기 제1병리 슬라이드 내지 상기 제N병리 슬라이드 이미지를 정합하는 단계; 및상기 정합된 제1병리 슬라이드 이미지 내지 상기 정합된 제N병리 슬라이드 이미지를 채널 중첩(channel stacking)을 통해 하나의 다채널 이미지로 변환하는 단계를 포함하고,상기 제m학습 데이터는 상기 다채널 이미지를 포함하는 인공 뉴럴 네트워크 학습 방법.
- 제3항에 있어서,상기 제1병리 슬라이드 내지 상기 제N병리 슬라이드 이미지 각각에 존재하는 생체 조직 영역 간의 위치 및 형태가 일치하도록 상기 제1병리 슬라이드 내지 상기 제N병리 슬라이드 이미지를 정합하는 단계는,1<=i<=N인 모든 자연수 i에 대하여, 제i병리 슬라이드 이미지에 상응하는 변환 관계를 산출하는 단계를 포함하고(여기서, 상기 제i병리 슬라이드 이미지에 상응하는 변환 관계는, 상기 제i병리 슬라이드 이미지와 그에 대응되는 정합된 제i병리 슬라이드 이미지 간의 변환 관계임),상기 제1병리 슬라이드 이미지 내지 상기 제N병리 슬라이드 이미지에 기초하여 상기 제m학습 데이터를 생성하는 단계는,1<=j<=N인 모든 자연수 j에 대하여, 제j병리 슬라이드 이미지에 부여된 병변 어노테이션 영역을 상기 제j병리 슬라이드 이미지에 상응하는 변환 관계를 이용하여 변형하는 단계; 및상기 변형된 제1병리 슬라이드 이미지의 병변 어노테이션 영역 내지 상기 변형된 제N병리 슬라이드 이미지의 병변 어노테이션 영역을 채널 중첩(channel stacking)을 통해 하나의 다채널 병변 어노테이션 영역으로 변환하는 단계를 더 포함하고,상기 제m학습 데이터는 상기 다채널 병변 어노테이션 영역을 더 포함하는 인공 뉴럴 네트워크 학습 방법.
- 제1항에 기재된 인공 뉴럴 네트워크 학습 방법에 의해 학습된 인공 뉴럴 네트워크를 통해 소정의 판단 대상 병리 검체에 대한 판단 결과를 제공하는 방법으로서,컴퓨팅 시스템이, 제1판단 대상 병리 슬라이드 이미지 내지 제N판단 대상 병리 슬라이드 이미지(여기서, N은 2 이상의 자연수)를 획득하는 단계-여기서, 상기 제1판단 대상 병리 슬라이드 이미지 내지 상기 제N판단 대상 병리 슬라이드 이미지는, 상기 판단 대상 병리 검체에 대한 연속된 섹션(serial section)을 서로 다른 염색 시약으로 염색한 병리 슬라이드 이미지임;상기 컴퓨팅 시스템이, 상기 인공 뉴럴 네트워크가 상기 제1판단 대상 병리 슬라이드 이미지 내지 상기 제N판단 대상 병리 슬라이드 이미지에 기초하여 판단한 상기 판단 대상 병리 검체에 대한 판단 결과를 출력하는 단계를 포함하는 방법.
- 데이터 처리장치에 설치되며 제1항 내지 제5항 중 어느 한 항에 기재된 방법을 수행하기 위한 매체에 기록된 컴퓨터 프로그램.
- 제1항 내지 제5항 중 어느 한 항에 기재된 방법을 수행하기 위한 컴퓨터 프로그램이 기록된 컴퓨터 판독 가능한 기록매체.
- 인공 뉴럴 네트워크 학습 시스템으로서,프로세서; 및 컴퓨터 프로그램을 저장하는 메모리를 포함하고,상기 컴퓨터 프로그램은, 상기 프로세서에 의해 실행되는 경우, 상기 컴퓨팅 시스템으로 하여금 인공 뉴럴 네트워크를 학습하는 방법을 수행하도록 하며,상기 인공 뉴럴 네트워크 학습 시스템은,뉴럴 네트워크 학습 시스템이, M개의 개별 학습 데이터(여기서, M은 2 이상의 자연수)를 포함하는 학습 데이터 세트를 생성하는 단계; 및상기 뉴럴 네트워크 학습 시스템이, 상기 학습 데이터 세트에 기초하여 상기 인공 뉴럴 네트워크를 학습하는 단계를 포함하되,상기 M개의 개별 학습 데이터를 포함하는 학습 데이터 세트를 생성하는 단계는, 1<=m<=M인 모든 m에 대하여, 상기 학습 데이터 세트에 포함될 제m학습 데이터를 생성하는 단계를 포함하고,상기 제m학습 데이터를 생성하는 단계는,제1 병리 슬라이드 이미지 내지 제N병리 슬라이드 이미지(여기서, N은 2 이상의 자연수)를 획득하는 단계-여기서, 상기 제1병리 슬라이드 이미지 내지 제N병리 슬라이드 이미지는, 단일 병리 검체에 대한 연속된 섹션(serial section)을 서로 다른 염색 시약으로 염색한 병리 슬라이드 이미지임;상기 제1 병리 슬라이드 이미지 내지 상기 제N병리 슬라이드 이미지에 기초하여 상기 제m학습 데이터를 생성하는 단계를 포함하는 인공 뉴럴 네트워크 학습 시스템.
- 제8항에 있어서,상기 제1병리 슬라이드 이미지 내지 상기 제N병리 슬라이드 이미지에 기초하여 상기 제m학습 데이터를 생성하는 단계는,상기 제1병리 슬라이드 이미지 내지 상기 제N병리 슬라이드 이미지를 채널 중첩(channel stacking)을 통해 하나의 다채널 이미지로 변환하는 단계를 포함하고,상기 제m학습 데이터는 상기 다채널 이미지를 포함하는 인공 뉴럴 네트워크 학습 방법.
- 제8항에 있어서,상기 제1 병리 슬라이드 이미지 내지 상기 제N병리 슬라이드 이미지에 기초하여 상기 제m학습 데이터를 생성하는 단계는,상기 제1병리 슬라이드 이미지 내지 상기 제N병리 슬라이드 이미지 내에 존재하는 생체 조직 영역을 특정하는 단계;상기 제1병리 슬라이드 내지 상기 제N병리 슬라이드 이미지 각각에 존재하는 생체 조직 영역 간의 위치 및 형태가 일치하도록 상기 제1병리 슬라이드 내지 상기 제N병리 슬라이드 이미지를 정합하는 단계; 및상기 정합된 제1병리 슬라이드 이미지 내지 상기 정합된 제N병리 슬라이드 이미지를 채널 중첩(channel stacking)을 통해 하나의 다채널 이미지로 변환하는 단계를 포함하고,상기 제m학습 데이터는 상기 다채널 이미지를 포함하는 인공 뉴럴 네트워크 학습 시스템.
- 제10항에 있어서,상기 제1병리 슬라이드 내지 상기 제N병리 슬라이드 이미지 각각에 존재하는 생체 조직 영역 간의 위치 및 형태가 일치하도록 상기 제1병리 슬라이드 내지 상기 제N병리 슬라이드 이미지를 정합하는 단계는,1<=i<=N인 모든 자연수 i에 대하여, 제i병리 슬라이드 이미지에 상응하는 변환 관계를 산출하는 단계를 포함하고(여기서, 상기 제i병리 슬라이드 이미지에 상응하는 변환 관계는, 상기 제i병리 슬라이드 이미지와 그에 대응되는 정합된 제i병리 슬라이드 이미지 간의 변환 관계임),상기 제1병리 슬라이드 이미지 내지 상기 제N병리 슬라이드 이미지에 기초하여 상기 제m학습 데이터를 생성하는 단계는,1<=j<=N인 모든 자연수 j에 대하여, 제j병리 슬라이드 이미지에 부여된 병변 어노테이션 영역을 상기 제j병리 슬라이드 이미지에 상응하는 변환 관계를 이용하여 변형하는 단계; 및상기 변형된 제1병리 슬라이드 이미지의 병변 어노테이션 영역 내지 상기 변형된 제N병리 슬라이드 이미지의 병변 어노테이션 영역을 채널 중첩(channel stacking)을 통해 하나의 다채널 병변 어노테이션 영역으로 변환하는 단계를 더 포함하고,상기 제m학습 데이터는 상기 다채널 병변 어노테이션 영역을 더 포함하는 인공 뉴럴 네트워크 학습 시스템.
- 병리 검체에 대한 판단 결과 제공 시스템으로서,프로세서; 및 컴퓨터 프로그램을 저장하는 메모리를 포함하고,상기 컴퓨터 프로그램은, 상기 프로세서에 의해 실행되는 경우, 상기 컴퓨팅 시스템으로 하여금 제1항에 기재된 인공 뉴럴 네트워크 학습 방법에 의해 학습된 인공 뉴럴 네트워크를 통해 병리 검체에 대한 판단 결과를 제공하는 방법을 수행하도록 하며,상기 판단 결과를 제공하는 방법은,제1판단 대상 병리 슬라이드 이미지 내지 제N판단 대상 병리 슬라이드 이미지(여기서, N은 2 이상의 자연수)를 획득하는 단계-여기서, 상기 제1판단 대상 병리 슬라이드 이미지 내지 상기 제N판단 대상 병리 슬라이드 이미지는, 소정의 판단 대상 병리 검체에 대한 연속된 섹션(serial section)을 서로 다른 염색 시약으로 염색한 병리 슬라이드 이미지임;상기 인공 뉴럴 네트워크가 상기 제1판단 대상 병리 슬라이드 이미지 내지 상기 제N판단 대상 병리 슬라이드 이미지에 기초하여 판단한 상기 판단 대상 병리 검체에 대한 판단 결과를 출력하는 단계를 포함하는 병리 검체에 대한 판단 결과 제공 시스템.
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP22736871.9A EP4258283A4 (en) | 2021-01-07 | 2022-01-07 | METHOD FOR TRAINING AN ARTIFICIAL NEURAL NETWORK USING DETERMINATION RESULTS OF A PATHOLOGICAL SAMPLE AND COMPUTER SYSTEM FOR CARRYING OUT THE METHOD |
JP2023540030A JP2024502806A (ja) | 2021-01-07 | 2022-01-07 | 病理検体に対する判断結果を提供する人工ニューラルネットワークの学習方法、及びこれを行うコンピューティングシステム |
US18/271,233 US20240281653A1 (en) | 2021-01-07 | 2022-01-07 | Method for training artificial neural network providing determination result of pathological specimen, and computing system for performing same |
CN202280009439.3A CN116783662A (zh) | 2021-01-07 | 2022-01-07 | 对病理样本提供判断结果的人工神经网络的学习方法及执行其的计算系统 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020210001912A KR102246319B1 (ko) | 2021-01-07 | 2021-01-07 | 병리 검체에 대한 판단 결과를 제공하는 인공 뉴럴 네트워크의 학습 방법, 및 이를 수행하는 컴퓨팅 시스템 |
KR10-2021-0001912 | 2021-01-07 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022149894A1 true WO2022149894A1 (ko) | 2022-07-14 |
Family
ID=75910856
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2022/000269 WO2022149894A1 (ko) | 2021-01-07 | 2022-01-07 | 병리 검체에 대한 판단 결과를 제공하는 인공 뉴럴 네트워크의 학습 방법, 및 이를 수행하는 컴퓨팅 시스템 |
Country Status (6)
Country | Link |
---|---|
US (1) | US20240281653A1 (ko) |
EP (1) | EP4258283A4 (ko) |
JP (1) | JP2024502806A (ko) |
KR (1) | KR102246319B1 (ko) |
CN (1) | CN116783662A (ko) |
WO (1) | WO2022149894A1 (ko) |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102246319B1 (ko) * | 2021-01-07 | 2021-05-03 | 주식회사 딥바이오 | 병리 검체에 대한 판단 결과를 제공하는 인공 뉴럴 네트워크의 학습 방법, 및 이를 수행하는 컴퓨팅 시스템 |
WO2023287235A1 (ko) * | 2021-07-14 | 2023-01-19 | 주식회사 루닛 | 병리 이미지 분석 방법 및 시스템 |
KR102485414B1 (ko) * | 2021-12-13 | 2023-01-06 | 주식회사 딥바이오 | 병리 검체에 대한 판단 결과를 제공하는 인공 뉴럴 네트워크의 학습 방법, 및 이를 수행하는 컴퓨팅 시스템 |
KR102393957B1 (ko) | 2021-12-30 | 2022-05-03 | 서울대학교병원 | 전립선암 분석 장치 및 프로그램, 이의 동작 방법 |
WO2023167448A1 (ko) * | 2022-03-03 | 2023-09-07 | 주식회사 루닛 | 병리 슬라이드 이미지를 분석하는 방법 및 장치 |
KR102471515B1 (ko) * | 2022-03-31 | 2022-11-28 | 주식회사 딥바이오 | 면역조직화학 염색 이미지를 분석하기 위한 기계학습모델을 학습하는 방법 및 이를 수행하는 컴퓨팅 시스템 |
KR102579826B1 (ko) * | 2022-12-09 | 2023-09-18 | (주) 브이픽스메디칼 | 인공지능 기반 진단 보조 정보 제공 방법, 장치 및 시스템 |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0350453B2 (ko) | 1985-03-22 | 1991-08-01 | Nippon Denki Aishii Maikon Shisutemu Kk | |
KR20180066983A (ko) * | 2016-12-11 | 2018-06-20 | 주식회사 딥바이오 | 뉴럴 네트워크를 이용한 질병의 진단 시스템 및 그 방법 |
KR20190143510A (ko) * | 2018-06-04 | 2019-12-31 | 주식회사 딥바이오 | 투 페이스 질병 진단 시스템 및 그 방법 |
KR20200016658A (ko) * | 2018-08-07 | 2020-02-17 | 주식회사 딥바이오 | 뉴럴 네트워크를 이용한 질병의 진단 시스템 및 방법 |
KR20200044183A (ko) * | 2018-10-05 | 2020-04-29 | 주식회사 딥바이오 | 병리 이미지 검색을 위한 시스템 및 방법 |
KR102108050B1 (ko) * | 2019-10-21 | 2020-05-07 | 가천대학교 산학협력단 | 증강 컨볼루션 네트워크를 통한 유방암 조직학 이미지 분류 방법 및 그 장치 |
KR20200101540A (ko) * | 2019-02-01 | 2020-08-28 | 장현재 | 피부 이미지 기반의 인공지능 딥러닝을 이용한 피부질환 판별용 api 엔진을 구성하는 스마트 피부질환 판별 플랫폼시스템 |
KR102246319B1 (ko) * | 2021-01-07 | 2021-05-03 | 주식회사 딥바이오 | 병리 검체에 대한 판단 결과를 제공하는 인공 뉴럴 네트워크의 학습 방법, 및 이를 수행하는 컴퓨팅 시스템 |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2016087592A1 (en) | 2014-12-03 | 2016-06-09 | Ventana Medical Systems, Inc. | Systems and methods for early-stage cancer prognosis |
US11783603B2 (en) * | 2018-03-07 | 2023-10-10 | Verily Life Sciences Llc | Virtual staining for tissue slide images |
-
2021
- 2021-01-07 KR KR1020210001912A patent/KR102246319B1/ko active IP Right Grant
-
2022
- 2022-01-07 CN CN202280009439.3A patent/CN116783662A/zh active Pending
- 2022-01-07 EP EP22736871.9A patent/EP4258283A4/en active Pending
- 2022-01-07 US US18/271,233 patent/US20240281653A1/en active Pending
- 2022-01-07 JP JP2023540030A patent/JP2024502806A/ja active Pending
- 2022-01-07 WO PCT/KR2022/000269 patent/WO2022149894A1/ko active Application Filing
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0350453B2 (ko) | 1985-03-22 | 1991-08-01 | Nippon Denki Aishii Maikon Shisutemu Kk | |
KR20180066983A (ko) * | 2016-12-11 | 2018-06-20 | 주식회사 딥바이오 | 뉴럴 네트워크를 이용한 질병의 진단 시스템 및 그 방법 |
KR20190143510A (ko) * | 2018-06-04 | 2019-12-31 | 주식회사 딥바이오 | 투 페이스 질병 진단 시스템 및 그 방법 |
KR20200016658A (ko) * | 2018-08-07 | 2020-02-17 | 주식회사 딥바이오 | 뉴럴 네트워크를 이용한 질병의 진단 시스템 및 방법 |
KR20200044183A (ko) * | 2018-10-05 | 2020-04-29 | 주식회사 딥바이오 | 병리 이미지 검색을 위한 시스템 및 방법 |
KR20200101540A (ko) * | 2019-02-01 | 2020-08-28 | 장현재 | 피부 이미지 기반의 인공지능 딥러닝을 이용한 피부질환 판별용 api 엔진을 구성하는 스마트 피부질환 판별 플랫폼시스템 |
KR102108050B1 (ko) * | 2019-10-21 | 2020-05-07 | 가천대학교 산학협력단 | 증강 컨볼루션 네트워크를 통한 유방암 조직학 이미지 분류 방법 및 그 장치 |
KR102246319B1 (ko) * | 2021-01-07 | 2021-05-03 | 주식회사 딥바이오 | 병리 검체에 대한 판단 결과를 제공하는 인공 뉴럴 네트워크의 학습 방법, 및 이를 수행하는 컴퓨팅 시스템 |
Also Published As
Publication number | Publication date |
---|---|
EP4258283A4 (en) | 2024-10-16 |
US20240281653A1 (en) | 2024-08-22 |
CN116783662A (zh) | 2023-09-19 |
EP4258283A1 (en) | 2023-10-11 |
JP2024502806A (ja) | 2024-01-23 |
KR102246319B1 (ko) | 2021-05-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2022149894A1 (ko) | 병리 검체에 대한 판단 결과를 제공하는 인공 뉴럴 네트워크의 학습 방법, 및 이를 수행하는 컴퓨팅 시스템 | |
WO2021230457A1 (en) | Learning method and learning device for training an object detection network by using attention maps and testing method and testing device using the same | |
WO2017022882A1 (ko) | 의료 영상의 병리 진단 분류 장치 및 이를 이용한 병리 진단 시스템 | |
WO2021049729A1 (ko) | 인공지능 모델을 이용한 폐암 발병 가능성 예측 방법 및 분석 장치 | |
WO2017164478A1 (ko) | 미세 얼굴 다이나믹의 딥 러닝 분석을 통한 미세 표정 인식 방법 및 장치 | |
WO2020067632A1 (ko) | 인공지능 영상 학습을 위한 동영상의 학습 대상 프레임 이미지 샘플링 방법, 장치, 프로그램 및 그 영상 학습 방법 | |
WO2019050108A1 (ko) | 데이터 이미지화를 이용한 딥러닝 기반 시스템 이상행위 분석 기술 | |
WO2022163996A1 (ko) | 자기주의 기반 심층 신경망 모델을 이용한 약물-표적 상호작용 예측 장치 및 그 방법 | |
WO2019235828A1 (ko) | 투 페이스 질병 진단 시스템 및 그 방법 | |
WO2022231200A1 (ko) | 유방암 병변 영역을 판별하기 위한 인공 신경망을 학습하기 위한 학습 방법, 및 이를 수행하는 컴퓨팅 시스템 | |
WO2021034138A1 (ko) | 치매 평가 방법 및 이를 이용한 장치 | |
WO2021091027A1 (ko) | 3차원 굴절률 영상과 딥러닝을 활용한 비표지 방식의 3차원 분자 영상 생성 방법 및 장치 | |
WO2022124725A1 (ko) | 화합물과 단백질의 상호작용 예측 방법, 장치 및 컴퓨터 프로그램 | |
CN112151179B (zh) | 影像数据评估方法、装置、设备及存储介质 | |
WO2023191472A1 (ko) | 면역조직화학 염색 이미지를 분석하기 위한 기계학습모델을 학습하는 방법 및 이를 수행하는 컴퓨팅 시스템 | |
WO2021235682A1 (en) | Method and device for performing behavior prediction by using explainable self-focused attention | |
WO2022158843A1 (ko) | 조직 검체 이미지 정제 방법, 및 이를 수행하는 컴퓨팅 시스템 | |
CN111242922A (zh) | 一种蛋白质图像分类方法、装置、设备及介质 | |
WO2022197044A1 (ko) | 뉴럴 네트워크를 이용한 방광병변 진단 방법 및 그 시스템 | |
WO2020138549A1 (ko) | 음향 방출 결함 신호 검출 장치 및 방법 | |
WO2019189972A1 (ko) | 치매를 진단을 하기 위해 홍채 영상을 인공지능으로 분석하는 방법 | |
WO2021080043A1 (ko) | 시퀀싱 플랫폼 특이적인 오류를 줄인 체성 돌연변이 검출 장치 및 방법 | |
WO2018221816A1 (en) | Method for determining whether examinee is infected by microorganism and apparatus using the same | |
WO2017010612A1 (ko) | 의료 영상 분석 기반의 병리 진단 예측 시스템 및 방법 | |
WO2022191539A1 (ko) | Turp 병리 이미지로부터 전립선암을 검출하기 위한 용도의 인공 뉴럴 네트워크를 학습하는 방법 및 이를 수행하는 컴퓨팅 시스템 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22736871 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023540030 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18271233 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202280009439.3 Country of ref document: CN |
|
ENP | Entry into the national phase |
Ref document number: 2022736871 Country of ref document: EP Effective date: 20230704 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |