WO2022191539A1 - Turp 병리 이미지로부터 전립선암을 검출하기 위한 용도의 인공 뉴럴 네트워크를 학습하는 방법 및 이를 수행하는 컴퓨팅 시스템 - Google Patents
Turp 병리 이미지로부터 전립선암을 검출하기 위한 용도의 인공 뉴럴 네트워크를 학습하는 방법 및 이를 수행하는 컴퓨팅 시스템 Download PDFInfo
- Publication number
- WO2022191539A1 WO2022191539A1 PCT/KR2022/003178 KR2022003178W WO2022191539A1 WO 2022191539 A1 WO2022191539 A1 WO 2022191539A1 KR 2022003178 W KR2022003178 W KR 2022003178W WO 2022191539 A1 WO2022191539 A1 WO 2022191539A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- neural network
- turp
- artificial neural
- pathological
- learning
- Prior art date
Links
- 238000013528 artificial neural network Methods 0.000 title claims abstract description 140
- 230000001575 pathological effect Effects 0.000 title claims abstract description 105
- 206010060862 Prostate cancer Diseases 0.000 title claims abstract description 62
- 208000000236 Prostatic Neoplasms Diseases 0.000 title claims abstract description 62
- 238000000034 method Methods 0.000 title claims abstract description 51
- 238000012549 training Methods 0.000 title abstract description 18
- 210000002307 prostate Anatomy 0.000 claims abstract description 61
- 230000003902 lesion Effects 0.000 claims abstract description 24
- 238000013188 needle biopsy Methods 0.000 claims abstract description 23
- 238000011472 radical prostatectomy Methods 0.000 claims abstract description 6
- 230000007170 pathology Effects 0.000 claims description 58
- 238000011471 prostatectomy Methods 0.000 claims description 17
- 238000004590 computer program Methods 0.000 claims description 12
- 238000012545 processing Methods 0.000 claims description 7
- 238000002271 resection Methods 0.000 claims description 7
- 238000001514 detection method Methods 0.000 claims description 5
- 210000001519 tissue Anatomy 0.000 abstract description 9
- 238000010801 machine learning Methods 0.000 description 11
- 238000013527 convolutional neural network Methods 0.000 description 9
- 238000010586 diagram Methods 0.000 description 9
- 238000003745 diagnosis Methods 0.000 description 7
- 201000010099 disease Diseases 0.000 description 6
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 6
- 210000000981 epithelium Anatomy 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 210000003932 urinary bladder Anatomy 0.000 description 3
- 206010028980 Neoplasm Diseases 0.000 description 2
- 201000011510 cancer Diseases 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 238000013135 deep learning Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000001356 surgical procedure Methods 0.000 description 2
- 206010004446 Benign prostatic hyperplasia Diseases 0.000 description 1
- 208000004403 Prostatic Hyperplasia Diseases 0.000 description 1
- 238000001574 biopsy Methods 0.000 description 1
- 238000013145 classification model Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 238000010191 image analysis Methods 0.000 description 1
- 210000002569 neuron Anatomy 0.000 description 1
- 210000000056 organ Anatomy 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 208000024891 symptom Diseases 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 238000000844 transformation Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/20—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/20—ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/40—ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/0464—Convolutional networks [CNN, ConvNet]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30024—Cell structures in vitro; Tissue sections in vitro
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30081—Prostate
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30096—Tumor; Lesion
Definitions
- the present invention relates to a method for learning an artificial neural network for detecting prostate cancer from a TURP pathology image and a computing system for performing the same. More particularly, it relates to a method of learning an artificial neural network capable of effectively detecting prostate cancer lesions in a TURP pathological image while considering the tissue shape characteristically appearing in the TURP pathological image, and a computing system for performing the same.
- the existing pathology diagnosis method consists of a pathologist visually observing and reading a pathology slide image prepared from a specimen through an optical microscope.
- the method of converting pathological slides into digital images using a microscope camera connected to a computer and then observing and reading them on a monitor can be said to be the beginning of digital pathology.
- digital slide scanners a method of converting the entire pathological slide into a single digital image, producing a pathological slide image, and then observing and reading it through a computer monitor is widely spread.
- a neural network which is a type of machine learning (eg, a deep learning method using a convolutional neural network (CNN)).
- diagnosis through deep learning using a neural network does not simply automate the experience and knowledge of conventionally skilled medical personnel, but finds characteristic elements through self-learning to derive the desired answer. Rather, there are cases in which the characteristics of disease factors unknown to skilled medical personnel are found in the image.
- a diagnosis of a disease through a neural network using a biological image uses a piece of a biological image (eg, a biological tissue slide image), that is, a patch (also called a tile). That is, a medical practitioner skilled in the patch annotates the state of a specific disease (eg, whether cancer is expressed), and uses a plurality of such annotated patches as training data to learn the neural network.
- a convolutional neural network (a convolutional neural network) may be used as the neural network.
- transurethral resection of prostate is often performed for the treatment of benign diseases such as benign prostatic hyperplasia. do.
- the scale of TURP pathology images is very large. An average of N 2cm x 3cm glass slides are produced for each patient (N is an integer greater than or equal to 2), and when they are scanned at a magnification of 400 and made into a digital image, an image with a size of N x 80,000 x 120,000 pixels is produced.
- the size of the prostate cancer lesion area within the TURP pathology image is very small. Prostate cancer is detected in less than 20% of patients undergoing TURP procedure, and even when prostate cancer is found, it is mostly found in only one localized area of the pathological image. In other words, the size and capacity of the training data is very large, but the ratio of the area marked with prostate cancer to the entire tissue is very small. may occur.
- the technical task of the present invention is to solve the above-mentioned problems and provide a method for training a machine learning model that can detect prostate cancer in TURP pathological images with high performance. More specifically, it is to provide an efficient method for training a machine learning model that can effectively detect prostate cancer lesions within a TURP pathology image while considering the tissue shape characteristically displayed on the TURP pathology image.
- the neural network learning system acquires a plurality of pathological images for primary learning, wherein each of the plurality of pathological images for primary learning is obtained through prostate needle biopsy. It is either a prostate needle biopsy pathological image, which is a scanned image of a slide of a pathological specimen, or a prostatectomy pathological image, which is a scanned image of a slide of a pathological specimen obtained through radical prostatectomy-, the neural network learning system , using the plurality of pathological images for primary learning, the first step of learning an artificial neural network for judging prostate cancer -
- the artificial neural network for judging prostate cancer is an artificial neural network for detecting prostate cancer from pathological images Is a network-, wherein the neural network learning system acquires a plurality of TURP pathological images, which are images obtained by scanning a slide of a pathological specimen secured through transurethral resection of prostate (TURP), and learning the neural network A system comprising, by using the plurality of TURP pathological images, secondary learning
- the artificial neural network may be any one of U-Net, DeepLabv3+, Mask R-CNN, and DenseNet.
- a method of providing a determination result for a predetermined determination target TURP pathology image through an artificial neural network learned by the method for learning the artificial neural network wherein the computing system includes the determination target
- a method comprising: obtaining, by the computing system, a TURP pathology image; and outputting, by the artificial neural network, a prostate cancer detection result determined based on the judgment target TURP pathology image.
- a computer program installed in a data processing apparatus and recorded in a medium for performing the above-described method.
- a computer-readable recording medium in which a computer program for performing the above-described method is recorded.
- a neural network learning system comprising a processor and a memory for storing a computer program
- the computer program when executed by the processor, causes the neural network learning system to
- a neural network learning system for performing a method of learning a neural network is provided.
- a computing system that provides a determination result for a predetermined determination target TURP pathology image, comprising a processor and a memory for storing a computer program,
- the computer program when executed by the processor, causes the computing system to provide a judgment result for the TURP pathology image to be judged through the artificial neural network learned by the artificial neural network learning method described above. There is provided a computing system for doing so.
- the neural network learning system acquires a plurality of prostate needle biopsy pathological images, which are images obtained by scanning a slide of a pathological specimen secured through prostate needle biopsy, the neural network learning Acquiring, by the system, a plurality of prostatectomy pathological images that are scanned images of a slide of a pathological specimen obtained through radical prostatectomy, the neural network learning system, the plurality of prostate needle biopsy pathological images and Primary learning of an artificial neural network for judging prostate cancer by using the plurality of prostatectomy pathological images -
- the artificial neural network for judging prostate cancer is an artificial neural network for detecting prostate cancer from pathological images Im-, obtaining, by the neural network learning system, a plurality of TURP pathological images, which are images obtained by scanning a slide of a pathological specimen secured through transurethral resection of prostate (TURP), and the neural network learning system This, using the plurality of TURP pathological images, comprising the step of secondary learning of the primary learned artificial neural network, wherein the plurality
- FIG. 1 is a diagram schematically illustrating an environment in which a method for learning an artificial neural network and a method for providing a judgment result for a pathological specimen according to the technical spirit of the present invention are performed.
- FIG. 2 is a flowchart illustrating a method for learning a neural network according to an embodiment of the present invention.
- FIG. 3 is an example of a pathological image of a prostate needle biopsy specimen.
- Fig. 4a is an example including the region of the engraved prostate tissue
- Fig. 4b is an enlarged view of the region of the engraved prostate tissue
- 5A is an example in which the non-prostate tissue region is included
- FIG. 5B is an enlarged view of the non-prostate tissue region.
- FIG. 6 is a diagram illustrating an example of a method for providing a determination result according to an embodiment of the present invention.
- FIG. 7 is a diagram illustrating a schematic configuration of an artificial neural network learning system according to an embodiment of the present invention.
- FIG. 8 is a diagram illustrating a schematic configuration of a system for providing a determination result according to an embodiment of the present invention.
- the component when any one component 'transmits' data to another component, the component may directly transmit the data to the other component or through at least one other component. This means that the data may be transmitted to the other component. Conversely, when one component 'directly transmits' data to another component, it means that the data is transmitted from the component to the other component without passing through the other component.
- FIG. 1 is a diagram schematically illustrating an environment in which a method for learning an artificial neural network and a method for providing a judgment result for a pathological specimen according to the technical spirit of the present invention are performed.
- the artificial neural network learning method may be performed by the neural network learning system 100, and a method for providing a determination result for a pathological specimen according to an embodiment of the present invention may be performed by the judgment result providing system 200 for the pathological specimen (hereinafter, referred to as a 'judgment result providing system').
- the artificial neural network learning system 100 may learn the artificial neural network 300 .
- the artificial neural network 300 may be a neural network for providing diagnostic information on a pathological specimen obtained through transurethral resection of prostate (TURP).
- the pathological specimen may be a biopsy collected from various organs of the human body or a living tissue excised by surgery.
- the artificial neural network 300 may be an artificial neural network for receiving a TURP pathological image and detecting prostate cancer from the received TURP pathological image.
- the determination result providing system 300 may determine whether to detect prostate cancer from the TURP pathology image using the learned artificial neural network 300 .
- the TURP pathological image may refer to a scanned image or a part of a slide of a pathological specimen obtained through transurethral prostatectomy (TURP).
- the TURP pathological image may refer to a whole slide image obtained by scanning a pathological slide, but according to an embodiment, the TURP pathological image may refer to a patch or a tile obtained by dividing the entire slide image into a predetermined unit size. .
- the neural network 300 may be a machine learning model trained to output a probability value of whether prostate cancer is expressed.
- the neural network 300 may output a numerical value, ie, a probability value, indicating a determination result (eg, a possibility of disease expression) for a target sample based on data input through an input layer.
- the artificial neural network 300 may be a machine learning model that receives an entire slide image and determines the presence or absence of a lesion due to prostate cancer or detects a lesion area, and according to an embodiment, It may be a machine learning model that receives a patch and determines the presence or absence of a lesion due to prostate cancer or a lesion area in the corresponding patch.
- an artificial neural network is a neural network artificially constructed based on the operation principle of human neurons.
- the artificial neural network 300 may be a convolutional neural network mainly used for image analysis or may include a convolutional neural network.
- U-Net Reneberger2015
- DeepLabv3+ Choen2018
- Mask R-CNN He2017
- a classification model such as ResNet (He2016) or DenseNet (Huang2017), which has been trained to determine the presence or absence of prostate cancer in units, may be used.
- the neural network learning system 100 and/or the determination result providing system 200 may be a computing system that is a data processing device having computational power for implementing the technical idea of the present invention, and generally, a client through a network It may include a computing device such as a personal computer or a portable terminal as well as a server that is an accessible data processing device.
- the neural network learning system 100 and/or the determination result providing system 200 may be implemented as any one physical device, but if necessary, a plurality of physical devices may be organically coupled to each other according to the technical idea of the present invention.
- An average expert in the technical field of the present invention can easily infer that the neural network learning system 100 and/or the judgment result providing system 200 can be implemented.
- the neural network learning system 100 may learn the artificial neural network 300 based on learning data generated from a plurality of pathological specimens.
- the neural network learning system 100 generates individual training data using a scanned image of a slide of a pathological specimen or a part (ie, a patch) of a scanned slide image, and stores it in the input layer of the neural network 300 . It is possible to learn the neural network 300 by inputting it.
- the pathological specimen may be a specimen obtained by prostate needle biopsy or prostatectomy, or a specimen obtained by TURP procedure.
- the learned neural network 300 may be stored in the determination result providing system 200, and using the artificial neural network learned by the determination result providing system 200 to determine a predetermined diagnosis target TURP sample can do.
- the neural network learning system 100 and/or the determination result providing system 200 may be implemented in the form of a subsystem of a predetermined parent system 10 .
- the parent system 10 may be a server.
- the server 10 means a data processing device having computational capability for implementing the technical idea of the present invention, and generally provides a specific service such as a personal computer, a mobile terminal, etc. as well as a data processing device that a client can access through a network.
- An average expert in the art of the present invention can easily infer that any device capable of performing can be defined as a server.
- the neural network learning system 100 and the determination result providing system 200 may be implemented in a separate form.
- the artificial neural network 300 may be trained through the learning method of FIG. 2 , and as mentioned above, the artificial neural network 300 is an artificial neural network for detecting prostate cancer from pathological images.
- the artificial neural network 300 is a neural network that performs the function of performing a diagnosis for prostate cancer in the TURP pathology image, but in the process of learning the artificial neural network 300, not only the TURP pathology image but also the prostate needle biopsy or prostatectomy An image of a specimen obtained by surgery may be used.
- the neural network learning system 100 may acquire a plurality of pathological images for primary learning ( S100 ).
- each of the plurality of pathological images for primary learning is an image obtained by scanning a slide of a pathological specimen obtained through prostate needle biopsy or a pathology obtained through prostatectomy or prostatectomy. It may be any one of a prostatectomy pathology image that is a scanned image of a slide of a specimen.
- a lesion caused by prostate cancer may be previously annotated in each of the plurality of pathological images for primary learning, and the annotated information may be used as a label of the learning data.
- Prostate needle biopsy refers to a method of collecting living tissue of the prostate through a needle (needle). An example of a pathological image of a prostate needle biopsy specimen is shown in FIG. 3 .
- the neural network learning system 300 may first learn the artificial neural network 300 using the plurality of pathological images for primary learning (S110).
- the annotated prostate needle biopsy pathology image or the annotated prostatectomy pathology image is the artificial neural network. (300) is first learned.
- a predetermined ratio or more of the plurality of pathological images for primary learning may be configured as an image including a lesion area due to prostate cancer, and by doing so, the sensitivity of the artificial neural network 300 may be increased to a certain level or higher.
- the neural network learning system 100 may obtain a plurality of TURP pathological images, which are images obtained by scanning a slide of a pathological specimen obtained through transurethral resection of prostate (TURP). There is (S120).
- each of the plurality of TURP pathological images used for secondary learning must include at least one of a non-prostate tissue region or a cauterized prostate tissue region, and the prostate cancer lesion region should not be included at all.
- FIG. 4A and 5A are diagrams illustrating examples of TURP pathological images used for secondary learning, respectively.
- Fig. 4a is an example including the region of the engraved prostate tissue
- Fig. 4b is an enlarged view of the region of the engraved prostate tissue
- 5A is an example in which the non-prostate tissue region is included
- FIG. 5B is an enlarged view of the non-prostate tissue region.
- some of the plurality of TURP pathology images used for secondary learning are images including non-prostate tissue regions but not prostate cancer lesion regions at all, and the rest are scorched prostate tissue regions. It may be an image including but not including the prostate cancer lesion area at all.
- the plurality of TURP pathology images used for secondary learning should not all include the prostate cancer lesion region, the plurality of TURP pathology images used for secondary learning can be collectively labeled as benign. , is training data that does not require annotation for a separate lesion region.
- the neural network learning system 100 may use the plurality of TURP pathological images to secondary learn the firstly learned artificial neural network 300 ( S130 ).
- FIG. 6 is a diagram illustrating an example of a method for providing a determination result according to an embodiment of the present invention.
- the determination result providing system 200 may acquire a predetermined determination target TURP pathology image (S200).
- the TURP pathology image to be judged is an image obtained by scanning a slide of a predetermined pathological specimen to be judged obtained through transurethral prostatectomy (TURP).
- the determination result providing system 200 inputs the determination target TURP pathology image to the artificial neural network 300, and the artificial neural network 300 determines the result of prostate cancer detection based on the determination target TURP pathology image. can be output (S210).
- FIG. 7 is a diagram illustrating a schematic configuration of an artificial neural network learning system 100 according to an embodiment of the present invention
- FIG. 8 is a schematic configuration of a determination result providing system 200 according to an embodiment of the present invention. is a diagram showing
- the artificial neural network learning system 100 and the determination result providing system 200 may mean a logical configuration having hardware resources and/or software necessary to implement the technical idea of the present invention, and must be It does not mean a single physical component or a single device. That is, the artificial neural network learning system 100 and the determination result providing system 200 may mean a logical combination of hardware and/or software provided to implement the technical idea of the present invention, and if necessary, each other It may be implemented as a set of logical configurations for implementing the technical idea of the present invention by being installed in spaced devices to perform respective functions. In addition, the artificial neural network learning system 100 and the determination result providing system 200 may refer to a set of components separately implemented for each function or role for implementing the technical idea of the present invention.
- Each component of the artificial neural network learning system 100 and the determination result providing system 200 may be located in different physical devices or may be located in the same physical device.
- the combination of software and/or hardware constituting each of the components of the artificial neural network learning system 100 and the determination result providing system 200 is also located in different physical devices, and different physical devices The components located in the may be organically combined with each other to implement each of the modules.
- a module may mean a functional and structural combination of hardware for carrying out the technical idea of the present invention and software for driving the hardware.
- the module may mean a logical unit of a predetermined code and a hardware resource for executing the predetermined code, and does not necessarily mean physically connected code or one type of hardware. can be easily inferred to an average expert in the technical field of the present invention.
- the artificial neural network learning system 100 may include a storage module 110 , an acquisition module 120 , and a learning module 140 .
- some of the above-described components may not necessarily correspond to the components essential for the implementation of the present invention, and according to the embodiment, the artificial neural network learning system 100 ) may include more components than this, of course.
- the artificial neural network learning system 100 includes a communication module (not shown) for communicating with an external device, and a control module (not shown) for controlling components and resources of the artificial neural network learning system 100 . may further include.
- the storage module 110 may store the artificial neural network 300 to be learned.
- the acquisition module 120 may acquire a plurality of pathological images for primary learning, and each of the plurality of pathological images for primary learning is a prostate needle biopsy pathology image, which is an image obtained by scanning a slide of a pathological specimen obtained through prostate needle biopsy.
- a prostate needle biopsy pathology image which is an image obtained by scanning a slide of a pathological specimen obtained through prostate needle biopsy.
- it may be any one of a prostatectomy pathology image, which is an image obtained by scanning a slide of a pathological specimen obtained through prostatectomy.
- the acquisition module 120 may acquire a plurality of TURP pathological images, which are images obtained by scanning a slide of a pathological specimen secured through transurethral prostatectomy.
- each of the plurality of TURP pathological images necessarily includes at least one of a non-prostate tissue region or a cauterized prostate tissue region, and may not include a prostate cancer lesion region at all.
- the learning module 130 may first learn an artificial neural network for judging prostate cancer by using the plurality of primary learning pathological images, and using the plurality of TURP pathological images, the primary learned artificial neural network Secondary learning of neural networks is possible.
- the determination result providing system 200 may include a storage module 210 , an acquisition module 220 , and a determination module 230 .
- some of the above-described components may not necessarily correspond to the components essential for the implementation of the present invention, and also according to the embodiment, the determination result providing system 200 Of course, it may include more components than this.
- the determination result providing system 200 may further include a communication module (not shown) and a control module (not shown) for controlling components and resources of the determination result providing system 200 .
- the storage module 210 may store the learned artificial neural network 300 .
- the acquisition module 220 may acquire a predetermined determination target TURP pathology image.
- the determination module 230 inputs the determination target TURP pathology image to the artificial neural network 300, and outputs the prostate cancer detection result determined by the artificial neural network 300 based on the determination target TURP pathology image.
- the artificial neural network learning system 100 and the determination result providing system 200 may include a processor and a memory for storing a program executed by the processor.
- the processor may include a single-core CPU or a multi-core CPU.
- the memory may include high-speed random access memory and may include non-volatile memory such as one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid-state memory devices. Access to memory by the processor and other components may be controlled by a memory controller.
- the method according to the embodiment of the present invention may be implemented in the form of a computer-readable program command and stored in a computer-readable recording medium, and the control program and the target program according to the embodiment of the present invention are also implemented in the computer. It may be stored in a readable recording medium.
- the computer-readable recording medium includes all types of recording devices in which data readable by a computer system is stored.
- the program instructions recorded on the recording medium may be specially designed and configured for the present invention, or may be known and available to those skilled in the software field.
- Examples of the computer-readable recording medium include magnetic media such as hard disks, floppy disks and magnetic tapes, optical recording media such as CD-ROMs and DVDs, and floppy disks. hardware devices specially configured to store and execute program instructions, such as magneto-optical media and ROM, RAM, flash memory, and the like.
- the computer-readable recording medium is distributed in a computer system connected through a network, so that the computer-readable code can be stored and executed in a distributed manner.
- Examples of the program instruction include not only machine code such as generated by a compiler, but also a device for electronically processing information using an interpreter or the like, for example, a high-level language code that can be executed by a computer.
- the hardware devices described above may be configured to operate as one or more software modules to perform the operations of the present invention, and vice versa.
- the present invention can be used in a method for learning an artificial neural network for detecting prostate cancer from a TURP pathology image and a computing system for performing the same.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Biomedical Technology (AREA)
- Epidemiology (AREA)
- Primary Health Care (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Data Mining & Analysis (AREA)
- General Physics & Mathematics (AREA)
- Pathology (AREA)
- Databases & Information Systems (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Quality & Reliability (AREA)
- Life Sciences & Earth Sciences (AREA)
- Molecular Biology (AREA)
- Software Systems (AREA)
- Evolutionary Computation (AREA)
- Computing Systems (AREA)
- General Engineering & Computer Science (AREA)
- Mathematical Physics (AREA)
- Biophysics (AREA)
- Artificial Intelligence (AREA)
- Computational Linguistics (AREA)
- Image Analysis (AREA)
- Business, Economics & Management (AREA)
- Image Processing (AREA)
- Accounting & Taxation (AREA)
- Finance (AREA)
- Heart & Thoracic Surgery (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Veterinary Medicine (AREA)
- Investigating Or Analysing Biological Materials (AREA)
- Entrepreneurship & Innovation (AREA)
- Gynecology & Obstetrics (AREA)
Abstract
Description
Claims (6)
- 뉴럴 네트워크 학습 시스템이, 복수의 1차 학습용 병리 이미지를 획득하는 단계-여기서, 상기 복수의 1차 학습용 병리 이미지 각각은, 전립선침생검(prostate needle biopsy)을 통해 확보된 병리 검체의 슬라이드를 스캐닝한 이미지인 전립선침생검 병리 이미지 또는 전립선절제수술(radical prostatectomy)을 통해 확보된 병리 검체의 슬라이드를 스캐닝한 이미지인 전립선절제수술 병리 이미지 중 어느 하나임;상기 뉴럴 네트워크 학습 시스템이, 상기 복수의 1차 학습용 병리 이미지를 이용하여, 전립선암 판단용 인공 뉴럴 네트워크를 1차 학습하는 단계-상기 전립선암 판단용 인공 뉴럴 네트워크는, 병리 이미지로부터 전립선암을 검출하기 위한 용도의 인공 뉴럴 네트워크임;상기 뉴럴 네트워크 학습 시스템이, 경요도전립선절제술(transurethral resection of prostate; TURP)를 통해 확보된 병리 검체의 슬라이드를 스캐닝한 이미지인 복수의 TURP 병리 이미지를 획득하는 단계; 및상기 뉴럴 네트워크 학습 시스템이, 상기 복수의 TURP 병리 이미지를 이용하여, 1차 학습된 상기 인공 뉴럴 네트워크를 2차 학습하는 단계를 포함하되,상기 복수의 TURP 병리 이미지 각각은,비(非)전립선 조직 영역 또는 지져진(cauterized) 전립선 조직 영역 중 적어도 하나를 반드시 포함하며, 전립선암 병변 영역은 전혀 포함하지 않는 것을 특징으로 하는 TURP 병리 이미지로부터 전립선암을 검출하기 위한 용도의 인공 뉴럴 네트워크를 학습하는 방법.
- 제1항에 기재된 인공 뉴럴 네트워크를 학습하는 방법에 의해 학습된 인공 뉴럴 네트워크를 통해 소정의 판단 대상 TURP 병리 이미지에 대한 판단 결과를 제공하는 방법으로서,컴퓨팅 시스템이, 상기 판단 대상 TURP 병리 이미지를 획득하는 단계; 및상기 컴퓨팅 시스템이, 상기 인공 뉴럴 네트워크가 상기 판단 대상 TURP 병리 이미지에 기초하여 판단한 전립선암 검출 결과를 출력하는 단계를 포함하는 방법.
- 데이터 처리장치에 설치되며 제1항 또는 제2항 중 어느 한 항에 기재된 방법을 수행하기 위한 매체에 기록된 컴퓨터 프로그램.
- 제1항 또는 제2항 중 어느 한 항에 기재된 방법을 수행하기 위한 컴퓨터 프로그램이 기록된 컴퓨터 판독 가능한 기록매체.
- 뉴럴 네트워크 학습 시스템으로서,프로세서; 및 컴퓨터 프로그램을 저장하는 메모리를 포함하고,상기 컴퓨터 프로그램은, 상기 프로세서에 의해 실행되는 경우, 상기 컴퓨팅 시스템으로 하여금 인공 뉴럴 네트워크를 학습하는 방법을 수행하도록 하며,상기 인공 뉴럴 네트워크 학습 방법은,복수의 1차 학습용 병리 이미지를 획득하는 단계-여기서, 상기 복수의 1차 학습용 병리 이미지 각각은, 전립선침생검(prostate needle biopsy)을 통해 확보된 병리 검체의 슬라이드를 스캐닝한 이미지인 전립선침생검 병리 이미지 또는 전립선절제수술(radical prostatectomy)을 통해 확보된 병리 검체의 슬라이드를 스캐닝한 이미지인 전립선절제수술 병리 이미지 중 어느 하나임;상기 복수의 1차 학습용 병리 이미지를 이용하여, 전립선암 판단용 인공 뉴럴 네트워크를 1차 학습하는 단계-상기 전립선암 판단용 인공 뉴럴 네트워크는, 병리 이미지로부터 전립선암을 검출하기 위한 용도의 인공 뉴럴 네트워크임;상기 뉴럴 네트워크 학습 시스템이, 경요도전립선절제술(transurethral resection of prostate; TURP)를 통해 확보된 병리 검체의 슬라이드를 스캐닝한 이미지인 복수의 TURP 병리 이미지를 획득하는 단계; 및상기 복수의 TURP 병리 이미지를 이용하여, 1차 학습된 상기 인공 뉴럴 네트워크를 2차 학습하는 단계를 포함하되,상기 복수의 TURP 병리 이미지 각각은,비(非)전립선 조직 영역 또는 지져진(cauterized) 전립선 조직 영역 중 적어도 하나를 반드시 포함하며, 전립선암 병변 영역은 전혀 포함하지 않는 것을 특징으로 하는 뉴럴 네트워크 학습 시스템.
- 소정의 판단 대상 TURP 병리 이미지에 대한 판단 결과를 제공하는 컴퓨팅 시스템으로서,프로세서; 및 컴퓨터 프로그램을 저장하는 메모리를 포함하고,상기 컴퓨터 프로그램은, 상기 프로세서에 의해 실행되는 경우, 상기 컴퓨팅 시스템으로 하여금 제1항에 기재된 인공 뉴럴 네트워크 학습 방법에 의해 학습된 인공 뉴럴 네트워크를 통해 상기 판단 대상 TURP 병리 이미지에 대한 판단 결과를 제공하는 방법을 수행하도록 하며,상기 판단 결과를 제공하는 방법은,상기 판단 대상 TURP 병리 이미지를 획득하는 단계; 및상기 인공 뉴럴 네트워크가 상기 판단 대상 TURP 병리 이미지에 기초하여 판단한 전립선암 검출 결과를 출력하는 단계를 포함하는 컴퓨팅 시스템.
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP22767435.5A EP4290529A1 (en) | 2021-03-08 | 2022-03-07 | Method for training artificial neural network having use for detecting prostate cancer from turp pathological images, and computing system performing same |
US18/280,662 US20240153073A1 (en) | 2021-03-08 | 2022-03-07 | Method for training artificial neural network having use for detecting prostate cancer from turp pathological images, and computing system performing same |
JP2023551750A JP2024509105A (ja) | 2021-03-08 | 2022-03-07 | Turpの病理画像から前立腺癌を検出するための人工ニューラルネットワークを学習する方法、及びこれを行うコンピューティングシステム |
CN202280020209.7A CN117337473A (zh) | 2021-03-08 | 2022-03-07 | 学习用于从turp病理图像中检测前列腺癌的人工神经网络的方法及执行该方法的计算系统 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR10-2021-0029876 | 2021-03-08 | ||
KR1020210029876A KR102316525B1 (ko) | 2021-03-08 | 2021-03-08 | Turp 병리 이미지로부터 전립선암을 검출하기 위한 용도의 인공 뉴럴 네트워크를 학습하는 방법 및 이를 수행하는 컴퓨팅 시스템 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022191539A1 true WO2022191539A1 (ko) | 2022-09-15 |
Family
ID=78275729
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2022/003178 WO2022191539A1 (ko) | 2021-03-08 | 2022-03-07 | Turp 병리 이미지로부터 전립선암을 검출하기 위한 용도의 인공 뉴럴 네트워크를 학습하는 방법 및 이를 수행하는 컴퓨팅 시스템 |
Country Status (6)
Country | Link |
---|---|
US (1) | US20240153073A1 (ko) |
EP (1) | EP4290529A1 (ko) |
JP (1) | JP2024509105A (ko) |
KR (1) | KR102316525B1 (ko) |
CN (1) | CN117337473A (ko) |
WO (1) | WO2022191539A1 (ko) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102395664B1 (ko) | 2022-01-26 | 2022-05-09 | 주식회사 몰팩바이오 | Hsv 색 공간과 clahe를 이용한 병리 슬라이드 이미지 색감 표준화 방법 및 장치 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6004267A (en) * | 1997-03-07 | 1999-12-21 | University Of Florida | Method for diagnosing and staging prostate cancer |
JP2019148473A (ja) * | 2018-02-27 | 2019-09-05 | シスメックス株式会社 | 画像解析方法、画像解析装置、プログラム、学習済み深層学習アルゴリズムの製造方法および学習済み深層学習アルゴリズム |
KR20190143510A (ko) * | 2018-06-04 | 2019-12-31 | 주식회사 딥바이오 | 투 페이스 질병 진단 시스템 및 그 방법 |
WO2020005815A1 (en) * | 2018-06-29 | 2020-01-02 | Miraki Innovation Think Tank, Llc | Miniaturized intra-body controllable medical device employing machine learning and artificial intelligence |
KR20200092803A (ko) * | 2019-01-25 | 2020-08-04 | 주식회사 딥바이오 | 준-지도학습을 이용하여 질병의 발병 영역에 대한 어노테이션을 수행하기 위한 방법 및 이를 수행하는 진단 시스템 |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101944536B1 (ko) | 2016-12-11 | 2019-02-01 | 주식회사 딥바이오 | 뉴럴 네트워크를 이용한 질병의 진단 시스템 및 그 방법 |
KR102100698B1 (ko) * | 2019-05-29 | 2020-05-18 | (주)제이엘케이 | 앙상블 학습 알고리즘을 이용한 인공지능 기반 진단 보조 시스템 |
-
2021
- 2021-03-08 KR KR1020210029876A patent/KR102316525B1/ko active IP Right Grant
-
2022
- 2022-03-07 JP JP2023551750A patent/JP2024509105A/ja active Pending
- 2022-03-07 WO PCT/KR2022/003178 patent/WO2022191539A1/ko active Application Filing
- 2022-03-07 CN CN202280020209.7A patent/CN117337473A/zh active Pending
- 2022-03-07 US US18/280,662 patent/US20240153073A1/en active Pending
- 2022-03-07 EP EP22767435.5A patent/EP4290529A1/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6004267A (en) * | 1997-03-07 | 1999-12-21 | University Of Florida | Method for diagnosing and staging prostate cancer |
JP2019148473A (ja) * | 2018-02-27 | 2019-09-05 | シスメックス株式会社 | 画像解析方法、画像解析装置、プログラム、学習済み深層学習アルゴリズムの製造方法および学習済み深層学習アルゴリズム |
KR20190143510A (ko) * | 2018-06-04 | 2019-12-31 | 주식회사 딥바이오 | 투 페이스 질병 진단 시스템 및 그 방법 |
WO2020005815A1 (en) * | 2018-06-29 | 2020-01-02 | Miraki Innovation Think Tank, Llc | Miniaturized intra-body controllable medical device employing machine learning and artificial intelligence |
KR20200092803A (ko) * | 2019-01-25 | 2020-08-04 | 주식회사 딥바이오 | 준-지도학습을 이용하여 질병의 발병 영역에 대한 어노테이션을 수행하기 위한 방법 및 이를 수행하는 진단 시스템 |
Also Published As
Publication number | Publication date |
---|---|
US20240153073A1 (en) | 2024-05-09 |
KR102316525B1 (ko) | 2021-10-22 |
CN117337473A (zh) | 2024-01-02 |
JP2024509105A (ja) | 2024-02-29 |
EP4290529A1 (en) | 2023-12-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2021073380A1 (zh) | 一种图像识别模型训练的方法、图像识别的方法及装置 | |
JP2021519663A (ja) | 内視鏡画像の処理方法、システム、コンピュータデバイス及びコンピュータプログラム | |
WO2018106005A1 (ko) | 뉴럴 네트워크를 이용한 질병의 진단 시스템 및 그 방법 | |
WO2021182889A2 (ko) | 영상 기반의 안질환 진단 장치 및 방법 | |
WO2017069596A1 (ko) | Cad기반 디지털 엑스레이의 자동 결핵 진단 예측 시스템 | |
WO2022149894A1 (ko) | 병리 검체에 대한 판단 결과를 제공하는 인공 뉴럴 네트워크의 학습 방법, 및 이를 수행하는 컴퓨팅 시스템 | |
WO2021071288A1 (ko) | 골절 진단모델의 학습 방법 및 장치 | |
WO2020032562A2 (ko) | 생체 이미지 진단 시스템, 생체 이미지 진단 방법, 및 이를 수행하기 위한 단말 | |
WO2019235828A1 (ko) | 투 페이스 질병 진단 시스템 및 그 방법 | |
WO2019098415A1 (ko) | 자궁경부암에 대한 피검체의 발병 여부를 판정하는 방법 및 이를 이용한 장치 | |
WO2020111754A9 (ko) | 세미 슈퍼바이즈드 학습을 이용한 진단 시스템 제공방법 및 이를 이용하는 진단 시스템 | |
WO2022191539A1 (ko) | Turp 병리 이미지로부터 전립선암을 검출하기 위한 용도의 인공 뉴럴 네트워크를 학습하는 방법 및 이를 수행하는 컴퓨팅 시스템 | |
WO2022197044A1 (ko) | 뉴럴 네트워크를 이용한 방광병변 진단 방법 및 그 시스템 | |
CN109460717A (zh) | 消化道共聚焦激光显微内镜病变图像识别方法及装置 | |
WO2021040327A1 (ko) | 심혈관 질환 위험 인자 예측 장치 및 방법 | |
WO2023191472A1 (ko) | 면역조직화학 염색 이미지를 분석하기 위한 기계학습모델을 학습하는 방법 및 이를 수행하는 컴퓨팅 시스템 | |
Xu et al. | Upper gastrointestinal anatomy detection with multi‐task convolutional neural networks | |
WO2022231329A1 (ko) | 생체 이미지 조직 표시 방법 및 장치 | |
WO2021201582A1 (ko) | 피부 병변의 원인 분석 방법 및 장치 | |
WO2022158843A1 (ko) | 조직 검체 이미지 정제 방법, 및 이를 수행하는 컴퓨팅 시스템 | |
WO2016204535A1 (ko) | 의료 검사를 위한 이미지 분석 관리 방법 및 서버 | |
WO2021002669A1 (ko) | 병변 통합 학습 모델을 구축하는 장치와 방법, 및 상기 병변 통합 학습 모델을 사용하여 병변을 진단하는 장치와 방법 | |
WO2023121051A1 (ko) | 환자 정보 제공 방법, 환자 정보 제공 장치, 및 컴퓨터 판독 가능한 기록 매체 | |
WO2020246676A1 (ko) | 자궁경부암 자동 진단 시스템 | |
WO2022231200A1 (ko) | 유방암 병변 영역을 판별하기 위한 인공 신경망을 학습하기 위한 학습 방법, 및 이를 수행하는 컴퓨팅 시스템 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22767435 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023551750 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18280662 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202280020209.7 Country of ref document: CN |
|
ENP | Entry into the national phase |
Ref document number: 2022767435 Country of ref document: EP Effective date: 20230907 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |