WO2019098415A1 - Method for determining whether subject has developed cervical cancer, and device using same - Google Patents

Method for determining whether subject has developed cervical cancer, and device using same Download PDF

Info

Publication number
WO2019098415A1
WO2019098415A1 PCT/KR2017/013015 KR2017013015W WO2019098415A1 WO 2019098415 A1 WO2019098415 A1 WO 2019098415A1 KR 2017013015 W KR2017013015 W KR 2017013015W WO 2019098415 A1 WO2019098415 A1 WO 2019098415A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
cervical cancer
computing device
cervical
analysis information
Prior art date
Application number
PCT/KR2017/013015
Other languages
French (fr)
Korean (ko)
Inventor
정재훈
최성원
Original Assignee
주식회사 버즈폴
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 버즈폴 filed Critical 주식회사 버즈폴
Priority to CN201780004364.9A priority Critical patent/CN110352461A/en
Priority to PCT/KR2017/013015 priority patent/WO2019098415A1/en
Priority to KR1020177033520A priority patent/KR20190087681A/en
Publication of WO2019098415A1 publication Critical patent/WO2019098415A1/en

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/50ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for simulation or modelling of medical disorders
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0033Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room
    • A61B5/004Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room adapted for image acquisition of a particular organ or body part
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/43Detecting, measuring or recording for evaluating the reproductive systems
    • A61B5/4306Detecting, measuring or recording for evaluating the reproductive systems for evaluating the female reproductive systems, e.g. gynaecological evaluations
    • A61B5/4318Evaluation of the lower reproductive system
    • A61B5/4331Evaluation of the lower reproductive system of the cervix
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/606Protecting data by securing the transmission between two devices or processes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/003Reconstruction from projections, e.g. tomography
    • G06T11/005Specific pre-processing for tomographic reconstruction, e.g. calibration, source positioning, rebinning, scatter correction, retrospective gating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/40ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the management of medical equipment or devices, e.g. scheduling maintenance or upgrades
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/67ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/70ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for mining of medical data, e.g. analysing previous cases of other patients
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H80/00ICT specially adapted for facilitating communication between medical practitioners or patients, e.g. for collaborative diagnosis, therapy or health monitoring
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]

Definitions

  • the present invention relates to a method for judging whether or not a subject develops cervical cancer and a judgment apparatus using the same.
  • the determination apparatus according to the present invention is a determination apparatus for acquiring an image of the cervix of the subject, and acquiring, from the input of the obtained cervical-radiographic image, And provides analysis information on the incidence of the cervical cancer and provides the generated analysis information so as to allow the user of the determination apparatus or the remote user to read out the incidence of the cervical cancer corresponding to the analysis information Acquires and stores evaluation information on the analysis information as a result of the reading, and outputs the evaluation information.
  • the determination apparatus according to the present invention can re-learn the machine learning model based on the evaluation information.
  • Cervical cancer is the most common cancer ranking among Korean women, because it can be affected by pregnancy and childbirth due to hysterectomy, and it can cause a sense of loss as a woman. According to the statistics for 2013, the number of cervical cancer patients in Korea is 26,207, ranking 4th among female cancer (Ministry of Health and Welfare data). In addition, it is the 7th cancer recommendation in Korea, and it is included in the national cancer screening project in 1999, and the rate of early diagnosis is increasing. In recent years, cervical intraepithelial cancer (precancerous stage) called "0 period" cancer of the cervix is also on the rise, and it is recommended that women who have experience of sexual experience have annual checkups.
  • the proportion of cervical intraepithelial neoplasia in young women is increasing, and the screening target has been lowered from 30 to 20 years from 2016.
  • health insurance benefits apply to 300% of the cost of screening for cervical cytology examinations.
  • it is recommended that screening tests be conducted in parallel with the cervical screening test because the false negative rate (ie, false positive rate) of screening reaches 55%.
  • the market for cervical cancer screening in the world is estimated at 6.86 trillion won Of these, the cervical dilatation test is 30%, reaching about 2 trillion won.
  • FIG. 1 is a conceptual diagram schematically showing a method of examining a cervical cell examination and a cervical dilatation examination, which has been conventionally performed to diagnose cervical cancer. Referring to the bottom of FIG. 1, (For example, the cervical lumen shown in FIG. 1), and analyzing the resultant image and using the result, the misdiagnosis rate of the examination for cervical cancer can be lowered.
  • the medical staff confirms whether the cervical cancer has developed in relation to the image of the cervix in view of education and experience. This method is often repeated and obscure, It can take a long time and the accuracy can drop together.
  • the CDSS Clinical Decision Support System
  • KRW24 trillion up 25% on average
  • CDSS which is specialized for cervical cancer is used as a CDSS method for improving the efficiency of cervical cancer screening and preventing misdiagnosis through a computing device, thereby enabling a medical staff to perform diagnosis of cervical cancer more quickly and accurately. And a judgment device for this purpose is proposed.
  • a method of determining whether a subject has developed cervical cancer comprising: (a) the computing device acquiring an image of the cervix of the subject Supporting another device associated with the computing device to obtain; (b) the computing device generates analysis information on the incidence of the cervical cancer of the subject based on the obtained machine learning model of the cervical cancer from the input of the cervical radiographic image, To generate the generated data; (c) the computing device provides the generated analysis information or provides the other device with information, thereby allowing a user of the computing device or a remote user of the remote location to determine whether the cervical cancer corresponding to the analysis information has occurred Supporting to read; And (d) the computing device is configured to: (i) acquire and store evaluation information for the analysis information as a result of performing the step (c), or support the other device to acquire and store the evaluation information And (ii) performing a process of outputting the evaluation information or supporting the output of the other device.
  • the method further comprises the step of (e) re-learning the machine learning model or allowing the other device to re-learn the machine learning model based on the evaluation information .
  • a computer program recorded on a machine-readable non-volatile storage medium, comprising instructions embodied to perform the above-described method.
  • a computing device for determining whether a subject is affected by cervical cancer, the computing device comprising: a communication unit for obtaining an image of the cervix of the subject; And generating analysis information on the onset of the cervical cancer of the subject based on the obtained machine learning model of the cervical cancer from the input of the cervical radiographic image,
  • the processor is configured to provide the analysis information or to provide the other apparatus with information on whether the cervical cancer corresponding to the analysis information is present or not, (I) acquiring and storing evaluation information on the analysis information as a result of the reading, or (ii) supporting the other device to acquire and store the evaluation information, and (ii) And outputs the evaluation information or causes the other device to output
  • the processor of the computing device re-learns the machine learning model based on the evaluation information.
  • the present invention compared to the conventional method in which a medical staff directly observes a cervical image obtained through a cervical loupe and confirms the state of the cervix individually based on education and experience, it is possible to quickly and accurately determine the onset of cervical cancer There is an effect that can be done.
  • the present invention it is possible to facilitate the division of labor in the medical field by making it possible to read out the cervical cancer even in a remote place away from the photographing site of the cervical image.
  • the present invention has the effect of continuously improving the determination performance through re-learning according to the method of the present invention.
  • FIG. 1 is a conceptual diagram schematically showing a method of cervical cytology examination and cervical dilatation examination which were conventionally performed to diagnose cervical cancer.
  • FIG. 2 is a diagram showing a main concept for explaining a CNN (convolutional neural network) which is one of the machine learning models used in the present invention.
  • FIG. 3 is a conceptual diagram schematically illustrating an exemplary configuration of a computing device that performs a method for determining whether a subject develops cervical cancer according to the present invention.
  • FIG. 4 is a flowchart illustrating an exemplary method for determining the incidence of cervical cancer according to the present invention.
  • 5A to 5E are diagrams illustrating exemplary user interfaces (UIs) provided at respective steps of the method for determining the incidence of cervical cancer according to the present invention.
  • UIs user interfaces
  • 'learning' is a term referring to performing machine learning in accordance with a procedure, and is not intended to refer to a mental function such as a human educational activity. Can be understood.
  • FIG. 2 is a diagram showing a main concept for explaining a CNN (convolutional neural network) which is one of the machine learning models used in the present invention.
  • a CNN (convolutional neural network) model can be briefly described as an artificial neural network stacked in multiple layers. That is, this is referred to as a deep neural network in the sense of a network of deep structure.
  • a deep neural network in the sense of a network of deep structure.
  • CNN by learning a large amount of data in a structure of a multi-layer network, It is a form that learns the network by learning the feature automatically and minimizing the error of the objective function through it. It is also expressed as a connection between the nerve cells of the human brain, and is thus a representative of artificial intelligence.
  • CNN is a model suitable for classification of a two-dimensional image such as an image.
  • the CNN is a composite layer for creating feature maps using a plurality of filters (eg, points, lines, and surfaces) by repeating a pooling layer (a sub-sampling layer) that reduces the size of the feature map and extracts features that are invariant to changes in position or rotation It is possible to extract various levels of features from complicated and meaningful high-level features. Finally, if the feature extracted through the fully-connected layer is used as the input value of the existing model, Can be constructed.
  • filters eg, points, lines, and surfaces
  • FIG. 3 schematically shows an exemplary configuration of a computing device that performs a method of determining whether a subject has a cervical cancer incidence according to the present invention (hereinafter referred to as " a method of determining whether or not a cervical cancer has occurred) It is a conceptual diagram.
  • a computing device 300 includes a communication unit 310 and a processor 320.
  • the communication unit 310 communicates with an external computing device (not shown) Communication is possible.
  • the computing device 300 may include a variety of devices, such as routers, switches, and the like, which may include conventional computer hardware (e.g., a computer processor, memory, storage, input and output devices, Electronic communication devices, electronic information storage systems such as Network Attached Storage (NAS) and Storage Area Networks (SAN)) and computer software (i. E., Instructions that cause a computing device to function in a particular manner) System performance.
  • conventional computer hardware e.g., a computer processor, memory, storage, input and output devices, Electronic communication devices, electronic information storage systems such as Network Attached Storage (NAS) and Storage Area Networks (SAN)
  • computer software i. E., Instructions that cause a computing device to function in a particular manner
  • the communication unit 310 of such a computing device can send and receive requests and responses to and from other interworking computing devices.
  • requests and responses can be made by the same TCP session, For example, as a UDP datagram.
  • the communication unit 310 may include a keyboard, a mouse, and other external input devices for receiving commands or instructions.
  • the processor 320 of the computing device may include a hardware configuration such as a micro processing unit (MPU) or a central processing unit (CPU), a cache memory, a data bus, and the like. It may further include a software configuration of an operating system and an application that performs a specific purpose.
  • MPU micro processing unit
  • CPU central processing unit
  • cache memory a cache memory
  • data bus a data bus
  • FIG. 4 is a flowchart illustrating an exemplary method for determining the incidence of cervical cancer according to the present invention.
  • the method for determining the incidence of cervical cancer according to the present invention is characterized in that the communication unit 310 of the computing device 300 acquires an image of the cervix of the subject, (Step S410).
  • the photographed image may be acquired by a predetermined photographing module linked to the computing device 300.
  • the captured image may be captured and obtained by another apparatus located far away from the place where the cervical cancer incidence determination method according to the present invention is performed by the computing apparatus 300, 300 can acquire it.
  • 5A to 5E are diagrams illustrating exemplary user interfaces (UIs) provided at respective steps of the method for determining the incidence of cervical cancer according to the present invention.
  • UIs user interfaces
  • the photographed image 514 obtained by the other device is illustrated as an example in step S410, for example, when the 'Request' button 512 is detected to be pressed
  • the captured image 514 may be transmitted to the computing device 300 so that the computing device 300 acquires the captured image 514.
  • the subject corresponding to the photographed image 514 that is, the subject information as the patient information
  • information on the input time point, which is the acquired time point can also be obtained together.
  • the information on the subject and the input time point can be obtained from the captured image 514 and the captured image 514, May be communicated to the computing device 300 together.
  • the method for determining the incidence of cervical cancer according to the present invention is characterized in that an analysis module (not shown) implemented by the processor 320 of the computing device 300 detects the cervical cancer (S420) generating analysis information on the onset of the cervical cancer of the subject based on a machine learning model of the cervical cancer from the input of the image or supporting the other apparatus to generate the analysis information.
  • an analysis module (not shown) implemented by the processor 320 of the computing device 300 detects the cervical cancer (S420) generating analysis information on the onset of the cervical cancer of the subject based on a machine learning model of the cervical cancer from the input of the image or supporting the other apparatus to generate the analysis information.
  • the machine learning model includes a plurality of previously entered training information, that is, (i) data of a plurality of cervical tomography images, (ii) whether or not there are lesions of cervical cancer in the plurality of cervical- Data, and (iii) if there is a lesion, the processor 320 learns the machine learning model using information including data indicating in which part of the image the lesion is present.
  • the machine learning model may be a CNN (convolutional neural network) model, or a combination of CNN and a support vector machine (SVM).
  • learning can be performed by applying a gradient descent and a backpropagation algorithm based on an image of input training information.
  • the analysis information may include classification information on negative, atypical, positive, and malignant characteristics of the cervical cancer.
  • classification information may include probability information indicating how accurate the classification is.
  • the analytical information may include negative judgment information, such as information on whether the subject is negative or whether the risk is positive or low (low cancer risk vs. high cancer risk).
  • negative judgment information such as information on whether the subject is negative or whether the risk is positive or low (low cancer risk vs. high cancer risk).
  • Acetowhite Epithelium, Mosaic Morphological information such as Erosion or ulceration, Irregular surface contour, Punctation, Atypical Vessels, and Discolaration.
  • the analysis information may be cataloged to correspond to a plurality of photographed images 514 and may be provided to the subject information 520 and the input time point information 522 as exemplarily shown in FIG. 5B.
  • (Denoted as 'suspicious') 524 may be provided depending on the classification information and probability information calculated by the machine learning model and whether or not the onset of the cervical cancer is suspected.
  • 5B illustrates an example in which the 'Evaluation' buttons 526 corresponding to a specific shot image are displayed in the example (FIG. 5B) so that the user can continue to perform subsequent steps following step S420 with respect to the selected specific shot image, Respectively.
  • the method for determining the incidence of cervical cancer may include a preprocessing module implemented by the processor 320 of the computing device 300, (Not shown) may perform the pre-processing on the cervical image or support the other device to perform the preprocessing (S415).
  • the preprocessing may include at least one of image quality enhancement through blurring, histogram smoothing, etc., blurring and noise processing to perform robustness to the illuminance and noise of the photographed image. It will be understood by those of ordinary skill in the art.
  • a method for determining the incidence of cervical cancer is then performed by a read support module (not shown) implemented by the processor 320 of the computing device 300, (S430) supporting the user of the computing device or another user at the remote location to read whether or not the cervical cancer corresponding to the analysis information is caused by providing information or providing the other device with the information .
  • information transmission / reception between the computing device and the other device may be performed by encryption and decryption, for example, AES 128 bit encryption and / Decoding may be applied.
  • all or part of the photographed image may be provided on the user interface for image reading of the other user of the user or the remote site, as exemplarily shown in Fig. 5C, (For example, a rectangle, an arrow, a text input, and the like) while judging whether or not there is an unusual area.
  • the analysis information may be processed in a manner easy for the user to understand, such as reading, and provided through a predetermined display device (display).
  • a predetermined display device display
  • the classification information on the lesion location or lesion included in the analysis information may be displayed in a predetermined format.
  • the user of the computing device 300 by using the analysis information provided through a predetermined display device, the user of the computing device 300, for example, Evaluation information on the classification information may be generated, and evaluation information on the occurrence and classification information of the classification information may be determined by the reading of the remote location.
  • the evaluation information includes information on whether the provided analysis information is correct, that is, whether or not the onset included in the analysis information is correct, and if the classification information included in the analysis information is correct or not, May be included.
  • the computing device 300 may acquire the evaluation information through the communication unit 310.
  • the evaluation information may include information on the quality of the photographed image, for example, information on a technical defect of the photographed image.
  • a technical defect is that it is difficult to accurately determine the photographed image due to excessive mucus or blood in the photographed image, or it is difficult to confirm the incidence of the cervical cancer due to the angle of the photographed image or the position of the photographed region It may also be an image problem with an acetic acid reaction, insufficient acetic acid reaction, or out of focus, overexposure, or underexpression, even though there is an acetic acid reaction.
  • step S430 as exemplarily shown in Fig. 5D, all or part of the photographed image 540, history information 541 of another image previously photographed with respect to the same subject, A subject information exposure area 542 indicating the subject information as inputted through the subject information input area 510, a sound positive determination information input area 543 to which sound positive determination information can be inputted, A technical defect information input area 545 in which information on technical defects of the morphological feature information input area 544 and the shot image 540 to which information can be inputted can be inputted, An artificial intelligence analysis information output area 546 representing information and a user opinion input area 547 by which the user (reader) can input the findings based on the photographed image can be provided on the user interface , Thereby making it easy for the user of the computing apparatus or the remote user to read whether or not the onset of the cervical cancer corresponding to the analysis information is read.
  • a method for determining the incidence of cervical cancer includes: (i) acquiring and storing evaluation information on the analysis information as a result of performing step (S430); or And (ii) performing a process of outputting the evaluation information or supporting the output of the other device (S440).
  • the evaluation information may be processed and provided in the form of a medical result report, for example, a user interface provided for this purpose is shown in FIG. 5E, and the medical result report 550 may include information on the onset of cervical cancer, And the like.
  • the method for determining the incidence of cervical cancer according to the present invention can determine whether the cervical cancer has an onset based on a previously learned machine learning model, As an example of the method for determining the incidence of cervical cancer according to the present invention for taking advantage of this advantage, there is an advantage that the machine learning model can perform more accurate reading, A re-learning module (not shown) implemented by the processor 320 of the computing device 300 re-learns the machine learning model (S450) based on the evaluation information .
  • the present invention in comparison with the conventional method in which the medical staff directly examines the cervical image obtained through the cervical loupe and confirms the state of the cervix individually based on education and experience, There is an effect of quickly and accurately determining the incidence of cervical cancer.
  • An advantage of the techniques described hereinabove with respect to the above embodiments is that it is possible to prevent mistakes, that is, misdiagnosis, of the medical personnel who must accurately determine the cervical cancer despite a large number of diagnoses.
  • the use of machine learning technology allows the physician to analyze and learn the characteristics and forms of lesions of the cervical cancer, which are known only through education and experience of many years, by the computing device itself, It is possible to assist the judgment in cases where it is difficult to determine the incidence of cervical cancer.
  • Objects of the technical solution of the present invention or portions contributing to the prior art can be recorded in a machine-readable recording medium implemented in the form of program instructions that can be executed through various computer components.
  • the machine-readable recording medium may include program commands, data files, data structures, and the like, alone or in combination.
  • the program instructions recorded on the machine-readable recording medium may be those specially designed and constructed for the present invention or may be those known to those of ordinary skill in the computer software arts.
  • machine-readable recording medium examples include magnetic media such as hard disks, floppy disks and magnetic tape, optical recording media such as CD-ROMs and DVDs, magneto-optical media such as floptical disks, media, and hardware devices specifically configured to store and execute program instructions such as ROM, RAM, flash memory, and the like.
  • program instructions include machine language code such as those generated by a compiler, as well as high-level language code that can be executed by a computer using an interpreter or the like.
  • the hardware device may be configured to operate as one or more software modules for performing the processing according to the present invention, and vice versa.
  • the hardware device may include a processor, such as a CPU or a GPU, coupled to a memory, such as ROM / RAM, for storing program instructions, and configured to execute instructions stored in the memory, And a communication unit.
  • the hardware device may include a keyboard, a mouse, and other external input devices for receiving commands generated by the developers.

Abstract

The present invention relates to a method for determining whether a subject has developed cervical cancer, and a determination device using the same. Specifically, a determination device according to the present invention: acquires a captured image of a subject's cervix; generates analysis information relating to whether the subject has developed cervical cancer, on the basis of a machine learning model for cervical cancer from an input of the acquired captured image of the cervix; and provides the generated analysis information, so as to support a user of the determination device or another user at a remote place to read whether cervix cancer has developed, which corresponds to the analysis information, and acquire, store, and output evaluation information of the analysis information, as a result of the reading.

Description

자궁경부암에 대한 피검체의 발병 여부를 판정하는 방법 및 이를 이용한 장치A method for determining the onset of a subject for cervical cancer and a device using the same
본 발명은 자궁경부암에 대한 피검체의 발병 여부를 판정하는 방법 및 이를 이용한 판정 장치에 관한 것이다. 구체적으로, 본 발명에 따른 판정 장치는, 상기 피검체의 자궁경부에 대한 촬영 영상을 획득하고, 획득된 상기 자궁경부 촬영 영상의 입력으로부터 상기 자궁경부암에 대한 기계 학습 모델에 기초하여 상기 피검체의 상기 자궁경부암의 발병 여부에 관한 분석 정보를 생성하며, 생성된 상기 분석 정보를 제공함으로써, 상기 분석 정보에 대응되는 자궁경부암의 발병 여부를 판정 장치의 사용자 또는 원격지의 타 사용자로 하여금 판독하도록 지원하고, 그 판독의 수행 결과로서 상기 분석 정보에 대한 평가 정보를 획득 및 저장하며, 상기 평가 정보를 출력한다. 또한, 본 발명에 따른 판정 장치는, 그 평가 정보에 기초하여 상기 기계 학습 모델을 재학습시킬 수 있다.The present invention relates to a method for judging whether or not a subject develops cervical cancer and a judgment apparatus using the same. More specifically, the determination apparatus according to the present invention is a determination apparatus for acquiring an image of the cervix of the subject, and acquiring, from the input of the obtained cervical-radiographic image, And provides analysis information on the incidence of the cervical cancer and provides the generated analysis information so as to allow the user of the determination apparatus or the remote user to read out the incidence of the cervical cancer corresponding to the analysis information Acquires and stores evaluation information on the analysis information as a result of the reading, and outputs the evaluation information. Further, the determination apparatus according to the present invention can re-learn the machine learning model based on the evaluation information.
자궁경부암은 대한민국 여성들이 가장 두려워하는 암 순위 중 1위를 차지하고 있는바, 이는 자궁 적출로 인해 임신, 출산에 대한 영향을 받을 수 있고, 여성으로서의 상실감을 겪을 우려가 있기 때문이다. 2013년 통계에 따르면 대한민국 국내의 자궁경부암 환자수는 26,207명으로 여성암 중에서 4위를 차지하고 있다(보건복지부 자료). 또한, 국내에서 검진을 권고하는 7대 암에 해당되며, 1999년 국가 암 검진 사업에 포함되면서 조기 진단의 비율이 증가하고 있는 추세이다. 최근에는 자궁경부 '0기' 암이라 불리는 자궁경부 상피내암(전암 단계)도 증가 추세에 있는바, 성경험이 있는 여성에게 매년 검진을 받도록 권고하고 있는 실정이다.Cervical cancer is the most common cancer ranking among Korean women, because it can be affected by pregnancy and childbirth due to hysterectomy, and it can cause a sense of loss as a woman. According to the statistics for 2013, the number of cervical cancer patients in Korea is 26,207, ranking 4th among female cancer (Ministry of Health and Welfare data). In addition, it is the 7th cancer recommendation in Korea, and it is included in the national cancer screening project in 1999, and the rate of early diagnosis is increasing. In recent years, cervical intraepithelial cancer (precancerous stage) called "0 period" cancer of the cervix is also on the rise, and it is recommended that women who have experience of sexual experience have annual checkups.
그 검진에 관한 시장의 현황을 살펴보면, 우선 젊은 여성의 자궁경부 상피내암의 비율이 증가하고 있어 2016년부터 검진 대상이 30세에서 20세로 하향 조정되었다. 특히, 다른 암과 달리 자궁경부 세포 검사 검진에 관한 검진 비용의 300%에 대하여 건강보험 혜택이 적용된다. 그런데, 검진의 위음성률(즉, 오진율)이 최대 55%에 달하고 있어 그 보완책으로 자궁경부 확대 촬영 검사를 병행할 것이 권고되고 있는바, 2013년 기준으로 세계 자궁경부암 검진 시장은 약 6.86조원 규모로서, 이 중 자궁경부 확대촬영 검사는 30%를 차지하여 약 2조원의 규모에 달한다.Looking at the market status of the screening, the proportion of cervical intraepithelial neoplasia in young women is increasing, and the screening target has been lowered from 30 to 20 years from 2016. In particular, unlike other cancers, health insurance benefits apply to 300% of the cost of screening for cervical cytology examinations. However, it is recommended that screening tests be conducted in parallel with the cervical screening test because the false negative rate (ie, false positive rate) of screening reaches 55%. As a result, the market for cervical cancer screening in the world is estimated at 6.86 trillion won Of these, the cervical dilatation test is 30%, reaching about 2 trillion won.
도 1은 종래에 자궁경부암을 진단하기 위하여 시행되던 자궁경부 세포 검사 및 자궁경부 확대 촬영 검진의 방식을 개략적으로 도시한 개념도인바, 도 1의 하단을 참조하면, 여성 피검체의 질내에 삽입된 소정의 촬영 장치(예컨대 도 1에 도시된 자궁경부 확대경)를 통하여 자궁경부에 대한 촬영 영상이 획득되면, 이를 분석하여 그 결과를 이용함으로써 자궁경부암에 대한 검진의 오진율을 낮출 수 있게 된다.FIG. 1 is a conceptual diagram schematically showing a method of examining a cervical cell examination and a cervical dilatation examination, which has been conventionally performed to diagnose cervical cancer. Referring to the bottom of FIG. 1, (For example, the cervical lumen shown in FIG. 1), and analyzing the resultant image and using the result, the misdiagnosis rate of the examination for cervical cancer can be lowered.
그런데, 종래의 자궁경부 확대경을 활용할 때, 의료진은 교육과 경험에 비추어 해당 자궁경부의 영상에 대하여 자궁경부암의 발병 여부를 확인하는데, 이와 같은 방식은 반복적이고 애매한 경우가 많아서 숙련된 의사에게도 시간이 오래 걸리고 정확도도 함께 떨어질 수 있다.However, when using a conventional cervical loupe, the medical staff confirms whether the cervical cancer has developed in relation to the image of the cervix in view of education and experience. This method is often repeated and obscure, It can take a long time and the accuracy can drop together.
이와 같이 반복적인 의사의 진단 행위를 보조해주기 위한 기술로서, 다양한 분야에 대하여 CDSS(임상의사결정 지원 시스템, clinical decision support system)가 개발되어 의사가 내리는 판정의 효율성을 배가하고 의료상 있을 수 있는 실수를 방지하는 효과를 거두고 있는바, 전세계 CDSS 시장은 2014년 기준 4.2조원 규모에서 2020년 기준 24조원 규모로 크게 성장할 것으로 기대된다(연 평균 25% 성장).The CDSS (Clinical Decision Support System) has been developed in various fields as a technology to support the repeated diagnostic physician's diagnosis, so that the efficiency of the doctor's decision is doubled, The global CDSS market is projected to grow at an annual rate of KRW24 trillion (up 25% on average) from KRW4.2tr in 2014.
본 발명에서는 자궁경부암에 특화된 CDSS로서 컴퓨팅 장치를 통하여 자궁경부암 검진에 있어서의 효율성을 높이고 오진을 방지하여 의료진으로 하여금 자궁경부암 진단을 보다 신속하고 정확하게 수행할 수 있도록 지원하는 자궁경부암 발병 여부 판정 방법 및 이를 위한 판정 장치를 제안하고자 한다.In the present invention, CDSS which is specialized for cervical cancer is used as a CDSS method for improving the efficiency of cervical cancer screening and preventing misdiagnosis through a computing device, thereby enabling a medical staff to perform diagnosis of cervical cancer more quickly and accurately. And a judgment device for this purpose is proposed.
본 발명은 상술한 문제점을 해결하여 산부인과 등에서 촬영한 자궁경부의 고해상도 촬영 영상을 통하여 신속하고 정확하게 자궁경부암 병변 유무를 판독할 수 있게 하는 것을 목적으로 한다.It is an object of the present invention to solve the above-mentioned problems and make it possible to quickly and accurately read the presence or absence of cervical cancer through a high-resolution imaging of the cervix taken by an obstetrician or the like.
상기한 바와 같은 본 발명의 목적을 달성하고, 후술하는 본 발명의 특징적인 효과를 실현하기 위한 본 발명의 특징적인 구성은 하기와 같다.The characteristic configuration of the present invention for achieving the object of the present invention as described above and realizing the characteristic effects of the present invention described below is as follows.
본 발명의 일 태양에 따르면, 자궁경부암에 대한 피검체의 발병 여부를 판정하는 방법이 제공되는바, 그 방법은, (a) 컴퓨팅 장치가, 상기 피검체의 자궁경부에 대한 촬영 영상을 획득하거나 상기 컴퓨팅 장치에 연동되는 타 장치로 하여금 획득하도록 지원하는 단계; (b) 상기 컴퓨팅 장치가, 획득된 상기 자궁경부 촬영 영상의 입력으로부터 상기 자궁경부암에 대한 기계 학습 모델에 기초하여 상기 피검체의 상기 자궁경부암의 발병 여부에 관한 분석 정보를 생성하거나 상기 타 장치로 하여금 생성하도록 지원하는 단계; (c) 상기 컴퓨팅 장치가, 생성된 상기 분석 정보를 제공하거나 상기 타 장치로 하여금 제공하도록 지원함으로써, 상기 분석 정보에 대응되는 자궁경부암의 발병 여부를 상기 컴퓨팅 장치의 사용자 또는 원격지의 타 사용자로 하여금 판독하도록 지원하는 단계; 및 (d) 상기 컴퓨팅 장치가, (i) 상기 (c) 단계를 수행한 결과로서 상기 분석 정보에 대한 평가 정보를 획득 및 저장하거나 상기 타 장치로 하여금 상기 평가 정보를 획득 및 저장하도록 지원하는 프로세스, 및 (ii) 상기 평가 정보를 출력하거나 상기 타 장치로 하여금 출력하도록 지원하는 프로세스를 수행하는 단계를 포함한다.According to one aspect of the present invention, there is provided a method of determining whether a subject has developed cervical cancer, the method comprising: (a) the computing device acquiring an image of the cervix of the subject Supporting another device associated with the computing device to obtain; (b) the computing device generates analysis information on the incidence of the cervical cancer of the subject based on the obtained machine learning model of the cervical cancer from the input of the cervical radiographic image, To generate the generated data; (c) the computing device provides the generated analysis information or provides the other device with information, thereby allowing a user of the computing device or a remote user of the remote location to determine whether the cervical cancer corresponding to the analysis information has occurred Supporting to read; And (d) the computing device is configured to: (i) acquire and store evaluation information for the analysis information as a result of performing the step (c), or support the other device to acquire and store the evaluation information And (ii) performing a process of outputting the evaluation information or supporting the output of the other device.
바람직하게는, 상기 방법은, (e) 평가 정보에 기초하여, 상기 컴퓨팅 장치가, 상기 기계 학습 모델을 재학습시키거나 상기 타 장치로 하여금 상기 기계 학습 모델을 재학습시키도록 지원하는 단계를 더 포함한다.Advantageously, the method further comprises the step of (e) re-learning the machine learning model or allowing the other device to re-learn the machine learning model based on the evaluation information .
본 발명의 다른 태양에 따르면, 전술한 방법을 수행하도록 구현된 명령어(instructions)를 포함하는, 기계 판독 가능한 비일시적 기록매체에 기록된 컴퓨터 프로그램도 제공된다.According to another aspect of the present invention, there is also provided a computer program recorded on a machine-readable non-volatile storage medium, comprising instructions embodied to perform the above-described method.
본 발명의 또 다른 태양에 따르면, 자궁경부암에 대한 피검체의 발병 여부를 판정하는 컴퓨팅 장치가 제공되는바, 그 컴퓨팅 장치는, 상기 피검체의 자궁경부에 대한 촬영 영상을 획득하는 통신부; 및 획득된 상기 자궁경부 촬영 영상의 입력으로부터 상기 자궁경부암에 대한 기계 학습 모델에 기초하여 상기 피검체의 상기 자궁경부암의 발병 여부에 관한 분석 정보를 생성하거나 상기 통신부를 통하여 연동되는 타 장치로 하여금 생성하도록 지원하는 프로세서를 포함하되, 상기 프로세서는, 생성된 상기 분석 정보를 제공하거나 상기 타 장치로 하여금 제공하도록 지원함으로써, 상기 분석 정보에 대응되는 자궁경부암의 발병 여부를 상기 컴퓨팅 장치의 사용자 또는 원격지의 타 사용자로 하여금 판독하도록 지원하고, (i) 상기 판독의 결과로서 상기 분석 정보에 대한 평가 정보를 획득 및 저장하거나 상기 타 장치로 하여금 상기 평가 정보를 획득 및 저장하도록 지원하는 프로세스 및 (ii) 상기 평가 정보를 출력하거나 상기 타 장치로 하여금 출력하도록 지원하는 프로세스를 수행한다.According to still another aspect of the present invention, there is provided a computing device for determining whether a subject is affected by cervical cancer, the computing device comprising: a communication unit for obtaining an image of the cervix of the subject; And generating analysis information on the onset of the cervical cancer of the subject based on the obtained machine learning model of the cervical cancer from the input of the cervical radiographic image, Wherein the processor is configured to provide the analysis information or to provide the other apparatus with information on whether the cervical cancer corresponding to the analysis information is present or not, (I) acquiring and storing evaluation information on the analysis information as a result of the reading, or (ii) supporting the other device to acquire and store the evaluation information, and (ii) And outputs the evaluation information or causes the other device to output The process of supporting
바람직하게는, 상기 컴퓨팅 장치의 상기 프로세서는, 상기 평가 정보에 기초하여, 상기 기계 학습 모델을 재학습시킨다.Preferably, the processor of the computing device re-learns the machine learning model based on the evaluation information.
본 발명에 의하면, 종래에 의료진이 자궁경부 확대경을 통하여 획득된 자궁경부 촬영 영상을 직접 보고 교육과 경험에 기초하여 일일이 자궁경부의 상태를 확인하는 방식에 비하여 보다 빠르고 정확하게 자궁경부암의 발병 여부를 판정할 수 있는 효과가 있다.According to the present invention, compared to the conventional method in which a medical staff directly observes a cervical image obtained through a cervical loupe and confirms the state of the cervix individually based on education and experience, it is possible to quickly and accurately determine the onset of cervical cancer There is an effect that can be done.
또한 본 발명에 의하면, 기계 학습, 특히 딥 러닝 기술 등과 같이 고도화된 인공지능을 활용함으로써 의료진의 실수, 즉 오진을 방지할 수 있는 효과가 있다.Further, according to the present invention, by utilizing advanced artificial intelligence such as machine learning, in particular, deep learning technology, it is possible to prevent mistakes of the medical staff, that is, misdiagnosis.
그리고 본 발명에 의하면, 자궁경부 영상의 촬영지로부터 떨어진 원격지에서도 자궁경부암에 관한 판독을 가능하게 함으로써 의료 현장에서의 분업을 촉진할 수 있는 효과가 있다.According to the present invention, it is possible to facilitate the division of labor in the medical field by making it possible to read out the cervical cancer even in a remote place away from the photographing site of the cervical image.
본 발명은, 본 발명의 방법에 따른 재학습을 통하여 지속적으로 그 판정 성능이 개선될 수 있는 효과가 있다.The present invention has the effect of continuously improving the determination performance through re-learning according to the method of the present invention.
본 발명의 실시예의 설명에 이용되기 위하여 첨부된 아래 도면들은 본 발명의 실시예들 중 단지 일부일 뿐이며, 본 발명이 속한 기술분야에서 통상의 지식을 가진 사람(이하 "통상의 기술자"라 함)에게 있어서는 발명적 작업이 이루어짐 없이 이 도면들에 기초하여 다른 도면들이 얻어질 수 있다.BRIEF DESCRIPTION OF THE DRAWINGS The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiments of the invention and, together with the description, serve to explain the principles of the invention to those skilled in the art Other drawings can be obtained based on these figures without an inventive task being performed.
도 1은 종래에 자궁경부암을 진단하기 위하여 시행되던 자궁경부 세포 검사 및 자궁경부 확대 촬영 검진의 방식을 개략적으로 도시한 개념도이다.FIG. 1 is a conceptual diagram schematically showing a method of cervical cytology examination and cervical dilatation examination which were conventionally performed to diagnose cervical cancer.
도 2는 본 발명에 이용되는 기계 학습 모델(machine learning model) 중 하나인 CNN(convolutional neural network; 합성 신경망)을 설명하기 위하여 그 주요 개념을 도시한 도면이다.FIG. 2 is a diagram showing a main concept for explaining a CNN (convolutional neural network) which is one of the machine learning models used in the present invention.
도 3은 본 발명에 따라 자궁경부암에 대한 피검체의 발병 여부를 판정하는 방법을 수행하는 컴퓨팅 장치의 예시적 구성을 개략적으로 도시한 개념도이다.3 is a conceptual diagram schematically illustrating an exemplary configuration of a computing device that performs a method for determining whether a subject develops cervical cancer according to the present invention.
도 4는 본 발명에 따른 자궁경부암 발병 여부 판정 방법을 예시적으로 나타낸 흐름도이다.FIG. 4 is a flowchart illustrating an exemplary method for determining the incidence of cervical cancer according to the present invention.
도 5a 내지 5e는 본 발명에 따른 자궁경부암 발병 여부 판정 방법의 각 단계에서 제공되는 사용자 인터페이스(UI; user interface)를 예시적으로 나타낸 도면들이다.5A to 5E are diagrams illustrating exemplary user interfaces (UIs) provided at respective steps of the method for determining the incidence of cervical cancer according to the present invention.
후술하는 본 발명에 대한 상세한 설명은, 본 발명의 목적들, 기술적 해법들 및 장점들을 분명하게 하기 위하여 본 발명이 실시될 수 있는 특정 실시예를 예시로서 도시하는 첨부 도면을 참조한다. 이들 실시예는 통상의 기술자가 본 발명을 실시할 수 있기에 충분하도록 상세히 설명된다. The following detailed description of the invention refers to the accompanying drawings, which illustrate, by way of example, specific embodiments in which the invention may be practiced in order to clarify the objects, technical solutions and advantages of the invention. These embodiments are described in sufficient detail to enable those skilled in the art to practice the invention.
그리고 본 발명의 상세한 설명 및 청구항들에 걸쳐 '학습'은 절차에 따라 기계 학습(machine learning)을 수행함을 일컫는 용어인바, 인간의 교육 활동과 같은 정신적 작용을 지칭하도록 의도된 것이 아님을 통상의 기술자는 이해할 수 있을 것이다.Throughout the description and claims of the present invention, 'learning' is a term referring to performing machine learning in accordance with a procedure, and is not intended to refer to a mental function such as a human educational activity. Can be understood.
그리고 본 발명의 상세한 설명 및 청구항들에 걸쳐, '포함하다'라는 단어 및 그 변형은 다른 기술적 특징들, 부가물들, 구성요소들 또는 단계들을 제외하는 것으로 의도된 것이 아니다. 통상의 기술자에게 본 발명의 다른 목적들, 장점들 및 특성들이 일부는 본 설명서로부터, 그리고 일부는 본 발명의 실시로부터 드러날 것이다. 아래의 예시 및 도면은 실례로서 제공되며, 본 발명을 한정하는 것으로 의도된 것이 아니다.And throughout the description and claims of this invention, the word 'comprise' and variations thereof are not intended to exclude other technical features, additions, elements or steps. Other objects, advantages and features of the present invention will become apparent to those skilled in the art from this description, and in part from the practice of the invention. The following examples and figures are provided by way of illustration and are not intended to limit the invention.
더욱이 본 발명은 본 명세서에 표시된 실시예들의 모든 가능한 조합들을 망라한다. 본 발명의 다양한 실시예는 서로 다르지만 상호 배타적일 필요는 없음이 이해되어야 한다. 예를 들어, 여기에 기재되어 있는 특정 형상, 구조 및 특성은 일 실시예에 관련하여 본 발명의 사상 및 범위를 벗어나지 않으면서 다른 실시예로 구현될 수 있다. 또한, 각각의 개시된 실시예 내의 개별 구성요소의 위치 또는 배치는 본 발명의 사상 및 범위를 벗어나지 않으면서 변경될 수 있음이 이해되어야 한다. 따라서, 후술하는 상세한 설명은 한정적인 의미로서 취하려는 것이 아니며, 본 발명의 범위는, 적절하게 설명된다면, 그 청구항들이 주장하는 것과 균등한 모든 범위와 더불어 첨부된 청구항에 의해서만 한정된다. 도면에서 유사한 참조부호는 여러 측면에 걸쳐서 동일하거나 유사한 기능을 지칭한다. Moreover, the present invention encompasses all possible combinations of embodiments shown herein. It should be understood that the various embodiments of the present invention are different, but need not be mutually exclusive. For example, certain features, structures, and characteristics described herein may be implemented in other embodiments without departing from the spirit and scope of the invention in connection with one embodiment. It should also be understood that the position or arrangement of individual components within each disclosed embodiment may be varied without departing from the spirit and scope of the present invention. The following detailed description is, therefore, not to be taken in a limiting sense, and the scope of the present invention is to be limited only by the appended claims, along with the full scope of equivalents to which such claims are entitled, if properly explained. In the drawings, like reference numerals refer to the same or similar functions throughout the several views.
본 명세서에서 달리 표시되거나 분명히 문맥에 모순되지 않는 한, 단수로 지칭된 항목은, 그 문맥에서 달리 요구되지 않는 한, 복수의 것을 아우른다. 또한, 본 발명을 설명함에 있어, 관련된 공지 구성 또는 기능에 대한 구체적인 설명이 본 발명의 요지를 흐릴 수 있다고 판단되는 경우에는 그 상세한 설명은 생략한다.Unless otherwise indicated herein or clearly contradicted by context, items referred to in the singular are intended to encompass a plurality unless otherwise specified in the context. In the following description of the present invention, a detailed description of known functions and configurations incorporated herein will be omitted when it may make the subject matter of the present invention rather unclear.
이하, 통상의 기술자가 본 발명을 용이하게 실시할 수 있도록 하기 위하여, 본 발명의 바람직한 실시예들에 관하여 첨부된 도면을 참조하여 상세히 설명하기로 한다.Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings, so that those skilled in the art can easily carry out the present invention.
도 2는 본 발명에 이용되는 기계 학습 모델(machine learning model) 중 하나인 CNN(convolutional neural network; 합성 신경망)을 설명하기 위하여 그 주요 개념을 도시한 도면이다.FIG. 2 is a diagram showing a main concept for explaining a CNN (convolutional neural network) which is one of the machine learning models used in the present invention.
도 2를 참조하면 CNN(convolutional neural network; 합성 신경망) 모델은 인공 신경망을 다층으로 쌓은 형태로 간략하게 설명할 수 있다. 즉, 이는 깊은 구조의 네트워크라는 의미로 깊은 신경망(deep neural network; 딥 뉴럴 네트워크)이라고 표현되며, 도 2에 도시된 바와 같이, 다층의 네트워크로 이루어진 구조에서 다량의 데이터를 학습시킴으로써 각각의 이미지의 특징을 자동으로 학습하고, 이를 통하여 목적 함수의 에러(error)를 최소화시키는 방법으로 네트워크를 학습시켜 나아가는 형태이다. 이는, 일견 인간 두뇌의 신경세포 간의 연결로도 표현이 되는바, 이에 따라 인공지능의 대표격으로 자리잡고 있다. 특히, 도 2에 예시적으로 도시된 바와 같은 CNN은 이미지와 같은 2차원 이미지의 분류에 적합한 모델로서, 이미지의 각 영역을 복수의 필터를 이용하여 특징 지도(feature map)를 만들어내는 합성층(convolution layer)과 특징 지도의 크기를 줄여 위치나 회전의 변화에 불변하는 특징을 추출할 수 있도록 하는 풀링층(pooling layer; sub-sampling layer)를 반복함으로써 점, 선, 면 등의 저수준의 특징에서부터 복잡하고 의미 있는 고수준의 특징까지 다양한 수준의 특징을 추출할 수 있게 되며, 최종적으로 완전 연결층(fully-connected layer)를 통하여 추출된 특징을 기존 모델의 입력값으로서 이용하면 더 높은 정확도의 분류 모델을 구축할 수 있게 되는 장점이 있다.Referring to FIG. 2, a CNN (convolutional neural network) model can be briefly described as an artificial neural network stacked in multiple layers. That is, this is referred to as a deep neural network in the sense of a network of deep structure. As shown in FIG. 2, by learning a large amount of data in a structure of a multi-layer network, It is a form that learns the network by learning the feature automatically and minimizing the error of the objective function through it. It is also expressed as a connection between the nerve cells of the human brain, and is thus a representative of artificial intelligence. In particular, as shown in FIG. 2, CNN is a model suitable for classification of a two-dimensional image such as an image. The CNN is a composite layer for creating feature maps using a plurality of filters (eg, points, lines, and surfaces) by repeating a pooling layer (a sub-sampling layer) that reduces the size of the feature map and extracts features that are invariant to changes in position or rotation It is possible to extract various levels of features from complicated and meaningful high-level features. Finally, if the feature extracted through the fully-connected layer is used as the input value of the existing model, Can be constructed.
다음으로, 도 3은 본 발명에 따라 자궁경부암에 대한 피검체의 발병 여부를 판정하는 방법(이하 "자궁경부암 발병 여부 판정 방법"이라 함)을 수행하는 컴퓨팅 장치의 예시적 구성을 개략적으로 도시한 개념도이다.Next, Fig. 3 schematically shows an exemplary configuration of a computing device that performs a method of determining whether a subject has a cervical cancer incidence according to the present invention (hereinafter referred to as " a method of determining whether or not a cervical cancer has occurred) It is a conceptual diagram.
도 3을 참조하면, 본 발명의 일 실시예에 따른 컴퓨팅 장치(300)는, 통신부(310) 및 프로세서(320)를 포함하며, 상기 통신부(310)를 통하여 외부 컴퓨팅 장치(미도시)와 직간접적으로 통신할 수 있다.3, a computing device 300 according to an exemplary embodiment of the present invention includes a communication unit 310 and a processor 320. The communication unit 310 communicates with an external computing device (not shown) Communication is possible.
구체적으로, 상기 컴퓨팅 장치(300)는, 전형적인 컴퓨터 하드웨어(예컨대, 컴퓨터 프로세서, 메모리, 스토리지, 입력 장치 및 출력 장치, 기타 기존의 컴퓨팅 장치의 구성요소들을 포함할 수 있는 장치; 라우터, 스위치 등과 같은 전자 통신 장치; 네트워크 부착 스토리지(NAS) 및 스토리지 영역 네트워크(SAN)와 같은 전자 정보 스토리지 시스템)와 컴퓨터 소프트웨어(즉, 컴퓨팅 장치로 하여금 특정의 방식으로 기능하게 하는 인스트럭션들)의 조합을 이용하여 원하는 시스템 성능을 달성하는 것일 수 있다.In particular, the computing device 300 may include a variety of devices, such as routers, switches, and the like, which may include conventional computer hardware (e.g., a computer processor, memory, storage, input and output devices, Electronic communication devices, electronic information storage systems such as Network Attached Storage (NAS) and Storage Area Networks (SAN)) and computer software (i. E., Instructions that cause a computing device to function in a particular manner) System performance.
이와 같은 컴퓨팅 장치의 통신부(310)는 연동되는 타 컴퓨팅 장치와 요청과 응답을 송수신할 수 있는바, 일 예시로서 그러한 요청과 응답은 동일한 TCP 세션에 의하여 이루어질 수 있지만, 이에 한정되지는 않는바, 예컨대 UDP 데이터그램으로서 송수신될 수도 있을 것이다. 덧붙여, 넓은 의미에서 상기 통신부(310)는 명령어 또는 지시 등을 전달받기 위한 키보드, 마우스, 기타 외부 입력장치를 포함할 수 있다.The communication unit 310 of such a computing device can send and receive requests and responses to and from other interworking computing devices. As an example, such requests and responses can be made by the same TCP session, For example, as a UDP datagram. In addition, in a broad sense, the communication unit 310 may include a keyboard, a mouse, and other external input devices for receiving commands or instructions.
또한, 컴퓨팅 장치의 프로세서(320)는 MPU(micro processing unit) 또는 CPU(central processing unit), 캐시 메모리(cache memory), 데이터 버스(data bus) 등의 하드웨어 구성을 포함할 수 있다. 또한, 운영체제, 특정 목적을 수행하는 애플리케이션의 소프트웨어 구성을 더 포함할 수도 있다.In addition, the processor 320 of the computing device may include a hardware configuration such as a micro processing unit (MPU) or a central processing unit (CPU), a cache memory, a data bus, and the like. It may further include a software configuration of an operating system and an application that performs a specific purpose.
이제 본 발명에 따른 자궁경부암 발병 여부 판정 방법을 도 4를 참조하여 구체적으로 설명하기로 한다. 도 4는 본 발명에 따른 자궁경부암 발병 여부 판정 방법을 예시적으로 나타낸 흐름도이다.Now, a method for determining the incidence of cervical cancer according to the present invention will be described in detail with reference to FIG. FIG. 4 is a flowchart illustrating an exemplary method for determining the incidence of cervical cancer according to the present invention.
도 4를 참조하면, 본 발명에 따른 자궁경부암 발병 여부 판정 방법은, 우선, 컴퓨팅 장치(300)의 통신부(310)가, 상기 피검체의 자궁경부에 대한 촬영 영상을 획득하거나 상기 컴퓨팅 장치에 연동되는 타 장치로 하여금 획득하도록 지원하는 단계(S410)를 포함한다.Referring to FIG. 4, the method for determining the incidence of cervical cancer according to the present invention is characterized in that the communication unit 310 of the computing device 300 acquires an image of the cervix of the subject, (Step S410).
이 단계(S410)에서 상기 촬영 영상은 상기 컴퓨팅 장치(300)에 연동되는 소정의 촬영 모듈에 의하여 획득될 수 있다.In step S410, the photographed image may be acquired by a predetermined photographing module linked to the computing device 300. [
또한, 상기 촬영 영상은 본 발명에 따른 자궁경부암 발병 여부 판정 방법이 컴퓨팅 장치(300)에 의하여 수행되는 장소로부터 멀리 떨어진 곳에 위치한 타 장치에 의하여 촬영, 획득될 수도 있는데, 원격 통신을 통하여 컴퓨팅 장치(300)가 이를 획득할 수 있음은 물론이다.In addition, the captured image may be captured and obtained by another apparatus located far away from the place where the cervical cancer incidence determination method according to the present invention is performed by the computing apparatus 300, 300 can acquire it.
도 5a 내지 5e는 본 발명에 따른 자궁경부암 발병 여부 판정 방법의 각 단계에서 제공되는 사용자 인터페이스(UI; user interface)를 예시적으로 나타낸 도면들이다.5A to 5E are diagrams illustrating exemplary user interfaces (UIs) provided at respective steps of the method for determining the incidence of cervical cancer according to the present invention.
도 5a를 참조하면, 단계(S410)에서 타 장치에 의하여 획득된 촬영 영상(514)이 예시적으로 도시되어 있는바, 예컨대 'Request' 버튼(512)이 눌러짐을 감지하면 타 장치는 그 획득된 촬영 영상(514)을 컴퓨팅 장치(300)에 전달하여 컴퓨팅 장치(300)가 그 촬영 영상(514)을 획득하도록 할 수 있다. 도 5a에 예시적으로 도시된 바와 같이 촬영 영상(514)을 획득하는 때에 그 촬영 영상(514)에 대응되는 피검체, 즉, 환자의 정보인 피검체 정보가 피검체 정보 입력 영역(510)을 통하여 촬영 영상(514)과 함께 입력되어 획득될 수 있으며, 그 획득된 시점인 입력 시점의 정보 또한 함께 획득될 수 있는바, 이와 같은 피검체 정보 및 입력 시점의 정보가 상기 촬영 영상(514)과 함께 컴퓨팅 장치(300)에 전달될 수 있다.Referring to FIG. 5A, when the photographed image 514 obtained by the other device is illustrated as an example in step S410, for example, when the 'Request' button 512 is detected to be pressed, The captured image 514 may be transmitted to the computing device 300 so that the computing device 300 acquires the captured image 514. [ 5A, the subject corresponding to the photographed image 514, that is, the subject information as the patient information, is displayed on the subject information input area 510 And information on the input time point, which is the acquired time point, can also be obtained together. The information on the subject and the input time point can be obtained from the captured image 514 and the captured image 514, May be communicated to the computing device 300 together.
다음으로, 도 4를 참조하면, 본 발명에 따른 자궁경부암 발병 여부 판정 방법은, 상기 컴퓨팅 장치(300)의 프로세서(320)에 의하여 구현되는 분석 모듈(미도시)이, 획득된 상기 자궁경부 촬영 영상의 입력으로부터 상기 자궁경부암에 대한 기계 학습 모델에 기초하여 상기 피검체의 상기 자궁경부암의 발병 여부에 관한 분석 정보를 생성하거나 상기 타 장치로 하여금 생성하도록 지원하는 단계(S420)를 더 포함한다.4, the method for determining the incidence of cervical cancer according to the present invention is characterized in that an analysis module (not shown) implemented by the processor 320 of the computing device 300 detects the cervical cancer (S420) generating analysis information on the onset of the cervical cancer of the subject based on a machine learning model of the cervical cancer from the input of the image or supporting the other apparatus to generate the analysis information.
이를 위한 상기 기계 학습 모델은 미리 입력된 다수의 훈련용 정보, 즉, (i) 다수의 자궁경부 촬영 영상의 데이터, (ii) 그 다수의 자궁경부 촬영 영상에 자궁경부암의 병변이 존재하는지 여부의 데이터, 및 만약 병변이 존재한다면 (iii) 해당 영상의 어떤 부분에 그 병변이 존재하는지를 표시한 데이터를 포함하는 정보를 이용하여, 상기 프로세서(320)가, 학습시킨 기계 학습 모델이다. 전술한 바와 같이, 상기 기계 학습 모델은 CNN(convolutional neural network; 합성 신경망) 모델일 수 있으며 CNN과 SVM(support vector machine)을 결합한 모델일 수 있다. For this purpose, the machine learning model includes a plurality of previously entered training information, that is, (i) data of a plurality of cervical tomography images, (ii) whether or not there are lesions of cervical cancer in the plurality of cervical- Data, and (iii) if there is a lesion, the processor 320 learns the machine learning model using information including data indicating in which part of the image the lesion is present. As described above, the machine learning model may be a CNN (convolutional neural network) model, or a combination of CNN and a support vector machine (SVM).
예를 들어, CNN 모델에 있어서는 입력된 훈련용 정보의 영상에 기초하여 기울기 강하법(gradient descent) 및 역전파(backpropagation) 알고리즘을 적용하여 학습이 수행될 수 있다.For example, in the CNN model, learning can be performed by applying a gradient descent and a backpropagation algorithm based on an image of input training information.
상기 기계 학습 모델의 정확도와 신뢰도를 높이기 위하여 다량의 훈련용 데이터가 필요하며, 훈련용 데이터가 많으면 많을수록 기계 학습 모델의 정확도 및 신뢰도가 향상될 수 있다. To increase the accuracy and reliability of the machine learning model, a large amount of training data is required. The more training data, the better the accuracy and reliability of the machine learning model.
상기 단계(S420)에서 상기 분석 정보는 상기 자궁경부암에 대한 음성(negative), 비정형(atypical), 양성(positive) 및 악성 여부에 관한 분류 정보를 포함할 수 있다. 통상의 기술자에게 잘 알려진 바와 같이 이와 같은 분류 정보는 그 분류가 얼마나 정확한지를 나타내는 확률 정보를 포함할 수 있다. 또한, 더 구체적으로, 분석 정보는 음성인지 여부, 양성이라면 그 위험성이 높은지 낮은지 여부(low cancer risk vs. high cancer risk)에 관한 정보인 음양성 판정 정보를 포함할 수 있으며, Acetowhite Epithelium, Mosaic, Erosion or ulceration, Irregular surface contour, Punctation, Atypical Vessels, Discolaration 등의 형태학적 소견 정보를 포함할 수도 있다.In step S420, the analysis information may include classification information on negative, atypical, positive, and malignant characteristics of the cervical cancer. As is well known to those of ordinary skill in the art, such classification information may include probability information indicating how accurate the classification is. More specifically, the analytical information may include negative judgment information, such as information on whether the subject is negative or whether the risk is positive or low (low cancer risk vs. high cancer risk). Acetowhite Epithelium, Mosaic , Morphological information such as Erosion or ulceration, Irregular surface contour, Punctation, Atypical Vessels, and Discolaration.
단계(S420)에서 상기 분석 정보는 다수의 촬영 영상(514)에 대응되도록 목록화되어 제공될 수 있으며, 도 5b에 예시적으로 도시된 바와 같이 피검체 정보(520) 및 입력 시점 정보(522)와 함께 기계 학습 모델에 의한 산출된 분류 정보 및 확률 정보에 따라 자궁경부암의 발병이 의심되는지 여부의 표시('suspicious'로 표시됨; 524)가 제공될 수도 있다. 도 5b에는, 사용자가 특정 촬영 영상을 선택함으로써 그 선택된 특정 촬영 영상에 관하여 단계(S420)에 이어지는 후속 단계들을 계속 수행할 수 있도록, 특정 촬영 영상에 대응되는 'Evaluation' 버튼(526)들이 예시적으로 도시되어 있다.In step S420, the analysis information may be cataloged to correspond to a plurality of photographed images 514 and may be provided to the subject information 520 and the input time point information 522 as exemplarily shown in FIG. 5B. (Denoted as 'suspicious') 524 may be provided depending on the classification information and probability information calculated by the machine learning model and whether or not the onset of the cervical cancer is suspected. 5B illustrates an example in which the 'Evaluation' buttons 526 corresponding to a specific shot image are displayed in the example (FIG. 5B) so that the user can continue to perform subsequent steps following step S420 with respect to the selected specific shot image, Respectively.
한편, 단계(S420)에서 정확한 분석 정보를 생성하기 위하여 단계(S420)에 앞서, 본 발명에 따른 자궁경부암 발병 여부 판정 방법은, 상기 컴퓨팅 장치(300)의 프로세서(320)에 의하여 구현되는 전처리 모듈(미도시)이, 상기 자궁경부 촬영 영상에 대하여 전처리를 수행하거나 상기 타 장치로 하여금 상기 전처리를 수행하도록 지원하는 단계(S415)를 더 포함할 수 있다.The method for determining the incidence of cervical cancer according to the present invention may include a preprocessing module implemented by the processor 320 of the computing device 300, (Not shown) may perform the pre-processing on the cervical image or support the other device to perform the preprocessing (S415).
여기에서 전처리는, 상기 촬영 영상의 조도 및 노이즈에 강건하도록 하기 위한 RGB-HSV 변환, 히스토그램 평활화 등을 통한 영상 화질 개선, 블러링(blurring) 및 노이즈 처리 중 적어도 하나를 포함할 수 있는데, 이에 한정되지 않음을 통상의 기술자는 이해할 수 있을 것이다.Here, the preprocessing may include at least one of image quality enhancement through blurring, histogram smoothing, etc., blurring and noise processing to perform robustness to the illuminance and noise of the photographed image. It will be understood by those of ordinary skill in the art.
도 4를 다시 참조하면, 그 다음으로, 본 발명에 따른 자궁경부암 발병 여부 판정 방법은, 컴퓨팅 장치(300)의 프로세서(320)에 의하여 구현되는 판독 지원 모듈(미도시)이, 생성된 상기 분석 정보를 제공하거나 상기 타 장치로 하여금 제공하도록 지원함으로써, 상기 분석 정보에 대응되는 자궁경부암의 발병 여부를 상기 컴퓨팅 장치의 사용자 또는 원격지의 타 사용자로 하여금 판독하도록 지원하는 단계(S430)를 더 포함한다.Referring again to FIG. 4, a method for determining the incidence of cervical cancer according to the present invention is then performed by a read support module (not shown) implemented by the processor 320 of the computing device 300, (S430) supporting the user of the computing device or another user at the remote location to read whether or not the cervical cancer corresponding to the analysis information is caused by providing information or providing the other device with the information .
여기에서 원격지와의 통신에 있어서 보안을 위하여, 상기 컴퓨팅 장치가 상기 타 장치와 연동되는 때에, 상기 컴퓨팅 장치와 상기 타 장치 간의 정보 송수신은 암호화 및 복호화에 의하여 이루어질 수도 있는바, 예컨대 AES 128bit 암호화 및 복호화가 적용될 수도 있다.Here, for security in communication with a remote place, when the computing device is interlocked with the other device, information transmission / reception between the computing device and the other device may be performed by encryption and decryption, for example, AES 128 bit encryption and / Decoding may be applied.
단계(S430)의 일부로서, 도 5c에 예시적으로 도시된 바와 같이, 촬영 영상의 전부 또는 일부가 사용자 또는 원격지의 타 사용자의 영상 판독을 위하여 사용자 인터페이스 상에 제공될 수 있는데, 사용자는 의학적으로 특이한 영역이 있는지 여부를 판단하면서, 필요한 여러 가지 표시를 할 수 있다(예컨대, 사각형, 화살표, 텍스트 입력 등).As part of step S430, all or part of the photographed image may be provided on the user interface for image reading of the other user of the user or the remote site, as exemplarily shown in Fig. 5C, (For example, a rectangle, an arrow, a text input, and the like) while judging whether or not there is an unusual area.
또한, 여기에서 상기 분석 정보는 판독의 등의 사용자가 이해하기에 쉬운 방식으로 가공되어 소정의 표시 장치(디스플레이)를 통하여 제공될 수 있다. 예컨대, 상기 분석 정보에 포함된 병변의 위치나 병변에 대한 상기 분류 정보가 소정의 포맷으로 표시될 수도 있을 것이다.Here, the analysis information may be processed in a manner easy for the user to understand, such as reading, and provided through a predetermined display device (display). For example, the classification information on the lesion location or lesion included in the analysis information may be displayed in a predetermined format.
도 4에 도시된, 단계(S430)의 일 실시예로서, 소정의 표시 장치를 통하여 제공된 분석 정보를 이용하여 컴퓨팅 장치(300)의 사용자, 예컨대 촬영지의 판독의에 의하여 자궁경부암의 발병 여부 및 그 분류 정보에 대한 평가 정보가 생성될 수도 있고, 원격지의 판독의에 의하여 발병 여부 및 그 분류 정보에 대한 평가 정보가 판단될 수도 있다. 여기에서 평가 정보라고 함은, 상기 제공된 분석 정보가 정확한 것인지, 즉 상기 분석 정보에 포함된 발병 여부가 옳은지 그른지에 관한 정보, 및 상기 분석 정보에 포함된 분류 정보가 옳은지 그른지, 그르다면, 어떠한 분류가 옳은 것인지에 관한 정보를 포함할 수 있다. 후자의 경우에는 다음 단계(S440)에서, 컴퓨팅 장치(300)가, 상기 통신부(310)를 통하여 그 평가 정보를 획득할 수 있을 것이다.As an example of the step S430 shown in FIG. 4, by using the analysis information provided through a predetermined display device, the user of the computing device 300, for example, Evaluation information on the classification information may be generated, and evaluation information on the occurrence and classification information of the classification information may be determined by the reading of the remote location. Here, the evaluation information includes information on whether the provided analysis information is correct, that is, whether or not the onset included in the analysis information is correct, and if the classification information included in the analysis information is correct or not, May be included. In the latter case, in the next step S440, the computing device 300 may acquire the evaluation information through the communication unit 310. [
단계(S430)의 위 실시예에서, 평가 정보는, 촬영 영상에 관한 품질에 관한 정보, 예컨대 촬영 영상의 기술적 결함에 관한 정보를 포함할 수도 있다. 예컨대, 그러한 기술적 결함은, 촬영 영상에 점액(mucus)이나 혈액이 과다하여 그 촬영 영상에 대한 정확한 판정이 어렵다거나 촬영 영상의 각도나 촬영된 부위의 위치 때문에 자궁경부암의 발병 여부를 확인하기 어려운 것일 수도 있고, 아세트산 반응이 있어야 하는데도 그 아세트산 반응이 불충분하다거나(insufficient acetic acid reaction) 초점 불분명(out of focus), 과노출, 과소노출에 해당하는 영상의 문제일 수도 있다.In the above embodiment of the step S430, the evaluation information may include information on the quality of the photographed image, for example, information on a technical defect of the photographed image. For example, such a technical defect is that it is difficult to accurately determine the photographed image due to excessive mucus or blood in the photographed image, or it is difficult to confirm the incidence of the cervical cancer due to the angle of the photographed image or the position of the photographed region It may also be an image problem with an acetic acid reaction, insufficient acetic acid reaction, or out of focus, overexposure, or underexpression, even though there is an acetic acid reaction.
단계(S430)에서는, 도 5d에 예시적으로 도시된 바와 같이, 촬영 영상의 전부 또는 일부(540), 동일 피검체에 대하여 이전에 촬영된 타 영상의 이력 정보(541), 도 5a에 예시된 피검체 정보 입력 영역(510)을 통하여 입력되었던 것과 같은 피검체 정보를 나타내는 피검체 정보 노출 영역(542), 음양성 판정 정보가 입력될 수 있는 음양성 판정 정보 입력 영역(543), 형태학적 소견 정보가 입력될 수 있는 형태학적 소견 정보 입력 영역(544), 촬영 영상(540)의 기술적 결함에 관한 정보가 입력될 수 있는 기술적 결함 정보 입력 영역(545), 상기 기계 학습 모델에 의하여 도출된 분석 정보를 나타내는 인공지능 분석 정보 출력 영역(546) 및 사용자(판독의)가 상기 촬영 영상에 기초한 소견을 입력할 수 있는 사용자 소견 입력 영역(547)이 사용자 인터페이스 상에 제공될 수 있는바, 이로써 상기 분석 정보에 대응되는 자궁경부암의 발병 여부를 상기 컴퓨팅 장치의 사용자 또는 원격지의 타 사용자가 판독하는 것이 용이해진다.In step S430, as exemplarily shown in Fig. 5D, all or part of the photographed image 540, history information 541 of another image previously photographed with respect to the same subject, A subject information exposure area 542 indicating the subject information as inputted through the subject information input area 510, a sound positive determination information input area 543 to which sound positive determination information can be inputted, A technical defect information input area 545 in which information on technical defects of the morphological feature information input area 544 and the shot image 540 to which information can be inputted can be inputted, An artificial intelligence analysis information output area 546 representing information and a user opinion input area 547 by which the user (reader) can input the findings based on the photographed image can be provided on the user interface , Thereby making it easy for the user of the computing apparatus or the remote user to read whether or not the onset of the cervical cancer corresponding to the analysis information is read.
다음으로, 본 발명에 따른 자궁경부암 발병 여부 판정 방법은, (i) 단계(S430)를 수행한 결과로서 상기 분석 정보에 대한 평가 정보를 획득 및 저장하거나 상기 타 장치로 하여금 상기 평가 정보를 획득 및 저장하도록 지원하는 프로세스, 및 (ii) 상기 평가 정보를 출력하거나 상기 타 장치로 하여금 출력하도록 지원하는 프로세스를 수행하는 단계(S440)를 더 포함한다. Next, a method for determining the incidence of cervical cancer according to the present invention includes: (i) acquiring and storing evaluation information on the analysis information as a result of performing step (S430); or And (ii) performing a process of outputting the evaluation information or supporting the output of the other device (S440).
여기에서 평가 정보는 의료 결과 레포트의 형식으로 가공되어 제공될 수도 있는바, 예컨대, 이를 위하여 제공되는 사용자 인터페이스는 도 5e에 잘 나타나 있으며, 의료 결과 레포트(550)는 자궁경부암의 발병 여부, 분류 정보 등을 포함할 수 있다.Here, the evaluation information may be processed and provided in the form of a medical result report, for example, a user interface provided for this purpose is shown in FIG. 5E, and the medical result report 550 may include information on the onset of cervical cancer, And the like.
또한, 이와 같은 의료 결과 레포트는 촬영 영상을 최초 획득한 장소(즉, 촬영지)나 단계(S430)에 의하여 촬영 영상의 판독이 수행된 장소로부터 멀리 떨어진 곳에 위치한 타 장치를 통하여 제공될 수 있음은 통상의 기술자가 이해할 수 있을 것이다.It should be noted that such a medical result report can be provided through another device located far away from the place where the photographed image was first obtained (i.e., the photographing place) or the step of reading the photographed image by the step S430 The person skilled in the art will understand.
이와 같이 본 발명에 따른 자궁경부암 발병 여부 판정 방법은, 미리 학습된 기계 학습 모델에 기초하여 자궁경부암에 대한 발병 여부를 판정할 수 있는바, 상기 평가 정보를 다시 상기 기계 학습 모델에 대한 재학습의 자료로 활용한다면, 상기 기계 학습 모델로 하여금 더 정확한 판독을 수행하도록 할 수 있는 장점이 있으므로, 이러한 장점을 취하기 위한 본 발명에 따른 자궁경부암 발병 여부 판정 방법의 일 실시예는, 도 4에 예시적으로 도시된 바와 같이, 상기 평가 정보에 기초하여, 상기 컴퓨팅 장치(300)의 프로세서(320)에 의하여 구현되는 재학습 모듈(미도시)이, 상기 기계 학습 모델을 재학습시키는 단계(S450)를 더 포함할 수 있다.As described above, the method for determining the incidence of cervical cancer according to the present invention can determine whether the cervical cancer has an onset based on a previously learned machine learning model, As an example of the method for determining the incidence of cervical cancer according to the present invention for taking advantage of this advantage, there is an advantage that the machine learning model can perform more accurate reading, A re-learning module (not shown) implemented by the processor 320 of the computing device 300 re-learns the machine learning model (S450) based on the evaluation information .
이와 같이 본 발명은 전술한 모든 실시예들에 걸쳐, 종래에 의료진이 자궁경부 확대경을 통하여 획득된 자궁경부 촬영 영상을 직접 보고 교육과 경험에 기초하여 일일이 자궁경부의 상태를 확인하는 방식에 비하여 보다 빠르고 정확하게 자궁경부암의 발병 여부를 판정할 수 있는 효과가 있다.As described above, according to the present invention, in comparison with the conventional method in which the medical staff directly examines the cervical image obtained through the cervical loupe and confirms the state of the cervix individually based on education and experience, There is an effect of quickly and accurately determining the incidence of cervical cancer.
상기 실시예들로써 여기에서 설명된 기술의 이점은, 많은 진단 횟수에도 불구하고 자궁경부암을 정확히 판정해 내어야 하는 의료진의 실수, 즉 오진을 방지할 수 있다는 점이다. 특히, 기계 학습 기술을 이용하면 의사가 다년간의 교육과 경험을 통하여만 알게 되는 자궁경부암의 병변의 특징 및 형태를 컴퓨팅 장치 자체로 분석 및 학습할 수 있게 됨으로써 사람인 의사가 간과할 수 있는 사례 또는 자궁경부암의 판정이 어려운 사례에 대해서도 그 판정을 보조할 수 있게 된다.An advantage of the techniques described hereinabove with respect to the above embodiments is that it is possible to prevent mistakes, that is, misdiagnosis, of the medical personnel who must accurately determine the cervical cancer despite a large number of diagnoses. In particular, the use of machine learning technology allows the physician to analyze and learn the characteristics and forms of lesions of the cervical cancer, which are known only through education and experience of many years, by the computing device itself, It is possible to assist the judgment in cases where it is difficult to determine the incidence of cervical cancer.
위 실시예의 설명에 기초하여 해당 기술분야의 통상의 기술자는, 본 발명이 소프트웨어 및 하드웨어의 결합을 통하여 달성되거나 하드웨어만으로 달성될 수 있다는 점을 명확하게 이해할 수 있다. 본 발명의 기술적 해법의 대상물 또는 선행 기술들에 기여하는 부분들은 다양한 컴퓨터 구성요소를 통하여 수행될 수 있는 프로그램 명령어의 형태로 구현되어 기계 판독 가능한 기록 매체에 기록될 수 있다. 상기 기계 판독 가능한 기록 매체는 프로그램 명령어, 데이터 파일, 데이터 구조 등을 단독으로 또는 조합하여 포함할 수 있다. 상기 기계 판독 가능한 기록 매체에 기록되는 프로그램 명령어는 본 발명을 위하여 특별히 설계되고 구성된 것들이거나 컴퓨터 소프트웨어 분야의 통상의 기술자에게 공지되어 사용 가능한 것일 수도 있다. 기계 판독 가능한 기록 매체의 예에는, 하드 디스크, 플로피 디스크 및 자기 테이프와 같은 자기 매체, CD-ROM, DVD와 같은 광기록 매체, 플롭티컬 디스크(floptical disk)와 같은 자기-광 매체(magneto-optical media), 및 ROM, RAM, 플래시 메모리 등과 같은 프로그램 명령어를 저장하고 수행하도록 특별히 구성된 하드웨어 장치가 포함된다. 프로그램 명령어의 예에는, 컴파일러에 의해 만들어지는 것과 같은 기계어 코드뿐만 아니라 인터프리터 등을 사용해서 컴퓨터에 의해서 실행될 수 있는 고급 언어 코드도 포함된다. Based on the description of the above embodiments, one of ordinary skill in the art can clearly understand that the present invention can be achieved through a combination of software and hardware, or can be accomplished by hardware alone. Objects of the technical solution of the present invention or portions contributing to the prior art can be recorded in a machine-readable recording medium implemented in the form of program instructions that can be executed through various computer components. The machine-readable recording medium may include program commands, data files, data structures, and the like, alone or in combination. The program instructions recorded on the machine-readable recording medium may be those specially designed and constructed for the present invention or may be those known to those of ordinary skill in the computer software arts. Examples of the machine-readable recording medium include magnetic media such as hard disks, floppy disks and magnetic tape, optical recording media such as CD-ROMs and DVDs, magneto-optical media such as floptical disks, media, and hardware devices specifically configured to store and execute program instructions such as ROM, RAM, flash memory, and the like. Examples of program instructions include machine language code such as those generated by a compiler, as well as high-level language code that can be executed by a computer using an interpreter or the like.
상기 하드웨어 장치는 본 발명에 따른 처리를 수행하기 위해 하나 이상의 소프트웨어 모듈로서 작동하도록 구성될 수 있으며, 그 역도 마찬가지이다. 상기 하드웨어 장치는, 프로그램 명령어를 저장하기 위한 ROM/RAM 등과 같은 메모리와 결합되고 상기 메모리에 저장된 명령어들을 실행하도록 구성되는 CPU나 GPU와 같은 프로세서를 포함할 수 있으며, 외부 장치와 신호를 주고 받을 수 있는 통신부를 포함할 수 있다. 덧붙여, 상기 하드웨어 장치는 개발자들에 의하여 작성된 명령어들을 전달받기 위한 키보드, 마우스, 기타 외부 입력장치를 포함할 수 있다.The hardware device may be configured to operate as one or more software modules for performing the processing according to the present invention, and vice versa. The hardware device may include a processor, such as a CPU or a GPU, coupled to a memory, such as ROM / RAM, for storing program instructions, and configured to execute instructions stored in the memory, And a communication unit. In addition, the hardware device may include a keyboard, a mouse, and other external input devices for receiving commands generated by the developers.
이상에서 본 발명이 구체적인 구성요소 등과 같은 특정 사항들과 한정된 실시예 및 도면에 의해 설명되었으나, 이는 본 발명의 보다 전반적인 이해를 돕기 위해서 제공된 것일 뿐, 본 발명이 상기 실시예들에 한정되는 것은 아니며, 본 발명이 속하는 기술분야에서 통상적인 지식을 가진 사람이라면 이러한 기재로부터 다양한 수정 및 변형을 꾀할 수 있다.While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments, but, on the contrary, And various modifications and changes may be made thereto by those skilled in the art to which the present invention pertains.
따라서, 본 발명의 사상은 상기 설명된 실시예에 국한되어 정해져서는 아니되며, 후술하는 특허청구범위뿐만 아니라 이 특허청구범위와 균등하게 또는 등가적으로 변형된 모든 것들은 본 발명의 사상의 범주에 속한다고 할 것이다.Therefore, the spirit of the present invention should not be construed as being limited to the above-described embodiments, and all of the equivalents or equivalents of the claims, as well as the following claims, I will say.
그와 같이 균등하게 또는 등가적으로 변형된 것에는, 예컨대 본 발명에 따른 방법을 실시한 것과 동일한 결과를 낼 수 있는, 논리적으로 동치(logically equivalent)인 방법이 포함될 것이다.Equally or equivalently modified such methods will include logically equivalent methods which can yield, for example, the same results as those of the method according to the invention.

Claims (10)

  1. 자궁경부암에 대한 피검체의 발병 여부를 판정하는 방법에 있어서, A method for determining the onset of a subject for cervical cancer,
    (a) 컴퓨팅 장치가, 상기 피검체의 자궁경부에 대한 촬영 영상을 획득하거나 상기 컴퓨팅 장치에 연동되는 타 장치로 하여금 획득하도록 지원하는 단계; (a) supporting a computing device to acquire a captured image of the cervix of the subject or acquire another device associated with the computing device;
    (b) 상기 컴퓨팅 장치가, 획득된 상기 자궁경부 촬영 영상의 입력으로부터 상기 자궁경부암에 대한 기계 학습 모델에 기초하여 상기 피검체의 상기 자궁경부암의 발병 여부에 관한 분석 정보를 생성하거나 상기 타 장치로 하여금 생성하도록 지원하는 단계; (b) the computing device generates analysis information on the incidence of the cervical cancer of the subject based on the obtained machine learning model of the cervical cancer from the input of the cervical radiographic image, To generate the generated data;
    (c) 상기 컴퓨팅 장치가, 생성된 상기 분석 정보를 제공하거나 상기 타 장치로 하여금 제공하도록 지원함으로써, 상기 분석 정보에 대응되는 자궁경부암의 발병 여부를 상기 컴퓨팅 장치의 사용자 또는 원격지의 타 사용자로 하여금 판독하도록 지원하는 단계; 및 (c) the computing device provides the generated analysis information or provides the other device with information, thereby allowing a user of the computing device or a remote user of the remote location to determine whether the cervical cancer corresponding to the analysis information has occurred Supporting to read; And
    (d) 상기 컴퓨팅 장치가, (i) 상기 (c) 단계를 수행한 결과로서 상기 분석 정보에 대한 평가 정보를 획득 및 저장하거나 상기 타 장치로 하여금 상기 평가 정보를 획득 및 저장하도록 지원하는 프로세스, 및 (ii) 상기 평가 정보를 출력하거나 상기 타 장치로 하여금 출력하도록 지원하는 프로세스를 수행하는 단계(d) the computing device is configured to: (i) acquire and store evaluation information for the analysis information as a result of performing the step (c), or support the other device to acquire and store the evaluation information; And (ii) performing a process of outputting the evaluation information or supporting the output of the other device
    를 포함하는 자궁경부암 발병 여부 판정 방법.Of the cervical cancer.
  2. 제1항에 있어서, The method according to claim 1,
    (e) 상기 평가 정보에 기초하여, 상기 컴퓨팅 장치가, 상기 기계 학습 모델을 재학습시키거나 상기 타 장치로 하여금 상기 기계 학습 모델을 재학습시키도록 지원하는 단계(e) based on the evaluation information, the computing device re-learning the machine learning model or supporting the other device to re-learn the machine learning model
    를 더 포함하는 자궁경부암 발병 여부 판정 방법.Of the cervix cancer.
  3. 제1항에 있어서, The method according to claim 1,
    상기 (b) 단계에 있어서, In the step (b)
    상기 분석 정보는 상기 자궁경부암에 대한 음성(negative), 비정형(atypical), 양성(positive) 및 악성 여부에 관한 분류 정보를 포함하는 것을 특징으로 하는 자궁경부암 발병 여부 판정 방법.Wherein the analysis information includes classification information on the negative, atypical, positive, and malignant characteristics of the cervical cancer.
  4. 제1항에 있어서, The method according to claim 1,
    상기 (b) 단계에 앞서, Prior to step (b)
    (b0) 상기 컴퓨팅 장치가, 상기 자궁경부 촬영 영상에 대하여 전처리를 수행하거나 상기 타 장치로 하여금 상기 전처리를 수행하도록 지원하는 단계를 더 포함하는 것을 특징으로 하는 자궁경부암 발병 여부 판정 방법.(b0) further comprising the step of the computing device performing a pre-processing on the cervical image or supporting the other device to perform the preprocessing.
  5. 제1항에 있어서, The method according to claim 1,
    상기 기계 학습 모델은 합성 신경망(CNN) 모델 및 서포트 벡터 머신(SVM) 모델을 포함하고 The machine learning model includes a composite neural network (CNN) model and a support vector machine (SVM) model
    상기 (b) 단계는, The step (b)
    (b1) 상기 컴퓨팅 장치가, 상기 합성 신경망 모델에 기초하여, 상기 자궁경부 촬영 영상으로부터 특징을 추출하거나 상기 타 장치로 하여금 추출하도록 지원하는 단계; 및 (b1) supporting the computing device to extract features or extract features from the cervical radiographic image based on the synthetic neural network model; And
    (b2) 상기 컴퓨팅 장치가, 상기 서포트 벡터 머신 모델에 기초하여, 상기 특징을 이용하여 상기 자궁경부 촬영 영상을 분류함으로써 분류 정보를 생성하거나 상기 타 장치로 하여금 생성하도록 지원하는 단계(b2) the computing device generates classification information by classifying the cervical-radiographed image using the feature based on the support vector machine model or supports the other device to generate the classification information
    를 포함하는 것을 특징으로 하는 자궁경부암 발병 여부 판정 방법.And determining whether the cervix cancer has been diagnosed.
  6. 제1항에 있어서, The method according to claim 1,
    상기 컴퓨팅 장치가 상기 타 장치와 연동되는 때에, 상기 컴퓨팅 장치와 상기 타 장치 간의 정보 송수신은 AES 128bit 암호화 및 복호화에 의하여 이루어지는 것을 특징으로 하는 자궁경부암 발병 여부 판정 방법.Wherein the information transmission / reception between the computing device and the other device is performed by AES 128 bit encryption and decryption when the computing device is interlocked with the other device.
  7. 컴퓨팅 장치로 하여금, 제1항 내지 제6항 중 어느 한 항의 방법을 수행하도록 구현된 인스트럭션들(instructions)을 포함하는 컴퓨터 프로그램.A computer program comprising instructions implemented by a computing device to perform the method of any one of claims 1 to 6.
  8. 자궁경부암에 대한 피검체의 발병 여부를 판정하는 컴퓨팅 장치에 있어서, 1. A computing device for determining whether a subject has a cervical cancer,
    상기 피검체의 자궁경부에 대한 촬영 영상을 획득하는 통신부; 및 A communication unit for acquiring an image of the cervix of the subject; And
    획득된 상기 자궁경부 촬영 영상의 입력으로부터 상기 자궁경부암에 대한 기계 학습 모델에 기초하여 상기 피검체의 상기 자궁경부암의 발병 여부에 관한 분석 정보를 생성하거나 상기 통신부를 통하여 연동되는 타 장치로 하여금 생성하도록 지원하는 프로세서From the input of the obtained cervical tomography image, analysis information on the onset of the cervical cancer of the subject on the basis of the machine learning model of the cervical cancer, or to generate other information interworking through the communication unit Supported Processors
    를 포함하되, , ≪ / RTI &
    상기 프로세서는, The processor comprising:
    생성된 상기 분석 정보를 제공하거나 상기 타 장치로 하여금 제공하도록 지원함으로써, 상기 분석 정보에 대응되는 자궁경부암의 발병 여부를 상기 컴퓨팅 장치의 사용자 또는 원격지의 타 사용자로 하여금 판독하도록 지원하고, And provides the generated analysis information or provides the other device to provide the analysis information so that the user of the computing device or the remote user at the remote location can read whether or not the cervical cancer corresponding to the analysis information is detected,
    (i) 상기 판독의 결과로서 상기 분석 정보에 대한 평가 정보를 획득 및 저장하거나 상기 타 장치로 하여금 상기 평가 정보를 획득 및 저장하도록 지원하는 프로세스 및 (ii) 상기 평가 정보를 출력하거나 상기 타 장치로 하여금 출력하도록 지원하는 프로세스를 수행하는 것을 특징으로 하는 자궁경부암 발병 여부 판정 장치.(i) acquiring and storing evaluation information on the analysis information as a result of the reading, or (ii) supporting the other device to acquire and store the evaluation information, and (ii) And outputting the resultant cervical cancer.
  9. 제8항에 있어서, 9. The method of claim 8,
    상기 프로세서는, The processor comprising:
    상기 평가 정보에 기초하여 상기 기계 학습 모델을 재학습시키거나 상기 타 장치로 하여금 상기 기계 학습 모델을 재학습시키는 것을 특징으로 하는 자궁경부암 발병 여부 판정 장치.And re-learning the machine learning model based on the evaluation information, or causing the other apparatus to re-learn the machine learning model.
  10. 제8항에 있어서, 9. The method of claim 8,
    상기 기계 학습 모델은 합성 신경망(CNN) 모델 및 서포트 벡터 머신(SVM) 모델을 포함하고 The machine learning model includes a composite neural network (CNN) model and a support vector machine (SVM) model
    상기 프로세서는, The processor comprising:
    상기 분석 정보의 생성시에, Upon generating the analysis information,
    상기 합성 신경망 모델에 기초하여, 상기 자궁경부 촬영 영상으로부터 특징을 추출하거나 상기 타 장치로 하여금 추출하도록 지원하고, Extracting features from the cervical radiographic image or extracting the other apparatus based on the synthetic neural network model,
    상기 서포트 벡터 머신 모델에 기초하여, 상기 특징을 이용하여 상기 자궁경부 촬영 영상을 분류함으로써 분류 정보를 생성하거나 상기 타 장치로 하여금 생성하도록 지원하는 것을 특징으로 하는 자궁경부암 발병 여부 판정 장치.Wherein the support vector machine model is used to generate categorization information by classifying the cervical radiographic image using the feature, or to assist the other device to generate categorical information.
PCT/KR2017/013015 2017-11-16 2017-11-16 Method for determining whether subject has developed cervical cancer, and device using same WO2019098415A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN201780004364.9A CN110352461A (en) 2017-11-16 2017-11-16 For determining the method and apparatus that cervical carcinoma whether occurs in subject
PCT/KR2017/013015 WO2019098415A1 (en) 2017-11-16 2017-11-16 Method for determining whether subject has developed cervical cancer, and device using same
KR1020177033520A KR20190087681A (en) 2017-11-16 2017-11-16 A method for determining whether a subject has an onset of cervical cancer

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/KR2017/013015 WO2019098415A1 (en) 2017-11-16 2017-11-16 Method for determining whether subject has developed cervical cancer, and device using same

Publications (1)

Publication Number Publication Date
WO2019098415A1 true WO2019098415A1 (en) 2019-05-23

Family

ID=66539730

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2017/013015 WO2019098415A1 (en) 2017-11-16 2017-11-16 Method for determining whether subject has developed cervical cancer, and device using same

Country Status (3)

Country Link
KR (1) KR20190087681A (en)
CN (1) CN110352461A (en)
WO (1) WO2019098415A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20200139606A (en) * 2019-06-04 2020-12-14 주식회사 아이도트 Cervical cancer diagnosis system
CN112200253A (en) * 2020-10-16 2021-01-08 武汉呵尔医疗科技发展有限公司 Cervical cell image classification method based on senet
EP4042928A4 (en) * 2019-10-28 2023-11-01 Aidot Inc. Uterine cervical image acquisition apparatus

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102155381B1 (en) * 2019-09-19 2020-09-11 두에이아이(주) Method, apparatus and software program for cervical cancer decision using image analysis of artificial intelligence based technology
KR20230099995A (en) * 2021-12-28 2023-07-05 가천대학교 산학협력단 Method for providing information of diagnosing cervical cancer and device for providing information diagnosing cervical cancer using the same

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7664300B2 (en) * 2005-02-03 2010-02-16 Sti Medical Systems, Llc Uterine cervical cancer computer-aided-diagnosis (CAD)
US20120283574A1 (en) * 2011-05-06 2012-11-08 Park Sun Young Diagnosis Support System Providing Guidance to a User by Automated Retrieval of Similar Cancer Images with User Feedback
US8503747B2 (en) * 2010-05-03 2013-08-06 Sti Medical Systems, Llc Image analysis for cervical neoplasia detection and diagnosis
WO2017083588A1 (en) * 2015-11-10 2017-05-18 Davey Neil Shivraj Apparatus and method for detecting cervical cancer and tuberculosis
US20170175169A1 (en) * 2015-12-18 2017-06-22 Min Lee Clinical decision support system utilizing deep neural networks for diagnosis of chronic diseases

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8483454B2 (en) * 2008-10-10 2013-07-09 Sti Medical Systems, Llc Methods for tissue classification in cervical imagery
US20120157767A1 (en) * 2010-12-20 2012-06-21 Milagen, Inc. Digital Cerviscopy Device and Applications
CN106780466A (en) * 2016-12-21 2017-05-31 广西师范大学 A kind of cervical cell image-recognizing method based on convolutional neural networks
KR101791029B1 (en) * 2017-05-31 2017-10-27 주식회사 뷰노 Method for diagnosis of protozoal disease and apparatus using the same
CN107220975B (en) * 2017-07-31 2018-03-09 合肥工业大学 Uterine neck image intelligent auxiliary judgment system and its processing method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7664300B2 (en) * 2005-02-03 2010-02-16 Sti Medical Systems, Llc Uterine cervical cancer computer-aided-diagnosis (CAD)
US8503747B2 (en) * 2010-05-03 2013-08-06 Sti Medical Systems, Llc Image analysis for cervical neoplasia detection and diagnosis
US20120283574A1 (en) * 2011-05-06 2012-11-08 Park Sun Young Diagnosis Support System Providing Guidance to a User by Automated Retrieval of Similar Cancer Images with User Feedback
WO2017083588A1 (en) * 2015-11-10 2017-05-18 Davey Neil Shivraj Apparatus and method for detecting cervical cancer and tuberculosis
US20170175169A1 (en) * 2015-12-18 2017-06-22 Min Lee Clinical decision support system utilizing deep neural networks for diagnosis of chronic diseases

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20200139606A (en) * 2019-06-04 2020-12-14 주식회사 아이도트 Cervical cancer diagnosis system
KR102316557B1 (en) * 2019-06-04 2021-10-25 주식회사 아이도트 Cervical cancer diagnosis system
EP4042928A4 (en) * 2019-10-28 2023-11-01 Aidot Inc. Uterine cervical image acquisition apparatus
CN112200253A (en) * 2020-10-16 2021-01-08 武汉呵尔医疗科技发展有限公司 Cervical cell image classification method based on senet

Also Published As

Publication number Publication date
KR20190087681A (en) 2019-07-25
CN110352461A (en) 2019-10-18

Similar Documents

Publication Publication Date Title
WO2020207377A1 (en) Method, device, and system for image recognition model training and image recognition
WO2019098415A1 (en) Method for determining whether subject has developed cervical cancer, and device using same
Du et al. Review on the applications of deep learning in the analysis of gastrointestinal endoscopy images
CN110600122B (en) Digestive tract image processing method and device and medical system
AU2019431299B2 (en) AI systems for detecting and sizing lesions
Yogapriya et al. Gastrointestinal tract disease classification from wireless endoscopy images using pretrained deep learning model
CN106530295A (en) Fundus image classification method and device of retinopathy
CN111310851A (en) Artificial intelligence ultrasonic auxiliary system and application thereof
CN109460717B (en) Digestive tract confocal laser microscopy endoscope lesion image identification method and device
WO2017069596A1 (en) System for automatic diagnosis and prognosis of tuberculosis by cad-based digital x-ray
CN103945755B (en) Image processing apparatus
KR102155381B1 (en) Method, apparatus and software program for cervical cancer decision using image analysis of artificial intelligence based technology
CN111144271B (en) Method and system for automatically identifying biopsy parts and biopsy quantity under endoscope
WO2019189972A1 (en) Method for analyzing iris image by artificial intelligence so as to diagnose dementia
CN112802000A (en) Intelligent auxiliary diagnosis and treatment system for multi-modal medical images
KR20200139606A (en) Cervical cancer diagnosis system
CN110427994A (en) Digestive endoscope image processing method, device, storage medium, equipment and system
Xu et al. Upper gastrointestinal anatomy detection with multi‐task convolutional neural networks
WO2020179950A1 (en) Deep learning-based method and device for prediction of progression of brain disease
CN107977958A (en) A kind of image diagnosing method and device
WO2024005542A1 (en) Method and device for predicting disease through wrinkle detection
WO2023155488A1 (en) Fundus image quality evaluation method and device based on multi-source multi-scale feature fusion
Cho et al. A novel summary report of colonoscopy: timeline visualization providing meaningful colonoscopy video information
CA3147017C (en) System and method for classifying dermatological images using machine learning
WO2022145988A1 (en) Apparatus and method for facial fracture reading using artificial intelligence

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 20177033520

Country of ref document: KR

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 1020177033520

Country of ref document: KR

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205 DATED 02/02/2021)

122 Ep: pct application non-entry in european phase

Ref document number: 17932015

Country of ref document: EP

Kind code of ref document: A1