CN110858396A - System for generating cervical learning data and method for classifying cervical learning data - Google Patents

System for generating cervical learning data and method for classifying cervical learning data Download PDF

Info

Publication number
CN110858396A
CN110858396A CN201910137502.1A CN201910137502A CN110858396A CN 110858396 A CN110858396 A CN 110858396A CN 201910137502 A CN201910137502 A CN 201910137502A CN 110858396 A CN110858396 A CN 110858396A
Authority
CN
China
Prior art keywords
image data
cervix
classification
captured image
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910137502.1A
Other languages
Chinese (zh)
Inventor
郑载训
崔成源
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Buzzpole Corp
Original Assignee
Buzzpole Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Buzzpole Corp filed Critical Buzzpole Corp
Publication of CN110858396A publication Critical patent/CN110858396A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10068Endoscopic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Biomedical Technology (AREA)
  • Theoretical Computer Science (AREA)
  • Public Health (AREA)
  • Data Mining & Analysis (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Pathology (AREA)
  • Databases & Information Systems (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Artificial Intelligence (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Quality & Reliability (AREA)
  • Image Analysis (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The present invention relates to a cervical learning system, and more particularly, to a system for generating cervical learning data for deep learning of cervix and a method of classifying cervical learning data in the system, wherein the method includes: receiving captured image data of an unclassified cervix from an external device; classifying the captured image data of the unclassified cervix based on a neural network algorithm according to a plurality of multi-level classification criteria; and generating and storing classification-criterion-specific learning data from the captured image data of the cervix classified according to the classification criterion, or learning the generated classification-criterion-specific learning data.

Description

System for generating cervical learning data and method for classifying cervical learning data
Technical Field
The invention relates to a cervical learning system. More particularly, the present invention relates to a system for generating cervical learning data to deeply learn the cervix and a method of classifying cervical learning data in the system.
Background
Cervical cancer ranks highest among the most feared cancer types in korean women because hysterectomy may affect pregnancy or childbirth or cause loss of female feelings.
According to the statistics of 2013, the number of korean cervical cancer patients is 26,207, and the ranking is the fourth in female cancer (korean health and welfare department data). In addition, cervical cancer is one of seven major cancers recommended to be screened in korea, and as cervical cancer is incorporated into national cancer screening programs since 1999, its early diagnosis rate is increasing. In recent years, the in situ Cancer (CIS) (precancerous) of the cervix, which is called "stage 0" cervical cancer, has also steadily increased, and women with sexual experience are suggested to receive examinations every year.
Market status for cervical cancer screening has shown that the age of screening targets has decreased from 30 to 20 years as the rate of CIS in young women has increased since 2016. In particular, unlike other cancers, medical insurance gold is suitable for 100% of the cost of cervical cytology-based screening tests. However, since the false negative rate (i.e., misdiagnosis rate) is as high as 55%, it is recommended to supplement sub-cervical angiography when performing screening tests. Thus, the global cervical cancer screening has a market value of 6.86 trillion won, with cervical imaging accounting for 30% and a value of about 2 trillion won.
Fig. 1 is a diagram schematically illustrating a cervical cytological examination and cervical imaging, which are commonly used for diagnosing cervical cancer. Referring to the lower part of fig. 1, when a captured image of the cervix uteri is acquired from the outside of the vagina of a female subject by a predetermined photographing apparatus (e.g., a cervical colposcope shown in fig. 1), the result of analyzing the captured image may be used to reduce the misdiagnosis rate of cervical cancer.
However, when using a conventional cervical colposcope, medical professionals can confirm whether cervical cancer is occurring from images of the cervix based on their education and experience, and this method is often repetitive and ambiguous, time consuming even for experienced physicians, and may also reduce accuracy.
To overcome these drawbacks, devices for determining the incidence of cervical cancer have been introduced, which acquire captured images of the cervix, generate from the acquired cervical images based on a machine-learned model of cervical cancer, and provide analytical information about whether a subject has suffered from cervical cancer.
A key factor in evaluating the performance of such a determining device is that the images used for learning should be accurately classified and organized. If the data classification is not accurately and clearly performed, the accuracy of the analysis of the cervical cancer incidence is reduced. Unlike general medical images, colposcopic images of cervical cancer may appear in different forms depending on the imaging environment and the operator. Therefore, the apparatus for determining cervical cancer incidence must classify images to be used for learning according to more definite and strict criteria and perform learning.
(Prior art document)
(patent document)
(patent document 1) Korean patent application No. 10-0850347
(patent document 2) Korean patent laid-open publication No. 10-2016-0047720
Disclosure of Invention
1. Technical problem
It is a technical object of the present invention to provide a method of classifying cervical learning data for deep learning of cervix, which classifies cervical data necessary for accurately diagnosing the presence or absence of cervical cancer lesion according to an accurate criterion.
Further, another technical object of the present invention is to provide a system for generating cervical learning data and a method of classifying cervical learning data, which make an accurate diagnosis of a cervix uteri by preventing an over-learning of image data of a cervix uteri of a specific shape or a phenomenon that a specific type of image is not normally learned.
2. Means for solving the problems
A method executable in a computer system for generating cervical learning data, the method comprising: receiving captured image data of an unclassified cervix from an external device; classifying the captured image data of the unclassified cervix based on a neural network algorithm according to a plurality of multi-level classification criteria; and generating and storing classification criterion-specific learning data from the captured image data of the cervix classified according to the classification criterion, or learning the generated classification criterion-specific learning data.
In the method according to the present invention, it is characterized in that the generating learning data and the learning further include generating additional learning data for controlling a numerical balance of the learning data specific to the classification criterion.
In the method according to the present invention, it is characterized in that the additional learning data is generated based on learning data specific to each of the classification criteria.
In a method according to the invention, it is characterized in that the plurality of multi-level classification criteria comprises at least two or more of a first level classification criterion based on color, a second level classification criterion based on the size of the cervix in the captured image data, and a third level classification criterion based on a combination of color and shape in the cervix image data.
In the method according to the invention, it is characterized in that the plurality of multilevel sorting criteria further comprises a fourth level sorting criterion based on exposure and focus.
In the method according to the present invention, it is characterized in that classifying the image data includes: first classifying the captured image data of the unclassified cervix according to a first color-based classification criterion; secondarily classifying the captured image data of the unclassified cervix according to a second-level classification criterion based on a size of the cervix in the primarily classified captured image data; and classifying the captured image data of the unclassified cervix three times according to a third-level classification criterion based on a combination of color and shape in the secondarily classified cervix image data.
In the method according to the invention, it is characterized in that the captured image data of the cervix, which has not been classified after the tertiary classification, is classified four times according to a fourth classification criterion based on exposure and focus.
In the method according to the present invention, it is characterized in that the first-stage classification criterion includes a color value as a classification reference value for identifying each of at least one of an acetic acid reaction image, a lugol solution reaction image, a green filter image, and a general image.
In the method according to the present invention, it is characterized in that the third-level classification criterion includes a combination of a color value and a shape as a classification reference value for identifying each of at least one of blood, mucus, a ring, a colposcope, a treatment trace, and a surgical instrument in the cervical image data.
A system for generating cervical learning data, the system comprising: an image receiving unit configured to receive captured image data of an unclassified cervix from an external device allowing transmission and reception of data; an image data classification unit configured to classify the captured image data of the unclassified cervix based on a neural network algorithm according to a plurality of multi-level classification criteria; a learning data generation unit configured to generate learning data specific to a classification criterion from the captured image data of the cervix uteri classified according to the classification criterion, and store the learning data or transmit the learning data to an artificial intelligence learning system for learning; and a data storage unit configured to store the multi-level classification criteria, the captured image data of the unclassified cervix, and the generated learning data.
In the system according to the present invention, it is characterized in that the learning data generation unit further generates additional learning data for controlling a numerical balance of the generated classification criterion-specific learning data.
In the system according to the present invention, it is characterized in that the image data classifying unit classifies the captured image data of the unclassified cervix using at least two or more of a first-level classification criterion based on color, a second-level classification criterion based on a size of the cervix in the captured image data, and a third-level classification criterion based on a combination of color and shape in the cervix image data.
In the system according to the present invention, it is characterized in that the image data classifying unit further includes a fourth-level classification criterion based on exposure and focus, and classifies the captured image data of the cervix that is not classified.
In the system according to the present invention, it is characterized in that the image data classifying unit primarily classifies the captured image data of the unclassified cervix according to a primary color-based classification criterion, secondarily classifies the captured image data of the unclassified cervix according to a secondary classification criterion based on a size of the cervix in the primarily classified captured image data, and thirdly classifies the captured image data of the unclassified cervix according to a tertiary classification criterion based on a combination of a color and a shape in the secondarily classified cervix image data.
In the system according to the present invention, it is characterized in that the image data classifying unit classifies the captured image data of the cervix that has not been classified yet after the tertiary classification four times, according to a fourth-order classification criterion based on exposure and focus.
In the system according to the present invention, it is characterized in that the image data classifying unit classifies the captured image data of the unclassified cervix according to a first-stage color-based classification standard, and the first-stage classification standard includes a color value as a classification reference value for identifying each of at least one of an acetic acid reaction image, a solution reaction image, a green filter image, and a general image.
In the system according to the present invention, it is characterized in that the image data classifying unit classifies the captured image data of the unclassified cervix three times according to a third-level classification criterion based on a combination of a color and a shape in the cervical image data in the captured image data of the unclassified cervix, and the third-level classification criterion includes a combination of a color value and a shape as a classification reference value for identifying at least one of blood, mucus, a ring, a colposcope, a treatment mark, and a surgical instrument in the cervical image data.
3. Advantageous effects
According to the present invention, the system for generating cervical learning data has an advantageous effect in that since the learning data is generated by classifying captured image data of unclassified cervix uteri according to a multi-level classification standard, the presence or absence of cervical lesions can be diagnosed more accurately than in the existing system.
Further, there is another advantageous effect in that since additional learning data for controlling the numerical balance of the learning data specific to the classification criterion is further generated and used for learning, thereby preventing a phenomenon in which cervical image data of a specific shape is over-learned or a specific type of image is not normally learned, it is possible to accurately diagnose whether cervical lesion exists.
Drawings
Fig. 1 is a diagram schematically illustrating a cervical cytology examination and cervical imaging commonly used for diagnosis of cervical cancer.
Fig. 2 is an exemplary diagram illustrating a configuration of a system for generating cervical learning data according to an embodiment of the present invention.
Fig. 3 is a flowchart illustrating a method of classifying cervical learning data according to an embodiment of the present invention.
Fig. 4 is a diagram for describing in more detail the multi-level classification criteria for generating cervical learning data according to one embodiment of the present invention.
Detailed Description
Hereinafter, detailed embodiments of the present invention will be described with reference to the accompanying drawings. In the description of the present invention, a detailed description of related well-known functions or configurations determined to unnecessarily obscure the gist of the present invention will be omitted.
Fig. 2 is an exemplary diagram illustrating a configuration of a system 200 for generating cervical learning data according to an embodiment of the present invention. The illustrated system 200 for generating cervical learning data may be implemented as a set of code data executable in a computer system. In the following description, the system 200 and the Artificial Intelligence (AI) learning system 300 for generating cervical learning data are respectively illustrated, but the two systems may be integrated into one system according to a system implementation method.
For reference, the computer system is a system including a communication unit capable of data transmission/reception with an external device, a storage unit, and a control unit configured to control the overall operation of the system according to a set of control code data stored in the storage unit, and the storage unit may further include a software module for executing a specific-purpose application program.
Hereinafter, the configuration of the system 200 for generating cervical learning data will be described in detail with reference to fig. 2. A system 200 for generating cervical learning data according to one embodiment of the invention includes: an image receiving unit 210 configured to receive captured image data of an unclassified cervix from an external apparatus capable of transmitting/receiving data, for example, the apparatus 100 (photographing apparatus) capable of acquiring a captured image of a cervix or a storage apparatus storing a captured image of a cervix, and store the received captured image data of the unclassified cervix in the data storage unit 240; an image data classification unit 220 configured to classify the captured image data of the unclassified cervix based on a neural network algorithm (e.g., a Convolutional Neural Network (CNN) or a model containing the CNN and a Support Vector Machine (SVM)) according to a plurality of multi-level classification criteria; a learning data generation unit 230 configured to generate classification criterion-specific learning data from the captured image data of the cervix uteri classified according to each classification criterion, and transmit the learning data to the AI learning system 300 or store the learning data in the data storage unit 240; and a data storage unit 240 in which the multi-level classification criteria, the captured image data of the unclassified cervix uteri, and the generated learning data are stored.
In addition, the learning data generation unit 230 also generates additional learning data for controlling the numerical balance of the generated learning data specific to the plurality of classification criteria, thereby preventing the over-learning of the image data of the cervix uteri (or cervical cancer) of a specific shape or preventing a phenomenon that the image of a specific shape (or a specific type) is not normally learned.
Meanwhile, the image data classifying unit 220 may classify the captured image data of the unclassified cervix using at least two or more of a first-level classification criterion based on color, a second-level criterion based on the size of the cervix in the captured image data, a third-level criterion based on a combination of color and shape in the cervix image data, and a fourth-level criterion based on exposure and focus.
Specifically, the image data classifying unit 220 may primarily classify the captured image data of the unclassified cervix according to a primary color-based classification criterion, secondarily classify the captured image data of the unclassified cervix according to a secondary classification criterion based on a size of the cervix in the primary classified captured image data, and thirdly classify the captured image data of the unclassified cervix according to a tertiary classification criterion based on a combination of a color and a shape in the secondarily classified cervix image data.
In addition, the image data classifying unit 220 may classify captured image data of the cervix uteri, which has not been classified yet after the tertiary classification, four times according to a fourth-level classification criterion based on the exposure level and the focus.
In addition, the image data classifying unit 220 may first classify the captured image data of the unclassified cervix according to a first-level color-based classification criterion including a color value as a classification reference value for identifying each of at least one of an acetic acid reaction image, a Lugol solution reaction image, a green filter image, and a general image.
In addition, the image data classifying unit 220 may perform secondary classification according to the size of the cervix uteri in the captured image data primarily classified, for example, 150%, 100%, 80%, and 50% of the cervix uteri size, and whether or not a colposcope and other parts are included in the image.
Further, the image data classifying unit 220 may classify the captured image data of the cervix that has not been classified after the secondary classification three times according to a third-level classification criterion based on a combination of a color and a shape of the cervix image data in the captured image data of the unclassified cervix, wherein the third-level classification criterion includes a combination of a color value and a shape as a classification reference value for identifying at least one of blood, mucus, a ring, a colposcope, a treatment trace, and a surgical instrument in the cervix image data to classify the foreign substance affecting the cervix.
For example, blood typically appears in a reddish shape flowing down from the center of the cervix, mucus typically appears in a yellowish shape flowing down from the center of the cervix, and a ring is typically located in the middle of the cervix and clearly shows the line of the boomerang shape. Colposcopes and other surgical instruments appear in a color different from the color of the cervix (silver, blue, etc.), and thus, as described above, foreign objects affecting the cervix can be classified by using a combination of the color and shape of each foreign object.
Meanwhile, the image data classification unit 220 may additionally classify images that have not been classified by the above-described three classification criteria (i.e., images in which lesions are not recognized) according to the classification criteria based on the exposure and focus. For example, when underexposure or overexposure occurs, the histogram may show an extreme value on one side and thus may be classified using such a characteristic, and when the image is out of focus, an edge may not be detected or the color contrast may not be clear and thus may be classified using such a characteristic (four-time classification).
In the first to fourth described classification process, each classification may be performed using CNN as a deep learning technique. In the first, second, and fourth classifications, the features to be extracted are definite, and therefore, classification with high accuracy can be performed using a small number of layers, whereas in the third classification, many features need to be extracted, and therefore, accuracy can be improved by arranging deep layers.
Hereinafter, a learning data classification method of the system 200 for generating cervical learning data according to an embodiment of the present invention will be described in more detail with reference to fig. 2 to 4.
Fig. 3 is a flowchart illustrating a method of classifying cervical learning data according to an embodiment of the present invention. Fig. 4 is a diagram for describing in more detail the multi-level classification criteria for generating cervical learning data according to one embodiment of the present invention.
Referring to fig. 3, the image receiving unit 210 receives captured image data of an unclassified cervix from an external device, such as the photographing device 100, and stores the unclassified captured image data in the data storage unit 240 (S100).
The image data classifying unit 220 classifies one or more unclassified captured image data based on a neural network algorithm (e.g., CNN) according to a plurality of multi-level classification criteria, and stores the classified captured image data (S200).
For example, the image data classification unit 220 first classifies the captured image data of the unclassified cervix according to a first-level color-based classification criterion.
For the first classification, the image data classification unit 220 may include color values as classification reference values for identifying each of the acetic acid reaction image, the lugol solution reaction image, the green filter image, and the general image, and classifying the four images.
Specifically, the acetic acid response image can be distinguished from pink cervix and vagina due to the appearance of white spots on the cervix. In the lugol solution reaction image, brown or dark orange appears, and in the green filter image, green appears over the entire image. Accordingly, color values indicative of features of these images may be used as classification reference values to classify captured image data of unclassified cervix.
When the primary classification is completed, the image data classification unit 220 secondarily classifies the unclassified captured image data according to a second-level classification criterion based on the size of the cervix uteri in the primarily classified captured image data.
The cervix is a 500 won coin sized circle, usually located in the middle of the image. Thus, based on the size of the cervix in the image (150%, 100%, 80%, etc.), the image may be secondarily classified, for example, as an image in which only the cervix is enlarged, an image showing the entire cervix, an image in which the cervix occupies about 80% of the entire area, an image in which the cervix occupies about 50% of the entire area, and an image including the cervix, colposcope, and other parts.
Then, the image data classifying unit 220 classifies the foreign substance affecting the cervix three times according to a third-level classification criterion based on a combination of colors and shapes in the secondarily classified cervix image data.
As described above, blood generally takes on a reddish shape flowing down from the center of the cervix, mucus generally takes on a yellowish shape flowing down from the center of the cervix, and a ring is generally located in the middle of the cervix and clearly shows the line of the boomerang shape. Colposcopes and other surgical instruments appear in a color different from the color of the cervix (silver, blue, etc.), and thus, as described above, foreign objects affecting the cervix can be classified by using a combination of the color and shape of each foreign object.
The image data classification unit 220 may classify the three-times classified image four times based on the exposure and the focus, as the case may be.
As described above, the captured image data specific to the plurality of classification criteria classified according to the multi-level classification criteria is stored in the data storage unit 240.
When the classification of the image data is completed, the learning data generating unit 230 generates the learning data specific to the classification criterion from the captured image data of the cervix classified according to each classification criterion, and stores the generated learning data (S300). In the process of generating the learning data, the learning data generating unit 230 may also generate additional learning data for controlling the numerical balance of the learning data specific to the classification criterion, wherein it is preferable to generate the additional learning data based on each of the learning data specific to the classification criterion.
As a method of generating the additional learning data, the left and right of the image may be reversed using vertical mirroring, and the top and bottom of the image may be reversed using horizontal mirroring, or the image may be cropped to a size smaller than the original size on the basis of the top, bottom, left, and right sides to generate the additional learning data. Furthermore, when mirroring and cropping are used together, up to 16 times more additional learning data can be generated.
When the learning data is generated by classifying the captured image data of the unclassified cervix according to the multi-level classification criteria as described above, the AI learning system 300 learns and verifies the generated learning data (S400). When the AI learning system 300 is implemented in the system for generating cervical learning data 200, the generated learning data can be learned and validated, and the presence or absence of a lesion in the cervix can be diagnosed or determined based on the learning.
Meanwhile, when additional unclassified cervical image data is obtained, further learning data is generated as described above and can be used for relearning to improve performance.
As described above, the system 200 for generating cervical learning data according to one embodiment of the present invention generates learning data by classifying captured image data of an unclassified cervix according to a plurality of levels of classification criteria, so that whether or not a lesion exists in the cervix can be diagnosed more accurately by learning AI diagnosis devices (determination devices, AI engines, etc.) established according to the learning data generated according to various classification criteria, as compared to existing systems.
In addition, according to the present invention, additional learning data is further generated to control the numerical balance of the learning data specific to the classification criteria and used for learning, so that it is possible to prevent a phenomenon in which image data of a cervix uteri (or cervical cancer) of a specific shape is over-learned or an image of a specific shape (or type) is not learned, thereby making it possible to accurately diagnose whether there is a lesion in the cervix uteri.
While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present invention as defined by the following claims.

Claims (17)

1. A method executable in a computer system for generating cervical learning data, the method comprising:
receiving captured image data of an unclassified cervix from an external device;
classifying the captured image data of the unclassified cervix based on a neural network algorithm according to a plurality of multi-level classification criteria; and
generating and storing classification-criterion-specific learning data from the captured image data of the cervix classified according to the classification criterion, or learning the generated classification-criterion-specific learning data.
2. The method of claim 1, wherein the generating the learning data and the learning further comprise generating additional learning data for controlling a numerical balance of the classification-criteria-specific learning data.
3. The method of claim 2, wherein the additional learning data is generated based on learning data specific to each of the classification criteria.
4. The method of any of claims 1 to 3, wherein the plurality of multi-level classification criteria includes at least two or more of a first level classification criteria based on color, a second level classification criteria based on a size of the cervix in the captured image data, and a third level classification criteria based on a combination of color and shape in the cervix image data.
5. The method of claim 4, wherein the plurality of multi-level sorting criteria further comprises a fourth level sorting criteria based on exposure and focus.
6. The method of any of claims 1 to 4, wherein the classifying the image data comprises: first classifying the captured image data of the unclassified cervix according to a first color-based classification criterion; secondarily classifying the captured image data of the unclassified cervix according to a second-level classification criterion based on a size of the cervix in the primarily classified captured image data; and classifying the captured image data of the unclassified cervix three times according to a third-level classification criterion based on a combination of color and shape in the secondarily classified cervix image data.
7. The method of claim 6, wherein the captured image data of the cervix that remains unclassified after the tertiary classification is classified four times according to a fourth-level classification criterion based on exposure and focus.
8. The method of claim 4, wherein the first level of classification criteria comprises color values as classification references for identifying each of at least one of an acetic acid response image, a Lugol solution response image, a green filter image, and a general image.
9. The method of claim 4, wherein the third level of classification criteria includes a combination of color values and shapes as classification references for identifying each of at least one of blood, mucus, rings, colposcopes, treatment marks, and surgical instruments in the cervical image data.
10. A system for generating cervical learning data, the system comprising:
an image receiving unit configured to receive captured image data of an unclassified cervix from an external device allowing transmission and reception of data;
an image data classification unit configured to classify the captured image data of the unclassified cervix based on a neural network algorithm according to a plurality of multi-level classification criteria;
a learning data generation unit configured to generate learning data specific to a classification criterion from the captured image data of the cervix uteri classified according to the classification criterion, and store the learning data or transmit the learning data to an artificial intelligence learning system for learning; and
a data storage unit configured to store the multi-level classification criteria, the captured image data of the unclassified cervix, and the generated learning data.
11. The system according to claim 10, wherein the learning data generation unit further generates additional learning data for controlling a numerical balance of the generated classification criterion-specific learning data.
12. The system according to claim 10 or 11, wherein the image data classification unit classifies the captured image data of the unclassified cervix using at least two or more of a first level classification criterion based on color, a second level classification criterion based on a size of the cervix in the captured image data, and a third level classification criterion based on a combination of color and shape in the cervix image data.
13. The system of claim 12, wherein the image data classification unit further comprises a fourth level classification criterion based on exposure and focus, and classifies the captured image data of the unclassified cervix.
14. The system according to claim 10 or 11, wherein the image data classifying unit primarily classifies the captured image data of the unclassified cervix according to a primary color-based classification criterion, secondarily classifies the captured image data of the unclassified cervix according to a secondary classification criterion based on a size of the cervix in the primarily classified captured image data, and thirdly classifies the captured image data of the unclassified cervix according to a tertiary classification criterion based on a combination of a color and a shape in the secondarily classified cervix image data.
15. The system according to claim 14, wherein the image data classification unit classifies the captured image data of the cervix that has not been classified after the tertiary classification four times according to a fourth classification criterion based on exposure and focus.
16. The system according to claim 10 or 11, wherein the image data classifying unit classifies the captured image data of the unclassified cervix according to a first-level color-based classification criterion, and the first-level classification criterion includes a color value as a classification reference value for identifying each of at least one of an acetic acid reaction image, a solution reaction image, a green filter image, and a general image.
17. The system according to claim 10 or 11, wherein the image data classification unit classifies the captured image data of the unclassified cervix three times according to a third-level classification criterion based on a combination of color and shape in the cervical image data in the captured image data of the unclassified cervix, and the third-level classification criterion includes a combination of color value and shape as a classification reference value for identifying at least one of blood, mucus, a ring, a colposcope, a treatment mark, and a surgical instrument in the cervical image data.
CN201910137502.1A 2018-08-09 2019-02-25 System for generating cervical learning data and method for classifying cervical learning data Pending CN110858396A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020180093182A KR102041402B1 (en) 2018-08-09 2018-08-09 Cervical learning data generation system
KR10-2018-0093182 2018-08-09

Publications (1)

Publication Number Publication Date
CN110858396A true CN110858396A (en) 2020-03-03

Family

ID=68578817

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910137502.1A Pending CN110858396A (en) 2018-08-09 2019-02-25 System for generating cervical learning data and method for classifying cervical learning data

Country Status (2)

Country Link
KR (1) KR102041402B1 (en)
CN (1) CN110858396A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112884707A (en) * 2021-01-15 2021-06-01 复旦大学附属妇产科医院 Cervical precancerous lesion detection system, equipment and medium based on colposcope

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102316557B1 (en) * 2019-06-04 2021-10-25 주식회사 아이도트 Cervical cancer diagnosis system
BR112021024432A2 (en) 2019-06-04 2022-02-01 Aidot Inc Automatic cervical cancer diagnosis system
KR20230131687A (en) * 2022-03-07 2023-09-14 한양대학교 에리카산학협력단 System and method for colposcopy by using a deep learning model

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1037035A (en) * 1988-04-08 1989-11-08 神经医学系统公司 Automated cytological specimen classification system and method based on neural network
CN105719291A (en) * 2016-01-20 2016-06-29 江苏省沙钢钢铁研究院有限公司 Surface defect image classification system with selectable varieties
CN107563123A (en) * 2017-09-27 2018-01-09 百度在线网络技术(北京)有限公司 Method and apparatus for marking medical image
CN107689073A (en) * 2016-08-05 2018-02-13 阿里巴巴集团控股有限公司 The generation method of image set, device and image recognition model training method, system
CN108009589A (en) * 2017-12-12 2018-05-08 腾讯科技(深圳)有限公司 Sample data processing method, device and computer-readable recording medium

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100850347B1 (en) 2007-10-30 2008-08-05 문정숙 Sysmtem and method for integration medical diagnosis
KR20130012297A (en) * 2011-07-25 2013-02-04 삼성전자주식회사 Apparatus for detecting lesion, method for detecting lesion and lesion diagnosis apparatus
KR20140104946A (en) * 2011-10-05 2014-08-29 시레카 테라노스틱스, 엘엘씨 Method and system for analyzing biological specimens by spectral imaging
KR101682604B1 (en) 2014-10-23 2016-12-05 전북대학교산학협력단 Automated cervical cancer diagnosis system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1037035A (en) * 1988-04-08 1989-11-08 神经医学系统公司 Automated cytological specimen classification system and method based on neural network
CN105719291A (en) * 2016-01-20 2016-06-29 江苏省沙钢钢铁研究院有限公司 Surface defect image classification system with selectable varieties
CN107689073A (en) * 2016-08-05 2018-02-13 阿里巴巴集团控股有限公司 The generation method of image set, device and image recognition model training method, system
CN107563123A (en) * 2017-09-27 2018-01-09 百度在线网络技术(北京)有限公司 Method and apparatus for marking medical image
CN108009589A (en) * 2017-12-12 2018-05-08 腾讯科技(深圳)有限公司 Sample data processing method, device and computer-readable recording medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112884707A (en) * 2021-01-15 2021-06-01 复旦大学附属妇产科医院 Cervical precancerous lesion detection system, equipment and medium based on colposcope

Also Published As

Publication number Publication date
KR102041402B1 (en) 2019-11-07

Similar Documents

Publication Publication Date Title
CN110858396A (en) System for generating cervical learning data and method for classifying cervical learning data
CN108573490B (en) Intelligent film reading system for tumor image data
CN110197493B (en) Fundus image blood vessel segmentation method
CN109102491B (en) Gastroscope image automatic acquisition system and method
KR102316557B1 (en) Cervical cancer diagnosis system
KR102155381B1 (en) Method, apparatus and software program for cervical cancer decision using image analysis of artificial intelligence based technology
JP2016516475A (en) Estimating bilirubin levels
US20210090248A1 (en) Cervical cancer diagnosis method and apparatus using artificial intelligence-based medical image analysis and software program therefor
US10832410B2 (en) Computer system, method, and program for diagnosing subject
CN113158821B (en) Method and device for processing eye detection data based on multiple modes and terminal equipment
JP2018054443A (en) Information processing device, information processing system and program
CN111242920A (en) Biological tissue image detection method, device, equipment and medium
KR102220573B1 (en) Method, apparatus and computer program for calculating quality score of fundus image data using artificial intelligence
JP7346600B2 (en) Cervical cancer automatic diagnosis system
Cordeiro et al. An automatic patch-based approach for her-2 scoring in immunohistochemical breast cancer images using color features
Lee et al. Real-time image analysis of capsule endoscopy for bleeding discrimination in embedded system platform
CN116563224B (en) Image histology placenta implantation prediction method and device based on depth semantic features
KR20210033902A (en) Method, apparatus and software program for cervical cancer diagnosis using image analysis of artificial intelligence based technology
KR20200018360A (en) Cervical learning data generation system
KR102220574B1 (en) Method, apparatus and computer program for calculating quality score threshold for filtering image data
CN111582434B (en) deep learning system
KR20220138069A (en) Method, apparatus and software program for cervical cancer diagnosis using image analysis of artificial intelligence based technology
CN115206494A (en) Film reading system and method based on fundus image classification
KR20210071173A (en) Apparatus and method for automatic calculation of bowel preparation
US20230386660A1 (en) System and method for detecting gastrointestinal disorders

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40023220

Country of ref document: HK

EE01 Entry into force of recordation of patent licensing contract
EE01 Entry into force of recordation of patent licensing contract

Application publication date: 20200303

Assignee: Shenzhen baozhiplatinum Intelligent Medical Technology Co.,Ltd.

Assignor: Korea Treasure Platinum Co.,Ltd.

Contract record no.: X2021990000544

Denomination of invention: System for generating cervical learning data and method for classifying cervical learning data

License type: Common License

Record date: 20210903

WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20200303