WO2019039595A1 - Device and method for estimating cytodiagnostic category of cervix - Google Patents

Device and method for estimating cytodiagnostic category of cervix Download PDF

Info

Publication number
WO2019039595A1
WO2019039595A1 PCT/JP2018/031409 JP2018031409W WO2019039595A1 WO 2019039595 A1 WO2019039595 A1 WO 2019039595A1 JP 2018031409 W JP2018031409 W JP 2018031409W WO 2019039595 A1 WO2019039595 A1 WO 2019039595A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
cell
category
stained
estimation
Prior art date
Application number
PCT/JP2018/031409
Other languages
French (fr)
Japanese (ja)
Inventor
雅彦 黒田
彰 齋藤
藤田 浩司
知義 堀澤
卓也 浜田
匡治 小林
憲治 原田
康志 沼田
Original Assignee
学校法人東京医科大学
株式会社カイ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 学校法人東京医科大学, 株式会社カイ filed Critical 学校法人東京医科大学
Publication of WO2019039595A1 publication Critical patent/WO2019039595A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N33/00Investigating or analysing materials by specific methods not covered by groups G01N1/00 - G01N31/00
    • G01N33/48Biological material, e.g. blood, urine; Haemocytometers
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N33/00Investigating or analysing materials by specific methods not covered by groups G01N1/00 - G01N31/00
    • G01N33/48Biological material, e.g. blood, urine; Haemocytometers
    • G01N33/50Chemical analysis of biological material, e.g. blood, urine; Testing involving biospecific ligand binding methods; Immunological testing
    • G01N33/53Immunoassay; Biospecific binding assay; Materials therefor
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N33/00Investigating or analysing materials by specific methods not covered by groups G01N1/00 - G01N31/00
    • G01N33/48Biological material, e.g. blood, urine; Haemocytometers
    • G01N33/50Chemical analysis of biological material, e.g. blood, urine; Testing involving biospecific ligand binding methods; Immunological testing
    • G01N33/53Immunoassay; Biospecific binding assay; Materials therefor
    • G01N33/574Immunoassay; Biospecific binding assay; Materials therefor for cancer

Definitions

  • the present invention relates to an estimation apparatus and estimation method for cervical cytology categories.
  • cytology In cervical cancer screening, it is common to first perform cytology prior to diagnosis by a pathologist. This cytology is known as one of the few screening methods that has been shown to reduce mortality.
  • papain crow Pap. Stained specimens are generally prepared for specimens obtained by abrasion from the cervix. Then, a cytologist searches for epithelial cells of human papilloma virus (HPV) infection taking a characteristic form by microscopic observation of the stained specimen, and the degree of progression of precancerous lesions etc. Classified according to the system.
  • the pathologist determines the risk of cervical cancer of the patient, and further performs a detailed examination such as a tissue diagnosis to make a final diagnosis. It is considered useful to treat patients by early detection of precancerous lesions by cytology before diagnosis by such a doctor.
  • the age and examination period for examinations will be every 20 years from the age of 20, and the examination rate is lower at around 20%, but the number of examinations is compared with 10 years ago And it has increased 2.5 times.
  • the majority of cervical cancer is due to HPV infection caused by sexual intercourse, the tendency of sexual activity to become younger, and so on, it is expected that the screening for cervical cancer will further increase in the future.
  • the examination of cervical cancer by cytology is a manual judgment by a cytologist who is an expert, so labor and cost are required, and the cost of examination for the municipality is a heavy burden.
  • the epithelial cells to be targeted for cytology may exhibit various forms depending on age, presence or absence of inflammation, hormonal environment, etc., and therefore, the determination may be difficult.
  • it since it is human power, it requires skill and judgment may differ depending on the cytologist.
  • the present invention aims to provide a new system and method capable of easily estimating cytological classification for cervical specimens, for example, prior to diagnosis of the possibility of cervical cancer by a pathologist. .
  • the estimation apparatus of the present invention is an estimation apparatus for cervical cytology category, Sample image input unit, Cell image extraction unit, A cell shape classification unit, and a cytology category estimation unit;
  • the sample image input unit Input the sample image of the stained sample,
  • the stained sample is a cervical sample stained by a MUC conjugate,
  • the cell image extraction unit Extracting a cell image of stained cells from the sample image;
  • the cell shape classification unit Based on cell morphology classification information in which cell morphology information is associated with each of two or more morphology categories, the stained cells in the cell image are classified into corresponding morphology categories,
  • the composition pattern of the morphological category of stained cells in the cell image extracted from the sample image is determined for the sample image;
  • the cytology category estimation unit The sample image from the composition pattern of the sample image based on cytology category estimation information in which each of two or more cytology categories is associated with a composition pattern of a cervical morphological category belonging to each of the cytology categories It is characterized in that
  • the estimation method of the present invention is an estimation method of cytology category of cervix, Sample image input process, Cell image extraction process, Including a cell shape classification step and a cytology category estimation step;
  • In the sample image input process Input the sample image of the stained sample,
  • the stained sample is a cervical sample stained by a MUC conjugate,
  • the cell image extraction step Extracting a cell image of stained cells from the sample image;
  • the cell shape classification step Based on cell morphology classification information in which cell morphology information is associated with each of two or more morphology categories, the stained cells in the cell image are classified into corresponding morphology categories,
  • the composition pattern of the morphological category of stained cells in the cell image extracted from the sample image is determined for the sample image;
  • the cytology category estimation step is The sample image from the composition pattern of the sample image based on cytology category estimation information in which each of two or more cytology categories is associated with a composition pattern of a cervical morphological category belonging to each of the cytology
  • the program of the present invention is characterized by causing a computer to execute the method of the present invention for estimating the cytology category of the cervix.
  • the recording medium of the present invention is a computer readable recording medium in which the program of the present invention is recorded.
  • cytopathic classification estimation can be easily performed on cervical specimens prior to a pathologist's diagnosis of the morbidity of cervical cancer.
  • FIG. 1 is a block diagram showing an example of an estimation apparatus of the present invention.
  • FIG. 2 is a block diagram showing an example of the hardware configuration of the estimation apparatus of the present invention.
  • FIG. 3 is a flow chart showing an example of a part of the estimation method of the present invention.
  • FIG. 4 is a flowchart showing an example of a part of the estimation method of the present invention.
  • FIG. 5 is a photograph showing an example of a sample image.
  • FIG. 6 is a flowchart showing an example of extracting stained cells from a sample image.
  • FIG. 7 is a conceptual view showing an example of labeling from a sample image.
  • FIG. 8 is a flowchart showing an example of selecting a region with many stained pixels from the label.
  • FIG. 1 is a block diagram showing an example of an estimation apparatus of the present invention.
  • FIG. 2 is a block diagram showing an example of the hardware configuration of the estimation apparatus of the present invention.
  • FIG. 3 is a flow chart showing an example of
  • FIG. 9 is a photograph showing an example of classification of stained cells.
  • FIG. 10 is a conceptual diagram showing an example of the relationship between the cytological classification of the Bethesda system and the classification ratio of stained cells.
  • FIG. 11 is a flowchart showing an example of a method of generating a cell shape estimation model.
  • the cytology category is a category of Bethesda system.
  • the estimation apparatus of the present invention is, for example, a cell morphology estimation model in which the cell morphology information estimates a corresponding morphology category for cells in a cell image
  • the cell shape estimation model is a model generated by learning from a learning stained cell image of the cervix corresponding to each of the shape categories
  • the cell shape classification unit classifies the stained cells in the cell image into corresponding morphology categories according to the cell shape estimation model.
  • the estimation apparatus of the present invention is, for example, a cytology category estimation model in which the cytology category estimation information estimates a corresponding cytology category for a composition pattern of cells of a sample image
  • the cytology category estimation model is a model generated by learning from composition patterns of morphological cells of stained cells in the cervix corresponding to the respective cytology categories
  • the cytology category estimation unit estimates a cytology category of the stained sample of the sample image by the cytology category estimation model.
  • the cell image extraction unit extracts a cell image of stained cells from the sample image using a cell image extraction model
  • the cell image extraction model is a model generated by learning from a stained cell image in a cervical stained sample image for learning.
  • the number of the form categories is 5 to 100.
  • the estimation apparatus of the present invention further includes, for example, an appropriate determination unit of the sample image,
  • the suitability determination unit detects a determination item for the sample image, and determines that the determination result is appropriate as the sample image in the cell image extraction unit when the detection result satisfies the appropriate result.
  • the determination item is at least one selected from the group consisting of presence or absence of image blurring, number of cells, degree of blood contamination, and degree of inflammation in the sample image.
  • the cytology category is a category of Bethesda system.
  • the estimation method of the present invention is, for example, a morphology category estimation model in which the cell morphology classification information estimates a corresponding morphology category for cells in a cell image
  • the cell shape estimation model is a model generated by learning from a learning stained cell image of the cervix corresponding to each of the shape categories
  • the cell shape classification unit classifies the stained cells in the cell image into corresponding morphology categories according to the cell shape estimation model.
  • the estimation method of the present invention is, for example, a cytology category estimation model in which the cytology category estimation information estimates a corresponding cytology category for a composition pattern of cells of a sample image
  • the cytology category estimation model is a model generated by learning from composition patterns of morphological cells of stained cells in the cervix corresponding to the respective cytology categories
  • a cytology category is estimated for a stained sample of the sample image by the cytology category estimation model.
  • the cell image extraction step extracts a cell image of stained cells from the sample image using a cell image extraction model;
  • the cell image extraction model is a model generated by learning from a stained cell image in a cervical stained sample image for learning.
  • the number of morphology categories is 5 to 100.
  • the estimation method of the present invention further includes, for example, an appropriate determination unit of the sample image,
  • the aptitude determining step detects a determination item for the sample image, and when the detection result satisfies the appropriate result, determines that the aptitude image is suitable as the sample image in the cell image extracting step.
  • the determination item is at least one selected from the group consisting of presence or absence of image blurring, cell number, degree of blood contamination, and degree of inflammation in the sample image.
  • MUC is a core protein of mucin and is known as a component of mucus.
  • cervical epithelium does not have mucus, it is technical common knowledge that there is no MUC in cervical epithelium.
  • the present inventors specifically isolated MUCs whose expression is not confirmed in normal epithelia because they are specifically expressed. By detecting the presence of MUC, it was found that the possibility of cervical cancer morbidity can be tested.
  • the present inventor further uses the specimen image stained by the MUC conjugate of the cervix specimen from the relationship between MUC and cervical cancer as described above, and uses the stained specimen image of the stained specimen image.
  • composition patterns of morphological categories allow estimation of cytology categories such as Bethesda system.
  • the cervical specimen is stained with the MUC conjugate
  • the morphology of the stained cells is classified, it becomes clear that the composition pattern of the morphology is different depending on the condition of the cervix.
  • a doctor such as a pathologist
  • even a person who does not have a medical license does not have a cytology category indirectly from the composition pattern of the form of the stained cells. It is possible to easily estimate whether it is likely to be applicable or not.
  • the type of cytology for estimating the category is not particularly limited, and examples thereof include Bethesda system (also referred to as Bethesda classification) and the like.
  • the cytology category for estimating the category is not limited to, for example, cytology known at the time of filing of the present application such as the Bethesda system and at the time of basic application of the present application. Also available.
  • the cytology category means, for example, categories classified in a certain cytology (for example, including the meaning of class, level, stage, etc.).
  • the category is, for example, a category (NILM, ASC-US, ASC-H, LSIL, HSIL, SCC) as shown in Table 1 described later.
  • the Bethesda system is shown as an example of the cytology category. According to the Bethesda system, cells are classified into six categories as shown in Table 1 below. According to the present invention, for example, as described later, the cytology category by the Bethesda system can be estimated from the composition pattern of the morphological category of stained cells in the sample image.
  • MUC is a core protein family of mucins, and examples thereof include MUC1, MUC2, MUC3, MUC4, MUC5AC, MUC5B, MUC6, MUC7 and the like.
  • MUC may be, for example, any one type or two or more types, and among them, MUC1 is preferable.
  • the type of MUC conjugate is not particularly limited, and it is a binding substance having binding activity to MUC, preferably a binding substance exhibiting specific binding activity, and examples thereof include MUC antibody and the like.
  • the specimen image used in the present invention is, as described above, an image of a cervical specimen stained with a MUC conjugate.
  • the preparation method of the cervical sample is not particularly limited, and, for example, general slide preparation in cytology can be used.
  • the method for staining the cervical specimen with MUC conjugate is not particularly limited, and, for example, general staining using binding of a target (for example, an antigen) to a conjugate (for example, an antibody) thereto Methods are available.
  • sample image an image of a cervical sample stained with the MUC antibody is exemplified, but the present invention is not limited thereto, and has a binding property to MUC. It may be an image of a sample stained by the binding substance.
  • a sample to be a target for which a cytology category is to be estimated according to the present invention is referred to as a "subject".
  • Embodiment 1 An example of the estimation apparatus and estimation method of the present invention will be described with reference to the drawings.
  • FIG. 1 is a block diagram showing an example of an estimation apparatus of the present embodiment.
  • the estimation apparatus 1 includes a sample image input unit 11, a cell image extraction unit 12, a cell shape classification unit 13, and a cytology category estimation unit 14.
  • the estimation device 1 may further include, for example, a storage unit 15 and an output unit 16.
  • the storage unit 15 includes, for example, a processing information storage unit 151, a cell image extraction information storage unit 152, a cell type classification information storage unit 153, a cytology category estimation information storage unit 154, and the like.
  • the information storage unit 151 stores, for example, processing information (for example, input information input to the estimation device 1, output information output from the estimation device 1, and information obtained by the estimation device 1), and a cell image
  • the extraction information storage unit 152 stores cell image extraction information
  • the cell type classification information storage unit 153 stores cell type classification information
  • the cytology category estimation information storage unit 154 stores cytology category estimation information .
  • the estimation device 1 is also referred to, for example, as an estimation system.
  • the estimation apparatus 1 may be, for example, one estimation apparatus including the respective units, or the respective units may be an estimation apparatus connectable via a communication network.
  • the communication network is not particularly limited, and may be a known communication network, and may be wired or wireless. Specifically, for example, the Internet, telephone, LAN (Local Area Network), WiFi (Wireless Fidelity) Etc.).
  • the processing of each unit may be performed on the cloud.
  • FIG. 2 illustrates a block diagram of a hardware configuration of the estimation device 1.
  • the estimation device 1 includes, for example, a CPU (central processing unit) 101, a memory 102, a bus 103, an input device 104, a display 105, a communication device 106, a storage device 107, an imaging device 113, and the like.
  • the respective units of the estimation device 1 are mutually connected via a bus 103 by respective interfaces (I / F).
  • the CPU 101 is responsible for overall control of the estimation device 1.
  • the program of the present invention and other programs are executed by the CPU 101, and reading and writing of various information are performed.
  • the CPU 101 functions as the sample image input unit 11, the cell image extraction unit 12, the cell shape classification unit 13, and the cytology category estimation unit 14.
  • the estimation apparatus 1 can be connected to a communication network by, for example, the communication device 106 connected to the bus 103, and can also be connected to an external device via the communication network.
  • the external device is not particularly limited, and examples thereof include an imaging device such as a camera, a terminal such as a personal computer (PC), a tablet, and a smartphone.
  • the connection method between the estimation device 1 and the external device is not particularly limited, and may be, for example, wired connection or wireless connection.
  • the wired connection may be, for example, a cord connection or a cable connection for using a communication network.
  • the wireless connection may be, for example, a connection using a communication network or a connection using wireless communication.
  • the communication network is not particularly limited, and for example, a known communication network can be used and is the same as described above.
  • the connection type between the estimation device 1 and the external device may be, for example, USB or the like.
  • the memory 102 includes, for example, a main memory, and the main memory is also referred to as a main storage device.
  • the main memory is, for example, a RAM (random access memory).
  • the memory 102 further includes, for example, a ROM (read only memory).
  • the storage device 107 is also called, for example, a so-called auxiliary storage device with respect to the main memory (main storage device).
  • the storage device 107 includes, for example, a storage medium and a drive that reads and writes to the storage medium.
  • the storage medium is not particularly limited, and may be, for example, a built-in type or an external type, HD (hard disk), FD (floppy (registered trademark) disk), CD-ROM, CD-R, CD-RW, MO, Examples of the drive include a DVD, a flash memory, and a memory card, and the drive is not particularly limited.
  • the storage device 107 can also be exemplified by, for example, a hard disk drive (HDD) in which a storage medium and a drive are integrated.
  • HDD hard disk drive
  • the operation program 108 is stored in the storage device 107, and as described above, the memory 102 reads the operation program 108 from the storage device 107 when the CPU 101 is executed.
  • the storage device 107 may be, for example, the above-mentioned processing information 109 (for example, the input information, the output information, and information obtained by the estimation device 1), the cell image extraction information (for example, the cell image extraction model 110).
  • the cell shape classification information for example, the cell shape estimation model 111
  • the cytology category estimation information for example, the cytology category estimation model 112
  • the estimation device 1 may further include an imaging device 113.
  • the imaging device 113 is, for example, a camera.
  • the imaging device 113 can image the stained sample and input the image.
  • the estimation device 1 may further include, for example, an input device 104 and a display 105.
  • the input device 104 is, for example, a scanner that reads an image, a touch panel, or a keyboard.
  • the display 105 is, for example, an LED display, a liquid crystal display or the like, and also serves as the output unit 16.
  • the cell image extraction information is information for extracting stained cells from the sample image.
  • Examples of the cell image extraction information include a cell image extraction model 110.
  • the cell image extraction model 110 is, for example, a model generated by learning from a stained cell image (for example, a cut out stained cell image) in a learning staining sample image for learning of the cervix accumulated in a database.
  • a cervical specimen is stained with the MUC conjugate
  • the resulting stained specimen includes cells that are stained by binding to the MUC conjugate and cells that are not stained because the MUC conjugate is not bound. And mixed.
  • the learning data may further include, for example, a non-stained cell image in the stained sample image for learning, in which case the stained cell image and the non-stained cell image are learned as learning data, A model may be generated that can discriminate between non-stained cells.
  • the learning may be, for example, any of AI, machine learning, deep learning, and the like.
  • the machine learning can use, for example, SVM (Support Vector Machine) or the like, and the deep learning can use, for example, CNN (Convolutional Neural Network) or the like.
  • the learning in the present invention is, for example, the same as the following.
  • the cell morphology classification information is, as described above, information in which each of two or more morphology categories is associated with cell morphology information.
  • the number of the morphology categories and the cell morphology information of each of the morphology categories are not particularly limited, and can be set arbitrarily.
  • the present invention is a technique based on the fact that the compositional pattern of the morphology of the stained cells is different depending on the condition of the cervix as described above. For this reason, it can be said that, for example, as the number of morphological categories is relatively large, it is possible to estimate cytology categories with higher accuracy.
  • the lower limit is 2 or more, for example, 5 or more and 20 or more are preferable, and the upper limit of the number of the form categories is not particularly limited, and for example, 100 or less and 30 or less is preferable.
  • the cell morphology classification information may be stored in a database, for example, or may be a cell morphology estimation model 111.
  • the cell shape estimation model 111 is, for example, a model for estimating a corresponding morphology category for cells in a cell image, and for example, from a learning stained cell image of the cervix corresponding to each of the morphology categories accumulated in the database, It can be generated by learning.
  • the staining cell image for learning for example, a staining cell image of a characteristic form in each cytology category, a staining cell image of a form common to cytology categories, and the like can be used.
  • the type of the cell shape information is not particularly limited, and examples thereof include items as shown in Table 2 below, and as shown in Table 3 below, can be linked to each of the shape categories.
  • the cytology category estimation information is information in which each of two or more cytology categories is associated with a composition pattern of a cervical morphological category belonging to each of the cytology categories.
  • the composition pattern of the morphology is different depending on the condition of the cervix. Therefore, for example, for each cytology category, the composition pattern of the morphological category of each stained cell is set in advance, and the corresponding cytology category is estimated from the composition pattern of the sample image of the subject. it can.
  • the cytology category estimation information may be stored in, for example, a database, or may be a cytology category estimation model 112.
  • the cytology category estimation model 112 is, for example, a model for estimating a corresponding cytology category with respect to the composition pattern of cells of the sample image, and the stained cells in the cervix corresponding to the respective cytology categories accumulated in the database. It can be generated by learning from composition patterns of the morphological category.
  • the sample image input unit 11 inputs a sample image of the stained sample.
  • the input of the sample image may be performed, for example, by imaging a slide of the stained sample, or reading of an image obtained by photographing the slide.
  • the magnification of the sample image to be input is not particularly limited, and can be appropriately set or changed according to, for example, the magnification of the microscope at the time of photographing the slide.
  • the cell image extraction unit 12 extracts a cell image of stained cells from the sample image.
  • the method for extracting stained cells from the sample image is not particularly limited, and can be extracted using, for example, the cell image extraction model 110 as described above.
  • cell images may be extracted for a large number of detectable stained cells contained in the sample image, preferably all detectable stained cells. As described above, since approximately 100,000 cells are usually present in a stained sample, it is impossible to confirm all stained cells by human judgment of the cytologist. . However, according to the estimation apparatus of the present invention, for example, since automatic analysis using an image is possible, it is also possible to extract an image of stained cells for all detectable stained cells.
  • the cell image extraction unit 12 may double as, for example, a counting unit that detects the number of cell images of stained cells extracted in the sample image, and the estimation device 1 further includes, for example, the counting unit. It is also good.
  • the cell morphology classification unit 13 classifies the stained cells in the cell image into a corresponding morphology category based on the cell morphology classification information, and the stained cells in the cell image extracted from the specimen image for the specimen image. Determine the composition patterns of the morphological categories.
  • the classification into the morphology category can be performed, for example, using the cell morphology estimation model 111 as described above.
  • the way of expressing the composition pattern is not particularly limited, and may be, for example, the cell ratio or frequency distribution of cells belonging to each of the morphology categories.
  • the cell shape classification unit 13 may also serve as, for example, a counting unit that detects the number of cell images classified into each of the shape categories, or the estimation device 1 may further include, for example, the counting unit. .
  • the cytology category estimation unit 14 estimates a cytology category for the stained sample of the sample image from the composition pattern of the sample image based on the cytology category estimation information.
  • the estimation of the cytology category can be performed using, for example, the cytology category estimation model 112 as described above.
  • the estimation method of the present embodiment can be performed, for example, by the estimation device 1 of the present embodiment.
  • the sample image input step is a step of inputting a sample image of the stained sample, and can be executed by the sample image input unit 11 of the estimation device 1.
  • the cell image extraction step is a step of extracting a cell image of a stained cell from the sample image, and can be executed by the cell image extraction unit 12 of the estimation device 1.
  • the cell morphology classification step classifies the stained cells in the cell image into a corresponding morphology category based on the cell morphology classification information, and the stained cells in the cell image extracted from the specimen image for the specimen image. Determining the composition pattern of the morphological category of This process can be performed by the cell shape classification unit 13 of the estimation device 1.
  • the cytology category estimation step is a step of estimating a cytology category for the stained specimen of the specimen image from the composition pattern of the specimen image based on the cytology category estimation information. This process can be performed, for example, by the cytology category estimation unit 14 of the estimation device 1.
  • Embodiment 1 In the estimation apparatus and estimation method of the present invention, an example of a mode of determining the appropriateness of the sample image will be described. In addition, the description of the said Embodiment 1 can be used for this Embodiment 2, unless it shows in particular.
  • the estimation apparatus 1 may further include, for example, an appropriateness determination unit of the sample image.
  • the estimation method of the present embodiment may further include, for example, a step of determining the appropriateness of the sample image, and this step can be performed by, for example, the aptitude determining unit.
  • the aptitude determining unit and the aptitude determining step for example, detect a determination item for the sample image, and when the detection result satisfies the aptitude result, determine that the sample image in the cell image extracting unit is suitable.
  • the sample image of the stained sample is used. For example, if there is a problem in the stained sample itself or there is a problem in the sample image, it may lead to an erroneous estimation. Therefore, the erroneous estimation can be suppressed by judging the appropriateness of the sample image before extracting the stained cells.
  • the determination item is, for example, the presence or absence of blurring of the detected image as an item for determining the appropriateness of the detected image itself. Then, for example, when blurring is detected from the detection image, the detection image is determined to be inappropriate, the cell image extraction process by the cell image extraction unit is stopped, and blurring is not detected from the detection image. In this case, it is possible to determine that the detected image is appropriate, and to execute the cell image extraction process by the cell image extraction unit.
  • the determination items include, for example, the number of cells, the degree of blood contamination, the degree of inflammation, and the like in the sample image as items for determining the appropriateness of the stained sample itself. Then, for example, when it is detected that the number of cells is less than the threshold, the degree of blood contamination exceeds the threshold, the degree of inflammation exceeds the threshold, or the like in the detection image, the specimen image of the stained specimen and the stained specimen Is judged to be inappropriate, and the cell image extraction step by the cell image extraction unit is discontinued, while the number of cells is above the threshold and the degree of blood contamination is below the threshold in the detection image.
  • the threshold value is equal to or less than a threshold value
  • the determination item determines that the sample image is inappropriate when, for example, any one item becomes inappropriate.
  • the threshold value of the determination item is not particularly limited, and the setting can be changed depending on, for example, whether or not the determination is severe.
  • the number of cells can be determined from, for example, the number of cells in a predetermined area
  • the degree of blood contamination can be determined, for example, the ratio of blood in a predetermined area
  • the degree of inflammation can be determined from the appearance rate of neutrophils, for example.
  • the determination of the aptitude can also use an aptitude determination model.
  • the aptitude determination model can be generated, for example, by using, as learning data, an inappropriate sample image accumulated in a database and a sample image of aptitude for each determination item.
  • the sample image input step is a step of inputting a sample image of the stained sample.
  • the magnification of the sample image is not particularly limited. For example, when the actual size of the slide is 1 ⁇ , an image of about 100 ⁇ can be exemplified.
  • the suitability determination step includes, for example, the steps of A2-1 to A2-6.
  • the determination item is detected for the input sample image (A2-1).
  • the determination items are, for example, the presence or absence of blurring of an image, the number of cells, the degree of blood contamination, the degree of inflammation, etc., and these are detected from the sample image.
  • the order of detection of each of these determination items is not particularly limited.
  • an aptitude determination is performed for each determination item of the sample image.
  • the determination of the aptitude for example, as described above, the determination of the aptitude of the sample image itself and the determination of the aptitude of the stained sample itself are included.
  • the order of the determination is not particularly limited, and any may be preceded. For example, when the sample image itself is inappropriate, for example, it is difficult to carry out the subsequent process itself, so it is preferable to determine the suitability of the stained sample itself after determining the suitability of the sample image.
  • the presence or absence of blurring of the image is detected with respect to the sample image (A2-2), and when blurring occurs in the image (YES), the sample image itself is determined to be inappropriate and the process ends ( END).
  • the sample image is determined to be inappropriate and the process ends ( END).
  • the number of cells contained in the sample image is further detected (A2-3), and when the number of cells is small (YES), the stained sample is regarded as inappropriate.
  • the number of cells is not small, that is, sufficient (NO)
  • blood contamination is detected in the sample image (A2-4), and if blood contamination is significant (YES), the stained specimen itself is Judge as inappropriate and finish (END).
  • A3 Cell Image Extraction Step
  • an image (image area) of stained cells stained with the MUC conjugate is extracted from the sample image determined to be appropriate in the (A2-6) step.
  • the extraction of the image includes, for example, the specification of the image area of the stained cell in the sample image, and the clipping of the specified image area.
  • FIG. 5 is a schematic view of the extraction of the stained cells from the sample image
  • FIG. 5 (A) shows a part of the sample image
  • FIG. 5 (B) is each FIG. 5 (A). Shows an image of stained cells contained in As shown in FIG. 5 (A), the image areas of the stained cells are identified (image areas 1, 2, 3 in FIG. 5 (A)) for the sample image in which a plurality of stained areas exist (FIG. 5 (B)) As shown in), the image areas 1, 2, 3 are cut out as a stained image of the stained cells.
  • the extraction of the image of the stained cells from the sample image may be performed, for example, by conventional image processing, or the cell image extraction model may be used.
  • FIG. 6 a flowchart of FIG. 6 is shown as an example of extraction of a cell image of the stained cell from the sample image.
  • the sample image determined to be appropriate in the step (A2) is input (A3-1), and a binarization process is performed on the sample image so that the stained area is white and the unstained area is black ( A3-2).
  • the binarization process can use, for example, a cell image extraction model 110.
  • regions including overlapping and / or adjacent white are collectively labeled as one cell (A3-3).
  • A3-4 From the label (region labeled as one cell) in the sample image, a region having the largest number of stained pixels (stained pixels) is further selected (A3-4), and cut out as an image (A3-5).
  • A3-5 Specifically, for example, as shown in the schematic diagram of FIG.
  • the area Y with the largest number of stained pixels is selected, and the area Y is selected. Cut out as an image.
  • the magnification of the clipped image is not particularly limited. For example, when the actual size of the slide is 1 ⁇ , an image of about 400 ⁇ can be exemplified. In the case where a plurality of labels are present in one sample image, the selection and clipping of the region having the largest number of stained pixels are repeated for each label.
  • the selection and cutting out (A3-3- to A3-5) of the region having the largest number of stained pixels from the label can be performed, for example, by a conventional method.
  • An example of the selection and segmentation is shown in FIG.
  • the right figure of FIG. 8 is the same as FIG. 7, and the image area labeled as one cell is a boundary box X formed with height h in the Y axis direction and width w in the X axis direction. It is the figure enclosed.
  • Y 'in the boundary box X is a search rectangle Y' indicating a region for counting the stained pixels in a flowchart to be described later.
  • the size of the search rectangle Y ′ is arbitrary, and can be represented by a height b (b ⁇ h) in the Y-axis direction and a width a (a ⁇ w) in the X-axis direction.
  • h is the height h of the boundary box X in the Y-axis direction
  • w is the width w of the boundary box X in the X-axis direction.
  • the variable j indicating the position in the Y-axis direction is 0, and the variable i indicating the position in the X-axis direction is 0.
  • the left view of FIG. 8 is a flowchart of the selection and segmentation. For the image area of the boundary box X in FIG. 8, for example, as shown in the flowchart, the following steps are performed.
  • A4) Count stained pixels in the search rectangle Y 'at the positions of the variables j and i (cnt).
  • the count position of the stained image is moved to the next position (i + 1, j) or (i, j + 1), and the above (a4) And (a5) are similarly repeated (a8)
  • the maximum value of the stained pixel count is set in cnt, and the maximum value in the search rectangle Y ′ is indicated in the coordinate position x, y.
  • the position is set. Therefore, an image having a width a and a height b is cut out from the x and y points, and the process ends (END).
  • selection and clipping are performed on the right diagram of FIG. 8, for example, the region Y in the diagram of FIG. 7 is selected and clipped.
  • a cell image of stained cells extracted in the step (A3) may be counted.
  • the expression of MUC is not confirmed in normal epithelium, but is specifically expressed in cervical intraepithelial neoplasia. Therefore, when the cervical specimen is normal, stained cells may not be detected even by the staining with the MUC conjugate. Therefore, for example, the cell image of the stained cells extracted for the sample image is counted (A4), and when the number of stained cells is small (YES), the sample of the sample image is estimated to be normal. May end (END).
  • a threshold value of the number of stained cells per slide is set, and when it is less than the threshold value, it can be estimated to be normal.
  • the slide in the cytology has 100,000 cells per sheet, so that the threshold of normality or not is, for example, a ratio of about 1% of the stained cells or the stained cells
  • the number can be set to about 1,000.
  • it can be estimated to be normal ie, the cytology category of "NILM".
  • NO when the number of the stained cells is not small (NO), for example, when the number is larger than the threshold value, it is not possible to deduce that it is normal, and the process proceeds to the next cell shape classification step.
  • the cell form classification step classifies the stained cells in the extracted cell image into a corresponding form category based on the cell form classification information (A5), and the sample image in the form It is a process of determining the composition pattern of the morphological category of stained cells in the cell image extracted from the specimen image.
  • the cell shape classification step can be performed using, for example, the cell shape estimation model 111 as described above.
  • FIG. 9 shows a schematic view of the classification of the stained cells.
  • each stained cell is classified into A category based on the cell morphology classification information, It can be classified into 4 types of B category, C category and D category.
  • This classification can use, for example, the cell shape estimation model 111 described above.
  • Cytology category estimation step The cytology category estimation step estimates the cytology category for the stained sample of the sample image from the composition pattern of the sample image based on the cytology category estimation information ( A6). This estimation can use, for example, the cytology category estimation model 112 described above.
  • the stained specimen contains various forms of stained cells stained with the MUC conjugate.
  • the composition patterns of the various forms of stained cells are different depending on the condition of the cervix. That is, they are significantly different for each sample group belonging to each cytology category of cytology. Therefore, for example, a composition pattern of the subject and a composition pattern in each category of cytology are compared, and a cytology category similar to the composition pattern of the subject is a cytology category of the subject. It is possible to estimate.
  • the stained cells in the sample image of the subject are category A 20%, category B 40%, category C 10%, category D 30% It is assumed that the composition pattern is classified into the ratio of.
  • the categories NILM, LSIL, HSIL and SCC respectively have characteristic composition patterns as exemplified in FIG. 10, and the ratios of categories A to D, for example.
  • the cytology category can be estimated from the composition pattern of the sample image by using the composition pattern of each cytology category as an estimation criterion. That is, for example, a composition pattern having a ratio of Category A 20%, Category B 40%, Category C 10%, and Category D 30% can be estimated as NILM.
  • the present embodiment describes generation of each illustrated model.
  • the present invention is not limited to this example.
  • the cell image extraction model 110 can be generated by learning, for example, as described above. In the sample image, discrimination between a stained region and a non-stained region can be performed, for example, by setting the target staining color as positive and the other colors as negative. In the learning, for example, image data of a cell stained in a target color, data that the image data linked thereto is a positive example, image data of a cell stained in any other color, and It may be learned using data or the like that the image data linked thereto is a negative example.
  • a specific example of generation of the cell image extraction model 110 is shown below. First, a slide of a cervical specimen stained with a MUC conjugate is imaged, and an image of stained cells is cut out from the imaged slide image and accumulated in a stained cell database. Then, the accumulated stained cell image is input as learning data to the model generation apparatus, and learning is performed, whereby a model can be generated which selects and extracts an image area including the stained cells.
  • the sample image accumulated in the stained cell database is processed to confirm whether or not the extraction of the stained cell image is correct. Can correct the extraction error and further learn to generate the cell image extraction model 110 in which the extraction accuracy is further improved.
  • the cell shape estimation model 111 can be generated by learning, for example, as described above. In the learning, for example, image data of cells corresponding to each morphology category may be used for learning.
  • any plural form categories having different forms are determined (B1-1), and each of the form categories is associated with a stained cell image belonging thereto to classify the forms. Accumulate in database 1 Then, the form category stored and the stained cell image belonging thereto are input as learning data to the model generation apparatus as learning data (B1-2), and the stained cell images to be classified are learned by learning (B1-3), A model 111 can be generated that determines (estimates) which form category it falls under.
  • the cell image of the stained cells stored in the shape classification database 2 is classified (B1-4), and whether the classification is correct or not is confirmed. If incorrect, perform processing to correct classification errors (B1-5), and further perform learning (B1-6) to generate a cell shape estimation model 111 with further improved classification accuracy. You can also.
  • the database 1 used in the first learning and the database 2 used in the confirmation of the classification error are separately shown, but both may be the same database.
  • Cytology category estimation model 112 The cytology category estimation model 112 can be generated by learning, for example, as described above. In the learning, for example, a composition pattern of a sample corresponding to each cytology category may be used to learn.
  • a specific example of generation of the cytology category estimation model 112 is shown below.
  • a slide of the cervix belonging to each cytology category stained with a MUC conjugate is imaged, and from the imaged slide image, for example, using the cell shape estimation model 111, the composition pattern of the cells belonging to each of the above morphology categories Are linked to the cytology category and accumulated in the composition pattern database.
  • the composition pattern of the sample image to be classified corresponds to any cytology category by inputting the accumulated composition pattern and the cytology category as learning data to the model generation apparatus as learning data.
  • a model 112 can be generated to estimate the
  • the cytology category is classified for the composition pattern stored in the composition pattern database, and it is determined whether the classification is correct or not.
  • a cytology category estimation model 112 with a further improved classification accuracy can be generated by performing processing for correcting classification errors and further performing learning.
  • the cytology category estimation model 112 may be, for example, a cluster model by cluster analysis of the composition pattern and the cytology category.
  • a program according to Embodiment 5 of the present invention is a program that can execute the estimation method of the present invention on a computer.
  • the program of the present embodiment may be recorded on, for example, a computer readable recording medium.
  • the recording medium is not particularly limited, and examples thereof include the above-described storage medium and the like.
  • cytopathic classification estimation can be easily performed on cervical specimens, for example, before a pathologist diagnoses the morbidity of cervical cancer.

Abstract

Provided is a novel system with which it is possible to easily estimate a cytodiagnosis classification for a cervical specimen prior to diagnosis by a pathologist. This device for estimating a cytodiagnosis category includes: a specimen image input unit 11 that inputs a specimen image of a stained specimen, the stained specimen being a cervical specimen that is stained by an MUC conjugate; a cell image extraction unit 12 that extracts a stained cell image from the specimen image; a cell shape classification unit 13 that classifies the stained cell in the cell image into a corresponding shape category on the basis of cell shape classification information, and determines, for the specimen image, an organizational pattern of the shape category of the stained cell in the cell image extracted from the specimen image; and a cytodiagnosis category estimation unit 14 that estimates a cytodiagnosis category for the stained specimen in the specimen image from the organizational pattern of the specimen image on the basis of cytodiagnosis category estimation information.

Description

子宮頸部の細胞診カテゴリーの推定装置および推定方法Device and method for estimating cervical cytology category
 本発明は、子宮頸部の細胞診カテゴリーの推定装置および推定方法に関する。 The present invention relates to an estimation apparatus and estimation method for cervical cytology categories.
 子宮頸がんの検診では、病理医による診断に先立って、まず、細胞診が行われるのが一般的となっている。この細胞診は、死亡率の軽減が証明されている数少ない検診法として知られている。細胞診では、一般的に、子宮頸部から擦過により得た検体について、パパニコロウ(Pap.)染色標本が作製される。そして、細胞検査士が、前記染色標本の顕微鏡観察により、特徴的な形態をとるヒトパピローマウイルス(HPV)感染の上皮細胞を探索し、その細胞の形態から、前がん病変の進行度等をベセスダシステムに沿って分類している。このベセスダシステムによる分類結果から、病理医は、患者の子宮頸がんのリスクを判定し、さらに組織診等の精密検査を行い、最終的な診断を行っている。このような医師による診断前の細胞診により、前がん病変を早期発見することで、患者の治療に役立てることが有用とされている。 In cervical cancer screening, it is common to first perform cytology prior to diagnosis by a pathologist. This cytology is known as one of the few screening methods that has been shown to reduce mortality. In cytology, papain crow (Pap.) Stained specimens are generally prepared for specimens obtained by abrasion from the cervix. Then, a cytologist searches for epithelial cells of human papilloma virus (HPV) infection taking a characteristic form by microscopic observation of the stained specimen, and the degree of progression of precancerous lesions etc. Classified according to the system. Based on the classification result by the Bethesda system, the pathologist determines the risk of cervical cancer of the patient, and further performs a detailed examination such as a tissue diagnosis to make a final diagnosis. It is considered useful to treat patients by early detection of precancerous lesions by cytology before diagnosis by such a doctor.
 近年、子宮頸がんの低年齢化をうけ、検診対象年齢と受診期間が、20歳から2年ごととなり、受診率は、20%台と低いものの、受診者数は、10年前と比較して2.5倍に増加している。特に、子宮頸がんの大部分は、性交が原因となるHPV感染によること、性行為の低年齢化傾向等から、子宮頸がんの検診は、今後、さらに増加することが予想される。 In recent years, due to the age reduction of cervical cancer, the age and examination period for examinations will be every 20 years from the age of 20, and the examination rate is lower at around 20%, but the number of examinations is compared with 10 years ago And it has increased 2.5 times. In particular, the majority of cervical cancer is due to HPV infection caused by sexual intercourse, the tendency of sexual activity to become younger, and so on, it is expected that the screening for cervical cancer will further increase in the future.
 しかしながら、細胞診による子宮頸がんの検診は、専門家である細胞検査士らによる人力での判定であるため、労力やコストがかかり、市区町村にとって検診費用は大きな負担となっている。さらに、細胞診の対象となる上皮細胞は、年齢、炎症の有無、ホルモン環境等により、さまざまな形態を呈するため、判定が困難な場合がある。また、人力であるため、熟練が必要であり、細胞検査士によっては、判断が異なる可能性もある。 However, the examination of cervical cancer by cytology is a manual judgment by a cytologist who is an expert, so labor and cost are required, and the cost of examination for the municipality is a heavy burden. Furthermore, the epithelial cells to be targeted for cytology may exhibit various forms depending on age, presence or absence of inflammation, hormonal environment, etc., and therefore, the determination may be difficult. In addition, since it is human power, it requires skill and judgment may differ depending on the cytologist.
 具体的に、前記染色標本には、通常、1スライドあたり10万~30万個の細胞が存在していることから、前記細胞検査士が、前記スライド内の全ての細胞を確認することは不可能である。このため、通常、前記細胞検査士が、顕微鏡を通して、前記染色標本のうち着目した領域のみを観察して、形態を判断しており、この判断にかけられる時間も、1スライドあたりわずか120秒程度である。このため、人手による細胞診カテゴリーの判断および精度には限界がある。 Specifically, since 100,000 to 300,000 cells are usually present in the stained sample, it is impossible for the cytologist to confirm all the cells in the slide. It is possible. For this reason, the cytologist usually observes only the region of interest in the stained sample through a microscope to determine the form, and the time taken for this determination is also about 120 seconds per slide. is there. For this reason, there is a limit to the judgment and accuracy of the cytology category manually.
 そこで、本発明は、例えば、病理医による子宮頸がんの罹患可能性の診断に先立って、容易に、子宮頚部検体について細胞診分類を推定できる、新たなシステムおよび方法の提供を目的とする。 Therefore, the present invention aims to provide a new system and method capable of easily estimating cytological classification for cervical specimens, for example, prior to diagnosis of the possibility of cervical cancer by a pathologist. .
 前記目的を達成するために、本発明の推定装置は、子宮頸部の細胞診カテゴリーの推定装置であり、
検体画像入力部、
細胞画像抽出部、
細胞形態分類部、および
細胞診カテゴリー推定部を含み、
前記検体画像入力部は、
  染色検体の検体画像を入力し、
  前記染色検体は、MUC結合体により染色された子宮頚部検体であり、
前記細胞画像抽出部は、
  前記検体画像から染色細胞の細胞画像を抽出し、
前記細胞形態分類部は、
  2以上の形態カテゴリーのそれぞれと細胞形態情報とが関連づけられた細胞形態分類情報に基づいて、前記細胞画像における染色細胞を、該当する形態カテゴリーに分類し、
  前記検体画像について、前記検体画像から抽出された前記細胞画像における染色細胞の形態カテゴリーの組成パターンを決定し、
前記細胞診カテゴリー推定部は、
  2以上の細胞診カテゴリーのそれぞれと前記各細胞診カテゴリーに属する子宮頚部の形態カテゴリーの組成パターンとが関連付けられた細胞診カテゴリー推定情報に基づいて、前記検体画像の前記組成パターンから、前記検体画像の染色検体について、細胞診カテゴリーを推定することを特徴とする。
In order to achieve the above object, the estimation apparatus of the present invention is an estimation apparatus for cervical cytology category,
Sample image input unit,
Cell image extraction unit,
A cell shape classification unit, and a cytology category estimation unit;
The sample image input unit
Input the sample image of the stained sample,
The stained sample is a cervical sample stained by a MUC conjugate,
The cell image extraction unit
Extracting a cell image of stained cells from the sample image;
The cell shape classification unit
Based on cell morphology classification information in which cell morphology information is associated with each of two or more morphology categories, the stained cells in the cell image are classified into corresponding morphology categories,
The composition pattern of the morphological category of stained cells in the cell image extracted from the sample image is determined for the sample image;
The cytology category estimation unit
The sample image from the composition pattern of the sample image based on cytology category estimation information in which each of two or more cytology categories is associated with a composition pattern of a cervical morphological category belonging to each of the cytology categories It is characterized in that a cytology category is estimated for a stained sample of
 本発明の推定方法は、子宮頸部の細胞診カテゴリーの推定方法であり、
検体画像入力工程、
細胞画像抽出工程、
細胞形態分類工程、および
細胞診カテゴリー推定工程を含み、
前記検体画像入力工程は、
  染色検体の検体画像を入力し、
  前記染色検体は、MUC結合体により染色された子宮頚部検体であり、
前記細胞画像抽出工程は、
  前記検体画像から染色細胞の細胞画像を抽出し、
前記細胞形態分類工程は、
  2以上の形態カテゴリーのそれぞれと細胞形態情報とが関連づけられた細胞形態分類情報に基づいて、前記細胞画像における染色細胞を、該当する形態カテゴリーに分類し、
  前記検体画像について、前記検体画像から抽出された前記細胞画像における染色細胞の形態カテゴリーの組成パターンを決定し、
前記細胞診カテゴリー推定工程は、
  2以上の細胞診カテゴリーのそれぞれと前記各細胞診カテゴリーに属する子宮頚部の形態カテゴリーの組成パターンとが関連付けられた細胞診カテゴリー推定情報に基づいて、前記検体画像の前記組成パターンから、前記検体画像の染色検体について、細胞診カテゴリーを推定することを特徴とする。
The estimation method of the present invention is an estimation method of cytology category of cervix,
Sample image input process,
Cell image extraction process,
Including a cell shape classification step and a cytology category estimation step;
In the sample image input process,
Input the sample image of the stained sample,
The stained sample is a cervical sample stained by a MUC conjugate,
The cell image extraction step
Extracting a cell image of stained cells from the sample image;
The cell shape classification step
Based on cell morphology classification information in which cell morphology information is associated with each of two or more morphology categories, the stained cells in the cell image are classified into corresponding morphology categories,
The composition pattern of the morphological category of stained cells in the cell image extracted from the sample image is determined for the sample image;
The cytology category estimation step is
The sample image from the composition pattern of the sample image based on cytology category estimation information in which each of two or more cytology categories is associated with a composition pattern of a cervical morphological category belonging to each of the cytology categories It is characterized in that a cytology category is estimated for a stained sample of
 本発明のプログラムは、前記本発明の子宮頸部の細胞診カテゴリーの推定方法をコンピュータに実行させることを特徴とする。 The program of the present invention is characterized by causing a computer to execute the method of the present invention for estimating the cytology category of the cervix.
 本発明の記録媒体は、前記本発明のプログラムを記録したコンピュータ読み取り可能な記録媒体である。 The recording medium of the present invention is a computer readable recording medium in which the program of the present invention is recorded.
 本発明によれば、例えば、子宮頸がんの罹患可能性を病理医が診断するに先立って、容易に、子宮頚部検体について細胞診分類の推定を行うことができる。 According to the present invention, for example, cytopathic classification estimation can be easily performed on cervical specimens prior to a pathologist's diagnosis of the morbidity of cervical cancer.
図1は、本発明の推定装置の一例を示すブロック図である。FIG. 1 is a block diagram showing an example of an estimation apparatus of the present invention. 図2は、本発明の推定装置のハードウエア構成の一例を示すブロック図である。FIG. 2 is a block diagram showing an example of the hardware configuration of the estimation apparatus of the present invention. 図3は、本発明の推定方法の一部の一例を示すフローチャートである。FIG. 3 is a flow chart showing an example of a part of the estimation method of the present invention. 図4は、本発明の推定方法の一部の一例を示すフローチャートである。FIG. 4 is a flowchart showing an example of a part of the estimation method of the present invention. 図5は、検体画像の一例を示す写真である。FIG. 5 is a photograph showing an example of a sample image. 図6は、検体画像からの染色細胞を抽出する一例を示すフローチャートである。FIG. 6 is a flowchart showing an example of extracting stained cells from a sample image. 図7は、検体画像からのラベリングの一例を示す概念図である。FIG. 7 is a conceptual view showing an example of labeling from a sample image. 図8は、ラベルから染色画素が多い領域を選択する一例を示すフローチャートである。FIG. 8 is a flowchart showing an example of selecting a region with many stained pixels from the label. 図9は、染色細胞の分類の一例を示す写真である。FIG. 9 is a photograph showing an example of classification of stained cells. 図10は、ベセスダシステムの細胞診分類と、染色細胞の分類比率との関係の一例を示す概念図である。FIG. 10 is a conceptual diagram showing an example of the relationship between the cytological classification of the Bethesda system and the classification ratio of stained cells. 図11は、細胞形態推定モデルの生成方法の一例を示すフローチャートである。FIG. 11 is a flowchart showing an example of a method of generating a cell shape estimation model.
 本発明の推定装置は、例えば、前記細胞診カテゴリー推定情報において、前記細胞診カテゴリーが、ベセスダシステムのカテゴリーである。 In the estimation device of the present invention, for example, in the cytology category estimation information, the cytology category is a category of Bethesda system.
 本発明の推定装置は、例えば、前記細胞形態情報が、細胞画像における細胞について、該当する形態カテゴリーを推定する、細胞形態推定モデルであり、
前記細胞形態推定モデルは、前記各形態カテゴリーに該当する子宮頸部の学習用染色細胞画像から、学習により生成されるモデルであり、
前記細胞形態分類部は、前記細胞形態推定モデルにより、前記細胞画像における染色細胞を該当する形態カテゴリーに分類する。
The estimation apparatus of the present invention is, for example, a cell morphology estimation model in which the cell morphology information estimates a corresponding morphology category for cells in a cell image,
The cell shape estimation model is a model generated by learning from a learning stained cell image of the cervix corresponding to each of the shape categories,
The cell shape classification unit classifies the stained cells in the cell image into corresponding morphology categories according to the cell shape estimation model.
 本発明の推定装置は、例えば、前記細胞診カテゴリー推定情報が、検体画像の細胞の組成パターンについて、該当する細胞診カテゴリーを推定する、細胞診カテゴリー推定モデルであり、
前記細胞診カテゴリー推定モデルは、前記各細胞診カテゴリーに該当する子宮頸部における染色細胞の形態カテゴリーの組成パターンから、学習により生成されるモデルであり、
前記細胞診カテゴリー推定部は、前記細胞診カテゴリー推定モデルにより、前記検体画像の染色検体について、細胞診カテゴリーを推定する。
The estimation apparatus of the present invention is, for example, a cytology category estimation model in which the cytology category estimation information estimates a corresponding cytology category for a composition pattern of cells of a sample image,
The cytology category estimation model is a model generated by learning from composition patterns of morphological cells of stained cells in the cervix corresponding to the respective cytology categories,
The cytology category estimation unit estimates a cytology category of the stained sample of the sample image by the cytology category estimation model.
 本発明の推定装置において、例えば、前記細胞画像抽出部は、細胞画像抽出モデルを用いて、前記検体画像から染色細胞の細胞画像を抽出し、
前記細胞画像抽出モデルは、子宮頸部の学習用染色検体画像における染色細胞画像から、学習により生成されるモデルである。
In the estimation device of the present invention, for example, the cell image extraction unit extracts a cell image of stained cells from the sample image using a cell image extraction model,
The cell image extraction model is a model generated by learning from a stained cell image in a cervical stained sample image for learning.
 本発明の推定装置は、例えば、前記形態カテゴリーの数が、5~100である。 In the estimation device of the present invention, for example, the number of the form categories is 5 to 100.
 本発明の推定装置は、例えば、さらに、検体画像の適正判定部を含み、
前記適性判定部は、前記検体画像について、判定項目を検出し、検出結果が適正結果を満たす場合、前記細胞画像抽出部における前記検体画像として適性であると判定する。
The estimation apparatus of the present invention further includes, for example, an appropriate determination unit of the sample image,
The suitability determination unit detects a determination item for the sample image, and determines that the determination result is appropriate as the sample image in the cell image extraction unit when the detection result satisfies the appropriate result.
 本発明の推定装置は、例えば、前記判定項目が、前記検体画像における、画像のボケの有無、細胞数、血液混入の程度、および炎症の程度からなる群から選択された少なくとも一つである。 In the estimation device of the present invention, for example, the determination item is at least one selected from the group consisting of presence or absence of image blurring, number of cells, degree of blood contamination, and degree of inflammation in the sample image.
 本発明の推定方法は、例えば、前記細胞診カテゴリー推定情報において、前記細胞診カテゴリーが、ベセスダシステムのカテゴリーである。 In the estimation method of the present invention, for example, in the cytology category estimation information, the cytology category is a category of Bethesda system.
 本発明の推定方法は、例えば、前記細胞形態分類情報が、細胞画像における細胞について、該当する形態カテゴリーを推定する、形態カテゴリー推定モデルであり、
前記細胞形態推定モデルは、前記各形態カテゴリーに該当する子宮頸部の学習用染色細胞画像から、学習により生成されるモデルであり、
前記細胞形態分類部は、前記細胞形態推定モデルにより、前記細胞画像における染色細胞を該当する形態カテゴリーに分類する。
The estimation method of the present invention is, for example, a morphology category estimation model in which the cell morphology classification information estimates a corresponding morphology category for cells in a cell image,
The cell shape estimation model is a model generated by learning from a learning stained cell image of the cervix corresponding to each of the shape categories,
The cell shape classification unit classifies the stained cells in the cell image into corresponding morphology categories according to the cell shape estimation model.
 本発明の推定方法は、例えば、前記細胞診カテゴリー推定情報が、検体画像の細胞の組成パターンについて、該当する細胞診カテゴリーを推定する、細胞診カテゴリー推定モデルであり、
前記細胞診カテゴリー推定モデルは、前記各細胞診カテゴリーに該当する子宮頸部における染色細胞の形態カテゴリーの組成パターンから、学習により生成されるモデルであり、
前記細胞診カテゴリー推定工程は、前記細胞診カテゴリー推定モデルにより、前記検体画像の染色検体について、細胞診カテゴリーを推定する。
The estimation method of the present invention is, for example, a cytology category estimation model in which the cytology category estimation information estimates a corresponding cytology category for a composition pattern of cells of a sample image,
The cytology category estimation model is a model generated by learning from composition patterns of morphological cells of stained cells in the cervix corresponding to the respective cytology categories,
In the cytology category estimation step, a cytology category is estimated for a stained sample of the sample image by the cytology category estimation model.
 本発明の推定方法において、例えば、前記細胞画像抽出工程は、細胞画像抽出モデルを用いて、前記検体画像から染色細胞の細胞画像を抽出し、
前記細胞画像抽出モデルは、子宮頸部の学習用染色検体画像における染色細胞画像から、学習により生成されるモデルである。
In the estimation method of the present invention, for example, the cell image extraction step extracts a cell image of stained cells from the sample image using a cell image extraction model;
The cell image extraction model is a model generated by learning from a stained cell image in a cervical stained sample image for learning.
 本発明の推定方法は、前記形態カテゴリーの数が、5~100である。 In the estimation method of the present invention, the number of morphology categories is 5 to 100.
 本発明の推定方法は、例えば、さらに、検体画像の適正判定部を含み、
前記適性判定工程は、前記検体画像について、判定項目を検出し、検出結果が適正結果を満たす場合、前記細胞画像抽出工程における前記検体画像として適性であると判定する。
The estimation method of the present invention further includes, for example, an appropriate determination unit of the sample image,
The aptitude determining step detects a determination item for the sample image, and when the detection result satisfies the appropriate result, determines that the aptitude image is suitable as the sample image in the cell image extracting step.
 本発明の推定方法は、例えば、前記判定項目が、前記検体画像における、画像のボケの有無、細胞数、血液混入の程度、および炎症の程度からなる群から選択された少なくとも一つである。 In the estimation method of the present invention, for example, the determination item is at least one selected from the group consisting of presence or absence of image blurring, cell number, degree of blood contamination, and degree of inflammation in the sample image.
 MUCは、ムチンのコアタンパク質であり、粘液の成分として知られている。一方、子宮頸部上皮は、粘液を有さないため、子宮頸部上皮にMUCは存在しないというのが技術常識である。しかしながら、本発明者らは、鋭意研究の結果、子宮頸部上皮内腫瘍においては、正常上皮で発現が確認されないMUCが、特異的な形質発現をしているため、子宮頸部上皮から単離した検体について、MUCの存在を検出することで、子宮頸がんの罹患の可能性を試験できるとの知見を得た。 MUC is a core protein of mucin and is known as a component of mucus. On the other hand, since cervical epithelium does not have mucus, it is technical common knowledge that there is no MUC in cervical epithelium. However, as a result of earnest studies, in the cervical intraepithelial neoplasia, the present inventors specifically isolated MUCs whose expression is not confirmed in normal epithelia because they are specifically expressed. By detecting the presence of MUC, it was found that the possibility of cervical cancer morbidity can be tested.
 そして、さらに、本発明者は、このようなMUCと子宮頸がんとの関係性から、子宮頸部検体のMUC結合体による染色検体画像を用いて、前記染色検体画像に含まれる染色細胞の形態カテゴリーの組成パターンから、ベセスダシステム等の細胞診カテゴリーの推定が可能になることを見出した。具体的に、前記MUC結合体で子宮頸部検体を染色した場合、染色された細胞の形態を分類すると、子宮頸部の状態に依存して、前記形態の組成パターンが異なることが明らかとなった。このため、前記MUC結合体で染色した染色検体画像について、それに含まれる染色細胞の形態を分類し、組成パターンを決定することによって、前記染色検体が、どのような細胞診カテゴリーに属するかを推定することが可能となった。 Further, the present inventor further uses the specimen image stained by the MUC conjugate of the cervix specimen from the relationship between MUC and cervical cancer as described above, and uses the stained specimen image of the stained specimen image. It has been found that composition patterns of morphological categories allow estimation of cytology categories such as Bethesda system. Specifically, when the cervical specimen is stained with the MUC conjugate, when the morphology of the stained cells is classified, it becomes clear that the composition pattern of the morphology is different depending on the condition of the cervix. The Therefore, by classifying the morphology of stained cells contained in the stained sample image stained with the MUC conjugate and determining the composition pattern, it is estimated what cytology category the stained sample belongs to. It became possible to
 本発明によれば、例えば、病理医をはじめとする医師による診断に先立って、医師免許を有していない者であっても、前記染色細胞の形態の組成パターンから、間接的に細胞診カテゴリーのいずれに該当する可能性があるかを、簡便かつ要否に推定することができる。 According to the present invention, for example, prior to diagnosis by a doctor such as a pathologist, even a person who does not have a medical license does not have a cytology category indirectly from the composition pattern of the form of the stained cells. It is possible to easily estimate whether it is likely to be applicable or not.
 本発明において、カテゴリーを推定する細胞診の種類は、特に制限されず、例えば、ベセスダシステム(ベセスダ分類ともいう)等があげられる。また、本発明において、カテゴリーを推定する細胞診カテゴリーは、例えば、ベセスダシステム等の本願出願時および本願の基礎出願時において公知である細胞診には限られず、以後、公知となった細胞診にも利用できる。本発明において細胞診カテゴリーとは、例えば、ある細胞診において分類されるカテゴリー(例えば、クラス、レベル、ステージ等の意味も含む)を意味する。すなわち、具体例として、細胞診がベセスダシステムであれば、カテゴリーは、例えば、後述する表1に示すようなカテゴリー(NILM、ASC-US、ASC-H、LSIL、HSIL、SCC)である。 In the present invention, the type of cytology for estimating the category is not particularly limited, and examples thereof include Bethesda system (also referred to as Bethesda classification) and the like. Further, in the present invention, the cytology category for estimating the category is not limited to, for example, cytology known at the time of filing of the present application such as the Bethesda system and at the time of basic application of the present application. Also available. In the present invention, the cytology category means, for example, categories classified in a certain cytology (for example, including the meaning of class, level, stage, etc.). That is, as a specific example, if the cytology is a Bethesda system, the category is, for example, a category (NILM, ASC-US, ASC-H, LSIL, HSIL, SCC) as shown in Table 1 described later.
 前記細胞診カテゴリーの一例として、前記ベセスダシステムを示す。前記ベセスダシステムによると、細胞は、下記表1に示すような6種類のカテゴリーに分類される。本発明によれば、例えば、後述するように、前記検体画像における染色細胞の形態カテゴリーの組成パターンから、ベセスダシステムによる細胞診カテゴリーを推定できる。 The Bethesda system is shown as an example of the cytology category. According to the Bethesda system, cells are classified into six categories as shown in Table 1 below. According to the present invention, for example, as described later, the cytology category by the Bethesda system can be estimated from the composition pattern of the morphological category of stained cells in the sample image.
Figure JPOXMLDOC01-appb-T000001
Figure JPOXMLDOC01-appb-T000001
 MUCは、前述のように、ムチンのコアタンパク質ファミリーであり、例えば、MUC1、MUC2、MUC3、MUC4、MUC5AC、MUC5B、MUC6、MUC7等があげられる。本発明において、MUCは、例えば、いずれか一種類でもよいし、二種類以上であってもよく、中でもMUC1が好ましい。 As described above, MUC is a core protein family of mucins, and examples thereof include MUC1, MUC2, MUC3, MUC4, MUC5AC, MUC5B, MUC6, MUC7 and the like. In the present invention, MUC may be, for example, any one type or two or more types, and among them, MUC1 is preferable.
 前記MUC結合体の種類は、特に制限されず、MUCに結合性を有する結合物質、このましくは特異的な結合性を示す結合物質であり、例えば、MUC抗体等があげられる。 The type of MUC conjugate is not particularly limited, and it is a binding substance having binding activity to MUC, preferably a binding substance exhibiting specific binding activity, and examples thereof include MUC antibody and the like.
 本発明において使用する検体画像は、前述のように、MUC結合体により染色された子宮頸部検体の画像である。前記子宮頸部検体の調製方法は、特に制限されず、例えば、細胞診における一般的なスライド調製が利用できる。また、前記子宮頸部検体をMUC結合体により染色する方法も、特に制限されず、例えば、ターゲット(例えば、抗原)とそれに対する結合体(例えば、抗体)との結合を利用した一般的な染色方法が利用できる。 The specimen image used in the present invention is, as described above, an image of a cervical specimen stained with a MUC conjugate. The preparation method of the cervical sample is not particularly limited, and, for example, general slide preparation in cytology can be used. In addition, the method for staining the cervical specimen with MUC conjugate is not particularly limited, and, for example, general staining using binding of a target (for example, an antigen) to a conjugate (for example, an antibody) thereto Methods are available.
 つぎに、本発明の実施形態について説明する。なお、本発明は、以下の実施形態には限定されない。なお、以下の各図において、同一部分には、同一符号を付している。また、各実施形態の説明は、特に言及がない限り、互いの説明を援用できる。さらに、各実施形態の構成は、特に言及がない限り、組合せ可能である。 Next, an embodiment of the present invention will be described. The present invention is not limited to the following embodiments. In the following drawings, the same parts are denoted by the same reference numerals. Further, the descriptions of the respective embodiments can be mutually incorporated unless otherwise stated. Furthermore, the configurations of the respective embodiments can be combined unless otherwise stated.
 以下の実施形態において、前記検体画像としては、一例として、前記MUC抗体により染色した子宮頸部検体の画像をあげるが、本発明は、これには制限されず、MUCに対して結合性を有する結合物質による染色検体の画像であればよい。また、以下、本発明により細胞診カテゴリーを推定する対象となる検体を、「被検体」という。 In the following embodiment, as an example of the sample image, an image of a cervical sample stained with the MUC antibody is exemplified, but the present invention is not limited thereto, and has a binding property to MUC. It may be an image of a sample stained by the binding substance. Further, hereinafter, a sample to be a target for which a cytology category is to be estimated according to the present invention is referred to as a "subject".
[実施形態1]
 本発明の推定装置および推定方法の一例について、図を用いて説明する。
Embodiment 1
An example of the estimation apparatus and estimation method of the present invention will be described with reference to the drawings.
 図1は、本実施形態の推定装置の一例を示すブロック図である。推定装置1は、検体画像入力部11、細胞画像抽出部12、細胞形態分類部13、および細胞診カテゴリー推定部14を含む。推定装置1は、例えば、さらに、記憶部15、および出力部16を含んでもよい。記憶部15は、例えば、処理情報記憶部151、細胞画像抽出情報記憶部152、細胞形態分類情報記憶部153、細胞診カテゴリー推定情報記憶部154等を含む。情報記憶部151は、例えば、処理情報(例えば、推定装置1に入力された入力情報、推定装置1から出力される出力情報、および推定装置1で得られた情報等)を記憶し、細胞画像抽出情報記憶部152は、細胞画像抽出情報を記憶し、細胞形態分類情報記憶部153は、細胞形態分類情報を記憶し、細胞診カテゴリー推定情報記憶部154は、細胞診カテゴリー推定情報を記憶する。 FIG. 1 is a block diagram showing an example of an estimation apparatus of the present embodiment. The estimation apparatus 1 includes a sample image input unit 11, a cell image extraction unit 12, a cell shape classification unit 13, and a cytology category estimation unit 14. The estimation device 1 may further include, for example, a storage unit 15 and an output unit 16. The storage unit 15 includes, for example, a processing information storage unit 151, a cell image extraction information storage unit 152, a cell type classification information storage unit 153, a cytology category estimation information storage unit 154, and the like. The information storage unit 151 stores, for example, processing information (for example, input information input to the estimation device 1, output information output from the estimation device 1, and information obtained by the estimation device 1), and a cell image The extraction information storage unit 152 stores cell image extraction information, the cell type classification information storage unit 153 stores cell type classification information, and the cytology category estimation information storage unit 154 stores cytology category estimation information .
 推定装置1は、例えば、推定システムともいう。推定装置1は、例えば、前記各部を含む1つの推定装置でもよいし、前記各部が、通信回線網を介して接続可能な推定装置でもよい。前記通信回線網は、特に制限されず、公知の通信回線網を使用でき、有線でも無線でもよく、具体的には、例えば、インターネット回線、電話回線、LAN(Local Area Network)、WiFi(Wireless Fidelity)等があげられる。推定装置1は、各部の処理がクラウド上で行われてもよい。 The estimation device 1 is also referred to, for example, as an estimation system. The estimation apparatus 1 may be, for example, one estimation apparatus including the respective units, or the respective units may be an estimation apparatus connectable via a communication network. The communication network is not particularly limited, and may be a known communication network, and may be wired or wireless. Specifically, for example, the Internet, telephone, LAN (Local Area Network), WiFi (Wireless Fidelity) Etc.). In the estimation device 1, the processing of each unit may be performed on the cloud.
 つぎに、図2に、推定装置1のハードウエア構成のブロック図を例示する。推定装置1は、例えば、CPU(中央処理装置)101、メモリ102、バス103、入力装置104、ディスプレイ105、通信デバイス106、記憶装置107、撮像装置113等を有する。推定装置1の各部は、それぞれのインターフェース(I/F)により、バス103を介して、相互に接続されている。 Next, FIG. 2 illustrates a block diagram of a hardware configuration of the estimation device 1. The estimation device 1 includes, for example, a CPU (central processing unit) 101, a memory 102, a bus 103, an input device 104, a display 105, a communication device 106, a storage device 107, an imaging device 113, and the like. The respective units of the estimation device 1 are mutually connected via a bus 103 by respective interfaces (I / F).
 CPU101は、推定装置1の全体の制御を担う。推定装置1において、CPU101により、例えば、本発明のプログラムやその他のプログラムが実行され、また、各種情報の読み込みや書き込みが行われる。具体的に、推定装置1は、例えば、CPU101が、検体画像入力部11、細胞画像抽出部12、細胞形態分類部13、および細胞診カテゴリー推定部14として機能する。 The CPU 101 is responsible for overall control of the estimation device 1. In the estimation device 1, for example, the program of the present invention and other programs are executed by the CPU 101, and reading and writing of various information are performed. Specifically, in the estimation device 1, for example, the CPU 101 functions as the sample image input unit 11, the cell image extraction unit 12, the cell shape classification unit 13, and the cytology category estimation unit 14.
 推定装置1は、例えば、バス103に接続された通信デバイス106により、通信回路網に接続でき、前記通信回路網を介して、外部機器とも接続できる。前記外部機器は、特に制限されず、例えば、カメラ等の撮像デバイス、パーソナルコンピュータ(PC)、タブレット、スマートフォン等の端末等があげられる。推定装置1と前記外部機器との接続方式は、特に制限されず、例えば、有線による接続でもよいし、無線による接続でもよい。前記有線による接続は、例えば、コードによる接続でもよいし、通信回線網を利用するためのケーブル等による接続でもよい。前記無線による接続は、例えば、通信回線網を利用した接続でもよいし、無線通信を利用した接続でもよい。前記通信回線網は、特に制限されず、例えば、公知の通信回線網を使用でき、前述と同様である。推定装置1と前記外部機器との接続形式は、例えば、USB等であってもよい。 The estimation apparatus 1 can be connected to a communication network by, for example, the communication device 106 connected to the bus 103, and can also be connected to an external device via the communication network. The external device is not particularly limited, and examples thereof include an imaging device such as a camera, a terminal such as a personal computer (PC), a tablet, and a smartphone. The connection method between the estimation device 1 and the external device is not particularly limited, and may be, for example, wired connection or wireless connection. The wired connection may be, for example, a cord connection or a cable connection for using a communication network. The wireless connection may be, for example, a connection using a communication network or a connection using wireless communication. The communication network is not particularly limited, and for example, a known communication network can be used and is the same as described above. The connection type between the estimation device 1 and the external device may be, for example, USB or the like.
 メモリ102は、例えば、メインメモリを含み、前記メインメモリは、主記憶装置ともいう。CPU101が処理を行う際には、例えば、後述する補助記憶装置に記憶されている、本発明のプログラム等の種々の動作プログラム108を、メモリ102が読み込み、CPU101は、メモリ102からデータを受け取って、プログラム108を実行する。前記メインメモリは、例えば、RAM(ランダムアクセスメモリ)である。メモリ102は、例えば、さらに、ROM(読み出し専用メモリ)を含む。 The memory 102 includes, for example, a main memory, and the main memory is also referred to as a main storage device. When the CPU 101 performs a process, for example, the memory 102 reads various operation programs 108 such as the program of the present invention stored in an auxiliary storage device described later, and the CPU 101 receives data from the memory 102. , Program 108 is executed. The main memory is, for example, a RAM (random access memory). The memory 102 further includes, for example, a ROM (read only memory).
 記憶装置107は、例えば、前記メインメモリ(主記憶装置)に対して、いわゆる補助記憶装置ともいう。記憶装置107は、例えば、記憶媒体と、前記記憶媒体に読み書きするドライブとを含む。前記記憶媒体は、特に制限されず、例えば、内蔵型でも外付け型でもよく、HD(ハードディスク)、FD(フロッピー(登録商標)ディスク)、CD-ROM、CD-R、CD-RW、MO、DVD、フラッシュメモリー、メモリーカード等があげられ、前記ドライブは、特に制限されない。記憶装置107は、例えば、記憶媒体とドライブとが一体化されたハードディスクドライブ(HDD)も例示できる。記憶装置107には、例えば、前述のように、動作プログラム108が格納され、前述のように、CPU101を実行させる際、メモリ102が、記憶装置107から動作プログラム108を読み込む。また、記憶装置107は、例えば、前述の処理情報109(例えば、前記入力情報、前記出力情報、および推定装置1で得られた情報等)、前記細胞画像抽出情報(例えば、細胞画像抽出モデル110)、前記細胞形態分類情報(例えば、細胞形態推定モデル111)、前記細胞診カテゴリー推定情報(例えば、細胞診カテゴリー推定モデル112)等を記憶する。 The storage device 107 is also called, for example, a so-called auxiliary storage device with respect to the main memory (main storage device). The storage device 107 includes, for example, a storage medium and a drive that reads and writes to the storage medium. The storage medium is not particularly limited, and may be, for example, a built-in type or an external type, HD (hard disk), FD (floppy (registered trademark) disk), CD-ROM, CD-R, CD-RW, MO, Examples of the drive include a DVD, a flash memory, and a memory card, and the drive is not particularly limited. The storage device 107 can also be exemplified by, for example, a hard disk drive (HDD) in which a storage medium and a drive are integrated. For example, as described above, the operation program 108 is stored in the storage device 107, and as described above, the memory 102 reads the operation program 108 from the storage device 107 when the CPU 101 is executed. In addition, the storage device 107 may be, for example, the above-mentioned processing information 109 (for example, the input information, the output information, and information obtained by the estimation device 1), the cell image extraction information (for example, the cell image extraction model 110). And the cell shape classification information (for example, the cell shape estimation model 111), the cytology category estimation information (for example, the cytology category estimation model 112), and the like.
 推定装置1は、さらに、撮像装置113を備えてもよい。撮像装置113は、例えば、カメラである。推定装置1が撮像装置113を備える場合、例えば、撮像装置113により、前記染色検体を撮像し、その画像を入力できる。 The estimation device 1 may further include an imaging device 113. The imaging device 113 is, for example, a camera. When the estimation device 1 includes the imaging device 113, for example, the imaging device 113 can image the stained sample and input the image.
 推定装置1は、例えば、さらに、入力装置104、ディスプレイ105を有してもよい。入力装置104は、例えば、画像を読み込むスキャナー、タッチパネル、キーボード等である。ディスプレイ105は、例えば、LEDディスプレイ、液晶ディスプレイ等があげられ、出力部16にもなる。 The estimation device 1 may further include, for example, an input device 104 and a display 105. The input device 104 is, for example, a scanner that reads an image, a touch panel, or a keyboard. The display 105 is, for example, an LED display, a liquid crystal display or the like, and also serves as the output unit 16.
 前記細胞画像抽出情報は、前記検体画像から染色細胞を抽出するための情報である。前記細胞画像抽出情報は、例えば、細胞画像抽出モデル110があげられる。細胞画像抽出モデル110は、例えば、データベースに蓄積した子宮頸部の学習用染色検体画像における染色細胞画像(例えば、切り出した染色細胞画像)から、学習により生成されるモデルである。子宮頸部検体について前記MUC結合体を用いて染色処理した場合、得られる染色検体には、前記MUC結合体が結合して染色される細胞と、前記MUC結合体が非結合のため染色されない細胞とが混在する。このため、前記染色検体の画像を学習用染色検体画像とし、それに含まれる染色細胞画像を学習データとして学習させることによって、子宮頸部の染色検体画像に含まれる複数の細胞のそれぞれについて、染色細胞か否かを判別し、染色細胞を含む領域を染色細胞の細胞画像として抽出する前記モデルを生成できる。前記学習データは、例えば、さらに、前記学習用染色検体画像における非染色細胞画像を含んでもよく、この場合、前記染色細胞画像と前記非染色細胞画像とを学習データとして、学習させ、染色細胞か非染色細胞かを判別できるモデルを生成してもよい。前記学習は、例えば、AI、機械学習、ディープラーニング等のいずれでもよい。前記機械学習は、例えば、SVM(Support Vector Machine)等が利用でき、ディープラーニングは、例えば、CNN(Convolutional Neural Network)等が利用できる。本発明において学習は、例えば、以下、同様である。 The cell image extraction information is information for extracting stained cells from the sample image. Examples of the cell image extraction information include a cell image extraction model 110. The cell image extraction model 110 is, for example, a model generated by learning from a stained cell image (for example, a cut out stained cell image) in a learning staining sample image for learning of the cervix accumulated in a database. When a cervical specimen is stained with the MUC conjugate, the resulting stained specimen includes cells that are stained by binding to the MUC conjugate and cells that are not stained because the MUC conjugate is not bound. And mixed. Therefore, by using the image of the stained sample as a learning stained sample image and learning the stained cell image contained therein as learning data, the stained cells of each of the plurality of cells included in the stained sample image of the cervix can be obtained. It is possible to generate the above-mentioned model which discriminates whether or not to extract a region including stained cells as a cell image of the stained cells. The learning data may further include, for example, a non-stained cell image in the stained sample image for learning, in which case the stained cell image and the non-stained cell image are learned as learning data, A model may be generated that can discriminate between non-stained cells. The learning may be, for example, any of AI, machine learning, deep learning, and the like. The machine learning can use, for example, SVM (Support Vector Machine) or the like, and the deep learning can use, for example, CNN (Convolutional Neural Network) or the like. The learning in the present invention is, for example, the same as the following.
 前記細胞形態分類情報は、前述のように、2以上の形態カテゴリーのそれぞれと細胞形態情報とが関連づけられた情報である。本発明において、前記形態カテゴリーの数および前記各形態カテゴリーの細胞形態情報は、特に制限されず、任意に設定できる。本発明は、前述のように、子宮頸部の状態に依存して、前記染色細胞の形態の組成パターンが異なることに基づく技術である。このため、前記形態カテゴリーの数は、例えば、相対的に多い程、より精度に優れた細胞診カテゴリーの推定が可能になるといえる。前記形態カテゴリーの数は、下限が、2以上であり、例えば、5以上、20以上が好ましく、上限は、特に制限されず、例えば、100以下、30以下が好ましい。 The cell morphology classification information is, as described above, information in which each of two or more morphology categories is associated with cell morphology information. In the present invention, the number of the morphology categories and the cell morphology information of each of the morphology categories are not particularly limited, and can be set arbitrarily. The present invention is a technique based on the fact that the compositional pattern of the morphology of the stained cells is different depending on the condition of the cervix as described above. For this reason, it can be said that, for example, as the number of morphological categories is relatively large, it is possible to estimate cytology categories with higher accuracy. The lower limit is 2 or more, for example, 5 or more and 20 or more are preferable, and the upper limit of the number of the form categories is not particularly limited, and for example, 100 or less and 30 or less is preferable.
 前記細胞形態分類情報は、例えば、データベースに格納されてもよいし、細胞形態推定モデル111でもよい。細胞形態推定モデル111は、例えば、細胞画像における細胞について、該当する形態カテゴリーを推定するモデルであり、例えば、データベースに蓄積した前記各形態カテゴリーに該当する子宮頸部の学習用染色細胞画像から、学習により生成できる。前記学習用染色細胞画像は、例えば、各細胞診カテゴリーにおいて特徴的な形態の染色細胞画像、および細胞診カテゴリー間において共通する形態の染色細胞画像等が使用できる。 The cell morphology classification information may be stored in a database, for example, or may be a cell morphology estimation model 111. The cell shape estimation model 111 is, for example, a model for estimating a corresponding morphology category for cells in a cell image, and for example, from a learning stained cell image of the cervix corresponding to each of the morphology categories accumulated in the database, It can be generated by learning. As the staining cell image for learning, for example, a staining cell image of a characteristic form in each cytology category, a staining cell image of a form common to cytology categories, and the like can be used.
 前記細胞形態情報の種類は、特に制限されず、例えば、下記表2に示すような項目があげられ、下記表3に示すように、前記各形態カテゴリーと紐づけできる。 The type of the cell shape information is not particularly limited, and examples thereof include items as shown in Table 2 below, and as shown in Table 3 below, can be linked to each of the shape categories.
Figure JPOXMLDOC01-appb-T000002
Figure JPOXMLDOC01-appb-T000002
Figure JPOXMLDOC01-appb-T000003
Figure JPOXMLDOC01-appb-T000003
 前記細胞診カテゴリー推定情報は、2以上の細胞診カテゴリーのそれぞれと前記各細胞診カテゴリーに属する子宮頚部の形態カテゴリーの組成パターンとが関連付けられた情報である。前述のように、前記子宮頚部検体について、前記MUC結合体により染色された細胞の形態を分類すると、子宮頸部の状態に依存して、前記形態の組成パターンが異なることが明らかとなった。このため、例えば、各細胞診カテゴリーについて、それぞれの前記染色細胞の前記形態カテゴリーの組成パターンを予め設定しておくことで、前記被検体の検体画像の組成パターンから、該当する細胞診カテゴリーを推定できる。 The cytology category estimation information is information in which each of two or more cytology categories is associated with a composition pattern of a cervical morphological category belonging to each of the cytology categories. As described above, when the morphology of the cells stained with the MUC conjugate was classified for the cervical sample, it became clear that the composition pattern of the morphology is different depending on the condition of the cervix. Therefore, for example, for each cytology category, the composition pattern of the morphological category of each stained cell is set in advance, and the corresponding cytology category is estimated from the composition pattern of the sample image of the subject. it can.
 前記細胞診カテゴリー推定情報は、例えば、データベースに格納されてもよいし、細胞診カテゴリー推定モデル112でもよい。細胞診カテゴリー推定モデル112は、例えば、検体画像の細胞の組成パターンについて、該当する細胞診カテゴリーを推定するモデルであり、データベースに蓄積した前記各細胞診カテゴリーに該当する子宮頸部における染色細胞の形態カテゴリーの組成パターンから、学習により生成できる。 The cytology category estimation information may be stored in, for example, a database, or may be a cytology category estimation model 112. The cytology category estimation model 112 is, for example, a model for estimating a corresponding cytology category with respect to the composition pattern of cells of the sample image, and the stained cells in the cervix corresponding to the respective cytology categories accumulated in the database. It can be generated by learning from composition patterns of the morphological category.
 つぎに、本実施形態の推定装置1の各部について、具体的に説明する。 Below, each part of the estimation apparatus 1 of this embodiment is demonstrated concretely.
 検体画像入力部11は、前記染色検体の検体画像を入力する。前記検体画像の入力は、例えば、前記染色検体のスライドを撮像することにより行ってもよいし、前記スライドを撮影した画像の読み込みであってもよい。前記入力する検体画像の倍率は、特に制限されず、例えば、前記スライドの撮影時における顕微鏡の倍率によって、適宜設定でき、また、変更することもできる。 The sample image input unit 11 inputs a sample image of the stained sample. The input of the sample image may be performed, for example, by imaging a slide of the stained sample, or reading of an image obtained by photographing the slide. The magnification of the sample image to be input is not particularly limited, and can be appropriately set or changed according to, for example, the magnification of the microscope at the time of photographing the slide.
 細胞画像抽出部12は、前記検体画像から染色細胞の細胞画像を抽出する。前記検体画像からの染色細胞の抽出方法は、特に制限されず、例えば、前述のような細胞画像抽出モデル110を用いて抽出できる。本実施形態においては、例えば、前記検体画像に含まれる検出可能な多数の染色細胞、好ましくは検出可能な全ての染色細胞について、細胞画像を抽出してもよい。前述のように、染色標本には、通常、1スライドあたり約10万個の細胞が存在するため、前記細胞検査士の人力での判定では、全ての染色細胞を確認することは不可能である。しかしながら、本発明の推定装置によれば、例えば、画像を用いた自動的な分析が可能であるため、検出可能な染色細胞全てについて、染色細胞の画像を抽出することも可能である。 The cell image extraction unit 12 extracts a cell image of stained cells from the sample image. The method for extracting stained cells from the sample image is not particularly limited, and can be extracted using, for example, the cell image extraction model 110 as described above. In the present embodiment, for example, cell images may be extracted for a large number of detectable stained cells contained in the sample image, preferably all detectable stained cells. As described above, since approximately 100,000 cells are usually present in a stained sample, it is impossible to confirm all stained cells by human judgment of the cytologist. . However, according to the estimation apparatus of the present invention, for example, since automatic analysis using an image is possible, it is also possible to extract an image of stained cells for all detectable stained cells.
 細胞画像抽出部12は、例えば、前記検体画像において抽出された染色細胞の細胞画像の数を検出する計数部を兼ねてもよいし、推定装置1が、例えば、さらに、前記計数部を備えてもよい。 The cell image extraction unit 12 may double as, for example, a counting unit that detects the number of cell images of stained cells extracted in the sample image, and the estimation device 1 further includes, for example, the counting unit. It is also good.
 細胞形態分類部13は、前記細胞形態分類情報に基づいて、前記細胞画像における染色細胞を、該当する形態カテゴリーに分類し、前記検体画像について、前記検体画像から抽出された前記細胞画像における染色細胞の形態カテゴリーの組成パターンを決定する。前記形態カテゴリーへの分類は、前述のように、例えば、細胞形態推定モデル111を用いて行うことができる。 The cell morphology classification unit 13 classifies the stained cells in the cell image into a corresponding morphology category based on the cell morphology classification information, and the stained cells in the cell image extracted from the specimen image for the specimen image. Determine the composition patterns of the morphological categories. The classification into the morphology category can be performed, for example, using the cell morphology estimation model 111 as described above.
 前記組成パターンの表し方は、特に制限されず、例えば、前記各形態カテゴリーに属する細胞の細胞比または頻度分布でもよい。細胞形態分類部13は、例えば、前記各形態カテゴリーに分類された細胞画像の数を検出する計数部を兼ねてもよいし、推定装置1が、例えば、さらに、前記計数部を備えてもよい。 The way of expressing the composition pattern is not particularly limited, and may be, for example, the cell ratio or frequency distribution of cells belonging to each of the morphology categories. The cell shape classification unit 13 may also serve as, for example, a counting unit that detects the number of cell images classified into each of the shape categories, or the estimation device 1 may further include, for example, the counting unit. .
 細胞診カテゴリー推定部14は、前記細胞診カテゴリー推定情報に基づいて、前記検体画像の前記組成パターンから、前記検体画像の染色検体について、細胞診カテゴリーを推定する。前記細胞診カテゴリーの推定は、前述のように、例えば、細胞診カテゴリー推定モデル112を用いて行うことができる。 The cytology category estimation unit 14 estimates a cytology category for the stained sample of the sample image from the composition pattern of the sample image based on the cytology category estimation information. The estimation of the cytology category can be performed using, for example, the cytology category estimation model 112 as described above.
 つぎに、本実施形態の推定方法について、具体的に説明する。本実施形態の推定方法は、例えば、本実施形態の推定装置1により行うことができる。 Below, the estimation method of this embodiment is demonstrated concretely. The estimation method of the present embodiment can be performed, for example, by the estimation device 1 of the present embodiment.
 前記検体画像入力工程は、前記染色検体の検体画像を入力する工程であり、推定装置1の検体画像入力部11により実行できる。 The sample image input step is a step of inputting a sample image of the stained sample, and can be executed by the sample image input unit 11 of the estimation device 1.
 前記細胞画像抽出工程は、前記検体画像から染色細胞の細胞画像を抽出する工程であり、推定装置1の細胞画像抽出部12により実行できる。 The cell image extraction step is a step of extracting a cell image of a stained cell from the sample image, and can be executed by the cell image extraction unit 12 of the estimation device 1.
 前記細胞形態分類工程は、前記細胞形態分類情報に基づいて、前記細胞画像における染色細胞を、該当する形態カテゴリーに分類し、前記検体画像について、前記検体画像から抽出された前記細胞画像における染色細胞の形態カテゴリーの組成パターンを決定する工程である。この工程は、推定装置1の細胞形態分類部13により実行できる。 The cell morphology classification step classifies the stained cells in the cell image into a corresponding morphology category based on the cell morphology classification information, and the stained cells in the cell image extracted from the specimen image for the specimen image. Determining the composition pattern of the morphological category of This process can be performed by the cell shape classification unit 13 of the estimation device 1.
 前記細胞診カテゴリー推定工程は、前記細胞診カテゴリー推定情報に基づいて、前記検体画像の前記組成パターンから、前記検体画像の染色検体について、細胞診カテゴリーを推定する工程である。この工程は、例えば、推定装置1の細胞診カテゴリー推定部14により実行できる。 The cytology category estimation step is a step of estimating a cytology category for the stained specimen of the specimen image from the composition pattern of the specimen image based on the cytology category estimation information. This process can be performed, for example, by the cytology category estimation unit 14 of the estimation device 1.
[実施形態2]
 本発明の推定装置および推定方法において、前記検体画像の適正を判定する形態の一例を説明する。なお、本実施形態2は、特に示さない限り、前記実施形態1の記載を援用できる。
Second Embodiment
In the estimation apparatus and estimation method of the present invention, an example of a mode of determining the appropriateness of the sample image will be described. In addition, the description of the said Embodiment 1 can be used for this Embodiment 2, unless it shows in particular.
 推定装置1は、例えば、さらに、前記検体画像の適正判定部を含んでもよい。また、本実施形態の推定方法は、例えば、さらに、前記検体画像の適正判定工程を含んでもよく、この工程は、例えば、前記適性判定部により実行できる。前記適性判定部および前記適性判定工程は、例えば、前記検体画像について、判定項目を検出し、検出結果が適正結果を満たす場合、前記細胞画像抽出部における前記検体画像として適性であると判定する。 The estimation apparatus 1 may further include, for example, an appropriateness determination unit of the sample image. In addition, the estimation method of the present embodiment may further include, for example, a step of determining the appropriateness of the sample image, and this step can be performed by, for example, the aptitude determining unit. The aptitude determining unit and the aptitude determining step, for example, detect a determination item for the sample image, and when the detection result satisfies the aptitude result, determine that the sample image in the cell image extracting unit is suitable.
 本実施形態においては、前記染色検体の検体画像が使用されるが、例えば、染色検体そのものに問題があったり、検体画像に問題があった場合、誤った推定につながる可能性がある。そこで、前記検体画像について、前記染色細胞の抽出を行う前に、適正を判断することで、誤った推定を抑制することができる。 In the present embodiment, the sample image of the stained sample is used. For example, if there is a problem in the stained sample itself or there is a problem in the sample image, it may lead to an erroneous estimation. Therefore, the erroneous estimation can be suppressed by judging the appropriateness of the sample image before extracting the stained cells.
 前記判定項目は、例えば、前記検出画像そのものの適正を判定する項目として、前記検出画像のボケの有無があげられる。そして、例えば、前記検出画像からボケが検出される場合、前記検出画像を不適正と判定し、前記細胞画像抽出部による前記細胞画像抽出工程は中止し、他方、前記検出画像からボケが非検出の場合、前記検出画像を適正と判定し、前記細胞画像抽出部による前記細胞画像抽出工程を実行することができる。 The determination item is, for example, the presence or absence of blurring of the detected image as an item for determining the appropriateness of the detected image itself. Then, for example, when blurring is detected from the detection image, the detection image is determined to be inappropriate, the cell image extraction process by the cell image extraction unit is stopped, and blurring is not detected from the detection image. In this case, it is possible to determine that the detected image is appropriate, and to execute the cell image extraction process by the cell image extraction unit.
 また、前記判定項目は、例えば、前記染色検体そのものの適正を判定する項目として、前記検体画像における、細胞数、血液混入の程度、炎症の程度等があげられる。そして、例えば、前記検出画像において、細胞数が閾値未満である、血液混入の程度が閾値を超える、炎症の程度が閾値を超える等と検出される場合、前記染色検体および前記染色検体の検体画像を不適正と判定し、前記細胞画像抽出部による前記細胞画像抽出工程は中止し、他方、前記検出画像において、細胞数が閾値以上である、血液混入の程度が閾値以下である、炎症の程度が閾値以下である等と検出される場合、前記染色検体および前記検出画像を適正と判定し、前記細胞画像抽出部による前記細胞画像抽出工程を実行することができる。前記判定項目は、例えば、いずれか一つの項目について不適正となった場合、前記検体画像を不適正と判定する。 Further, the determination items include, for example, the number of cells, the degree of blood contamination, the degree of inflammation, and the like in the sample image as items for determining the appropriateness of the stained sample itself. Then, for example, when it is detected that the number of cells is less than the threshold, the degree of blood contamination exceeds the threshold, the degree of inflammation exceeds the threshold, or the like in the detection image, the specimen image of the stained specimen and the stained specimen Is judged to be inappropriate, and the cell image extraction step by the cell image extraction unit is discontinued, while the number of cells is above the threshold and the degree of blood contamination is below the threshold in the detection image. When it is detected that the threshold value is equal to or less than a threshold value, it is possible to determine that the stained sample and the detection image are appropriate, and to execute the cell image extraction process by the cell image extraction unit. The determination item determines that the sample image is inappropriate when, for example, any one item becomes inappropriate.
 前記判定項目の前記閾値は、特に制限されず、例えば、シビアに判定するか否かによって、設定を変更できる。前記細胞数は、例えば、所定面積における細胞数、前記血液混入の程度は、例えば、所定面積における血液がしめる割合、前記炎症の程度は、例えば、好中球の出現率等から判断できる。 The threshold value of the determination item is not particularly limited, and the setting can be changed depending on, for example, whether or not the determination is severe. The number of cells can be determined from, for example, the number of cells in a predetermined area, the degree of blood contamination can be determined, for example, the ratio of blood in a predetermined area, and the degree of inflammation can be determined from the appearance rate of neutrophils, for example.
 また、前記適性の判定は、適正判定モデルを使用することもできる。前記適性判定モデルは、例えば、各判定項目について、データベースに蓄積した不適正な検体画像と適性の検体画像とを学習データとして使用し、学習させることによって生成できる。 Moreover, the determination of the aptitude can also use an aptitude determination model. The aptitude determination model can be generated, for example, by using, as learning data, an inappropriate sample image accumulated in a database and a sample image of aptitude for each determination item.
[実施形態3]
 実施形態1および2の推定装置1を用いた推定方法の一例について、図を用いて説明する。図3および図4は、本実施形態のフローチャートである。なお、本実施形態の推定方法は、図1の推定装置1の使用には限定されない。
Third Embodiment
An example of an estimation method using the estimation device 1 of the first and second embodiments will be described with reference to the drawings. 3 and 4 are flowcharts of the present embodiment. In addition, the estimation method of this embodiment is not limited to use of the estimation apparatus 1 of FIG.
(A1)検体画像入力工程
 前記検体画像入力工程は、前記染色検体の検体画像を入力する工程である。前記検体画像の倍率は、特に制限されず、例えば、前記スライドにおける現物の大きさを1倍とした場合、約100倍の画像が例示できる。
(A1) Sample Image Input Step The sample image input step is a step of inputting a sample image of the stained sample. The magnification of the sample image is not particularly limited. For example, when the actual size of the slide is 1 ×, an image of about 100 × can be exemplified.
(A2)適正判定工程
 前記適性判定工程は、例えば、A2-1~A2-6の工程を含む。
(A2) Suitability Determination Step The suitability determination step includes, for example, the steps of A2-1 to A2-6.
 まず、入力した前記検体画像について、前記判定項目の検出を行う(A2-1)。前記判定項目は、例えば、画像のボケの有無、細胞数、血液混入の程度、炎症の程度等であり、これらを前記検体画像から検出する。これらの各判定項目の検出の順序は、特に制限されない。 First, the determination item is detected for the input sample image (A2-1). The determination items are, for example, the presence or absence of blurring of an image, the number of cells, the degree of blood contamination, the degree of inflammation, etc., and these are detected from the sample image. The order of detection of each of these determination items is not particularly limited.
 そして、前記検体画像の各判定項目について適性の判定を行う。前記適性の判断において、例えば、前述のように、前記検体画像そのものの適性の判断と、前記染色検体そのものの適性の判断とが含まれる。判断の順序は、特に制限されず、いずれが先でもよい。例えば、前記検体画像そのもの不適正である場合、例えば、後の工程自体が行い難いため、前記検体画像の適性の判断を行った後に、前記染色検体そのものの適性の判断を行うことが好ましい。 Then, an aptitude determination is performed for each determination item of the sample image. In the determination of the aptitude, for example, as described above, the determination of the aptitude of the sample image itself and the determination of the aptitude of the stained sample itself are included. The order of the determination is not particularly limited, and any may be preceded. For example, when the sample image itself is inappropriate, for example, it is difficult to carry out the subsequent process itself, so it is preferable to determine the suitability of the stained sample itself after determining the suitability of the sample image.
 具体例として、まず、前記検体画像について画像のボケの有無を検出し(A2-2)、画像にボケが生じている場合(YES)、前記検体画像そのものが不適正と判断し、終了する(END)。他方、ボケが生じていない場合(NO)の場合、さらに、前記検体画像に含まれる細胞数を検出し(A2-3)、細胞数が少ない場合(YES)、前記染色検体そのものが不適正と判断し、終了する(END)。他方、細胞が少なくない、すなわち十分である場合(NO)、さらに、前記検体画像について血液の混入を検出し(A2-4)、血液の混入が顕著な場合(YES)、前記染色検体そのものが不適正と判断し、終了する(END)。他方、血液の混入が顕著でない場合(NO)、さらに、前記検体画像における炎症を検出し(A2-5)、炎症がひどい場合(YES)、前記染色検体そのものが不適正と判断し、終了する(END)。他方、炎症がひどくない場合(NO)は、前記検体画像が適正である判定し(A2-6)、つぎの工程に進む(X)。 As a specific example, first, the presence or absence of blurring of the image is detected with respect to the sample image (A2-2), and when blurring occurs in the image (YES), the sample image itself is determined to be inappropriate and the process ends ( END). On the other hand, when blurring does not occur (NO), the number of cells contained in the sample image is further detected (A2-3), and when the number of cells is small (YES), the stained sample is regarded as inappropriate. Judge and finish (END). On the other hand, if the number of cells is not small, that is, sufficient (NO), blood contamination is detected in the sample image (A2-4), and if blood contamination is significant (YES), the stained specimen itself is Judge as inappropriate and finish (END). On the other hand, when blood contamination is not remarkable (NO), further, inflammation in the sample image is detected (A2-5), and when the inflammation is severe (YES), the stained sample itself is judged to be inappropriate and the process is ended. (END). On the other hand, if the inflammation is not severe (NO), it is determined that the sample image is appropriate (A2-6), and the process proceeds to the next step (X).
(A3)細胞画像抽出工程
 前記細胞画像抽出工程は、前記(A2-6)工程で適正と判定された前記検体画像から、前記MUC結合体により染色された染色細胞の画像(画像領域)を抽出する(A3)。前記画像の抽出とは、例えば、前記検体画像における前記染色細胞の画像領域の特定と、前記特定した画像領域の切り出しとを含む。
(A3) Cell Image Extraction Step In the cell image extraction step, an image (image area) of stained cells stained with the MUC conjugate is extracted from the sample image determined to be appropriate in the (A2-6) step. (A3). The extraction of the image includes, for example, the specification of the image area of the stained cell in the sample image, and the clipping of the specified image area.
 図5は、前記検体画像からの前記染色細胞の抽出の概略図であり、図5(A)は、前記検体画像の一部を示し、図5(B)は、それぞれ、図5(A)に含まれる染色細胞の画像を示す。図5(A)に示すように、複数の染色領域が存在する検体画像について、前記染色細胞の画像領域を特定し(図5(A)の画像領域1、2、3)、図5(B)に示すように、それらの画像領域1、2、3を、前記染色細胞の染色画像として切り出す。 FIG. 5 is a schematic view of the extraction of the stained cells from the sample image, and FIG. 5 (A) shows a part of the sample image, and FIG. 5 (B) is each FIG. 5 (A). Shows an image of stained cells contained in As shown in FIG. 5 (A), the image areas of the stained cells are identified ( image areas 1, 2, 3 in FIG. 5 (A)) for the sample image in which a plurality of stained areas exist (FIG. 5 (B)) As shown in), the image areas 1, 2, 3 are cut out as a stained image of the stained cells.
 前記検体画像からの前記染色細胞の画像の抽出は、例えば、従来の画像処理により、行ってもよいし、さらに、前記細胞画像抽出モデルを用いてもよい。 The extraction of the image of the stained cells from the sample image may be performed, for example, by conventional image processing, or the cell image extraction model may be used.
 ここで、前記検体画像からの前記染色細胞の細胞画像の抽出について、一例として、図6のフローチャートを示す。 Here, a flowchart of FIG. 6 is shown as an example of extraction of a cell image of the stained cell from the sample image.
 前記(A2)工程で適正と判定された前記検体画像を入力(A3-1)し、前記検体画像について、染色された領域を白色、未染色の領域を黒色とする2値化処理を行う(A3-2)。前記2値化処理は、例えば、細胞画像抽出モデル110を利用できる。つぎに、前記検体画像において、重複および/または隣接する白色を含む領域を、あわせて1細胞としてラベリングする(A3-3)。前記検体画像におけるラベル(1細胞としてラベリングした領域)から、さらに、染色されている画素(染色画素)が最も多い領域を選択し(A3-4)、画像として切り出す(A3-5)。具体的には、例えば、図7の概略図に示すように、1細胞としてラベリングした画像領域を、バウンダリーボックスXで囲んだ場合に、最も染色画素が多い領域Yを選択し、領域Yを画像として切り出す。前記切り出した画像の倍率は、特に制限されず、例えば、前記スライドにおける現物の大きさを1倍とした場合、約400倍の画像が例示できる。1つの前記検体画像において、複数のラベルが存在する場合には、各ラベルについて、前記染色画素が最も多い領域の選択と切り出しとを、繰り返し行う。具体的には、あるラベルについての切り出しが終了すると、同じ前記検体画像に他のラベリング対象があるか否かを確認し(A3-6)、「YES」の場合は、ラベリングから切り出しまで(A3-3~A3-5)を繰り返し行い、全てのラベルの処理を行い、前記(A3-6)が「NO」となった場合は、終了する(END)。 The sample image determined to be appropriate in the step (A2) is input (A3-1), and a binarization process is performed on the sample image so that the stained area is white and the unstained area is black ( A3-2). The binarization process can use, for example, a cell image extraction model 110. Next, in the sample image, regions including overlapping and / or adjacent white are collectively labeled as one cell (A3-3). From the label (region labeled as one cell) in the sample image, a region having the largest number of stained pixels (stained pixels) is further selected (A3-4), and cut out as an image (A3-5). Specifically, for example, as shown in the schematic diagram of FIG. 7, when the image area labeled as one cell is surrounded by the boundary box X, the area Y with the largest number of stained pixels is selected, and the area Y is selected. Cut out as an image. The magnification of the clipped image is not particularly limited. For example, when the actual size of the slide is 1 ×, an image of about 400 × can be exemplified. In the case where a plurality of labels are present in one sample image, the selection and clipping of the region having the largest number of stained pixels are repeated for each label. Specifically, when the extraction of a certain label is completed, it is confirmed whether there is another labeling target in the same sample image (A3-6), and in the case of “YES”, from labeling to extraction (A3) Repeat steps -3 to A3-5), process all labels, and if (A3-6) becomes "NO", end (END).
 前記ラベルからの、前記染色画素が最も多い領域の選択および切り出し(A3-3~A3-5)は、例えば、従来の方法で行うことができる。前記選択および切り出しの一例を、図8に示す。図8の右図は、図7と同じ図であり、1細胞としてラベリングした画像領域を、Y軸方向に高さh、X軸方向に幅wの長さで形成されたバウンダリーボックスXで囲んだ図である。バウンダリーボックスX中のY’は、後述するフローチャートにおいて、前記染色画素をカウントする領域を示す探索矩形Y’である。探索矩形Y’の大きさは、任意であり、Y軸方向の高さb(b<h)、X軸方向の幅a(a<w)で表すことができる。hは、バウンダリーボックスXのY軸方向の高さhであり、wは、バウンダリーボックスXのX軸方向の幅wである。図8の右図における探索矩形Y’は、例えば、Y軸方向の位置を示す変数jが0、X軸方向の位置を示す変数iが0である。図8の左図は、前記選択および切り出しのフローチャートである。図8のバウンダリーボックスXの画像領域について、例えば、前記フローチャートに示すように、以下のような工程を行う。 The selection and cutting out (A3-3- to A3-5) of the region having the largest number of stained pixels from the label can be performed, for example, by a conventional method. An example of the selection and segmentation is shown in FIG. The right figure of FIG. 8 is the same as FIG. 7, and the image area labeled as one cell is a boundary box X formed with height h in the Y axis direction and width w in the X axis direction. It is the figure enclosed. Y 'in the boundary box X is a search rectangle Y' indicating a region for counting the stained pixels in a flowchart to be described later. The size of the search rectangle Y ′ is arbitrary, and can be represented by a height b (b <h) in the Y-axis direction and a width a (a <w) in the X-axis direction. h is the height h of the boundary box X in the Y-axis direction, and w is the width w of the boundary box X in the X-axis direction. In the search rectangle Y ′ in the right view of FIG. 8, for example, the variable j indicating the position in the Y-axis direction is 0, and the variable i indicating the position in the X-axis direction is 0. The left view of FIG. 8 is a flowchart of the selection and segmentation. For the image area of the boundary box X in FIG. 8, for example, as shown in the flowchart, the following steps are performed.
(a1)染色画素のカウント数を示す変数maxを、初期値0にセットする(max=0)。
(a2)Y軸方向の位置を示す変数jを、0から[高さh-高さb]まで1ずつ変化させる(j=0、h-b、1)。
(a3)X軸方向の位置を示す変数iを、0から[幅w-幅a]まで1ずつ変化させる(i=0、w-a、1)。
(a4)変数jおよびiの位置における探索矩形Y’中の染色画素をカウントする(cnt)。
(a5)変数maxよりも前記カウント数cntが大きい(max<cnt)か否かを判定する。
・max<cntの場合(YES)、変数maxに、前記カウント数cntをセットし、そのカウント数cntを示した座標位置を示す変数x,yに、それぞれ、i,jをセットする(max=cnt、x=i、y=j)。
・max>cntの場合(NO)、変数maxおよび変数x,yは、変更しない。
(a6)(a7)図8の右図中の矢印に示すように、染色画像のカウント位置を、次の位置(i+1,j)または(i,j+1)に移動していき、前記(a4)および(a5)を同様に繰り返す
(a8)この繰り返し処理が終了すると、cntには、染色画素カウントの最大値がセットされ、座標位置x,yには、探索矩形Y’における前記最大値を示す位置がセットされている。このため、x,y地点から幅a、高さbの画像を切り出し、終了する(END)。図8の右図について選択および切り出しを行った場合、例えば、図7の図における領域Yが選択され切り出される。
(A1) A variable max indicating the count number of stained pixels is set to an initial value 0 (max = 0).
(A2) A variable j indicating the position in the Y-axis direction is changed by one from 0 to [height h-height b] (j = 0, hb, 1).
(A3) A variable i indicating the position in the X-axis direction is changed one by one from 0 to [width w-width a] (i = 0, wa, 1).
(A4) Count stained pixels in the search rectangle Y 'at the positions of the variables j and i (cnt).
(A5) It is determined whether the count cnt is larger than the variable max (max <cnt).
If max <cnt (YES), the count number cnt is set to the variable max, and i and j are set to the variables x and y indicating the coordinate position indicating the count number cnt (max = cnt, x = i, y = j).
If max> cnt (NO), the variable max and the variables x and y are not changed.
(A6) (a7) As shown by the arrow in the right drawing of FIG. 8, the count position of the stained image is moved to the next position (i + 1, j) or (i, j + 1), and the above (a4) And (a5) are similarly repeated (a8) When this repeated processing is completed, the maximum value of the stained pixel count is set in cnt, and the maximum value in the search rectangle Y ′ is indicated in the coordinate position x, y. The position is set. Therefore, an image having a width a and a height b is cut out from the x and y points, and the process ends (END). When selection and clipping are performed on the right diagram of FIG. 8, for example, the region Y in the diagram of FIG. 7 is selected and clipped.
(A4)細胞計数工程
 前記推定方法では、例えば、前記(A3)工程で抽出された染色細胞の細胞画像をカウントしてもよい。前述のように、前記MUCは、正常上皮では発現が確認されず、子宮頸部上皮内腫瘍において特異的な形質発現が確認される。このため、子宮頸部検体が正常である場合、前記MUC結合体での染色によっても、染色細胞が検出されない場合がある。このため、例えば、前記検体画像について抽出された前記染色細胞の細胞画像をカウント(A4)し、染色細胞の数が少ない場合(YES)は、前記検体画像の検体は、正常であると推定して、終了してもよい(END)。前記推定においては、例えば、1スライドあたりの染色細胞数の閾値を設定し、前記閾値未満である場合、正常であると推定することができる。一般的に、細胞診におけるスライドには、1枚あたり10万個の細胞がのっていることから、前記正常か否かの閾値は、例えば、前記染色細胞の割合約1%または前記染色細胞の個数約1000個前後で設定することができる。前記ベセスダシステムの場合、例えば、正常、すなわち、「NILM」の細胞診カテゴリーに推定できる。他方、前記染色細胞の数が少なくない場合(NO)、例えば、閾値以上の場合、正常とは推定できないとして、次の細胞形態分類工程に進む。
(A4) Cell Counting Step In the above estimation method, for example, a cell image of stained cells extracted in the step (A3) may be counted. As described above, the expression of MUC is not confirmed in normal epithelium, but is specifically expressed in cervical intraepithelial neoplasia. Therefore, when the cervical specimen is normal, stained cells may not be detected even by the staining with the MUC conjugate. Therefore, for example, the cell image of the stained cells extracted for the sample image is counted (A4), and when the number of stained cells is small (YES), the sample of the sample image is estimated to be normal. May end (END). In the estimation, for example, a threshold value of the number of stained cells per slide is set, and when it is less than the threshold value, it can be estimated to be normal. Generally, the slide in the cytology has 100,000 cells per sheet, so that the threshold of normality or not is, for example, a ratio of about 1% of the stained cells or the stained cells The number can be set to about 1,000. In the case of the above-mentioned Bethesda system, for example, it can be estimated to be normal, ie, the cytology category of "NILM". On the other hand, when the number of the stained cells is not small (NO), for example, when the number is larger than the threshold value, it is not possible to deduce that it is normal, and the process proceeds to the next cell shape classification step.
(A5)細胞形態分類工程
 前記細胞形態分類工程は、前記細胞形態分類情報に基づいて、前記抽出した細胞画像における染色細胞を、該当する形態カテゴリーに分類し(A5)、前記検体画像について、前記検体画像から抽出された前記細胞画像における染色細胞の形態カテゴリーの組成パターンを決定する工程である。前記細胞形態分類工程は、例えば、前述のように、細胞形態推定モデル111を用いて行うことができる。
(A5) Cell Form Classification Step The cell form classification step classifies the stained cells in the extracted cell image into a corresponding form category based on the cell form classification information (A5), and the sample image in the form It is a process of determining the composition pattern of the morphological category of stained cells in the cell image extracted from the specimen image. The cell shape classification step can be performed using, for example, the cell shape estimation model 111 as described above.
 図9に、前記染色細胞の分類について、概略図を示す。前記表3に示すような形態項目の情報に基づいて、4種類の形態カテゴリーを設定した場合、図9に示すように、前記各染色細胞を、前記細胞形態分類情報に基づいて、Aカテゴリー、Bカテゴリー、Cカテゴリー、Dカテゴリーの4種類に分類できる。この分類は、例えば、前述の細胞形態推定モデル111を利用できる。 FIG. 9 shows a schematic view of the classification of the stained cells. When four types of morphology categories are set based on information of morphology items as shown in Table 3, as shown in FIG. 9, each stained cell is classified into A category based on the cell morphology classification information, It can be classified into 4 types of B category, C category and D category. This classification can use, for example, the cell shape estimation model 111 described above.
(A6)細胞診カテゴリー推定工程
 前記細胞診カテゴリー推定工程は、前記細胞診カテゴリー推定情報に基づいて、前記検体画像の前記組成パターンから、前記検体画像の染色検体について、細胞診カテゴリーを推定する(A6)。この推定は、例えば、前述の細胞診カテゴリー推定モデル112を利用できる。
(A6) Cytology category estimation step The cytology category estimation step estimates the cytology category for the stained sample of the sample image from the composition pattern of the sample image based on the cytology category estimation information ( A6). This estimation can use, for example, the cytology category estimation model 112 described above.
 子宮頸部検体を前記MUC結合体で染色した場合、その染色検体においては、前記MUC結合体で染色される様々な形態の染色細胞が含まれている。そして、様々な形態の染色細胞の組成パターンは、子宮頸部の状態に依存して、異なっている。つまり、細胞診の各細胞診カテゴリーに属する検体群ごとで、有意に異なっている。このため、例えば、前記被検体の組成パターンと、細胞診の各カテゴリーにおける組成パターンとを対比させ、前記被検体の組成パターンと同様である細胞診カテゴリーを、前記被検体の細胞診カテゴリーであると推定することが可能となる。具体例として、図9で例示した細胞形態カテゴリーA~Dへの分類を行った場合、前記被検体の検体画像における前記染色細胞が、カテゴリーA20%、カテゴリーB40%、カテゴリーC10%、カテゴリーD30%の比率となる組成パターンに分類されたと仮定する。他方、子宮頸がんのベセスダシステムにおいて、カテゴリーNILM、LSIL、HSILおよびSCCは、それぞれに、カテゴリーA~Dの比率は、例えば、図10に例示するように、特有の組成パターンを有する。この各細胞診カテゴリーの組成パターンを推定基準とすることで、前記検体画像の組成パターンから、細胞診カテゴリーが推定できる。すなわち、例えば、カテゴリーA20%、カテゴリーB40%、カテゴリーC10%、カテゴリーD30%の比率となる組成パターンであれば、NILMと推定できる。 When a cervical specimen is stained with the MUC conjugate, the stained specimen contains various forms of stained cells stained with the MUC conjugate. And, the composition patterns of the various forms of stained cells are different depending on the condition of the cervix. That is, they are significantly different for each sample group belonging to each cytology category of cytology. Therefore, for example, a composition pattern of the subject and a composition pattern in each category of cytology are compared, and a cytology category similar to the composition pattern of the subject is a cytology category of the subject. It is possible to estimate. As a specific example, when the classification into the cell morphology categories A to D illustrated in FIG. 9 is performed, the stained cells in the sample image of the subject are category A 20%, category B 40%, category C 10%, category D 30% It is assumed that the composition pattern is classified into the ratio of. On the other hand, in the Bethesda system for cervical cancer, the categories NILM, LSIL, HSIL and SCC respectively have characteristic composition patterns as exemplified in FIG. 10, and the ratios of categories A to D, for example. The cytology category can be estimated from the composition pattern of the sample image by using the composition pattern of each cytology category as an estimation criterion. That is, for example, a composition pattern having a ratio of Category A 20%, Category B 40%, Category C 10%, and Category D 30% can be estimated as NILM.
[実施形態4]
 本実施形態は、例示した各モデルの生成について説明する。なお、本発明は、この例には制限されない。
Fourth Embodiment
The present embodiment describes generation of each illustrated model. The present invention is not limited to this example.
(1)細胞画像抽出モデル110
 細胞画像抽出モデル110は、例えば、前述のように学習により生成できる。前記検体画像において、染色された領域と非染色の領域との判別は、例えば、目的の染色の色を正、それ以外の色を負に設定することにより行える。前記学習においては、例えば、目的の色で染色された細胞の画像データ、それに紐づけられた前記画像データが正例であるとのデータ、それ以外の色で染色された細胞の画像データ、およびそれに紐づけられた前記画像データが負例であるとのデータ等を使用し、学習させればよい。
(1) Cell image extraction model 110
The cell image extraction model 110 can be generated by learning, for example, as described above. In the sample image, discrimination between a stained region and a non-stained region can be performed, for example, by setting the target staining color as positive and the other colors as negative. In the learning, for example, image data of a cell stained in a target color, data that the image data linked thereto is a positive example, image data of a cell stained in any other color, and It may be learned using data or the like that the image data linked thereto is a negative example.
 細胞画像抽出モデル110の生成の具体例を以下に示す。まず、MUC結合体で染色した子宮頸部検体のスライドを撮像し、撮像したスライド画像から、染色細胞の画像を切り出し、染色細胞データベースに蓄積する。そして、モデル生成装置に、蓄積された染色細胞画像を学習データとして入力し、学習させることにより、前記染色細胞を含む画像領域を選択し、抽出するモデルが生成できる。 A specific example of generation of the cell image extraction model 110 is shown below. First, a slide of a cervical specimen stained with a MUC conjugate is imaged, and an image of stained cells is cut out from the imaged slide image and accumulated in a stained cell database. Then, the accumulated stained cell image is input as learning data to the model generation apparatus, and learning is performed, whereby a model can be generated which selects and extracts an image area including the stained cells.
 そして、学習により構築された細胞画像抽出モデル110を用いて、前記染色細胞データベースに蓄積された検体画像の処理を行い、染色細胞画像の抽出が正確か否かを確認し、間違っている場合には、抽出間違いを修正する処理を行い、さらに学習を行うことで、抽出の精度がより向上した細胞画像抽出モデル110を生成することもできる。 Then, using the cell image extraction model 110 constructed by learning, the sample image accumulated in the stained cell database is processed to confirm whether or not the extraction of the stained cell image is correct. Can correct the extraction error and further learn to generate the cell image extraction model 110 in which the extraction accuracy is further improved.
(2)細胞形態分類モデル111
 細胞形態推定モデル111は、例えば、前述のように学習により生成できる。前記学習においては、例えば、各形態カテゴリーに該当する細胞の画像データを使用し、学習させればよい。
(2) Cell morphology classification model 111
The cell shape estimation model 111 can be generated by learning, for example, as described above. In the learning, for example, image data of cells corresponding to each morphology category may be used for learning.
 細胞形態推定モデル111の生成の具体例を、図11のフローチャートを例として、以下に示す。まず、MUC結合体で染色した染色細胞の画像について、形態が異なる任意の複数の形態カテゴリーを決定し(B1-1)、前記各形態カテゴリーとそれに属する染色細胞画像とを紐づけて、形態分類データベース1に蓄積する。そして、モデル生成装置に、蓄積された形態カテゴリーとそれに属する染色細胞画像とを学習データとして入力し(B1-2)、学習(B1-3)させることにより、分類対象となる染色細胞画像が、いずれの形態カテゴリーに該当するかを判定(推定)するモデル111が生成できる。 A specific example of the generation of the cell shape estimation model 111 is shown below by taking the flowchart of FIG. 11 as an example. First, with regard to images of stained cells stained with MUC conjugate, any plural form categories having different forms are determined (B1-1), and each of the form categories is associated with a stained cell image belonging thereto to classify the forms. Accumulate in database 1 Then, the form category stored and the stained cell image belonging thereto are input as learning data to the model generation apparatus as learning data (B1-2), and the stained cell images to be classified are learned by learning (B1-3), A model 111 can be generated that determines (estimates) which form category it falls under.
 そして、学習により構築された細胞形態推定モデル111を用いて、形態分類データベース2に格納された、前記染色細胞の細胞画像について分類を行い(B1-4)、分類が正確か否かを確認し、間違っている場合には、分類間違いを修正する処理を行い(B1-5)、さらに学習を行う(B1-6)ことで、分類の精度がより向上した細胞形態推定モデル111を生成することもできる。図11において、便宜上、最初の学習で使用したデータベース1と、分類間違いの確認で使用したデータベース2とを、別に示したが、両者は同じデータベースでもよい。 Then, using the cell shape estimation model 111 constructed by learning, the cell image of the stained cells stored in the shape classification database 2 is classified (B1-4), and whether the classification is correct or not is confirmed. If incorrect, perform processing to correct classification errors (B1-5), and further perform learning (B1-6) to generate a cell shape estimation model 111 with further improved classification accuracy. You can also. In FIG. 11, for convenience, the database 1 used in the first learning and the database 2 used in the confirmation of the classification error are separately shown, but both may be the same database.
(3)細胞診カテゴリー推定モデル112
 細胞診カテゴリー推定モデル112は、例えば、前述のように学習により生成できる。前記学習においては、例えば、各細胞診カテゴリーに該当する検体の組成パターンを使用し、学習させればよい。
(3) Cytology category estimation model 112
The cytology category estimation model 112 can be generated by learning, for example, as described above. In the learning, for example, a composition pattern of a sample corresponding to each cytology category may be used to learn.
 細胞診カテゴリー推定モデル112の生成の具体例を以下に示す。まず、MUC結合体で染色した各細胞診カテゴリーに属する子宮頸部のスライドを撮像し、撮像したスライド画像から、例えば、細胞形態推定モデル111を用いて、前記各形態カテゴリーに属する細胞の組成パターンを検出し、前記細胞診カテゴリーに紐づけて、組成パターンデータベースに蓄積する。そして、モデル生成装置に、蓄積された前記組成パターンと前記細胞診カテゴリーとを学習データとして入力し、学習させることにより、分類対象となる検体画像の組成パターンが、いずれの細胞診カテゴリーに該当するかを推定するモデル112が生成できる。 A specific example of generation of the cytology category estimation model 112 is shown below. First, a slide of the cervix belonging to each cytology category stained with a MUC conjugate is imaged, and from the imaged slide image, for example, using the cell shape estimation model 111, the composition pattern of the cells belonging to each of the above morphology categories Are linked to the cytology category and accumulated in the composition pattern database. Then, the composition pattern of the sample image to be classified corresponds to any cytology category by inputting the accumulated composition pattern and the cytology category as learning data to the model generation apparatus as learning data. A model 112 can be generated to estimate the
 そして、学習により構築された細胞診カテゴリー推定モデル112を用いて、組成パターンデータベースに格納された、前記組成パターンについて細胞診カテゴリーの分類を行い、分類が正確か否かを確認し、間違っている場合には、分類間違いを修正する処理を行い、さらに学習を行うことで、分類の精度がより向上した細胞診カテゴリー推定モデル112を生成することもできる。 Then, using the cytology category estimation model 112 constructed by learning, the cytology category is classified for the composition pattern stored in the composition pattern database, and it is determined whether the classification is correct or not. In this case, a cytology category estimation model 112 with a further improved classification accuracy can be generated by performing processing for correcting classification errors and further performing learning.
 細胞診カテゴリー推定モデル112は、例えば、前記組成パターンと細胞診カテゴリーとのクラスター分析による、クラスターモデルでもよい。 The cytology category estimation model 112 may be, for example, a cluster model by cluster analysis of the composition pattern and the cytology category.
[実施形態5]
 本発明の実施形態5によるプログラムは、前記本発明の推定方法を、コンピュータ上で実行可能なプログラムである。または、本実施形態のプログラムは、例えば、コンピュータ読み取り可能な記録媒体に記録されてもよい。前記記録媒体としては、特に限定されず、例えば、前述のような記憶媒体等があげられる。
Fifth Embodiment
A program according to Embodiment 5 of the present invention is a program that can execute the estimation method of the present invention on a computer. Alternatively, the program of the present embodiment may be recorded on, for example, a computer readable recording medium. The recording medium is not particularly limited, and examples thereof include the above-described storage medium and the like.
 以上、実施形態を参照して本願発明を説明したが、本願発明は、上記実施形態に限定されるものではない。本願発明の構成や詳細には、本願発明のスコープ内で当業者が理解しうる様々な変更をすることができる。 Although the present invention has been described above with reference to the embodiments, the present invention is not limited to the above embodiments. Various modifications that can be understood by those skilled in the art can be made to the configuration and details of the present invention within the scope of the present invention.
 この出願は、2017年8月25日に出願された日本出願特願2017-161944を基礎とする優先権を主張し、その開示の全てをここに取り込む。 This application claims priority based on Japanese Patent Application No. 2017-161944 filed on Aug. 25, 2017, the entire disclosure of which is incorporated herein.
 以上のように、本発明によれば、例えば、子宮頸がんの罹患可能性を病理医が診断するに先立って、容易に、子宮頚部検体について細胞診分類の推定を行うことができる。 As described above, according to the present invention, cytopathic classification estimation can be easily performed on cervical specimens, for example, before a pathologist diagnoses the morbidity of cervical cancer.

Claims (18)

  1. 検体画像入力部、
    細胞画像抽出部、
    細胞形態分類部、および
    細胞診カテゴリー推定部を含み、
    前記検体画像入力部は、
      染色検体の検体画像を入力し、
      前記染色検体は、MUC結合体により染色された子宮頚部検体であり、
    前記細胞画像抽出部は、
      前記検体画像から染色細胞の細胞画像を抽出し、
    前記細胞形態分類部は、
      2以上の形態カテゴリーのそれぞれと細胞形態情報とが関連づけられた細胞形態分類情報に基づいて、前記細胞画像における染色細胞を、該当する形態カテゴリーに分類し、
      前記検体画像について、前記検体画像から抽出された前記細胞画像における染色細胞の形態カテゴリーの組成パターンを決定し、
    前記細胞診カテゴリー推定部は、
      2以上の細胞診カテゴリーのそれぞれと前記各細胞診カテゴリーに属する子宮頚部の形態カテゴリーの組成パターンとが関連付けられた細胞診カテゴリー推定情報に基づいて、前記検体画像の前記組成パターンから、前記検体画像の染色検体について、細胞診カテゴリーを推定する
    ことを特徴とする子宮頸部の細胞診カテゴリーの推定装置。
    Sample image input unit,
    Cell image extraction unit,
    A cell shape classification unit, and a cytology category estimation unit;
    The sample image input unit
    Input the sample image of the stained sample,
    The stained sample is a cervical sample stained by a MUC conjugate,
    The cell image extraction unit
    Extracting a cell image of stained cells from the sample image;
    The cell shape classification unit
    Based on cell morphology classification information in which cell morphology information is associated with each of two or more morphology categories, the stained cells in the cell image are classified into corresponding morphology categories,
    The composition pattern of the morphological category of stained cells in the cell image extracted from the sample image is determined for the sample image;
    The cytology category estimation unit
    The sample image from the composition pattern of the sample image based on cytology category estimation information in which each of two or more cytology categories is associated with a composition pattern of a cervical morphological category belonging to each of the cytology categories Apparatus for estimating cervical cytology category characterized in that cytology category is estimated for a stained sample of
  2. 前記細胞診カテゴリー推定情報において、前記細胞診カテゴリーが、ベセスダシステムのカテゴリーである、請求項1記載の推定装置。 The estimation apparatus according to claim 1, wherein in the cytology category estimation information, the cytology category is a Bethesda system category.
  3. 前記細胞形態情報が、細胞画像における細胞について、該当する形態カテゴリーを推定する、細胞形態推定モデルであり、
    前記細胞形態推定モデルは、前記各形態カテゴリーに該当する子宮頸部の学習用染色細胞画像から、学習により生成されるモデルであり、
    前記細胞形態分類部は、前記細胞形態推定モデルにより、前記細胞画像における染色細胞を該当する形態カテゴリーに分類する、請求項1または2記載の推定装置。
    The cell shape estimation model is a cell shape estimation model in which a corresponding shape category is estimated for cells in a cell image,
    The cell shape estimation model is a model generated by learning from a learning stained cell image of the cervix corresponding to each of the shape categories,
    The estimation device according to claim 1, wherein the cell shape classification unit classifies stained cells in the cell image into a corresponding shape category according to the cell shape estimation model.
  4. 前記細胞診カテゴリー推定情報が、検体画像の細胞の組成パターンについて、該当する細胞診カテゴリーを推定する、細胞診カテゴリー推定モデルであり、
    前記細胞診カテゴリー推定モデルは、前記各細胞診カテゴリーに該当する子宮頸部における染色細胞の形態カテゴリーの組成パターンから、学習により生成されるモデルであり、
    前記細胞診カテゴリー推定部は、前記細胞診カテゴリー推定モデルにより、前記検体画像の染色検体について、細胞診カテゴリーを推定する、請求項1から3のいずれか一項に記載の推定装置。
    The cytology category estimation information is a cytology category estimation model for estimating a corresponding cytology category for a composition pattern of cells of a sample image,
    The cytology category estimation model is a model generated by learning from composition patterns of morphological cells of stained cells in the cervix corresponding to the respective cytology categories,
    The estimation apparatus according to any one of claims 1 to 3, wherein the cytology category estimation unit estimates a cytology category of the stained sample of the specimen image by the cytology category estimation model.
  5. 前記細胞画像抽出部は、細胞画像抽出モデルを用いて、前記検体画像から染色細胞の細胞画像を抽出し、
    前記細胞画像抽出モデルは、子宮頸部の学習用染色検体画像における染色細胞画像から、学習により生成されるモデルである、請求項1から4のいずれか一項に記載の推定装置。
    The cell image extraction unit extracts a cell image of stained cells from the sample image using a cell image extraction model,
    The estimation device according to any one of claims 1 to 4, wherein the cell image extraction model is a model generated by learning from a stained cell image in a stained sample image for learning of a cervix.
  6. 前記形態カテゴリーの数が、5~100である、請求項1から5のいずれか一項に記載の推定装置。 The estimation device according to any one of claims 1 to 5, wherein the number of form categories is 5 to 100.
  7. さらに、検体画像の適正判定部を含み、
    前記適性判定部は、前記検体画像について、判定項目を検出し、検出結果が適正結果を満たす場合、前記細胞画像抽出部における前記検体画像として適性であると判定する、請求項1から6のいずれか一項に記載の推定装置。
    Furthermore, it includes an appropriateness determination section of the sample image,
    The said aptitude determination part detects a determination item about the said sample image, and when a detection result satisfy | fills an appropriate result, it determines that it is suitable as the said sample image in the said cell image extraction part. The estimation device according to one of the claims.
  8. 前記判定項目が、前記検体画像における、画像のボケの有無、細胞数、血液混入の程度、および炎症の程度からなる群から選択された少なくとも一つである、請求項7記載の推定装置。 The estimation device according to claim 7, wherein the determination item is at least one selected from the group consisting of presence or absence of image blurring, cell number, degree of blood contamination, and degree of inflammation in the sample image.
  9. 検体画像入力工程、
    細胞画像抽出工程、
    細胞形態分類工程、および
    細胞診カテゴリー推定工程を含み、
    前記検体画像入力工程は、
      染色検体の検体画像を入力し、
      前記染色検体は、MUC結合体により染色された子宮頚部検体であり、
    前記細胞画像抽出工程は、
      前記検体画像から染色細胞の細胞画像を抽出し、
    前記細胞形態分類工程は、
      2以上の形態カテゴリーのそれぞれと細胞形態情報とが関連づけられた細胞形態分類情報に基づいて、前記細胞画像における染色細胞を、該当する形態カテゴリーに分類し、
      前記検体画像について、前記検体画像から抽出された前記細胞画像における染色細胞の形態カテゴリーの組成パターンを決定し、
    前記細胞診カテゴリー推定工程は、
      2以上の細胞診カテゴリーのそれぞれと前記各細胞診カテゴリーに属する子宮頚部の形態カテゴリーの組成パターンとが関連付けられた細胞診カテゴリー推定情報に基づいて、前記検体画像の前記組成パターンから、前記検体画像の染色検体について、細胞診カテゴリーを推定する
    ことを特徴とする子宮頸部の細胞診カテゴリーの推定方法。
    Sample image input process,
    Cell image extraction process,
    Including a cell shape classification step and a cytology category estimation step;
    In the sample image input process,
    Input the sample image of the stained sample,
    The stained sample is a cervical sample stained by a MUC conjugate,
    The cell image extraction step
    Extracting a cell image of stained cells from the sample image;
    The cell shape classification step
    Based on cell morphology classification information in which cell morphology information is associated with each of two or more morphology categories, the stained cells in the cell image are classified into corresponding morphology categories,
    The composition pattern of the morphological category of stained cells in the cell image extracted from the sample image is determined for the sample image;
    The cytology category estimation step is
    The sample image from the composition pattern of the sample image based on cytology category estimation information in which each of two or more cytology categories is associated with a composition pattern of a cervical morphological category belonging to each of the cytology categories A method of estimating cervical cytology category, which comprises estimating cytology category for a stained sample of
  10. 前記細胞診カテゴリー推定情報において、前記細胞診カテゴリーが、ベセスダシステムのカテゴリーである、請求項9記載の推定方法。 The estimation method according to claim 9, wherein in the cytology category estimation information, the cytology category is a Bethesda system category.
  11. 前記細胞形態分類情報が、細胞画像における細胞について、該当する形態カテゴリーを推定する、形態カテゴリー推定モデルであり、
    前記細胞形態推定モデルは、前記各形態カテゴリーに該当する子宮頸部の学習用染色細胞画像から、学習により生成されるモデルであり、
    前記細胞形態分類部は、前記細胞形態推定モデルにより、前記細胞画像における染色細胞を該当する形態カテゴリーに分類する、請求項9または10記載の推定方法。
    The cell shape classification information is a shape category estimation model in which a corresponding shape category is estimated for cells in a cell image,
    The cell shape estimation model is a model generated by learning from a learning stained cell image of the cervix corresponding to each of the shape categories,
    The estimation method according to claim 9, wherein the cell morphology classification unit classifies stained cells in the cell image into a corresponding morphology category according to the cell morphology estimation model.
  12. 前記細胞診カテゴリー推定情報が、検体画像の細胞の組成パターンについて、該当する細胞診カテゴリーを推定する、細胞診カテゴリー推定モデルであり、
    前記細胞診カテゴリー推定モデルは、前記各細胞診カテゴリーに該当する子宮頸部における染色細胞の形態カテゴリーの組成パターンから、学習により生成されるモデルであり、
    前記細胞診カテゴリー推定工程は、前記細胞診カテゴリー推定モデルにより、前記検体画像の染色検体について、細胞診カテゴリーを推定する、請求項9から11のいずれか一項に記載の推定方法。
    The cytology category estimation information is a cytology category estimation model for estimating a corresponding cytology category for a composition pattern of cells of a sample image,
    The cytology category estimation model is a model generated by learning from composition patterns of morphological cells of stained cells in the cervix corresponding to the respective cytology categories,
    The estimation method according to any one of claims 9 to 11, wherein the cytology category estimation step estimates a cytology category for a stained specimen of the specimen image by the cytology category estimation model.
  13. 前記細胞画像抽出工程は、細胞画像抽出モデルを用いて、前記検体画像から染色細胞の細胞画像を抽出し、
    前記細胞画像抽出モデルは、子宮頸部の学習用染色検体画像における染色細胞画像から、学習により生成されるモデルである、請求項9から12のいずれか一項に記載の推定方法。
    The cell image extraction step extracts a cell image of stained cells from the sample image using a cell image extraction model.
    The estimation method according to any one of claims 9 to 12, wherein the cell image extraction model is a model generated by learning from a stained cell image in a stained sample image for learning of a cervix.
  14. 前記形態カテゴリーの数が、5~100である、請求項9から13のいずれか一項に記載の推定方法。 The estimation method according to any one of claims 9 to 13, wherein the number of form categories is 5 to 100.
  15. さらに、検体画像の適正判定部を含み、
    前記適性判定工程は、前記検体画像について、判定項目を検出し、検出結果が適正結果を満たす場合、前記細胞画像抽出工程における前記検体画像として適性であると判定する、請求項9から14のいずれか一項に記載の推定方法。
    Furthermore, it includes an appropriateness determination section of the sample image,
    The said aptitude determination process detects a determination item about the said specimen image, and when a detection result satisfy | fills an appropriate result, it determines with it being suitable as the said specimen image in the said cell image extraction process. Estimation method described in the paragraph.
  16. 前記判定項目が、前記検体画像における、画像のボケの有無、細胞数、血液混入の程度、および炎症の程度からなる群から選択された少なくとも一つである、請求項15記載の推定方法。 The estimation method according to claim 15, wherein the determination item is at least one selected from the group consisting of presence or absence of image blurring, cell number, degree of blood contamination, and degree of inflammation in the sample image.
  17. 請求項9から16のいずれか一項に記載の子宮頸部の細胞診カテゴリーの推定方法をコンピュータに実行させることを特徴とするプログラム。 A program causing a computer to execute the method of estimating cervical cytology category according to any one of claims 9 to 16.
  18. 請求項17記載のプログラムを記録したコンピュータ読み取り可能な記録媒体。 The computer-readable recording medium which recorded the program of Claim 17.
PCT/JP2018/031409 2017-08-25 2018-08-24 Device and method for estimating cytodiagnostic category of cervix WO2019039595A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017161944 2017-08-25
JP2017-161944 2017-08-25

Publications (1)

Publication Number Publication Date
WO2019039595A1 true WO2019039595A1 (en) 2019-02-28

Family

ID=65438931

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/031409 WO2019039595A1 (en) 2017-08-25 2018-08-24 Device and method for estimating cytodiagnostic category of cervix

Country Status (1)

Country Link
WO (1) WO2019039595A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020180954A (en) * 2019-04-26 2020-11-05 学校法人順天堂 Method, device and computer program for assisting disease analysis, and method, device and program for training computer algorithm
US11830188B2 (en) 2018-05-10 2023-11-28 Sysmex Corporation Image analysis method, apparatus, non-transitory computer readable medium, and deep learning algorithm generation method

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005043166A1 (en) * 2003-10-30 2005-05-12 Sysmex Corporation Diagnostic for uterine gland cancer and method of detecting gland cancer cell
JP2007516428A (en) * 2003-06-12 2007-06-21 サイティック コーポレイション A system for determining the staining quality of slides using a scatter plot distribution
JP2013020212A (en) * 2011-07-14 2013-01-31 Canon Inc Image processing device, imaging system, and image processing system
JP2013541767A (en) * 2010-09-16 2013-11-14 ユニバーシティ・オブ・カンザス System and method for digital evaluation of cell block preparations
JP2016184224A (en) * 2015-03-25 2016-10-20 株式会社日立ハイテクノロジーズ Cytology support apparatus, cytology support method, remote diagnosis support system, service providing system, and image processing method
JP2017029083A (en) * 2015-08-03 2017-02-09 国立大学法人 東京大学 Diagnostic method of viral gynecology cancer

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007516428A (en) * 2003-06-12 2007-06-21 サイティック コーポレイション A system for determining the staining quality of slides using a scatter plot distribution
WO2005043166A1 (en) * 2003-10-30 2005-05-12 Sysmex Corporation Diagnostic for uterine gland cancer and method of detecting gland cancer cell
JP2013541767A (en) * 2010-09-16 2013-11-14 ユニバーシティ・オブ・カンザス System and method for digital evaluation of cell block preparations
JP2013020212A (en) * 2011-07-14 2013-01-31 Canon Inc Image processing device, imaging system, and image processing system
JP2016184224A (en) * 2015-03-25 2016-10-20 株式会社日立ハイテクノロジーズ Cytology support apparatus, cytology support method, remote diagnosis support system, service providing system, and image processing method
JP2017029083A (en) * 2015-08-03 2017-02-09 国立大学法人 東京大学 Diagnostic method of viral gynecology cancer

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
INOUE, YOSHIKI: "Cervical Cancer", JAPANESE JOURNAL OF CLINICAL MEDICINE, vol. 67, no. 5, 2009, pages 185 - 190 *
TSUJI, TAKAHIRO ET AL.: "Malignant tumor, cervical gland lesions - Current status and problems of handling boundary lesions", OBSTETRICAL AND GYNECOLOGICAL THERAPY, vol. 100, no. 1, 2010, pages 109 - 114 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11830188B2 (en) 2018-05-10 2023-11-28 Sysmex Corporation Image analysis method, apparatus, non-transitory computer readable medium, and deep learning algorithm generation method
JP2020180954A (en) * 2019-04-26 2020-11-05 学校法人順天堂 Method, device and computer program for assisting disease analysis, and method, device and program for training computer algorithm
JP7381003B2 (en) 2019-04-26 2023-11-15 学校法人順天堂 METHODS, APPARATUS AND COMPUTER PROGRAMS TO ASSIST DISEASE ANALYSIS AND METHODS, APPARATUS AND PROGRAMS FOR TRAINING COMPUTER ALGORITHM

Similar Documents

Publication Publication Date Title
US20230419696A1 (en) Image analysis method, apparatus, program, and learned deep learning algorithm
Veta et al. Assessment of algorithms for mitosis detection in breast cancer histopathology images
EP3839885A1 (en) Real-time pathological microscopic image collection and analysis system, method and device and medium
US9349036B2 (en) System and method for quality assurance in pathology
Ghasemian et al. An efficient method for automatic morphological abnormality detection from human sperm images
Al-Kadi Texture measures combination for improved meningioma classification of histopathological images
Zhang et al. Automation‐assisted cervical cancer screening in manual liquid‐based cytology with hematoxylin and eosin staining
US8199997B2 (en) Feature dependent extended depth of focusing on semi-transparent biological specimens
TWI379248B (en) Methods and systems for processing biological specimens utilizing multiple wavelengths
Sertel et al. Computer-aided prognosis of neuroblastoma: Detection of mitosis and karyorrhexis cells in digitized histological images
CN111986150A (en) Interactive marking refinement method for digital pathological image
Bibiloni et al. A real-time fuzzy morphological algorithm for retinal vessel segmentation
Sivakamasundari et al. Proposal of a Content Based retinal Image Retrieval system using Kirsch template based edge detection
JP4864709B2 (en) A system for determining the staining quality of slides using a scatter plot distribution
WO2019039595A1 (en) Device and method for estimating cytodiagnostic category of cervix
US20220277440A1 (en) User-assisted iteration of cell image segmentation
CN111931751A (en) Deep learning training method, target object identification method, system and storage medium
US8538122B2 (en) Localization of a valid area of a blood smear
WO2018128091A1 (en) Image analysis program and image analysis method
JP4897488B2 (en) A system for classifying slides using a scatter plot distribution
Arya et al. Clustering Techniques on Pap-smear Images for the Detection of Cervical Cancer
US20100111398A1 (en) Method and system for detection of oral sub-mucous fibrosis using microscopic image analysis of oral biopsy samples
CN111062909A (en) Method and equipment for judging benign and malignant breast tumor
Iwai et al. Automatic diagnosis supporting system for cervical cancer using image processing
Forsberg et al. Evaluating cell nuclei segmentation for use on whole-slide images in lung cytology

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18848180

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18848180

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP