WO2016152242A1 - 細胞診断支援装置、細胞診断支援方法、遠隔診断支援システム、サービス提供システム、及び画像処理方法 - Google Patents

細胞診断支援装置、細胞診断支援方法、遠隔診断支援システム、サービス提供システム、及び画像処理方法 Download PDF

Info

Publication number
WO2016152242A1
WO2016152242A1 PCT/JP2016/052426 JP2016052426W WO2016152242A1 WO 2016152242 A1 WO2016152242 A1 WO 2016152242A1 JP 2016052426 W JP2016052426 W JP 2016052426W WO 2016152242 A1 WO2016152242 A1 WO 2016152242A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
cell
processor
cytodiagnosis
target image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2016/052426
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
英春 服部
容弓 柿下
憲孝 内田
定光 麻生
史明 濱里
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi High Tech Corp
Original Assignee
Hitachi High Technologies Corp
Hitachi High Tech Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi High Technologies Corp, Hitachi High Tech Corp filed Critical Hitachi High Technologies Corp
Priority to EP16768145.1A priority Critical patent/EP3276573A4/en
Priority to CN201680014083.7A priority patent/CN107430757A/zh
Priority to US15/556,546 priority patent/US10453192B2/en
Publication of WO2016152242A1 publication Critical patent/WO2016152242A1/ja
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N33/00Investigating or analysing materials by specific methods not covered by groups G01N1/00 - G01N31/00
    • G01N33/48Biological material, e.g. blood, urine; Haemocytometers
    • G01N33/483Physical analysis of biological material
    • G01N33/4833Physical analysis of biological material of solid biological material, e.g. tissue samples, cell cultures
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10056Microscopic image
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10088Magnetic resonance imaging [MRI]
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10132Ultrasound image
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30024Cell structures in vitro; Tissue sections in vitro
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30096Tumor; Lesion

Definitions

  • the present invention relates to a cell diagnosis support apparatus, a cell diagnosis support method, a remote diagnosis support system, a service providing system, and an image processing method, and, for example, relates to an image processing technique for supporting cell diagnosis.
  • a low-magnification image is generated from a high-magnification image, the images are simply classified using the low-magnification image, and then the pathological tissue is classified using the high-magnification image that is the source of the low-magnification image.
  • tissues and cells may have various shapes depending on the type of abnormal cells (eg, cancer) and the degree of progression of abnormal cells (eg, cancer). : Depending on the degree of progression of cancer, it may fall into multiple suspicious categories. For this reason, there exists a subject that it may lead to a misdiagnosis if it narrows down to one classification. For this reason, as in Patent Document 1, a low-magnification image is generated from a high-magnification image, and after simple classification of the image with the low-magnification image, tissues and cells are classified using the high-magnification image that is the source of the low-magnification image. In such a case, there is a problem that even if there is a possibility of corresponding to a plurality of types of abnormal cells, the classification is limited to one classification, leading to erroneous detection.
  • a low-magnification image is generated from a high-magnification image, and after simple classification of the image with the low-magnification image, tissues and cells are classified using the high-mag
  • the present invention has been made in view of such a situation, and even when tissues and cells have various shapes depending on the type of abnormal cells (eg, cancer) and the degree of progression of abnormal cells (eg, cancer).
  • a technique for realizing tissue / cell determination from one image is provided.
  • a feature amount of cell deformation is calculated for each type of abnormal cell (eg, cancer). More specifically, the cytodiagnosis support apparatus according to the present invention uses a process of extracting feature amounts of a plurality of directional components from a processing target image, and the plurality of feature amounts, and the image corresponds to one classification. A process for determining whether or not to perform, and a process for determining whether or not the determination process has been completed for all of the preset classifications.
  • abnormal cells eg, cancer
  • the tissue or cells have various shapes depending on the type of abnormal cells (eg, cancer) and the degree of progression of abnormal cells (eg, cancer).
  • FIG. 4 is a diagram for explaining an example of a filter of a feature extraction unit 11.
  • FIG. 4 is a diagram for explaining an example of an operation in one direction of a feature extraction unit 11.
  • FIG. 10 is a diagram for explaining an example of the operation of one classification determination unit 12. It is a figure for demonstrating an example of GUI of cancer determination.
  • FIG. 10 is a diagram for explaining an example of the operation of one classification determination unit 12. It is a figure for demonstrating an example of GUI of cancer determination.
  • FIG. 10 is a diagram for explaining an example of the operation of the drawing unit 14.
  • 3 is a flowchart for explaining an overall operation of the image processing apparatus 1 according to the first embodiment. It is a block diagram which shows the function of the image processing apparatus by the 2nd Embodiment of this invention. It is a figure for demonstrating an example of operation
  • movement of the one classification determination part 12 by 2nd Embodiment. 4 is a flowchart for explaining the operation of a learning unit 16; It is a flowchart for demonstrating the whole operation
  • the present invention relates to image processing for detecting a specific tissue or cell (for example, cancer) included in an image obtained by photographing a tissue, a cell slice, or the like on a slide glass by an imaging device such as a camera mounted on a microscope. It is about technology.
  • Embodiments of the present invention capture the degree of deformation of cells, determine the presence or absence of abnormal cells (eg, cancer) for each type of abnormal cells (eg, cancer), and calculate the likelihood of abnormal cells (eg, cancer)
  • image processing that realizes detection failure of abnormal cells (eg, cancer) and false detection suppression
  • the embodiment of the present invention may be implemented by software running on a general-purpose computer, or may be implemented by dedicated hardware or a combination of software and hardware.
  • each processing unit as a program (for example, feature extraction unit)” as the subject (operation subject), but the program is executed by a processor (CPU or the like). Since the processing determined in this way is performed using the memory and the communication port (communication control device), the description may be made with the processor as the subject.
  • FIG. 1 is a block diagram showing a functional configuration of an image processing apparatus according to an embodiment of the present invention.
  • the image processing apparatus 1 includes an input unit 10, a feature extraction unit 11, a single classification determination unit 12, a multiple classification determination unit 13, a drawing unit 14, a recording unit 15, a control unit 91, a memory 90, have.
  • the image processing apparatus may be mounted in a tissue / cell image acquisition device such as a virtual slide, or, as will be described later (third to fourth embodiments), via the tissue / cell image acquisition device and the network. It may be implemented in a server that is connected.
  • the input unit 10, feature extraction unit 11, one classification determination unit 12, multiple classification determination unit 13, drawing unit 14, and recording unit 15 in the image processing apparatus 1 may be realized by a program or modularized. You may do it.
  • the image data is input to the input unit 10.
  • the input unit 10 obtains encoded still image data or the like in JPG, Jpeg2000, PNG, BMP format, etc. captured by imaging means such as a camera built in the microscope at predetermined time intervals.
  • the image may be an input image.
  • the input unit 10 includes Motion JPEG, MPEG, H.264. It is also possible to extract still image data of frames at a predetermined interval from moving image data of H.264, HD / SDI format, etc., and use that image as an input image.
  • the input unit 10 may use an image acquired by the imaging unit via a bus, a network, or the like as an input image. Further, the input unit 10 may use an image already stored in a removable recording medium as an input image.
  • the feature extraction unit 11 extracts feature quantities such as a plurality of direction components related to cells from the image.
  • the one classification determination unit 12 calculates the degree of deformation of the cell from the extracted feature amount, and classifies the normal cell or the abnormal cell with respect to one classification.
  • the multi-classification determination unit 13 classifies the tissue / cell using a plurality of one-class classification results set in advance.
  • the drawing unit 14 draws a detection frame on the image so as to surround the abnormal cells classified by the multiple classification determination unit 13.
  • the recording unit 15 stores an image in which the detection frame is drawn on the original image by the drawing unit 14 in the memory 90.
  • the control unit 91 is realized by a processor and connected to each element in the image processing apparatus 1.
  • the operation of each element of the image processing apparatus 1 is an autonomous operation of each component described above or an operation performed according to an instruction from the control unit 91.
  • the one classification determination unit 12 determines whether the cell is a normal cell or an abnormal cell. Classification is performed, and the plurality of classification determination unit 13 classifies the tissues and cells by using a plurality of predetermined classification results.
  • FIG. 2 is a diagram illustrating a hardware configuration example of the image processing apparatus 1 according to the embodiment of the present invention.
  • the image processing apparatus 1 outputs a CPU (processor) 201 that executes various programs, a memory 202 that stores various programs, a storage device (corresponding to the memory 90) 203 that stores various data, and an image after detection.
  • the CPU 201 reads various programs from the memory 202 and executes them as necessary.
  • the memory 202 stores the input unit 10, the feature extraction unit 11, the one classification determination unit 12, the multiple classification determination unit 13, the drawing unit 14, and the recording unit 15 as programs.
  • the learning unit 16 is a configuration necessary for the second embodiment, and the image processing apparatus 1 according to the first embodiment does not include the learning unit 16.
  • the storage device 203 is a processing target image, a classification result of one classification generated by the one classification determination unit 12 and its numerical value, a tissue / cell classification result generated by the multiple classification determination unit 13, and a drawing unit 14. Position information and the like for drawing the detection frame are stored.
  • the output device 204 includes devices such as a display, a printer, and a speaker.
  • the output device 204 displays data generated by the drawing unit 14 on the display screen.
  • the input device 205 includes devices such as a keyboard, a mouse, and a microphone. An instruction (including determination of a processing target image) by the user is input to the image processing apparatus 1 by the input device 205.
  • the communication device 206 is not an essential component for the image processing apparatus 1, and when the communication device is included in a personal computer or the like connected to the tissue / cell image acquisition apparatus, the image processing apparatus 1 holds the communication device 206. It does not have to be.
  • the communication device 206 performs an operation of receiving data (including an image) transmitted from another device (for example, a server) connected via a network and storing it in the storage device 203.
  • FIG. 3 shows a filter for obtaining a feature amount in the direction of 0 degrees.
  • filter coefficients of region 1 (region other than cells and cell nuclei), region 2 (cell region), and region 3 (cell nucleus region) in FIG. 3 are 0, 1, and ⁇ 1, respectively.
  • Equation 1 pj is a pixel value, kj is a filter coefficient, and m is the number of filter coefficients.
  • the feature quantity in each direction is calculated using a filter for obtaining the feature quantity in the direction from 0 degrees to 359 degrees.
  • the feature value distribution is calculated using the feature values (f0 to f359) from 0 degrees to 359 degrees.
  • the filter for obtaining the feature value in each direction is used.
  • the feature value fi in each direction is obtained using a filter for obtaining the feature value in one direction and an image obtained by rotating the target image once. May be.
  • the one-classification determination unit 12 calculates the distribution value var of the distribution of the feature quantity fi using the expressions 2 and 3 from the distribution of the feature quantity fi obtained by the feature extraction unit 11. .
  • fav represents an average value of fi
  • fsd represents a standard deviation of fi.
  • t represents the number of direction components to be obtained, for example, 360.
  • the calculated dispersion value var represents the uniformity of the cells, and is classified as normal cells or abnormal cells from the dispersion values.
  • the calculated dispersion value is used as a value of abnormal cell (eg, cancer) likelihood.
  • abnormal cell eg, cancer
  • FIG. 6 (a) when a tissue / cell image including cells having a uniform shape is input, the feature amount distribution shown in FIG. 6 (b) is obtained, and abnormal cells (eg, cancer) ) Since the variance value var indicating the uniqueness value is less than the threshold value Th, the input target image is classified as a normal cell.
  • FIG. 6 (c) when a tissue / cell image including cells having a non-uniform shape is input, the feature amount distribution as shown in FIG. : The variance value var indicating the likelihood of cancer is equal to or greater than the threshold value Th, so that the input target image is classified as an abnormal cell.
  • FIG. 7 is a diagram showing an example of a GUI (graphical user interface) for cancer determination as an example of cell determination.
  • FIG. 7 is an example of stomach cancer, and shows classification results of poorly differentiated tubular adenocarcinoma, moderately differentiated tubular adenocarcinoma, well differentiated tubular adenocarcinoma, papillary adenocarcinoma, and signet ring cell carcinoma It is.
  • the classification determination unit 12 classifies that the input target image includes the poorly differentiated tubular adenocarcinoma that is an abnormal cell.
  • the cancer likelihood value is calculated as 0.89.
  • the classification determination unit 12 does not include the moderately differentiated tubular adenocarcinoma that is an abnormal cell in the input target image, classifies it as a normal cell, and This is an example in which the cancer likelihood value is calculated as 0.31.
  • the classification determination unit 12 does not include the well-differentiated tubular adenocarcinoma that is an abnormal cell in the input target image, classifies it as a normal cell, and This is an example in which the cancer likelihood value was calculated to be 0.21.
  • the classification determination unit 12 does not include the papillary adenocarcinoma that is an abnormal cell in the input target image, classifies it as a normal cell, and the likelihood of papillary adenocarcinoma cancer. This is an example of calculating the value of 0.11.
  • the classification determination unit 12 does not include the signet-ring cell cancer that is an abnormal cell in the input target image, classifies it as a normal cell, and indicates the likelihood of cancer of the signet-ring cell cancer. This is an example of calculating the value of 0.05 as 0.05.
  • the multiple classification determination unit 13 compares the value of the abnormal cell (eg, cancer) likelihood that is the result of a plurality of predetermined classifications obtained by the single classification determination unit 12 with an arbitrary threshold Th, Only the types of abnormal cells (eg, cancer) exceeding Th are displayed in the abnormal cell (eg: cancer) likelihood determination result. In the example of FIG. 7, poorly differentiated tubular adenocarcinoma is displayed in the cancer likelihood determination result. Depending on the progress or type of abnormal cells (eg, cancer), it may be determined that there are multiple types of abnormal cells (eg, cancer). For this reason, the value of the likelihood of multiple abnormal cells (eg, cancer) may exceed the threshold Th, and in that case, multiple abnormal cells (eg, cancer) are included in the abnormal cell (eg: cancer) likelihood determination result. ) Type is displayed.
  • FIG. 7 When the “image” button in FIG. 7 is pressed for the items determined to be abnormal cells (for example, cancer) by the one-classification determination unit 12, the drawing unit 14 is abnormal as illustrated in FIG. 8. A detection frame is drawn in the input target image in order to indicate a place where a cell (eg, cancer) is suspected.
  • the “image” button in FIG. 7 when the “image” button in FIG. 7 is pressed, the input target image is displayed as it is without being drawn on the input target image.
  • (V) Recording unit 15 The recording unit 15 stores the coordinate information for drawing the detection frame on the target image input by the drawing unit 14 and the target image in the memory 90.
  • FIG. 9 is a flowchart for explaining the operation of the image processing apparatus 1 according to the embodiment of the present invention.
  • each processing unit input unit 10, feature extraction unit 11, etc.
  • CPU 201 executes each processing unit as a program.
  • Step 801 The input unit 10 receives the input image and outputs the input image to the feature extraction unit 11.
  • Step 802 The feature extraction unit 11 obtains feature quantities fi of a plurality of direction components using the above-described Expression 1.
  • Step 803 The one classification determination unit 12 calculates the variance value var indicating the distribution of the feature amount fi using the above-described Expressions 2 and 3 using the feature amount fi output from the feature extraction unit 11.
  • Step 804 The one classification determination unit 12 compares the calculated variance value var with the threshold value Th. That is, when the variance value var ⁇ threshold Th, the process proceeds to step 805. On the other hand, if var ⁇ threshold Th, the process proceeds to step 806.
  • Step 805 The one classification determination unit 12 sets abnormal cells (for example, 1) in the classification result res.
  • Step 806 The one classification determination unit 12 sets normal cells (for example, 0) in the classification result res.
  • Step 807 The multiple determination unit 13 repeats Steps 802 to 806 in order to perform the one classification determination unit 12 for all types set in advance. By repeating steps 802 to 806, it is possible to determine whether all the preset types are normal cells or abnormal cells. In addition, since the coefficients of the filter (FIG. 3) for obtaining the feature quantity fi are different for each type, when performing the classification process for another type, the filter coefficient is changed, and the process proceeds to step 802. Become. If it is determined that all types have been determined, the process proceeds to step 808.
  • Step 808 For the type determined as an abnormal cell, the drawing unit 14 draws and displays a detection frame indicating the abnormal cell on the image when the image button shown in FIG. 7 is pressed. For the type determined as a normal cell, the drawing unit 14 does not draw a detection frame on the image when the image button shown in FIG. 7 is pressed.
  • the recording unit 15 stores the coordinate information for drawing the detection frame on the target image input by the drawing unit 14 and the target image in the memory 90 (corresponding to the storage device 203).
  • the variance value indicating the degree of deformation of the cell is obtained using the feature quantities of the plurality of direction components. Therefore, regarding one classification, it is possible to classify normal cells or abnormal cells from one image while suppressing erroneous detection and overdetection.
  • FIG. 10 shows an image processing apparatus 1 according to the second embodiment.
  • many of the same configurations as those in FIG. 1 of the first embodiment are included, but the operations of the feature extraction unit 11 and the one classification determination unit 12 are different from those in FIG. Further, a learning unit 16 is added. Therefore, here, a different processing configuration and an additional configuration will be described with reference to FIG. 10, and an overall processing flow different from FIG. 9 will be described with reference to FIG.
  • the learning unit 16 includes the same configuration as that of the feature extraction unit 11 and the one-classification determination unit 12, and using these, for example, a machine learning technique that is a conventional technique is used to determine how the cells are deformed. learn. Note that the input image learned by the learning unit 16 and the image inputted as the evaluation target are different.
  • the feature quantity fi of the plurality of direction components obtained by the feature extraction unit 11 includes information indicating the shape of a part of the cell.
  • the one-classification determination unit 12 uses the feature quantities fi of the plurality of directional components obtained by the feature extraction unit 11 to express cells in the input tissue / cell image according to Equations 4 and 5. If the cell is a normal cell, the degree of deformation of the cell is learned using, for example, a conventional machine learning technique so as to be determined as a normal cell by logistic regression processing, for example. In addition, if the cell in the input tissue / cell image is an abnormal cell, the degree of deformation of the cell is learned so as to be determined as an abnormal cell by logistic regression processing.
  • Equation 4 w is a weight matrix, f is a matrix of each direction component fi obtained from the input image, b is an offset value, g is a non-linear function, y is a calculation result, Find the offset value of the weight and b.
  • pj is a pixel value, wj is a filter coefficient, bi is an offset value, m is the number of filter coefficients, N is the number of processes, and h is a nonlinear function.
  • Convolutional Neural Network may be used as a machine learning technique.
  • the learning unit 16 repeatedly performs the feature extraction unit 11 and the one classification determination unit 12 using a plurality of learning images, obtains weight w, filter coefficient wj, offset values b and bi, and determines whether normal cells or abnormal Create a discriminator to determine if it is a cell. Further, the learning unit 16 stores the obtained weight w, filter coefficient wj, and offset values b and bi in a memory.
  • (Ii) Feature extraction unit 11 The feature extraction unit 11 reads the filter coefficient wj and the offset value bi from the memory, and uses the equation 5 for the input image to be determined output from the input unit 10, as shown in FIG.
  • the feature quantity fi in each direction is calculated using a filter that obtains the feature quantity in the previous direction.
  • the classification determination unit 12 reads the weight w and the offset value b from the memory, and uses Equation 5 to determine whether the cell is normal from the feature quantity fi obtained by the feature extraction unit 11 as shown in FIG. Determine if it is abnormal.
  • a hardware configuration example of the image processing apparatus 1 according to the embodiment of the present invention is the same as that shown in FIG. However, in the case of the second embodiment, the image processing apparatus 1 stores the learning unit 16 in the memory 202, unlike the first embodiment. Other hardware configurations of the image processing apparatus 1 are the same as those of the image processing apparatus 1.
  • FIG. 12 is a flowchart for explaining the operation of the learning unit 16 of the image processing apparatus 1 according to the embodiment of the present invention.
  • the learning unit 16 is described as an operation subject, but it may be read so that the CPU 201 is an operation subject and the CPU 201 executes each processing unit as a program.
  • Step 1201 The input unit 10 receives an image input for learning and outputs the input image to the learning unit 16.
  • Step 1202 The learning unit 16 obtains feature quantities fi of a plurality of direction components using the above-described Expression 1.
  • Step 1203 The learning unit 16 learns how the cells are deformed using Equations 4 and 5, and calculates the weight w, the filter coefficient wj, and the offsets b and bi.
  • Step 1204 The learning unit 16 stores the calculated weight w, filter coefficient wj, and offsets b and bi in the memory 90. Note that the weight w, the filter coefficient wj, and the offsets b and bi are obtained for all types (for example, all types of cancer cells) set in advance by learning.
  • FIG. 13 is a flowchart for explaining the operation of the image processing apparatus 1 according to the embodiment of the present invention.
  • each processing unit input unit 10, feature extraction unit 11, etc.
  • CPU 201 executes each processing unit as a program.
  • Step 1301 The input unit 10 receives an input image to be determined and outputs the input image to the feature extraction unit 11.
  • Step 1302 The feature extraction unit 11 reads the filter coefficient wj and the offset bi from the memory 90, and obtains feature quantities fi of a plurality of direction components using the above-described Expression 5.
  • Step 1303 The one-classification determination unit 12 reads the weight w and the offset b from the memory 90 and calculates the calculation result y using Equation 4.
  • Step 1304 The one classification determination unit 12 compares the calculated calculation result y with the threshold value Th2. That is, when the calculation result y ⁇ threshold Th2, the process proceeds to step 1305. On the other hand, if the calculation result y ⁇ threshold Th2, the process proceeds to step 1306.
  • the one classification determination unit 12 sets abnormal cells (for example, 1) in the classification result res.
  • Step 1306 The one classification determination unit 12 sets normal cells (for example, 0) in the classification result res.
  • Step 1307 The multiple classification determination unit 13 repeats the above steps 1302 to 1306 in order to perform the single classification determination unit 12 for all types set in advance. By repeating steps 1302 to 1306, it is possible to determine whether all the preset types are normal cells or abnormal cells. When determining another type, the filter coefficient wj and the offset bi for the type are read from the memory, and the feature quantity fi corresponding to the type is obtained. If it is determined that all types have been determined, the process proceeds to step 1308.
  • Step 1308 For the type determined as an abnormal cell, the drawing unit 14 draws and displays a detection frame indicating the abnormal cell on the image when the image button shown in FIG. 7 is pressed. For the type determined as a normal cell, the drawing unit 14 does not draw a detection frame on the image when the image button shown in FIG. 7 is pressed.
  • the recording unit 15 stores the coordinate information for drawing the detection frame on the target image input by the drawing unit 14 and the target image in the memory 90 (corresponding to the storage device 203).
  • a discriminator that learns the degree of deformation of a cell by using feature quantities of a plurality of direction components, calculates a weight, a filter coefficient, and an offset, and determines whether the cell is a normal cell or an abnormal cell Therefore, regarding one classification, it is possible to classify normal cells or abnormal cells from one image while suppressing erroneous detection and overdetection.
  • abnormal cells eg, cancer
  • progression of abnormal cells eg, cancer
  • FIG. 14 is a functional block diagram showing a configuration of a remote diagnosis support system 1400 according to a third embodiment of the present invention.
  • the remote diagnosis support system 1400 includes a server 1403 and an image acquisition device 1405.
  • the image acquisition device 1405 is, for example, a device such as a virtual slide device or a personal computer equipped with a camera, and includes an imaging unit 1401 that captures image data, and a display unit that displays determination results transmitted from the server 1403 or the like. 1404. Although not shown, the image acquisition apparatus 1405 includes a communication device that transmits image data to the server or the like 1403 and receives data transmitted from the server or the like 1403.
  • a server or the like 1403 performs image processing on the image data transmitted from the image acquisition device 1405 according to the first or second embodiment of the present invention, and is output from the image processing device 1. And a storage unit 1402 for storing the determination result.
  • the server 1403 has a communication device (not shown) that receives the image data transmitted from the image acquisition device 1405 and transmits the determination result data to the image acquisition device 1405. .
  • the image processing apparatus 1 determines whether the cells in the image data captured by the imaging unit 1401 are normal cells or abnormal cells for each type of abnormal cells (eg, cancer).
  • the abnormal cell (eg, cancer) likelihood determination according to the degree of progression of the abnormal cell (eg, cancer) is performed using the classification results obtained by a plurality of preset classifiers.
  • the display unit 1404 displays the determination result transmitted from the server or the like 1403 on the display screen of the image acquisition device 1405.
  • a regenerative medical device having an imaging unit, an iPS cell culture device, an MRI, an ultrasonic imaging device, or the like may be used.
  • a cell in an image transmitted from a facility at a different point is determined as a normal cell or an abnormal cell, and the determination result is transmitted to a facility at a different point. It is possible to provide a remote diagnosis support system by displaying the determination result on the display unit of the image acquisition apparatus.
  • FIG. 15 is a functional block diagram showing a configuration of a network trust service providing system 1500 according to a fourth embodiment of the present invention.
  • the network trust service providing system 1500 includes a server 1503 and an image acquisition device 1505.
  • the image acquisition device 1505 is a device such as a virtual slide device or a personal computer equipped with a camera, for example, an image capturing unit 1501 that captures image data, a storage unit 1504 that stores a discriminator transmitted from a server 1503, and the like.
  • a second embodiment of the present invention that reads a discriminator transmitted from a server 1503 or the like and determines whether a cell in an image newly captured by the imaging unit 1501 of the image acquisition device 1505 is a normal cell or an abnormal cell.
  • an image processing apparatus 1 that performs image processing according to the above.
  • the image acquisition apparatus 1505 includes a communication device that transmits image data to the server or the like 1503 or receives data transmitted from the server or the like 1503.
  • a server or the like 1503 includes an image processing apparatus 1 that performs image processing on the image data transmitted from the image acquisition apparatus 1505 according to the second embodiment of the present invention, and a classifier output from the image processing apparatus 1. And a storage unit 1502 for storing.
  • the server 1503 has a communication device that receives image data transmitted from the image acquisition apparatus 1505 and transmits an identifier to the image acquisition apparatus 1505.
  • the image processing apparatus 1 performs machine learning so that normal cells are determined as normal cells and abnormal cells are determined as abnormal cells in cells in the image data captured by the imaging unit 1501. Create a classifier suitable for an image of a facility.
  • the storage unit 1504 stores an identifier and the like transmitted from the server 1503 and the like.
  • the image processing device 1 in the image acquisition device 1505 reads a discriminator or the like from the storage unit 1504 and uses the discriminator to normalize cells in the image newly captured by the imaging unit 1501 of the image acquisition device 1505. It is determined whether the cell is an abnormal cell, and the determination result is displayed on the display screen of the output device 204 of the image processing apparatus 1.
  • a regenerative medical device having an imaging unit, an iPS cell culture device, an MRI, an ultrasonic imaging device, or the like may be used.
  • machine learning is performed so that normal cells are determined to be normal cells and abnormal cells are determined to be abnormal cells with respect to cells in images transmitted from different facilities or the like.
  • To create a discriminator, etc. transmit the discriminator to a facility at a different point, etc., read the discriminator with an image acquisition device in the facility, etc., and about the cells in the newly taken image, normal cells By determining whether the cell is an abnormal cell, it is possible to provide a network contract service providing system.
  • the image processing apparatus relates to a process for calculating feature amounts of a plurality of direction components, a process for obtaining a dispersion value indicating the degree of deformation of a cell, and a classification.
  • a process for determining the likelihood of abnormal cells (for example, cancer) according to. More specifically, as shown in Equation 1, feature quantities of a plurality of direction components are obtained, and as shown in Equations 2 and 3, cell deformation is performed using feature quantities fi of the plurality of direction components. A variance value var indicating the condition is obtained.
  • the feature quantity fi may be obtained by rotating the target image in a plurality of directions.
  • the calculated dispersion value var represents the uniformity of the cells, and it is possible to classify whether normal cells are abnormal cells from the dispersion values.
  • tissue / cells are classified using a plurality of preset classification results, and abnormal cell (eg, cancer) is judged, so the degree of progression of abnormal cells (eg: cancer)
  • abnormal cell eg, cancer
  • the degree of progression of abnormal cells eg: cancer
  • the image processing apparatus obtains feature quantities of a plurality of directional components, and performs machine learning of cell deformation using the obtained feature quantities, and a classifier obtained by machine learning.
  • tissue and cells are classified using a process of classifying normal cells or abnormal cells from one image and a plurality of preset classification results.
  • tissue / cells are classified using a plurality of classification results that are set in advance, and abnormal cells (eg, cancer) are determined according to the degree of progression of abnormal cells (eg, cancer). The determination result corresponding to the degree of progression of abnormal cells (eg, cancer) can be displayed.
  • normal cells are determined as normal cells
  • abnormal cells are determined as abnormal cells.
  • Perform machine learning to create a classifier, etc. read the classifier with an image acquisition device in a facility at a different location, etc., and determine whether the cells in the newly captured image are normal or abnormal
  • the machine deformation of the cells is machine-learned using logistic regression, but linear regression, Poisson regression, or the like may be used, and the same effect is obtained.
  • the classification determination unit 12 performs cell classification determination using variance values or machine learning of a plurality of direction components. However, both the determination result based on the variance values of a plurality of direction components and the determination result based on machine learning may be used. Well, it has the same effect.
  • the present invention can also be realized by software program codes that implement the functions of the embodiments.
  • a storage medium recording the program code is provided to the system or apparatus, and the computer (or CPU or MPU) of the system or apparatus reads the program code stored in the storage medium.
  • the program code itself read from the storage medium realizes the functions of the above-described embodiments, and the program code itself and the storage medium storing the program code constitute the present invention.
  • a storage medium for supplying such program code for example, a flexible disk, CD-ROM, DVD-ROM, hard disk, optical disk, magneto-optical disk, CD-R, magnetic tape, nonvolatile memory card, ROM Etc. are used.
  • an OS operating system
  • the computer CPU or the like performs part or all of the actual processing based on the instruction of the program code.
  • the program code is stored in a storage means such as a hard disk or a memory of a system or apparatus, or a storage medium such as a CD-RW or CD-R
  • the computer (or CPU or MPU) of the system or apparatus may read and execute the program code stored in the storage means or the storage medium when used.
  • control lines and information lines are those that are considered necessary for the explanation, and not all control lines and information lines on the product are necessarily shown. All the components may be connected to each other.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Chemical & Material Sciences (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Quality & Reliability (AREA)
  • Data Mining & Analysis (AREA)
  • Biochemistry (AREA)
  • Immunology (AREA)
  • Hematology (AREA)
  • Molecular Biology (AREA)
  • Urology & Nephrology (AREA)
  • Optics & Photonics (AREA)
  • Food Science & Technology (AREA)
  • Medicinal Chemistry (AREA)
  • Analytical Chemistry (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Artificial Intelligence (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)
  • Investigating Or Analysing Biological Materials (AREA)
PCT/JP2016/052426 2015-03-25 2016-01-28 細胞診断支援装置、細胞診断支援方法、遠隔診断支援システム、サービス提供システム、及び画像処理方法 Ceased WO2016152242A1 (ja)

Priority Applications (3)

Application Number Priority Date Filing Date Title
EP16768145.1A EP3276573A4 (en) 2015-03-25 2016-01-28 Cytodiagnosis assistance device, cytodiagnosis assistance method, remote diagnosis assistance system, service provision system, and image processing method
CN201680014083.7A CN107430757A (zh) 2015-03-25 2016-01-28 细胞诊断支援装置、细胞诊断支援方法、远程诊断支援系统、服务提供系统及图像处理方法
US15/556,546 US10453192B2 (en) 2015-03-25 2016-01-28 Cytologic diagnosis support apparatus, cytologic diagnosis support method, remote diagnosis support system, service providing system, and image processing method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2015063103A JP6324338B2 (ja) 2015-03-25 2015-03-25 細胞診断支援装置、細胞診断支援方法、遠隔診断支援システム、及びサービス提供システム
JP2015-063103 2015-03-25

Publications (1)

Publication Number Publication Date
WO2016152242A1 true WO2016152242A1 (ja) 2016-09-29

Family

ID=56978080

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/052426 Ceased WO2016152242A1 (ja) 2015-03-25 2016-01-28 細胞診断支援装置、細胞診断支援方法、遠隔診断支援システム、サービス提供システム、及び画像処理方法

Country Status (5)

Country Link
US (1) US10453192B2 (enExample)
EP (1) EP3276573A4 (enExample)
JP (1) JP6324338B2 (enExample)
CN (1) CN107430757A (enExample)
WO (1) WO2016152242A1 (enExample)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20210015834A (ko) 2018-06-05 2021-02-10 스미또모 가가꾸 가부시키가이샤 진단 지원 시스템, 진단 지원 방법 및 진단 지원 프로그램

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6632288B2 (ja) * 2014-12-12 2020-01-22 キヤノン株式会社 情報処理装置、情報処理方法、プログラム
JP2016146174A (ja) * 2015-02-06 2016-08-12 パナソニックIpマネジメント株式会社 決定方法およびプログラム
JP6670222B2 (ja) 2016-11-01 2020-03-18 株式会社日立ハイテク 画像診断支援装置及びシステム、画像診断支援方法
JP6979278B2 (ja) 2017-04-07 2021-12-08 株式会社日立ハイテク 画像診断支援装置及び画像診断支援システム、並びに画像診断支援方法
WO2019039595A1 (ja) * 2017-08-25 2019-02-28 学校法人東京医科大学 子宮頸部の細胞診カテゴリーの推定装置および推定方法
CN109685058B (zh) * 2017-10-18 2021-07-09 杭州海康威视数字技术股份有限公司 一种图像目标识别方法、装置及计算机设备
JP7135303B2 (ja) * 2017-11-15 2022-09-13 大日本印刷株式会社 コンピュータプログラム、検出装置、撮像装置、サーバ、検出器、検出方法及び提供方法
JP2019105703A (ja) * 2017-12-12 2019-06-27 有限会社 高度技術研究所 位相差画像検査装置及び位相差画像検査方法
EP3970158A1 (en) * 2019-05-16 2022-03-23 PAIGE.AI, Inc. Systems and methods for processing images to classify the processed images for digital pathology
CN110263656B (zh) * 2019-05-24 2023-09-29 南方科技大学 一种癌细胞识别方法、装置和系统
CN111047577B (zh) * 2019-12-12 2021-02-26 太原理工大学 一种畸形尿红细胞分类统计方法与系统
JP7412279B2 (ja) * 2020-06-03 2024-01-12 株式会社日立ハイテク 画像診断方法、画像診断支援装置、及び計算機システム
JP7663341B2 (ja) * 2020-11-19 2025-04-16 セコム株式会社 データ処理装置、データ処理方法、データ処理プログラム、及び学習モデル生成方法
JP7412321B2 (ja) * 2020-12-08 2024-01-12 株式会社日立ハイテク オブジェクト分類装置、オブジェクト分類システム及びオブジェクト分類方法
CN115294570B (zh) * 2022-08-11 2024-05-28 武汉希诺智能医学有限公司 一种基于深度学习的细胞图像识别方法

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006517663A (ja) * 2003-02-11 2006-07-27 キネティック リミテッド 画像解析
JP2009180539A (ja) * 2008-01-29 2009-08-13 Nec Corp 病理診断支援装置、病理診断支援方法、およびプログラム
WO2010041423A1 (ja) * 2008-10-09 2010-04-15 日本電気株式会社 病理組織診断支援システム、病理組織診断支援プログラム、病理組織診断支援方法
WO2013076927A1 (ja) * 2011-11-24 2013-05-30 パナソニック株式会社 診断支援装置および診断支援方法
JP2015114172A (ja) * 2013-12-10 2015-06-22 オリンパスソフトウェアテクノロジー株式会社 画像処理装置、顕微鏡システム、画像処理方法、及び画像処理プログラム
WO2015145643A1 (ja) * 2014-03-27 2015-10-01 コニカミノルタ株式会社 画像処理装置および画像処理プログラム

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6418238B1 (en) * 1997-09-22 2002-07-09 Olympus Optical Co., Ltd. Image detection apparatus and image detection method capable of detecting roundish shape
JP5321145B2 (ja) 2009-03-04 2013-10-23 日本電気株式会社 画像診断支援装置、画像診断支援方法、画像診断支援プログラム、及びその記憶媒体
JP2011095182A (ja) * 2009-10-30 2011-05-12 Sysmex Corp 細胞分析装置及び細胞分析方法
CN102682305B (zh) * 2012-04-25 2014-07-02 深圳市迈科龙医疗设备有限公司 宫颈液基细胞学自动筛查方法和系统
WO2014088049A1 (en) 2012-12-07 2014-06-12 Canon Kabushiki Kaisha Image generating apparatus and image generating method
JP6362062B2 (ja) * 2012-12-07 2018-07-25 キヤノン株式会社 画像生成装置および画像生成方法
RU2016114861A (ru) * 2013-09-20 2017-10-25 Трансмурал Биотек, С.Л. Способы анализа изображений для диагностики заболеваний
CN103839066A (zh) * 2014-03-13 2014-06-04 中国科学院光电技术研究所 一种源于生物视觉的特征提取方法
CN104036235B (zh) * 2014-05-27 2017-07-07 同济大学 基于叶片hog特征和智能终端平台的植物物种识别方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006517663A (ja) * 2003-02-11 2006-07-27 キネティック リミテッド 画像解析
JP2009180539A (ja) * 2008-01-29 2009-08-13 Nec Corp 病理診断支援装置、病理診断支援方法、およびプログラム
WO2010041423A1 (ja) * 2008-10-09 2010-04-15 日本電気株式会社 病理組織診断支援システム、病理組織診断支援プログラム、病理組織診断支援方法
WO2013076927A1 (ja) * 2011-11-24 2013-05-30 パナソニック株式会社 診断支援装置および診断支援方法
JP2015114172A (ja) * 2013-12-10 2015-06-22 オリンパスソフトウェアテクノロジー株式会社 画像処理装置、顕微鏡システム、画像処理方法、及び画像処理プログラム
WO2015145643A1 (ja) * 2014-03-27 2015-10-01 コニカミノルタ株式会社 画像処理装置および画像処理プログラム

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3276573A4 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20210015834A (ko) 2018-06-05 2021-02-10 스미또모 가가꾸 가부시키가이샤 진단 지원 시스템, 진단 지원 방법 및 진단 지원 프로그램
JPWO2019235335A1 (ja) * 2018-06-05 2021-07-08 住友化学株式会社 診断支援システム、診断支援方法及び診断支援プログラム
JP7349425B2 (ja) 2018-06-05 2023-09-22 住友化学株式会社 診断支援システム、診断支援方法及び診断支援プログラム
US12026872B2 (en) 2018-06-05 2024-07-02 Sumitomo Chemical Company, Limited Diagnosis assisting system, diagnosis assisting method and diagnosis assisting program

Also Published As

Publication number Publication date
US10453192B2 (en) 2019-10-22
EP3276573A4 (en) 2018-10-17
US20180053296A1 (en) 2018-02-22
JP2016184224A (ja) 2016-10-20
JP6324338B2 (ja) 2018-05-16
CN107430757A (zh) 2017-12-01
EP3276573A1 (en) 2018-01-31

Similar Documents

Publication Publication Date Title
JP6324338B2 (ja) 細胞診断支援装置、細胞診断支援方法、遠隔診断支援システム、及びサービス提供システム
US10872411B2 (en) Diagnostic imaging assistance apparatus and system, and diagnostic imaging assistance method
JP6979278B2 (ja) 画像診断支援装置及び画像診断支援システム、並びに画像診断支援方法
CN110428475B (zh) 一种医学图像的分类方法、模型训练方法和服务器
US12223645B2 (en) Method and device for identifying abnormal cell in to-be-detected sample, and storage medium
EP3992851A1 (en) Image classification method, apparatus and device, storage medium, and medical electronic device
JP6235921B2 (ja) 内視鏡画像診断支援システム
EP3507743A1 (en) System and method of otoscopy image analysis to diagnose ear pathology
WO2018070285A1 (ja) 画像処理装置、及び画像処理方法
JP6665999B2 (ja) データ処理装置、決定木生成方法、識別装置及びプログラム
WO2021065937A1 (ja) 機械学習装置
Mmileng et al. Application of ConvNeXt with transfer learning and data augmentation for malaria parasite detection in resource-limited settings using microscopic images
JP7046745B2 (ja) 機械学習装置、画像診断支援装置、機械学習方法及び画像診断支援方法
JP7529592B2 (ja) 画像診断支援装置、画像診断支援方法、遠隔診断支援システム、ネット受託サービスシステム
JP2006346094A (ja) 検出情報の出力方法及び医用画像処理システム
Shunmuga Priya et al. Enhanced Skin Disease Image Analysis Using Hybrid CLAHE-Median Filter and Salient K-Means Cluster
JP7412279B2 (ja) 画像診断方法、画像診断支援装置、及び計算機システム
Umamaheswari et al. Optimizing Cervical Cancer Classification with SVM and Improved Genetic Algorithm on Pap Smear Images.
Tabassum et al. Image segmentation for neuroscience: lymphatics
CN120708701A (zh) 一种影像基因组学预测模型构建及图像标注方法
Sucharitha et al. Diabetic Retinopathy Detection Using Deep Learning
Xia et al. GastritisMIL: An interpretable deep learning model for the comprehensive histological assessment of chronic gastritis
GB2638787A (en) Image analysis system for otoscopy images
US20210192734A1 (en) Electronic method and device for aiding the determination, in an image of a sample, of at least one element of interest from among biological elements, associated computer program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16768145

Country of ref document: EP

Kind code of ref document: A1

REEP Request for entry into the european phase

Ref document number: 2016768145

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 15556546

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE