WO2016017533A1 - 識別装置および識別方法 - Google Patents
識別装置および識別方法 Download PDFInfo
- Publication number
- WO2016017533A1 WO2016017533A1 PCT/JP2015/071023 JP2015071023W WO2016017533A1 WO 2016017533 A1 WO2016017533 A1 WO 2016017533A1 JP 2015071023 W JP2015071023 W JP 2015071023W WO 2016017533 A1 WO2016017533 A1 WO 2016017533A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- feature amount
- image
- optical thickness
- thickness distribution
- unit
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 25
- 238000000605 extraction Methods 0.000 claims abstract description 82
- 238000010801 machine learning Methods 0.000 claims abstract description 37
- 239000000284 extract Substances 0.000 claims abstract description 22
- 210000004027 cell Anatomy 0.000 claims description 108
- 230000003287 optical effect Effects 0.000 claims description 94
- 210000000265 leukocyte Anatomy 0.000 claims description 48
- 206010028980 Neoplasm Diseases 0.000 claims description 34
- 201000011510 cancer Diseases 0.000 claims description 33
- 239000003219 hemolytic agent Substances 0.000 claims description 13
- 238000010586 diagram Methods 0.000 description 20
- 210000003855 cell nucleus Anatomy 0.000 description 7
- 208000005443 Circulating Neoplastic Cells Diseases 0.000 description 6
- 210000000805 cytoplasm Anatomy 0.000 description 6
- 210000000170 cell membrane Anatomy 0.000 description 5
- 101100433753 Arabidopsis thaliana ABCG2 gene Proteins 0.000 description 4
- 210000004369 blood Anatomy 0.000 description 4
- 239000008280 blood Substances 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 101100377801 Arabidopsis thaliana ABCG1 gene Proteins 0.000 description 3
- 238000013135 deep learning Methods 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 238000012360 testing method Methods 0.000 description 3
- 238000012549 training Methods 0.000 description 3
- 201000005787 hematologic cancer Diseases 0.000 description 2
- 208000024200 hematopoietic and lymphoid system neoplasm Diseases 0.000 description 2
- 239000000203 mixture Substances 0.000 description 2
- 238000003909 pattern recognition Methods 0.000 description 2
- 210000005259 peripheral blood Anatomy 0.000 description 2
- 239000011886 peripheral blood Substances 0.000 description 2
- 238000002360 preparation method Methods 0.000 description 2
- 230000002787 reinforcement Effects 0.000 description 2
- 238000012706 support-vector machine Methods 0.000 description 2
- 210000001519 tissue Anatomy 0.000 description 2
- 206010006187 Breast cancer Diseases 0.000 description 1
- 208000026310 Breast neoplasm Diseases 0.000 description 1
- 206010027476 Metastases Diseases 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000015556 catabolic process Effects 0.000 description 1
- 238000003066 decision tree Methods 0.000 description 1
- 101150042537 dld1 gene Proteins 0.000 description 1
- 238000009499 grossing Methods 0.000 description 1
- 230000009401 metastasis Effects 0.000 description 1
- 206010061289 metastatic neoplasm Diseases 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000001537 neural effect Effects 0.000 description 1
- 238000004393 prognosis Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000026683 transduction Effects 0.000 description 1
- 238000010361 transduction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N15/00—Investigating characteristics of particles; Investigating permeability, pore-volume, or surface-area of porous materials
- G01N15/10—Investigating individual particles
- G01N15/14—Electro-optical investigation, e.g. flow cytometers
- G01N15/1429—Electro-optical investigation, e.g. flow cytometers using an analyser being characterised by its signal processing
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/02—Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
- G01B11/06—Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness for measuring thickness ; e.g. of sheet material
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N15/00—Investigating characteristics of particles; Investigating permeability, pore-volume, or surface-area of porous materials
- G01N15/02—Investigating particle size or size distribution
- G01N15/0205—Investigating particle size or size distribution by optical means, e.g. by light scattering, diffraction, holography or imaging
- G01N15/0227—Investigating particle size or size distribution by optical means, e.g. by light scattering, diffraction, holography or imaging using imaging, e.g. a projected image of suspension; using holography
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N15/00—Investigating characteristics of particles; Investigating permeability, pore-volume, or surface-area of porous materials
- G01N15/10—Investigating individual particles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N15/00—Investigating characteristics of particles; Investigating permeability, pore-volume, or surface-area of porous materials
- G01N15/10—Investigating individual particles
- G01N15/14—Electro-optical investigation, e.g. flow cytometers
-
- G01N15/1433—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
- G06T7/0014—Biomedical image inspection using an image reference approach
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/69—Microscopic objects, e.g. biological cells or cellular parts
- G06V20/698—Matching; Classification
-
- G01N2015/016—
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N15/00—Investigating characteristics of particles; Investigating permeability, pore-volume, or surface-area of porous materials
- G01N15/02—Investigating particle size or size distribution
- G01N15/0205—Investigating particle size or size distribution by optical means, e.g. by light scattering, diffraction, holography or imaging
- G01N2015/025—Methods for single or grouped particles
-
- G01N2015/1014—
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N15/00—Investigating characteristics of particles; Investigating permeability, pore-volume, or surface-area of porous materials
- G01N15/10—Investigating individual particles
- G01N15/14—Electro-optical investigation, e.g. flow cytometers
- G01N2015/1493—Particle size
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30024—Cell structures in vitro; Tissue sections in vitro
Definitions
- One aspect of the present invention relates to an identification device and an identification method for identifying an object using an image of an optical thickness distribution of the object.
- the identification of an object can generally be performed based on its size, shape or color. However, if they have a three-dimensional shape, such as leukocytes and cancer cells, and there is no significant difference in size and shape, both are colorless and transparent, The object cannot be identified.
- a phase contrast microscope and a differential interference microscope are used for visualizing colorless and transparent cells, but lack quantitativeness with respect to optical thickness. In addition, these microscopes can obtain only two-dimensional information even though the cells have a three-dimensional structure because the depth of focus is less than the thickness of the cells depending on the objective lens used. The object cannot be identified.
- CTC circulating tumor cells
- Patent Document 1 acquires a cell image by an optical system that obtains a bright field image, extracts feature parameters (size, color information, circularity, etc.) of the image, and based on the feature parameters. To identify cells.
- pattern recognition processing is performed using a neural network when identifying cells.
- Patent Document 1 identifies a target object by performing pattern recognition processing on an image of the target object acquired by an optical system that obtains a bright field image.
- a target object phase object
- a target object that has a three-dimensional shape like a cancer cell and does not have a large difference in any of the size, shape, and color cannot be identified.
- An object is to provide an apparatus and a method capable of identifying an object.
- An identification apparatus includes (1) a feature amount extraction unit that extracts an image feature amount of an optical thickness distribution of an object, and (2) an object whose type is known (hereinafter referred to as “known object”).
- An identification unit that determines the type of an unknown object based on the feature amount extracted by the feature amount extraction unit for an image of an optical thickness distribution of an object whose type is unknown (hereinafter referred to as “unknown object”) And using the learning result stored in the storage unit when extracting the feature amount of the image of the optical thickness distribution of the unknown object or determining the type of the unknown object, Quantity extraction part is optical thickness distribution The information about the spatial variation of optical thickness at the location of the image is extracted as a feature quantity of the image.
- a feature amount extraction unit extracts a feature amount of an image of an optical thickness distribution of an object whose type is unknown (hereinafter referred to as “unknown object”). And (2) machine learning based on the feature quantity extracted by the feature quantity extraction unit with respect to the image of the optical thickness distribution of the object whose type is known (hereinafter referred to as “known object”).
- An identification step of determining the type of the unknown object based on the feature amount extracted in the first feature amount extraction step using the learning result stored in the storage unit, and the optical thickness of the unknown object When extracting the feature amount of the distribution image or determining the type of the unknown object, the learning result stored in the storage unit is used, and the feature amount extraction unit uses the learning result stored in the image of the optical thickness distribution. In the position It takes the information about the spatial variation of the optical thickness are extracted as feature amounts of the image.
- a target object that has a three-dimensional shape and has no large feature in size and shape and is colorless and transparent can be identified.
- FIG. 1 is a diagram illustrating a configuration of the identification device 1 according to the present embodiment.
- FIG. 2 is a flowchart for explaining the identification method of the present embodiment.
- FIG. 3 is a diagram schematically showing the structure of cells (a) and (b).
- FIG. 4 is a diagram showing an example of a quantitative phase image of (a) and (b) cancer cells.
- FIGS. 5A and 5B are diagrams showing examples of quantitative phase images of white blood cells.
- FIG. 6 is a diagram for explaining fx (x, y), fy (x, y), m (x, y), and ⁇ (x, y) in the quantitative phase image.
- FIG. 1 is a diagram illustrating a configuration of the identification device 1 according to the present embodiment.
- FIG. 2 is a flowchart for explaining the identification method of the present embodiment.
- FIG. 3 is a diagram schematically showing the structure of cells (a) and (b).
- FIG. 4 is a diagram showing an example of
- FIG. 7 is a diagram illustrating an example of a histogram of the gradient direction ⁇ (x, y) obtained by weighting with the gradient strength m (x, y).
- FIG. 8 is a diagram showing an ROC curve when machine learning is performed using white blood cells to which a hemolytic agent has not been added.
- FIG. 9 is a diagram showing an ROC curve when machine learning is performed using leukocytes to which a hemolytic agent is added.
- FIG. 10 is a flowchart for explaining another example of the identification method of the present embodiment.
- FIG. 11 is a diagram schematically showing (a) the structure of white blood cells and (b) information on the inclination of the optical thickness.
- FIG. 12 is a diagram schematically showing (a) cancer cell structure and (b) optical thickness inclination information.
- FIG. 13 is an ROC curve showing the relationship between the false positive rate and the true positive rate of leukocytes when discriminating between leukocytes and cancer cells.
- FIG. 1 is a diagram showing the configuration of the identification device 1 of the present embodiment.
- the identification device 1 includes a quantitative phase image acquisition unit 11, a feature amount extraction unit 12, a learning unit 13, a storage unit 14, and an identification unit 15.
- Quantitative phase image acquisition unit 11 acquires a quantitative phase image of an object (cell).
- a quantitative phase image is an image of an optical thickness distribution of a cell.
- the optical thickness is the product of the physical length along the light traveling direction and the refractive index. Therefore, if the physical length of the cell is spatially uniform, the optical thickness distribution of the cell is equivalent to the refractive index distribution. If the refractive index of the cell is spatially uniform, the optical thickness distribution of the cell is equivalent to the physical length distribution.
- the quantitative phase image may be a one-dimensional image, a two-dimensional image, or a three-dimensional image.
- the three-dimensional quantitative phase image is a special case of a one-dimensional or two-dimensional quantitative phase image, and represents a spatial distribution of the three-dimensional refractive index of cells. That is, it represents information obtained by separating the refractive index that characterizes the optical thickness and the physical length.
- the feature quantity extraction unit 12 extracts the feature quantity of the quantitative phase image of the cell acquired by the quantitative phase image acquisition unit 11.
- the feature amount extraction unit 12 represents each cell by a quantitative phase image composed of fixed m ⁇ n pixels, and performs a smoothing process as necessary, and then extracts the feature amount of the image.
- the feature amount may be, for example, the maximum value of the optical thickness, or may be information related to the magnitude of the change in the optical thickness with respect to the position (inclination of the optical thickness).
- the feature amount extraction unit 12 extracts the feature amount of the quantitative phase image acquired by the quantitative phase image acquisition unit 11 using the learning result stored in the storage unit 14 to be described later.
- the learning unit 13 performs machine learning on the quantitative phase image of a cell whose type is known (known cell) based on the feature amount extracted by the feature amount extraction unit 12.
- Machine learning is, for example, statistical machine learning, supervised learning (Supervised Learning), unsupervised learning (Unsupervised Learning), semi-supervised learning (Semi-supervised Learning), reinforcement learning ( Reinforcement Learning, transduction, multi-task learning, or deep learning.
- supervised learning the data of known cells is training data (training data), the data of unknown cells is test data (test ⁇ data), multiple training data are given to the computer in advance, and the correct output for the input test data is output.
- the storage unit 14 stores a result of machine learning (for example, a function obtained by machine learning) by the learning unit 13.
- the identification unit 15 uses the learning result stored in the storage unit 14 to determine the unknown cell based on the feature quantity extracted by the feature quantity extraction unit 12 for the quantitative phase image of the cell whose type is unknown (unknown cell). The type of is determined.
- a quantitative phase microscope is used as the quantitative phase image acquisition unit 11.
- the feature amount extraction unit 12, the learning unit 13, the storage unit 14, and the identification unit 15 for example, a computer including a processor and a memory is used.
- the computer executes functions as the feature amount extraction unit 12, the learning unit 13, and the identification unit 15 by the processor.
- the computer executes the function of the storage unit 14 using a memory or an external storage device. Therefore, the computer includes a feature amount extraction unit 12, a learning unit 13, a storage unit 14, and an identification unit 15.
- HOG Histograms of Oriented Gradients
- LBP Lical Binary Pattern
- GLAC Gram Local Auto-Correlation
- HLAC Higher-order Local Auto-Correlation
- Haar-like and the like are used as the feature amount extraction algorithm.
- machine learning algorithms for example, AdaBoost (Adaptive Boosting), Mahalanobis K-means, simple Bayes classifier, decision tree, boosting, random tree, expectation maximization (Expectation Maximization), K neighborhood, neural A network, a multilayer perceptron (MPL), a support vector machine (Support vector machine), a deep learning (Deep Learning), etc. are used.
- FIG. 2 is a flowchart for explaining the identification method of the present embodiment.
- the identification method of the present embodiment includes a first image acquisition step S11, a first feature amount extraction step S12, an identification step S13, a second image acquisition step S21, a second feature amount extraction step S22, and a learning step S23.
- the quantitative phase image acquisition unit 11 acquires quantitative phase images of many known cells.
- the feature amount of the quantitative phase image of these known cells is extracted by the feature amount extraction unit 12.
- the learning step S23 the learning unit 13 performs machine learning based on the feature amount extracted in the second feature amount extraction step S22, and stores the learning result in the storage unit 14.
- the identification step S13 when identifying white blood cells, it is preferable that a large number of known cells acquire quantitative phase images of cells other than white blood cells and white blood cells by the quantitative phase image acquisition unit 11.
- the identification step S13 when identifying cancer cells, it is preferable that a large number of known cells acquire quantitative phase images of cancer cells and cells other than cancer cells by the quantitative phase image acquisition unit 11. It is.
- leukocytes may be collected from cancer patients or collected from healthy individuals.
- the leukocytes may be those to which a hemolytic agent has been added.
- the cancer cell may be a collected blood circulating tumor cell or a cultured cancer cell.
- the quantitative phase image of the unknown cell is acquired by the quantitative phase image acquisition unit 11.
- the feature quantity extraction unit 12 extracts the feature quantity of the quantitative phase image of the unknown cell.
- the identification unit 15 determines the type of the unknown cell based on the feature amount extracted in the first feature amount extraction step S12 using the learning result stored in the storage unit 14.
- FIG. 3 is a diagram schematically showing the structure of a cell.
- FIG. 3 shows an xyz rectangular coordinate system for convenience of explanation.
- the cell 30 is placed on a preparation 40 disposed in parallel to the xy plane.
- Fig.3 (a) shows sectional drawing of the cell 30 parallel to xz plane.
- FIG. 3B shows a plan view of the cell 30 viewed in the direction of the optical axis parallel to the z-axis.
- the cell 30 has a structure in which a cytoplasm 32 covers a cell nucleus 31 present in the central region, and the cytoplasm 32 is covered with a cell membrane 33.
- a general cell has a structure including a cell nucleus, a cytoplasm, and a cell membrane.
- the shape and refractive index of these cell nuclei, cytoplasm, and cell membrane vary depending on the cell type such as leukocytes and cancer cells.
- the size and shape of cell nuclei change when normal cells change into cancer cells.
- general cancer cells are described instead of blood cancer cells.
- the phase delay of the light is in the xy plane depending on the refractive index and shape of each cell nucleus, cytoplasm, and cell membrane. It depends on each position above.
- the quantitative phase image acquired by the quantitative phase image acquisition unit 11 represents this phase lag distribution, and represents the optical thickness distribution of the cells.
- Each pixel value of the quantitative phase image corresponds to the optical thickness at the xy position corresponding to the pixel.
- the quantitative phase image is in accordance with the refractive index and shape of each of the cell nucleus, cytoplasm, and cell membrane. Therefore, the cell type can be determined based on the quantitative phase image of the cell.
- FIG. 4 is a diagram showing an example of a quantitative phase image of cancer cells (HepG2).
- FIG. 5 is a diagram showing an example of a quantitative phase image of white blood cells.
- FIG. 4A and FIG. 5A show quantitative phase images.
- 4 (b) and 5 (b) show the distribution of the optical thickness of the cells along the broken line shown in (a) of the corresponding figure.
- the quantitative phase images of cancer cells and leukocytes they differ in terms of the maximum value of the optical thickness, and also differ in terms of the slope of the optical thickness. Therefore, these may be extracted by the feature amount extraction unit 12 as feature amounts of the quantitative phase image.
- the optical thickness inclination information is extracted by the characteristic amount extraction unit 12 as the characteristic amount of the quantitative phase image, and the white blood cells and the cancer cells are identified by performing image recognition based on the inclination information.
- the tilt information represents information such as a graph tilt when the horizontal axis is the position and the vertical axis is the optical thickness, and a vector on the xy plane. .
- This inclination information does not represent the inclination of the cell surface, but reflects the internal structure of the cell such as the shape of the cell nucleus and the refractive index constituting the cell.
- the feature quantity extraction unit 12 uses HOG as a feature quantity extraction algorithm to extract tilt information as a feature quantity of a quantitative phase image of a cell, for example, the following feature quantity extraction processing is performed.
- the pixel value I (x, y) of the pixel at the position (x, y) in the quantitative phase image corresponds to the optical thickness.
- the magnitude (gradient strength) of the vector (fx (x, y), fy (x, y)) on the xy plane is represented by m (x, y) obtained by the following equation (3).
- the gradient (gradient direction) of the vector (fx (x, y), fy (x, y)) on the xy plane is represented by ⁇ (x, y) obtained by the following equation (4).
- FIG. 6 is a diagram for explaining fx (x, y), fy (x, y), m (x, y), and ⁇ (x, y) in the quantitative phase image.
- FIG. 6 shows a region of cells in the quantitative phase image as a substantially circular shape, and fx (x, y), fy (x, y), m (x, y) and ⁇ (x, y) at a certain point in the region. This explains the relationship between y).
- FIG. 7 is a diagram illustrating an example of a histogram of the gradient direction ⁇ (x, y) obtained by weighting with the gradient strength m (x, y).
- the shape of this histogram varies depending on the cell type. Therefore, cancer cells and white blood cells can be identified based on the shape of this histogram.
- the feature quantity extraction unit 12 can extract the feature quantity of the unknown cell in the first feature quantity extraction step S12 using the result of the machine learning with the known cell. Extracting features for all pixels in the quantitative phase image of unknown cells takes time, so based on the results of machine learning of known cells, the region (position or pixel) for extracting feature values from all the pixels in the quantitative phase image ) Can be set to one or more to significantly reduce the time required for cell determination. Note that the set region may be a range including at least one pixel constituting the quantitative phase image.
- white blood cells are identified from a cell population in which cancer cells and white blood cells are mixed by extracting the above-described feature amounts.
- 240 white blood cells collected from a healthy person are used as known cells (positive cells), and 71 cultured cancer cells are used as known cells (negative cells).
- a second feature amount extraction step S22 and a learning step S23 were performed.
- the breakdown of the 71 cultured cancer cells is 18 for the cell line name HCT116, 21 for the cell line name DLD1, 7 for the cell line name HepG2, and 25 for the cell line name Panc1. It was.
- leukocytes those with a hemolytic agent added and those without a hemolytic agent were used.
- the quantitative phase image of each cell which was originally about 150 ⁇ 150 pixels, is converted into an 8-bit black-and-white image and reduced to an image with a size of 24 ⁇ 24 pixels, 48 ⁇ 48 pixels, or 72 ⁇ 72 pixels, Feature extraction and machine learning were performed using the reduced image.
- the algorithms used were HOG and AdaBoost included in OpenCV (Version 2.4.8). Machine learning at each stage was stopped at a misdiagnosis rate of 0.4.
- FIG. 8 is a diagram showing an ROC curve when machine learning is performed using white blood cells to which a hemolytic agent has not been added.
- “WBC1” in FIG. 8 indicates that machine learning was performed using white blood cells to which no hemolytic agent was added.
- the ROC curve of “WBC1 24 ⁇ 24” is a curve obtained when the white blood cell image size is reduced to 24 ⁇ 24 pixels.
- An ROC curve of “WBC1 48 ⁇ 48” is a curve obtained when the white blood cell image size is reduced to 48 ⁇ 48 pixels.
- FIG. 9 is a diagram showing an ROC curve when machine learning is performed using leukocytes to which a hemolytic agent is added.
- “WBC2” in FIG. 9 indicates that machine learning was performed using leukocytes to which a hemolytic agent was added.
- the ROC curve of “WBC2 24 ⁇ 24” is a curve obtained when the white blood cell image size is reduced to 24 ⁇ 24 pixels.
- the ROC curve of “WBC2 48 ⁇ 48” is a curve obtained when the white blood cell image size is reduced to 48 ⁇ 48 pixels.
- An ROC curve of “WBC2 72 ⁇ 72” is a curve obtained when the white blood cell image size is reduced to 72 ⁇ 72 pixels.
- a ROC (Receiver Operating Characteristic) curve represents the performance of identification by the identification unit 15 using the result of machine learning by the learning unit 13.
- the horizontal axis represents the false positive rate (False Positive Fraction) representing the probability that an object that is not actually a white blood cell is erroneously determined to be a white blood cell.
- the vertical axis represents the true positive rate (True Positive Fraction) that represents the probability that an object that is actually a white blood cell is correctly determined to be a white blood cell.
- AUC Absolute under the curve
- AUC represents the area of the area below the ROC curve.
- a large AUC that is, AUC close to the value 1) means that the ROC curve is close to the upper left, and indicates that the identification accuracy is high.
- the larger the number of pixels in the cell image the larger the AUC.
- AUC is larger when machine learning is performed using leukocytes to which a hemolytic agent is added. Similar ROC curves were obtained when cancer cells used for machine learning were combined with strains different from the above. Therefore, in order to distinguish cancer cells and leukocytes with high accuracy, it is preferable that the number of pixels in the cell image is large (for example, 48 ⁇ 48 or more), and machine learning is performed using leukocytes to which a hemolytic agent is added. Is preferably performed.
- FIG. 11 is a diagram schematically showing (a) the structure of white blood cells, and (b) inclination information of the optical thickness.
- an arrow 51 indicates the direction of the optical thickness in the white blood cell 50.
- an arrow 52 indicates inclination information in the optical thickness distribution.
- FIG. 12 is a diagram schematically showing (a) cancer cell structure and (b) optical thickness inclination information.
- an arrow 56 indicates the direction of the optical thickness in the cancer cell 55.
- an arrow 57 indicates tilt information in the optical thickness distribution.
- FIG. 13 is an ROC curve showing the relationship between the false positive rate and the true positive rate of leukocytes when discriminating leukocytes and cancer cells using the gradient information extracted as the feature quantity of the quantitative phase image.
- HOG is used as the feature amount extraction algorithm.
- AUC shows a very high value of about 0.98, and it can be seen that cancer cells and leukocytes can be determined with high accuracy.
- the identification device and the identification method according to one aspect of the present invention are not limited to the above-described embodiments and configuration examples, and various modifications are possible.
- a feature amount extraction unit that extracts a feature amount of an optical thickness distribution image of an object
- the storage unit stores a learning result obtained by performing machine learning based on the feature amount extracted by the feature amount extraction unit for the image of the thickness distribution, and (3) the type is determined using the learning result stored in the storage unit.
- An identification unit for determining the type of the unknown object based on the feature quantity extracted by the feature quantity extraction unit for the image of the optical thickness distribution of the unknown object (unknown object) When extracting the feature amount of the image of the optical thickness distribution or determining the type of the unknown object, the learning result stored in the storage unit is used.
- the identification device having the above configuration further includes (4) a learning unit that performs machine learning on the image of the optical thickness distribution of the known object based on the feature amount extracted by the feature amount extraction unit, and (5) the storage unit It is preferable to store a learning result of machine learning by the learning unit.
- the identification method having the above configuration includes (3) a second feature amount extraction step in which a feature amount of an image of an optical thickness distribution of a known object is extracted by a feature amount extraction unit, and (4) extraction in the second feature amount extraction step. It is preferable to further include a learning step of performing machine learning based on the feature amount thus determined and storing the learning result in the storage unit.
- the feature amount extraction unit may set at least one region from which the feature amount is extracted in the image of the optical thickness distribution of the unknown object using the learning result stored in the storage unit. good. Specifically, in the identification device, at least one region in which the feature amount extraction unit extracts the feature amount in the optical thickness distribution image of the unknown object using the learning result stored in the storage unit. It is good also as a structure to set. In the identification method, the first feature amount extraction step sets at least one region from which the feature amount is extracted in the image of the optical thickness distribution of the unknown object using the learning result stored in the storage unit. It is good also as composition to do.
- information regarding the spatial change amount of the optical thickness at the position in the image of the optical thickness distribution may be extracted as the feature amount of the image.
- the identification device may be configured such that the feature amount extraction unit extracts information on the spatial change amount of the optical thickness at a position in the image of the optical thickness distribution as the feature amount of the image.
- the identification method may be configured such that the feature amount extraction unit extracts information regarding the spatial change amount of the optical thickness at a position in the image of the optical thickness distribution as the feature amount of the image.
- the information regarding the spatial variation of the optical thickness at the position in the image of the optical thickness distribution is the vector gradient strength and / or the gradient direction at the position (pixel) in the image of the optical thickness distribution. It is good also as composition which is.
- the object may include white blood cells and cancer cells. Moreover, it is good also as a structure which extracts the feature-value of the image of the optical thickness distribution of the target object which added the hemolytic agent by the feature-value extraction part.
- an identification device includes a feature amount extraction unit that extracts a feature amount of an image of an optical thickness distribution of an object, and an identification unit that determines the type of the object based on the extracted feature amount
- the feature quantity extraction unit extracts information on the spatial variation of the optical thickness at a position in the image of the optical thickness distribution as the feature quantity of the image.
- An identification method includes an extraction step of extracting an image feature amount of an optical thickness distribution image of an object, and an identification step of determining the type of the object based on the extracted feature amount.
- an extraction step of extracting an image feature amount of an optical thickness distribution image of an object
- an identification step of determining the type of the object based on the extracted feature amount.
- information relating to a spatial change amount of the optical thickness at a position in the image of the optical thickness distribution is extracted as a feature amount of the image.
- the type of the object can be determined with high accuracy.
- One aspect of the present invention is used as an identification device and an identification method that can identify an object even if the object has a three-dimensional shape and has no large feature in size and shape and is colorless and transparent. Is possible.
Abstract
Description
Claims (10)
- 対象物の光学的厚み分布の画像の特徴量を抽出する特徴量抽出部と、
種別が既知である既知対象物の光学的厚み分布の画像について前記特徴量抽出部により抽出された特徴量に基づいて機械学習を行った学習結果を記憶する記憶部と、
前記記憶部により記憶されている学習結果を用いて、種別が未知である未知対象物の光学的厚み分布の画像について前記特徴量抽出部により抽出された特徴量に基づいて未知対象物の種別を判定する識別部と、
を備え、
未知対象物の光学的厚み分布の画像の特徴量を抽出する際、または、未知対象物の種別を判定する際に、前記記憶部により記憶されている学習結果を用いるとともに、
前記特徴量抽出部が、光学的厚み分布の画像内の位置における光学的厚みの空間的変化量に関する情報を該画像の特徴量として抽出する、
識別装置。 - 既知対象物の光学的厚み分布の画像について前記特徴量抽出部により抽出された特徴量に基づいて機械学習を行う学習部を更に備え、
前記記憶部が、前記学習部による機械学習の学習結果を記憶する、
請求項1に記載の識別装置。 - 前記特徴量抽出部は、前記記憶部により記憶されている学習結果を用いて、前記未知対象物の光学的厚み分布の画像内において前記特徴量を抽出する領域を少なくとも1つ設定する、請求項1または2に記載の識別装置。
- 前記光学的厚みの空間的変化量に関する情報は、前記光学的厚み分布の画像内の位置におけるベクトルの勾配強度および勾配方向の双方または何れか一方である、請求項1~3の何れか一項に記載の識別装置。
- 種別が未知である未知対象物の光学的厚み分布の画像の特徴量を特徴量抽出部により抽出する第1特徴量抽出ステップと、
種別が既知である既知対象物の光学的厚み分布の画像について前記特徴量抽出部により抽出された特徴量に基づいて機械学習を行って記憶部により記憶された当該学習結果を用いて、前記第1特徴量抽出ステップにおいて抽出された特徴量に基づいて未知対象物の種別を判定する識別ステップと、
を備え、
未知対象物の光学的厚み分布の画像の特徴量を抽出する際、または、未知対象物の種別を判定する際に、前記記憶部により記憶されている学習結果を用いるとともに、
前記特徴量抽出部により、光学的厚み分布の画像内の位置における光学的厚みの空間的変化量に関する情報を該画像の特徴量として抽出する、
識別方法。 - 既知対象物の光学的厚み分布の画像の特徴量を前記特徴量抽出部により抽出する第2特徴量抽出ステップと、
前記第2特徴量抽出ステップにおいて抽出された特徴量に基づいて機械学習を行い、その学習結果を前記記憶部に記憶させる学習ステップと、
を更に備える、請求項5に記載の識別方法。 - 前記第1特徴量抽出ステップは、前記記憶部により記憶された学習結果を用いて、前記未知対象物の光学的厚み分布の画像内において前記特徴量を抽出する領域を少なくとも1つを設定する、請求項5または6に記載の識別方法。
- 前記光学的厚みの空間的変化量に関する情報は、前記光学的厚み分布の画像内の位置におけるベクトルの勾配強度および勾配方向の双方または何れか一方である、請求項5~7の何れか一項に記載の識別方法。
- 前記対象物として白血球およびがん細胞を含む、請求項5~8の何れか一項に記載の識別方法。
- 前記特徴量抽出部により、溶血剤を添加した対象物の光学的厚み分布の画像の特徴量を抽出する、請求項9に記載の識別方法。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2016538314A JP6692049B2 (ja) | 2014-07-29 | 2015-07-23 | 識別装置および識別方法 |
US15/329,456 US10180387B2 (en) | 2014-07-29 | 2015-07-23 | Identification device and identification method |
EP15827812.7A EP3176563B1 (en) | 2014-07-29 | 2015-07-23 | Identification device and identification method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014153651 | 2014-07-29 | ||
JP2014-153651 | 2014-07-29 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016017533A1 true WO2016017533A1 (ja) | 2016-02-04 |
Family
ID=55217434
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/071023 WO2016017533A1 (ja) | 2014-07-29 | 2015-07-23 | 識別装置および識別方法 |
Country Status (4)
Country | Link |
---|---|
US (1) | US10180387B2 (ja) |
EP (1) | EP3176563B1 (ja) |
JP (1) | JP6692049B2 (ja) |
WO (1) | WO2016017533A1 (ja) |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018207524A1 (ja) * | 2017-05-07 | 2018-11-15 | 国立大学法人大阪大学 | 識別方法、分類分析方法、識別装置、分類分析装置および記憶媒体 |
EP3454042A1 (en) | 2017-09-06 | 2019-03-13 | Hamamatsu Photonics K.K. | Cell observation system and cell observation method |
WO2019097587A1 (ja) * | 2017-11-14 | 2019-05-23 | 株式会社ニコン | 定量位相画像生成方法、定量位相画像生成装置およびプログラム |
JP2019103412A (ja) * | 2017-12-11 | 2019-06-27 | 憲隆 福永 | 胚選抜システム |
CN110178012A (zh) * | 2016-12-16 | 2019-08-27 | 国立大学法人大阪大学 | 分类分析方法、分类分析装置及分类分析用记录介质 |
JP2019141090A (ja) * | 2019-05-07 | 2019-08-29 | 憲隆 福永 | 胚選抜システム |
US20200193140A1 (en) * | 2017-08-24 | 2020-06-18 | Nano Global | Detection of Biological Cells or Biological Substances |
CN111784669A (zh) * | 2020-06-30 | 2020-10-16 | 长沙理工大学 | 一种胶囊内镜图像多病灶检测方法 |
WO2022024564A1 (ja) | 2020-07-30 | 2022-02-03 | 浜松ホトニクス株式会社 | 判別装置、判別方法、判別プログラム及び記録媒体 |
EP3965070A1 (en) | 2020-08-28 | 2022-03-09 | Hamamatsu Photonics K.K. | Learning model generation method, identification method, learning model generation system, identification system, learning model generation program, identification program, and recording medium |
WO2022195754A1 (ja) * | 2021-03-17 | 2022-09-22 | 株式会社エビデント | データ処理方法、データ処理装置、三次元観察装置、学習方法、学習装置及び記録媒体 |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3410416B1 (en) * | 2016-01-28 | 2021-08-04 | Ricoh Company, Ltd. | Image processing device, imaging device, mobile entity apparatus control system, image processing method, and program |
CN110392732B (zh) * | 2017-03-02 | 2023-07-28 | 株式会社岛津制作所 | 细胞分析方法和细胞分析装置 |
EP3714459A4 (en) * | 2017-11-20 | 2021-12-22 | Nano Global Corp. | DATA COLLECTION AND ANALYSIS BASED ON THE DETECTION OF BIOLOGICAL CELLS OR BIOLOGICAL SUBSTANCES |
CN111105416B (zh) * | 2019-12-31 | 2022-09-09 | 北京理工大学重庆创新中心 | 一种骨髓细胞增生程度自动分级方法及系统 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009210542A (ja) * | 2008-03-06 | 2009-09-17 | Hamamatsu Photonics Kk | 観察装置 |
JP2010203949A (ja) * | 2009-03-04 | 2010-09-16 | Nec Corp | 画像診断支援装置、画像診断支援方法、画像診断支援プログラム、及びその記憶媒体 |
JP2014039535A (ja) * | 2012-07-24 | 2014-03-06 | Univ Of Electro-Communications | 細胞識別装置及び細胞識別方法、並びに、細胞識別方法のプログラム及びそのプログラムを記録した記録媒体 |
JP2014508922A (ja) * | 2011-01-25 | 2014-04-10 | マサチューセッツ インスティテュート オブ テクノロジー | シングルショットの全視野反射位相顕微鏡法 |
Family Cites Families (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8131053B2 (en) * | 1999-01-25 | 2012-03-06 | Amnis Corporation | Detection of circulating tumor cells using imaging flow cytometry |
US20060258018A1 (en) * | 2003-09-23 | 2006-11-16 | Curl Claire L | Method and apparatus for determining the area or confluency of a sample |
US9322834B2 (en) * | 2007-05-30 | 2016-04-26 | Sysmex Corporation | Sample analyzer, blood analyzer and displaying method |
EP2290350B1 (en) | 2008-06-04 | 2018-11-21 | Hitachi High-Technologies Corporation | Particle image analysis method and device |
WO2010056859A1 (en) * | 2008-11-14 | 2010-05-20 | Beckman Coulter, Inc. | Monolithic optical flow cells and method of manufacture |
US8599383B2 (en) * | 2009-05-06 | 2013-12-03 | The Regents Of The University Of California | Optical cytometry |
WO2011132586A1 (ja) * | 2010-04-23 | 2011-10-27 | 浜松ホトニクス株式会社 | 細胞観察装置および細胞観察方法 |
US11105686B2 (en) * | 2010-05-10 | 2021-08-31 | University of Pittshurgh-Of the Commonwealth System of Higher Education | Spatial-domain low-coherence quantitative phase microscopy |
EP2637015A4 (en) * | 2010-11-01 | 2016-09-07 | Kanagawa Kagaku Gijutsu Akad | CELL ANALYZER |
US9522396B2 (en) * | 2010-12-29 | 2016-12-20 | S.D. Sight Diagnostics Ltd. | Apparatus and method for automatic detection of pathogens |
CN103842769B (zh) * | 2011-08-02 | 2017-12-15 | 加利福尼亚大学董事会 | 通过活细胞干涉测量法的快速、大量平行单细胞药物响应测量 |
US20140193892A1 (en) | 2012-07-25 | 2014-07-10 | Theranos, Inc. | Image analysis and measurement of biological samples |
JP5464244B2 (ja) * | 2012-08-24 | 2014-04-09 | 富士ゼロックス株式会社 | 画像処理装置、プログラム及び画像処理システム |
EP2972214B1 (en) * | 2013-03-15 | 2018-10-31 | Iris International, Inc. | Sheath fluid systems and methods for particle analysis in blood samples |
EP2997363A4 (en) * | 2013-05-13 | 2016-11-30 | Chiranjit Deka | APPARATUS AND METHODS FOR CELL ANALYSIS |
WO2015065909A1 (en) * | 2013-10-30 | 2015-05-07 | The General Hospital Corporation | System and method for inertial focusing cytometer with integrated optics for particle characterization |
WO2015085216A1 (en) * | 2013-12-06 | 2015-06-11 | University Of Pittsburgh - Of The Commonwealth System Of Higher Education | Spatial-domain low-coherence quantitative phase microscopy |
JP6116502B2 (ja) * | 2014-02-28 | 2017-04-19 | シスメックス株式会社 | 検体分析装置および検体分析方法 |
US10036698B2 (en) * | 2015-06-19 | 2018-07-31 | Captl Llc | Time-sequential cytometry |
-
2015
- 2015-07-23 US US15/329,456 patent/US10180387B2/en active Active
- 2015-07-23 EP EP15827812.7A patent/EP3176563B1/en active Active
- 2015-07-23 WO PCT/JP2015/071023 patent/WO2016017533A1/ja active Application Filing
- 2015-07-23 JP JP2016538314A patent/JP6692049B2/ja active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009210542A (ja) * | 2008-03-06 | 2009-09-17 | Hamamatsu Photonics Kk | 観察装置 |
JP2010203949A (ja) * | 2009-03-04 | 2010-09-16 | Nec Corp | 画像診断支援装置、画像診断支援方法、画像診断支援プログラム、及びその記憶媒体 |
JP2014508922A (ja) * | 2011-01-25 | 2014-04-10 | マサチューセッツ インスティテュート オブ テクノロジー | シングルショットの全視野反射位相顕微鏡法 |
JP2014039535A (ja) * | 2012-07-24 | 2014-03-06 | Univ Of Electro-Communications | 細胞識別装置及び細胞識別方法、並びに、細胞識別方法のプログラム及びそのプログラムを記録した記録媒体 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3176563A4 * |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110178012A (zh) * | 2016-12-16 | 2019-08-27 | 国立大学法人大阪大学 | 分类分析方法、分类分析装置及分类分析用记录介质 |
WO2018207524A1 (ja) * | 2017-05-07 | 2018-11-15 | 国立大学法人大阪大学 | 識別方法、分類分析方法、識別装置、分類分析装置および記憶媒体 |
JPWO2018207524A1 (ja) * | 2017-05-07 | 2020-05-28 | 国立大学法人大阪大学 | 識別方法、分類分析方法、識別装置、分類分析装置および記憶媒体 |
US20200193140A1 (en) * | 2017-08-24 | 2020-06-18 | Nano Global | Detection of Biological Cells or Biological Substances |
EP3454042A1 (en) | 2017-09-06 | 2019-03-13 | Hamamatsu Photonics K.K. | Cell observation system and cell observation method |
US11285483B2 (en) | 2017-09-06 | 2022-03-29 | Hamamatsu Photonics K.K. | Cell observation system and cell observation method |
WO2019097587A1 (ja) * | 2017-11-14 | 2019-05-23 | 株式会社ニコン | 定量位相画像生成方法、定量位相画像生成装置およびプログラム |
JP2019103412A (ja) * | 2017-12-11 | 2019-06-27 | 憲隆 福永 | 胚選抜システム |
JP7000379B2 (ja) | 2019-05-07 | 2022-01-19 | 憲隆 福永 | 胚選抜システム |
JP2019141090A (ja) * | 2019-05-07 | 2019-08-29 | 憲隆 福永 | 胚選抜システム |
CN111784669A (zh) * | 2020-06-30 | 2020-10-16 | 长沙理工大学 | 一种胶囊内镜图像多病灶检测方法 |
CN111784669B (zh) * | 2020-06-30 | 2024-04-02 | 长沙理工大学 | 一种胶囊内镜图像多病灶检测方法 |
WO2022024564A1 (ja) | 2020-07-30 | 2022-02-03 | 浜松ホトニクス株式会社 | 判別装置、判別方法、判別プログラム及び記録媒体 |
EP3965070A1 (en) | 2020-08-28 | 2022-03-09 | Hamamatsu Photonics K.K. | Learning model generation method, identification method, learning model generation system, identification system, learning model generation program, identification program, and recording medium |
WO2022195754A1 (ja) * | 2021-03-17 | 2022-09-22 | 株式会社エビデント | データ処理方法、データ処理装置、三次元観察装置、学習方法、学習装置及び記録媒体 |
Also Published As
Publication number | Publication date |
---|---|
US20170212033A1 (en) | 2017-07-27 |
EP3176563A1 (en) | 2017-06-07 |
US10180387B2 (en) | 2019-01-15 |
EP3176563A4 (en) | 2018-04-25 |
JPWO2016017533A1 (ja) | 2017-05-25 |
EP3176563B1 (en) | 2021-06-30 |
JP6692049B2 (ja) | 2020-05-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6692049B2 (ja) | 識別装置および識別方法 | |
Veta et al. | Assessment of algorithms for mitosis detection in breast cancer histopathology images | |
JP6710135B2 (ja) | 細胞画像の自動分析方法及びシステム | |
CN110889312B (zh) | 活体检测方法和装置、电子设备、计算机可读存储介质 | |
Sanchez-Morillo et al. | Classification of breast cancer histopathological images using KAZE features | |
Vargas et al. | Particle quality assessment and sorting for automatic and semiautomatic particle-picking techniques | |
Sadek et al. | Automatic discrimination of color retinal images using the bag of words approach | |
Aswathy et al. | An SVM approach towards breast cancer classification from H&E-stained histopathology images based on integrated features | |
Nanni et al. | A comparison of methods for extracting information from the co-occurrence matrix for subcellular classification | |
Cicconet et al. | Mirror symmetry histograms for capturing geometric properties in images | |
Elsalamony | Detection of anaemia disease in human red blood cells using cell signature, neural networks and SVM | |
US20240054639A1 (en) | Quantification of conditions on biomedical images across staining modalities using a multi-task deep learning framework | |
Kong et al. | Texture based image recognition in microscopy images of diffuse gliomas with multi-class gentle boosting mechanism | |
Iqbal et al. | A heteromorphous deep CNN framework for medical image segmentation using local binary pattern | |
Peikari et al. | Clustering analysis for semi-supervised learning improves classification performance of digital pathology | |
Majid et al. | Enhanced transfer learning strategies for effective kidney tumor classification with CT imaging | |
Rathore et al. | A novel approach for ensemble clustering of colon biopsy images | |
Di Ruberto et al. | On different colour spaces for medical colour image classification | |
Hirata et al. | Plankton image classification based on multiple segmentations | |
KR20230063147A (ko) | 다단계 특징 분석을 사용한 전립선 조직의 효율적인 경량 cnn과 앙상블 머신 러닝 분류 방법 및 시스템 | |
Yancey | Deep Feature Fusion for Mitosis Counting | |
Boonsiri et al. | 3D gray level co-occurrence matrix based classification of favor benign and borderline types in follicular neoplasm images | |
Naik et al. | Hybrid Feature Set based Mitotic Detection in Breast Histopathology Images | |
da Silva | Combining machine learning and deep learning approaches to detect cervical cancer in cytology images | |
Foran et al. | A cagrid-enabled, learning based image segmentation method for histopathology specimens |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15827812 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2016538314 Country of ref document: JP Kind code of ref document: A |
|
REEP | Request for entry into the european phase |
Ref document number: 2015827812 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15329456 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |