EP3176563B1 - Identifizierungsvorrichtung und identifizierungsverfahren - Google Patents
Identifizierungsvorrichtung und identifizierungsverfahren Download PDFInfo
- Publication number
- EP3176563B1 EP3176563B1 EP15827812.7A EP15827812A EP3176563B1 EP 3176563 B1 EP3176563 B1 EP 3176563B1 EP 15827812 A EP15827812 A EP 15827812A EP 3176563 B1 EP3176563 B1 EP 3176563B1
- Authority
- EP
- European Patent Office
- Prior art keywords
- feature quantity
- image
- optical thickness
- thickness distribution
- cell
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims description 28
- 210000004027 cell Anatomy 0.000 claims description 114
- 230000003287 optical effect Effects 0.000 claims description 85
- 238000000605 extraction Methods 0.000 claims description 71
- 238000009826 distribution Methods 0.000 claims description 53
- 210000000265 leukocyte Anatomy 0.000 claims description 49
- 206010028980 Neoplasm Diseases 0.000 claims description 36
- 201000011510 cancer Diseases 0.000 claims description 35
- 238000010801 machine learning Methods 0.000 claims description 32
- 238000003860 storage Methods 0.000 claims description 32
- 239000003219 hemolytic agent Substances 0.000 claims description 13
- 239000000284 extract Substances 0.000 claims description 9
- 210000003855 cell nucleus Anatomy 0.000 description 7
- 208000005443 Circulating Neoplastic Cells Diseases 0.000 description 6
- 210000000805 cytoplasm Anatomy 0.000 description 6
- 238000010586 diagram Methods 0.000 description 6
- 210000000170 cell membrane Anatomy 0.000 description 5
- 101100433753 Arabidopsis thaliana ABCG2 gene Proteins 0.000 description 4
- 210000004369 blood Anatomy 0.000 description 4
- 239000008280 blood Substances 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 101100377801 Arabidopsis thaliana ABCG1 gene Proteins 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 238000012360 testing method Methods 0.000 description 3
- 238000005303 weighing Methods 0.000 description 3
- 238000013528 artificial neural network Methods 0.000 description 2
- 238000013135 deep learning Methods 0.000 description 2
- 238000003909 pattern recognition Methods 0.000 description 2
- 210000005259 peripheral blood Anatomy 0.000 description 2
- 239000011886 peripheral blood Substances 0.000 description 2
- 210000001519 tissue Anatomy 0.000 description 2
- 238000012549 training Methods 0.000 description 2
- 206010006187 Breast cancer Diseases 0.000 description 1
- 208000026310 Breast neoplasm Diseases 0.000 description 1
- 206010027476 Metastases Diseases 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 210000005266 circulating tumour cell Anatomy 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000003066 decision tree Methods 0.000 description 1
- 101150042537 dld1 gene Proteins 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000009499 grossing Methods 0.000 description 1
- 238000010191 image analysis Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000009401 metastasis Effects 0.000 description 1
- 206010061289 metastatic neoplasm Diseases 0.000 description 1
- 238000000386 microscopy Methods 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 238000004393 prognosis Methods 0.000 description 1
- 230000002787 reinforcement Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000012706 support-vector machine Methods 0.000 description 1
- 230000026683 transduction Effects 0.000 description 1
- 238000010361 transduction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N15/00—Investigating characteristics of particles; Investigating permeability, pore-volume, or surface-area of porous materials
- G01N15/02—Investigating particle size or size distribution
- G01N15/0205—Investigating particle size or size distribution by optical means, e.g. by light scattering, diffraction, holography or imaging
- G01N15/0227—Investigating particle size or size distribution by optical means, e.g. by light scattering, diffraction, holography or imaging using imaging, e.g. a projected image of suspension; using holography
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N15/00—Investigating characteristics of particles; Investigating permeability, pore-volume, or surface-area of porous materials
- G01N15/10—Investigating individual particles
- G01N15/14—Electro-optical investigation, e.g. flow cytometers
- G01N15/1429—Electro-optical investigation, e.g. flow cytometers using an analyser being characterised by its signal processing
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/02—Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
- G01B11/06—Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness for measuring thickness ; e.g. of sheet material
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N15/00—Investigating characteristics of particles; Investigating permeability, pore-volume, or surface-area of porous materials
- G01N15/10—Investigating individual particles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N15/00—Investigating characteristics of particles; Investigating permeability, pore-volume, or surface-area of porous materials
- G01N15/10—Investigating individual particles
- G01N15/14—Electro-optical investigation, e.g. flow cytometers
-
- G01N15/1433—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
- G06T7/0014—Biomedical image inspection using an image reference approach
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/69—Microscopic objects, e.g. biological cells or cellular parts
- G06V20/698—Matching; Classification
-
- G01N2015/016—
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N15/00—Investigating characteristics of particles; Investigating permeability, pore-volume, or surface-area of porous materials
- G01N15/02—Investigating particle size or size distribution
- G01N15/0205—Investigating particle size or size distribution by optical means, e.g. by light scattering, diffraction, holography or imaging
- G01N2015/025—Methods for single or grouped particles
-
- G01N2015/1014—
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N15/00—Investigating characteristics of particles; Investigating permeability, pore-volume, or surface-area of porous materials
- G01N15/10—Investigating individual particles
- G01N15/14—Electro-optical investigation, e.g. flow cytometers
- G01N2015/1493—Particle size
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30024—Cell structures in vitro; Tissue sections in vitro
Definitions
- An aspect of the present invention relates to an identification apparatus and an identification method for identifying an object using an image of an optical thickness distribution of the object.
- an object can be identified based on size, shape, or color of the object.
- the objects when the objects have three-dimensional shapes, have sizes and shapes not significantly different from each other, and are colorless and transparent, the objects cannot be identified in an image obtained with a bright field microscope.
- a phase contrast microscope and a differential interference microscope are used for visualizing colorless and transparent cells, these microscopes lack quantitativity for optical thickness.
- these microscopes depending on an objective lens used, these microscopes have a focus depth less than the thickness of a cell, and as a result, only two-dimensional information can be obtained in spite of the fact that the cell has a three-dimensional structure, and the object cannot be identified.
- CTCs Cells which are released from an original tumor tissue or a metastatic tumor tissue and infiltrate into blood are called circulating tumor cells.
- the CTCs are present in a trace amount in the peripheral blood of solid cancer patients, are presumed to be associated with metastasis, and have been actively studied in recent years. On the other hand, it is important to identify white blood cells and cancer cells since most all of nucleated cells in the peripheral blood are white blood cells.
- an image of a cell is acquired by an optical system for obtaining a bright field image, feature parameters (such as size, color information, and circularity) of the image are extracted, and the cell is identified based on the feature parameters.
- feature parameters such as size, color information, and circularity
- a pattern recognition process is performed by using a neural network in that invention.
- Patent Document 1 Japanese Patent Publication No. 5470625 US2013/0130307 relates to a cell observation device and a cell observation method.
- US2010/284016 relates to interferometric systems, materials and techniques that can used to examine one or more cells.
- US2014/193892 relates to methods, devices, systems, and apparatuses for optical and image analysis or measurements of biological and other samples.
- WO2014/030378 relates to an image processing device, a program, an image processing method, a computer-readable medium, and an image processing system.
- US 2015/124082 relates to an image processing device for cells using imaging techniques other than quantitative phase microscopy (optical microscope). Histograms of oriented gradients (HOG) are made based on extracted features.
- HOG Histograms of oriented gradients
- Patent Document 1 a pattern recognition process is performed for an image of an object acquired by an optical system for obtaining a bright field image, and thereby the object is identified, and therefore, it is difficult to identify objects (phase objects) which have three-dimensional shapes, and have no significant difference therebetween in any of sizes, shapes and colors, as in a case of white blood cells and cancer cells.
- An aspect of the present invention has been made in order to solve the above problem, and an object thereof is to provide an apparatus and a method capable of identifying an object even when the object has a three-dimensional shape, has a size and a shape with no distinctive feature, and is colorless and transparent.
- an object has a three-dimensional shape, has a size and a shape with no distinctive feature, and is colorless and transparent, it is possible to identify the object.
- FIG. 1 is a diagram illustrating a configuration of an identification apparatus 1 of an embodiment.
- the identification apparatus 1 includes a quantitative phase image acquisition unit 11, a feature quantity extraction unit 12, a learning unit 13, a storage unit 14, and an identification unit 15.
- the quantitative phase image acquisition unit 11 acquires a quantitative phase image of an object (cell).
- the quantitative phase image is an image of an optical thickness distribution of the cell.
- the optical thickness is a product of a physical length along a travelling direction of light and a refractive index. Accordingly, if the physical length of the cell is spatially uniform, the optical thickness distribution of the cell is equivalent to a refractive index distribution. If the refractive index of the cell is spatially uniform, the optical thickness distribution of the cell is equivalent to a physical length distribution.
- the quantitative phase image may be a one-dimensional image, or may be a two-dimensional image or a three-dimensional image.
- the three-dimensional quantitative phase image is a special case of the one-dimensional or two-dimensional quantitative phase image, and indicates a three-dimensional spatial distribution of refractive index of the cell. In other words, it indicates information in which the refractive index and the physical length, which characterize the optical thickness, are separated.
- the feature quantity extraction unit 12 extracts a feature quantity of the quantitative phase image of the cell acquired by the quantitative phase image acquisition unit 11.
- the feature quantity extraction unit 12 expresses an individual cell by the quantitative phase image including fixed m ⁇ n pixels, performs a smoothing process if necessary, and then extracts the feature quantity of the image.
- the feature quantity may be, for example, a maximum value of the optical thickness, or information regarding a magnitude of a change in the optical thickness with respect to a position (inclination of the optical thickness).
- the feature quantity extraction unit 12 extracts the feature quantity of the quantitative phase image acquired by the quantitative phase image acquisition unit 11 using a learning result stored by the storage unit 14 described later.
- the learning unit 13 performs machine learning for a quantitative phase image of a cell of which the type is known (known cell) based on a feature quantity extracted by the feature quantity extraction unit 12.
- the machine learning is, for example, statistical machine learning, and supervised learning, unsupervised learning, semi-supervised learning, reinforcement learning, transduction, multi-task learning, or deep learning, or the like.
- data of a known cell is employed as training data
- data of an unknown cell is employed as test data
- a plurality of training data is given to a computer in advance
- a function which performs proper output in response to input test data is generated.
- the storage unit 14 stores a result of the machine learning (for example, the function obtained by the machine learning) by the learning unit 13.
- the identification unit 15 determines the type of the unknown cell using the learning result stored by the storage unit 14.
- the quantitative phase image acquisition unit 11 for example, a quantitative phase microscope is used.
- the feature quantity extraction unit 12, the learning unit 13, the storage unit 14, and the identification unit 15, for example, a computer including a processor and a memory is used.
- the computer executes functions as the feature quantity extraction unit 12, the learning unit 13, and the identification unit 15, by the processor.
- the computer executes a function of the storage unit 14 by the memory or an external storage device. Accordingly, the computer includes the feature quantity extraction unit 12, the learning unit 13, the storage unit 14, and the identification unit 15.
- HOG high-order local auto-correlation
- LBP local binary pattern
- GLAC gradient local auto-correlation
- HLAC higher-order local auto-correlation
- Haar-like and the like
- machine learning algorithm for example, AdaBoost (adaptive boosting), Mahalanobis K-means, naive Bayes classifier, decision tree, boosting, random trees, expectation maximization, K-nearest neighbors, neural network, multi-layer perceptron (MPL), support vector machine, and deep learning, and the like are used.
- MPL multi-layer perceptron
- FIG. 2 is a flowchart for explaining the identification method of the embodiment.
- the identification method of the embodiment includes a first image acquisition step S11, a first feature quantity extraction step S12, an identification step S13, a second image acquisition step S21, a second feature quantity extraction step S22, and a learning step S23.
- a quantitative phase image of many known cells is acquired by the quantitative phase image acquisition unit 11.
- a feature quantity of the quantitative phase image of these known cells is extracted by the feature quantity extraction unit 12.
- machine learning is performed in the learning unit 13 based on the feature quantity extracted in the second feature quantity extraction step S22, and a learning result thereof is stored in the storage unit 14.
- white blood cells are identified in the identification step S13, as for the many known cells
- cancer cells are identified in the identification step S13, as for the many known cells, it is preferable to acquire quantitative phase images of cancer cells and cells other than cancer cells by the quantitative phase image acquisition unit 11.
- the white blood cells may be those collected from a cancer patient, or may be those collected from a healthy person.
- the white blood cells may be those to which a hemolytic agent is added.
- the cancer cells may be collected circulating tumor cells, or may be cultured cancer cells.
- a quantitative phase image of an unknown cell is acquired by the quantitative phase image acquisition unit 11.
- a feature quantity of the quantitative phase image of the unknown cell is extracted by the feature quantity extraction unit 12.
- the identification step S13 the type of the unknown cell is determined in the identification unit 15 based on the feature quantity extracted in the first feature quantity extraction step S12 using the learning result stored by the storage unit 14.
- FIG. 3 includes views schematically illustrating the structure of the cell.
- an xyz orthogonal coordinate system is illustrated for the convenience of description.
- the cell 30 is placed on a preparation 40 disposed in parallel to an xy plane.
- (a) in FIG. 3 illustrates a sectional view of the cell 30 in parallel to an xz plane.
- (b) in FIG. 3 illustrates a plan view of the cell 30 viewing in a direction of an optical axis in parallel to a z-axis.
- the cell 30 has a structure in which a cell nucleus 31 present in a central region is covered with cytoplasm 32, and the cytoplasm 32 is enveloped by a cell membrane 33.
- common cells have a structure including a cell nucleus, cytoplasm, and a cell membrane.
- the shapes and the refractive indices of the cell nuclei, the cytoplasm and the cell membranes vary depending on the type of cell such as a white blood cell or a cancer cell.
- the size and shape of a cell nucleus are changed in general when a normal cell turns into a cancer cell.
- a definite shape of cancer cells in the blood is unknown, the following description will be given for not the cancer cells in the blood, but for common cancer cells.
- phase delay of the light varies depending on positions on the xy plane in accordance with the refractive index and the shape of each of a cell nucleus, cytoplasm, and a cell membrane.
- a quantitative phase image acquired by the quantitative phase image acquisition unit 11 indicates the phase delay distribution, and indicates the optical thickness distribution of the cell.
- Each of pixel values of the quantitative phase image corresponds to the optical thickness at an xy position corresponding to the pixel.
- the quantitative phase image is in accordance with the refractive index and the shape of each of the cell nucleus, the cytoplasm, and the cell membrane. Therefore, the type of the cell can be determined based on the quantitative phase image of the cell.
- FIG. 4 includes diagrams illustrating an example of a quantitative phase image of cancer cells (HepG2).
- FIG. 5 includes diagrams illustrating an example of a quantitative phase image of white blood cells.
- (a) in FIG. 4 and (a) in FIG. 5 illustrate each a quantitative phase image.
- (b) in FIG. 4 and (b) in FIG. 5 illustrate each an optical thickness distribution of the cell along a dashed line illustrated in (a) in the corresponding figure.
- the feature quantity extraction unit 12 may be extracted as the feature quantity of the quantitative phase images by the feature quantity extraction unit 12.
- the inclination information is information regarding inclination of a graph obtained when a horizontal axis represents a position and a vertical axis represents an optical thickness, as illustrated in (b) in FIG. 4 and (b) in FIG. 5 , a vector in the xy plane, or the like.
- the inclination information does not indicate inclination of a surface of the cell, but reflects a structure in the cell such as the shape and the refractive index of a cell nucleus and the like constituting the cell.
- inclination information is extracted in the feature quantity extraction unit 12 as a feature quantity of a quantitative phase image of a cell using the HOG as a feature quantity extraction algorithm, for example, the following feature quantity extraction process is performed.
- a pixel value I(x, y) of a pixel located at a position (x, y) in the quantitative phase image corresponds to the optical thickness.
- a difference fx(x, y) between pixel values I(x + 1, y) and I(x - 1, y) of two vicinal pixels in an x-direction is obtained by the following formula (1)
- a difference fy(x, y) between pixel values I(x, y + 1) and I(x, y - 1) of two vicinal pixels in a y-direction is obtained by the following formula (2).
- a magnitude (gradient magnitude) of a vector (fx(x, y), fy(x, y)) in the xy plane is represented by m(x, y) obtained by the following formula (3).
- inclination (gradient direction) of the vector (fx(x, y), fy(x, y)) in the xy plane is represented by ⁇ (x, y) obtained by the following formula (4).
- FIG. 6 is a drawing for explaining fx(x, y), fy(x, y), m(x, y), and ⁇ (x, y) in the quantitative phase image.
- a region of a cell in the quantitative phase image is represented as substantially a circular shape, and a relationship among fx(x, y), fy(x, y), m(x, y), and ⁇ (x, y) at a certain point in the region is explained.
- the gradient magnitude m(x, y) and the gradient direction ⁇ (x, y) are obtained for all pixels in the quantitative phase image and a histogram of the gradient direction ⁇ (x, y) is obtained. At this time, weighing is performed with the gradient magnitude m(x, y).
- FIG. 7 is a drawing illustrating an example of the histogram of the gradient direction ⁇ (x, y) obtained by weighing with the gradient magnitude m(x, y).
- the shape of the histogram varies depending on the type of cell. Accordingly, it is possible to identify cancer cells and white blood cells based on the shape of the histogram.
- a feature quantity of an unknown cell can be extracted by the feature quantity extraction unit 12 in the first feature quantity extraction step S12. It takes time to extract a feature quantity for all pixels in a quantitative phase image of the unknown cell, and therefore, among all pixels in the quantitative phase image, one or more regions are set as a region (a position or a pixel) from which a feature quantity is extracted based on the result of machine learning with the known cell, and accordingly, it is possible to substantially reduce time taken to determine the cell.
- a range of the set region may be that including at least one pixel which constitutes the quantitative phase image.
- white blood cells were identified, from a cell population including cancer cells and white blood cells mixed with each other, by extracting the feature quantity as described above.
- 240 white blood cells collected from a healthy person were used as known cells (positive cells) and 71 cultured cancer cells were used as known cells (negative cells), and the second image acquisition step S21, the second feature quantity extraction step S22, and the learning step S23 were performed.
- the details of the 71 cultured cancer cells are as follows; the number of cells of cell line HCT116, cell line DLD1, cell line HepG2, and cell line Panel are 18, 21, 7, and 25, respectively.
- the white blood cells those to which a hemolytic agent had been added and those to which no hemolytic agent had been added were used.
- the quantitative phase image of each cell which was originally about 150 ⁇ 150 pixels in size, was converted into an 8-bit black-and-white image, and reduced to an image of 24 ⁇ 24 pixels, 48 ⁇ 48 pixels, or 72 ⁇ 72 pixels in size, and feature quantity extraction and machine learning were performed using the reduced images.
- HOG and AdaBoost included in OpenCV were used as the algorithm.
- the machine learning at each stage was stopped at a misdiagnosis rate of 0.4.
- FIG. 8 is a view illustrating ROC curves obtained when the machine learning is performed using white blood cells to which no hemolytic agent is added.
- "WBC1” means that the machine learning was performed using white blood cells to which no hemolytic agent had been added.
- the ROC curve of "WBC1 24 x 24" is a curve obtained when the size of a white blood cell image was reduced to 24 x 24 pixels.
- the ROC curve of "WBC1 48 x 48" is a curve obtained when the size of the white blood cell image was reduced to 48 x 48 pixels.
- FIG. 9 is a view illustrating ROC curves obtained when the machine learning is performed using white blood cells to which a hemolytic agent is added.
- "WBC2" means that the machine learning was performed using white blood cells to which a hemolytic agent had been added.
- the ROC curve of "WBC2 24 x 24" is a curve obtained when the size of a white blood cell image was reduced to 24 x 24 pixels.
- the ROC curve of "WBC2 48 x 48” is a curve obtained when the size of the white blood cell image was reduced to 48 x 48 pixels.
- the ROC curve of "WBC2 72 x 72" is a curve obtained when the size of the white blood cell image was reduced to 72 x 72 pixels.
- the ROC (receiver operating characteristic) curves indicate performance of identification by the identification unit 15 using the result of the machine learning by the learning unit 13.
- the horizontal axis represents a false positive fraction which indicates a probability that an object, which is not actually a white blood cell, is erroneously determined to be a white blood cell.
- the vertical axis represents a true positive fraction which indicates a probability that an object, which is actually a white blood cell, is properly determined to be a white blood cell.
- An AUC area under the curve is an area of a region under the ROC curve. What is meant by that the AUC is large (in other words, the AUC is close to a value 1) is that the ROC curve is present close to the left upper corner, which indicates that the accuracy of identification is high.
- the number of pixels of a cell image is preferably large (for example, 48 ⁇ 48 or more), and machine learning is preferably performed using white blood cells to which a hemolytic agent is added.
- Such machine learning is performed and a learning result thereof is stored in the storage unit 14.
- the identification or the extraction of the feature quantity may be performed using the learning result stored in the storage unit 14, and therefore, there is no need to use the learning unit 13, and to perform the second image acquisition step S21, the second feature quantity extraction step S22, and the learning step S23.
- FIG. 11 includes views schematically illustrating (a) a structure of a white blood cell and (b) inclination information of an optical thickness.
- an arrow 51 indicates a direction of the optical thickness in the white blood cell 50.
- an arrow 52 indicates inclination information in optical thickness distribution.
- FIG. 12 includes views schematically illustrating (a) a structure of a cancer cell and (b) inclination information of an optical thickness.
- an arrow 56 indicates a direction of the optical thickness in the cancer cell 55.
- an arrow 57 indicates inclination information in optical thickness distribution.
- FIG. 13 is an ROC curve illustrating a relationship between a false positive fraction and a true positive fraction of white blood cells at discrimination between white blood cells and cancer cells using inclination information extracted as a feature quantity of a quantitative phase image.
- HOG is used as a feature quantity extraction algorithm.
- the AUC value is as very high as about 0.98, which indicates that it is possible to determine the cancer cell and the white blood cell with high accuracy.
- the identification apparatus and the identification method according to an aspect of the present invention are not limited to the embodiment and the configuration examples described above, and may be modified in various ways.
- the identification apparatus has a configuration which includes (1) a feature quantity extraction unit for extracting a feature quantity of an image of an optical thickness distribution of an object; (2) a storage unit for storing a learning result of machine learning performed based on the feature quantity extracted by the feature quantity extraction unit for the image of the optical thickness distribution of an object of which a type is known (a known object); and (3) an identification unit for determining, based on the feature quantity extracted by the feature quantity extraction unit for the image of the optical thickness distribution of an object of which a type is unknown (an unknown object), the type of the unknown object using the learning result stored by the storage unit, and the learning result stored by the storage unit is used when extracting the feature quantity of the image of the optical thickness distribution of the unknown object, or when determining the type of the unknown object.
- the identification apparatus having the above configuration preferably further includes (4) a learning unit for performing machine learning based on the feature quantity extracted by the feature quantity extraction unit for the image of the optical thickness distribution of the known object, and (5) the storage unit preferably stores the learning result of machine learning by the learning unit.
- the identification method has a configuration which includes (1) a first feature quantity extraction step of extracting, by a feature quantity extraction unit, a feature quantity of an image of an optical thickness distribution of an object of which a type is unknown (an unknown object); and (2) an identification step of determining the type of the unknown object based on the feature quantity extracted in the first feature quantity extraction step, using a learning result stored by a storage unit obtained by performing machine learning based on a feature quantity extracted by the feature quantity extraction unit for an image of an optical thickness distribution of an object of which a type is known (a known object), and the learning result stored by the storage unit is used when extracting the feature quantity of the image of the optical thickness distribution of the unknown object, or when determining the type of the unknown object.
- the identification method having the above configuration preferably further includes (3) a second feature quantity extraction step of extracting, by the feature quantity extraction unit, the feature quantity of the image of the optical thickness distribution of the known object, and (4) a learning step of performing machine learning based on the feature quantity extracted in the second feature quantity extraction step and causing the storage unit to store the learning result thereof.
- a configuration may be employed in which the feature quantity extraction unit sets at least one region, from which the feature quantity is extracted, in the image of the optical thickness distribution of the unknown object using the learning result stored by the storage unit.
- the identification apparatus may have a configuration in which the feature quantity extraction unit sets at least one region, from which the feature quantity is extracted, in the image of the optical thickness distribution of the unknown object using the learning result stored by the storage unit.
- the identification method may have a configuration in which, in the first feature quantity extraction step, at least one region, from which the feature quantity is extracted, is set in the image of the optical thickness distribution of the unknown object using the learning result stored by the storage unit.
- a configuration may be employed in which information regarding a spatial change amount of an optical thickness at a position in the image of the optical thickness distribution is extracted as the feature quantity of the image.
- the identification apparatus may have a configuration in which the feature quantity extraction unit extracts information regarding a spatial change amount of an optical thickness at a position in the image of the optical thickness distribution as the feature quantity of the image.
- the identification method may have a configuration in which the feature quantity extraction unit extracts information regarding a spatial change amount of an optical thickness at a position in the image of the optical thickness distribution as the feature quantity of the image.
- a configuration is employed in which, in particular, the information regarding the spatial change amount of the optical thickness at a position in the image of the optical thickness distribution is both of a gradient magnitude and a gradient direction of a vector at the position (pixel) in the image of the optical thickness distribution.
- a configuration may be employed in which a white blood cell and a cancer cell are included as the object. Further, a configuration may be employed in which the feature quantity extraction unit extracts the feature quantity of the image of the optical thickness distribution of the object to which a hemolytic agent is added.
- the identification apparatus includes a feature quantity extraction unit for extracting a feature quantity of an image of an optical thickness distribution of an object, and an identification unit for determining the type of the object based on the extracted feature quantity, and the feature quantity extraction unit extracts information regarding a spatial change amount of an optical thickness at a position in the image of the optical thickness distribution as the feature quantity of the image.
- the identification method includes an extraction step of extracting a feature quantity of an image of an optical thickness distribution of an object, and an identification step of determining the type of the object based on the extracted feature quantity, and in the extraction step, information regarding a spatial change amount of an optical thickness at a position in the image of the optical thickness distribution is extracted as the feature quantity of the image.
- An aspect of the present invention can be used as an identification apparatus and an identification method capable of identifying an object, even when the object has a three-dimensional shape, has a size and a shape with no distinctive feature, and is colorless and transparent.
- 1 - identification apparatus 11 - quantitative phase image acquisition unit, 12 - feature quantity extraction unit, 13 - learning unit, 14 - storage unit, 15 - identification unit.
Landscapes
- Chemical & Material Sciences (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Dispersion Chemistry (AREA)
- Analytical Chemistry (AREA)
- Biochemistry (AREA)
- Immunology (AREA)
- Pathology (AREA)
- Theoretical Computer Science (AREA)
- Biomedical Technology (AREA)
- Molecular Biology (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Signal Processing (AREA)
- Medical Informatics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Quality & Reliability (AREA)
- Investigating Or Analysing Materials By Optical Means (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
Claims (8)
- Identifizierungsvorrichtung (1), umfassend:eine Merkmalsgrößenextraktionseinheit (12), welche dazu eingerichtet ist, eine Merkmalsgröße eines Bilds einer optischen Dickenverteilung eines Objekts zu extrahieren;eine Speichereinheit (14), welche dazu eingerichtet ist, ein Lernergebnis eines auf Grundlage der für das Bild der optischen Dickenverteilung eines bekannten Objekts, dessen Art bekannt ist, durch die Merkmalsgrößenextraktionseinheit (12) extrahierten Merkmalsgröße durchgeführten Maschinenlernens zu speichern; undeine Identifizierungseinheit (15), welche dazu eingerichtet ist, auf Grundlage der für das Bild der optischen Dickenverteilung eines unbekannten Objekts, dessen Art unbekannt ist, durch die Merkmalsgrößenextraktionseinheit (12) extrahierten Merkmalsgröße die Art des unbekannten Objekts unter Verwendung des durch die Speichereinheit (14) gespeicherten Lernergebnisses zu bestimmen, wobeidas durch die Speichereinheit (14) gespeicherte Lernergebnis beim Extrahieren der Merkmalsgröße des Bilds der optischen Dickenverteilung des unbekannten Objekts oder beim Bestimmen der Art des unbekannten Objekts verwendet wird unddie Merkmalsgrößenextraktionseinheit (12) dazu eingerichtet ist, eine Information bezüglich eines räumlichen Änderungsbetrags einer optischen Dicke an einer Position in dem Bild der optischen Dickenverteilung als die Merkmalsgröße des Bilds zu extrahieren,dadurch gekennzeichnet, dass das Bild ein quantitatives Phasenbild ist und die Information bezüglich des räumlichen Änderungsbetrags der optischen Dicke sowohl eine Gradientgröße als auch eine Gradientrichtung eines Vektors an der Position in dem Bild der optischen Dickenverteilung ist.
- Identifizierungsvorrichtung (1) nach Anspruch 1, ferner umfassend eine Lerneinheit (13), welche dazu eingerichtet ist, das Maschinenlernen auf Grundlage der für das Bild der optischen Dickenverteilung des bekannten Objekts durch die Merkmalsgrößenextraktionseinheit extrahierten Merkmalsgröße durchzuführen, wobei
die Speichereinheit (14) dazu eingerichtet ist, das Lernergebnis des Maschinenlernens durch die Lerneinheit zu speichern. - Identifizierungsvorrichtung (1) nach Anspruch 1 oder 2, wobei die Merkmalsgrößenextraktionseinheit (12) unter Verwendung des durch die Speichereinheit (14) gespeicherten Lernergebnisses wenigstens einen Bereich, aus welchem die Merkmalsgröße extrahiert worden ist, in dem Bild der optischen Dickenverteilung des unbekannten Objekts festlegt.
- Identifizierungsverfahren, umfassend:einen ersten Merkmalsgrößenextraktionsschritt eines Extrahierens, durch eine Merkmalsgrößenextraktionseinheit (12), einer Merkmalsgröße eines Bilds einer optischen Dickenverteilung eines unbekannten Objekts, dessen Art unbekannt ist; undeinen Identifizierungsschritt eines Bestimmens der Art des unbekannten Objekts auf Grundlage der in dem ersten Merkmalsgrößenextraktionsschritt extrahierten Merkmalsgröße unter Verwendung eines durch eine Speichereinheit (14) gespeicherten Lernergebnisses, welches erhalten wird, indem auf Grundlage einer für ein Bild einer optischen Dickenverteilung eines bekannten Objekts, dessen Art bekannt ist, durch die Merkmalsgrößenextraktionseinheit (12) extrahierten Merkmalsgröße ein Maschinenlernen durchgeführt wird, wobeidas durch die Speichereinheit (14) gespeicherte Lernergebnis beim Extrahieren der Merkmalsgröße des Bilds der optischen Dickenverteilung des unbekannten Objekts oder beim Bestimmen der Art des unbekannten Objekts verwendet wird unddie Merkmalsgrößenextraktionseinheit (12) eine Information bezüglich eines räumlichen Änderungsbetrags einer optischen Dicke an einer Position in dem Bild der optischen Dickenverteilung als die Merkmalsgröße des Bilds extrahiert,dadurch gekennzeichnet, dass das Bild ein quantitatives Phasenbild ist und die Information bezüglich des räumlichen Änderungsbetrags der optischen Dicke sowohl eine Gradientgröße als auch eine Gradientrichtung eines Vektors an der Position in dem Bild der optischen Dickenverteilung ist.
- Identifizierungsverfahren nach Anspruch 4, ferner umfassend:einen zweiten Merkmalsgrößenextraktionsschritt eines Extrahierens, durch die Merkmalsgrößenextraktionseinheit (12), der Merkmalsgröße des Bilds der optischen Dickenverteilung des bekannten Objekts undeinen Lernschritt eines Durchführens des Maschinenlernens auf Grundlage der in dem zweiten Merkmalsgrößenextraktionsschritt extrahierten Merkmalsgröße und eines Veranlassens, dass die Speichereinheit (14) das Lernergebnis speichert.
- Identifizierungsverfahren nach Anspruch 4 oder 5, wobei in dem ersten Merkmalsgrößenextraktionsschritt unter Verwendung des durch die Speichereinheit (14) gespeicherten Lernergebnisses wenigstens ein Bereich, aus welchem die Merkmalsgröße extrahiert worden ist, in dem Bild der optischen Dickenverteilung des unbekannten Objekts festgelegt wird.
- Identifizierungsverfahren nach einem der Ansprüche 4 bis 6, wobei ein weißes Blutkörperchen und eine Krebszelle als das Objekt umfasst sind.
- Identifizierungsverfahren nach Anspruch 7, wobei die Merkmalsgrößenextraktionseinheit (12) die Merkmalsgröße des Bilds der optischen Dickenverteilung des Objekts extrahiert, welchem ein hämolytisches Mittel zugesetzt worden ist.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014153651 | 2014-07-29 | ||
PCT/JP2015/071023 WO2016017533A1 (ja) | 2014-07-29 | 2015-07-23 | 識別装置および識別方法 |
Publications (3)
Publication Number | Publication Date |
---|---|
EP3176563A1 EP3176563A1 (de) | 2017-06-07 |
EP3176563A4 EP3176563A4 (de) | 2018-04-25 |
EP3176563B1 true EP3176563B1 (de) | 2021-06-30 |
Family
ID=55217434
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP15827812.7A Active EP3176563B1 (de) | 2014-07-29 | 2015-07-23 | Identifizierungsvorrichtung und identifizierungsverfahren |
Country Status (4)
Country | Link |
---|---|
US (1) | US10180387B2 (de) |
EP (1) | EP3176563B1 (de) |
JP (1) | JP6692049B2 (de) |
WO (1) | WO2016017533A1 (de) |
Families Citing this family (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3410416B1 (de) * | 2016-01-28 | 2021-08-04 | Ricoh Company, Ltd. | Bildverarbeitungsvorrichtung, bildgebungsvorrichtung, steuerungssystem für mobilitätsvorrichtung, bildverarbeitungsverfahren und programm |
US20200251184A1 (en) * | 2016-12-16 | 2020-08-06 | Osaka University | Classification analysis method, classification analysis device, and storage medium for classification analysis |
CN110392732B (zh) * | 2017-03-02 | 2023-07-28 | 株式会社岛津制作所 | 细胞分析方法和细胞分析装置 |
CN110720034B (zh) * | 2017-05-07 | 2022-10-18 | 艾珀尔有限公司 | 识别方法、分类分析方法、识别装置、分类分析装置及记录介质 |
US20200193140A1 (en) * | 2017-08-24 | 2020-06-18 | Nano Global | Detection of Biological Cells or Biological Substances |
JP6999129B2 (ja) | 2017-09-06 | 2022-01-18 | 浜松ホトニクス株式会社 | 細胞観察システムおよび細胞観察方法 |
JP6798625B2 (ja) * | 2017-11-14 | 2020-12-09 | 株式会社ニコン | 定量位相画像生成方法、定量位相画像生成装置およびプログラム |
EP3714459A4 (de) * | 2017-11-20 | 2021-12-22 | Nano Global Corp. | Datenerfassung und -analyse auf der grundlage der detektion von biologischen zellen oder biologischen substanzen |
JP6732722B2 (ja) * | 2017-12-11 | 2020-08-05 | 憲隆 福永 | 胚選抜システム |
JP7000379B2 (ja) * | 2019-05-07 | 2022-01-19 | 憲隆 福永 | 胚選抜システム |
CN111105416B (zh) * | 2019-12-31 | 2022-09-09 | 北京理工大学重庆创新中心 | 一种骨髓细胞增生程度自动分级方法及系统 |
CN111784669B (zh) * | 2020-06-30 | 2024-04-02 | 长沙理工大学 | 一种胶囊内镜图像多病灶检测方法 |
EP4123577A4 (de) * | 2020-07-30 | 2024-04-03 | Hamamatsu Photonics Kk | Beurteilungsvorrichtung, beurteilungsverfahren, beurteilungsprogramm und aufzeichnungsmedium |
JP2022039514A (ja) | 2020-08-28 | 2022-03-10 | 浜松ホトニクス株式会社 | 学習モデル生成方法、識別方法、学習モデル生成システム、識別システム、学習モデル生成プログラム、識別プログラム及び記録媒体 |
WO2022195754A1 (ja) * | 2021-03-17 | 2022-09-22 | 株式会社エビデント | データ処理方法、データ処理装置、三次元観察装置、学習方法、学習装置及び記録媒体 |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120169863A1 (en) * | 2010-12-29 | 2012-07-05 | Ido Bachelet | Apparatus and method for automatic detection of pathogens |
US20150124082A1 (en) * | 2012-08-24 | 2015-05-07 | Fuji Xerox Co., Ltd. | Image processing device, program, image processing method, computer-readable medium, and image processing system |
Family Cites Families (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8131053B2 (en) * | 1999-01-25 | 2012-03-06 | Amnis Corporation | Detection of circulating tumor cells using imaging flow cytometry |
US20060258018A1 (en) * | 2003-09-23 | 2006-11-16 | Curl Claire L | Method and apparatus for determining the area or confluency of a sample |
US9322834B2 (en) * | 2007-05-30 | 2016-04-26 | Sysmex Corporation | Sample analyzer, blood analyzer and displaying method |
JP5022274B2 (ja) * | 2008-03-06 | 2012-09-12 | 浜松ホトニクス株式会社 | 観察装置 |
EP2290350B1 (de) | 2008-06-04 | 2018-11-21 | Hitachi High-Technologies Corporation | Verfahren und vorrichtung zur partikelbildanalyse |
WO2010056859A1 (en) * | 2008-11-14 | 2010-05-20 | Beckman Coulter, Inc. | Monolithic optical flow cells and method of manufacture |
JP5321145B2 (ja) * | 2009-03-04 | 2013-10-23 | 日本電気株式会社 | 画像診断支援装置、画像診断支援方法、画像診断支援プログラム、及びその記憶媒体 |
US8599383B2 (en) * | 2009-05-06 | 2013-12-03 | The Regents Of The University Of California | Optical cytometry |
WO2011132586A1 (ja) * | 2010-04-23 | 2011-10-27 | 浜松ホトニクス株式会社 | 細胞観察装置および細胞観察方法 |
US11105686B2 (en) * | 2010-05-10 | 2021-08-31 | University of Pittshurgh-Of the Commonwealth System of Higher Education | Spatial-domain low-coherence quantitative phase microscopy |
EP2637015A4 (de) * | 2010-11-01 | 2016-09-07 | Kanagawa Kagaku Gijutsu Akad | Zellanalysator |
CN103328921B (zh) * | 2011-01-25 | 2017-11-14 | 麻省理工学院 | 单镜头全视场反射相显微镜 |
CN103842769B (zh) * | 2011-08-02 | 2017-12-15 | 加利福尼亚大学董事会 | 通过活细胞干涉测量法的快速、大量平行单细胞药物响应测量 |
JP6265364B2 (ja) * | 2012-07-24 | 2018-01-24 | 国立大学法人電気通信大学 | 細胞識別装置及び細胞識別方法、並びに、細胞識別方法のプログラム及びそのプログラムを記録した記録媒体 |
US20140193892A1 (en) | 2012-07-25 | 2014-07-10 | Theranos, Inc. | Image analysis and measurement of biological samples |
EP2972214B1 (de) * | 2013-03-15 | 2018-10-31 | Iris International, Inc. | Hüllflüssigkeitssysteme und verfahren zur partikelanalyse in blutproben |
EP2997363A4 (de) * | 2013-05-13 | 2016-11-30 | Chiranjit Deka | Zellanalysevorrichtungen und -verfahren |
WO2015065909A1 (en) * | 2013-10-30 | 2015-05-07 | The General Hospital Corporation | System and method for inertial focusing cytometer with integrated optics for particle characterization |
WO2015085216A1 (en) * | 2013-12-06 | 2015-06-11 | University Of Pittsburgh - Of The Commonwealth System Of Higher Education | Spatial-domain low-coherence quantitative phase microscopy |
JP6116502B2 (ja) * | 2014-02-28 | 2017-04-19 | シスメックス株式会社 | 検体分析装置および検体分析方法 |
US10036698B2 (en) * | 2015-06-19 | 2018-07-31 | Captl Llc | Time-sequential cytometry |
-
2015
- 2015-07-23 US US15/329,456 patent/US10180387B2/en active Active
- 2015-07-23 EP EP15827812.7A patent/EP3176563B1/de active Active
- 2015-07-23 WO PCT/JP2015/071023 patent/WO2016017533A1/ja active Application Filing
- 2015-07-23 JP JP2016538314A patent/JP6692049B2/ja active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120169863A1 (en) * | 2010-12-29 | 2012-07-05 | Ido Bachelet | Apparatus and method for automatic detection of pathogens |
US20150124082A1 (en) * | 2012-08-24 | 2015-05-07 | Fuji Xerox Co., Ltd. | Image processing device, program, image processing method, computer-readable medium, and image processing system |
Also Published As
Publication number | Publication date |
---|---|
US20170212033A1 (en) | 2017-07-27 |
EP3176563A1 (de) | 2017-06-07 |
US10180387B2 (en) | 2019-01-15 |
EP3176563A4 (de) | 2018-04-25 |
WO2016017533A1 (ja) | 2016-02-04 |
JPWO2016017533A1 (ja) | 2017-05-25 |
JP6692049B2 (ja) | 2020-05-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3176563B1 (de) | Identifizierungsvorrichtung und identifizierungsverfahren | |
Sanchez-Morillo et al. | Classification of breast cancer histopathological images using KAZE features | |
WO2017132674A1 (en) | Automated image analysis to assess reproductive potential of human oocytes and pronuclear embryos | |
US20220215548A1 (en) | Method and device for identifying abnormal cell in to-be-detected sample, and storage medium | |
EP2751730A2 (de) | Systeme und verfahren zur gewebeklassifizierung | |
Li et al. | Automatic localization and identification of mitochondria in cellular electron cryo-tomography using faster-RCNN | |
Sadek et al. | Automatic discrimination of color retinal images using the bag of words approach | |
US10915729B2 (en) | Three-dimensional cell and tissue image analysis for cellular and sub-cellular morphological modeling and classification | |
Kowal et al. | Breast cancer nuclei segmentation and classification based on a deep learning approach | |
CN110751069A (zh) | 一种人脸活体检测方法及装置 | |
KR102500220B1 (ko) | 3차원 굴절률 영상과 인공지능을 이용한 세포의 세부 분류 구분 방법 및 장치 | |
Ribeiro et al. | Analysis of the influence of color normalization in the classification of non-hodgkin lymphoma images | |
US20240054639A1 (en) | Quantification of conditions on biomedical images across staining modalities using a multi-task deep learning framework | |
WO2016189469A1 (en) | A method for medical screening and a system therefor | |
Dubosclard et al. | Automated visual grading of grain kernels by machine vision | |
Krappe et al. | Automated classification of bone marrow cells in microscopic images for diagnosis of leukemia: a comparison of two classification schemes with respect to the segmentation quality | |
KR20230063147A (ko) | 다단계 특징 분석을 사용한 전립선 조직의 효율적인 경량 cnn과 앙상블 머신 러닝 분류 방법 및 시스템 | |
KR20210113573A (ko) | 인공지능을 이용하여 정렬된 염색체 이미지의 분석을 통한 염색체 이상 판단 방법, 장치 및 컴퓨터프로그램 | |
López-Tapia et al. | A fast pyramidal bayesian model for mitosis detection in whole-slide images | |
Kwak et al. | Nucleus detection using gradient orientation information and linear least squares regression | |
Romo-Bucheli et al. | Identifying histological concepts on basal cell carcinoma images using nuclei based sampling and multi-scale descriptors | |
Christian Ramarolahy et al. | Classification and generation of microscopy images with plasmodium falciparum via artificial neural networks | |
Singh et al. | Accurate Cervical Tumor Cell Segmentation and Classification from Overlapping Clumps in Pap Smear Images | |
Foran et al. | A cagrid-enabled, learning based image segmentation method for histopathology specimens | |
Lu et al. | Analysis of deep ultraviolet fluorescence images for intraoperative breast tumor margin assessment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
17P | Request for examination filed |
Effective date: 20170125 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
A4 | Supplementary search report drawn up and despatched |
Effective date: 20180322 |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: G01N 15/02 20060101ALI20180316BHEP Ipc: G06K 9/00 20060101ALI20180316BHEP Ipc: G06T 7/00 20170101ALI20180316BHEP Ipc: G01N 15/10 20060101ALN20180316BHEP Ipc: G01N 15/00 20060101ALN20180316BHEP Ipc: G01N 21/41 20060101ALI20180316BHEP Ipc: G01N 15/14 20060101AFI20180316BHEP |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20181211 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: GRANT OF PATENT IS INTENDED |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: G01N 15/10 20060101ALN20210127BHEP Ipc: G01N 15/02 20060101ALI20210127BHEP Ipc: G01N 21/41 20060101ALI20210127BHEP Ipc: G01N 15/00 20060101ALN20210127BHEP Ipc: G06T 7/00 20170101ALI20210127BHEP Ipc: G01N 15/14 20060101AFI20210127BHEP Ipc: G06K 9/00 20060101ALI20210127BHEP |
|
INTG | Intention to grant announced |
Effective date: 20210302 |
|
RIN1 | Information on inventor provided before grant (corrected) |
Inventor name: OZAKI, YUSUKE Inventor name: KIKUCHI, HIROTOSHI Inventor name: IWAI, HIDENAO Inventor name: KONNO, HIROYUKI Inventor name: YAMAUCHI, TOYOHIKO |
|
GRAS | Grant fee paid |
Free format text: ORIGINAL CODE: EPIDOSNIGR3 |
|
GRAA | (expected) grant |
Free format text: ORIGINAL CODE: 0009210 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE PATENT HAS BEEN GRANTED |
|
AK | Designated contracting states |
Kind code of ref document: B1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: EP |
|
RIN1 | Information on inventor provided before grant (corrected) |
Inventor name: YAMAUCHI, TOYOHIKO Inventor name: KIKUCHI, HIROTOSHI Inventor name: KONNO, HIROYUKI Inventor name: IWAI, HIDENAO Inventor name: OZAKI, YUSUKE |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: REF Ref document number: 1406772 Country of ref document: AT Kind code of ref document: T Effective date: 20210715 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R096 Ref document number: 602015070928 Country of ref document: DE |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: NL Ref legal event code: FP |
|
REG | Reference to a national code |
Ref country code: LT Ref legal event code: MG9D |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: FI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210630 Ref country code: HR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210630 Ref country code: BG Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210930 |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: MK05 Ref document number: 1406772 Country of ref document: AT Kind code of ref document: T Effective date: 20210630 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: NO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210930 Ref country code: SE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210630 Ref country code: RS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210630 Ref country code: LV Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210630 Ref country code: GR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20211001 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: AT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210630 Ref country code: ES Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210630 Ref country code: PT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20211102 Ref country code: RO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210630 Ref country code: SK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210630 Ref country code: SM Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210630 Ref country code: EE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210630 Ref country code: CZ Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210630 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: PL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210630 |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: PL |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R097 Ref document number: 602015070928 Country of ref document: DE |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MC Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210630 |
|
REG | Reference to a national code |
Ref country code: BE Ref legal event code: MM Effective date: 20210731 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: LI Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20210731 Ref country code: DK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210630 Ref country code: CH Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20210731 |
|
PLBE | No opposition filed within time limit |
Free format text: ORIGINAL CODE: 0009261 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: LU Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20210723 Ref country code: AL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210630 |
|
26N | No opposition filed |
Effective date: 20220331 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210630 Ref country code: IE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20210723 Ref country code: FR Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20210830 Ref country code: BE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20210731 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: LT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210630 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: HU Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO Effective date: 20150723 |
|
P01 | Opt-out of the competence of the unified patent court (upc) registered |
Effective date: 20230510 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: CY Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210630 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: NL Payment date: 20230614 Year of fee payment: 9 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: GB Payment date: 20230601 Year of fee payment: 9 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: DE Payment date: 20230531 Year of fee payment: 9 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210630 |