WO2019128971A1 - 自动化显微镜系统的控制方法、显微镜系统及计算机可读存储介质 - Google Patents

自动化显微镜系统的控制方法、显微镜系统及计算机可读存储介质 Download PDF

Info

Publication number
WO2019128971A1
WO2019128971A1 PCT/CN2018/123401 CN2018123401W WO2019128971A1 WO 2019128971 A1 WO2019128971 A1 WO 2019128971A1 CN 2018123401 W CN2018123401 W CN 2018123401W WO 2019128971 A1 WO2019128971 A1 WO 2019128971A1
Authority
WO
WIPO (PCT)
Prior art keywords
neural network
microscope system
control method
automated microscope
magnification image
Prior art date
Application number
PCT/CN2018/123401
Other languages
English (en)
French (fr)
Inventor
叶肇元
Original Assignee
云象科技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 云象科技股份有限公司 filed Critical 云象科技股份有限公司
Priority to EP18894641.2A priority Critical patent/EP3734515A4/en
Priority to US16/957,467 priority patent/US11287634B2/en
Priority to JP2020536747A priority patent/JP7277886B2/ja
Publication of WO2019128971A1 publication Critical patent/WO2019128971A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N15/00Investigating characteristics of particles; Investigating permeability, pore-volume or surface-area of porous materials
    • G01N15/10Investigating individual particles
    • G01N15/14Optical investigation techniques, e.g. flow cytometry
    • G01N15/1429Signal processing
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N15/00Investigating characteristics of particles; Investigating permeability, pore-volume or surface-area of porous materials
    • G01N15/10Investigating individual particles
    • G01N15/14Optical investigation techniques, e.g. flow cytometry
    • G01N15/1429Signal processing
    • G01N15/1433Signal processing using image recognition
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/36Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
    • G02B21/365Control or image processing arrangements for digital or video microscopes
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/36Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
    • G02B21/365Control or image processing arrangements for digital or video microscopes
    • G02B21/367Control or image processing arrangements for digital or video microscopes providing an output produced by processing a plurality of individual source images, e.g. image tiling, montage, composite images, depth sectioning, image comparison
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N7/00Computing arrangements based on specific mathematical models
    • G06N7/01Probabilistic graphical models, e.g. probabilistic networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/69Microscopic objects, e.g. biological cells or cellular parts
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N15/00Investigating characteristics of particles; Investigating permeability, pore-volume or surface-area of porous materials
    • G01N15/10Investigating individual particles
    • G01N2015/1006Investigating individual particles for cytology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10056Microscopic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20021Dividing image into blocks, subimages or windows
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30096Tumor; Lesion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection

Definitions

  • the present application relates to the field of automated microscope systems, and more particularly to a method of controlling an automated microscope system, a microscope system, and a computer readable storage medium.
  • Analysis of biological samples is an important part of disease diagnosis. For example, blood samples, sections of target tissue samples, tissue fluid samples, and the like are analyzed to determine whether there are disease-related characteristics.
  • automated microscope systems have been used in the related art to reduce the time of human operation. Specifically, such automated microscope systems can provide autofocus capabilities to allow the inspector to find a field of view suitable for analysis.
  • the purpose of the present application is to provide a control method of an automated microscope system and a computer readable storage medium, which can improve the analysis efficiency of a biological sample.
  • Another object of the present application is to provide an automated microscope system that avoids errors and errors caused by human manipulation.
  • the present application provides a method for controlling an automated microscope system, comprising the steps of: acquiring a low-magnification image by a device; and inputting the low-magnification image into a first neural network to select a region of interest; wherein the first nerve The network is trained by enhanced learning; the region of interest is enlarged to generate a high-magnification image; the high-magnification image is input to the second neural network to analyze whether the high-magnification image contains the target feature, and the target feature is generated Relevant statistical results; and generating a feedback signal based on the statistical result, and sending the feedback signal to the first neural network to train the first neural network by the enhanced learning.
  • the first neural network is a first convolutional neural network or a fully connected neural network.
  • the step of inputting the low-magnification image into the first neural network to select the region of interest further comprises cutting the low-magnification image into a plurality of regions.
  • the first neural network is the first convolutional neural network; wherein the plurality of regions are input to the first convolutional neural network to generate a probability distribution model, wherein the probability distribution model represents any of the plurality of regions One is the probability of the area of interest.
  • the first neural network finds any one of the plurality of regions as the region of interest, generating a forward feedback signal to the first neural network to train the first nerve through the enhanced learning The internet.
  • a negative feedback signal is generated to the first neural network every specific time to pass the enhanced learning training.
  • the first neural network is
  • the first neural network further comprises a supervised learning algorithm, an unsupervised learning algorithm, an imitative algorithm, or a combination thereof.
  • the method further comprises: determining whether the statistical result meets the overall goal.
  • a negative feedback signal is generated to the first neural network to train the first neural network by the enhanced learning.
  • a negative feedback signal is generated to the first neural network to train the first neural network by the enhanced learning.
  • the second neural network is a second convolutional neural network or a fully connected neural network.
  • the second neural network is configured as an instance semantic segmentation model or an image classification model to analyze whether the high-magnification image contains the target feature.
  • the present application further provides a microscope system comprising a processor that performs the control method of the automated microscope system.
  • the application further provides a computer readable storage medium storing a program, and when the computer loads the program, the control method of the automated microscope system can be executed.
  • FIG. 1 is a flow chart of a method of controlling an automated microscope system in accordance with a particular embodiment of the present application.
  • FIG. 2 is a block diagram of a microscope system in accordance with an embodiment of the present application.
  • the present application relates to a control method for an automated microscope system, a microscope system, and a computer readable storage medium characterized by reduced human manipulation and automated learning and analysis of biological samples.
  • the term "device” as used in this application is an optical microscope, especially an automated microscope system.
  • the automated microscope system includes an optical unit, and the optical unit includes a plurality of sets of objective lenses of different magnifications (e.g., 5x, 10x, 20x, 40x, 100x).
  • the customary operation of the microscope is to search the region of interest in the field of view with a low magnification objective lens and then convert it to a high magnification objective to further observe the details.
  • the term "low-magnification image” as used in the present application refers to an image obtained by a relatively low objective lens
  • the term "high-magnification image” as used herein refers to an image obtained by an objective lens having a relatively high magnification.
  • the low magnification image refers to an image acquired with an objective lens of 5 x or 10 x.
  • the high magnification image refers to an image acquired by an objective lens of 20 ⁇ , 40 ⁇ , 100 ⁇ or higher.
  • the low-magnification image is determined relative to the high-magnification image, that is, the magnification of the low-magnification image is lower than the magnification of the high-magnification image.
  • target characteristics refers to the characteristics related to the analysis target.
  • the target feature is a particular cell type, such as a bone marrow cell or a cancer cell.
  • the target feature refers to a feature associated with a particular disease or condition.
  • region of interest refers to a region in which a biological sample to be analyzed is determined to be related to an analysis target in the field of view of the device.
  • the elements determining the region of interest include, but are not limited to, image quality (eg, focus quality and/or stain quality of the biological sample), presence and distribution of the target feature, or a combination thereof.
  • image quality eg, focus quality and/or stain quality of the biological sample
  • the aforementioned focus quality can be determined in various ways, for example, calculating the average intensity of the fast Fourier transform characteristic interval or the Laplacian of Gaussian values (LoG).
  • the region of interest can be viewed as a frame with good image quality and target features.
  • all goal is based on the purpose of analysis.
  • the purpose of the analysis is to calculate the distribution of bone marrow cell types in the smear. Therefore, a certain amount of overall goals can be set. For example, counting up to 500 bone marrow cells that can be successfully classified is the overall goal.
  • the purpose of the analysis is to identify the presence or absence of cancer cells in the tissue of the section. Therefore, the overall goal of certainty can be set. For example, when cancer cells are identified in the slice, an overall goal is achieved.
  • FIG. 1 is a flow chart of a method of controlling an automated microscope system in accordance with an embodiment of the present application.
  • the figures show sequential steps, it will be appreciated by those of ordinary skill in the art to which the present disclosure pertains that certain steps may be interchanged or performed concurrently.
  • a low magnification image is acquired by the device.
  • the low magnification image is an image taken of a biological sample by an optical component of an automated microscope system.
  • the biological sample is a bone marrow smear.
  • the purpose of the analysis of the bone marrow smear embodiment is to detect the distribution of bone marrow cell types in the sample. Therefore, the target feature is bone marrow cells, and the overall goal can be set to successfully classify 500 bone marrow cells.
  • the biological sample is a lymph node slice.
  • the purpose of the analysis of the lymph node section is to detect the presence or absence of cancer cells in the lymph nodes. Therefore, the target is characterized by cancer cells, and the overall goal is the presence or absence of cancer cells.
  • the low magnification image is input to the first neural network to select a region of interest.
  • the first neural network may be a Convolutional Neural Network (CNN) or a Fully Connected Neural Network (or Multi-layer perceptron).
  • the first neural network is a first convolutional neural network trained by a reinforcement learning scheme.
  • the first neural network can be viewed as comprising two parts: a first convolutional neural network and an enhanced learning algorithm.
  • the low-magnification image is first cut into a plurality of regions, each of which is a candidate region of interest. Based on this, the plurality of regions are input to the first convolutional neural network, and subjected to enhanced learning training to output a probability distribution model.
  • the probability distribution model represents a probability of selecting any one of the plurality of regions as the region of interest.
  • the enhanced learning algorithm automatically finds potential rules in Training Data and preferably trains the first neural network through a feedback mechanism. For example, when the first neural network obtains an image with good focus quality or an image with a target feature, a forward feedback signal is generated. Thereby the ability of the first neural network is optimized by enhanced learning. In addition, when the first neural network selects any one of the plurality of regions as the region of interest, a negative feedback signal is generated to the first neural network every time a certain time passes. Thereby, it is not encouraged that the first neural network understands that searching for the region of interest is too long by enhanced learning.
  • the first neural network further comprises a supervised learning algorithm, an unsupervised learning algorithm, an imitation algorithm, or a combination thereof.
  • the pre-collected bone marrow smear image is used as training data.
  • the features of the extracted data from the training data help the system to judge the target and then tell the answer corresponding to each sample of the first neural network.
  • the dense distribution of bone marrow cells would be detrimental to subsequent identification, and the distribution of bone marrow cells would be too sparse to reduce processing efficiency. Therefore, preferably, the region of interest should have good image quality and moderate density of bone marrow cells.
  • the appropriate area label (Label) is 1 by the human expert and the general sample label is 0. Based on this, as the amount of training data is accumulated, the first neural network can learn to identify the region (region of interest) containing the target feature. Thereafter, when a new image data is input, the first neural network can recognize the probability that any of the plurality of regions of the image belong to the region of interest.
  • the region of interest is acquired in step S106, the region is enlarged to obtain a high-magnification image.
  • the automated microscope system automatically moves the region of interest to the center of the field of view of the microscope, and obtains a high-magnification image of the region of interest with a high-magnification objective lens. Thereby presenting more details in the region of interest.
  • the high-magnification image is input to the second neural network to analyze whether the high-magnification image contains the target feature, and generate statistical results related to the target feature.
  • the second neural network can be a convolutional neural network or a fully connected neural network.
  • the second neural network includes a second convolutional neural network, which is an instance semantic segmentation model or an image classification model, which can identify individual objects and their contours and categories in the image. Therefore, it is used to analyze whether the high-magnification image contains the target feature.
  • the second neural network is configured to identify bone marrow cells in the high-magnification image of the region of interest, calculate the number thereof, and make a statistical result. If the second neural network does count bone marrow cells in the high-magnification image, then the frame included in the region of interest is defined as an expected frame and the statistical results are stored. On the other hand, if the second neural network fails to count the bone marrow cells in the high-magnification image, the frame included in the region of interest is not defined as an expected frame, and proceeds to step S114 to generate a feedback signal to the The first neural network to train the first neural network by enhanced learning.
  • the feedback signal is generated based on the statistical result and the feedback signal is sent to the first neural network to train the first neural network through enhanced learning.
  • the overall goal is to have a high magnification image containing 500 bone marrow cells.
  • the first neural network will receive a positive reward.
  • the selected region of interest has a magnified high-magnification image of less than 500 bone marrow cells, the first neural network will receive a lower reward.
  • the second neural network is configured to identify follicular structures and cancer cells in high magnification images of the region of interest. If the second neural network does recognize the follicular structure in the high-magnification image, the frame included in the region of interest is defined as an expected frame, and the statistical result is stored. On the other hand, if the second neural network fails to recognize the follicular structure in the high-magnification image, the frame included in the region of interest is not defined as an expected frame, and proceeds to step S114 to generate a feedback signal to The first neural network to train the first neural network by enhanced learning.
  • step S112 it is determined whether the statistical result conforms to the overall goal, wherein the overall goal depends on the purpose of the analysis.
  • the process proceeds to step S116.
  • the process proceeds to step S116 to output a statistical result.
  • the process proceeds to step S116 to output a statistical result.
  • the process proceeds to step S114, and the feedback signal is used to cause the first neural network to learn how to achieve the overall goal.
  • the microscope system is an automated microscope system 200 comprising: an optical unit 202, a stage 204, an electronic control unit 206, a storage unit 208, and a processor 210.
  • the optical unit 202 is composed of objective lenses, relay optics, trinoculars, and a digital camera, and the objective lens is used to magnify an image of a biological sample.
  • the optical unit has a plurality of objective lenses of different magnifications (e.g., 5x, 10x, 20x, 40x, 100x) mounted on the electric front wheel.
  • the magnified image reaches the trinocular through the relay optical module, which splits the incident light into three paths, two for the human eye and one for the digital camera.
  • a digital camera is used to capture an image of the sample.
  • the stage 204 is used to place a slide of a biological sample.
  • the stage 204 is movable in the x, y, and z directions. Movement in the x, y direction is to change the field of view of the biological sample, while movement in the z direction is to perform focusing on the biological sample.
  • the electronic control unit 206 is configured to control the movement of the stage 204 or the action of the optical unit 202 according to the output of the processor 210 (eg, rotating the electric front wheel to replace the objective lens of different magnifications).
  • the storage unit 208 is configured to store images acquired by the optical unit 202 and one or more algorithms and/or predetermined rules. The one or more algorithms and predetermined rules may be used to cause the processor 210 to perform the control method of the automated microscope system as described in FIG. In order to smoothly perform the control method of the automated microscope system as described in FIG. 1, the storage unit 208 also stores a first neural network and a second neural network (not shown).
  • the processor 210 is configured to perform automated steps based on one or more algorithms and predetermined rules.
  • the flow of image processing can include acquiring images, analyzing the image content, and/or generating correlated statistical results.
  • the processor 210, the one or more algorithms and predetermined rules can assist the automated microscope system in identifying and acquiring images for diagnosis.
  • the enhanced learning algorithm referred to in this application can be set to determine an optimization action, such as to obtain the highest feedback, the algorithm can set the strategy of the stage movement or objective lens replacement to achieve the best Focus plane.
  • the processor 210 controls the motion of the optical unit 202 via the electronic control unit 206 to capture and capture low magnification images of the biological sample using a low magnification objective lens. After receiving the low-magnification image generated by the optical unit 202, the processor 210 inputs the low-magnification image into the first neural network to select the region of interest. Once the region of interest is found, the processor 210 will control the stage 204 via the electronic control unit 206, move the region of interest to the center of the field of view, and acquire the magnified image of the region of interest through the high magnification objective lens. This high magnification image will show more details of the biological sample to be analyzed.
  • the processor 210 inputs the high-magnification image into the second neural network to analyze whether the high-magnification image contains the target feature, and generates a statistical result related to the target feature. Finally, the processor 210 generates a feedback signal based on the statistical result and sends it to the first neural network to enable the first neural network to learn how to achieve the overall goal.
  • control method of the automated microscope system as described above in FIG. 1 may be differently presented in software and can be understood as a product concept, and the form is usually The executable program code and/or related data embodied on a computer readable storage medium or embodied on a computer readable medium.
  • Computer readable media includes any or all types of memory, any or all other storage devices for use with a computer, processor or other similar device, or modules associated with devices such as computers, processors, etc., such as various semiconductor memories, tapes Storage devices, hard drives, and other similar devices that store software at any time.
  • All or part of the program may be communicated over the network at any time, such as the Internet or other various telecommunication networks.
  • such communication may cause the program to be loaded by the computer or processor to another computer or processor, such as
  • the device is loaded into a hardware platform of a computer environment, or loaded into other systems that implement a computer environment or similar functionality associated with distributed machine learning techniques.
  • another type of medium carrying the software that is, including light waves, electric waves, and electromagnetic waves, can be used between different local devices through wires or fiber-optic fixed networks and various types of air transmission across physical interfaces.
  • a physical component carrying the aforementioned electronic waves, such as a wired or wireless network, a fiber optic network, or the like, may also be considered a medium carrying the program.
  • conventional common forms of computer readable storage media are as follows: disk, floppy disk, hard disk, magnetic tape, any other magnetic media, CD-ROM, DVD or DVD-ROM, any other optical media, perforated paper cassette, any other holed Physical storage medium, RAM, PROM and EPROM, FLASH-EPROM, any other memory chip or cassette, a carrier carrying data or instructions, a cable or network carrying this carrier, or any other program code for the computer to read And/or the medium of the data.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Software Systems (AREA)
  • Computing Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Analytical Chemistry (AREA)
  • Medical Informatics (AREA)
  • Data Mining & Analysis (AREA)
  • Mathematical Physics (AREA)
  • General Engineering & Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Optics & Photonics (AREA)
  • Pathology (AREA)
  • Signal Processing (AREA)
  • Immunology (AREA)
  • Biochemistry (AREA)
  • Dispersion Chemistry (AREA)
  • Pure & Applied Mathematics (AREA)
  • Algebra (AREA)
  • Mathematical Optimization (AREA)
  • Mathematical Analysis (AREA)
  • Computational Mathematics (AREA)
  • Probability & Statistics with Applications (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Quality & Reliability (AREA)
  • Image Analysis (AREA)

Abstract

一种自动化显微镜系统的控制方法、显微镜系统及计算机可读存储介质,该控制方法利用通过增强式学习训练的神经网络,以实现显微镜系统自动化分析生物样本,提高诊断效率。

Description

自动化显微镜系统的控制方法、显微镜系统及计算机可读存储介质 技术领域
本申请涉及自动化显微镜系统领域,尤其是一种关于自动化显微镜系统的控制方法、显微镜系统及计算机可读存储介质。
背景技术
对于生物性样本的分析是疾病诊断的重要一环。例如分析血液样本、目标组织样本的切片、组织液样本等,以确认其中是否存在与疾病相关的特征。为了促进生物性样本的分析的效率,相关领域中已开始使用自动化显微镜系统来减少人为操作的时间。具体来说,这类自动化显微镜系统多可以提供自动对焦的功能,以方便检测人员找寻适合进行分析的视野。
然而,随着疾病检测的普及和复杂化,当需要分析大量影像时,以人工的方式进行影像的检测和分析,可能会提高检测错误和失误的比例。传统的提供自动对焦功能的自动化显微镜系统已无法满足相关领域中的需求。因此,相关领域中需要一种可以找寻合适视野、主动分析、且自我学习的自动化显微镜系统。
发明内容
本申请的目的在于提供一种自动化显微镜系统的控制方法及计算机可读存储介质,该方法可以提高生物性样本的分析效率。
本申请的另一目的在于提供一种自动化显微镜系统,该系统可以避免人为操作所造成的错误与失误。
为达到上述目的,本申请提供一种自动化显微镜系统的控制方法,包含以下步骤:通过装置获取低倍率影像;将该低倍率影像输入第一神经网络以选取感兴趣区域;其中,该第一神经网络系是通过增强式学习训练的;将该感兴趣区域放大以产生高倍率影像;将该高倍率影像输入第二神经网络,以分析该高倍率影像是否包含目标特征,并产生和该目标特征相关的统计结果;以及依据该统计结果产生反馈信号,并将该反馈信号送至该第一神经网络,以通过所述增强式学习训练该第一神经网络。
较佳地,该第一神经网络为第一卷积神经网络或全连接神经网络。
较佳地,将该低倍率影像输入该第一神经网络以选取该感兴趣区域的步骤更包含将该低倍率影像切割成多个区域。
较佳地,该第一神经网络为该第一卷积神经网络;其中将该多个区域输入该第一卷积神经网络以产生机率分布模型,其中该机率分布模型代表该多个区域中任意一个为该感兴趣区域的机率。
较佳地,当该第一神经网络在该多个区域中找到任意一个为该感兴趣区域时,产生正向反馈信号至该第一神经网络,以通过所述增强式学习训练该第一神经网络。
较佳地,当该第一神经网络在该多个区域中选择任意一个为该感兴趣区域时,每经过特定时间产生负向反馈信号至该第一神经网络,以通过所述增强式学习训练该第一神经网络。
较佳地,该第一神经网络进一步包含监督式学习算法、非监督式学习算法、模仿学算法、或其组合。
较佳地,该方法更包含:判断该统计结果是否符合总体目标。较佳地,当该统计结果不符合该总体目标时,产生负向反馈信号至该第一神经网络,以通过所述增强式学习训练该第一神经网络。
较佳地,当该高倍率影像不包含该目标特征时,产生负向反馈信号至该第一神经网络,以通过所述增强式学习训练该第一神经网络。
较佳地,该第二神经网络为第二卷积神经网络或全连接神经网络。
较佳地,该第二神经网络被配置为实例语义分割模型或影像分类模型,以分析该高倍率影像是否包含该目标特征。
本申请另提供一种显微镜系统,其特征在于:包含执行该自动化显微镜系统的控制方法的处理器。
本申请再提供一种内储程序的计算机可读存储介质,当计算机加载该程序后,可执行该自动化显微镜系统的控制方法。
附图说明
参照下列附图与说明,可更进一步理解本申请。非限制性与非穷举性实例系参照下列附图而描述。在附图中的构件并非必须为实际尺寸,重点在于说明结构及原理。
图1是依据本申请的具体实施例的自动化显微镜系统的控制方法的流程图。
图2是依据本申请一实施例的显微镜系统的方块图。
具体实施方式
本申请关于自动化显微镜系统的控制方法、显微镜系统及计算机可读存储介质,其特点在于减少人为操作,并得以自动化学习及分析生物性样本。
本申请所称「装置」,是一种光学显微镜,尤指一种自动化显微镜系统。在一可行实施例中,该自动化显微镜系统包含光学单元,且该光学单元内包含多组不同放大倍率(例如5×,10×,20×,40×,100×)的物镜。
显微镜的习惯操作方式是以低倍率的物镜搜寻视野中感兴趣区域,然后再转换为高倍率的物镜以进一步观察细节。据此,本申请所称「低倍率影像」,是指以相对低的物镜所取得的影像;而本申请所称「高倍率影像」,是指以放大倍率相对高的物镜所取得的影像。在一具体实施例中,该低倍率影像是指以5×或10×的物镜所获取的影像。在一具体实施例中,该高倍率影像是指以20×、40×、100×或更高倍率的物镜所获取的影像。在一可行实施例中,该低倍率影像是相对于该高倍率影像而定,意即该低倍率影像的放大倍率低于该高倍率影像的放大倍率。
本申请所称「目标特征」,是指与分析目标相关的特征。在一具体实施例中,该目标特征为特定细胞种类,例如骨髓细胞或癌细胞。在一较佳实施例中,该目标特征是指与特定疾病或其病征相关的特征。
本申请所称「感兴趣区域(region of interest)」,是指待分析的生物性样本在该装置的视野中,被判定与分析目标相关的区域。判定该感兴趣区域的要素包含,但不限于:影像质量(例如:对焦质量及/或生物性样本的染色质量)、目标特征的存在与分布、或其组合。前述对焦质量可由多种方式决定,例如,计算影像经快速傅立叶变换后(fast Fourier transform)特征区间之平均强度或是计算高斯值的拉普拉斯算子(Laplacian of Gaussian values;LoG)。在一可行实施例中,该感兴趣区域可视为一图框,其具有良好的影像质量以及目标特征。
本申请所称「总体目标(overall goal)」,是基于分析目的而定。举例来说,在骨髓抹片的实施例中,分析目的是计算该抹片中的骨髓细胞类别分布。因此,可设定一定量的总体目标。例如,以计数达500个可成功分类的骨髓细胞做为总体目标。在淋巴结切片的实施例中,分析目的是辨识该切片的组织中是否存在癌细胞。因此,可设定一定性的总体目标。例如,当于该切片中辨识出癌细胞时,即达成总体目标。
以下将参考本申请的附图详细描述实施例。在该附图中,相同及/或对应组件以相同参考符号表示。
图1是依据本申请一实施例之自动化显微镜系统的控制方法的流程图。虽然图中显示这些步骤具有顺序性,但本申请所属领域中具有通常知识者应可了解,在其他实施例中,某些步骤可以交换或者同时执行。
在步骤S102,通过装置获取低倍率影像。该低倍率影像是以自动化显微镜系统的光学组件对一生物性样本拍摄的影像。在一实施例中,该生物性样本为一骨髓抹片。该骨髓抹片实施例的分析目的为检测样本中的骨髓细胞类别分布。因此,目标特征为骨髓细胞,总体目标可设为对500个骨髓细胞成功进行分类。在另一实施例中,该生物性样本为一淋巴结切片。该淋巴结切片实施例的分析目的为检测淋巴结中是否存在癌细胞。因此,目标特征为癌细胞,总体目标为癌细胞的存在与否。
在步骤S104,将该低倍率影像输入第一神经网络以选取感兴趣区域。该第一神经网络可为卷积神经网络(Convolutional Neural Network,CNN)或全连接神经网络(Fully Connected Neural Network,或称Multi-layer perceptron)。在本实施例中,该第一神经网络为通过增强式学习(reinforcement learning scheme)所训练的第一卷积神经网络。因此该第一神经网络可视为包含两个部份:第一卷积神经网络以及增强式学习算法。在本实施例中,先将该低倍率影像切割成多个区域,每个区域都是候选的感兴趣区域。基此,将该多个区域输入该第一卷积神经网络,并使其经增强式学习训练而输出一机率分布模型。该机率分布模型代表选择到该多个区域中任意一个区域为该感兴趣区域的机率。
该增强式学习算法是自动在训练数据(Training Data)中找出潜在的规则,并较佳地,通过一反馈机制训练该第一神经网络。举例来说,当该第一神经网络取得一对焦质量良好的影像或具有目标特征的影像时,将产生正向反馈信号。借此通过增强式学习优化该第一神经网络的能力。此外,当该第一神经网络在该多个区域中选择任意一个为该感兴趣区域时,每经过特定时间即产生负向反馈信号至该第一神经网络。借此,通过增强式学习可使该第一神经网络理解找寻感兴趣区域的时间过长是不被鼓励的。
在一可行实施例中,该第一神经网络另包含监督式学习算法、非监督式学习算法、模仿学算法、或其组合。以监督式学习(Supervised Learning)算法为例,将预先搜集 的骨髓抹片影像作为训练数据。从训练数据中撷取出数据的特征(Features)帮助系统判读出目标,再告诉该第一神经网络每一个样本所对应到的答案。例如,在骨髓抹片的实施例中,骨髓细胞分布太密集将不利后续辨识,骨髓细胞分布太稀疏则会降低处理效率。因此,较佳地,感兴趣区域应具有良好的影像质量及密度适中的骨髓细胞。经由人类专家把合适的区域卷标(Label)为1,一般样本卷标为0。基此,随着训练数据量的累积,该第一神经网络即可学习辨识包含目标特征的区域(感兴趣区域)。此后,当一笔新影像数据输入时,该第一神经网络即可辨识出该影像之该多个区域中任一属于感兴趣区域的机率。
在步骤S106,取得感兴趣区域后,将该区域放大以取得高倍率影像。在本实施例中,该第一神经网络辨识出感兴趣区域后,自动化显微镜系统会自动将感兴趣区域移动到显微镜的视野的中心,并以高倍率的物镜取得该感兴趣区域的高倍率影像,借此呈现该感兴趣区域中的更多细节。
在步骤S108,将高倍率影像输入第二神经网络以分析高倍率影像是否包含目标特征,并产生和目标特征相关的统计结果。该第二神经网络可为卷积神经网络或全连接神经网络。在本实施例中,第二神经网络包含第二卷积神经网络,其为实例语义分割(semantic segmentation)模型或影像分类模型,此类模型可辨识出影像中的个别物体及其轮廓及类别,因此被用以分析该高倍率影像是否包含该目标特征。
在步骤S110,在骨髓抹片的实施例中,该第二神经网络被配置为辨识感兴趣区域的高倍率影像中的骨髓细胞,计算其数量,并作成统计结果。如果该第二神经网络确实在该高倍率影像中计得骨髓细胞,则定义此感兴趣区域所包含的图框为一预期图框,并将统计结果储存。反之,若该第二神经网络未能在该高倍率影像中计得骨髓细胞,则定义此感兴趣区域所包含的图框非为一预期图框,并进入步骤S114以产生一反馈信号至该第一神经网络,以通过增强式学习训练该第一神经网络。
在一较佳实施例中,依据统计结果产生反馈信号,并将反馈信号送至第一神经网络,以通过增强式学习训练该第一神经网络。举例来说,在骨髓抹片的实施例中,将总体目标定成高倍率影像含有500个骨髓细胞。当该高倍率影像中计得500个或以上的骨髓细胞, 该第一神经网络会得到一个正向奖励(positive reward)。反之,若选择的感兴趣区域放大后的高倍率影像少于500个骨髓细胞,该第一神经网络会得到一个较低的奖励。更具体来说,在一可行实施例中,用于增强式学习算法中的奖励函数形式可为f(n)=min(n/500,1),其中n为高倍率影像中包含的骨髓细胞数量。此时该奖励由影像中的骨髓细胞数目决定,最高为1即代表高倍率影像中至少有500个骨髓细胞,最低为0即代表高倍率影像中没有任何骨髓细胞。增通过将奖励最大化(等价于选出至少包含500个骨髓细胞的区域),以训练该第一神经网络学习达成总体目标。
在淋巴结切片的实施例中,该第二神经网络被配置为辨识感兴趣区域的高倍率影像中的滤泡结构与癌细胞。如果该第二神经网络确实在该高倍率影像中辨识出滤泡结构,则定义此感兴趣区域所包含的图框为一预期图框,并将统计结果储存。反之,若该第二神经网络未能在该高倍率影像中辨识出滤泡结构,则定义此感兴趣区域所包含的图框非为一预期图框,并进入步骤S114以产生一反馈信号至该第一神经网络,以通过增强式学习训练该第一神经网络。
在步骤S112,判断统计结果是否符合总体目标,其中总体目标取决于分析目的。当统计结果符合总体目标时,进入步骤S116。在骨髓抹片的实施例中,每一感兴趣区域的高倍率影像中所计得的骨髓细胞数量经累计计算达到总体目标(例如,500个骨髓细胞)时,进入步骤S116以输出统计结果。在淋巴结切片的实施例中,当发现高倍率影像中存在癌细胞时,即符合总体目标,进入步骤S116以输出统计结果。反之,当统计结果不符合总体目标时,则进入步骤S114,通过反馈信号以使第一神经网络学习如何达成总体目标。
图2是依据本申请一实施例的显微镜系统的方块图。在本实施例中,显微镜系统为自动化显微镜系统200,其包含:光学单元202、载物台204、电子控制单元206、储存单元208以及处理器210。
该光学单元202由物镜(objective lenses)、中继光学模块(relay optics)、三目镜(trinocular)和数字相机组成,物镜用于放大生物性样本的影像。在一具体实例中,该光学单元具有不同放大倍数(例如5×,10×,20×,40×,100×)的多个物镜,其安装 在电动前轮上。放大后的影像通过中继光学模块到达三目镜,三目镜将入射光分为三路,其中两个用于人眼,另一个用于数字相机。数字相机用以取得样本的影像。
载物台204用以放置生物性样本的载玻片。该载物台204可在x,y和z方向上移动。沿x,y方向的移动是改变观察生物性样本的视野,而沿z方向的移动是将执行对生物性样本的对焦。
电子控制单元206用以根据处理器210的输出控制载物台204的移动或者光学单元202的动作(例如,转动电动前轮以替换不同倍率的物镜)。储存单元208用以储存光学单元202取得的影像以及一个或多个算法及/或预定规则。此一个或多个算法和预定规则可用以使处理器210执行如图1所述的自动化显微镜系统的控制方法。为了能顺利执行如图1所述的自动化显微镜系统的控制方法,储存单元208还储存了第一神经网络以及第二神经网络(未图标)。
处理器210系被设置以基于一或多个算法及预定规则执行自动化步骤。举例来说,图像处理的流程可包括获取影像、影像内容的分析、及/或据以产生相关的统计结果。通过处理器210,该一或多个算法及预定规则可协助该自动化显微镜系统辨识及获取影像,以供诊断。举例来说,本申请中所提到的增强式学习算法可被设置以决定一优化动作,如为了获得最高的反馈,该算法可设定载物台移动或物镜更换的策略以取得最佳的对焦平面。
在一具体实施例中,处理器210经由电子控制单元206控制光学单元202的动作,以利用低倍率的物镜拍摄并取得生物性样本的低倍率影像。于接收光学单元202产生的低倍率影像后,处理器210将低倍率影像输入第一神经网络以选取感兴趣区域。一旦发现感兴趣的区域,处理器210将经由电子控制单元206控制载物台204,将感兴趣区域移动至视野的中心,并通过高倍率的物镜取得感兴趣区域放大后之高倍率影像。此高倍率影像将显示待分析的生物性样本的更多细节。
接着,处理器210将该高倍率影像输入第二神经网络以分析该高倍率影像是否包含目标特征,并产生和该目标特征相关的统计结果。最后,处理器210依据该统计结果产生反馈信号并送至第一神经网络,以使第一神经网络学习如何达成总体目标。
另外,在本申请的另一方面中,如上所述的如图1所述的自动化显微镜系统的控制方法,其不同方面可于软件中具体呈现并可以理解成产品之概念,其形式通常为载于一种计算机可读存储介质上,或是在一种计算机可读取媒体上具体呈现的可执行程序代码及/或相关数据。计算机可读取介质包含任何或所有类型的内存、任何或所有其他供计算机、处理器或其他类似装置使用的储存装置,或与计算机、处理器等装置相关的模块,例如各种半导体内存、磁带储存装置、硬盘及其他可随时储存软件的类似装置。
该程序所有部分或其中一部分,可随时通过网络通讯,如因特网或其他各种电信网络,举例而言,此种通讯可使该程序由计算机或处理器被加载至另一计算机或处理器,例如由装置被加载至计算机环境的硬件平台,或加载至实施计算机环境或与分布式机器学习技术相关的类似功能的其他系统中。因此,载有该软件的另一类型介质,即包含光波、电波及电磁波,可通过电线或光纤固网及各式经空气传输横跨各实体接口,于不同本地装置之间使用。载有前述电子波的实体组件,如有线或无线网络、光纤网络或其他类似之网络,亦可被视为载有该程序之介质。本申请所使用的计算机可读存储介质等词汇,除非限定指称有形储存介质,否则即指任何涉及给予处理器执行指示的介质。
另外,计算机可读存储介质的传统常见形式如下:磁盘、软盘、硬盘、磁带、任何其他磁性介质、CD-ROM、DVD或DVD-ROM、任何其他光学介质、穿孔纸卡带、任何其他具孔洞之实体储存介质、RAM、PROM及EPROM、FLASH-EPROM、任何其他存储芯片或卡匣、一载有数据或指令之载波、载有此载波之缆线或网络,或任何其他供计算机读取程序代码及/或数据之介质。
所属技术领域技术人员将会认可本公开具调整灵活性,可施以各种修改及/或优化。举例而言,尽管上述设置各种系统组件的方式可于硬件装置中实施,也可限定仅以软件方式实施,或者于既有服务器上安装。另外,本申请公开的自动化显微镜系统的控制方法可采用固件、固件/软件组合、固件/硬件组合或硬件/固件/软件组合实施。
【符号说明】
S102           步骤                 S104           步骤
S106           步骤                 S108           步骤
S110           步骤                 S112           步骤
S114           步骤                 S116           步骤
200            自动化显微镜系统
202            光学单元             204            载物台
206            电子控制单元         208            储存单元
210            处理器

Claims (15)

  1. 一种自动化显微镜系统的控制方法,其特征在于,所述控制方法包括:
    通过装置获取低倍率影像;
    将所述低倍率影像输入第一神经网络以选取感兴趣区域,其中该第一神经网络是通过增强式学习训练的;
    将所述感兴趣区域放大以产生高倍率影像;
    将所述高倍率影像输入第二神经网络,以分析所述高倍率影像是否包含目标特征,并产生和所述目标特征相关的统计结果;以及
    依据所述统计结果产生反馈信号,并将所述反馈信号送至所述第一神经网络,以通过所述增强式学习训练所述第一神经网络。
  2. 如权利要求1所述的自动化显微镜系统的控制方法,其特征在于,所述第一神经网络为第一卷积神经网络或全连接神经网络。
  3. 如权利要求2所述的自动化显微镜系统的控制方法,其特征在于,将所述低倍率影像输入所述第一神经网络以选取所述感兴趣区域的步骤进一步包括:
    将所述低倍率影像切割成多个区域。
  4. 如权利要求3所述的自动化显微镜系统的控制方法,其特征在于,所述第一神经网络为第一卷积神经网络;其中将所述多个区域输入所述第一卷积神经网络以产生机率分布模型,其中所述机率分布模型代表所述多个区域中的任意一个为所述感兴趣区域的机率。
  5. 如权利要求3所述的自动化显微镜系统的控制方法,其特征在于,当所述第一神经网络在所述多个区域中找到任意一个为所述感兴趣区域时,产生正向反馈信号至所述第一神经网络,以通过所述增强式学习训练所述第一神经网络。
  6. 如权利要求3所述的自动化显微镜系统的控制方法,其特征在于,当所述第一神经网络在所述多个区域中选择任意一个为所述感兴趣区域时,每经过特定时间产生负向反馈信号至所述第一神经网络,以通过所述增强式学习训练该第一神经网络。
  7. 如权利要求1所述的自动化显微镜系统的控制方法,其特征在于,所述第一神经网络进一步经过监督式学习算法、非监督式学习算法、模仿学算法、或其组合的训练。
  8. 如权利要求1所述的自动化显微镜系统的控制方法,其特征在于,所述控制方法包括:判断所述统计结果是否符合总体目标。
  9. 如权利要求8所述的自动化显微镜系统的控制方法,其特征在于,当所述统计结果不符合所述总体目标时,产生负向反馈信号至所述第一神经网络,以通过所述增强式学习训练该第一神经网络。
  10. 如权利要求1所述的自动化显微镜系统的控制方法,其特征在于,当所述高倍率影像不包含所述目标特征时,产生负向反馈信号至所述第一神经网络,以通过所述增强式学习训练该第一神经网络。
  11. 如权利要求1所述的自动化显微镜系统的控制方法,其特征在于,所述第二神经网络为第二卷积神经网络或全连接神经网络。
  12. 如权利要求1所述的自动化显微镜系统的控制方法,其特征在于,所述第二神经网络被配置为实例语义分割模型或影像分类模型,以分析所述高倍率影像是否包含该目标特征。
  13. 一种显微镜系统,其特征在于,包含执行如权利要求1至10项中任意一项所述的自动化显微镜系统的控制方法的处理器。
  14. 如权利要求13所述的显微镜系统,其特征在于,进一步包含光学单元、载物台、电子控制单元、储存单元、或其组合。
  15. 一种内储程序的计算机可读存储介质,其特征在于,当计算机加载该程序后,可执行如权利要求1至12项中任一项所述的自动化显微镜系统的控制方法。
PCT/CN2018/123401 2017-12-26 2018-12-25 自动化显微镜系统的控制方法、显微镜系统及计算机可读存储介质 WO2019128971A1 (zh)

Priority Applications (3)

Application Number Priority Date Filing Date Title
EP18894641.2A EP3734515A4 (en) 2017-12-26 2018-12-25 ORDERING PROCEDURE FOR AUTOMATIC MICROSCOPE SYSTEM, MICROSCOPE SYSTEM AND COMPUTER READABLE STORAGE MEDIA
US16/957,467 US11287634B2 (en) 2017-12-26 2018-12-25 Control method for automated microscope system, microscope system and computer-readable storage medium
JP2020536747A JP7277886B2 (ja) 2017-12-26 2018-12-25 自律顕微鏡システムの制御方法、顕微鏡システム、および、コンピュータ可読記憶媒体

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201762610491P 2017-12-26 2017-12-26
US62/610,491 2017-12-26

Publications (1)

Publication Number Publication Date
WO2019128971A1 true WO2019128971A1 (zh) 2019-07-04

Family

ID=67063178

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/123401 WO2019128971A1 (zh) 2017-12-26 2018-12-25 自动化显微镜系统的控制方法、显微镜系统及计算机可读存储介质

Country Status (5)

Country Link
US (1) US11287634B2 (zh)
EP (1) EP3734515A4 (zh)
JP (1) JP7277886B2 (zh)
TW (1) TWI699816B (zh)
WO (1) WO2019128971A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2022523661A (ja) * 2019-01-22 2022-04-26 アプライド マテリアルズ インコーポレイテッド 拡大画像の取得およびストレージ

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108776834B (zh) * 2018-05-07 2021-08-06 上海商汤智能科技有限公司 系统增强学习方法和装置、电子设备、计算机存储介质
JP7181001B2 (ja) * 2018-05-24 2022-11-30 日本電子株式会社 生物組織画像処理システム及び機械学習方法
US11614511B2 (en) * 2020-09-17 2023-03-28 Infineon Technologies Ag Radar interference mitigation

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5287272A (en) * 1988-04-08 1994-02-15 Neuromedical Systems, Inc. Automated cytological specimen classification system and method
CN102216827A (zh) * 2008-09-13 2011-10-12 独立行政法人科学技术振兴机构 显微镜装置以及使用该显微镜装置的荧光观察方法
CN104846054A (zh) * 2015-05-22 2015-08-19 电子科技大学 一种基于形态学特征的白带中霉菌的自动检测方法
CN106030608A (zh) * 2013-11-06 2016-10-12 理海大学 生物组织分析诊断系统与方法
US20170249548A1 (en) * 2016-02-26 2017-08-31 Google Inc. Processing cell images using neural networks
CN107369160A (zh) * 2017-06-28 2017-11-21 苏州比格威医疗科技有限公司 一种oct图像中脉络膜新生血管分割算法

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE19616997A1 (de) 1996-04-27 1997-10-30 Boehringer Mannheim Gmbh Verfahren zur automatisierten mikroskopunterstützten Untersuchung von Gewebeproben oder Körperflüssigkeitsproben
TW200538734A (en) * 2004-03-12 2005-12-01 Aureon Biosciences Corp Systems and methods for treating, diagnosing and predicting the occurrence of a medical condition
US7761240B2 (en) * 2004-08-11 2010-07-20 Aureon Laboratories, Inc. Systems and methods for automated diagnosis and grading of tissue images
JP6373864B2 (ja) * 2012-12-14 2018-08-15 ザ ジェイ. デヴィッド グラッドストーン インスティテューツ 自動ロボット顕微鏡検査システム
AU2014265382B2 (en) * 2013-05-15 2017-04-13 The Administrators Of The Tulane Educational Fund Microscopy of a tissue sample using structured illumination
EP3552389A4 (en) * 2016-11-11 2021-07-28 University of South Florida AUTOMATED STEREOLOGY FOR DETERMINING FABRIC CHARACTERISTICS
WO2018106691A1 (en) * 2016-12-06 2018-06-14 Abbott Laboratories Automated slide assessments and tracking in digital microscopy
DE102017111718A1 (de) * 2017-05-30 2018-12-06 Carl Zeiss Microscopy Gmbh Verfahren zur Erzeugung und Analyse eines Übersichtskontrastbildes

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5287272A (en) * 1988-04-08 1994-02-15 Neuromedical Systems, Inc. Automated cytological specimen classification system and method
US5287272B1 (en) * 1988-04-08 1996-08-27 Neuromedical Systems Inc Automated cytological specimen classification system and method
CN102216827A (zh) * 2008-09-13 2011-10-12 独立行政法人科学技术振兴机构 显微镜装置以及使用该显微镜装置的荧光观察方法
CN106030608A (zh) * 2013-11-06 2016-10-12 理海大学 生物组织分析诊断系统与方法
CN104846054A (zh) * 2015-05-22 2015-08-19 电子科技大学 一种基于形态学特征的白带中霉菌的自动检测方法
US20170249548A1 (en) * 2016-02-26 2017-08-31 Google Inc. Processing cell images using neural networks
CN107369160A (zh) * 2017-06-28 2017-11-21 苏州比格威医疗科技有限公司 一种oct图像中脉络膜新生血管分割算法

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3734515A4 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2022523661A (ja) * 2019-01-22 2022-04-26 アプライド マテリアルズ インコーポレイテッド 拡大画像の取得およびストレージ
US11694331B2 (en) 2019-01-22 2023-07-04 Applied Materials, Inc. Capture and storage of magnified images
JP7337937B2 (ja) 2019-01-22 2023-09-04 アプライド マテリアルズ インコーポレイテッド 拡大画像の取得およびストレージ

Also Published As

Publication number Publication date
US20200326526A1 (en) 2020-10-15
TWI699816B (zh) 2020-07-21
TW201929026A (zh) 2019-07-16
JP7277886B2 (ja) 2023-05-19
JP2021508127A (ja) 2021-02-25
EP3734515A1 (en) 2020-11-04
EP3734515A4 (en) 2021-10-13
US11287634B2 (en) 2022-03-29

Similar Documents

Publication Publication Date Title
WO2019128971A1 (zh) 自动化显微镜系统的控制方法、显微镜系统及计算机可读存储介质
US20230127698A1 (en) Automated stereology for determining tissue characteristics
Imran Razzak et al. Microscopic blood smear segmentation and classification using deep contour aware CNN and extreme machine learning
Wang et al. Classification of white blood cells with patternnet-fused ensemble of convolutional neural networks (pecnn)
TW202004513A (zh) 用於缺陷分類器訓練之主動學習
US9690976B2 (en) Imaging blood cells
CN111051955A (zh) 通过使用卷积神经网络来标识利用数字全息显微镜所获取的细胞图像的品质
KR102122068B1 (ko) 이미지 분석 시스템 및 분석 방법
CN111462075B (zh) 一种全切片数字病理图像模糊区域的快速重聚焦方法及系统
US11694327B2 (en) Cross layer common-unique analysis for nuisance filtering
Shah et al. Identification of robust focus measure functions for the automated capturing of focused images from Ziehl–Neelsen stained sputum smear microscopy slide
KR102313215B1 (ko) 특징 생성 기술을 이용한 머신러닝 기반 결함 분류 장치 및 방법
WO2023283321A1 (en) Stain-free detection of embryo polarization using deep learning
Dave et al. MIMO U-Net: efficient cell segmentation and counting in microscopy image sequences
US20230194407A1 (en) Method and system for label-free imaging and classification of malaria parasites
US20220012531A1 (en) Method for configuring an image evaluation device and also image evaluation method and image evaluation device
US20240037967A1 (en) Blood analyser with out-of-focus image plane analysis and related methods
Dave A Multiple Input Multiple Output Framework for the Automatic OpticalFractionator-Based Cell Counting in Z-Stacks Using Deep Learning
Muhammad et al. Classification of Red Blood Cell Abnormality in Thin Blood Smear Images using Convolutional Neural Networks
US20230055377A1 (en) Automated training of a machine-learned algorithm on the basis of the monitoring of a microscopy measurement
WO2024083692A1 (en) Toxicity prediction of compounds in cellular structures
JP2024500933A (ja) 画像平面分析を伴う血液分析器および関連する方法
WO2024083693A1 (en) Quality control of in-vitro analysis sample output
Christian et al. Classification and Generation of Microscopy Images with Plasmodium Falciparum via Artificial Neural Networks Using Low Cost Settings
Vandal On the Feasibility of Machine Learning Algorithms Towards Low-Cost Flow Cytometry

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18894641

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020536747

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2018894641

Country of ref document: EP

Effective date: 20200727