CN116406468A - Computational model for analyzing images of biological specimens - Google Patents
Computational model for analyzing images of biological specimens Download PDFInfo
- Publication number
- CN116406468A CN116406468A CN202180071617.0A CN202180071617A CN116406468A CN 116406468 A CN116406468 A CN 116406468A CN 202180071617 A CN202180071617 A CN 202180071617A CN 116406468 A CN116406468 A CN 116406468A
- Authority
- CN
- China
- Prior art keywords
- biological specimen
- image
- computational model
- cell
- output data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/12—Details of acquisition arrangements; Constructional details thereof
- G06V10/14—Optical characteristics of the device performing the acquisition or on the illumination arrangements
- G06V10/143—Sensing or illuminating at different wavelengths
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/16—Image acquisition using multiple overlapping images; Image stitching
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/82—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/69—Microscopic objects, e.g. biological cells or cellular parts
- G06V20/693—Acquisition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/69—Microscopic objects, e.g. biological cells or cellular parts
- G06V20/698—Matching; Classification
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Evolutionary Computation (AREA)
- Biomedical Technology (AREA)
- Molecular Biology (AREA)
- Life Sciences & Earth Sciences (AREA)
- Databases & Information Systems (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computing Systems (AREA)
- Artificial Intelligence (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- Investigating Or Analysing Biological Materials (AREA)
- Image Processing (AREA)
- Investigating Or Analysing Materials By Optical Means (AREA)
- Image Analysis (AREA)
Abstract
A method of analyzing an image of a biological specimen using a computational model is described, the method comprising processing a cell image of the biological specimen and a phase contrast image of the biological specimen using the computational model to generate output data. The cell image is a synthesis of a first bright field image of the biological specimen at a first focal plane and a second bright field image of the biological specimen at a second focal plane. The method further includes performing a comparison of the output data and reference data, and refining the computational model based on the comparison of the output data and the reference data. The method further includes thereafter processing additional image pairs according to the computational model to further refine the computational model based on a comparison of additional output data generated by the computational model with additional reference data.
Description
Technical Field
This application is an international application claiming priority from U.S. application Ser. No. 16/950,368, filed 11/17/2020, which is incorporated herein by reference. Also incorporated by reference are U.S. application Ser. No. 16/265,910 filed on 1 month 2 of 2019 and U.S. application Ser. No. 17/099,983 filed on 17 month 11 of 2020.
Background
Deep Artificial Neural Networks (ANNs), typically Convolutional Neural Networks (CNNs), may be used to analyze marked or unmarked images of biological specimens. Fluorescent labels are often used to provide detailed insight into biology, such as for labeling specific proteins, subcellular compartments, or cell types. However, due to the long exposure time required for fluorescence, such labels can also disrupt biology and cause phototoxic effects. In label-free assays, the use of ANN typically involves analyzing a single microscopic image of a given biological specimen.
Disclosure of Invention
In an example, the present disclosure includes a method of analyzing a biological specimen image using a computational model, the method comprising: processing a cell image of the biological specimen and a phase contrast image of the biological specimen using a computational model to generate output data, wherein the cell image is a synthesis of a first bright field image of the biological specimen at a first focal plane and a second bright field image of the biological specimen at a second focal plane; performing a comparison of the output data and the reference data; improving the computational model based on a comparison of the output data and the reference data; and thereafter processing the additional image pairs according to the computational model to further refine the computational model based on a comparison of additional output data generated by the computational model with additional reference data.
In another example, the invention includes a non-transitory data store storing instructions that, when executed by a computing device, cause the computing device to perform functions for analyzing an image of a biological specimen using a computational model, the functions comprising: processing a cell image of the biological specimen and a phase contrast image of the biological specimen using a computational model to generate output data, wherein the cell image is a synthesis of a first bright field image of the biological specimen at a first focal plane and a second bright field image of the biological specimen at a second focal plane; performing a comparison of the output data and the reference data; improving the computational model based on a comparison of the output data and the reference data; and thereafter processing the additional image pairs according to the computational model to further refine the computational model based on a comparison of additional output data generated by the computational model with additional reference data.
In yet another example, the invention includes a system for analyzing a biological specimen, the system comprising: an optical microscope; one or more processors; and a non-transitory data store storing instructions that, when executed by the one or more processors, cause the system to perform functions comprising: capturing a first bright field image of the biological specimen at a first focal plane and a second bright field image of the biological specimen at a second focal plane via an optical microscope; generating a cell image of the biological specimen by performing a pixel-level mathematical operation on the first bright field image and the second bright field image; processing the cell image of the biological specimen and the phase contrast image of the biological specimen using the computational model to generate output data; performing a comparison of the output data and the reference data; improving the computational model based on a comparison of the output data and the reference data; and thereafter processing the additional image pairs according to the computational model to further refine the computational model based on a comparison of additional output data generated by the computational model with additional reference data.
The features, functions, and advantages that have been discussed can be achieved independently in various examples or may be combined in yet other examples, further details of which can be seen with reference to the following description and drawings.
Drawings
FIG. 1 is a functional block diagram of an environment in accordance with an exemplary embodiment;
FIG. 2 depicts a block diagram of a computing device and a computer network in accordance with an illustrative embodiment;
FIG. 3 shows a flow chart of a method according to an exemplary embodiment;
FIG. 4 illustrates an image of a biological specimen according to an exemplary embodiment;
FIG. 5 illustrates an image of another biological specimen according to an exemplary embodiment;
fig. 6A shows experimental results of a cell-by-cell segmentation mask of cellular image response 24 hours after a time course of HT1080 fibrosarcoma apoptosis after camptothecin (CPT, cytotoxic) treatment, generated according to an exemplary embodiment;
FIG. 6B shows cell subsets classified based on Red (NucRed) and green fluorescence (Caspase 3/7, apoptosis indicators) according to the embodiment of FIG. 6A;
fig. 6C shows a decrease in red population (indicating loss of living cells), an increase in red and green fluorescence (indicating early apoptosis), and an increase in green fluorescence (indicating late apoptosis) after 24 hours following CPT treatment according to the embodiment of fig. 6A;
FIG. 6D shows the concentration response time course (percentage of total cells showing red and green fluorescence) of early apoptotic populations according to the embodiment of FIG. 6A;
fig. 6E shows experimental results of a cell-by-cell segmentation mask of the cellular image response 24 hours after a time course of HT1080 fibrosarcoma apoptosis following cycloheximide (CHX, cytostatic) treatment, generated according to an exemplary embodiment;
FIG. 6F shows cell subsets classified based on Red (NucRed) and green fluorescence (Caspase 3/7, apoptosis indicators) according to the embodiment of FIG. 6E;
FIG. 6G shows lack of apoptosis but reduced cell count following CHX treatment according to the embodiment of FIG. 6E;
FIG. 6H shows the concentration response time course (percentage of total cells showing red and green fluorescence) of early apoptotic populations according to the embodiment of FIG. 6E;
fig. 7A illustrates a cell-by-cell segmentation mask applied over a phase contrast image for label-free cell count of adherent cells using a cell-by-cell segmentation analysis generated according to an example embodiment. Different densities of a549 cells labeled with NucLight Red reagent were analyzed using both a label-free cell-by-cell analysis and a Red cell nucleus count analysis to verify label-free counts over time;
FIG. 7B shows a cell-by-cell segmentation mask according to FIG. 7A without a phase contrast image in the background;
FIG. 7C shows the time course of phase count and NucRed count data at different densities according to the embodiment of FIG. 7A;
FIG. 7D shows correlation of count data over 48 hours and shows R2 values of 1 for a slope of 1, according to the embodiment of FIG. 7A;
FIG. 8 is a schematic view of three focal planes;
FIG. 9 is a schematic diagram of an environmental operation;
FIG. 10 is a schematic diagram of output data and reference data;
FIG. 11 is a schematic diagram of output data and reference data;
FIG. 12 is a block diagram of a method;
FIG. 13 is a block diagram of a method;
FIG. 14 illustrates images associated with image classification; and
fig. 15 shows the result of the calculation model.
The drawings are for purposes of illustration, but it is to be understood that the invention is not limited to the arrangements and instrumentality shown in the drawings.
Detailed Description
I. Summary of the invention
Embodiments of the methods described herein may be used to segment phase contrast images of one or more cells of a biological specimen using out-of-focus bright field images, allowing for single cell and subpopulation analysis at rapid processing times. The disclosed exemplary methods also advantageously enable real-time label-free (i.e., non-fluorescent) cell counting and avoid the impact of fluorescent labels that would impair viability and function of living cells. Another advantage of the disclosed example methods is detecting boundaries of individual cells, regardless of the complexity of the cell morphology, including flat cells such as HUVECs.
Exemplary architecture
Fig. 1 is a block diagram illustrating an operating environment 100 that includes or involves, for example, an optical microscope 105 and a biological specimen 110 having one or more cells. The method 300 in fig. 3-5, described below, illustrates an embodiment of a method that may be implemented within the operating environment 100.
FIG. 2 is a block diagram illustrating an example of a computing device 200 configured to interact directly or indirectly with the operating environment 100 according to an example embodiment. The computing device 200 may be used to perform the functions of the methods shown in fig. 3-5 and described below. In particular, computing device 200 may be configured to perform one or more functions, including, for example, image generation functions based in part on images obtained by optical microscope 105. The computing device 200 has a processor 202 and also has a communication interface 204, a data memory 206, an output interface 208, and a display 210, each coupled to a communication bus 212. Computing device 200 may also include hardware to enable communications within computing device 200 and between computing device 200 and other devices (e.g., devices not shown). For example, the hardware may include a transmitter, a receiver, and an antenna.
The communication interface 204 may be a wireless interface and/or one or more wired interfaces that allow for short-range and long-range communications to one or more networks 214 or one or more remote computing devices 216 (e.g., tablet computer 216a, personal computer 216b, laptop computer 216c, and mobile computing device 216 d). Such wireless interfaces may provide communication under one or more wireless communication protocols, such as bluetooth, wiFi (e.g., institute of Electrical and Electronics Engineers (IEEE) 802.11 protocol), long Term Evolution (LTE), cellular communication, near Field Communication (NFC), and/or other wireless communication protocols. Such wired interfaces may include an ethernet interface, a Universal Serial Bus (USB) interface, or the like to communicate via wires, twisted pair, coaxial cable, optical link, fiber optic link, or other physical connection to a wired network. Thus, the communication interface 204 may be configured to receive input data from one or more devices and may also be configured to send output data to other devices.
The communication interface 204 may also include a user input device such as a keyboard, keypad, touch screen, touchpad, computer mouse, trackball, and/or other similar device.
The data storage 206 may include or take the form of one or more computer-readable storage media readable or accessible by the processor 202. The computer-readable storage medium may include volatile and/or nonvolatile storage components, such as optical, magnetic, organic, or other memory or disk storage devices, which may be integrated in whole or in part with the processor 202. The data store 206 is considered to be a non-transitory computer readable medium. In some examples, data store 206 may be implemented using a single physical device (e.g., one optical, magnetic, organic, or other memory or disk storage unit), while in other examples, data store 206 may be implemented using two or more physical devices.
The data storage 206 is thus a non-transitory computer-readable storage medium, and the executable instructions 218 are stored on the non-transitory computer-readable storage medium. The instructions 218 include computer executable code. The instructions 218, when executed by the processor 202, cause the processor 202 to perform functions. Such functions include, but are not limited to, receiving the bright field image from the optical microscope 100 and generating a phase contrast image, a junction mask, a cell image, a seed mask, a cell-by-cell segmentation mask, and a fluorescence image.
The processor 202 may be a general purpose processor or a special purpose processor (e.g., a digital signal processor, an application specific integrated circuit, etc.). The processor 202 may receive input from the communication interface 204 and process the input to generate output that is stored in the data memory 206 and output to the display 210. The processor 202 may be configured to execute executable instructions 218 (e.g., computer readable program instructions) stored in the data memory 206 and executable to provide the functionality of the computing device 200 described herein.
The computing device 200 shown in fig. 2 may also represent a local computing device 200a in the operating environment 100, for example, in communication with the optical microscope 105. The local computing device 200a may perform one or more steps of the method 300 described below, may receive input from a user, and/or may send image data and user input to the computing device 200 to perform all or some of the steps of the method 300. In addition, in an alternative exemplary embodiment, The platform may be used to perform the method 300 and include the combined functions of the computing device 200 and the optical microscope 105.
Fig. 3 illustrates a flowchart of an exemplary method 300 of achieving cell-by-cell segmentation of one or more cells of a biological specimen 110, according to an exemplary embodiment. For example, the method 300 shown in FIG. 3 presents an example of a method that may be used with the computing device 200 of FIG. 2. Further, the device or system may be used or configured to perform the logical functions illustrated in FIG. 3. In some cases, components of the devices and/or systems may be configured to perform functions such that the components are configured and constructed with hardware and/or software to achieve such performance. The components of the device and/or system may be arranged to be adapted, capable, or adapted to perform a function, such as when operated in a particular manner. Method 300 may include one or more operations, functions, or actions as illustrated by one or more of blocks 305 through 330. Although the blocks are shown in a sequential order, some of the blocks may also be performed in parallel and/or in a different order than described herein. Furthermore, different blocks may be combined into fewer blocks, split into additional blocks, and/or deleted based on the desired implementation.
It should be understood that for this and other processes and methods disclosed herein, the flow diagrams illustrate the function and operation of one possible implementation of the present examples. In this regard, each block may represent a module, segment, or portion of program code, which comprises one or more instructions executable by the processor for implementing the specified logical function(s) or step(s) in the process. The program code may be stored on any type of computer readable medium or data storage, such as a storage device including a diskette or hard drive. Furthermore, the program code may be encoded in a machine readable format on a computer readable storage medium or on other non-transitory media or articles of manufacture. The computer readable medium may include a non-transitory computer readable medium or memory, such as a computer readable medium, that stores data for a short period of time, such as register memory, processor cache, and Random Access Memory (RAM). The computer-readable medium may also include non-transitory media such as secondary or permanent long-term storage, e.g., read-only memory (ROM), optical or magnetic disks, compact disk read-only memory (CD-ROM). The computer readable medium may also be any other volatile or non-volatile memory system. A computer-readable medium may be considered to be, for example, a tangible computer-readable storage medium.
Furthermore, each block in fig. 3, as well as each block within other processes and methods disclosed herein, may represent circuitry wired to perform specific logic functions in the process. Alternative embodiments are within the scope of examples of the present disclosure, wherein functions may be performed out of the order shown or discussed, including substantially simultaneous orders or in reverse order, depending upon the function involved, as would be reasonably understood by a person of ordinary skill in the art.
Exemplary methods
As used herein, "bright field image" refers to an image obtained via a microscope based on a biological sample irradiated from below (such that light waves pass through a transparent portion of the biological sample). Thus, a varying brightness level is captured in the bright field image.
As used herein, "phase contrast image" refers to an image obtained directly or indirectly via a microscope based on a biological sample illuminated from below that captures the phase shift of light passing through the biological sample due to the difference in refractive index of different portions of the biological sample. For example, as a light wave passes through a biological specimen, the amplitude (i.e., brightness) and phase of the light wave change in a manner that depends on the characteristics of the biological specimen. As a result, the phase contrast image has luminance intensity values associated with pixels that vary such that denser areas with a high refractive index are rendered darker in the resulting image, while less dense areas with a lower refractive index are rendered lighter in the resulting image. Phase contrast images may be generated via a variety of techniques, including from the Z stack of bright field images (Z-stack).
As used herein, a "Z stack" or "Z scan" of bright field images refers to a digital image processing method that combines multiple images taken at different focal lengths to provide a composite image having a greater depth of field (i.e., thickness of the focal plane) than any single source bright field image.
As used herein, "focal plane" refers to a plane perpendicular to the lens axis of an optical microscope, on which a biological specimen can be observed at an optimal focus.
As used herein, "defocus" refers to the distance above or below the focal plane that enables a biological specimen to be observed in the event of defocus.
As used herein, "confluence mask" refers to a binary image in which pixels are identified as belonging to one or more cells in a biological specimen such that pixels corresponding to one or more cells are assigned a value of 1 and the remaining pixels corresponding to the background are assigned a value of 0, or vice versa.
As used herein, "cell image" refers to an image generated based on at least two bright field images obtained at different planes to enhance the contrast of the cell against the background.
As used herein, "seed mask" refers to an image having binary pixelation generated based on a set pixel intensity threshold.
As used herein, a "cell-by-cell segmentation mask" refers to an image with binary pixelation (i.e., the processor assigns a value of 0 or 1 to each pixel) such that the cells of the biological specimen 110 each display a different region of interest. The cell-by-cell segmentation mask may advantageously allow for label-free counting of cells displayed therein, for determining the entire area of individual adherent cells, for analysis based on cell texture metrics and cell shape descriptors, and/or for detecting boundaries of individual cells, including for adherent cells that tend to form sheets, where each cell may contact many other adjacent cells in biological specimen 110.
As used herein, "region growing iteration" refers to a single step in an iterative image segmentation method by which a region of interest ("ROI") is defined by iteratively expanding one or more initially identified single or multiple sets of pixels (i.e., "seeds") by acquiring the seeds and adding adjacent pixels to the set of pixels. The processor uses the similarity metric to determine which pixels to add to the growing region and defines stopping criteria for the processor to determine when the region growing is complete.
Referring now to fig. 3-5, a method 300 is illustrated using the computing device of fig. 1-2. The method 300 includes, at block 305, the processor 202 generating at least one phase contrast image 400 of a biological specimen 110 including one or more cells centered on a focal plane of the biological specimen 110. Then, at block 310, processor 202 generates a fusion mask 410 in the form of a binary image based on at least one phase-contrast image 400. Next, at block 315, the processor 202 receives a first bright field image 415 of one or more cells in the biological specimen 110 above the focal plane at an amount of defocus and a second bright field image 420 of one or more cells in the biological specimen 110 below the focal plane at an amount of defocus. Then, at block 320, the processor 202 generates a cell image 425 of one or more cells in the biological specimen based on the first bright field image 415 and the second bright field image 420. At block 325, processor 202 generates seed mask 430 based on cell image 425 and at least one phase contrast image 400. And at block 330, processor 202 generates an image of one or more cells in the biological specimen based on seed mask 430 and confluence mask 410, the image showing cell-by-cell segmentation mask 435.
As shown in fig. 3, at block 305, the processor 202 generating at least one phase contrast image 400 of the biological specimen 110 (including one or more cells of the biological specimen 110 centered at a focal plane) includes: the processor 202 receives the Z-scan of the bright field image and then generates at least one phase contrast image 400 based on the Z-scan of the bright field image. In various embodiments, the biological specimen 110 may be dispersed within a plurality of wells in a well plate representative of the experimental group.
In an alternative embodiment, method 100 includes processor 202 receiving at least one fluorescence image and then calculating fluorescence intensity of one or more cells in biological specimen 110 within cell-by-cell segmentation mask 435. In this example, the fluorescence intensity corresponds to the level of a protein of interest, such as an antibody that labels a cell surface marker (e.g., CD 20) or an annexin-V reagent that induces fluorescence corresponding to cell death. In addition, determining fluorescence intensity within individual cell boundaries can increase subpopulation recognition and allow calculation of subpopulation-specific metrics (e.g., average area and eccentricity of all dead cells, as defined by the presence of annexin-V).
In another embodiment, at block 310, processor 202 generating a fusion mask 410 in the form of a binary image based on at least one phase-contrast image 400 includes: the processor 202 applies one or more local texture filters or luminance filters to enable identification of pixels belonging to one or more cells in the biological specimen 110. Example filters may include, but are not limited to, local range filters, local entropy filters, local standard deviation filters, local luminance filters, and Gabor wavelet filters. Fig. 4 and 5 illustrate an example junction mask 410.
In yet another alternative embodiment, the optical microscope 105 determines the focal plane of the biological specimen 110. Further, in various embodiments, the amount of defocus can range from 20 μm to 60 μm. The optimum defocus amount is determined based on the optical characteristics of the objective lens used, including the magnification and working distance of the objective lens.
In another embodiment shown in fig. 5, at block 320, processor 202 generating cell image 425 based on first bright field image 415 and second bright field image 420 comprises: the processor 202 enhances the first bright field image 415 and the second bright field image 420 with at least one of a plurality of pixel level mathematical operations or feature detection based on the third bright field image 405 centered at the focal plane. One example of a pixel level mathematical operation includes addition, subtraction, multiplication, division, or any combination of these operations. The processor 202 then calculates transformation parameters to align the first bright field image 415 and the second bright field image 420 with the at least one phase contrast image 400. Next, the processor 202 combines the brightness level of each pixel of the aligned second bright field image 420 with the brightness level of the corresponding pixel in the aligned first bright field image 415 to form the cell image 425. The combination of the brightness levels of each pixel may be achieved by any of the mathematical operations described above. The technical effect of generating the cell image 425 is to remove bright field artifacts (e.g., shadows) and enhance the image contrast to increase cell detection by the seed mask 430.
In yet another alternative embodiment, at block 320, processor 202 generating cell image 425 of one or more cells in biological specimen 110 based on first bright field image 415 and second bright field image 420 includes: processor 202 receives one or more user-defined parameters that determine one or more threshold levels and one or more filter sizes. Processor 202 then applies one or more smoothing filters to cell image 425 based on the one or more user-defined parameters. The technical effect of the smoothing filter is to further increase the accuracy of cell detection in the seed mask 430 and increase the likelihood that each cell will be assigned a seed. Smoothing filter parameters are selected to accommodate different adherent cell morphologies, e.g., flat and round, protruding cells, clustered cells, etc.
In another alternative embodiment, at block 325 processor 202 generates seed mask 430 based on cell image 425 and at least one phase contrast image 400 comprising: processor 202 modifies cellular image 425 such that each pixel at or above the threshold pixel intensity is identified as a cellular seed pixel, thereby producing seed mask 430 with binary pixelation. The technical effect of binary pixelation of the seed mask is to allow comparison with the corresponding binary pixelation of the junction mask. Binary pixelation of the seed mask is also used as a starting point for the region growing iteration discussed below. For example, in yet another alternative embodiment, the seed mask 430 may have a plurality of seeds, each seed corresponding to a single cell in the biological specimen 110. In this embodiment, the method 300 further includes, prior to the processor 202 generating an image showing one or more cells in the biological specimen of the cell-by-cell segmentation mask 435, the processor 202 comparing the seed mask 430 and the junction mask 410 and eliminating from the seed mask 430 one or more regions not disposed in the regions of the junction mask 410 and eliminating from the junction mask 410 one or more regions of one of the plurality of seeds not including the seed mask 430. The technical effect of these eliminated regions is to exclude small bright objects (e.g., cell debris) that create seeds and to increase the identification of seeds used in the region growing iterations described below.
In another alternative embodiment, at block 330, processor 202 generates an image of one or more cells in biological specimen 110 showing cell-by-cell segmentation mask 435 based on seed mask 430 and confluence mask 410, including: the processor 202 performs a region growing iteration on each of the active seed sets. Processor 202 then repeats the region growing iteration for each seed in the active seed set until the growing region of a given seed reaches one or more boundaries of junction mask 410 or overlaps with the growing region of another seed. Processor 202 selects a subset of reactive species for each iteration based on the properties of the corresponding pixel values in the cell image. Furthermore, the technical effect of using at least one phase contrast image 400 and bright field images 415, 420, 405 is that the seeds correspond to bright spots in the cell image 425 and high texture areas in the phase contrast image 400 (i.e., overlapping of the confluence mask 410 and the seed mask 430 will be described in more detail below). Another technical effect produced by using the fusion mask 410, at least one phase contrast image, and the bright field images 415, 420, 405 is an increase in accuracy of identifying individual cell locations and cell boundaries in the cell-by-cell segmentation mask 435, which advantageously allows for quantification of features such as cell surface protein expression, as one example.
In yet another alternative embodiment, method 300 may include processor 202 applying one or more filters to remove objects based on one or more cell texture metrics and cell shape descriptors in response to user input. Processor 202 then modifies the image of the biological specimen showing the cell-by-cell segmentation mask in response to the application of the one or more filters. Exemplary cell texture metrics and cell shape descriptors include, but are not limited to, cell size, perimeter, eccentricity, fluorescence intensity, aspect ratio, solidity, feret's diameter (Feret's diameter), phase contrast entropy, and phase contrast standard deviation.
In another alternative embodiment, the method 300 may include the processor 202 determining a cell count of the biological specimen 110 based on an image of one or more cells in the biological specimen 110 showing the cell-by-cell segmentation mask 435. The foregoing cell count is advantageously allowed as a result of the defined cell boundaries shown in the cell-by-cell segmentation mask 435, such as shown in fig. 4. In an alternative embodiment, the one or more cells in biological specimen 110 are one or more of adherent cells and non-adherent cells. In another embodiment, an adherent cell may comprise: one or more different cancer cell lines including human lung cancer cells, fibro cancer cells, breast cancer cells, ovarian cancer cells; or a human microvascular cell line, including human umbilical vein cells. In an alternative embodiment, processor 202 performs a region growing iteration such that a different smoothing filter is applied to non-adherent cells (including human immune cells such as PMBC and Jurkat cells) rather than to adherent cells to improve the approximation of cell boundaries.
As described above, the non-transitory computer readable medium has stored thereon program instructions that, when executed by the processor 202, may perform any of the functions of the aforementioned methods.
As one example, a non-transitory computer readable medium has stored thereon program instructions that, when executed by the processor 202, perform a set of actions that includes the processor 202 generating at least one phase contrast image 400 of a biological specimen 110 that includes one or more cells based on at least one bright field image 405 centered at a focal plane of the biological specimen 110. Processor 202 then generates a junction mask 410 in the form of a binary image based on at least one phase-contrast image 400. Next, the processor 202 receives a first bright field image 415 of one or more cells in the biological specimen 110 at an out-of-focus amount above the focal plane and a second bright field image 420 of one or more cells in the biological specimen 110 at an out-of-focus amount below the focal plane. Processor 202 then generates a cell image 425 of the one or more cells based on first bright field image 415 and second bright field image 420. Processor 202 also generates seed mask 430 based on cell image 425 and at least one phase contrast image 400. And, processor 202 generates an image of one or more cells in biological specimen 100 based on seed mask 430 and confluence mask 410, the image showing cell-by-cell segmentation mask 435.
In an alternative embodiment, the non-transitory computer readable medium further includes causing processor 202 to receive the at least one fluorescence image, and causing processor 202 to calculate the fluorescence intensity of one or more cells in the biological specimen within the cell-by-cell segmentation mask.
In another alternative embodiment, the non-transitory computer readable medium further includes causing processor 202 to generate seed mask 430 based on cell image 425 and at least one phase contrast image 400. And the non-transitory computer-readable medium further comprises causing the processor 202 to modify the cell image 410 such that each pixel at or above the threshold pixel intensity is identified as a cell seed pixel, thereby causing the seed mask 430 to have binary pixelation.
In yet another alternative embodiment, the seed mask 430 has a plurality of seeds, each seed corresponding to a single cell. And the non-transitory computer-readable medium further comprises, prior to the processor 202 generating an image showing one or more cells in the biological specimen 110 of the cell-by-cell segmentation mask 435, causing the processor 202 to compare the seed mask 430 and the junction mask 410 and eliminate from the seed mask 430 one or more regions not disposed in regions of the junction mask 410 and eliminate from the junction mask 410 one or more regions of one of the plurality of seeds not comprising the seed mask 430.
In another alternative embodiment, the program instructions cause processor 202 to generate an image of one or more cells in biological specimen 110 showing cell-by-cell segmentation mask 435 based on seed mask 430 and confluence mask 410 includes: the processor 202 performs a region growing iteration on each of the active seed sets. The non-transitory computer-readable medium then further includes causing the processor 202 to repeat the region growing iteration for each seed in the active seed set until the growing region of a given seed reaches one or more boundaries of the junction mask 410 or overlaps with the growing region of another seed.
The non-transitory computer-readable medium further includes causing the processor 202 to apply one or more filters to remove the object based on the one or more cell texture metrics and the cell shape descriptor in response to the user input. And processor 202 modifies the image of biological specimen 110 showing cell-by-cell segmentation mask 435 in response to application of the one or more filters.
IV. Experimental results
Exemplary embodiments allow tracking of cell health over time in a subpopulation. For example, fig. 6A shows experimental results of a cell-by-cell segmentation mask of phase contrast image response 24 hours after a time course of HT1080 fibrosarcoma apoptosis after camptothecin (CPT, cytotoxic) treatment, generated according to an exemplary embodiment. By using NucLight Red (nuclear activity marker) and interference free +.>Multiple reads of Caspase 3/7 green reagent (apoptosis indicator) were used to determine cell health. FIG. 6B shows the use of +.>Cell-by-cell analysis software tools, cell subpopulations classified based on red and green fluorescence. FIG. 6C shows red population reduction (indicating loss of viable cells), red and green following CPT treatment according to the embodiment of FIG. 6AIncreased color fluorescence (indicating early apoptosis), and increased green fluorescence after 24 hours (indicating late apoptosis). Fig. 6D shows the concentration response time course (percentage of total cells showing red and green fluorescence) of the early apoptotic population according to the embodiment of fig. 6A. The values shown are mean ± SEM of 3 wells.
In another example, fig. 6E shows experimental results of a cell-by-cell segmentation mask generated according to an exemplary embodiment for a cell image response 24 hours after a time course of HT1080 fibrosarcoma apoptosis after cyclohexanamide (CHX, cytostatic) treatment. By usingNucLight Red (nuclear activity marker) and interference freeMultiple reads of Caspase 3/7 green reagent (apoptosis indicator) were used to determine cell health. FIG. 6F shows the use of +. >Cell-by-cell analysis software tools, cell subpopulations classified based on red and green fluorescence. Fig. 6G shows lack of apoptosis after CHX treatment, but reduced cell count, according to the embodiment of fig. 6E (data not shown). Fig. 6H shows the concentration response time course of early apoptotic populations (percentage of total cells showing red and green fluorescence) according to the embodiment of fig. 6E. The values shown are mean ± SEM of 3 wells.
FIG. 7A illustrates use of a data processing system according to an exemplary embodimentSoftware generated cell-by-cell segmentation analysis, a cell-by-cell segmentation mask applied over phase contrast images for label-free cell counting of adherent cells. A549 cells of different densities labeled with NucLight Red reagent were analyzed using both a label-free cell-by-cell analysis and a Red cell nuclear count analysis to verify the absence over timeAnd (5) marking and counting. Fig. 7B shows the cell-by-cell segmentation mask according to fig. 7A without a phase contrast image in the background. Fig. 7C shows the time course of phase count and red count data at different densities according to the embodiment of fig. 7A. Fig. 7D shows the correlation of count data over 48 hours, and demonstrates an R2 value of 1 for a slope of 1, according to the embodiment of fig. 7A. This has been repeated in a range of cell types. The values shown are mean ± SEM of 4 wells.
V. other examples and experimental data
For example, the following functions may be performed by the environment 100. Referring to fig. 4, the optical microscope 105 captures a first bright field image 415 of the biological specimen 110 at a first focal plane and captures a second bright field image 420 of the biological specimen 110 at a second focal plane. Next, the environment 100 generates a cell image 425 by performing a pixel level-wise operation on the first bright field image 415 and the second bright field image 420.
Fig. 8 is a schematic view of three focal planes. The first focal plane 611 is located at an amount of defocus 617 above the third focal plane 615 where biological specimens can be observed at a modified focus relative to the first and second focal planes 611, 613. The second focal plane 613 is located below the third focal plane 615 at a defocus amount 617. In some examples, the defocus amount is in the range of 20 μm to 60 μm.
Fig. 9 is a schematic diagram of the operations of the environment 100 performed by the processor 202 executing instructions stored on the data store 206. For example, processor 202 executes a computational model 601, which may take the form of, for example, an Artificial Neural Network (ANN) or a Convolutional Neural Network (CNN).
An ANN or CNN includes artificial neurons called nodes. Each node may transmit data to other nodes. The node receiving the data then processes the data and may send the data to the node to which it is connected. The data typically comprises numbers and the output of each node is calculated by a (e.g. non-linear) function of the sum of its inputs. These connections are called edges (edges). Nodes and edges typically have weights that adjust as learning proceeds. The weight increases or decreases in intensity and determines the direction of the data at the junction. The node may have a threshold such that data is only sent when the aggregate data exceeds the threshold. Typically, nodes are aggregated into layers. Different layers may perform different transformations on their inputs. Data travels from a first layer (input layer) to a last layer (output layer), possibly requiring multiple passes through multiple layers.
The environment 100 processes the cell image 425 of the biological specimen 110 and the phase contrast image 400 of the biological specimen 110 using the computational model 601 to generate output data 609. The cell image 425 is a composite of the first bright field image 415 of the biological specimen 110 at the first focal plane 611 and the second bright field image 420 of the biological specimen 110 at the second focal plane 613. For example, environment 100 processes cell image 425 and phase contrast image 400 according to nodes, connections, and weights defined by computing model 601.
In some examples, the environment 100, upon input to the computational model 601, immediately processes the synthesis of the cell image 425 and the phase contrast image 400 into two respective channels of image information that overlap each other.
In other examples, environment 100 processes cell image 425 via a first channel (e.g., an input channel) of computing model 601 and phase-contrast image 400 via a second channel (e.g., a different input channel) of computing model 601. As such, in this example, the environment 100 processes the first output of the first channel and the second output of the second channel to generate output data 609 or to generate intermediate data for obtaining the output data 609.
The output data 609 typically includes information about the biological specimen 110. For example, the output data 609 may estimate a location and/or a range (e.g., boundary, area, or volume) of one or more cells of the biological specimen 110. Other examples of output data 609 are described below.
Next, the environment 100 compares the output data 609 with the reference data 615. The reference data 615 is typically "real data" generated by a person that represents information about the biological specimen 110. For example, the output data 609 may include human-generated markers indicative of the location and/or extent of cells of the biological specimen 110. In some examples, environment 100 generates computer-implemented transformations 619 of human-generated data, which may also be included as part of reference data 615. Examples of computer-implemented transformations include rotation, magnification, translation, and/or resolution changes, among others. Thus, the environment 100 can refine the computing model 601 in a self-supervised manner or a semi-supervised manner.
The environment 100 then refines the computing model 601 based on a comparison of the output data 609 and the reference data 615. For example, the environment 100 may calculate pixel-by-pixel luminance and/or color differences between the output data 609 and the reference data 615, and adjust the nodes, connections, and/or weights of the computing model 601 so that the output data 609 generated by the computing model 601 better matches the reference data 615.
Thereafter, environment 100 processes the additional image pairs according to computing model 601 to further refine computing model 601 based on a comparison of additional output data 609 generated by the computing model with additional reference data 615. Additional image pairs include cell images 425 and phase contrast images 400, which correspond to other biological specimens 110 or other views of the same biological specimen 110 as described above. The additional reference data 615 corresponds to the additional biological specimen 110 or other views of the same biological specimen 110 as described above.
More specifically, environment 100 may modify computing model 601 (e.g., adjust nodes, connections, and/or weights of computing model 601) to reduce the sum of the respective differences between additional output data 609 and additional reference data 615. Thus, environment 100 may adjust the nodes, connections, and/or weights of computing model 601 such that collective output data 609 as a whole best matches collective reference data 615.
As shown in FIG. 10, the output data 609 may represent an estimate of the location and/or extent of cells 621 within the biological specimen 110. In this case, the reference data 615 correctly defines the location and/or extent of the cells 621. Although fig. 10 shows that the output data 609 and the reference data 615 are identical, this is not typically the case in practice.
In some examples, if the biological specimen has a fluorescent marker, the output data 609 represents an estimate of the appearance of the biological specimen 110. In this case, the reference data 615 is generated from or includes an actual image of the biological specimen 110 with the fluorescent marker.
In some examples, the output data 609 represents an estimate of the location and/or extent of one or more nuclei within the biological specimen 110. In this case, the reference data 615 correctly defines the location and/or extent of one or more nuclei within the biological specimen 110. Additionally or alternatively, the reference data 615 represents processed fluorescence data corresponding to the biological specimen 110. Such fluorescence data may be processed to identify nuclei.
Referring to fig. 11, the output data 609 may represent an estimate of the classification of the first portion 631 of the biological specimen 110 into a first category and the classification of the second portion 633 of the biological specimen 110 into a second category (e.g., living versus dead cells, stem versus lineage specific cells, undifferentiated versus differentiated cells, epithelial versus mesenchymal cells, wild-type versus mutant cells, cells expressing a particular protein of interest versus cells not expressing the particular protein of interest, etc.). Other categories are also possible. In this case, the reference data 615 correctly defines a classification of the first portion 631 of the biological specimen 110 into a first category and a classification of the second portion 633 of the biological specimen 110 into a second category.
In other examples, the output data 609 represents classifying the entirety of the biological specimen 110 into a single category. In this case, the reference data 615 correctly classifies the entirety of the biological specimen 110 into a single category of two or more categories (e.g., healthy and unhealthy, malignant and benign, wild-type and mutant).
Fig. 12 and 13 are block diagrams of methods 501 and 701, respectively. For example, methods 501 and 701 and related functionality may be performed by environment 100. As shown in fig. 12 and 13, methods 501 and 701 include one or more operations, functions, or actions as shown in blocks 503, 505, 507, 509, 702, 704, 706, 708, 710, and 712. Although the blocks are shown in a sequential order, the blocks may also be performed in parallel and/or in a different order than described herein. In addition, multiple blocks may be combined into fewer blocks, split into additional blocks, and/or deleted based on the target implementation.
At block 503, the method 501 includes processing the cell image 425 of the biological specimen 110 and the phase contrast image 400 of the biological specimen 110 using the computational model 601 to generate output data 609. The cell image 425 is a composite of the first bright field image 415 of the biological specimen 110 at the first focal plane 611 and the second bright field image 420 of the biological specimen 110 at the second focal plane 613.
At block 505, the method 501 includes performing a comparison of the output data 609 and the reference data 615.
At block 507, the method 501 includes refining the computational model 601 based on a comparison of the output data 609 and the reference data 615.
At block 509, the method 501 includes thereafter processing the additional image pairs in accordance with the computational model 601 to further refine the computational model 601 based on a comparison of additional output data 609 generated by the computational model 601 with additional reference data 615.
At block 702, the method 701 includes capturing, via the optical microscope 105, a first bright field image 415 of the biological specimen 110 at a first focal plane 611 and a second bright field image 420 of the biological specimen 110 at a second focal plane 613.
At block 704, the method 701 includes generating a cell image 425 of the biological specimen 110 by performing a pixel level-wise operation on the first bright field image 415 and the second bright field image 420.
At block 706, the method 701 includes processing the cell image 425 of the biological specimen 110 and the phase contrast image 400 of the biological specimen 110 using the computational model 601 to generate output data 609.
At block 708, the method 701 includes performing a comparison of the output data 609 and the reference data 615.
At block 710, the method 701 includes refining the computational model 601 based on a comparison of the output data 609 and the reference data 615.
At block 712, the method 701 includes thereafter processing the additional image pairs according to the computational model 601 to further refine the computational model 601 based on a comparison of additional output data 609 generated by the computational model 601 with additional reference data 615.
FIG. 14 illustrates an image associated with generating a cell-by-cell segmentation mask using a computational model. Inclusion of the cell image 425 (e.g., information from the first bright field image 415 and the second bright field image 420) improves the performance of the computational model 601. The computational model 601 is better at decomposing clusters (e.g., tight cell groups). Models that include only phase contrast images as inputs are more likely to have false positive cell recognition due to plate texture.
Fig. 15 shows a comparison of the results of a computational model 601 (e.g., phase + cells) and a model that includes only phase contrast images as inputs (e.g., phase only). Referring to both the box mAP (mean precision average) metric and the mask mAP metric, the computational model 601 produces a higher score than the "phase only" model, which indicates that the recognition or classification of cells by the computational model 601 is improved. The computational model 601 is generally more robust than other models because the addition of cell image data provides more robust training information.
The mean precision (mAP) score is used to compare the cell-by-cell segmentation mask of the computational model to a manually annotated reference cell-by-cell segmentation mask. The box mAP refers to the score calculated on a cell-by-cell bounding box, while the mask mAP refers to the score calculated on a cell-by-cell mask.
The description of the different advantageous arrangements has been presented for purposes of illustration and description, but is not intended to be exhaustive or limited to the examples in the form disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art. Furthermore, different advantageous examples may describe different advantages compared to other advantageous examples. The example or examples were chosen and described in order to best explain the principles of the examples, the practical application, and to enable others of ordinary skill in the art to understand the disclosure for various examples and with various modifications as are suited to the particular use contemplated.
Claims (27)
1. A method of analyzing an image of a biological specimen using a computational model, the method comprising:
processing a cell image of the biological specimen and a phase contrast image of the biological specimen using the computational model to generate output data, wherein the cell image is a synthesis of a first bright field image of the biological specimen at a first focal plane and a second bright field image of the biological specimen at a second focal plane;
Performing a comparison of the output data and reference data;
refining the computational model based on the comparison of the output data and the reference data; and
thereafter, additional image pairs are processed according to the computational model to further refine the computational model based on a comparison of additional output data generated by the computational model with additional reference data.
2. The method of claim 1, further comprising generating the cell image by performing a pixel-level mathematical operation on the first bright-field image and the second bright-field image.
3. The method of any of claims 1-2, wherein the first focal plane is located above a third focal plane at which the biological specimen can be observed at an improved focus relative to the first and second focal planes, and wherein the second focal plane is located below the third focal plane at the defocus amount.
4. A method according to claim 3, wherein the defocus amount is in the range of 20 μm to 60 μm.
5. The method of any of claims 1-4, wherein the reference data comprises a computer-implemented transformation of human-generated data.
6. The method of any of claims 1-5, further comprising generating a computer-implemented transformation of human-generated data, wherein the reference data comprises the computer-implemented transformation.
7. The method of any of claims 1-6, wherein refining the computational model comprises: the computational model is modified in a self-supervised or semi-supervised manner.
8. The method of any one of claims 1 to 7, wherein the output data represents an estimate of the location and extent of cells within the biological specimen.
9. The method of claim 8, wherein the reference data correctly defines the location and the range of the cell.
10. The method of any one of claims 1 to 9, wherein the output data represents an estimate of the appearance of the biological specimen if the biological specimen has a fluorescent marker.
11. The method of claim 10, wherein the reference data is generated from an actual image of the biological specimen with a fluorescent marker.
12. The method of any one of claims 1 to 11, wherein the output data represents an estimate of the location and extent of nuclei in the biological specimen.
13. The method of claim 12, wherein the reference data correctly defines the location and the extent of the nuclei within the biological specimen.
14. The method of claim 13, wherein the reference data represents processed fluorescence data corresponding to the biological specimen.
15. The method of any of claims 1-14, wherein the output data represents an estimate of classifying a first portion of the biological specimen into a first category and a second portion of the biological specimen into a second category.
16. The method of claim 15, wherein the reference data correctly defines classifying the first portion of the biological specimen as the first category and classifying the second portion of the biological specimen as the second category.
17. The method of any one of claims 1 to 16, wherein the output data represents classifying the whole of the biological specimen into a class.
18. The method of claim 17, wherein the reference data correctly classifies the entirety of the biological specimen into the category.
19. The method of any of claims 1-18, wherein processing the additional image pairs according to the computational model to further refine the computational model comprises: the computational model is modified to reduce the sum of the respective differences between the additional output data and the additional reference data.
20. The method of any one of claims 1 to 19, wherein processing the cell image of the biological specimen and the phase contrast image of the biological specimen using the computational model comprises: the cell image of the biological specimen and the phase contrast image of the biological specimen are processed using an artificial neural network.
21. The method of any one of claims 1 to 20, wherein processing the cell image of the biological specimen and the phase contrast image of the biological specimen using the computational model comprises: the cell image of the biological specimen and the phase contrast image of the biological specimen are processed using a convolutional neural network.
22. The method of any one of claims 1 to 21, wherein processing the cell image of the biological specimen and the phase contrast image of the biological specimen using the computational model comprises: processing the synthesis of the cell image and the phase contrast image.
23. The method of any one of claims 1 to 22, wherein processing the cell image of the biological specimen and the phase contrast image of the biological specimen using the computational model comprises: the cell image is processed via a first channel of the computational model and the phase contrast image is processed via a second channel of the computational model.
24. The method of claim 23, wherein processing the cell image of the biological specimen and the phase contrast image of the biological specimen using the computational model further comprises: processing a first output of the first channel and a second output of the second channel to generate the output data.
25. A non-transitory data store storing instructions that, when executed by a computing device, cause the computing device to perform the method of any of claims 1-24.
26. A system for analyzing a biological specimen, the system comprising:
an optical microscope;
one or more processors; and
a non-transitory data store storing instructions which, when executed by the one or more processors, cause the system to perform the method of any one of claims 1 to 24.
27. A system for analyzing a biological specimen, the system comprising:
an optical microscope;
one or more processors; and
a non-transitory data store storing instructions that, when executed by the one or more processors, cause the system to perform functions comprising:
Capturing a first bright field image of a biological specimen at a first focal plane and a second bright field image of the biological specimen at a second focal plane via the optical microscope;
generating a cellular image of the biological specimen by performing a pixel-level mathematical operation on the first bright field image and the second bright field image;
processing the cell image of the biological specimen and a phase contrast image of the biological specimen using a computational model to generate output data;
performing a comparison of the output data and reference data;
improving a computational model based on the comparison of the output data and the reference data; and
thereafter, additional image pairs are processed according to the computational model to further refine the computational model based on a comparison of additional output data generated by the computational model with additional reference data.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/950,368 US11803963B2 (en) | 2019-02-01 | 2020-11-17 | Computational model for analyzing images of a biological specimen |
US16/950,368 | 2020-11-17 | ||
PCT/US2021/059417 WO2022108884A1 (en) | 2020-11-17 | 2021-11-15 | Computational model for analyzing images of a biological specimen |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116406468A true CN116406468A (en) | 2023-07-07 |
Family
ID=78844844
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202180071617.0A Pending CN116406468A (en) | 2020-11-17 | 2021-11-15 | Computational model for analyzing images of biological specimens |
Country Status (5)
Country | Link |
---|---|
EP (1) | EP4248423A1 (en) |
JP (1) | JP2023549631A (en) |
KR (1) | KR20230106629A (en) |
CN (1) | CN116406468A (en) |
WO (1) | WO2022108884A1 (en) |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8744164B2 (en) * | 2010-04-06 | 2014-06-03 | Institute For Systems Biology | Automated analysis of images using bright field microscopy |
HUP1200018A2 (en) * | 2012-01-11 | 2013-07-29 | 77 Elektronika Mueszeripari Kft | Method of training a neural network, as well as a neural network |
CN110709749B (en) * | 2017-06-09 | 2021-10-26 | 电子慕泽雷帕里公司 | Combined bright field and phase contrast microscope system and image processing apparatus equipped therewith |
US10885631B2 (en) * | 2019-02-01 | 2021-01-05 | Essen Instruments, Inc. | Label-free cell segmentation using phase contrast and brightfield imaging |
-
2021
- 2021-11-15 EP EP21824176.8A patent/EP4248423A1/en active Pending
- 2021-11-15 JP JP2023518959A patent/JP2023549631A/en active Pending
- 2021-11-15 CN CN202180071617.0A patent/CN116406468A/en active Pending
- 2021-11-15 KR KR1020237017433A patent/KR20230106629A/en active Search and Examination
- 2021-11-15 WO PCT/US2021/059417 patent/WO2022108884A1/en active Application Filing
Also Published As
Publication number | Publication date |
---|---|
WO2022108884A1 (en) | 2022-05-27 |
KR20230106629A (en) | 2023-07-13 |
JP2023549631A (en) | 2023-11-29 |
EP4248423A1 (en) | 2023-09-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113474813B (en) | Label-free cell segmentation using phase contrast and bright field imaging | |
Zeune et al. | Deep learning of circulating tumour cells | |
Yang et al. | NuSeT: A deep learning tool for reliably separating and analyzing crowded cells | |
US10671833B2 (en) | Analyzing digital holographic microscopy data for hematology applications | |
US20230127698A1 (en) | Automated stereology for determining tissue characteristics | |
Sirinukunwattana et al. | Gland segmentation in colon histology images: The glas challenge contest | |
Mishra et al. | Convolutional neural network for histopathological analysis of osteosarcoma | |
Sommer et al. | Learning-based mitotic cell detection in histopathological images | |
Sun et al. | Deep learning‐based single‐cell optical image studies | |
US12039796B2 (en) | Method for classifying cells | |
Mathew et al. | Computational methods for automated mitosis detection in histopathology images: A review | |
CN116348921A (en) | Method for classifying cells | |
Baykal et al. | Modern convolutional object detectors for nuclei detection on pleural effusion cytology images | |
Fishman et al. | Practical segmentation of nuclei in brightfield cell images with neural networks trained on fluorescently labelled samples | |
US11803963B2 (en) | Computational model for analyzing images of a biological specimen | |
CN116797791A (en) | Mitochondrial segmentation and classification method based on deep learning | |
Singh et al. | Breast cancer detection and classification of histopathological images | |
Niederlein et al. | Image analysis in high content screening | |
Nguyen et al. | Classification of Road Pavement Defects Based on Convolution Neural Network in Keras | |
CN116406468A (en) | Computational model for analyzing images of biological specimens | |
KR20230063147A (en) | Efficient Lightweight CNN and Ensemble Machine Learning Classification of Prostate Tissue Using Multilevel Feature Analysis Method and System | |
Tayebi et al. | Histogram of cell types: deep learning for automated bone marrow cytology | |
Krithiga et al. | Proliferation score prediction using novel SMHC feature using adaptive XGBoost model | |
Ben Taieb | Analyzing cancers in digitized histopathology images | |
Jayaraj | Digital Pathology with Deep Learning for Diagnosis of Breast Cancer in Low-Resource Settings |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |