EP3397951A1 - Systeme und verfahren zur echtzeittestüberwachung - Google Patents

Systeme und verfahren zur echtzeittestüberwachung

Info

Publication number
EP3397951A1
EP3397951A1 EP16826047.9A EP16826047A EP3397951A1 EP 3397951 A1 EP3397951 A1 EP 3397951A1 EP 16826047 A EP16826047 A EP 16826047A EP 3397951 A1 EP3397951 A1 EP 3397951A1
Authority
EP
European Patent Office
Prior art keywords
assay
sample
specimen
image
fluid
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
EP16826047.9A
Other languages
English (en)
French (fr)
Other versions
EP3397951B1 (de
Inventor
Javier A. Perez SEPULVEDA
Yu-Heng Cheng
Setareh Duquette
Lisa A. JONES
Chih-Ching Lin
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ventana Medical Systems Inc
Original Assignee
Ventana Medical Systems Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ventana Medical Systems Inc filed Critical Ventana Medical Systems Inc
Priority to EP23171353.8A priority Critical patent/EP4235598A3/de
Publication of EP3397951A1 publication Critical patent/EP3397951A1/de
Application granted granted Critical
Publication of EP3397951B1 publication Critical patent/EP3397951B1/de
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N1/00Sampling; Preparing specimens for investigation
    • G01N1/28Preparing specimens for investigation including physical details of (bio-)chemical methods covered elsewhere, e.g. G01N33/50, C12Q
    • G01N1/30Staining; Impregnating ; Fixation; Dehydration; Multistep processes for preparing samples of tissue, cell or nucleic acid material and the like for analysis
    • G01N1/31Apparatus therefor
    • G01N1/312Apparatus therefor for samples mounted on planar substrates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/75Systems in which material is subjected to a chemical reaction, the progress or the result of the reaction being investigated
    • G01N21/77Systems in which material is subjected to a chemical reaction, the progress or the result of the reaction being investigated by observing the effect on a chemical indicator
    • G01N21/78Systems in which material is subjected to a chemical reaction, the progress or the result of the reaction being investigated by observing the effect on a chemical indicator producing a change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/147Details of sensors, e.g. sensor lenses
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/69Microscopic objects, e.g. biological cells or cellular parts
    • G06V20/695Preprocessing, e.g. image segmentation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N1/00Sampling; Preparing specimens for investigation
    • G01N1/28Preparing specimens for investigation including physical details of (bio-)chemical methods covered elsewhere, e.g. G01N33/50, C12Q
    • G01N1/30Staining; Impregnating ; Fixation; Dehydration; Multistep processes for preparing samples of tissue, cell or nucleic acid material and the like for analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30024Cell structures in vitro; Tissue sections in vitro
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30072Microarray; Biochip, DNA array; Well plate
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/03Recognition of patterns in medical or anatomical images

Definitions

  • the present disclosure generally relates to systems and methods for real time assay monitoring. More particularly, the present disclosure relates to systems and method for utilizing real time assay monitoring for quality control, repeat testing and reflex testing before sample preparation is completed.
  • An assay is an analytical procedure that can be performed to measure one or more properties associated with a biological specimen, for example, and array of molecules, a tissue section or a preparation of cells.
  • specimens may be processed for analysis by applying one or more fluids to the specimens.
  • microscope slides bearing biological specimens may be treated with one or more dyes or reagents to add color and contrast to otherwise transparent or invisible cells or cell components.
  • Immunohistochemical (IHC) and in situ hybridization (ISH) assay staining procedures can be used to process tissue specimens and provide information regarding the presence, location and/or amount of particular molecules in a sample.
  • Assay and platform development as well as commercial assay testing can be costly in terms of time and resources, particularly when tests fail and must be repeated.
  • tissue staining quality of a specimen undergoing an assay is evaluated by a pathologist only after the assay is completed, and the pathologist does not have any access to the slide before the specimen leaves the assay processing platform. This process can take up to 13 hours for ISH assays.
  • the same experimental conditions can be repeatedly performed to produce results, which then are evaluated by a pathologist, again, only after the assay is completed, to ensure consistent outcomes for the assays. Information about where and when any failures in the staining process occurred is unknown to the pathologist, and platform developers are left to run entire batteries of assays to find the root cause of failures that need to be fixed.
  • a digital pathology tool can include a real time monitoring system with automated scoring that can score the slides from an assay.
  • the real time monitoring system can provide a "saturation index," which is a score that correlates to a signal intensity score. By providing the saturation index in real time, the real time monitoring system can be used to evaluate assay quality in real time while the assay is occurring.
  • the generation of the saturation index can be automated in the real time monitoring system and used for various assay monitoring applications, such as the monitoring of assays with various protocols, while the assays are occurring.
  • the results of assay outcome can be obtained in real time, before the assay is complete, the slide is "coverslipped," and then examined by a pathologist.
  • FIG. 1 schematically shows an embodiment of a real time assay monitoring system.
  • FIG. 2 shows an embodiment of an imaging system and a sample processing system for the real time assay monitoring system depicted by FIG. 1 .
  • FIG. 3 schematically shows an embodiment of a controller for the real time assay monitoring system depicted by FIG. 1 .
  • FIG. 4 shows an exemplary image used for boundary detection.
  • FIG. 5 shows another exemplary image used for boundary detection.
  • FIG. 6 shows an embodiment of a process for determining an amount of adjustment fluid for an assay.
  • FIG. 7A shows a first position for the fluid in the system of FIG. 2.
  • FIG. 7B shows a second position for the fluid in the system of FIG. 2.
  • FIG. 7C shows a third position for the fluid in the system of FIG. 2.
  • FIG 7D shows a fourth position for the fluid in the system of FIG. 2.
  • FIG. 8 shows an exemplary screenshot of a graphical user interface (GUI) displayed by a real time adjustment system.
  • GUI graphical user interface
  • FIG. 9 shows an embodiment of a process for monitoring an assay.
  • FIG. 10 shows an embodiment of an HSV color model.
  • FIG. 1 1 shows an embodiment of a captured image with a region of interest selected.
  • FIG. 12 shows an embodiment of matrix values corresponding to the region of interest in FIG. 1 1.
  • FIG. 13 shows an exemplary graph of a correlation between signal intensity scores and saturation indexes.
  • FIG. 14 shows an exemplary graph of a correlation between signal intensity scores and value indexes.
  • FIG. 15 shows an exemplary graph of the saturation index of a specimen over time.
  • FIG. 16 shows another exemplary graph of saturation index of a specimen over time.
  • FIG. 17 shows an embodiment of an image analysis process for monitoring staining in a system.
  • FIG. 18 shows an example of the results for several different schemes of color space conversion followed by conversion to grayscale.
  • FIG. 19 shows a comparison of several schemes of color space conversion followed by conversion to grayscale.
  • FIG. 20 shows a comparison of results obtained using a disclosed automated real-time method for stain intensity scoring based on saturation with intensity scoring through visual pathological scoring
  • FIG. 21 shows a comparison results obtained using a disclosed automated real-time method for stain intensity scoring based on color space conversion and grayscale conversion with intensity scoring through visual pathological scoring
  • FIG. 22 shows a disclosed grayscale intensity index vs. antibody incubation time.
  • FIG. 23 shows a disclosed automated method for real-time calculation of percent positive cells for a CD20 IHC assay.
  • FIG. 24 shows a disclosed automated method for real-time separation of different stain colors in a multiplexed assay.
  • the present application generally pertains to a real time assay monitoring system (RTAMS) that can monitor fluid volume in assays for volume adjustment control, monitor stain process quality in real-time, and/or output test results in realtime.
  • the disclosed system includes a real time imaging system to obtain images of a sample undergoing a processing step (such as staining, de- staining, bluing or differentiation) to calculate a saturation index that correlates to a signal intensity score.
  • the RTAMS can use the calculated saturation index to monitor the signal intensity of assays in real time and predict assay outcomes before they are complete.
  • the imaging system in the RTAMS can be used to measure the on-site fluid volume with the specimen to control the system to overcome any fluid evaporation issues that may occur in an assay process.
  • the imaging system in the RTAMS can also be applied, for example, to monitor an assay by tracking changes in tissue color(s) and other image based characteristics to predict assay outcomes or results.
  • the RTAMS can be a developmental tool to develop new reagents, assays, or platforms. Tissue, slide and stain quality can also be tracked in real time for quality assurance, and users alerted early in the process such that remedial measures can be taken.
  • the RTAMS can function as a diagnostic tool, enabling and supporting early digital reporting of patient results before an assay is complete, and even ordering repeat or reflex tests based on the results as they develop.
  • the RTAMS may also serve as a digital pathology tool to support early electronic reporting of assay results and in some embodiments could replace the use of scanners used for analysis of completed assay results.
  • One aspect of the certain embodiments of the disclosed system and method is the ensuring of stain quality by monitoring and controlling assay outcomes.
  • Another aspect of the certain embodiments of the disclosed system and method is the ability to provide a faster result of stain quality in real time before the assay is complete, and permit remediation of any quality issues by alerting a user to possible quality issue, or even to automatically order a second test so that ordering such a test does not require a delay in time for a pathologist to read the test results and request the test due to quality issues evident in a finished assay.
  • Another aspect of certain embodiments of the disclosed system and method is the ability to optimize newly developed reagents, assays, and platforms to provide assay protocols that take less time or that can be automatically stopped when sufficiently developed, thereby shortening assay time "on-the-fly.”
  • FIG. 1 shows an embodiment of a real time assay monitoring system 10.
  • the system 10 includes a controller 12 that can be used to control an imaging system 15 and a sample processing system 21.
  • the sample processing system 21 can use a thin-film technology with low fluid volumes, however, in other embodiments, the sample processing system 21 can use "puddle" technology wherein reagents are applied directly onto substrates, such as slides, on which a tissue or cell sample is placed.
  • one controller 12 can be used to control all of the components of the imaging system 15 and the sample processing system 21.
  • the controller 12 can include more than one controller controlling one or more components of the imaging system 15 and/or the sample processing system 21.
  • the controller 12 (and other distributed controllers) can be connected to the imaging system 15 (which can include a camera 14, and one or more of a front light source 16 and a back light source 18) and the sample processing system 21 (which can include, for example, one or more of a fluid motion mechanism 20, a fluid exchange system 22 and a fluid dispense system 24) by a network.
  • the network connecting the controller 12 and the imaging system 15 and the sample processing system 21 can be a local area network (LAN) that uses an Ethernet protocol to communicate over the network.
  • LAN local area network
  • the network may be the Internet, an Intranet, a wide area network (WAN), or any other type of communication network using one or more communication protocols such as the transmission control protocol/Internet protocol (TCP/IP) when using the Internet.
  • TCP/IP transmission control protocol/Internet protocol
  • the camera 14 can be connected to controller 12 using a GigE vision interface, but the camera 14 can, in other embodiments, be connected to the controller 12 using other types of interfaces, such as a USB vision or Camera Link interface.
  • the controller 12 can connect with other controllers and workflow control software system solutions, for example, to a user alert system 26 or an automated test ordering system 26.
  • the controller 12 can further connect and interface with other Internet applications and imaging applications.
  • FIG. 2 shows a particular embodiment of an imaging system 15 and a sample processing system 21 of a real time assay monitoring system 10 of FIG. 1 .
  • the imaging system 15 can include a camera 14 and a front light source 16 and a back light source 18 as shown in FIG. 2. However, in other embodiments, the imaging system 15 can include more than one camera 14, more than one front light source 16 and more than one back light source 18. In one embodiment, some or all of the components of the imaging system 15 can be mounted on the sample processing system 21 .
  • the imaging system 15 can be used to illuminate and capture images of one or more samples in the sample processing system 21 .
  • the sample processing system 21 can include a fluid motion mechanism 20 to move fluid in the sample and a fluid exchange system 22 that has a fluid dispenser 24 (see FIG.
  • the fluid motion mechanism 20 can include a roller.
  • the sample processing system may not include a fluid motion mechanism 20.
  • the fluid motion mechanism 20 (schematically shown in FIG. 2) can include one or more staining cassettes (not shown) having one or more samples 50 undergoing an assay.
  • the sample processing system 21 can include more than one fluid motion mechanism 20 and more than one fluid exchange system 22. Examples of sample processing systems that can be used with the present application are described in commonly-assigned U.S. Patent Application Publication No.
  • Each of the samples 50 held by cassettes in the sample processing system 21 can include a slide 52 holding one or more specimens 54 to be analyzed by the assay.
  • the sample 50 shown in FIG. 2 is a schematic representation of an assay sample used to show the components in the sample and is not intended to provide any details on the relative sizes of the components.
  • One or more fluids 56 such as reagents and/or stains, can be applied to and/or removed from the specimen 54 with the fluid exchange system 22.
  • the reagents and/or stains 56 can include, but are not limited to, antibody diluent, protease 3, reaction buffer, system fluid, HRP (horseradish peroxidase) inhibitors, antibodies, HQ linker, HRP multimers, H 2 0 2 , DAB (3,3'-Diaminobenzidine), copper reagent, Hematoxylin (HTX), probe reagent and bluing reagent.
  • a cover 58 can then be placed over the specimen 54 and the reagent and/or stain 56.
  • the cover 58 can be a clear or translucent solid plastic or acrylic, but may have different color tints, e.g., a yellow tint, in other embodiments.
  • the cover 58 can also be a clear fluid.
  • the camera 14 can be placed a predetermined distance (d) above the sample 50 such that the sample 50 is within the field of view (FOV) of the camera 14.
  • the camera 14 can be an area scan camera with global shutter to prevent the distortion of the moving object, i.e., the reagent and/or stain 56.
  • other types of cameras can be used in other embodiments.
  • the camera 14 can be a 1600 x 1200 pixels (2 megapixel, 2MP) camera with a 35 mm fixed focal length lens that has a field of view of 988 x 740 mm with about 61.25 ⁇ / ⁇ resolution.
  • the camera 14 can have greater than or less than 2 megapixels, a fixed focal length lens that is greater than or less than 35 mm, a field of view that is greater than or less than 988 x 740 mm, and/or a resolution that is greater than or less than about 61.25 ⁇ / ⁇ .
  • the camera 14 can have a pixel scale (or resolution) of 0.16 mm or lower.
  • the camera 14 can use a 50mm fixed focal length lens with a smaller FOV but a higher resolution.
  • the predetermined distance for placement of the camera 14 above the sample 50 can be based on the resolution of the camera 14 and the number of samples 50 to be captured in the field of view of the camera 14. In one embodiment, the predetermined distance can be 19.5 inches to capture three samples 50. However, other predetermined distances can be used in other embodiments. In another embodiment, if more than three samples 50 are to be captured, a camera 14 can use a pixel array with an increased size and a lens with a decreased focal length to maintain the same image quality. [0047]
  • the front light source 16 and the back light source 18 can each generate white light that is used to illuminate the sample 50. In some embodiments, the front light source 16 and/or the back light source 18 can be assembled into a lamp for use with a lighting fixture.
  • the light source may be implemented by an incandescent bulb, a light emitting diode (LED), or a fluorescent light.
  • the front light source 16 can be positioned in the field of view of the camera 14 and direct light (L1 ) toward one side of the sample 50
  • the back light source 18 can be positioned outside of the field of view of the camera 14 and direct light (L2) toward the opposite side of the sample 50.
  • one or both of the front light source 16 and the back light source 18 can be either within or outside of the field of view of the camera 14.
  • FIG. 3 shows an embodiment of the controller 12.
  • the controller 12 can include logic 31 , referred to herein as "controller logic,” for generally controlling the operation of the controller 12, including communicating with the imaging system 15 and the sample processing system 21.
  • the controller 12 also includes a volume estimator 37 to determine the amount of fluid, e.g., reagent and/or stain 56, being used with a sample 50, an image analyzer 33 to analyze the images from the imaging system 15, and a dispenser volume calculator 35 to determine how much reagent and/or stain 56 to apply to the sample 50 with the fluid exchange system 22 based on information from the volume estimator 37.
  • a volume estimator 37 to determine the amount of fluid, e.g., reagent and/or stain 56, being used with a sample 50
  • an image analyzer 33 to analyze the images from the imaging system 15
  • a dispenser volume calculator 35 to determine how much reagent and/or stain 56 to apply to the sample 50 with the fluid exchange system 22 based on information from the volume estimator 37.
  • the controller logic 31 , the image analyzer 33, the dispenser volume calculator 35 and the volume estimator 37 can be implemented in software, hardware, firmware or any combination thereof.
  • the controller logic 31 , the image analyzer 33, the dispenser volume calculator 35 and the volume estimator 37 are implemented in software and stored in memory 38 of the controller 12.
  • the controller logic 31 , the image analyzer 33, the dispenser volume calculator 35 and the volume estimator 37 when implemented in software, can be stored and transported on any non-transitory computer-readable medium for use by or in connection with an instruction execution apparatus that can fetch and execute instructions.
  • the controller 12 can include at least one conventional processing element
  • the processing element 40 which has processing hardware for executing instructions stored in memory 38.
  • the processing element 40 may include a central processing unit (CPU) or a digital signal processor (DSP).
  • the processing element 40 communicates to and drives the other elements within the controller 12 via a local interface 42, which can include at least one bus.
  • a local interface 42 which can include at least one bus.
  • an input interface 44 for example, a keypad, keyboard or a mouse, can be used to input data from a user of the controller 12
  • an output interface 46 for example, a printer, monitor, liquid crystal display (LCD), or other display apparatus, can be used to output data to the user.
  • a communication interface 48 may be used to exchange data over one or more networks with, for example, the front light source 16, the back light source 18, the camera 14, the fluid motion mechanism 20 and the fluid exchange system 22.
  • the imaging system 15 can be used to obtain quality images of the sample 50 for image analysis, volume calculation, and assay sensing.
  • the camera 14 can have sufficient resolution (or distance per pixel) and contrast to capture the fluid edge and the specimen 54 in the sample 50.
  • the camera 14 can have higher resolution, i.e., a lower distance per pixel, and a lens with a smaller field of view to capture images of the sample 50 in more detail.
  • the imaging system 15 can be used for fluid volume sensing.
  • the imaging system 15 can use the front light source 16 and the back light source 18 to make the fluid boundaries bright so that the controller 12 can differentiate the fluid, e.g., reagent and/or stain 56, from the specimen 54 in the background, even when the specimen 54 has a color associated with it.
  • the back light source 18 can be placed outside of the field of view of camera 14 to provide dark field imaging to make the fluid boundary or edge in the sample 50 bright, so the fluid edge or boundary has strong contrast to the dark and normal background.
  • dark field imaging several other issues such as interference from shadows or a pipette blocking a light source can also be resolved.
  • the front light source 16 and the back light source 18 can be positioned about the sample 50 to provide uniform illumination of the sample 50 so that any determinations by the controller 12 using images from the imaging system 15 are not biased or skewed by lighting.
  • bright field imaging can be used by the imaging system 15 by placing the front light source 16 in the field of view of the camera 14.
  • the real time assay monitoring system 10 can be used as a real time fluid adjustment system (RTFAS) to track the fluid, e.g., the reagent and/or stain, volume in the sample 50 and determine an amount of fluid to be added to or removed from the sample 50, if any, by the fluid exchange system 22.
  • RTFAS can use the imaging system 15, the image analyzer 33, the volume estimator 37, the dispenser volume calculator 35, the fluid exchange system 22 and a position signal from fluid motion mechanism 20.
  • the controller 12 would perform frame checking on the image(s) from the imaging system 15 and suggest an adjustment amount from dispenser volume calculator 35 to fluid exchange system 22, forming a feedback control loop.
  • the adjustment amount from dispenser volume calculator 35 can be provided to a user interface and a user can then control the fluid exchange system 22 to provide the reagent and/or stain 56 to the sample 50.
  • motion-based foreground detection is used to detect the boundary of clear fluid and color-thresholding foreground detection is used to detect the boundary of a stain or colored reagent, e.g., hematoxylin.
  • the boundary detection methodologies used by the RTFAS can use a distinct feature of the fluid (target) for boundary detection and work under various conditions such as changing specimen color or the existence of random tissue patterns in the specimen 54.
  • a Gaussian mixture model foreground detection algorithm can be used by the RTFAS for boundary detection of a clear fluid.
  • FIG. 4 shows an exemplary image generated by the Gaussian mixture model foreground detection algorithm used for boundary detection.
  • two boundaries of the fluid droplet (56 of FIG. 2) located on the right and left of the droplet can be extracted to calculate the fluid volume.
  • a color-thresholding foreground detection algorithm can be used because of the distinctive color feature of the fluid.
  • the color-thresholding foreground detection algorithm can be used for boundary detection even if the specimen 54 may get a similar color to the reagent and/or stain 56 during the staining process because the intensity of the reagent and/or stain 56 is still much stronger than the specimen 54 so that the algorithm can differentiate reagent and/or stain 56 from the stained tissue of the specimen 54.
  • the color-thresholding foreground detection algorithm can transfer the captured images from the imaging system 15 to an HSV (hue, saturation, and value) color map (see FIG. 12) for the selection of the proper hue range to extract the region of reagent and/or stain 56.
  • FIG. 5 shows an exemplary image generated by the color-thresholding foreground detection algorithm used for boundary detection of the fluid 56 (of FIG. 2) even when the fluid 56 and tissue sample 54( of FIG. 2) are of similar colors.
  • the color-thresholding foreground detection algorithm the area of the reagent can be extracted and the fluid volume can be calculated from the extracted area.
  • the controller 12 can be connected to the camera 14 to receive acquired or captured images from the camera 14.
  • the controller 12 can also be connected to a digital I/O device associated with the fluid motion mechanism 20 to receive a digital signal indicative of the sample position of the sample 50 in the staining cassette.
  • the RTFAS can perform image analysis, error checking, and volume calculation to suggest a proper adjustment volume.
  • the images from the camera 14 can be captured at the same sample position and then analyzed for consistent results.
  • the image analysis can be performed on either color or grayscale images.
  • FIG. 6 shows an embodiment of a process for determining an amount of adjustment fluid with an RTFAS.
  • the process begins by configuring the camera 14 (step 102) by setting parameters such as exposure, brightness, and gain.
  • the I/O device can then be configured (step 104).
  • an image is acquired (step 106) from the camera 14.
  • a foreground detection algorithm can be applied to the captured image (step 108) by the image analyzer 33 to identify fluid boundaries.
  • the image analyzer 33 can be continuously provided with images or video in order to identify the image background by machine learning.
  • the image analyzer 33 (see, FIG. 3) can remove any noise from the processed image (step 1 10).
  • An I/O signal check is then made to determine if a signal indicating the proper sample position to make a fluid measurement has been received (step 1 12). If the proper I/O signal has not been received, the process returns to step 106 to acquire another image.
  • the sample position can be acquired each time an image or frame is acquired in step 106 to identify the position of the reagent and/or stain 56 in the sample 50 (see FIG. 2).
  • Sample position can be determined by the step motor positions in the fluid motion mechanism 20 (of FIG. 1 ) that move the staining cassette and samples 50 and thereby move the reagent and/or stain 56 in the sample 50.
  • the step motor positions and corresponding sample positions can be around +4500, which indicates one end position corresponding to the reagent and/or stain 56 at the right end of the slide 50 (see FIG.
  • FIG. 7B shows the reagent and/or stain 56 at sample position 0, which corresponds to the center position, when the reagent and/or stain 56 is moving from right to left in FIG. 7C, which sample position does not correspond to the reagent and/or stain 56 being in the center of the slide 52.
  • the proper sample position would be at a predetermined location relative to the center of the slide (which corresponds to sample position 0) depending on the direction of travel and the viscosity of the reagent and/or stain 56.
  • the reagent and/or stain 56 is at the measurement point, i.e., the reagent and/or stain 56 is in the center of the slide, at sample position -300, when the reagent and/or stain 56 is moving from right to left in the sample 50.
  • FIGS. 7A-7D are schematic representations used to show the position of the reagent and/or stain 56 relative to sample position and are not intended to provide any details on the relative sizes of the components.
  • the reagent and/or stain 56 can be dragging behind the center of the sample position, so the measurement should be taken slightly away from the central point of the sample position.
  • the majority of the reagent and/or stain 56 can be on the left-hand side of the central point when the reagent and/or stain 56 is travelling to the right and the majority of the reagent and/or stain 56 can be on the right-hand side of the central point when the reagent and/or stain 56 is travelling to the left.
  • the RTFAS can be used to characterize the relationship between the motion of the reagent and/or stain 56 and fluid motion mechanism 20, to understand how reagent and/or stain 56 rolls at different rolling speed and rolling volume, and to investigate how different reagents with different viscosities behave during the rolling operation since the RTFAS can acquire images at certain sample positions.
  • the RTFAS can check sample position periodically.
  • a detection mechanism in the fluid motion mechanism 20, which generates the I/O signal can determine if the sample position passes sample position -300 when moving from sample position +4500.
  • the detection mechanism can adjust the I/O signal to a "1 " if the sample position is between -300 and +4500 and adjust the I/O signal to a "0" in other positions.
  • the RTFAS can record or store the I/O signal, and if the previous I/O signal equals to 1 and the current I/O signal changes to 0, then the RTFAS knows the reagent and/or stain 56 is moving from an sample position of +4500 and just crossed a sample position of -300, which corresponds to the reagent and/or stain 56 being in the proper position for a measurement.
  • the detection mechanism can send a signal that corresponds to the sample position and the RTFAS can evaluate the signal from the detection mechanism to determine whether the corresponding sample position from the signal is within a predetermined range of the predetermined location of the sample position. For example, the RTFAS can indicate a positive I/O signal if the sample position is between about -200 and -400 when the reagent and/or stain 56 is moving from right to left in the sample 50.
  • a fluid volume is calculated by the volume estimator 37 (step 1 14).
  • the volume of the reagent and/or stain 56 can be calculated based on the system (or "ARC") geometry and the measured fluid bandwidth or length, i.e., the distance between the detected fluid boundaries.
  • the calculated volume may have to be calibrated to account for assumptions used in the volume calculation and/or other possible matters that may affect the accuracy of the calculation.
  • a frame check is then performed (step 1 16) to determine if the frame and corresponding volume calculation are acceptable.
  • the frame check can check for errors such as an excessive volume change and check for other abnormal frame conditions such as a pipette blocking the field of view. If the frame or volume calculation is not acceptable, i.e., there is an error or abnormality associated with the frame or the volume calculation, the process returns to step 106 to acquire another image. If the frame and volume calculation are acceptable, an adjustment amount is calculated (step 118) by the dispenser volume calculator 35 and the process returns to step 106 to acquire another image.
  • an adjustment amount should only be determined when the volume calculation is done from a satisfactory image or frame with clear fluid boundaries as can be judged by image processing analyzer 33 of FIG. 3 .
  • image processing analyzer 33 of FIG. 3 several different types of events can occur that can affect the accuracy of the volume estimation and thereby affect the calculation of the adjustment amount.
  • a frame with a pipette arm travelling through the field of view may yield an excessive calculated volume.
  • the ratio of the bright pixels in a frame is calculated as part of the frame check in step 116 to ensure that an adjustment amount is not calculated when bright pixels represents more than 50% of the frame. In other words, an acceptable frame has less than 50% of bright pixels in the frame.
  • an accurate volume calculation cannot occur when one part of the fluid boundary is not in the field of view.
  • the fluid boundary may be out of range, i.e., not in the field of view, when the reagent and/or stain 56 has a large volume, such as 200 ⁇ _ or more, and is moving at a high speed, such as more than 100mm/s.
  • an accurate volume calculation cannot occur when the foreground analysis of step 108 cannot provide a correct fluid boundary.
  • the RTFAS can compare the previous volume to the current volume. If there is a large difference between the two volumes, the RTFAS can wait until the next measurement point to determine the current volume. In other words, when there is a large difference between two calculated volumes, the frame check in step 1 16 can reject the volume measurement and return the process to step 106 to acquire a new image.
  • the RTFAS can provide a user interface for a user to monitor the process of FIG. 6.
  • FIG. 8 shows an exemplary screenshot of a user interface displayed by the RTFAS.
  • the user interface 140 displayed by the RTFAS can include four panels to provide information to the user on the process of FIG. 6.
  • a first panel 142 shows the current image acquired by camera 14.
  • a second panel 144 shows the foreground detected using the Gaussian Mixture Model or color- thresholding method.
  • a third panel 146 shows the calculated current volume (over time) based on the detected foreground.
  • a fourth panel 148 shows the calculated adjustment amount (over time) based on a user-input target volume, an offset volume, and the measured volume.
  • a decline of measured volume of about 8 ⁇ _ can be observed due to the evaporation of the reagent and/or stain 56 during 120 seconds of rolling.
  • the RTFAS can detect for the formation of bubbles in the reagent and/or stain 56 and can compensate for the presence of the bubbles in the volume calculation in step 1 14. If volume calculation does not compensate for the presence of bubbles, the volume calculation may be overestimated because the bubbles formed in the reagent and/or stain 56 would increase the measured fluid bandwidth. In one embodiment, bubbles may form in the reagent and/or stain 56 when antibody diluent is being used in the sample 50.
  • the circular shape of the bubbles inside the fluid can be used to detect for the presence of the bubbles and then perform compensation for the bubbles.
  • a circle detection scheme can be used to identify any bubbles in the detected foreground of the acquired image. By calculating the numbers of bubbles in the image and giving proper volume compensations for the bubbles, the volume of the reagent and/or stain 56 can be measured more accurately in the presence of bubbles in the reagent and/or stain 56.
  • the RTFAS can perform image acquisition, sample position acquisition, and image analysis in about 0.06 seconds and would have a frame rate of about 16 frames per second.
  • the processing time can be based on the programming language used to perform the image analysis and the performance of the computer used to execute the image analysis. Improvements in processing time may be obtained by using more efficient programming languages or better performing computers.
  • the real time assay monitoring system 10 can also be used to calculate a saturation index for an assay that corresponds to a signal intensity score given by a pathologist analyzing the results of the assay with a microscope at the completion of the assay.
  • the calculated indices can be obtained from changes in colors on the tissue specimen. The changes in color are captured during a reaction in which chromogen colors get deposited on the sample specimen during a reaction (e.g., during DAB depostition) and other color uptakes (e.g. dyes and fluorophores used, for example, in multiplexing assays).
  • the system 10 can monitor and measure an index of a reaction in real time.
  • the calculated saturation index can be used to monitor, in real time, the staining process for the samples 50.
  • An example of a staining process that can be used with the present application are described in commonly-assigned U.S. Patent Application Publication No. 2013/0302852, entitled “Hematoxylin Staining Method” and published on November 14, 2013, which is incorporated herein by reference.
  • FIG. 9 shows an embodiment of a process for monitoring the staining process of an assay.
  • the process begins by configuring the camera 14 (step 182) by setting parameters such as exposure, brightness, and gain.
  • an image is acquired (step 184) from the camera 14.
  • Each acquired image can be composed of a matrix with values representing the color for each pixel.
  • the HSV (hue, saturation, value) color model can be used.
  • different color models such as RGB (red, green, blue), L*A*B*, or YCbCr, can be used.
  • the hue index provides information, in the form of numbers, about the color of the specimen 54
  • the saturation index provides information on the lightness or darkness of the staining
  • the value index sometimes called the brightness index, also provides light / dark information on the stain.
  • FIG. 10 shows an embodiment of an HSV color model.
  • the hue index (or value) represents the color
  • a saturation index (or value) close to zero refers to a very light color close to white
  • a value index (or value) close to zero refers to a very dark color close black.
  • a region of interest can be selected (step 186) in the captured image.
  • a box 202 can be positioned to correspond to the selected for the region of interest (ROI) in the acquired image.
  • an ROI can be selected in a region of the tissue being stained either by a user or automatically by the system 10.
  • the same or a different ROI can be selected for each acquired image from one sample 50.
  • the image has a number or index representing the local intensity for each pixel as shown in FIG. 11. The array of different intensities corresponding to the pixels in the ROI can be analyzed and compared to each other.
  • the ROI can be established as the same location of a tissue biopsy that has been placed on different slides.
  • the arrays of the ROIs from the different slides can be compared to each other, either prior to or during the assay process, to provide a baseline. Once the baseline is established, any differences between the arrays of the ROIs of processed samples and the baseline are directed to the result of the assay process.
  • a saturation index and a signal intensity score for the selected ROI can be calculated (step 188).
  • the calculated saturation index can be converted to a signal intensity score using a predefined correlation.
  • FIG. 13 shows a graph of the correlation between signal intensity scores and saturation indexes.
  • the correlation between signal intensity scores and saturation indexes can be made experimentally by performing staining procedures with different antibody and DAB incubation times and recording the saturation index for each of the staining procedures just before the end of the staining procedure. The results of each of the staining procedures can then be provided to a pathologist for a signal intensity score which is then correlated to the recorded saturation index.
  • the value index can be used instead of the saturation index to generate the signal intensity score.
  • FIG. 14 shows a graph of the correlation between signal intensity scores and value indexes.
  • the correlation between signal intensity scores and value indexes can be made experimentally by performing staining procedures with different antibody and DAB incubation times and recording the value index for each of the staining procedures just before the end of the staining procedure. The results of each of the staining procedures can then be provided to a pathologist for a signal intensity score which is then correlated to the recorded value index.
  • both the saturation index and the value index can both be used to generate a corresponding signal intensity score.
  • the hue index can be used for color detection when multiple colors are used to distinguish multiple assay targets in the same specimen through multiplexing staining procedures since similar colors are encoded close to each other in numeric values.
  • the calculated signal intensity score can be used to evaluate the staining of the specimen (step 189).
  • the calculated signal intensity score can be used to determine if the staining process is proceeding as expected while the staining process is still ongoing. A determination can then be made as to whether the assay had been completed or should be stopped or modified (step 190). If the assay has been completed because the specified incubation time has elapsed or if the assay should be stopped or modified because the signal intensity score indicates a problem with the staining process, the process ends, otherwise the process returns to step 184 to acquire another image.
  • the real time assay monitoring system 10 can be used to ensure tissue staining uniformity.
  • the system 10 can segment the specimen areas into different ROIs and compare their saturation indexes. If there is a declining or increasing trend of saturation indexes, there can be a gradient of the staining signal intensity, which occurs in the case of a non-uniform stained sample.
  • the saturation index value can be normalized to the slide background to ensure that saturation index differences are not obtained from differences in local lighting conditions.
  • the real time assay monitoring system 10 can be used to optimize assay protocols.
  • the real time assay monitoring system 10 can monitor the saturation index in real time at about a frame per minute /or less for antibody incubation time optimization while maintaining DAB incubation time the same for each sample. As shown in FIG. 15, 16 minutes of antibody incubation time results in the saturation index being above 0.7 after 2 minutes during the DAB color reaction, which indicates that the 16 minute antibody incubation time results in the desired signal intensity for stain quality measurement optimization. If the antibody incubation time is shortened to 8 minutes, the saturation index during the DAB color reaction can only saturate around 0.68.
  • the saturation index can only reach around 0.66.
  • the real time assay monitoring system 10 can also monitor the saturation index in real time at about a frame per minute for DAB/H 2 0 2 incubation time optimization while maintaining the antibody incubation time the same for each sample. As shown in FIG. 16, DAB incubation for only 1 minute shows that the saturation index just stops while the saturation index is in a sharply increasing region, as evidenced by the other samples. For 15 minutes of DAB incubation, the saturation index rises above 0.7 after 6 minutes and may indicate that the extra minutes of DAB incubation time are not necessary.
  • a DAB incubation time of 6-8 minutes may provide better results because the signal is allowed to saturate with the time and there is also a time margin about 2-4 minutes to ensure the signal saturation.
  • the difference in the saturation index from the different assay protocols shows that the real time assay monitoring system 10 can be used to optimize assay protocols, such as antibody incubation time and/or DAB incubation time.
  • the system 10 can discern and measure changes in color during an assay chromogen reaction.
  • the system 10 can discern the presence or absence of color, determine the type of color and distinguish intensity and brightness. By measuring the changes in color during the assay chromogen reaction, the system 10 can be used for assay and platform development and extended to quality control monitoring and workflow monitoring.
  • the system 10 can be equipped to provide a scoring assessment of the stain quality in real time.
  • the stain quality scores provide insight of the assay performance and staining results before the assay is complete.
  • preliminary scores can be stored and/or reported electronically for various purposes.
  • the preliminary scores may aid pathologists and technicians by providing an assessment of the stain quality, initial results of the assay, and preliminary diagnostic assessment of the test case.
  • the system 10 can be used as a digital pathology tool enabling and supporting early digital reporting of patient results to pathologists before assay procedure is complete.
  • data collected throughout the assay procedure can also be stored as part of the slide's barcode as part of a workflow solution.
  • the system 10 can be used to maintain record keeping of the assay workflow accessible on cloud based workflow software outside of the staining platform.
  • the system 10 can be used as an assay and reagent development tool.
  • the system 10 can measure and profile measurement parameters linked to color change based on experimental testing for chromogens, reagents and antibody development. The measured results can help determine the optimal reagent, antibody, chromogen and counterstain incubation times based on pathologist scoring criteria.
  • the measured results provided by the system 10 enables determining which experimental conditions of antibody, chromogen detection and counterstain reagents incubations are sufficient and necessary in real time for optimal assay performance in the development and validation of the assays.
  • the system 10 can be applied to both fluorescent and non- fluorescent chromogens contingent on having filters that permit visual inspection at appropriate wavelengths.
  • the system 10 permits color separation, the system 10 can separate multiple different fluorophores and bright field chromogen colors at the same time during multiplexing IHC (immunohistochemistry).
  • multiplexing characterization of staining and validation can be enabled and readily optimized by quantitative parameters obtained with the system 10.
  • the system 10 can be used to implement any experimental manipulation including assessment of bulk reagents and test their impact on stain quality with the scoring algorithm.
  • the scoring algorithm used by the system 10 also enables quality monitoring and evaluation of platform performance.
  • implementation of real time assay monitoring could permit assessment of staining quality linking the potential platform design changes or platform related testing to the potential impact on stain quality for both primary and advanced staining platforms.
  • the system 10 can be used with marketed platforms to monitor consistency in desired stain quality in situations where global customers have varying preferences in stain intensity and hue.
  • the system 10 can enable customers to program stain preference and hue based on a quantitative scale such as through a touch screen.
  • the quantitative scale could serve as a metric for real time monitoring, and evaluating stain preference in a quality controlled approach.
  • the monitoring system 10 can provide an unbiased quantitative parameter to distinguish those settings that could be validated by pathologists.
  • FIG. 17 and embodiment of a process 300 for image analysis of DAB signal intensity is shown, which process can be used for monitoring stain process progression (for example, for quality control or assay development) that could trigger a user alert or for providing early results of an assay (such as a threshold % positivity of cells having a particular biomarker) that could trigger the automatic ordering of a reflex test to investigate for a correlated biomarker that could aid in a patient diagnosis.
  • the controller of the system triggers 302 image acquisition 304
  • the image analysis system identifies the tissue through a process of basic registration 306, edge detection 308, filtering of noise 310, formation of a binary mask 312 (a process that can include dilation, filling and image erosion as is shown in FIG.
  • Panel A of FIG. 18 shows images visually illustrating the process of tissue identification as described with regard to the process of FIG 17.
  • Panel B of FIG. 18 visually demonstrates the processes of color segmentation and scoring as described in FIG 17, but further illustrating several embodiments of color space conversions that are possible alternatives.
  • FIG. 19 illustrates additional types of average grayscale and saturation scores that can be generated according to additional embodiments of the disclosed system and method and that can be used to assess stain process progression and quality.
  • Grayscale images contain multiple shades of gray in between black and white. Grayscale index was chosen because each pixel only carries intensity information after colorimetric conversion from RGB color space or another color space. 8-bit grayscale index format converted from RGB color space was applied. This index varies from black as absolute absence of intensity (0 out of 255) to white as absolute presence of intensity (255 out of 255), and thus is inversely proportional to an intensity score provided by pathologist, since a darker signal will receive higher intensity score from pathologists but a lower index value from grayscale. As described and shown in FIGS. 17 and 18, edge detection was used to create a binary mask that separated the section containing tissue from the entire acquired image including some image dilation and erosion.
  • color segmentation was used to separate the stains by colors, which are positive signals, counterstain signal and background.
  • Different color segmentation strategies had been tested but k- means clustering for setting thresholds in RGB color space works well for the CD20 assay in the puddle environment.
  • FIG. 20 shows the correlation between pathologists' scores and the saturation value in the puddle environment.
  • the R 2 value is lower than what was seen in the thin film environment.
  • these tests demonstrated that the system can provide interpretive results before the assay ends, since RTAMS calculates the scores at the moment before hematoxylin is dispensed onto the tissue samples, whereas the pathologists' scores were made after the slides had undergone the complete assay protocol.
  • RTAMS can also be used to calculate a percentage of positive cells of in the CD20 according to the embodiment of FIG 23.
  • three colors are evident in the images: brown for the DAB signal, light blue for the counterstaining signal and a white background, as is shown in Panel A of FIG. 23.
  • K-means clustering in HSV color space can be used to separate brown with other colors.
  • threshold setting in RGB color space is used to divide blue from white background. This particular example calculates an index based on pixels instead of cells, wherein the % positive cells is calculated by dividing the number of brown pixels by the sum of brown and blue pixels and multiplying by 100.
  • a possible alternative to this method is to utilize machine learning method to build a classifier to separate stained cell and non-stained cells and arrive at a percent % cells. Since such measures of %positive cells can be obtained during the assay, it is possible to provide a logic module as part of the automated test ordering system 28 of FIG. 1 , wherein if for a given test a predetermined number of cells in a sample are positive for a particular marker, a second (and possibly third, fourth or more) test is automatically ordered before the first test is finished.
  • a test result for example, a result upon which a particular therapy decision could be made
  • controller 12 could be output from controller 12 immediately upon the number of positive cells reaching a predetermined value.
  • Another embodiment of the disclosed real time assay system and method includes a system and method for separating the portion of a sample image that is stained by DAB (brown) and a Red chromogen.
  • Setting a threshold in RGB channel is no longer a proper method for color detection since both brown and red contain main intensity in the R channel for a DAB/Red assay. Therefore, k-means clustering in various color spaces including RGB, HSV and L * a * b * was tested. As a result, k- means clustering in RGB color space was found to be the optimal solution for color detection in DAB/Red assay.
  • the overall scheme of this embodiment is shown in FIG. 24.
  • a method involving steps a, b, and c means that the method includes at least steps a, b, and c.
  • steps and processes may be outlined herein in a particular order, the skilled artisan will recognize that the ordering steps and processes may vary unless a particular order is clearly indicated by the context.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Biochemistry (AREA)
  • Immunology (AREA)
  • Analytical Chemistry (AREA)
  • Pathology (AREA)
  • Molecular Biology (AREA)
  • Biomedical Technology (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Quality & Reliability (AREA)
  • Radiology & Medical Imaging (AREA)
  • Plasma & Fusion (AREA)
  • Chemical Kinetics & Catalysis (AREA)
  • Vascular Medicine (AREA)
  • Investigating Or Analysing Materials By Optical Means (AREA)
  • Investigating Or Analysing Biological Materials (AREA)
  • Apparatus Associated With Microorganisms And Enzymes (AREA)
  • Measuring Or Testing Involving Enzymes Or Micro-Organisms (AREA)
  • Sampling And Sample Adjustment (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • Software Systems (AREA)
EP16826047.9A 2015-12-30 2016-12-22 System und verfahren zur echtzeit-assaykontrolle Active EP3397951B1 (de)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP23171353.8A EP4235598A3 (de) 2015-12-30 2016-12-22 System und verfahren zur echtzeit-assaykontrolle

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201562273232P 2015-12-30 2015-12-30
US201662430826P 2016-12-06 2016-12-06
PCT/EP2016/082377 WO2017114749A1 (en) 2015-12-30 2016-12-22 System and method for real time assay monitoring

Related Child Applications (1)

Application Number Title Priority Date Filing Date
EP23171353.8A Division EP4235598A3 (de) 2015-12-30 2016-12-22 System und verfahren zur echtzeit-assaykontrolle

Publications (2)

Publication Number Publication Date
EP3397951A1 true EP3397951A1 (de) 2018-11-07
EP3397951B1 EP3397951B1 (de) 2023-05-24

Family

ID=57796310

Family Applications (2)

Application Number Title Priority Date Filing Date
EP16826047.9A Active EP3397951B1 (de) 2015-12-30 2016-12-22 System und verfahren zur echtzeit-assaykontrolle
EP23171353.8A Pending EP4235598A3 (de) 2015-12-30 2016-12-22 System und verfahren zur echtzeit-assaykontrolle

Family Applications After (1)

Application Number Title Priority Date Filing Date
EP23171353.8A Pending EP4235598A3 (de) 2015-12-30 2016-12-22 System und verfahren zur echtzeit-assaykontrolle

Country Status (7)

Country Link
US (3) US11320348B2 (de)
EP (2) EP3397951B1 (de)
JP (1) JP6843867B2 (de)
CN (1) CN108475328B (de)
AU (1) AU2016382837B2 (de)
CA (2) CA3161884A1 (de)
WO (1) WO2017114749A1 (de)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6908624B2 (ja) * 2016-04-27 2021-07-28 ベンタナ メディカル システムズ, インコーポレイテッド リアルタイム体積制御するためのシステムおよび方法
WO2018026725A1 (en) * 2016-08-01 2018-02-08 Genprime, Inc. System and method to interpret tests that change color to indicate the presence or non-presence of a compound
CN117058415A (zh) 2016-10-28 2023-11-14 贝克曼库尔特有限公司 物质准备评估系统
EP3644044B1 (de) * 2018-10-24 2020-12-23 Leica Biosystems Imaging, Inc. Kamerabelichtungssteuerung bei der erfassung von fluoreszenz-in-situ-hybridisierungs-bildern
CN112666047B (zh) * 2021-01-14 2022-04-29 新疆大学 一种液体粘度检测方法
CN114397929B (zh) * 2022-01-18 2023-03-31 中山东菱威力电器有限公司 一种可以改善冲洗水初始温度的智能马桶盖控制系统

Family Cites Families (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5625706A (en) * 1995-05-31 1997-04-29 Neopath, Inc. Method and apparatus for continously monitoring and forecasting slide and specimen preparation for a biological specimen population
US6195451B1 (en) 1999-05-13 2001-02-27 Advanced Pathology Ststems, Inc. Transformation of digital images
US20020155587A1 (en) * 2001-04-20 2002-10-24 Sequenom, Inc. System and method for testing a biological sample
AU2002322033A1 (en) * 2001-06-04 2002-12-16 Ikonisys Inc. Method for detecting infectious agents using computer controlled automated image analysis
US6589180B2 (en) * 2001-06-20 2003-07-08 Bae Systems Information And Electronic Systems Integration, Inc Acoustical array with multilayer substrate integrated circuits
US20040033163A1 (en) * 2001-11-26 2004-02-19 Lab Vision Corporation Automated tissue staining system and reagent container
CN100370247C (zh) * 2002-05-13 2008-02-20 松下电器产业株式会社 生物样本的活动信号测量装置和测量方法
US20040009518A1 (en) 2002-05-14 2004-01-15 The Chinese University Of Hong Kong Methods for evaluating a disease condition by nucleic acid detection and fractionation
US7193775B2 (en) * 2002-05-30 2007-03-20 Dmetrix, Inc. EPI-illumination system for an array microscope
AU2003245499A1 (en) * 2002-06-14 2003-12-31 Chromavision Medical Systems, Inc. Automated slide staining apparatus
US20060073074A1 (en) * 2004-10-06 2006-04-06 Lars Winther Enhanced sample processing system and methods of biological slide processing
US20050136549A1 (en) * 2003-10-30 2005-06-23 Bioimagene, Inc. Method and system for automatically determining diagnostic saliency of digital images
US20080225072A1 (en) * 2007-03-15 2008-09-18 Jena Marie Klees Calibration of drop detector and acquisition of drop detect data for nozzles of fluid-ejection mechanisms
EP2362228B1 (de) 2007-07-10 2013-09-18 Ventana Medical Systems, Inc. Vorrichtung und Verfahren zur Verarbeitung biologischer Proben
CA2604317C (en) * 2007-08-06 2017-02-28 Historx, Inc. Methods and system for validating sample images for quantitative immunoassays
EP2217132B1 (de) * 2007-11-02 2013-05-15 The Trustees of Columbia University in the City of New York Einführbares chirurgisches bildgebungsgerät
US20100144055A1 (en) * 2008-11-07 2010-06-10 Nanosphere, Inc. Assays For Clinical Assessments of Disease-Associated Autoantibodies
US8150668B2 (en) * 2009-02-11 2012-04-03 Livermore Software Technology Corporation Thermal fluid-structure interaction simulation in finite element analysis
WO2010117025A1 (ja) * 2009-04-10 2010-10-14 株式会社 日立メディコ 超音波診断装置、および、血流動態の分布像の構成方法
TWI390970B (zh) * 2009-07-22 2013-03-21 Altek Corp Use motion detection to adjust the digital camera's shooting settings
JP5378271B2 (ja) * 2010-03-11 2013-12-25 シスメックス株式会社 塗抹標本染色装置、塗抹標本作製装置、塗抹標本処理システム、及び染色条件の決定方法
US8462388B2 (en) * 2010-06-08 2013-06-11 Xerox Corporation Identifying a color separation wherein a banding defect originates
US8673643B2 (en) * 2010-11-30 2014-03-18 General Electric Company Closed loop monitoring of automated molecular pathology system
US8521495B2 (en) * 2010-12-10 2013-08-27 The Boeing Company Calculating liquid levels in arbitrarily shaped containment vessels using solid modeling
US20130302852A1 (en) 2011-01-10 2013-11-14 Ventana Medical Systems, Inc. Hematoxylin Staining Method
WO2013167139A1 (en) * 2012-05-11 2013-11-14 Dako Denmark A/S Method and apparatus for image scoring and analysis
DE102012216336B4 (de) 2012-09-13 2018-12-13 Leica Biosystems Nussloch Gmbh Verfahren zum Färben einer histologischen Probe und Färbeautomat
US9989448B2 (en) 2012-12-26 2018-06-05 Ventana Medical Systems, Inc. Specimen processing systems and methods for holding slides
US9366493B2 (en) * 2014-01-08 2016-06-14 Trackingpoint, Inc. Precision guided handgun and method
JP6246030B2 (ja) * 2014-03-12 2017-12-13 東芝メディカルシステムズ株式会社 病理染色装置及び病理染色方法
JP5968944B2 (ja) * 2014-03-31 2016-08-10 富士フイルム株式会社 内視鏡システム、プロセッサ装置、光源装置、内視鏡システムの作動方法、プロセッサ装置の作動方法、光源装置の作動方法
JP2015198866A (ja) * 2014-04-10 2015-11-12 セイコーエプソン株式会社 流体噴射装置

Also Published As

Publication number Publication date
CN108475328A (zh) 2018-08-31
CA3007159A1 (en) 2017-07-06
JP6843867B2 (ja) 2021-03-17
WO2017114749A1 (en) 2017-07-06
US11854196B2 (en) 2023-12-26
AU2016382837B2 (en) 2019-08-08
US20220090995A1 (en) 2022-03-24
US20220148175A1 (en) 2022-05-12
US20190094116A1 (en) 2019-03-28
CA3007159C (en) 2022-08-23
JP2019507329A (ja) 2019-03-14
AU2016382837A1 (en) 2018-06-14
CN108475328B (zh) 2022-05-03
EP3397951B1 (de) 2023-05-24
US11320348B2 (en) 2022-05-03
EP4235598A2 (de) 2023-08-30
EP4235598A3 (de) 2023-09-13
CA3161884A1 (en) 2017-07-06

Similar Documents

Publication Publication Date Title
US11854196B2 (en) System and method for real time assay monitoring
US11016006B2 (en) System and method for real-time volume control
US8160348B2 (en) Methods and system for validating sample images for quantitative immunoassays
JP6594294B2 (ja) 顕微鏡画像の画像品質評価
US11531032B2 (en) Methods for measuring analyte and/or protein in biological samples
RU2504777C2 (ru) Способ клеточного анализа пробы при помощи виртуальной аналитической пластинки
US8744827B2 (en) Method for preparing a processed virtual analysis plate
US20220078340A1 (en) Microscopy System and Method for Generating an Overview Image

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20180724

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20200508

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

RIC1 Information provided on ipc code assigned before grant

Ipc: G06V 20/69 20220101ALI20221114BHEP

Ipc: G06V 10/56 20220101ALI20221114BHEP

Ipc: G06V 10/25 20220101ALI20221114BHEP

Ipc: G06V 10/147 20220101ALI20221114BHEP

Ipc: G06T 7/00 20170101ALI20221114BHEP

Ipc: G01N 1/30 20060101ALI20221114BHEP

Ipc: G01N 21/78 20060101AFI20221114BHEP

INTG Intention to grant announced

Effective date: 20221206

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

RIN1 Information on inventor provided before grant (corrected)

Inventor name: LIN, CHIH-CHING

Inventor name: JONES, LISA A.

Inventor name: DUQUETTE, SETAREH

Inventor name: CHENG, YU-HENG

Inventor name: SEPULVEDA, JAVIER A. PEREZ

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602016079617

Country of ref document: DE

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 1569798

Country of ref document: AT

Kind code of ref document: T

Effective date: 20230615

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG9D

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20230524

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 1569798

Country of ref document: AT

Kind code of ref document: T

Effective date: 20230524

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230524

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230925

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230824

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230524

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230524

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230524

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230524

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230524

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230524

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230524

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230924

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230524

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230825

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230524

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230524

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20231121

Year of fee payment: 8

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230524

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230524

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230524

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230524

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230524

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230524

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20231122

Year of fee payment: 8

Ref country code: DE

Payment date: 20231121

Year of fee payment: 8

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602016079617

Country of ref document: DE

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: CH

Payment date: 20240101

Year of fee payment: 8

26N No opposition filed

Effective date: 20240227

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230524