WO2015172025A1 - Systèmes et procédés permettant la détection, l'analyse, l'isolement et/ou la récolte d'objets biologiques - Google Patents

Systèmes et procédés permettant la détection, l'analyse, l'isolement et/ou la récolte d'objets biologiques Download PDF

Info

Publication number
WO2015172025A1
WO2015172025A1 PCT/US2015/029892 US2015029892W WO2015172025A1 WO 2015172025 A1 WO2015172025 A1 WO 2015172025A1 US 2015029892 W US2015029892 W US 2015029892W WO 2015172025 A1 WO2015172025 A1 WO 2015172025A1
Authority
WO
WIPO (PCT)
Prior art keywords
tip
stage
tool
location
interaction
Prior art date
Application number
PCT/US2015/029892
Other languages
English (en)
Inventor
George F. Muschler
James MONNICH
Edward KWEE
Kimberly A. Powell
Edward E. Herderick
Cynthia A. BOEHM
Thomas R. Adams
Robert GERMANOSKI
Frank KRAKOSH, III.
James Dunn
Daniel Bantz
Original Assignee
The Cleveland Clinic Foundation
Parker Hannifin Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by The Cleveland Clinic Foundation, Parker Hannifin Corporation filed Critical The Cleveland Clinic Foundation
Priority to EP15727490.3A priority Critical patent/EP3140662B1/fr
Priority to US15/309,712 priority patent/US10564172B2/en
Publication of WO2015172025A1 publication Critical patent/WO2015172025A1/fr
Priority to US16/741,864 priority patent/US11579160B2/en
Priority to US18/108,738 priority patent/US20230184804A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N35/00Automatic analysis not limited to methods or materials provided for in any single one of groups G01N1/00 - G01N33/00; Handling materials therefor
    • G01N35/10Devices for transferring samples or any liquids to, in, or from, the analysis apparatus, e.g. suction devices, injection devices
    • G01N35/1009Characterised by arrangements for controlling the aspiration or dispense of liquids
    • G01N35/1011Control of the position or alignment of the transfer device
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B01PHYSICAL OR CHEMICAL PROCESSES OR APPARATUS IN GENERAL
    • B01LCHEMICAL OR PHYSICAL LABORATORY APPARATUS FOR GENERAL USE
    • B01L3/00Containers or dishes for laboratory use, e.g. laboratory glassware; Droppers
    • B01L3/02Burettes; Pipettes
    • B01L3/0241Drop counters; Drop formers
    • B01L3/0262Drop counters; Drop formers using touch-off at substrate or container
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B01PHYSICAL OR CHEMICAL PROCESSES OR APPARATUS IN GENERAL
    • B01LCHEMICAL OR PHYSICAL LABORATORY APPARATUS FOR GENERAL USE
    • B01L9/00Supporting devices; Holding devices
    • B01L9/56Means for indicating position of a recipient or sample in an array
    • CCHEMISTRY; METALLURGY
    • C12BIOCHEMISTRY; BEER; SPIRITS; WINE; VINEGAR; MICROBIOLOGY; ENZYMOLOGY; MUTATION OR GENETIC ENGINEERING
    • C12MAPPARATUS FOR ENZYMOLOGY OR MICROBIOLOGY; APPARATUS FOR CULTURING MICROORGANISMS FOR PRODUCING BIOMASS, FOR GROWING CELLS OR FOR OBTAINING FERMENTATION OR METABOLIC PRODUCTS, i.e. BIOREACTORS OR FERMENTERS
    • C12M41/00Means for regulation, monitoring, measurement or control, e.g. flow regulation
    • C12M41/48Automatic or computerized control
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • G01B11/06Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness for measuring thickness ; e.g. of sheet material
    • G01B11/0608Height gauges
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/14Measuring arrangements characterised by the use of optical techniques for measuring distance or clearance between spaced objects or spaced apertures
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B01PHYSICAL OR CHEMICAL PROCESSES OR APPARATUS IN GENERAL
    • B01LCHEMICAL OR PHYSICAL LABORATORY APPARATUS FOR GENERAL USE
    • B01L2300/00Additional constructional details
    • B01L2300/06Auxiliary integrated devices, integrated components
    • B01L2300/0627Sensor or part of a sensor is integrated
    • B01L2300/0654Lenses; Optical fibres
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N35/00Automatic analysis not limited to methods or materials provided for in any single one of groups G01N1/00 - G01N33/00; Handling materials therefor
    • G01N35/10Devices for transferring samples or any liquids to, in, or from, the analysis apparatus, e.g. suction devices, injection devices
    • G01N35/1009Characterised by arrangements for controlling the aspiration or dispense of liquids
    • G01N35/1011Control of the position or alignment of the transfer device
    • G01N2035/1013Confirming presence of tip

Definitions

  • Patent Application No. 61 /990387 filed May 8, 2014, and entitled AUTOMATED SYSTEM AND METHOD FOR DETECTION, ISOLATION AND HARVESTING OF BIOLOGICAL OBJECTS, which is incorporated herein by reference in its entirety.
  • FIG. 1 depicts an example of a block diagram of a system to facilitate detection, isolation and harvesting of biological objects.
  • FIG. 2 depicts a plan view of an example of part of a system showing spatial relationships between a movable tip and a stage.
  • FIG. 3 depicts the system of FIG. 2 with the stage in a position for part of a calibration process.
  • FIG. 4 depicts the system of FIG. 2 showing the stage in another location associated with the calibration process.
  • FIG. 5 depicts the system of FIG. 2 with the tool arm moved to a location for receiving a new tip.
  • FIG. 6 depicts an example of the system of FIG. 2 demonstrating part of a tip sensing process.
  • FIG. 7 depicts an example of the system of FIG. 2 demonstrating another part of the tip sensing process of FIG. 6.
  • FIG. 8 depicts an example of a non contact tip sensing system that can be utilized for detecting the tip location with respect to axes of the stage.
  • FIGS. 9 and 10 depict examples of a Z-axis sensor to detect tip height along the Z axis.
  • FIG. 1 1 depicts an example of a topographical map of a surface that can be detected using the tip height sensing of FIGS. 9 and 10.
  • FIG. 12 is a flow diagram depicting an example of a calibration method.
  • FIG. 13 depicts an example of a method for interacting with an object identified in an image.
  • FIG. 14 depicts an example of a method to determine the tip height in association with and identifying a location of an image to enable interaction with the object.
  • FIG. 15 depicts examples of images of objects on a stage associated with a sequence of interactions for transferring identified biological objects.
  • FIGS. 16A and 16B depict an example of another harvest sequence of interactions that can be implemented.
  • FIG. 17 depicts example of part of an interaction protocol for harvesting biological objects that is updated dynamically based on post-interaction image data.
  • the system can include an imaging subsystem, a tool subsystem, a stage subsystem and a control system.
  • the control system can integrate controls for each of the other subsystems, which controls can be interdependent to implement desired functions over a variety of process parameters.
  • a method can include identifying at least one object of interest based on image data representing the object of interest residing in medium, such as acquired by the imaging system.
  • An image analysis function which can be part of the control system or include separate analytics, can be configured to analyze the acquired image data to determine a distribution of pixels or voxels of an image that corresponds to the object of interest.
  • the control system can control one or more forms of interaction with the object of interest based on the distribution of pixels or voxels.
  • the control can further be updated dynamically during the interaction based on image data acquired during such interaction to enable corresponding adjustments to one or more of the controls.
  • the tool system may include tools that are used to interact with the medium or interact directly with the biological object.
  • the interaction with the medium may include addition or removal of medium elements resulting in a change in composition of the medium.
  • the interaction with the medium may also include mechanical or biophysical modification of the medium conditions such as through movement, agitation, heating, cooling or application of external biophysical stimuli (e.g. light energy or electromagnetic fields).
  • Interaction can also include use of a tool having an interior space through which fluid or medium may flow, and the use of this tool to locate and aspirating medium using a fluidic system.
  • Interaction including but not limited to aspiration, may be used to sample, harvest, move, remove or kill biological objects (e.g., cells or colonies of cells) or other material that may
  • the interaction can include acquiring one or more images, which can be analyzed to determine information and process parameters. Images taken at two points in time may also be used to detect and measure changes that have taken place in the biological object or objects as a result of the interaction.
  • FIG. 1 depicts a block diagram of an example of an automated system 10 for detection, sequential analysis, isolation and harvesting of objects.
  • the object being harvested can correspond to a cell, a group of cells including a colony or a collection of cells corresponding to a population of cells, or cells forming a tissue. While many examples herein describe the objects as including cells or colonies of cells, it is to be understood and appreciated that the system and methods are not limited in application to cells as images of other types of objects can also be processed according to an aspect of the present invention.
  • the objects can also include microorganisms ⁇ e.g., bacteria, protozoa), cell nuclei, cell
  • organelles e.g., organelles, viruses, non-biological structures (e.g., proteins, peptides or
  • the system 1 0 includes a control system 40 that is programmed to control the various subsystems of the system including a stage motion system 1 6, imaging system 20, and the automated tool system 26 (e.g., further including tool function system 30 and tool motion system 28).
  • the control system 40 includes an imaging control 42, stage motion control 44, tool motion control 46 and tool function control 48.
  • the control system 40 provides an integrated control method to coordinate control of the various system components 20, 26 and 16 to perform a variety of functions, as disclosed herein. While in the example of FIG. 1 for purposes of explanation each of the respective control blocks 42, 44, 46 and 48 are demonstrated as separate modules (e.g., program code blocks executable by a processor), it is to be
  • the system 1 0 includes a stage 1 2 that supports the objects of interest 14.
  • the position and movement of the stage 1 2 in a defined stage coordinate system can be controlled by a stage motion system 1 6.
  • the stage motion system 1 6 can include an arrangement of motors connected to drive the stage 1 2 to a desired position in accordance system based on profits parameter.
  • the stage motion system 1 6 can include an arrangement of linear motors configured to adjust the position of the stage 12 along two or more (e.g., three) orthogonal axes, corresponding to X and Y axes.
  • the objects 14 that are positioned on the stage can correspond to plated cells residing in a known medium, such as can be a liquid medium or a viscous or solid (e.g., alginate, methycellulose, hyalyonan, or other hydrogel composed of natural or synthetic polymeric gel materials or the like).
  • a known medium such as can be a liquid medium or a viscous or solid (e.g., alginate, methycellulose, hyalyonan, or other hydrogel composed of natural or synthetic polymeric gel materials or the like).
  • a known medium such as can be a liquid medium or a viscous or solid (e.g., alginate, methycellulose, hyalyonan, or other hydrogel composed of natural or synthetic polymeric gel materials or the like).
  • marker criteria can be utilized to differentiate or optically label structures and/or chemical features of objects in a given image. Examples of some marker criteria include staining (e.g., with one or more dyes), employ
  • immunochemical marker a histochemical marker
  • in situ hybridization marker a marker that can be used to optically identify objects and/or chemical features of objects in an image.
  • AP staining can be employed as a mechanism to analyze undifferentiated progenitor cells, for example.
  • AP and DAPI stains can be utilized together on a given sample, such as for locating cells ⁇ e.g., via DAPI or other staining) as well as analyzing performance characteristics of certain types of cells ⁇ e.g., via AP or other staining).
  • other stains or dyes can also be utilized in addition to or as an alternative to AP and/or DAPI staining.
  • the particular types of staining employed can vary according to the types of objects being selectively stained. For example, specific types of cells, different components within cells ⁇ e.g., organelles, proteins or other molecules) as well as other objects can be selectively stained and provided as the one or more objects of interest.
  • Markers can also be employed to identify (e.g., via image analysis 24) chemical features or morphologic features in the matrix materials around and near the cells, which further can be used to characterize and assess biological identity or performance of the adjacent cells.
  • chemical features may include proteins or other chemical compounds that may be secreted or deposited by cells.
  • Morphologic features near the cells can include supercellular features ⁇ e.g., collections of cells into geometric structures, such as tubular and rosette shapes), minerals ⁇ e.g., calcium-based compounds) formed near the cells, fibrous proteins formed near cells, as well as the size and configuration of junction points between cells to name a few.
  • morphological features can include physical features of cells or groups of cells, such as including size, shape, optical density, auto- fluorescence, presence of absence of cell surface markers, presence or absence of specific extracellular matrix components and/or presence of specific enzymatic activity.
  • groups of cells can be identified as functionally related to one another as a biologically relevant group (e.g., a group of cells likely sharing a common ancestral cells (colonies), or a group of cells responding (or not) to a specific signal, cells meeting (or not meeting) metrics defining a desired range or constellation of features).
  • the medium can be a standard medium or it can be user selected and information about the medium (e.g., material properties, optical properties and the like) can be entered into the system 10 via a corresponding user interface 18.
  • the user interface 18 can be programmed to provide a human machine interface through which one or more users can interact with the system 10. Such human-machine interactions by the user can include setting operating parameters and thresholds utilized by the system. The human-machine interactions can also include remotely controlling the positioning of the stage along one or more of its axes.
  • the user interface 18 can also be employed to define properties, interaction protocols, and/or process parameters associated with the objects, the medium for the objects or other parameters or criteria that can be utilized in conjunction with the detection isolation and/or harvesting of the objects from the stage 12.
  • the imaging system 20 is configured to acquire image data 22 that includes one or more images collected from the stage.
  • the images can be static images captured at an instantaneous time and/or video images captured over a time interval.
  • the imaging system 20 can include a digital camera 60, such as can include an arrangement of one or more digital sensors, analog sensors, charge coupled device (CCD) sensor, complementary metal oxide semiconductor (CMOS) sensor, charge injection device (CID) sensor.
  • the arrangement of optical sensors may be implemented as a two-dimensional array.
  • the one or more sensors for instance, can be implemented in a digital camera 60 or the sensors could be a special purpose imaging device.
  • the imaging system 20 provides an output signal corresponding to image data in response to detecting light (e.g., visible and/or non-visible light).
  • the imaging system 20 can be configured to operate in an automated manner to acquire images of the stage and store
  • the imaging system 20 is demonstrated as also including optics 58 that can control the field of view and resolution.
  • the imaging system can also include one or more lights and/or filters 62 that can be utilized to change the type of illumination and/or filters applied for removing selected
  • the optics 58 can be selected to refine an object field of view but in turn can be passed to the camera 60 for capturing the corresponding digital image.
  • the camera 60 can be implemented as a digital camera that can be attached to a microscope containing the optics 58 for capturing at least a portion of the field of view as pixels having values that collectively form a
  • control system 40 can include metadata with the image data specifying parameters of the imaging system 20, such as image resolution, time and date information, the associated optics setting(s) as well as an indication of the light source and/or filtering that is utilized for the captured image.
  • control system 40 can provide location information in the metadata 23 that represents other state parameters for the system, including a spatial location (e.g., in stage coordinates in two or more dimensions) for each captured image.
  • location metadata for each image can represent the spatial coordinates of the stage at a time when each respective image is acquired, such as corresponding to or derived from linear position encoders that provide absolution position for the stage along its respective axes.
  • the location coordinates can be stored in the metadata 23 to identify the spatial position in stage coordinates that has been converted to an image coordinate for one or more predetermined pixels (e.g., at a center, along a perimeter or other locations) in the captured image, such as by registering the spatial coordinates of the stage to a predetermined location or other marker on the stage. Since the spatial location of the stage is known for each image (e.g., from absolute encoder data provided by respective encoders in response to position of the stage) and an offset between the tool and at least one pixel in the field of view is also known with respect to stage coordinates, as disclosed herein, the stage can be moved relative to the tool, to accurately position the tool in alignment for interaction with an object in each respective image.
  • the spatial offset between the tip of a tool and optical system can be employed to move the stage relative to the tip to position the tip in axial alignment with the target object.
  • Other machine state information that can be part of the metadata 23 can include time, current position of all monitor components (e.g., from respective position encoders).
  • the imaging system 20 can be configured to acquire a temporal sequence of images for a localized set of one or more
  • serially acquired images can be used to identify changes that occur between two or more serial images.
  • the identified changes can be interpreted as biological events of functional significance (e.g., proliferation, migration, change in physical, chemical, anabolic, catabolic or secretory properties) at the level of an individual cell or at the colony level, for example.
  • the image data 22 can include values for pixel or voxels in each image as well as process related data and/or metadata, such as timestamps, temperature, pressure, medium conditions, and the like.
  • An image analysis module 24 can be programmed to apply preprogrammed analytics to analyze and evaluate the imaging data that is acquired to enable detection and isolation of the biological objects of interest.
  • An example of an image analysis that can be utilized is shown and described in U.S. Patent No. 8,068,670, which is incorporated herein by reference.
  • the '670 patent also discloses an approach for stitching a spatially contiguous sequence of images together to create a corresponding montage image that includes a plurality of adjacent field of views. Metadata, including spatial coordinates, can be stored with each of the individual images that have been stitched together to form the corresponding montage image to facilitate interaction with objects that may be identified at different parts of the montage image.
  • the pixels or voxels for a given image can represent imaging data acquired by the imaging system 20 from one or a plurality of imaging domains or imaging modalities (e.g. bright field microscopy, phase contrast, Numarski imaging, fluorescence microscopy, or the like).
  • imaging domains or imaging modalities e.g. bright field microscopy, phase contrast, Numarski imaging, fluorescence microscopy, or the like.
  • multimodal imaging parameters can enable changes in individual and collective objects to be measured, for example, with respect to transmitted light, phase contrast, fluorescence, and other spectral features.
  • the image analysis 24 can employ a mathematical algorithm to integrate imaging data from one or more domains for characterization and interpretation of values in the pixels or voxels (e.g. a ratio of brightness or product of brightness).
  • the image analysis 24 can also be configured to provide summary analysis reports itemizing features of individual cells, groups of cells and/or the relationship of individual cells to defined groups within each of the acquired images.
  • the resulting analysis data can also be stored as part of the image data 22 to facilitate further isolation and harvesting of biological objects.
  • the image analysis 24 can be programmed to quantify changes at the level of individual biological objects including, for example, with respect to attachment, migration, survival, metabolic activity, protein secretion, expression of surface markers, and/or morphological changes. Additionally or alternatively, the image analysis 24 can quantify changes at the level of the entire population of biological objects with respect to attachment, migration, survival, metabolic activity, protein secretion, expression of surface markers, and/or
  • the system 10 can also include the automated tool system 26 that is configured to interact with one or more selected objects 14 based on at least in part on the distribution of pixels or voxels that is part of the image data 22.
  • the automated tool system 26 is configured to interact with one or more selected objects 14 based on at least in part on the distribution of pixels or voxels that is part of the image data 22.
  • the automated tool system 26 can be configured to position one or more tools in a location for interacting with medium or one or more of the biological objects 14 disposed on or in such medium.
  • the automated tool system 26 further can include a tool motion system 28 that is configured to adjust the position of a tool or more than one tool relative to the stage 12.
  • the tool motion system 28 can include an arrangement of linear motors, rotary motors and/or other types of actuators
  • an arrangement of linear motors can be provided to enable precise positioning of the tool along three orthogonal axes.
  • the tool motion system 28 can move the tool, including a tip of such tool, along a z-axis that is orthogonal to the surface of the stage to enable interaction between the tip of the tool and the media and/or biological objects disposed thereon. Additionally or alternatively, the tool motion system 28 can move the stage along X and/or Y axes such as to provide for positioning the tool in three-dimensional space in response to control instructions from the control system to the motion system 28.
  • control system 40 can provide instructions, corresponding to motion data 36, to the motion system 28 to move the tool to a predefined location in the coordinate system of the corresponding tool having a predefined position (e.g., tool reference coordinates) in the coordinate system of the stage 12 (or to some other common coordinate system), which can be utilized for accurately positioning the one or more tools with respect to a target location identified in one or more images.
  • Calibration disclosed herein between the tool system 26 and the image system 20 enables precise positioning of the tool with respective locations identified in captured images.
  • the control system 40 can implement a calibration function 54 for the system 10.
  • the system calibration function 54 can determine a spatial offset between a tip of a tool of the automated tool system 26 with respect to an optical location within the field of view of the imaging system 20.
  • the spatial offset can be stored in memory and utilized by the control system 40 to position the stage 12, accurately and
  • the images can be stored in the image data 22 associated with corresponding metadata 23, including position metadata.
  • the metadata 23 associated with each respective image can include a corresponding spatial position of the stage associated with one or more pixels thereof (e.g., a center pixel or a predetermined edge pixel), accurate coordinates of an object identified with respect to one or more pixel in the image can be readily determined with respect to the stage and thereby enable the control system 40 to adjust the position of the stage into alignment with a desired tip of the automated tool system 26.
  • the system calibration function 54 is programmed to control the stage motion system 28 to move to a plate or other tray under the tool tip that has been positioned to its predefined reference location and aligned with a location.
  • An ink or other transferable medium e.g., ink or other marker material
  • the indicia can be applied to the stage or an object disposed thereon by moving the tip carrying a marker material (e.g., ink or other marker material) into contact with the surface of the stage, for example.
  • the position of the stage e.g., in two- or three dimensional space
  • This stage carrying the applied indicia is then moved under the microscope until the applied marker is located and a new
  • This tip to optical offset information is used to calibrate the system to enable precise and reproducible positioning of the tool with respect to targets on a corresponding set of one or more well plates identified in the captured images.
  • the tip to optical offset calibration can be performed for each set of plates or other apparatuses that are positioned on the stage.
  • such system calibration 54 can also ascertain a tip-to-tip spatial offset between a reference tip and one or more other tips that can replace the reference tip on the tool (e.g., a mandrel).
  • the reference tip can correspond to a given tip that is utilized initially to determine the tip to optical offset mentioned above and each other tip that is attached to the tool (e.g., a mandrel, such as a hollow body of a syringe tool apparatus) can be calibrated to determine a corresponding tip-to-tip spatial offset with respect to the same given reference tip.
  • the system 10 can include one or more sensors integrated into the stage 12 to provide tip position data representing a location of the tip with respect to each of the orthogonal (e.g., X,Y) axis of the stage 12.
  • the tool can be positioned at its predefined reference position (e.g., based on predetermined tool motion data 36 stored in memory), and the stage can move relative to the tip, while the tool is located at its predefined reference position, to identify the coordinates of the tip with respect to the axes of the stage based on the tip sensor data.
  • the reference position of the tool can be any predefined repeatable position to which the tool can be accurately positioned.
  • the system calibration 54 of the control system can in turn aggregate the tip-to-tip offset with the tip to optical spatial offset determined for the reference tip to enable accurate positioning of the stage 12 via the stage motion system 16 with respect to the new tip. Examples of such tip sensing and calibration functions are disclosed herein with respect to FIGS. 6-8.
  • the automated tool system 26 can also include a tool function system 30 that is configured to activate an associated function of the tool for interacting with an identified target (e.g., one or more cells or cell colonies) based on tool action data 38.
  • the tool comprises a syringe apparatus that includes a hollow body mandrel to which a tip is removably attached and that provides fluid communication to a fluid source.
  • the tool function system 30 can activate the syringe for aspirating with the syringe by controlling the flow rate and volume of material that can be drawn into a tip of a tool containing an inner channel through which a controlled flow of fluid or medium can be implemented to aspirate or dispense.
  • Examples of variables that can be selectively controlled based on the image analysis 24 can include volume, flow rate, force, time and/or rate of change of pressure (e.g., positive or negative pressure), height of the tip relative to the object and/or stage 12 as well as pattern and movement of the tip relative to the stage 12, as well as combinations thereof.
  • the relative motion between the tool tip and the stage can be controlled, for example, based on the tool motion system 28 along two or more of orthogonal axes. Alternatively or additionally, relative motion between the tool tip and the stage can be controlled based on the stage motion system 16 controlling position of the stage along two or more orthogonal axes.
  • the tool tip can be moved to a computed centroid of the target object and the tip can be positioned at a computed distance (e.g., along the Z axis) from the stage for aspiration or other interaction.
  • the dimensions for a selected tip can be utilized as interactive protocol and process parameter to enable control of the tool subsystem and the tool and fluidics system, for example.
  • the fluid flow parameters can be stored in memory, such as part of tool action data 38 that includes a library of interaction protocols.
  • Each respective interaction protocol can include a series of "process steps” or actions, each having a defined set of "parameters” that characterize each step that is to be performed in a sequence or concurrently.
  • the library of interaction protocols can includes predetermined schemes applicable based on the sensed data and image analysis.
  • the interaction protocols may include one or more user configurable schemes that can be generated in response to inputs via the user interface 18.
  • a given aspiration scheme can be selected as part of a given interaction protocol based on the image data 22 (including derived from image analysis 24) and other data (e.g., sensor data 52, tool motion data 36, user input data and the like).
  • a given interaction protocol may be selected from a library of one or more pre-programmed interaction protocol, which can vary with respect to variables associated with tool selection and tool aspiration controls.
  • the schemes can be programmed to vary based on, including but not limited to: tool selection, aspiration tool design and rotational orientation, pattern of movement, height during aspiration, flow rate, volume aspirated, pattern of directionality of flow or tool movement, as well as stage motion.
  • the use of chelating agents or changes in plate or fluid temperature to reduce cell-cell adhesion or cell surface adhesion may also be used.
  • the addition of chelating agents e.g. ethylene glycol tetraacetic acid - EGTA; Ethylenediaminetetraacetic acid - EDTA), changes in plate or fluid
  • a given interaction protocol can be selected or modified based on object cell type (e.g., connective tissue progenitor cells, mesenchymal stem cells, endothelial cell, keratinocyte, neuron, induced pluripotent stem cells, or the like).
  • object cell type e.g., connective tissue progenitor cells, mesenchymal stem cells, endothelial cell, keratinocyte, neuron, induced pluripotent stem cells, or the like.
  • the selected protocol may be updated or modified based on object features (e.g., size/area, size/cell number, thickness, density, distribution (e.g., uniform or dense center), substrate/surface features, extracellular matrix features or the like.
  • Data associated with the tool motion can be stored as part of the tool motion data 36.
  • data associated with the aspiration including aspiration control parameters and aspiration scheme, can be stored in memory as part of the tool action data 38.
  • the tool motion data 36 and the action data 38 further can provide feedback information during the process which can further be utilized to refine the variables and process parameters in a selected interaction protocol.
  • the feedback can be derived from multiple acquired images (e.g., over a time interval) of areas that have the site of tool interaction or fluid flow intervention or aspiration.
  • the control system 40 activates corresponding process controls 42, 44, 46 and 48 to interact with the colony according to each sequence of steps in the selected interaction protocol.
  • the sample containing the biological object is assessed using the automated imaging processing (e.g., image analysis 24) of acquired image data 22. Analysis of images before and after interaction can then be used to detect and measure the effect of the interaction and/or the combined effect of a sequence of interactions. These data can then be used, based on predetermined criteria, to proceed to a next phase in a series of steps in an automated interaction protocol.
  • direct visual feedback information can be provided derived based on a comparison between image data before and after an interaction or series of interactions. From this visual feedback information compensations can be manually entered in to the motion algorithm for a subsequent interaction with the objects on the stage.
  • image data 22 can be acquired continuously or intermittently. For example, actions associated with the movement of the syringe can occur and corresponding image data can be acquired.
  • the control system 40 can employ the acquired image data to dynamically update and adjust the position parameters for the tool based on the analysis of the image data 22.
  • the distribution of pixels of voxels corresponding to the image data can be updated dynamically during the process of tool motion and/or aspiration based on the acquired image data to enable corresponding adjustments (e.g., in substantially real time) by the tool motion system and the tool function system 30.
  • the control system 40 further can access the data that is provided by one or more of the respective subsystems, including the image data 22, stage data 17, tool motion data 36 and tool action data 38.
  • the system 10 can also include one or more environmental sensors 50 that can be configured to provide corresponding sensor data 52.
  • Environmental sensors can include sensors for providing sensor data indicative of temperature, environmental pressure in the handling chamber, humidity and the like.
  • the control system 40 thus can also receive the sensor data 52 for implementing corresponding control of the system 10.
  • the imaging control 42 can control various parameters of the imaging system 20, including to control which optics are employed as well as activation of a light source and optical filters utilized during image acquisition.
  • the imaging control 42 also controls activation of the camera acquires images (e.g., at automatic intervals and/or in response to user input via user interface 18).
  • the imaging control 42 further enables the image analysis 24 to analyze values of the pixels or voxels in the corresponding distribution of pixels or voxels provided by the image data 22.
  • the image analysis 24 can determine one or more features of an object of interest based on the values of pixels or voxels. Then based on the determined features, the tool motion control 46, stage motion control 44 and/or function control 48 can provide corresponding control signals to the automated fluidic tool system 26 to control positioning of the tool then implement desired interaction with identified objects.
  • the tool function control 48 can be configured to control one or more of the volumes of fluid to be aspirated as well as the flow rate during aspiration.
  • the function control 48 can control the positive and negative pressure applied across the tip can with respect to flow rate and direction, thereby controlling the flow rate and volume of material passing through the tip of an automated syringe apparatus that is implemented as the automated tool system 26.
  • the function control 48 can control the flow rate up to about 500 ⁇ /second or greater.
  • the tool motion control 46 and/or the stage motion control 44 can be configured to adjust a height of a tool tip relative to the surface containing the object of interest and the medium and/or the articulating pattern of the needle during aspiration, such as based on one or more of the acquired data 17, 22, 36, 38 and 52.
  • the interoperation between the subsystems is driven mainly from the optical information obtained from image data 22 and the image metadata 23.
  • the location and size of the cells are computed (e.g., in system coordinates), such as disclosed herein, corresponding motion and aspiration processes are executed using one of the interaction protocols.
  • the tip e.g., needle of the syringe
  • the tool system can include a replaceable tip, which can be automatically selected from a plurality of different available tip designs.
  • a selected tip can be configured to attach a distal end of the tool via a mating interface.
  • Various types of interfaces e.g., friction fitting, threaded interface or the like
  • the interface can be a wedge shaped press fit on the tool that may be applied by pressing the fitting at the free distal end of the tool (e.g., a mandrel) into the tip.
  • the tool system 26 controls the pressure during tip attachment, such as between two levels - high enough to make a solid interface by not too high that it damages the tip.
  • the tool of the tool system 26 can include an integrated force sensor in the bracket that holds the tip.
  • the force sensor can provide force information (e.g., as part of tool action data 38), which is monitored by the control system 40.
  • the control system 40 can providing the force information back to the tool motion system 28 to control the force in the Z-direction while the tip is being applied to the tool holder.
  • the tool system 26 can also include a spring loaded holder that helps ensure proper force is applied during loading.
  • Different tip designs can be selected (e.g., by control system or in response to a user input) depending on the interaction that is needed and the medium from which the cells and/or colonies and nearby media are to be extracted.
  • Different media can be categorized differently, such as a fluid, viscous fluid, semisolid, gel, hydrogel or the like, and a tip can be selected according to the
  • the category of medium can further be a process parameter used to control aspiration for a given interaction protocol.
  • the tools can be designed for single use and be disposed of in a receptacle after use.
  • removal of a tip may be implemented by a "shucking" device.
  • the shucking device can be implemented as a keyhole structure (e.g., a large diameter hole intersected by a smaller diameter slot) located a predefined tool coordinates.
  • the loaded tool is lowered into the large hole and once below inside the key hole, the tool is moved laterally into the smaller slot area.
  • the tool is then moved up (e.g., in the Z direction) such that the tip catches the edge of the key hole slot and is removed from the tool.
  • the key hole can be located such that the tip will drop into a waste receptacle.
  • another means e.g., gripping device, such as a clamp or the like
  • the tip may be reusable.
  • the control system 40 can be configured to implement automated washing and/or sterilization of the tool tip and reservoir between successive interactions with a given tool tip that may be integral with the tool (e.g., fixed as not intended to be replaceable).
  • Examples of tools that may be fixed or otherwise not replaced may include a cutting tool, a scraping tool, a stamping tool or a stirring tool.
  • the tool system 26 can include a plurality of different tools that can be utilized sequentially or concurrently to interact with selected objects, including aspiration of objects into respective tips. Some tips further can be multi-purpose tips to perform more than one of the interaction functions disclosed herein.
  • an interaction can include pre-treatment applied via one or more tools, such as to prepare one or more target objects for subsequent interaction.
  • pre-treatment can include: removal of non-adherent cells or material (e.g. change of medium overlying the object prior to interaction); removal of cells or material from the object prior to interaction (e.g. blow off cells or material loosely adherent to the object); remove adjacent cells or material that may complicate or hinder the effectiveness of the interaction with the object; application of
  • pretreatment chemical or biophysical methods to change the interaction of the object or object components (cells) with each other or the underlying surface
  • pre-treatment chemical or biophysical methods may include medium agitation (e.g. Shaking or stirring), temperature Change within the chamber, media Change (e.g., removing Ca or Mg) and/or enzymatic digestion and quenching (e.g, with associated time controls).
  • control system 40 can be programmed to determine the distance of the object of interest that is located in the medium on the stage 12 relative to at least one other object located in the same medium based on the distribution of pixels or voxels provided by the image data 22.
  • the distance between objects can be based on the image analysis identifying objects and computing centroids for each object and then computing a corresponding distance between the respective centroids.
  • the distance can be an absolute distance or it can be a distance computed as a number of pixels or voxels along a line between respective centroids, for example.
  • the position of the object of interest can be computed in a desired spatial coordinate system to facilitate interaction.
  • the control system 40 can be further programmed to control the type of interaction or exclude interaction with the given object based on the determined distance and other conditions determined based on the image data and other process parameters. For instance, if an object is too close to another object the determination can be made to exclude aspirating the cells at such area.
  • means can be employed to separate the object of interest from non-interest prior to direct interaction with the object of interest by means of physical interaction using an automated aspirating or non-aspirating tool to dislodge or remove adjacent or adherent objects or materials.
  • the tool action control can be programmed to direct a controlled flow of fluid over the top of the object to displace less adherent cells or materials.
  • an aspirating or non- aspirating tool may be manipulated across the surface to dislodge and remove objects of non-interest, leaving an object of interest effectively isolated for use or making it more accessible for precise controlled interaction at a subsequent step in the current interaction protocol.
  • the imaging analysis 24 can also be programmed to divide the distribution of pixels detected from an image into discrete spatial regions and/or form a montage of plurality of discrete images, such as disclosed in Appendix A.
  • Each spatial region is assigned its own respective distribution of pixels or voxels therein.
  • the distribution of pixels or voxels for each respective spatial region can be utilized by the control system 40 to control the automated tool (e.g., syringe) 26 for performing selective aspiration on based on the distribution of pixels or voxels within the given respected spatial region.
  • the tool function control 48 can be further programmed to determine an interaction protocol that is selected for each discrete spatial region based on the corresponding distribution of pixels or voxels for each such region. In this way the tool function control 48 can set a corresponding interaction protocol (e.g., for aspiration and/or another form of interaction), such as for setting the processing parameters and controlling aspiration of each object of interest that is located in each respective spatial region.
  • a corresponding interaction protocol e
  • the imaging control 42 can be programmed to continue to acquire images either during interaction/aspiration or in between discrete phases of the interaction process.
  • the imaging system 20 can update the image data to reflect changes in the distribution of pixels or voxels for each of the respective spatial regions within an image field of view in response to corresponding interactions (e.g., aspirations) with the objects of interest.
  • the imaging system can update the spatial region in which the object resides and the immediate adjacent neighboring regions to the region containing the object of interest.
  • the image analysis function 24 can further divide each of the images into corresponding discrete spatial regions based on location metadata that is embedded in the image data 22. Such spatial regions may be the same or vary throughout the interaction process.
  • the image analysis 24 further can continue to analyze the distribution of pixels or voxels within each spatial region to provide updated dynamically varying imaging data reflecting changes in the distribution of pixels during such interactions or in between each of a sequence of interactions.
  • the tool function control 48 and tool motion control 46 can dynamically adjust the process parameters associated with the automated tool 26 for each spatial region containing an object of interest based on the changing imaging data that has been provided by the image analysis 24.
  • the image data 22 that is acquired by the imaging system 20 and the analyzed image data from the image analysis 24 can be converted into process parameters to control interaction with one or more selected object of interest.
  • the control system 40 including the tool motion control 46 and the tool function control 48, can be programmed to compute processing parameters for an interaction protocol based on the distribution of pixels or voxels for an object of interest or a feature of an object of interest determined from the image analysis 24 for the pixels or voxels.
  • Other information such as sensor data and the category of medium, can also be used in determining the interaction protocol's processing parameters.
  • each object of interest such as a cell or a group of cells (e.g., cell colony) can include one or more morphological features that can be readily determined by the imaging analysis 24 according to the values of pixels or voxels for each object of interest and the distribution of pixels or voxels for the object of interest. That is, in addition to the distribution of pixels or voxels, corresponding morphological features corresponding to the values of the respective pixels or voxels can be combined for further characterization.
  • the morphological features can include, for example, cell morphology or morphology of a colony of cells, such as disclosed in the above-incorporated U.S. Patent No. 8,068,670.
  • the tool motion control 46 further can be programmed to control the motion of a tool relative to the stage or other substrate in which the object of interest 14 is located.
  • the tool motion control 46 can control motion and position of the tool in three-dimensional space (e.g., via respective linear motors or actuators) based on the distribution of pixels or voxels for a given object of interest provided in the image data 22.
  • the tool function control 48 can in turn control the flow of the object of interest into an aspiration tool, again, based on the distribution of pixels and based on the analysis of the values of pixels or voxels for the given object of interest to be aspirated.
  • the aspiration scheme may be selected among the options in the protocol library, such as based on the imaging features of the object of interest (e.g., size, density, shape, morphology, expression of selected surface markers, or the like).
  • the tool function control 48 can be programmed to select an interaction protocol from the library of interaction protocols to implement an aspiration scheme based on the analysis of the image data acquired for the objects located on the stage and in response to a user input, such as can be provided the user interface 18.
  • the user input can be utilized to specify a desired type of interaction or type of aspiration as well as one or more parameters associated with such operations. Additionally or alternatively, the user input can establish a minimum colony size and define a type and material properties of the media in which the objects are being cultured.
  • the selected aspiration scheme and the analysis of the image data can be stored into a knowledge base to further document the application of the aspiration scheme for harvesting the object of interest.
  • the set of process parameters associated with each interaction can also be stored in memory for subsequent analysis and evaluation. In this way, process parameters can be further adjusted for subsequent harvesting based on the evaluation of the effectiveness of the scheme and the parameters utilized to harvest the object of interest.
  • one or more parameters such as tip distance from the object, flow rates, pressures and positioning of the tool relative to the object of interest can be evaluated relative to the results of the aspiration and harvesting process to determine the effectiveness of such parameters during aspiration.
  • the user input can further set a type of interaction, such as sample, move, remove, or kill an object of interest, depending upon whether it is determined that the object of interest is to be retained, transferred or removed from the medium.
  • a type of interaction such as sample, move, remove, or kill an object of interest, depending upon whether it is determined that the object of interest is to be retained, transferred or removed from the medium.
  • a given colony or group of cells may require multiple interactions or aspiration phases to displace unwanted cells or objects and/or to retain desired cells in the medium or capture the desired cells into the inner chamber of an aspiration tool for transfer.
  • calibration updates can be implemented dynamically to help optimize a next aspiration phase for the selected interaction protocol. For example, flow rates and distances can be adjusted, position of a needle can be readjusted and different flow rates can be utilized based on the distribution of pixels and voxels detected between each aspiration phase. As an example, the distribution of pixels before and/or after an aspiration sequence can be compared or correlated to ascertain process parameters for the next phase of aspiration.
  • the image analysis 24 further can be programmed to determine an object profile for each object of interest based on the analysis of the distribution of pixels or voxels for the object of interest.
  • an object profile may be compiled from an integrated series of algorithm steps, each applying a specific operation or range of specific variables including specific steps of imaging, image processing (e.g., thresholds, filters, maps, masks, segmentation, correction) steps. These may develop an object list that can be further refined by specific inclusion and exclusion testing criteria based on size, morphology, density, brightness, texture, gradients, proximity, shape, pattern and the like.
  • Individual (discrete) profiles for detection and classification of specific object types may differ by one or more steps or parameter range or threshold settings.
  • the process parameters for controlling interaction such as aspiration with each object of interest thus can be selectively adjusted based on the object profile that has been determined.
  • the object profile can be assigned to each discrete spatial region occupied by
  • corresponding objects of interest and the object profile further can be updated dynamically based on the distribution of pixels or voxels being acquired for each phase of the process.
  • the stage motion control 44 and tool function system 28 can cooperate to move the selected objects from the needle to a subsequent destination, which can reside on the stage 12 or another location within the system 10. For example, if the objects or cells that have been aspirated into the needle for deletion, the needle can be moved so that can be discarded to an appropriate disposal site. In other examples, the cells can be transferred to a new site, such as being re-plated into a medium for additional growth and subsequent harvesting.
  • the location of a tip of tool which may be integral with tool or be replaceable, is to be known in three directions, X-Y and Z. To enable accurate control for interactions with objects on the stage, these locations are calibrated to relate each motion system that is involved in the interaction. Additional examples of calibration and interaction between a tip of a tool and one more objects on a stage will be better appreciated with respect to the example of the system configuration depicted in FIGS. 2 - 10.
  • identical reference numbers refer to the same components and additional reference can be made to the system 10 of FIG. 1 for integrated controls and operations.
  • the system 100 includes a stage 1 10 that is movable along at least two orthogonal axes, identified as X and Y.
  • the stage 1 10 can be movable relative to one or more tools 1 12 that can include a tip that extends axially from the tool to terminate in a corresponding distal end of such tip.
  • the tool 1 12 can be controlled to move the tip axially in a direction that is orthogonal to a plane defined by the X and Y axes of the stage, in the Z direction.
  • the tip includes a longitudinal central axis, identified at 1 14.
  • the tool 1 12 can also move along the Y axis via operation of the corresponding motors to move tool holder arm 1 16 with respect to mounting arm 1 18.
  • the tool can also be moveable in the Z direction via control of a linear actuator.
  • the Z-axis motor of the tool 1 12 can be implemented as mechanical actuator (e.g., a ball-screw servo motor, a lead-screw motor, etc.), a linear motor, a segmented spindle motor, a moving coil motor (e.g., rotary, linear or rotary-linear), or the like.
  • the tool holder arm 1 16 can be movable in the Y- direction via actuation of a corresponding linear motor or other actuator.
  • the tool 1 12 can also be moveable in the Z-direction with respect to the stage 1 12 via actuation of the corresponding motor (e.g., a linear actuator).
  • the stage 1 10 can be movable in the X direction in response to actuation of the corresponding linear motor that is attached the stage 1 10 and corresponding base (e.g., frame or housing) of the system 100, and in the Y direction in response to an actuation of another linear motor fixed to the stage 1 10 and to the base of the system 100.
  • the stage 1 10 includes a surface 122 on which various components can be located.
  • the tool 1 12 can be implemented to include a hollow body that is fluidly connected to a source of fluid which can be activated to add or remove fluid through the hollow body and a corresponding tip that is attached and extends from to the hollow body.
  • a tip can be removably attached to the hollow body, as mentioned above such as by a friction fitting between an interior sidewall of the tip and the outer sidewall of the mandrel of the tool body.
  • Other types of attachments such as threaded fasteners and fittings, may also be utilized to secure the tip with respect to the tool 1 12.
  • each tip that is attached to the tool can result in spatial variation of the distal end of the tip in three-dimensional space, including in X and Y coordinates of the stage 1 10 as well as in a Z direction that is orthogonal to the surface of the stage. Accordingly, the system 100 employs calibration to resolve the position of the tip spatially.
  • two well plates 124 and 126 are positioned spaced apart from each other on the surface 122 of the stage. While the example of FIG. 2 demonstrates two well plates 124 and 126 it is understood that the stage can accommodate any number of one or more than one plates according to application requirements.
  • Each of the plates 124 and 126 can include one or more wells into which cells can be received and removed via the one or more tools (e.g., syringe tool apparatuses) 1 12 implemented within the system 100.
  • the tools e.g., a cutting tool, a scraping tool, a stamping tool or a stirring tool
  • the enclosure 1 15 is designed to maintain environmental conditions within the enclosure to facilitate growth and cleanliness of cells and other objects that maybe placed onto the well plates 124 and 126.
  • the system 100 can also include a tip sensing system 140, which may be integrated into the stage 1 10 and configure to interact with the tip of one or more tools.
  • the tip sensing system 140 can include hardware that includes sensors 142 and 144 for detecting a position of a tip of the tool 1 12 in each of the X and Y axes of the stage 1 10.
  • the sensor 142 can be configured to detect the Y position of the tip of the tool 1 12 and the sensor 144 can be configured to detect the X position of the tip of the tool.
  • the tip sensor 140 can be a non-contact sensor, such as using optical interrupter devices oriented 90 degrees from each other in the direction of the X-and Y axis.
  • the tip sensor 140 can be positioned at a predefined X and Y position with respect to the arms 1 16 and 1 18 relative to the housing 1 15 of the system 100.
  • This predefined location e.g., referred to herein as the tip reference position
  • the tip sensor 140 could be implemented using an X-Y laser gauge fixed with respect to the stage, which outputs not only the position of the tip with respect to the stage coordinates, but also the diameter of the tip.
  • the imaging system can include a camera that provides an object field of view 146 at a corresponding location that can remain fixed with respect to the housing 1 15 of the system 100.
  • the field of view 146 can include an optical axis 148, demonstrated at the center of the field of view 146.
  • stage motion system motion system 16 of FIG. 1
  • stage motion system can selectively adjust the position of the stage 1 10 in the X and/or Y axes so that the field of view 146 contains the desired object or other target of interest such as located in a corresponding well 128 or 130.
  • the size and resolution associated with the field of view 146 can be selected by configuring optics 58 of the corresponding imaging system 20, as disclosed herein, automatically or in response to a user input.
  • FIG. 3 depicts an example where the stage 1 10 has been moved to a first position in which the tip 1 14 can be moved into contact with a location at the surface of the stage such as in a corresponding well.
  • the particular location at which a reference tip contacts the stage can be arbitrary or predefined.
  • the tool 1 12 can be moved along the Z axis 1 14 in the direction of the stage 1 10 until the tip contacts the stage or an object located on the stage such as the well or medium residing in the well.
  • a fiducial marker can be transferred onto the contact site.
  • the marker can include ink or any other marker material that can be visually differentiated from the surroundings on the surface as to be detected by the imaging system.
  • the ink can be applied to the distal end of the tip prior to moving the tip into contact with the stage 1 10.
  • the ink can be applied to the end of the reference tip.
  • the distal end of the tip can be moved to contact an empty sample tray at a known empty location.
  • the tool motion system can control the tip to contact the tray at a plurality of different X-Y location by moving the tip along its Z axis to stamp the tray with the ink that is on the end of the tip, thereby creating a fiducial mark (e.g., a circle) on the sample tray.
  • a fiducial mark e.g., a circle
  • stage motion system is activated to adjust the position of the stage 1 10 so that the fiducial marker is within the field of view 146 of the imaging device (e.g., microscope), such as demonstrated in FIG. 4.
  • the imaging device e.g., microscope
  • the stage may be moved in response to user inputs (e.g., in the X and Y directions to position the fiducial marker within the optical field of view 146).
  • an automated method can be employed to adjust the position of the stage 1 10 with respect to the object field of view into the fiducial marker resides within the objects field of view.
  • a first automated method can be employed to adjust the position of the stage 1 10 with respect to the object field of view into the fiducial marker resides within the objects field of view.
  • the stage position can be adjusted to place the fiducial marker along with the optical axis 148.
  • the coordinates of the stage 1 10 can be stored in memory.
  • the system calibration function (e.g., function 54) of the control system can in turn calculate a difference between coordinates of the stage (e.g., X-Y coordinates) when the fiduciary marker was applied by the tip of the tool and the coordinates of the stage when the image was captured that contains the
  • the positions of the stage can be determined from outputs of X and Y linear encoders that are associated with the respective linear actuators utilized to move the stage along each of its X and Y axes.
  • the fiduciary marker can correspond to an ink dot having a diameter that is approximately 700 micrometers or less and the coordinates of such fiduciary marker can be determined according to a pixel or to a centroid of a group of pixels that contain the fiduciary marker from the image that was captured from the field of view containing the fiduciary marker. If the centroid or individual pixel for the fiduciary marker is not aligned with the optical axis, the image metadata (metadata 23 of FIG. 1 ) that specifies the resolution of the image can be employed to ascertain its spatial position of the pixel or centroid of pixels.
  • the spatial offset that is determined can be utilized to reproducibly to move the tip relative to a desired target that is identified within a field of view of a given image, such as to perform a desired interaction at the target object location.
  • examples of some interactions that can be performed with respect to objects on the well plates 124 or 126 can include cutting material on the tray, scraping material on the tray, stamping material on the tray, stirring a medium on the tray or aspirating (e.g., picking) objects from the tray and/or transferring objects to one or more destinations.
  • the control system can perform additional calibration to ascertain a tip-to-tip spatial offset for each new tip that is used.
  • the set of images stored as image data correspond to the same sets of plates 124 and 126 that are positioned at the same locations on the stage 1 10. In this way the calibration and any interactions are performed with the respect to a common set of images.
  • a new tip can be placed onto the tool (e.g., mandrel) from tip storage 1202, such as after the previous tip (e.g., the reference tip or another tip) has been removed from the tool, such as disclosed herein.
  • the tip sensor 140 is activated to ascertain the location of the tip in X and Y coordinates for the stage. Such tip coordinates are determined for each tip used, including the reference tip.
  • the tip-to- tip offset can be calculated as the difference between the tip coordinates of the reference tip and each other tip.
  • the tool 1 12 can be repositioned back at the reference tip position, and the stage 1 10 can be moved in the X direction so that the Y coordinate for the tip of the stage can be identified by the sensor 142, as indicated by axis 152.
  • the stage can be moved in the Y direction so that an X coordinate for the distal end of the tip is identified by the sensor 144, as demonstrated by axis 154.
  • the intersection between the axes 152 and 154 corresponds to the tip position (X, Y coordinates) for a given tip. Based upon the tip coordinates for the reference tip and each other tip, a corresponding tip offset can be determined as the difference between the tip coordinates of the reference tip and each respective other tip.
  • the corresponding tip offset for a given tip can in turn be applied to the spatial offset determined for the reference tip (see, e.g., FIGS. 3 and 4) to provide the new optical offset.
  • the X optical offset can be computed as the sum of the X coordinate for the optical offset and the X value of the tip offset
  • the Y optical offset for such given tip can be the sum of the Y coordinate for the reference optical offset and the tip offset in the Y direction.
  • the new optical offset is utilized by the control system 40 for controlling the stage motion system for positioning the distal end of the new given tip at a desired target location identified in a corresponding captured image.
  • the precision of the X-Y tip sensor 140 is can be about +/-5 microns or better.
  • FIG. 8 demonstrates a further example of identifying the tip location with respect to one of the X and Y axes.
  • the tip sensor 150 includes an optical transmitter 152 and optical receiver 154.
  • the optical transmitter can transmit a beam of light (e.g., laser) 155 above the surface 122 of the stage 1 10 that is detected at the receiver 154.
  • the stage can be moved in a direction orthogonal to the beam (along one of the X or Y axes) so that a distal end 156 of the tip 158 disrupts the beam.
  • the disruption of the beam generates a tip sensor signal 160 to trigger recording respective coordinates for each beam for the axis that extends orthogonal to the beam.
  • the coordinates of each beam may be known with respect to the X and Y coordinates of the stage or the beam coordinates may be relative to predetermined reference coordinates of the stage. Either way, the position information provided by each of the sensors 142 and 144 can be utilized to provide the position of each tip, which can be utilized to ascertain a precise tip offset between any two tips based on stage coordinates recorded when each beam is interrupted during the tip sensing phase.
  • FIG. 8 also demonstrates a block diagram that can be utilized to determine the tip-to-tip offset as well as the corresponding spatial offset between the tip location and the optical access of the imaging system.
  • a tip sensor processor 170 receives the tip sensor signal 160 to ascertain corresponding tip position along a given one of the X or Y axis. It is understood that each of the sensors 142 and 144 can implement similar processing to ascertain the tip location along the X and Y axes. In some examples, the tip sensor processing can also determine the distance 162 that can be combined with the known location of the tip sensor with respect to the X and Y coordinates of the stage 1 10 to provide absolute position data 172 for the X and Y position depending on which sensor 142 or 144 has detected the positioning of the tip 158. The tip position data 172 can store tip position for one or more tips that are utilized, such as including the reference tip position and the position of each subsequent tip that is being sensed by the system 150.
  • a tip-to-tip offset calculator 174 thus can compute the tip-to-tip offset based on the tip position data for the reference tip and another tip.
  • the tip-to-tip offset can in turn be provided to a spatial offset update function 176.
  • the spatial offset update can aggregate the reference spatial offset that is stored in memory with the tip-to-tip offset that has been calculated to in turn provide the updated spatial offset data that can be stored in memory and in turn utilize to position the tip 158 with respect to a desired target object that has been identified in the field of view of a captured image.
  • systems and methods disclosed herein can be implemented to calibrate the tip position with respect to the Z axis, which is orthogonal to the X and Y axes of the stage.
  • the Z axis calibration thus can relate a height of the distal end of the tip with respect to the surface of the stage to further facilitate controlling interaction with one or more objects that may reside in a medium positioned on a surface of the stage.
  • plates that may be positioned on the surface of the stage may have non-uniform thicknesses as well as other topographical variations across the surface thereof that may need to be accounted for.
  • each of the different tips can provide spatial variations in the Z direction as well as the X and Y directions, even when the tips are of the same size and design. These and other variations, if not accounted for, can result in errors when interacting with objects at various locations on the stage.
  • FIGS. 9 and 10 depict an example of an approach that can be utilized to determine height of a tip 158 with respect to a local region of the stage or other object that may be positioned on a surface of the stage 122.
  • the surface of the stage may or may not be planar surface.
  • the calibration along the Z axis can be utilized to reconcile small (e.g., micron level differences) between a plane of the tip and a surface plane of the stage to enable precise control of the height of the tip with respect to the surface of the stage.
  • the tip 158 is positioned a distance "h" above the surface 122 of the stage or an object positioned on the stage.
  • a sensor 180 is implemented to detect force applied to the tip in the Z direction (e.g., along the Z axis of the tip 158).
  • the force sensor 180 can be implemented as part of the tool, such as integrated into the tip holder, via mount system to insure adequate sensitivity along the Z axis.
  • the force sensor 180 might be a strain gauge that is part of the tool or integrated into the tip itself.
  • the sensor could be positioned on the stage or be implemented partially on the tip or tool and partially on the stage to provide information to determine Z axis height in reference to the bottom of the sample plate.
  • the force sensor 180 operates as a highly sensitive scale measuring the weight of the smart syringe, such that any pressure applied either up or down provides a corresponding output indicative of the sensed force.
  • the force sensor signal can be provided to the control system via a communications link.
  • the communications link may be a wireless link (e.g., Bluetooth or other short range wireless communications technology) or a physical link (e.g., optical or electrically conductive cable).
  • the force sensor output can be stored in memory as force sensor output data 182 for processing to ascertain the z-axis position of the tip 158.
  • the control system 40 is programmed to determine the height of the tip based on the force sensor data and Z-position data 184 obtained during a Z axis calibration mode. For example, tip 158 begins at a start position where Z-axis position data 184 is recorded based on an output of a Z-axis encoder associated with the tool.
  • the control system 40 provides tool motion control instructions to move the tip in the Z direction toward the surface 122. The distal end of the tip traverses a distance from its start position up to a distance corresponding to the height of the tip from the stage, producing corresponding Z-position data during such movement.
  • the force sensor 180 can provide a corresponding signal representing the sensed force, which indicates contact between the tip and the solid surface.
  • a height calculator 186 e.g., of the control system 40 computes the distance of travel, corresponding to tip height data 188, based on a difference between the Z position data 184 at the start position (FIG. 9) and the Z position data when contact occurs (FIG. 10). For example, the contact can be ascertained from the force data 182 indicating sufficient force to indicate a solid object, such as the surface 122, and not a fluid medium disposed thereon. That is, the control system 40 can discriminate between contact with a solid surface and a fluid medium.
  • the tip height data can also specify the X and Y coordinates (e.g., stage coordinates) where the tip height is measured (e.g., a tip height measurement location).
  • the tip height data including the height in the Z direction and associated X, Y coordinates for such location, can be stored in memory for a plurality of X, Y coordinates across the surface of the stage.
  • the tip height data 188 can be utilized to provide a corresponding topographical map of the surface of the stage 122 or other object disposed thereon for one or more portions of the surface on the stage, such as shown in FIG. 1 1 .
  • the tip height can be utilized to create a topographical surface map based on measuring height for one or more locations across the surface of the stage, which tip height measurements and associated measurement locations (e.g., in stage X,Y coordinates) can be stored in memory.
  • FIG. 1 1 demonstrates an example of a topographical map with a surface, such as an unoccupied surface of a well plate that is positioned on the stage 1 10. It is understood that such a detailed map is not required for the entire surface sense the tip plate is only relevant or needed for locations adjacent to each target site.
  • the control system 40 can employ the stored tip height measurements to control the tip height at or near one or more desired locations on the surface 122 such as for interacting with objects that have been identified in the corresponding image.
  • the Z axis tip height calculator 186 of calibration function 54 can be utilized to determine the height of the tip (e.g., corresponding to the distance of the Z-direction between the tip and the surface 122) at a location that is adjacent to but spaced apart from the target site.
  • the corresponding tip height ascertained at the adjacent location or a series of adjacent locations can be utilized to specify the tip height as to control the interaction with objects at each respective target site.
  • the known tip height can be utilized to control the interaction at each target site that is within such predetermined distance.
  • the height of the tip can be controlled precisely during interaction at the target site without having to potentially damage the object at the target site.
  • FIGS. 12-14 In view of the foregoing structural and functional features described above, methods that can be implemented will be better appreciated with reference to FIGS. 12-14. While, for purposes of simplicity of explanation, the methods of FIGS. 12-14 are shown and described as executing serially, it is to be understood and appreciated that such methods may not be limited by the illustrated order, as some aspects could occur in different orders and/or concurrently with other aspects from that shown and described herein. Moreover, not all illustrated features may be required to implement a method. The methods or portions thereof can be implemented as instructions stored in a non-transitory storage medium as well as be executed by a processor of a computer device, for example. Additionally, in some implementation, two or more of the methods of FIGS. 12-14 (or portions of two or more such methods) may be combined together as part of a workflow for calibrating and/or implementing an interaction protocol.
  • FIG. 12 depicts an example of a calibration method 200 to facilitate positioning a stage to provide for interaction between a tip and one or more objects identified in an image.
  • the method 200 begins at 202 in which a marker is provided at a touchoff location.
  • a corresponding tool can be moved to a predefined reference position and moved in the Z-direction to contact the surface of the stage and place a marker at such touchoff location in response to such contact.
  • the marker can be ink or any other substance that may be visible via the imaging system (e.g., system 20 of FIG. 1 ).
  • a reference location for the tip can be identified.
  • the reference location for the tip can corresponding to the coordinates of the movable stage (e.g., stage 12 or 1 10) in two or more dimensions such as according to positions provided by encoders for the X,Y directions for linear actuators that are employed for adjusting the position of the stage.
  • the marker is moved into the field of view of the imaging device.
  • the stage position can be adjusted to align the marker within a field of view from the imaging system (e.g., camera) that is positioned orthogonally and spaced apart from the surface of the stage onto which the marker is provided at 202.
  • the movement of the marker to a prescribed position within the field of view can be implemented manually (in response to user inputs), automatically or can employ a combination of manual and automatic motion controls.
  • the location of the marker within the field of view can be identified.
  • the location of the marker within the field of view can be identified based on the coordinates (e.g., X,Y position) of the stage after the marker has been moved into the field of view at 206.
  • the identified location can itself be the X,Y position of the stage or it can be the X,Y position of the stage in combination with distance between one or more pixels that represent the marker in an image captured by the image device relative to an optical reference in the image.
  • the optical reference in the image can correspond to a center of the field of view for the captured image or to one or more pixels at the periphery at such captured images.
  • the identified location at 208 can represent the position of the stage. Otherwise the pixel distance can be applied to the position of the stage to represent the location of the marker within the field of view.
  • the spatial offset between reference location for the tip and the field of view is determined.
  • the spatial offset can be determined, for example, based on a difference between the identified location for the tip at 204 and the identified location for the marker at 208.
  • the spatial offset for the tip and the field of view can be stored in memory as an optical offset that is utilized at 212 to position the stage relative to the tip for any number of one or more subsequent interactions between the tip and objects on the stage.
  • the control system can apply the spatial offset to move the stage so that any object identified in any captured image can be aligned with respect to the tip to enable interactions between the tip and such identified objects.
  • the interactions can include one or more of cutting material from the stage (e.g., from a well plate), scraping material, stamping material on the stage, stirring or agitating a medium on the tray or aspirating (e.g., picking) objects from the tray and/or transferring objects to one or more destinations on the stage or elsewhere.
  • FIG. 13 depicts an example of a method 250 to control interaction with objects.
  • the method 250 begins at 252 with capturing one or more images via the imaging device (e.g., imaging system 20).
  • the captured image can be for a field of view on the surface of the stage (e.g., stage 12 or 1 10) such as including various objects, including biological objects or the like that may be disposed within well plates as is known in the art.
  • metadata is stored with the image at 254 (e.g., metadata 23 embedded in or otherwise associated with image data 22).
  • the metadata includes position of the stage at the time of each image as well as the indication of image resolution for each captured image.
  • the image resolution and stage position thus can be utilized to ascertain an absolute position in the X,Y coordinates of the stage for each pixel in each captured image.
  • one or more reference pixels in a given image e.g., a center location and/or a location on a periphery image
  • each pixel can be assigned a dimension such that the number of pixels between the reference position can be utilized to ascertain a relative or absolute position with respect to the X,Y coordinates of the stage. While the foregoing has been described as two-dimensional image, similar metadata can be provided for a three-dimensional image.
  • the captured image can be analyzed to determine one or more target objects in the image.
  • the image analysis at 256 can be implemented according to the above-incorporated U.S. Patent No. 8,068,670. Those skilled in the art may understand and appreciate other image analysis techniques that may be utilized to detect target objects on the stage.
  • the location of the target object is determined based on the stored metadata for the captured image that was analyzed of one or more objects.
  • the stage can be moved relative to the tool to align the location of the target object or objects with the tip of the tool.
  • the control system e.g., stage motion control 44
  • the control system can control the motion of the stage along each of the X,Y positions to align the tip position with the target object.
  • a corresponding interaction protocol such as disclosed herein can be utilized to control the stage to move to a sequence of locations for tip alignment and corresponding interaction at each of the sequential locations distributed across the surface area of the identified object.
  • the tip can be controlled to interact with the object such as by controlling motion of the tip along the Z axis to a location that is spaced apart from the contact surface of the stage or plate that may be positioned thereon and the corresponding interaction may then be activated according to the selected interaction protocol.
  • the hollow tip can be inserted into a medium and aspirated and draw in a volume of media and corresponding biological objects growing therein. This may be repeated at the plurality of sequential locations as disclosed herein. Other forms of interaction disclosed herein can be performed at the identified locations disclosed herein.
  • the method may return to 256 to further analyze one or more of the images to determine additional target objects for which interaction may be desired and the method may proceed accordingly.
  • the method can proceed from 262 to 260 to in turn move the stage to a subsequent sequential location to align the tip of the tool with another part of the target object to enable the interaction at 262 for each of the subsequent sequential locations that have been specified.
  • the metadata of the image is utilized for such positioning. The method can repeat until the procedure has been completed according to the selected interaction protocol(s).
  • FIG. 14 depicts an example of another method 300 that can be utilized to facilitate positioning a tip with respect to the stage along the Z axis.
  • the method 300 can be considered a Z axis calibration method, for example.
  • the method 300 begins at 302 in which a target location is determined.
  • the target location can be determined by specifying coordinates, such as may be determined from image analysis (e.g., analysis 24) performed for one or more captured image or may otherwise be identified.
  • the target location for example, may include biological or other objects of interest, such as art to be picked from a media residing on the stage.
  • target objects can be considered objects that are to be destroyed and removed from the stage to allow further growth and culture of desired biological objects.
  • the stage is moved relative to the tool to position the tip adjacent to but spaced from the target location determined at 302.
  • the adjacent location can be free from the desired target object, such as in situations where it is desired not to contact the objects during calibration.
  • an adjacent portion of the stage can be defined as the adjacent area (e.g., positioned greater than 10 microns, such as about 50 - 100 microns from the identified target location) to which the tip can be aligned by movement of the stage at 304.
  • the tip can be controlled to contact the stage at such adjacent location.
  • a corresponding linear actuator can be controlled to move the tool and the tip that is attached thereto in the direction of the stage until contact between the tip and stage is detected (e.g., by Z-axis force sensor 180).
  • the difference between the starting position and the tip and the distance traveled in the Z- direction e.g., as measured by a Z-axis encoder can be employed to determine the height of the tip at the adjacent measurement location. Since the adjacent
  • a threshold for the adjacent measurement location can be preset or user defined according to application requirements, such as in response to a user input.
  • the tip height determined at 308 can be utilized to control the tip to interact with the object and/or medium at the target location in a desired manner according to a selected interaction protocol. For example, since the height of the tip is known at the adjacent measurement location, the interaction protocol can use that height to ensure that the tip is not moved in the Z-direction, unless that is required by the interaction protocol.
  • the method 300 can return from 312 to 302 to repeat 304 - 312 for each subsequent target location as part of the interaction protocol. In some cases, where the next target location are considered sufficiently close to the adjacent measurement location where the height was determined at 308, from 312 the method can instead return to 310 to remove the tip of the tool to the next target location and employ the same tip height over a plurality of interaction.
  • FIG. 15 depicts an example of captured images 402, 404, 406 and 408 demonstrating different steps of an interaction protocol associated with monolayer picking.
  • the captured image 402 corresponds to a source location in which a plurality of objects of interest have been indentified such as corresponding to a colony of area.
  • the object of interest can be identified by image analysis (e.g., analysis 24) such as can include the colony area, colony density, cell number, cell marker area expression or a combination thereof.
  • the captured image 404 demonstrates the source location for the object of interest following biopsy of the cells, such as using a tip and aspirating from the source location containing the desired target objects.
  • inner and outer diameters of the harvest tool are demonstrated as concentric rings 410 and 412, respectively.
  • the captured image 406 demonstrates the cells after being transferred to a different location for which the captured image 406 was obtained for a corresponding field of view.
  • the cells for example demonstrate growth for the transferred cells for a period of about 24 hours.
  • the image 408 demonstrates the transferred cells at a later time period (e.g., about six days).
  • FIGS. 16A and 16B depict a schematic illustration of first and second aspirations performed for an object, demonstrated at 450 and 452.
  • the syringe motion and/or stage motion systems can cooperate to position a tip 458 of a hollow needle for each aspiration.
  • an initial centroid 456 for the object can be computed (e.g., by image analysis 24).
  • Corresponding process parameters for a selected interaction protocol can be applied based on the relevant data (e.g., image data, sensor data, syringe motion data, aspiration data) to control the first aspiration that is performed by positioning the tip 458 into the medium at the centroid 456 spaced a distance.
  • relevant data e.g., image data, sensor data, syringe motion data, aspiration data
  • process parameters can be recomputed and stored at least in part as updated aspiration data for the next aspiration. For instance, this can include computing (e.g., by the image analysis 24) a new centroid 456' of the remaining target object based on the updated image data that is captured by moving the target object 454 into the field of view of the image capture device (e.g., camera 60).
  • the tool function control of the control system 40 can employ the updated process parameters to perform the second aspiration according to the selected protocol. This process can be repeated until harvesting has been completed for the target object.
  • FIG. 17 demonstrates another approach that can be utilized for harvesting cells from a colony according to a selected interaction protocol.
  • an original colony object can be identified based on the image analysis, as demonstrated at 500.
  • the aspiration tool and fluidic controls can compute an initial (e.g., original) harvest strategy 502.
  • the harvest strategy 502 can include a plurality of target sites 504 at which aspiration is to be performed sequentially as well other process parameters for controlling aspiration at each of the sites.
  • the colony object can be moved to within the field of view of the imaging system to acquire image data (e.g., one or more images) representing the current state of the colony object, shown at 506.
  • image data e.g., one or more images
  • the corresponding image data can be utilized to compute a second, updated harvest strategy 508 to facilitate harvesting remaining cells in the colony object at new target sites, such as shown at 510. While in this example, the second harvest strategy is computed after completion of the first strategy, it will be appreciated that the harvest strategy can be variable and dynamically updated based on image data acquired during any number of sequences of aspiration specified by the interaction protocol.
  • portions of the invention may be embodied as a method, data processing system, or computer program product. Accordingly, these portions of the present invention may take the form of an entirely hardware
  • portions of the invention may be a computer program product on a computer-usable storage medium having computer readable program code on the medium.
  • Any suitable computer-readable medium may be utilized including, but not limited to, static and dynamic storage devices, hard disks, optical storage devices, and magnetic storage devices.
  • These computer-executable instructions may also be stored in computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory result in an article of manufacture including instructions which implement the function specified in the flowchart block or blocks.
  • the computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart block or blocks.

Abstract

L'invention concerne des systèmes et procédés permettant la détection d'un ou plusieurs objets et l'interaction contrôlée avec ces derniers. Le système peut comprendre un sous-système d'imagerie (20), un sous-système d'outils (26) contenant un ou plusieurs outils, un sous-système progressif (16) et un système de commande (40). Le système de commande (40) peut intégrer des commandes pour chacun des autres sous-systèmes, lesquelles commandes peuvent mettre en œuvre des fonctions souhaitées sur un grand nombre de paramètres de traitement pour effectuer l'interaction contrôlée.
PCT/US2015/029892 2014-05-08 2015-05-08 Systèmes et procédés permettant la détection, l'analyse, l'isolement et/ou la récolte d'objets biologiques WO2015172025A1 (fr)

Priority Applications (4)

Application Number Priority Date Filing Date Title
EP15727490.3A EP3140662B1 (fr) 2014-05-08 2015-05-08 Systèmes et procédés permettant la détection, l'analyse, l'isolement et/ou la récolte d'objets biologiques
US15/309,712 US10564172B2 (en) 2014-05-08 2015-05-08 Systems and methods for detection, analysis, isolation and/or harvesting of biological objects
US16/741,864 US11579160B2 (en) 2014-05-08 2020-01-14 Systems and methods for detection, analysis, isolation and/or harvesting of biological objects
US18/108,738 US20230184804A1 (en) 2014-05-08 2023-02-13 Systems and methods for detection, analysis, isolation and/or harvesting of biological objects

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201461990387P 2014-05-08 2014-05-08
US61/990,387 2014-05-08

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US15/309,712 A-371-Of-International US10564172B2 (en) 2014-05-08 2015-05-08 Systems and methods for detection, analysis, isolation and/or harvesting of biological objects
US16/741,864 Continuation US11579160B2 (en) 2014-05-08 2020-01-14 Systems and methods for detection, analysis, isolation and/or harvesting of biological objects

Publications (1)

Publication Number Publication Date
WO2015172025A1 true WO2015172025A1 (fr) 2015-11-12

Family

ID=53298588

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2015/029892 WO2015172025A1 (fr) 2014-05-08 2015-05-08 Systèmes et procédés permettant la détection, l'analyse, l'isolement et/ou la récolte d'objets biologiques

Country Status (3)

Country Link
US (3) US10564172B2 (fr)
EP (1) EP3140662B1 (fr)
WO (1) WO2015172025A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109416372A (zh) * 2016-07-21 2019-03-01 西门子医疗保健诊断公司 测试系统的自动化对准
CN110520738A (zh) * 2017-04-12 2019-11-29 通用自动化实验技术公司 用于采拾生物样品的装置和方法

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3140662B1 (fr) * 2014-05-08 2024-03-20 The Cleveland Clinic Foundation Systèmes et procédés permettant la détection, l'analyse, l'isolement et/ou la récolte d'objets biologiques
IL251134B (en) * 2016-05-17 2018-03-29 Sheena Haim A system and method for monitoring and managing laboratory procedures
US10559378B2 (en) * 2017-02-17 2020-02-11 Agfa Healthcare Nv Systems and methods for processing large medical image data
US10671659B2 (en) 2017-02-17 2020-06-02 Agfa Healthcare Nv Systems and methods for collecting large medical image data
EP3739341A4 (fr) * 2018-01-10 2022-01-12 Hitachi High-Tech Corporation Dispositif d'analyse automatique
KR102258937B1 (ko) * 2019-07-02 2021-06-01 주식회사 제놀루션 핵산 추출 장치 및 핵산 추출 장치의 구동 제어 방법
EP4184176A1 (fr) * 2021-11-17 2023-05-24 Roche Diagnostics GmbH Procédé de détection d'un fond d'au moins un puits

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1548448A1 (fr) * 2002-09-27 2005-06-29 Shimadzu Corporation Procede et dispositif de conditionnement de liquide
WO2005121746A1 (fr) * 2004-06-09 2005-12-22 The University Of British Columbia Appareil et procedes de distribution de reactifs
US8068670B2 (en) 2005-06-10 2011-11-29 The Cleveland Clinic Foundation Image analysis of biological objects
WO2012129105A1 (fr) * 2011-03-18 2012-09-27 Siemens Healthcare Diagnostics Inc. Procédés et systèmes de calibrage d'une orientation de position entre un conteneur d'échantillons et une extrémité de buse
US20130129538A1 (en) 2010-06-04 2013-05-23 Daniel L. Bantz Miniaturized syringe pump system and modules
US20130205920A1 (en) * 2012-02-10 2013-08-15 Adam Perry Tow Automated visual pipetting

Family Cites Families (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU3586195A (en) 1994-09-20 1996-04-09 Neopath, Inc. Apparatus for automated identification of cell groupings on a biological specimen
US6252979B1 (en) 1995-06-07 2001-06-26 Tripath Imaging, Inc. Interactive method and apparatus for sorting biological specimens
US6404906B2 (en) 1997-03-03 2002-06-11 Bacus Research Laboratories,Inc. Method and apparatus for acquiring and reconstructing magnified specimen images from a computer-controlled microscope
US6396941B1 (en) 1996-08-23 2002-05-28 Bacus Research Laboratories, Inc. Method and apparatus for internet, intranet, and local viewing of virtual microscope slides
US6031930A (en) 1996-08-23 2000-02-29 Bacus Research Laboratories, Inc. Method and apparatus for testing a progression of neoplasia including cancer chemoprevention testing
US6272235B1 (en) 1997-03-03 2001-08-07 Bacus Research Laboratories, Inc. Method and apparatus for creating a virtual microscope slide
DE19742163C2 (de) 1997-09-24 2001-12-13 Steffen Hering Vorrichtung und Verfahren zur Behandlung von Objekten in einem Flüssigkeitsbad
DE19835833A1 (de) 1998-08-07 2000-02-17 Max Planck Gesellschaft Dosierkopf zur parallelen Bearbeitung einer Vielzahl von Fluidproben
WO2001040454A1 (fr) 1999-11-30 2001-06-07 Oncosis Procede et dispositif permettant de cibler de maniere selective des cellules specifiques dans une population de cellules
US6804385B2 (en) 2000-10-24 2004-10-12 Oncosis Method and device for selectively targeting cells within a three-dimensional specimen
US7218764B2 (en) 2000-12-04 2007-05-15 Cytokinetics, Inc. Ploidy classification method
US6466690C1 (en) 2000-12-19 2008-11-18 Bacus Res Lab Inc Method and apparatus for processing an image of a tissue sample microarray
WO2003067904A2 (fr) 2002-02-06 2003-08-14 University Of North Carolina At Chapel Hill Procede et appareil a debit eleve d'identification et d'isolation de cellule
JP4391527B2 (ja) 2003-06-12 2009-12-24 サイティック コーポレイション 関心のある複数の対象物を関心のあるフィールド内で組織化するシステム
US7776584B2 (en) 2003-08-01 2010-08-17 Genetix Limited Animal cell colony picking apparatus and method
US7310147B2 (en) 2005-01-27 2007-12-18 Genetix Limited Robotic apparatus for picking of cells and other applications with integrated spectroscopic capability
US7738730B2 (en) 2006-01-25 2010-06-15 Atalasoft, Inc. Method of image analysis using sparse hough transform
JP5507247B2 (ja) 2006-08-28 2014-05-28 サーモ エレクトロン サイエンティフィック インストルメンツ リミテッド ライアビリティ カンパニー 画像駆動分析による分光器顕微鏡法
CA2664135C (fr) 2006-09-22 2014-09-02 Rafael Backhaus Procede et dispositif de prelevement automatique de cellules et/ou de colonies cellulaires
GB2461425B (en) 2007-03-02 2010-12-15 Life Technologies Corp Methods for selecting cells with enhanced growth and production properties
US8064678B2 (en) 2007-10-22 2011-11-22 Genetix Corporation Automated detection of cell colonies and coverslip detection using hough transforms
US8417011B2 (en) 2008-09-18 2013-04-09 Molecular Devices (New Milton) Ltd. Colony detection
EP2376964A1 (fr) 2008-12-19 2011-10-19 Abbott Laboratories Procédé et appareil de détection de lamelles couvre-objets pour lames de microscope
US8723154B2 (en) 2010-09-29 2014-05-13 Crossbar, Inc. Integration of an amorphous silicon resistive switching device
JP6021909B2 (ja) * 2011-07-21 2016-11-09 ブルックス オートメーション インコーポレイテッド 低温試料グループホルダーにおける寸法変化の補正のための方法と装置
US8944001B2 (en) * 2013-02-18 2015-02-03 Nordson Corporation Automated position locator for a height sensor in a dispensing system
EP3140662B1 (fr) * 2014-05-08 2024-03-20 The Cleveland Clinic Foundation Systèmes et procédés permettant la détection, l'analyse, l'isolement et/ou la récolte d'objets biologiques

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1548448A1 (fr) * 2002-09-27 2005-06-29 Shimadzu Corporation Procede et dispositif de conditionnement de liquide
WO2005121746A1 (fr) * 2004-06-09 2005-12-22 The University Of British Columbia Appareil et procedes de distribution de reactifs
US8068670B2 (en) 2005-06-10 2011-11-29 The Cleveland Clinic Foundation Image analysis of biological objects
US20130129538A1 (en) 2010-06-04 2013-05-23 Daniel L. Bantz Miniaturized syringe pump system and modules
WO2012129105A1 (fr) * 2011-03-18 2012-09-27 Siemens Healthcare Diagnostics Inc. Procédés et systèmes de calibrage d'une orientation de position entre un conteneur d'échantillons et une extrémité de buse
US20130205920A1 (en) * 2012-02-10 2013-08-15 Adam Perry Tow Automated visual pipetting

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109416372A (zh) * 2016-07-21 2019-03-01 西门子医疗保健诊断公司 测试系统的自动化对准
JP2019521358A (ja) * 2016-07-21 2019-07-25 シーメンス・ヘルスケア・ダイアグノスティックス・インコーポレーテッドSiemens Healthcare Diagnostics Inc. 試験システムの位置合わせ自動化
EP3488250A4 (fr) * 2016-07-21 2019-11-13 Siemens Healthcare Diagnostics Inc. Alignement automatisé d'un système d'essai
US11162964B2 (en) 2016-07-21 2021-11-02 Siemens Healthcare Diagnostics Inc. Automated alignment of a testing system
EP3922998A1 (fr) * 2016-07-21 2021-12-15 Siemens Healthcare Diagnostics Inc. Alignement automatique d'un système d'essai
CN109416372B (zh) * 2016-07-21 2023-11-03 西门子医疗保健诊断公司 测试系统的自动化对准
CN110520738A (zh) * 2017-04-12 2019-11-29 通用自动化实验技术公司 用于采拾生物样品的装置和方法
JP2020516290A (ja) * 2017-04-12 2020-06-11 ジェネラル オートメーション ラボ テクノロジーズ インコーポレイテッド 生体試料を採取するための装置及び方法
EP3610271A4 (fr) * 2017-04-12 2021-04-14 General Automation LAB Technologies Inc. Appareil et procédé pour prélever un échantillon biologique
JP7283846B2 (ja) 2017-04-12 2023-05-30 アイソレーション バイオ インコーポレイテッド 生体試料を採取するための装置及び方法

Also Published As

Publication number Publication date
US20170227564A1 (en) 2017-08-10
EP3140662B1 (fr) 2024-03-20
US20200150143A1 (en) 2020-05-14
US11579160B2 (en) 2023-02-14
US10564172B2 (en) 2020-02-18
EP3140662A1 (fr) 2017-03-15
US20230184804A1 (en) 2023-06-15

Similar Documents

Publication Publication Date Title
US11579160B2 (en) Systems and methods for detection, analysis, isolation and/or harvesting of biological objects
KR101150444B1 (ko) 세포 및/또는 세포 군체의 자동 제거를 위한 방법 및 장치
EP1873232B1 (fr) Appareil de micro-injection et procédé de mise au point automatique
EP2955502B1 (fr) Système de support d'aspiration de cellule
JP2022116260A (ja) 特定の数の細胞の自動収集
US20190258046A1 (en) System and method for performing automated analysis of air samples
JP6139403B2 (ja) 培養装置、培養装置システム、培養操作管理方法およびプログラム
US20130205920A1 (en) Automated visual pipetting
US8846379B2 (en) Vision based method for micromanipulating biological samples
JP6102166B2 (ja) 心筋細胞の運動検出方法、心筋細胞の培養方法、薬剤評価方法、画像処理プログラム及び画像処理装置
JP7335168B2 (ja) 細胞コロニー採集システム
EP2967424B1 (fr) Système et procédés pour traiter un échantillon de biopsie
EP2476746B1 (fr) Dispositif de classification de diverses espèces bactériennes
JP3241147U (ja) 生体試料を採取するための装置
JP2015166829A (ja) 細胞撮像制御装置および方法並びにプログラム
JP5762764B2 (ja) 細胞画像解析システム
US20190195777A1 (en) Captured image evaluation apparatus, captured image evaluation method, and captured image evaluation program
WO2017130613A1 (fr) Dispositif d'observation
JP7006833B2 (ja) 細胞解析装置
US20160370283A1 (en) Method and system for characterizing a state of adhesion of particles such as cells
JP2017099405A (ja) 心筋細胞の運動検出方法、心筋細胞の培養方法、薬剤評価方法、画像処理プログラム及び画像処理装置
CA3163902A1 (fr) Methode d'analyse de taxies, methode d'evaluation de cancers, systeme d'analyse de taxies et programme
US20180285623A1 (en) Analysis-result browsing device
Liu et al. Automated microrobotic characterization of cell-cell communication
CN108270965A (zh) 实验用的摄像装置、实验系统及其控制方法和记录介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15727490

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

REEP Request for entry into the european phase

Ref document number: 2015727490

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2015727490

Country of ref document: EP