WO2018044683A1 - System and method for template-based image analysis - Google Patents

System and method for template-based image analysis Download PDF

Info

Publication number
WO2018044683A1
WO2018044683A1 PCT/US2017/048426 US2017048426W WO2018044683A1 WO 2018044683 A1 WO2018044683 A1 WO 2018044683A1 US 2017048426 W US2017048426 W US 2017048426W WO 2018044683 A1 WO2018044683 A1 WO 2018044683A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
template
images
binary
slide
Prior art date
Application number
PCT/US2017/048426
Other languages
French (fr)
Inventor
Paula Karen GEDRAITIS
Original Assignee
Molecular Devices, Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Molecular Devices, Llc filed Critical Molecular Devices, Llc
Priority to JP2019531545A priority Critical patent/JP7068307B2/en
Priority to US16/328,358 priority patent/US11409094B2/en
Priority to CN201780052795.2A priority patent/CN109643453B/en
Priority to EP17847247.8A priority patent/EP3507768B1/en
Publication of WO2018044683A1 publication Critical patent/WO2018044683A1/en
Priority to US17/863,804 priority patent/US11960073B2/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/74Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/34Microscope slides, e.g. mounting specimens on microscope slides
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/36Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
    • G02B21/365Control or image processing arrangements for digital or video microscopes
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/36Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
    • G02B21/365Control or image processing arrangements for digital or video microscopes
    • G02B21/367Control or image processing arrangements for digital or video microscopes providing an output produced by processing a plurality of individual source images, e.g. image tiling, montage, composite images, depth sectioning, image comparison
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/32Fiducial marks and measuring scales within the optical system
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B01PHYSICAL OR CHEMICAL PROCESSES OR APPARATUS IN GENERAL
    • B01JCHEMICAL OR PHYSICAL PROCESSES, e.g. CATALYSIS OR COLLOID CHEMISTRY; THEIR RELEVANT APPARATUS
    • B01J2219/00Chemical, physical or physico-chemical processes in general; Their relevant apparatus
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/62Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light
    • G01N21/63Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light optically excited
    • G01N21/64Fluorescence; Phosphorescence
    • G01N2021/6417Spectrofluorimetric devices
    • G01N2021/6419Excitation at two or more wavelengths
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/62Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light
    • G01N21/63Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light optically excited
    • G01N21/64Fluorescence; Phosphorescence
    • G01N21/6428Measuring fluorescence of fluorescent products of reactions or of fluorochrome labelled reactive substances, e.g. measuring quenching effects, using measuring "optrodes"
    • G01N2021/6439Measuring fluorescence of fluorescent products of reactions or of fluorochrome labelled reactive substances, e.g. measuring quenching effects, using measuring "optrodes" with indicators, stains, dyes, tags, labels, marks
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/25Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
    • G01N21/251Colorimeters; Construction thereof
    • G01N21/253Colorimeters; Construction thereof for batch operation, i.e. multisample apparatus
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/62Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light
    • G01N21/63Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light optically excited
    • G01N21/64Fluorescence; Phosphorescence
    • G01N21/645Specially adapted constructive features of fluorimeters
    • G01N21/6452Individual samples arranged in a regular 2D-array, e.g. multiwell plates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/62Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light
    • G01N21/63Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light optically excited
    • G01N21/64Fluorescence; Phosphorescence
    • G01N21/645Specially adapted constructive features of fluorimeters
    • G01N21/6456Spatial resolved fluorescence measurements; Imaging
    • G01N21/6458Fluorescence microscopy
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/62Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light
    • G01N21/63Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light optically excited
    • G01N21/64Fluorescence; Phosphorescence
    • G01N21/6486Measuring fluorescence of biological material, e.g. DNA, RNA, cells
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10056Microscopic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30072Microarray; Biochip, DNA array; Well plate
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker

Definitions

  • the present subject matter relates to high-content imaging systems and more particularly, to a system and method to use a template to align images captured by such systems.
  • a high-content imaging system may be used to obtain a microscopy image of a biological sample such as DNA, proteins, cells, and the like.
  • a plurality of such biological samples may be disposed on a slide.
  • Such biological samples may be disposed on the slide in a two-dimensional array pattern, wherein the distance between adjacent locations where biological samples may be deposited is predetermined. However, the location and rotation of the aggregate array may vary from slide to slide.
  • the slide on which the array of biological samples is disposed may be prepared by printing indicia thereon that serve as fiducial or reference marks.
  • the array of biological samples is then deposited on the slide relative to such indicia.
  • the position of each biological sample may be located relative to such fiducial or reference marks.
  • a template-based system to analyze an image includes a high-content imaging system (HCIS) and a slide loaded in the HCIS.
  • the slide includes a plurality of reference marks and a plurality of sample locations.
  • the template-based system also includes an acquired images data store having previously acquired images stored therein, an image acquisition module, a feature identification module, a template generation module, a template alignment module, an offset calculation module, and an image segmentation module.
  • the image acquisition module acquires an image of the slide from the high-content imaging system
  • the feature identification module develops a binary image of the acquired image, wherein the binary image identifies areas associated with reference mark.
  • the template generation module generates a template in accordance with the previously acquired images, the template alignment module aligns the generated template with the binary image, and the offset calculation module determines an offset between the template and the binary image.
  • the image segmentation module determines the coordinates of pixels of the acquired image that are associated with the sample locations in accordance with the calculated offset.
  • a template-based method for analyzing an image includes disposing reference marks and a plurality of samples on a slide and loading the slide in a high-content imaging system (HCIS). The positions of each the plurality of samples is in accordance with the positions of the reference marks.
  • the method also includes storing previously acquired images in an acquired images data store, acquiring with the HCIS an image of the slide loaded in the HCIS, and developing a binary image of the acquired image.
  • the binary image identifies areas associated with reference marks.
  • the method includes the additional steps of generating a template in accordance with the previously acquired images, aligning the template with the binary image, and calculating an offset between the template and the binary image.
  • the method includes the further step of determining the coordinates of pixels of the acquired image that are associated with the sample locations in accordance with calculated offset.
  • FIG. 1 is a block diagram of a high-content imaging system (HCIS) in accordance with the present invention
  • FIG. 2 is a top plan view of a slide that may be imaged using the HCIS of FIG. 1;
  • FIGS. 3A-3C illustrate images of slides developed using the HCIS of FIG. 1;
  • FIG. 4 illustrates a template developed from the images of FIGS. 3A-3C;
  • FIG. 5 is a block diagram of a template-based analysis system that may be used to analyze images captured by the HCIS of FIG. 1;
  • FIG. 6 is a flowchart of processing undertaken by the template-based analysis system of FIG. 5 to analyze an image
  • FIG. 7 is a flowchart of processing undertaken by the template-based analysis system of FIG. 5 to develop a template.
  • an HCIS 100 may include an X-Y stage 102, one or more objective lenses 104, one or more illumination sources 106, one or more filters 108, an image capture device 110, and a controller 112.
  • the HCIS 100 may also include one or more mirrors 114 that direct light from the illumination source 106 to a slide 116 that may be disposed on the X-Y stage 102, and from such slide 116 to the image capture device 110.
  • the slide 116 includes evenly spaced locations 118, and samples (for example, biological cells) to be imaged by the HCIS 100 may be disposed at each such location 118.
  • FIG. 1 shows the light from the illumination source 106 reflected from slide 116 reaching the image capture device 110
  • additional mirrors may be used so that light from the illumination source 106 is transmitted through the slide 116 and directed toward the image capture device 110.
  • no illumination from the illumination source 106 may be necessary to image the samples in the slide 116 (for example, if the samples emit light or if the samples include radioactive components).
  • light from the illumination source may be transmitted through the samples in the slide 116, and the samples refract and/or absorb the transmitted light to produce light that is imaged.
  • the slide 116 may be placed, either manually or robotically, on the X-Y stage 102.
  • the controller 112 may configure the HCIS 100 to use a combination of a particular objective lens 104, illumination generated by the illumination source 106, and/or filter 108.
  • the controller 112 may operate a positioning device 120 to place a selected objective lens 104 and, optionally, a selected filter 108 in the light path between the slide 116 and the image capture device 110.
  • Such position device 120 may include one or more motors coupled to the objective lens 102, the selected filter 108, and/or the image capture device 110.
  • the controller 112 may also direct the illumination source 106 to illuminate the slide 116 with particular wavelengths of light.
  • the samples in the slide 116 may contain molecules that fluoresce, either naturally occurring molecules, or molecules produced or present within the samples due to treatment.
  • the wavelength illuminating the sample may be the excitation wavelengths associated with such fluorescent molecules, and the imaging capture device will capture only the emission spectrum of such fluorescent materials.
  • One or more wavelengths may used serially or simultaneously to illuminate the same samples and produce images
  • a slide 116 includes evenly spaced locations 118 at which samples may be disposed.
  • the locations 118 are arranged in a two-dimensional array 120, having a predetermined number of horizontal and vertical locations 1 18, and the distance between adjacent pairs of the locations 118 being substantially identical.
  • the slide 116 also includes reference marks 152 and 154. Some reference marks 152 are disposed on the slide to be coincident with a particular location 118 of the array 120.
  • Other reference marks 154 are placed outside the bounds of the array 120 at predetermined horizontal and vertical distances 156 and 158, respectively, from a particular location 160 associated with the array 120. Although in FIG. 2 the horizontal and vertical distances 156 and 158, respectively, are measured from the centers of the reference marks 154, it should be apparent that such distances 156 and 158 may be measured from any other portion of the reference mark 152. Further, the horizontal and vertical distances 156 and 158 associated with a particular reference mark 154 need not be identical to the horizontal and vertical distances associated with a different reference mark.
  • the horizontal distance 156a associated with the reference mark 154a need not be identical to one or both of the horizontal distances 156b and 156c associated with the reference marks 154b and 154c, respectively.
  • the vertical distance 158a associated with the reference mark 154a need not be identical to one or both of the vertical distances 158b and 158c associated with the reference marks 154b and 154c, respectively.
  • the array 150 may be rotated relative to the slide 116. In such cases, the horizontal and vertical distances 154 and 156, respectively, are measured relative to the horizontal and vertical axes of the array 150.
  • reference marks 152 and 154 are depicted as circular, it should be apparent that these reference marks 152 and 154 may have any predetermined shape. Further, all of the reference marks 152 and 154 disposed on the slide 116 do not have to have identical shapes and/or sizes.
  • the reference marks 152 and 154 may be disposed on slide 116 by being imprinted on, attached to, raised from, or etched into the slide 116. Further, all of the reference marks 152 and 154 disposed on the slide 116 may be disposed using different techniques. For example, some reference marks 152 and 154 may be printed and others may be etched.
  • the coordinates of the pixels in the image of the slide 116 associated with all of the locations 118 may also be determined.
  • FIGS. 3A-3C depict images 200a, 200b, and 200c of three different slides, each slide includes an array of sample locations and reference marks positioned relative to one another identically to the slide 116 shown in FIG. 2. As should be apparent, the images 200a, 200b, and 200c show biological samples having been deposited in only some of the sample locations of each slide.
  • the image 200a includes pixels 202a that identify all of the reference marks 152, and pixels 204a, 206a, and 208a that identify the reference marks 154a, 154b, and 154c, respectively, of the slide from which this image was created.
  • the image 200b includes pixels 202b that identify the reference marks 152, and pixels 204b, 206b, and 208b that identify the reference marks 154a, 154b, and 154c, respectively of the slide from which this image was created.
  • the image 200a includes pixels 202a that identify only three of the reference marks 152 (with no pixels that show the reference mark 152 at the top-right of the array 118), and only pixels 206c and 208c that identify the reference marks 154b and 154c of the slide from which this image was made.
  • Each of the images 3A-3C include pixels 210a, 210b, and 210c associated with the location of point 160 in the slides from which the images 200a, 200b, and 200c, respectively, were created. Note, the pixels 210a, 210b, and 210c may be indistinguishable from the backgrounds of the images 200a, 200b, and 200c, respectively. However, once the coordinates of the pixels 210a, 210b, and 210c are determined, the coordinates of the pixels in the images 200a, 200b, and 200c, respectively, associated with the sample locations 118 in the slides associated with these images can also be determined.
  • a template image 250 is created from the images 200a, 200b, and 200c.
  • the template image 250 shows the common features of the images 200a, 200b, and 200c used generate the template image.
  • the template image 250 includes features common to all of the images 200 that were used to create the template image 250.
  • Such common features may include portions of the image associated with reference marks 152 on the slide 116 used to create the images 200, but also any other image elements.
  • image elements may include, for example, samples deposited in one or more locations 118 of the slide 116 that appear in all of the images.
  • the pixels at coordinates corresponding to the locations of those reference marks, for example, the top-right reference mark 152 and the reference mark 154a, that were not shown in all of the images 200c are set to the background intensity value.
  • reference marks are identified in each image 200a, 200b, and 200c using image segmentation and image feature identification techniques.
  • a threshold is applied to each image 200 to set the intensity value of any pixel of the image that has an intensity below that expected for a reference mark to a background intensity value.
  • the intensity value of any pixel of the image that has an intensity value above that expected for a reference mark is set to a background intensity value.
  • each adjacent group of pixels in the image 200 that have non-background intensity values is analyzed. If the number of such pixels that comprise the group, or if the dimensions and/or shape of the group is different that what would be expected for a reference mark, then the intensity value of the pixels that comprise the group are set to the background intensity. Otherwise, the intensity value of the group of pixels is set to the non-background intensity.
  • each image 200a, 200b, and 200c includes pixels that have non-background intensity values and that are associated with candidate reference marks.
  • the pixels associated with candidate reference marks in the binary images created from the images 200a, 200b, and 200c are analyzed across the binary images create the template image 250.
  • edge detection techniques apparent to those who have skill in the art may be used to identify substantially contiguous boundaries of image features in the binary image.
  • the pixels bounded by such contiguous boundaries may be set to the non- background intensity and the shape and/or dimensions of such pixels may analyzed as described in the foregoing to determine if such pixels are associated with reference marks.
  • a template image 252 is created from the binary images developed from the different images 200. All of the pixels of the template image 252 are initially set to a background intensity value. Thereafter, the binary images are registered with one another and each corresponding group of pixels having a non-background intensity values and that is present in all of the binary images and is coincident after registration is identified. The intensity values of the pixels in the template image 252 corresponding to the pixels of such identified group of pixels are set to the non-background intensity value.
  • the template image 250 developed as described above is then to identify the registration marks in the images 200 as well as other images captured by the HCIS 100 from additional slides 116 that have an identical relationship between one or more of the reference marks 152 and the array 150 of sample locations 118.
  • the template image 250 is aligned with the binary image created from the image 200a so that the non-background pixels 252, 254, and 256 of the template image are in register with corresponding non-background pixels of the binary image.
  • the number of pixels (xoffset) that template image 250 must be shifted horizontally and the number of pixels (y offset) that the template image 250 must be shifted vertically for the two images to be aligned may then be determined.
  • Such offsets (x 0 ff se t, yoffset) may be added to the coordinates (x t , y t ) of the pixel 258 (FIG. 4) to determine the coordinates of (x a , y a ) of the pixel 210a (FIG. 3 A) relative to the left and bottom boundaries 212a and 214a, respectively, of the image 200a.
  • the position of the array 150, and thereby the position of each of the locations 1 18 where samples may be deposited may be determined from the location 160. It should be apparent, that once the coordinates of the pixel 210a of the image 200a is determined as described above, the locations of the pixels in the image 200a that are associated with each of the locations 118 may also be determined. Sub-images of the image 200a that are associated with the each location 118 may be analyzed to determine the features of biological samples deposited in such location 118. Such sub-images may also be displayed to a user for visual analysis or transmitted to another system (not shown) for further analysis.
  • the image 200a may be acquired using at a low resolution.
  • the offsets (x 0ffset , yo ff se t ) of the pixel 210a in the image 200a may be calculated as described above and provided to the controller 112.
  • the controller 112 scales the offsets in accordance with a target, higher resolution at which the slide will be reimaged, calculates the positions on the slide of the sample locations 118, and operates the position device 120 in accordance with the calculated positions to scan at the higher resolution each location 118 of the slide.
  • the coordinates (xt,, yt,) of the pixel 210b and (x c , y c ) of the pixel 210c may be calculated in a manner similar to that described above to calculate the coordinates (x a , y a ).
  • one or more of the images 200a, 200b, and 200c may include an image of an array 118 that is rotated relative to the others.
  • an angle of rotation in addition to a positional offset may be determined.
  • One or more image analysis techniques apparent to those who have skill may be used to determine such angle of rotation including cross-correlation, frequency analysis, and searching through various angles of rotation.
  • a template-based image analysis system 300 that uses templates to analyze images from an HCIS includes an image acquisition module 302 coupled to the HCIS 100.
  • the image acquisition module 302 uses the HCIS 100 to obtain images of a plurality of slides 116 loaded in the HCIS 100 and stores such image in an acquired image data store 304.
  • Each one of the plurality of slides each has an array 150 of sample locations 118 distributed in a two-dimensional array pattern thereon.
  • a feature identification module 306 analyzes each acquired image to generate a binary image.
  • the feature identification module 306 sets the intensity value of pixels of the binary image that are expected to be associated with reference marks on the slide 116 are set to a non-background intensity value as described above.
  • the feature identification module 306 stores the binary image in the acquired images data store 304 and provides the binary image to a template generation module 308.
  • a template generation module 308 analyzes the binary images generated from the images of the plurality of slides and identifies pixels associated with reference marks that are common to all of binary images. The template generation module 308 thereafter generates a template image in which only those pixels associated with such common reference marks have a non-background intensity value. The remaining pixels have a background intensity value. The template generation module 308 stores the template in a template data store 310.
  • a template alignment module 312 retrieves from the image data store 304 an image to be analyzed and the binary image generated therefrom, and from the template data store 310 a template associated with the image to be analyzed.
  • the template alignment module 314 then aligns the pixels of the template image with the retrieved binary image so that corresponding pixels associated with reference marks in each image overlap.
  • An offset calculation module 316 develops a horizontal and vertical offsets and rotation of the array 150 represented in the image relative to the boundaries of the image.
  • An image segmentation module 318 uses these offsets to segment the image to be analyzed into sub-images, wherein each sub-image is associated with a location 118 on the slide 116 where a sample may be deposited.
  • a user may operate a user computer 320 to select and further analyze these sub-images.
  • such sub-images may be transmitted to a further system for analysis.
  • the analyzed image, and the pixel coordinate information about such sub-images may be transmitted to a further system, and the further system may use such pixel coordinate information to extract a sub-image from the analyzed image.
  • Such pixel coordinate information for each sub-image may comprise, for example, the pixel coordinate of opposite corners of a rectangular region of the analyzed image where the sub-image is located.
  • FIG. 6 shows a flowchart 400 of the steps undertaken by an embodiment of a template-based image analysis system to analyze images.
  • the image acquisition module 302 acquires an image of a slide 116 using the HCIS 100 and stores such image in the acquired images data store 304.
  • the template- based image analysis system 300 determines if a template already exists to analyze the acquired image. For example, the template-based image analysis system 300 may query the user via the user computer 320 to identify a previously created template stored in the template data store 310, for example, by presenting a list of such stored templates and asking the user to select a template from the list.
  • the template-based image analysis system 300 may query the user to select or enter an identifier associated with the scanned slide for which a template has already been created and stored in the template data store 310. The template-based image analysis system 300 may then identify a template from the template data store 310 in accordance with such identifier. If, at step 404, the user indicates that the template does not exist, processing proceeds to step 406, otherwise processing proceeds to step 408.
  • the template-based image analysis system 300 develops a template that may be used to analyze the image acquired at step 402 and stores the developed template in the template data store 310. Thereafter, processing proceeds to step 408. [0041] At step 408, the template-based image analysis system 300 retrieves the template from the template data store 310.
  • the feature identification module 306 analyzes the acquired image to develop a binary image that identifies pixels of the acquired image that are expected to be associated with reference marks.
  • the feature identification module 306 stores such binary image in the acquired images database 304.
  • the template generation module 314 aligns the pixels associated with reference marks in the selected template with pixels associated with corresponding references marks in the binary image created by the feature identification module 306.
  • the offset calculation module 316 determines the offset and rotation that were necessary to align the template with the binary image at step 412.
  • the template-based image analysis system 300 queries the user via the user computer 320 whether the sample locations 118 of the slide 116 should be rescanned at high resolution. If the user requests rescanning, the image acquisition module 302, at step 418, uses the calculated offsets to determine position information associated with the sample locations 118 on the slide 116 and, at step 420, directs the controller 112 to position the positioning device 120 in accordance with such positioning information and operate the camera 104 to acquire a high resolution sub-image of the slide 116 in accordance with each such location 118. The image acquisition module 302, also at step 420, stores these sub-images in the image acquired images database 302.
  • the image segmentation module 318 uses the calculated offsets to determine the coordinates in the acquired image of pixels associated with each sample location 118 of the slide 116.
  • the template-based image analysis system 300 supplies the position information or scanned sub-images to the user computer or to another system for further analysis.
  • FIG. 7 shows a flowchart 450 of the steps undertaken by an embodiment of the template-based image analysis system 300 at step 406 (FIG. 4) to generate a template.
  • the template-based image analysis system 300 asks the user via the user computer 320 to identify previously acquired images in the acquired images data store 304 that should be used to develop the template. The user may be asked to select one or more acquired images from a list of the images available. Alternately, the user may enter or select an identifier associated with the slide 116 from which with the image was acquired at step 402, and the system 300 selects all images associated with such identifier from the acquired images data store 304.
  • the feature identification module 306 applies a threshold to each selected image to generate a binary image corresponding thereto.
  • the feature identification module 306 analyzes each binary image to identify pixels in such image that correspond to reference marks.
  • the feature identification module also at step 456, sets such identified pixels to a non-background intensity value and sets all other pixels to a background intensity value.
  • the template generation module 308 registers all of the binary images as describe above and, at step 460, identifies pixels that are associated with features that are common to all of the binary images.
  • the template generation module creates a template image in which pixels corresponding to pixels of the binary images associated with features common to all of the binary images are set to a non-background intensity value and the remaining pixels are set to the background intensity value.
  • the template generation module 308 stores the template image in the template data store 310.
  • any combination of hardware and/or software may be used to implement the template-based image analysis described herein. It will be understood and appreciated that one or more of the processes, sub- processes, and process steps described in connection with FIGS. 1-7 may be performed by hardware, software, or a combination of hardware and software on one or more electronic or digitally-controlled devices.
  • the software may reside in a software memory (not shown) in a suitable electronic processing component or system such as, for example, one or more of the functional systems, controllers, devices, components, modules, or sub-modules schematically depicted in FIGS. 1-7.
  • the software memory may include an ordered listing of executable instructions for implementing logical functions (that is, "logic” that may be implemented in digital form such as digital circuitry or source code, or in analog form such as analog source such as an analog electrical, sound, or video signal).
  • the instructions may be executed within a processing module or controller (e.g., the image acquisition module 302, the feature identification module 306, the template generation module 308, the template alignment module 314, offset calculation module 316, and the image segmentation module 318 of FIG. 5), which includes, for example, one or more microprocessors, general purpose processors, combinations of processors, digital signal processors (DSPs), field programmable gate arrays (FPGAs), or application-specific integrated circuits (ASICs).
  • DSPs digital signal processors
  • FPGAs field programmable gate arrays
  • ASICs application-specific integrated circuits
  • schematic diagrams describe a logical division of functions having physical (hardware and/or software) implementations that are not limited by architecture or the physical layout of the functions.
  • the example systems described in this application may be implemented in a variety of configurations and operate as hardware/software components in a single hardware/software unit, or in separate hardware/software units.
  • the executable instructions may be implemented as a computer program product having instructions stored therein which, when executed by a processing module of an electronic system, direct the electronic system to carry out the instructions.
  • the computer program product may be selectively embodied in any non-transitory computer-readable storage medium for use by or in connection with an instruction execution system, apparatus, or device, such as a electronic computer-based system, processor-containing system, or other system that may selectively fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions.
  • computer-readable storage medium is any non- transitory means that may store the program for use by or in connection with the instruction execution system, apparatus, or device.
  • the non-transitory computer-readable storage medium may selectively be, for example, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device.
  • a non-exhaustive list of more specific examples of non-transitory computer readable media include: an electrical connection having one or more wires (electronic); a portable computer diskette (magnetic); a random access, i.e., volatile, memory (electronic); a read-only memory (electronic); an erasable programmable read only memory such as, for example, Flash memory (electronic); a compact disc memory such as, for example, CD-ROM, CD-R, CD-RW (optical); and digital versatile disc memory, i.e., DVD (optical).
  • receiving and transmitting of signals or data means that two or more systems, devices, components, modules, or sub-modules are capable of communicating with each other via signals that travel over some type of signal path.
  • the signals may be communication, power, data, or energy signals, which may communicate information, power, or energy from a first system, device, component, module, or sub-module to a second system, device, component, module, or sub-module along a signal path between the first and second system, device, component, module, or sub-module.
  • the signal paths may include physical, electrical, magnetic, electromagnetic, electrochemical, optical, wired, or wireless connections.
  • the signal paths may also include additional systems, devices, components, modules, or sub-modules between the first and second system, device, component, module, or sub-module.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Optics & Photonics (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Investigating Or Analysing Materials By Optical Means (AREA)

Abstract

A template-based image analysis system and method are disclosed. A slide is loaded in a high-content image system (HCIS), wherein the slide includes a plurality of reference marks and a plurality of sample locations. An acquired images data store has previously acquired images stored therein. An image acquisition module acquires an image of the slide from the high- content imaging system. A feature identification module develops a binary image of the acquired image, wherein the binary image identifies areas associated with reference marks. A template generation module generates a template in accordance with the previously acquired images and a template alignment module aligns the generate template with the binary image. An offset calculation module determines an offset between the template and the binary image and an image segmentation module determines the coordinates of pixels of the acquired image that are associated with the sample locations in accordance with the calculated offset.

Description

SYSTEM AND METHOD FOR TEMPLATE-BASED IMAGE ANALYSIS
RELATED APPLICATION
[0001] This application claims priority to U.S. Provisional Application No. 62/381974 filed on August 31, 2016, the content of which is incorporated herein by reference in its entirety.
FIELD OF DISCLOSURE
[0002] The present subject matter relates to high-content imaging systems and more particularly, to a system and method to use a template to align images captured by such systems.
BACKGROUND
[0001] A high-content imaging system (HCIS) may be used to obtain a microscopy image of a biological sample such as DNA, proteins, cells, and the like. A plurality of such biological samples may be disposed on a slide. Such biological samples may be disposed on the slide in a two-dimensional array pattern, wherein the distance between adjacent locations where biological samples may be deposited is predetermined. However, the location and rotation of the aggregate array may vary from slide to slide.
[0002] The slide on which the array of biological samples is disposed may be prepared by printing indicia thereon that serve as fiducial or reference marks. The array of biological samples is then deposited on the slide relative to such indicia. The position of each biological sample may be located relative to such fiducial or reference marks. However, it may be difficult to identify one or more reference mark(s) in an image of the slide on which the mark is disposed. Such difficulty may arise because of, for example, a lack of contrast between the reference mark and the slide material, defects in the printing process that disposed the reference mark on the slide, and/or too large a distance between the reference mark and the array to fit in the field of view of the HCIS.
SUMMARY
[0003] A template-based system to analyze an image includes a high-content imaging system (HCIS) and a slide loaded in the HCIS. The slide includes a plurality of reference marks and a plurality of sample locations. The template-based system also includes an acquired images data store having previously acquired images stored therein, an image acquisition module, a feature identification module, a template generation module, a template alignment module, an offset calculation module, and an image segmentation module. The image acquisition module acquires an image of the slide from the high-content imaging system, and the feature identification module develops a binary image of the acquired image, wherein the binary image identifies areas associated with reference mark. The template generation module generates a template in accordance with the previously acquired images, the template alignment module aligns the generated template with the binary image, and the offset calculation module determines an offset between the template and the binary image. The image segmentation module determines the coordinates of pixels of the acquired image that are associated with the sample locations in accordance with the calculated offset.
[0004] A template-based method for analyzing an image includes disposing reference marks and a plurality of samples on a slide and loading the slide in a high-content imaging system (HCIS). The positions of each the plurality of samples is in accordance with the positions of the reference marks. The method also includes storing previously acquired images in an acquired images data store, acquiring with the HCIS an image of the slide loaded in the HCIS, and developing a binary image of the acquired image. The binary image identifies areas associated with reference marks. The method includes the additional steps of generating a template in accordance with the previously acquired images, aligning the template with the binary image, and calculating an offset between the template and the binary image. The method includes the further step of determining the coordinates of pixels of the acquired image that are associated with the sample locations in accordance with calculated offset.
[0005] Other aspects and advantages will become apparent upon consideration of the following detailed description and the attached drawings wherein like numerals designate like structures throughout the specification.
BRIEF DESCRIPTION OF THE DRAWINGS
[0006] FIG. 1 is a block diagram of a high-content imaging system (HCIS) in accordance with the present invention;
[0007] FIG. 2 is a top plan view of a slide that may be imaged using the HCIS of FIG. 1;
[0008] FIGS. 3A-3C illustrate images of slides developed using the HCIS of FIG. 1;
[0009] FIG. 4 illustrates a template developed from the images of FIGS. 3A-3C;
[0010] FIG. 5 is a block diagram of a template-based analysis system that may be used to analyze images captured by the HCIS of FIG. 1;
[0011] FIG. 6 is a flowchart of processing undertaken by the template-based analysis system of FIG. 5 to analyze an image; and
[0012] FIG. 7 is a flowchart of processing undertaken by the template-based analysis system of FIG. 5 to develop a template. DETAILED DESCRIPTION
[0013] Referring to FIG. 1, as will be apparent to those who have skill in the art, an HCIS 100 may include an X-Y stage 102, one or more objective lenses 104, one or more illumination sources 106, one or more filters 108, an image capture device 110, and a controller 112. The HCIS 100 may also include one or more mirrors 114 that direct light from the illumination source 106 to a slide 116 that may be disposed on the X-Y stage 102, and from such slide 116 to the image capture device 110. Typically, the slide 116 includes evenly spaced locations 118, and samples (for example, biological cells) to be imaged by the HCIS 100 may be disposed at each such location 118.
[0014] Although, FIG. 1 shows the light from the illumination source 106 reflected from slide 116 reaching the image capture device 110, it should be apparent that additional mirrors (not shown) may be used so that light from the illumination source 106 is transmitted through the slide 116 and directed toward the image capture device 110. Further, it should be apparent that in some cases no illumination from the illumination source 106 may be necessary to image the samples in the slide 116 (for example, if the samples emit light or if the samples include radioactive components). In some embodiments, light from the illumination source may be transmitted through the samples in the slide 116, and the samples refract and/or absorb the transmitted light to produce light that is imaged.
[0015] During operation, the slide 116 may be placed, either manually or robotically, on the X-Y stage 102. In addition, the controller 112 may configure the HCIS 100 to use a combination of a particular objective lens 104, illumination generated by the illumination source 106, and/or filter 108. For example, the controller 112 may operate a positioning device 120 to place a selected objective lens 104 and, optionally, a selected filter 108 in the light path between the slide 116 and the image capture device 110. Such position device 120 may include one or more motors coupled to the objective lens 102, the selected filter 108, and/or the image capture device 110. The controller 112 may also direct the illumination source 106 to illuminate the slide 116 with particular wavelengths of light. The samples in the slide 116 may contain molecules that fluoresce, either naturally occurring molecules, or molecules produced or present within the samples due to treatment. The wavelength illuminating the sample may be the excitation wavelengths associated with such fluorescent molecules, and the imaging capture device will capture only the emission spectrum of such fluorescent materials. One or more wavelengths may used serially or simultaneously to illuminate the same samples and produce images
[0016] Referring to FIG. 2, a slide 116 includes evenly spaced locations 118 at which samples may be disposed. The locations 118 are arranged in a two-dimensional array 120, having a predetermined number of horizontal and vertical locations 1 18, and the distance between adjacent pairs of the locations 118 being substantially identical. The slide 116 also includes reference marks 152 and 154. Some reference marks 152 are disposed on the slide to be coincident with a particular location 118 of the array 120.
[0017] Other reference marks 154 are placed outside the bounds of the array 120 at predetermined horizontal and vertical distances 156 and 158, respectively, from a particular location 160 associated with the array 120. Although in FIG. 2 the horizontal and vertical distances 156 and 158, respectively, are measured from the centers of the reference marks 154, it should be apparent that such distances 156 and 158 may be measured from any other portion of the reference mark 152. Further, the horizontal and vertical distances 156 and 158 associated with a particular reference mark 154 need not be identical to the horizontal and vertical distances associated with a different reference mark. For example, the horizontal distance 156a associated with the reference mark 154a need not be identical to one or both of the horizontal distances 156b and 156c associated with the reference marks 154b and 154c, respectively. Similarly, the vertical distance 158a associated with the reference mark 154a need not be identical to one or both of the vertical distances 158b and 158c associated with the reference marks 154b and 154c, respectively.
[0018] The array 150 may be rotated relative to the slide 116. In such cases, the horizontal and vertical distances 154 and 156, respectively, are measured relative to the horizontal and vertical axes of the array 150.
[0019] Although the reference marks 152 and 154 are depicted as circular, it should be apparent that these reference marks 152 and 154 may have any predetermined shape. Further, all of the reference marks 152 and 154 disposed on the slide 116 do not have to have identical shapes and/or sizes. The reference marks 152 and 154 may be disposed on slide 116 by being imprinted on, attached to, raised from, or etched into the slide 116. Further, all of the reference marks 152 and 154 disposed on the slide 116 may be disposed using different techniques. For example, some reference marks 152 and 154 may be printed and others may be etched.
[0020] It should be apparent that in an image of the slide 116, if the coordinates of a pixel in such image of one of the reference mark 152 and 154 can be determined, then the coordinates of pixel of the location 160 associated with the array 150 may be calculated therefrom. The pixel coordinates of additional reference marks 152 and 154 may improve the accuracy with which the pixel coordinates of the location 160 is calculated. Coordinates of the pixel associated with a reference point 152, 154 may be the coordinates of a pixel associated with a center of such reference mark or the coordinates of any other predetermined feature of such reference mark.
[0021] Further, after the coordinates of the pixel associated with the location 160 are determined, the coordinates of the pixels in the image of the slide 116 associated with all of the locations 118 may also be determined.
[0022] FIGS. 3A-3C depict images 200a, 200b, and 200c of three different slides, each slide includes an array of sample locations and reference marks positioned relative to one another identically to the slide 116 shown in FIG. 2. As should be apparent, the images 200a, 200b, and 200c show biological samples having been deposited in only some of the sample locations of each slide.
[0023] The image 200a includes pixels 202a that identify all of the reference marks 152, and pixels 204a, 206a, and 208a that identify the reference marks 154a, 154b, and 154c, respectively, of the slide from which this image was created. Similarly, the image 200b includes pixels 202b that identify the reference marks 152, and pixels 204b, 206b, and 208b that identify the reference marks 154a, 154b, and 154c, respectively of the slide from which this image was created. However, the image 200a includes pixels 202a that identify only three of the reference marks 152 (with no pixels that show the reference mark 152 at the top-right of the array 118), and only pixels 206c and 208c that identify the reference marks 154b and 154c of the slide from which this image was made.
[0024] Each of the images 3A-3C include pixels 210a, 210b, and 210c associated with the location of point 160 in the slides from which the images 200a, 200b, and 200c, respectively, were created. Note, the pixels 210a, 210b, and 210c may be indistinguishable from the backgrounds of the images 200a, 200b, and 200c, respectively. However, once the coordinates of the pixels 210a, 210b, and 210c are determined, the coordinates of the pixels in the images 200a, 200b, and 200c, respectively, associated with the sample locations 118 in the slides associated with these images can also be determined.
[0025] Referring to FIG. 4, to facilitate accurate identification of the coordinates of the pixels 210a, 210b, and 210c, a template image 250 is created from the images 200a, 200b, and 200c. The template image 250 shows the common features of the images 200a, 200b, and 200c used generate the template image. The template image 250 includes features common to all of the images 200 that were used to create the template image 250. Such common features may include portions of the image associated with reference marks 152 on the slide 116 used to create the images 200, but also any other image elements. Such image elements may include, for example, samples deposited in one or more locations 118 of the slide 116 that appear in all of the images.
[0026] The pixels 252 of the template image 250 that are associated reference marks 152 that appeared in all of the images 200a, 200b, and 200c to a predetermined non-background intensity value. In addition, pixels 254 and pixels 256 associated with the reference marks 154b and 154c, respectively, are also set to non-background intensity value because these references marks appeared in all of the images 200a, 200b, and 200c. The pixels at coordinates corresponding to the locations of those reference marks, for example, the top-right reference mark 152 and the reference mark 154a, that were not shown in all of the images 200c are set to the background intensity value. [0027] To create the template image, reference marks are identified in each image 200a, 200b, and 200c using image segmentation and image feature identification techniques. A threshold is applied to each image 200 to set the intensity value of any pixel of the image that has an intensity below that expected for a reference mark to a background intensity value. In some embodiments, the intensity value of any pixel of the image that has an intensity value above that expected for a reference mark is set to a background intensity value. Thereafter, each adjacent group of pixels in the image 200 that have non-background intensity values is analyzed. If the number of such pixels that comprise the group, or if the dimensions and/or shape of the group is different that what would be expected for a reference mark, then the intensity value of the pixels that comprise the group are set to the background intensity. Otherwise, the intensity value of the group of pixels is set to the non-background intensity. The result is a binary image associated with each image 200a, 200b, and 200c, and each such binary image includes pixels that have non-background intensity values and that are associated with candidate reference marks. The pixels associated with candidate reference marks in the binary images created from the images 200a, 200b, and 200c are analyzed across the binary images create the template image 250.
[0028] In some embodiments, edge detection techniques apparent to those who have skill in the art may be used to identify substantially contiguous boundaries of image features in the binary image. The pixels bounded by such contiguous boundaries may be set to the non- background intensity and the shape and/or dimensions of such pixels may analyzed as described in the foregoing to determine if such pixels are associated with reference marks.
[0029] A template image 252 is created from the binary images developed from the different images 200. All of the pixels of the template image 252 are initially set to a background intensity value. Thereafter, the binary images are registered with one another and each corresponding group of pixels having a non-background intensity values and that is present in all of the binary images and is coincident after registration is identified. The intensity values of the pixels in the template image 252 corresponding to the pixels of such identified group of pixels are set to the non-background intensity value.
[0030] The template image 250 developed as described above is then to identify the registration marks in the images 200 as well as other images captured by the HCIS 100 from additional slides 116 that have an identical relationship between one or more of the reference marks 152 and the array 150 of sample locations 118.
[0031] In particular, the template image 250 is aligned with the binary image created from the image 200a so that the non-background pixels 252, 254, and 256 of the template image are in register with corresponding non-background pixels of the binary image. The number of pixels (xoffset) that template image 250 must be shifted horizontally and the number of pixels (y offset) that the template image 250 must be shifted vertically for the two images to be aligned may then be determined. Such offsets (x0ffset, yoffset) may be added to the coordinates (xt, yt) of the pixel 258 (FIG. 4) to determine the coordinates of (xa, ya) of the pixel 210a (FIG. 3 A) relative to the left and bottom boundaries 212a and 214a, respectively, of the image 200a.
[0032] Referring to FIGS. 1, 2 and 3A, as noted above, the position of the array 150, and thereby the position of each of the locations 1 18 where samples may be deposited may be determined from the location 160. It should be apparent, that once the coordinates of the pixel 210a of the image 200a is determined as described above, the locations of the pixels in the image 200a that are associated with each of the locations 118 may also be determined. Sub-images of the image 200a that are associated with the each location 118 may be analyzed to determine the features of biological samples deposited in such location 118. Such sub-images may also be displayed to a user for visual analysis or transmitted to another system (not shown) for further analysis.
[0033] In some embodiments, the image 200a may be acquired using at a low resolution. The offsets (x0ffset, yoffset) of the pixel 210a in the image 200a may be calculated as described above and provided to the controller 112. The controller 112 scales the offsets in accordance with a target, higher resolution at which the slide will be reimaged, calculates the positions on the slide of the sample locations 118, and operates the position device 120 in accordance with the calculated positions to scan at the higher resolution each location 118 of the slide.
[0034] Referring to FIGS. 3A-3C, and 4, the coordinates (xt,, yt,) of the pixel 210b and (xc, yc) of the pixel 210c may be calculated in a manner similar to that described above to calculate the coordinates (xa, ya).
[0035] It should be apparent that one or more of the images 200a, 200b, and 200c may include an image of an array 118 that is rotated relative to the others. In these situations, when the template image 250 is brought into register, with the image 200a, 200b, or 200c, an angle of rotation in addition to a positional offset may be determined. One or more image analysis techniques apparent to those who have skill may be used to determine such angle of rotation including cross-correlation, frequency analysis, and searching through various angles of rotation.
[0036] Referring to FIG. 5, a template-based image analysis system 300 that uses templates to analyze images from an HCIS includes an image acquisition module 302 coupled to the HCIS 100. The image acquisition module 302 uses the HCIS 100 to obtain images of a plurality of slides 116 loaded in the HCIS 100 and stores such image in an acquired image data store 304. Each one of the plurality of slides each has an array 150 of sample locations 118 distributed in a two-dimensional array pattern thereon. A feature identification module 306 analyzes each acquired image to generate a binary image. The feature identification module 306 sets the intensity value of pixels of the binary image that are expected to be associated with reference marks on the slide 116 are set to a non-background intensity value as described above. The feature identification module 306 stores the binary image in the acquired images data store 304 and provides the binary image to a template generation module 308.
[0037] A template generation module 308 analyzes the binary images generated from the images of the plurality of slides and identifies pixels associated with reference marks that are common to all of binary images. The template generation module 308 thereafter generates a template image in which only those pixels associated with such common reference marks have a non-background intensity value. The remaining pixels have a background intensity value. The template generation module 308 stores the template in a template data store 310.
[0038] A template alignment module 312 retrieves from the image data store 304 an image to be analyzed and the binary image generated therefrom, and from the template data store 310 a template associated with the image to be analyzed. The template alignment module 314 then aligns the pixels of the template image with the retrieved binary image so that corresponding pixels associated with reference marks in each image overlap. An offset calculation module 316 develops a horizontal and vertical offsets and rotation of the array 150 represented in the image relative to the boundaries of the image. An image segmentation module 318 uses these offsets to segment the image to be analyzed into sub-images, wherein each sub-image is associated with a location 118 on the slide 116 where a sample may be deposited. A user may operate a user computer 320 to select and further analyze these sub-images. In addition, such sub-images may be transmitted to a further system for analysis. In some embodiments, the analyzed image, and the pixel coordinate information about such sub-images may be transmitted to a further system, and the further system may use such pixel coordinate information to extract a sub-image from the analyzed image. Such pixel coordinate information for each sub-image may comprise, for example, the pixel coordinate of opposite corners of a rectangular region of the analyzed image where the sub-image is located.
[0039] FIG. 6 shows a flowchart 400 of the steps undertaken by an embodiment of a template-based image analysis system to analyze images. Referring to FIGS. 1, 5 and 6, at step 402, the image acquisition module 302 acquires an image of a slide 116 using the HCIS 100 and stores such image in the acquired images data store 304. Thereafter, at step 404 the template- based image analysis system 300 determines if a template already exists to analyze the acquired image. For example, the template-based image analysis system 300 may query the user via the user computer 320 to identify a previously created template stored in the template data store 310, for example, by presenting a list of such stored templates and asking the user to select a template from the list. Alternately, the template-based image analysis system 300 may query the user to select or enter an identifier associated with the scanned slide for which a template has already been created and stored in the template data store 310. The template-based image analysis system 300 may then identify a template from the template data store 310 in accordance with such identifier. If, at step 404, the user indicates that the template does not exist, processing proceeds to step 406, otherwise processing proceeds to step 408.
[0040] At step 406, the template-based image analysis system 300 develops a template that may be used to analyze the image acquired at step 402 and stores the developed template in the template data store 310. Thereafter, processing proceeds to step 408. [0041] At step 408, the template-based image analysis system 300 retrieves the template from the template data store 310.
[0042] At step 410, the feature identification module 306 analyzes the acquired image to develop a binary image that identifies pixels of the acquired image that are expected to be associated with reference marks. The feature identification module 306 stores such binary image in the acquired images database 304.
[0043] At step 412, the template generation module 314 aligns the pixels associated with reference marks in the selected template with pixels associated with corresponding references marks in the binary image created by the feature identification module 306.
[0044] At step 414, the offset calculation module 316 determines the offset and rotation that were necessary to align the template with the binary image at step 412.
[0045] At step 416, the template-based image analysis system 300 queries the user via the user computer 320 whether the sample locations 118 of the slide 116 should be rescanned at high resolution. If the user requests rescanning, the image acquisition module 302, at step 418, uses the calculated offsets to determine position information associated with the sample locations 118 on the slide 116 and, at step 420, directs the controller 112 to position the positioning device 120 in accordance with such positioning information and operate the camera 104 to acquire a high resolution sub-image of the slide 116 in accordance with each such location 118. The image acquisition module 302, also at step 420, stores these sub-images in the image acquired images database 302.
[0046] If, at step 416, the user indicates that the slide 116 does not need to be scanned, the image segmentation module 318, at step 422, uses the calculated offsets to determine the coordinates in the acquired image of pixels associated with each sample location 118 of the slide 116. At step 424, the template-based image analysis system 300 supplies the position information or scanned sub-images to the user computer or to another system for further analysis.
[0047] FIG. 7 shows a flowchart 450 of the steps undertaken by an embodiment of the template-based image analysis system 300 at step 406 (FIG. 4) to generate a template. Referring to FIGS. 5-7, at step 452, the template-based image analysis system 300 asks the user via the user computer 320 to identify previously acquired images in the acquired images data store 304 that should be used to develop the template. The user may be asked to select one or more acquired images from a list of the images available. Alternately, the user may enter or select an identifier associated with the slide 116 from which with the image was acquired at step 402, and the system 300 selects all images associated with such identifier from the acquired images data store 304.
[0048] At step 454, the feature identification module 306 applies a threshold to each selected image to generate a binary image corresponding thereto. At step 456, the feature identification module 306 analyzes each binary image to identify pixels in such image that correspond to reference marks. The feature identification module, also at step 456, sets such identified pixels to a non-background intensity value and sets all other pixels to a background intensity value.
[0049] At step 458, the template generation module 308 registers all of the binary images as describe above and, at step 460, identifies pixels that are associated with features that are common to all of the binary images. At step 462, the template generation module creates a template image in which pixels corresponding to pixels of the binary images associated with features common to all of the binary images are set to a non-background intensity value and the remaining pixels are set to the background intensity value. Also, at step 462, the template generation module 308 stores the template image in the template data store 310.
[0050] It should be apparent to those who have skill in the art that any combination of hardware and/or software may be used to implement the template-based image analysis described herein. It will be understood and appreciated that one or more of the processes, sub- processes, and process steps described in connection with FIGS. 1-7 may be performed by hardware, software, or a combination of hardware and software on one or more electronic or digitally-controlled devices. The software may reside in a software memory (not shown) in a suitable electronic processing component or system such as, for example, one or more of the functional systems, controllers, devices, components, modules, or sub-modules schematically depicted in FIGS. 1-7. The software memory may include an ordered listing of executable instructions for implementing logical functions (that is, "logic" that may be implemented in digital form such as digital circuitry or source code, or in analog form such as analog source such as an analog electrical, sound, or video signal). The instructions may be executed within a processing module or controller (e.g., the image acquisition module 302, the feature identification module 306, the template generation module 308, the template alignment module 314, offset calculation module 316, and the image segmentation module 318 of FIG. 5), which includes, for example, one or more microprocessors, general purpose processors, combinations of processors, digital signal processors (DSPs), field programmable gate arrays (FPGAs), or application-specific integrated circuits (ASICs). Further, the schematic diagrams describe a logical division of functions having physical (hardware and/or software) implementations that are not limited by architecture or the physical layout of the functions. The example systems described in this application may be implemented in a variety of configurations and operate as hardware/software components in a single hardware/software unit, or in separate hardware/software units.
[0051] The executable instructions may be implemented as a computer program product having instructions stored therein which, when executed by a processing module of an electronic system, direct the electronic system to carry out the instructions. The computer program product may be selectively embodied in any non-transitory computer-readable storage medium for use by or in connection with an instruction execution system, apparatus, or device, such as a electronic computer-based system, processor-containing system, or other system that may selectively fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. In the context of this document, computer-readable storage medium is any non- transitory means that may store the program for use by or in connection with the instruction execution system, apparatus, or device. The non-transitory computer-readable storage medium may selectively be, for example, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device. A non-exhaustive list of more specific examples of non-transitory computer readable media include: an electrical connection having one or more wires (electronic); a portable computer diskette (magnetic); a random access, i.e., volatile, memory (electronic); a read-only memory (electronic); an erasable programmable read only memory such as, for example, Flash memory (electronic); a compact disc memory such as, for example, CD-ROM, CD-R, CD-RW (optical); and digital versatile disc memory, i.e., DVD (optical).
[0052] It will also be understood that receiving and transmitting of signals or data as used in this document means that two or more systems, devices, components, modules, or sub-modules are capable of communicating with each other via signals that travel over some type of signal path. The signals may be communication, power, data, or energy signals, which may communicate information, power, or energy from a first system, device, component, module, or sub-module to a second system, device, component, module, or sub-module along a signal path between the first and second system, device, component, module, or sub-module. The signal paths may include physical, electrical, magnetic, electromagnetic, electrochemical, optical, wired, or wireless connections. The signal paths may also include additional systems, devices, components, modules, or sub-modules between the first and second system, device, component, module, or sub-module.
INDUSTRIAL APPLICABILITY
[0053] All references, including publications, patent applications, and patents, cited herein are hereby incorporated by reference to the same extent as if each reference were individually and specifically indicated to be incorporated by reference and were set forth in its entirety herein.
[0054] The use of the terms "a" and "an" and "the" and similar references in the context of describing the invention (especially in the context of the following claims) are to be construed to cover both the singular and the plural, unless otherwise indicated herein or clearly contradicted by context. Recitation of ranges of values herein are merely intended to serve as a shorthand method of referring individually to each separate value falling within the range, unless otherwise indicated herein, and each separate value is incorporated into the specification as if it were individually recited herein. All methods described herein can be performed in any suitable order unless otherwise indicated herein or otherwise clearly contradicted by context. The use of any and all examples, or exemplary language (e.g., "such as") provided herein, is intended merely to better illuminate the disclosure and does not pose a limitation on the scope of the disclosure unless otherwise claimed. No language in the specification should be construed as indicating any non-claimed element as essential to the practice of the disclosure.
[0055] Numerous modifications to the present disclosure will be apparent to those skilled in the art in view of the foregoing description. It should be understood that the illustrated embodiments are exemplary only, and should not be taken as limiting the scope of the disclosure.

Claims

WHAT IS CLAIMED IS:
1. A template-based image analysis system, comprising:
a high-content imaging system (HCIS);
a slide loaded in the HCIS, wherein the slide includes a plurality of reference marks and a plurality of sample locations;
an acquired images data store having previously acquired images stored therein;
an image acquisition module that acquires an image of the slide from the high-content imaging system;
a feature identification module that develops a binary image of the acquired image, wherein the binary image identifies areas associated with reference marks;
a template generation module that generates a template in accordance with the previously acquired images;
a template alignment module that aligns the generated template with the binary image; an offset calculation module that determines an offset between the template and the binary image; and
an image segmentation module that determines the coordinates of pixels of the acquired image that are associated with the sample locations in accordance with the calculated offset.
2. The system of claim 1, wherein the HCIS includes a positioning device, and the image acquisition module directs the HCIS to position the position device in accordance with the coordinates, and directs the HCIS to acquire an image of the sample location associated with the coordinates.
3. The system of claim 1, wherein the image segmentation module develops sub- images of the acquired image in accordance with coordinates.
4. The system of claim 1, wherein the generated template is a binary image.
5. The system of claim 1, wherein the template generation module also uses the acquired image to generate the template.
6. The system of claim 1, wherein the feature identification module develops a binary image associated with each of the previously acquired images.
7. The system of claim 6, wherein the feature identification module identifies a connected group of pixels to associate with a feature, and sets the intensity values of pixels bounded by the identified pixels to a non-background intensity value.
8. The system of claim 6, wherein the template generation module registers the binary images of the previously acquired images.
9. The system of claim 8, wherein the template generation module identifies features common to the binary images of the previously acquired images to generate the template.
10. A template-based method for analyzing an image, comprising:
disposing reference marks and a plurality of samples on a slide, wherein the positions of each the plurality of samples is in accordance with the positions of the reference marks;
loading the slide in a high-content image system (HCIS);
storing previously acquired images in an acquired images data store;
acquiring with the HCIS an image of the slide loaded in the HCIS;
developing a binary image of the acquired image, wherein the binary image identifies areas associated with reference marks;
generating a template in accordance with the previously acquired images;
aligning the template with the binary image;
calculating an offset between the template and the binary image; and
determining the coordinates of pixels of the acquired image that are associated with the sample locations in accordance with calculated offset.
11. The method of claim 10, further including operating a position device of the HCIS in accordance with the coordinates, and directing the HCIS to acquire an image of the sample location associated with the coordinates.
12. The method of claim 10, further including developing sub-images of the acquired image in accordance with coordinates.
13. The method of claim 10, wherein the generated template is a binary image.
14. The method of claim 10, wherein generating the template includes analyzing the acquired image.
15. The method of claim 10, further including developing a binary image associated with each of the previously acquired images.
16. The method of claim 15, wherein developing the binary image includes identifying a connected group of pixels to associate with a feature, and setting the intensity values of pixels bounded by the identified pixels to a non-background intensity value.
17. The method of claim 15, wherein generating the template includes registering the binary images associated with previously acquired images.
18. The method of claim 17, wherein generating the template includes identifying features common to the binary images of the previously acquired images.
PCT/US2017/048426 2016-08-31 2017-08-24 System and method for template-based image analysis WO2018044683A1 (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
JP2019531545A JP7068307B2 (en) 2016-08-31 2017-08-24 Systems and methods for template-based image analysis
US16/328,358 US11409094B2 (en) 2016-08-31 2017-08-24 System and method for template-based image analysis
CN201780052795.2A CN109643453B (en) 2016-08-31 2017-08-24 System and method for template-based image analysis
EP17847247.8A EP3507768B1 (en) 2016-08-31 2017-08-24 System and method for template-based image analysis
US17/863,804 US11960073B2 (en) 2016-08-31 2022-07-13 System and method for template-based image analysis

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201662381974P 2016-08-31 2016-08-31
US62/381,974 2016-08-31

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US16/328,358 A-371-Of-International US11409094B2 (en) 2016-08-31 2017-08-24 System and method for template-based image analysis
US17/863,804 Continuation US11960073B2 (en) 2016-08-31 2022-07-13 System and method for template-based image analysis

Publications (1)

Publication Number Publication Date
WO2018044683A1 true WO2018044683A1 (en) 2018-03-08

Family

ID=61301547

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2017/048426 WO2018044683A1 (en) 2016-08-31 2017-08-24 System and method for template-based image analysis

Country Status (5)

Country Link
US (2) US11409094B2 (en)
EP (1) EP3507768B1 (en)
JP (1) JP7068307B2 (en)
CN (1) CN109643453B (en)
WO (1) WO2018044683A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112689756A (en) * 2018-09-12 2021-04-20 分子装置有限公司 System and method for label-free identification and classification of biological samples
WO2021081056A1 (en) * 2019-10-24 2021-04-29 Molecular Devices, Llc A high-content imaging system to generate enhanced images and method of operating the same
CN112884797A (en) * 2021-02-02 2021-06-01 武汉钢铁有限公司 Image background removing method and device and electronic equipment
US20210343030A1 (en) * 2020-04-29 2021-11-04 Onfido Ltd Scalable, flexible and robust template-based data extraction pipeline

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021163702A1 (en) * 2020-02-15 2021-08-19 Heska Corporation Sample cartridges
WO2022061150A2 (en) * 2020-09-18 2022-03-24 10X Geonomics, Inc. Sample handling apparatus and image registration methods
US20230400781A1 (en) * 2022-06-14 2023-12-14 Bruker Nano, Inc. Method of Analyzing Metrology Data
WO2024000268A1 (en) * 2022-06-29 2024-01-04 深圳华大生命科学研究院 Image processing method and apparatus, and device and medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030103662A1 (en) * 2001-12-05 2003-06-05 Finkbeiner Steven M. Robotic microscopy systems
US20070031993A1 (en) * 2002-05-17 2007-02-08 Gsi Lumonics Corporation Method and system for machine vision-based feature detection and mark verification in a workpiece or wafer marking system
US20080144899A1 (en) * 2006-11-30 2008-06-19 Manoj Varma Process for extracting periodic features from images by template matching
WO2009126495A2 (en) 2008-04-09 2009-10-15 Vidar Systems Corporation Method and system for processing microarray images
EP2796917A1 (en) 2013-04-26 2014-10-29 Baden-Württemberg Stiftung gGmbH A method for automated platform and/or reference object independent acquisition of positional information and localization of objects of interest in a microscope
US20150278625A1 (en) * 2012-12-14 2015-10-01 The J. David Gladstone Institutes Automated robotic microscopy systems
WO2016123300A1 (en) * 2015-01-30 2016-08-04 Molecular Devices, Llc A high content imaging system and a method of operating the high content imaging system

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8644581B2 (en) * 2008-11-04 2014-02-04 Beckman Coulter, Inc. Systems and methods for cellular analysis data pattern global positioning
US9523680B2 (en) * 2010-06-30 2016-12-20 Ambergen, Inc. Global Proteomic screening of random bead arrays using mass spectrometry imaging
EP3800495A1 (en) * 2011-01-18 2021-04-07 Roche Diagnostics Hematology, Inc. Microscope slide coordinate system registration
US9390327B2 (en) * 2013-09-16 2016-07-12 Eyeverify, Llc Feature extraction and matching for biometric authentication
EP3943915A3 (en) * 2014-05-12 2022-05-25 Cellomics, Inc Automated imaging of chromophore labeled samples
EP3882851B1 (en) * 2014-12-30 2023-01-18 Ventana Medical Systems, Inc. Method for co-expression analysis
CN104820439B (en) * 2015-04-16 2017-10-20 华南理工大学 A kind of visual apparatus as sensor parallel connection platform follow-up control apparatus and method
US10061972B2 (en) * 2015-05-28 2018-08-28 Tokitae Llc Image analysis systems and related methods
US10706261B2 (en) * 2015-08-12 2020-07-07 Molecular Devices, Llc System and method for automatically analyzing phenotypical responses of cells
WO2017153848A1 (en) * 2016-03-10 2017-09-14 Genomic Vision Method of curvilinear signal detection and analysis and associated platform

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030103662A1 (en) * 2001-12-05 2003-06-05 Finkbeiner Steven M. Robotic microscopy systems
US20070031993A1 (en) * 2002-05-17 2007-02-08 Gsi Lumonics Corporation Method and system for machine vision-based feature detection and mark verification in a workpiece or wafer marking system
US20080144899A1 (en) * 2006-11-30 2008-06-19 Manoj Varma Process for extracting periodic features from images by template matching
WO2009126495A2 (en) 2008-04-09 2009-10-15 Vidar Systems Corporation Method and system for processing microarray images
US20150278625A1 (en) * 2012-12-14 2015-10-01 The J. David Gladstone Institutes Automated robotic microscopy systems
EP2796917A1 (en) 2013-04-26 2014-10-29 Baden-Württemberg Stiftung gGmbH A method for automated platform and/or reference object independent acquisition of positional information and localization of objects of interest in a microscope
WO2016123300A1 (en) * 2015-01-30 2016-08-04 Molecular Devices, Llc A high content imaging system and a method of operating the high content imaging system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3507768A4

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112689756A (en) * 2018-09-12 2021-04-20 分子装置有限公司 System and method for label-free identification and classification of biological samples
WO2021081056A1 (en) * 2019-10-24 2021-04-29 Molecular Devices, Llc A high-content imaging system to generate enhanced images and method of operating the same
US11669946B2 (en) 2019-10-24 2023-06-06 Moleculr Devices, Llc High-content imaging system to generate enhanced images and method of operating the same
US20210343030A1 (en) * 2020-04-29 2021-11-04 Onfido Ltd Scalable, flexible and robust template-based data extraction pipeline
US11657631B2 (en) * 2020-04-29 2023-05-23 Onfido Ltd. Scalable, flexible and robust template-based data extraction pipeline
CN112884797A (en) * 2021-02-02 2021-06-01 武汉钢铁有限公司 Image background removing method and device and electronic equipment
CN112884797B (en) * 2021-02-02 2023-12-08 武汉钢铁有限公司 Image background removing method and device and electronic equipment

Also Published As

Publication number Publication date
US11960073B2 (en) 2024-04-16
US20220350130A1 (en) 2022-11-03
EP3507768A1 (en) 2019-07-10
CN109643453A (en) 2019-04-16
JP7068307B2 (en) 2022-05-16
JP2019526870A (en) 2019-09-19
US11409094B2 (en) 2022-08-09
EP3507768A4 (en) 2020-02-26
US20210278654A1 (en) 2021-09-09
CN109643453B (en) 2023-06-27
EP3507768B1 (en) 2020-09-30

Similar Documents

Publication Publication Date Title
US11960073B2 (en) System and method for template-based image analysis
US10083522B2 (en) Image based measurement system
US8675992B2 (en) Digital microscope slide scanning system and methods
CN105701492B (en) A kind of machine vision recognition system and its implementation
KR20140006891A (en) Microscope slide coordinate system registration
US10852290B2 (en) Analysis accuracy improvement in automated testing apparatus
US9253449B2 (en) Mosaic picture generation
CN110140129B (en) Low resolution slide imaging and slide label imaging and high resolution slide imaging using dual optical paths and single imaging sensor
US7680316B2 (en) Imaging device and methods to derive an image on a solid phase
CN113139894A (en) Microscope and method for determining a measuring position of a microscope
JP6601785B2 (en) Well address acquisition system, well address acquisition method, and program
US20110267448A1 (en) Microscopy
JP5343762B2 (en) Control device and microscope system using the control device
CN109863536B (en) Image processing method and image processing apparatus
US20220155578A1 (en) Physical calibration slide
US10852237B2 (en) Microarray, imaging system and method for microarray imaging
JP2003156311A (en) Method and apparatus for detection and registration of alignment mark
JP3399697B2 (en) Measurement point mapping apparatus and semiconductor wafer measurement apparatus using the same
CN110998330A (en) Method and system for analyzing fluorescent immunospot assays
CN109283679B (en) Large-view-field optical microscopic image imaging device and method
KR200285409Y1 (en) Bio-chip having markers for efficient image processing
JP2004362143A (en) Edge detection device, component recognition device, edge detection method, and component recognition method
JPS60167071A (en) Stamp shape reader

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17847247

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019531545

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2017847247

Country of ref document: EP

Effective date: 20190401