US11614363B2 - Digital pathology color calibration and validation - Google Patents

Digital pathology color calibration and validation Download PDF

Info

Publication number
US11614363B2
US11614363B2 US17/086,099 US202017086099A US11614363B2 US 11614363 B2 US11614363 B2 US 11614363B2 US 202017086099 A US202017086099 A US 202017086099A US 11614363 B2 US11614363 B2 US 11614363B2
Authority
US
United States
Prior art keywords
image
xyz
pixels
digital image
rgb
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US17/086,099
Other versions
US20210048344A1 (en
Inventor
Allen Olson
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Leica Biosystems Imaging Inc
Original Assignee
Leica Biosystems Imaging Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Leica Biosystems Imaging Inc filed Critical Leica Biosystems Imaging Inc
Priority to US17/086,099 priority Critical patent/US11614363B2/en
Assigned to LEICA BIOSYSTEMS IMAGING, INC. reassignment LEICA BIOSYSTEMS IMAGING, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: OLSON, ALLEN
Publication of US20210048344A1 publication Critical patent/US20210048344A1/en
Application granted granted Critical
Publication of US11614363B2 publication Critical patent/US11614363B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • G06T7/337Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods involving reference images or patches
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/46Measurement of colour; Colour measuring devices, e.g. colorimeters
    • G01J3/462Computing operations in or between colour spaces; Colour management systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J1/00Photometry, e.g. photographic exposure meter
    • G01J1/02Details
    • G01J1/08Arrangements of light sources specially adapted for photometry standard sources, also using luminescent or radioactive material
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/10Arrangements of light sources specially adapted for spectrometry or colorimetry
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/46Measurement of colour; Colour measuring devices, e.g. colorimeters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/46Measurement of colour; Colour measuring devices, e.g. colorimeters
    • G01J3/50Measurement of colour; Colour measuring devices, e.g. colorimeters using electric radiation detectors
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N1/00Sampling; Preparing specimens for investigation
    • G01N1/28Preparing specimens for investigation including physical details of (bio-)chemical methods covered elsewhere, e.g. G01N33/50, C12Q
    • G01N1/30Staining; Impregnating ; Fixation; Dehydration; Multistep processes for preparing samples of tissue, cell or nucleic acid material and the like for analysis
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/25Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
    • G01N21/27Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands using photo-electric detection ; circuits for computing concentration
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/32Micromanipulators structurally combined with microscopes
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/36Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
    • G02B21/365Control or image processing arrangements for digital or video microscopes
    • G02B21/367Control or image processing arrangements for digital or video microscopes providing an output produced by processing a plurality of individual source images, e.g. image tiling, montage, composite images, depth sectioning, image comparison
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/20Processor architectures; Processor configuration, e.g. pipelining
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/60Memory management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/001Texturing; Colouring; Generation of texture or colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/0068Geometric image transformation in the plane of the image for image registration, e.g. elastic snapping
    • G06T3/14
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/40Scaling the whole image or part thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H70/00ICT specially adapted for the handling or processing of medical references
    • G16H70/60ICT specially adapted for the handling or processing of medical references relating to pathologies
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/2823Imaging spectrometer
    • G01J2003/2826Multispectral imaging, e.g. filter imaging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10056Microscopic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20021Dividing image into blocks, subimages or windows
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20076Probabilistic image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30024Cell structures in vitro; Tissue sections in vitro
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/751Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/46Colour picture communication systems
    • H04N1/56Processing of colour picture signals
    • H04N1/60Colour correction or control
    • H04N1/603Colour correction or control controlled by characteristics of the picture signal generator or the picture reproducer
    • H04N1/6033Colour correction or control controlled by characteristics of the picture signal generator or the picture reproducer using test pattern analysis

Definitions

  • the present invention generally relates to digital pathology and more specifically relates to systems and methods for calibrating color management in a digital pathology system and validating digital color data in connection with corresponding physical color spectrum.
  • the inventor has recognized that because the spectral properties of the staining agent change when bound to tissue, a test target that is constructed to faithfully approximate the characteristics of tissue with or without stains will always suffer significant drawbacks. This is because a proper solution requires that the spectral properties of the test targets match those of the objects being imaged. Accordingly, in the present description the inventor provides a solution that employs one or more standard slides as test targets that are used to successfully calibrate an imaging system.
  • a standard slide is prepared with a specimen.
  • a single stain or a combination of stains or no stains can be applied to the specimen.
  • a grid is overlayed or superimposed on the slide to divide the specimen into discrete sections.
  • a digital image of the specimen on the slide is then obtained using a hyperspectral imaging system (referred to herein as the “hyperspectral image” or the “H image” or the “HYP image” or the “XYZ image”).
  • Hyperspectral imaging systems are typically image tiling systems and, for example, a single field of view of the hyperspectral imaging system can be captured for each cell in the grid overlay.
  • the resulting hyperspectral image for each cell includes a stack of images ranging between 400 nm and 750 nm (the visual spectrum).
  • the number of images in the stack can vary, for example one image per 10 nm separation.
  • the hyperspectral image stack is then processed to result in an XYZ color image having a plurality of individual picture elements (“pixels”) where each pixel has an XYZ color value.
  • the hyperspectral image is then registered to the grid by mapping the top left corner pixel to the top left corner of the grid.
  • the individual pixels in an XYZ color image can be combined to create super pixels, which advantageously reduces pixel location errors when subsequently associating XYZ pixels created by the hyperspectral imaging system to RGB pixels created by the digital pathology system.
  • the same slide with the grid overlay is scanned using a digital pathology system having a color imaging capability.
  • the resulting digital image (referred to herein as the “pathology image” or the “P image” or the “PATH image” or the “RGB image”) has a red, green and blue (“RGB”) value for each pixel.
  • the pathology image is then registered to the grid by mapping the top left corner pixel to the top left corner of the grid.
  • the individual pixels in the RGB can also be combined to create super pixels and to match the pixel size of the hyperspectral image to allow for direct color comparison between the XYZ values and the RGB values.
  • the pixel sizes of both the hyperspectral image and the pathology image can be downsized or upsized to optimize the pixel size matching.
  • a lookup table (“LUT”) associating the XYZ color information for a single pixel in the hyperspectral image and the RGB color information for the same pixel in the pathology image is generated.
  • the LUT associates the XYZ and RBG color information for all pixels in the pathology image.
  • the LUT can be used by a display module so that the colors of a scanned specimen in a digital image file having RGB color data can be presented on a display using the corresponding hyperspectral XYZ color to result in the displayed color being substantially the same as the colors of the specimen on the physical slide.
  • the displayed color can be measured by a color measurement device such as a colorimeter or a spectrophotometer.
  • a standard slide having a specimen is scanned using the hyperspectral imaging system as described above to generate the XYZ image.
  • the same slide is also scanned using the digital pathology system to generate the RGB image.
  • the pixels of the XYZ image and RGB image are registered to each other to align the respective images. If the size of the individual pixels of the imaging sensors in the hyperspectral imaging system and the digital pathology imaging system differ, then pixel combining or downsizing may be employed to facilitate proper alignment of the XYZ and RGB images and proper image pixel size matching.
  • one of the XYZ or RGB images is indexed to identify a small number of colors into which every pixel in the image being indexed can be assigned while also minimizing the error values for each association of an image pixel to a color.
  • the indexing process may advantageously reduce the number of colors in the XYZ image to ten and each pixel in the XYZ image is assigned to one of the ten colors. In one embodiment, during indexing all pixels that are close to the same color are averaged into a single color of the index color palette.
  • This process is done iteratively until all pixels have been assigned to an averaged single color value where the error value of the original pixel color compared to the averaged single color is minimized across all pixels in the XYZ image.
  • the error value can be minimized using a root mean square analysis.
  • the result is a set of N pixel groups, where N is the index value (ten in the example above) and combining the pixels in the N pixel groups results in the complete XYZ image.
  • a pixel grouping is also referred to herein as an “index.”
  • the pixels in the RGB image can be likewise associated into the same N pixel groups in accordance with the pixel registration between the XYZ and RGB images.
  • Each of the N pixel groups from the RGB image are analyzed to calculate an average color value for each of the N pixel groups. The result is that each index in the XYZ image has an average color value and each index in the RGB image has an average color value and these average color values are used to generate the LUT.
  • the LUT can be used by a display module so that the colors of a scanned specimen in a digital image file having RGB color data can be presented on a display using the corresponding hyperspectral XYZ color to result in the displayed color (e.g., as measured by a colorimeter or spectrophotometer) being substantially the same as the colors of the specimen on the physical slide.
  • FIG. 1 is a plan view diagram illustrating an example digital pathology slide having a barcode and a grid overlay according to an embodiment of the invention
  • FIG. 2 is a block diagram illustrating an example prior art hyperspectral imaging system according to an embodiment of the invention
  • FIG. 3 A is a block diagram illustrating an example hyperspectral imaging stack as generated by a hyperspectral imaging system according to an embodiment of the invention
  • FIG. 3 B is a graph diagram illustrating an example color matching for use with a hyperspectral imaging stack according to an embodiment of the invention
  • FIG. 3 C is a block diagram illustrating an example hyperspectral image in XYZ color according to an embodiment of the invention.
  • FIG. 3 D is a flow diagram illustrating an example process for converting a hyperspectral image stack to a single XYZ color image according to an embodiment of the invention
  • FIG. 4 A is a block diagram illustrating an example image processor device according to an embodiment of the invention.
  • FIG. 4 B is a block diagram illustrating example superpixels of a hyperspectral image in XYZ color according to an embodiment of the invention
  • FIG. 5 is a flow diagram illustrating an example process for calibrating color values generated by a digital pathology scanning apparatus using a superpixel process according to an embodiment of the invention
  • FIG. 6 is a flow diagram illustrating an example process for validating color values generated by a digital pathology scanning apparatus using a superpixel process according to an embodiment of the invention
  • FIGS. 7 A, 7 B and 7 C are a graph diagrams illustrating example comparisons of color values of a specimen scanned by a hyperspectral imaging system versus superpixeling color values of the same specimen scanned by a digital pathology imaging system and presented on a display and measured by a color measurement device according to an embodiment of the invention;
  • FIG. 8 is a block diagram illustrating an example set of pixel groups that form a composite XYZ image according to an embodiment of the invention.
  • FIG. 9 is a flow diagram illustrating an example process for calibrating color values generated by a digital pathology scanning apparatus using an indexing process according to an embodiment of the invention.
  • FIG. 10 is a flow diagram illustrating an example process for validating color values generated by a digital pathology scanning apparatus using an indexing process according to an embodiment of the invention
  • FIGS. 11 A, 11 B and 11 C are graph diagrams illustrating example comparisons of color values of a specimen scanned by a hyperspectral imaging system versus indexing color values of the same specimen scanned by a digital pathology imaging system and presented on a display and measured by a color measurement device according to an embodiment of the invention;
  • FIG. 12 A is a block diagram illustrating an example processor enabled device 550 that may be used in connection with various embodiments described herein;
  • FIG. 12 B is a block diagram illustrating an example line scan camera having a single linear array
  • FIG. 12 C is a block diagram illustrating an example line scan camera having three linear arrays.
  • FIG. 12 D is a block diagram illustrating an example line scan camera having a plurality of linear arrays.
  • Certain embodiments disclosed herein provide for color calibration of a digital pathology system.
  • a pathology slide to be scanned by a hyperspectral imaging system and the colors of the resulting digital hyperspectral image compared to the colors of a digital image of the same slide scanned by the digital pathology system.
  • the comparison results in a lookup table that translates RGB values into XYZ values so that when the digital slide image scanned by the digital pathology system is presented on a display, the presented colors match the XYZ values corresponding to the true colors of the physical specimen on the slide.
  • FIG. 1 is a plan view diagram illustrating an example digital pathology slide 10 having a barcode 20 and a grid overlay 30 according to an embodiment of the invention.
  • the grid overlay 30 is positioned over the sample that is on the slide 10 .
  • the grid overlay 30 is used to facilitate registration of the hyperspectral digital image to the digital pathology digital image.
  • An alternative way to register the images is to use image pattern matching of the two digital images in order to align the hyperspectral digital image and the digital pathology digital image. For successful image pattern matching it is helpful to have digital images of the same magnification.
  • FIG. 2 is a block diagram illustrating an example prior art hyperspectral imaging system 50 according to an embodiment of the invention.
  • the hyperspectral imaging system 50 includes a 2D pixel array monochrome camera 60 , a microscope 70 , a slide 80 that supports a specimen and a narrow band filter wheel 90 .
  • the monochrome camera 60 of the hyperspectral imaging system 50 is a monochrome line scan camera.
  • the monochrome line scan camera 60 in the hyperspectral imaging system 50 has the same characteristics as the color line scan camera used in the digital pathology imaging system.
  • FIG. 3 A is a block diagram illustrating an example hyperspectral imaging stack 100 as generated by a hyperspectral imaging system according to an embodiment of the invention.
  • a hyperspectral imaging system generates a set of individual images that are each captured using a different wavelength of light, for example by using a narrow band filter wheel and capturing an image of the same region using each filter of the filter wheel. This set of images is referred to herein as a hyperspectral stack or a spectral stack.
  • a spectral stack is can be generated for an individual region of a sample or for an entire sample/entire slide.
  • a hyperspectral imaging system using a line scan camera can capture a whole slide image using each filter on the filter wheel to generate a whole slide image spectral stack.
  • FIG. 3 B is a graph diagram illustrating an example color matching for use with a hyperspectral imaging stack according to an embodiment of the invention.
  • color matching functions can be applied to each digital image in the whole slide image spectral stack at a variety of wavelengths of light ( 110 , 120 and 130 ) to generate a whole slide hyperspectral image 140 in XYZ color.
  • FIG. 3 C is a block diagram illustrating an example whole slide hyperspectral image 140 in XYZ color according to an embodiment of the invention.
  • FIG. 3 D is a flow diagram illustrating an example process for converting a whole slide hyperspectral image stack 100 to a single hyperspectral image 140 in XYZ color according to an embodiment of the invention.
  • whole slide images for each different wavelength of light are captured by the hyperspectral imaging system to generate a hyperspectral image stack 100 .
  • color matching functions are applied to each digital image in the spectral stack 100 at a variety of wavelengths of light ( 110 , 120 , 130 ) to generate a whole slide hyperspectral digital image 140 of the entire specimen in XYZ color.
  • FIG. 4 A is a block diagram illustrating an example image processor device 260 according to an embodiment of the invention.
  • the image processor device 260 is processor enabled device having a processor 267 and a non-transitory data storage area 265 for storing information and instructions that can be executed by the processor 267 .
  • the data storage area 260 may store a plurality of hyperspectral XYZ images and a plurality of digital pathology RGB images and a plurality of instructions for processing such images.
  • the image processor device 260 includes a register module 270 , a superpixel module 280 , an index module 290 and a LUT module 295 .
  • the superpixel module 280 and the index module 290 may be combined into a color module 285 .
  • the image processor device 260 may also be communicatively coupled with an integral or external display device 576 .
  • a color measurement device 577 may be configured to read color information from the display device 576 and translate the color information into one or more XYZ values.
  • the register module 270 is configured to register two digital images to each other.
  • the register module 270 is configured to register a hyperspectral XYZ image to a digital pathology RGB image.
  • the register module 270 registers two digital images by aligning the image data of the two images in X-Y to bring the two images into X-Y alignment, for example by pattern matching of features in the image data.
  • the register module 260 also registers two digital images by adjusting the digital images so that they have common characteristics.
  • the register model 270 may evaluate and adjust image characteristics including image pixel size and spatial alignment of translation, rotation, and magnification. Additionally, the register module 270 may also account for optical distortions between the two separate systems, which can be detected at the pixel level.
  • the superpixel module 280 is configured to identify contiguous image pixels that have the same or similar color values and combine those pixels into a single superpixel.
  • the superpixel module 280 is also configured to determine a color value for the superpixel by averaging the color values of all of the individual image pixels in the superpixel. Averaging is important to reduce noise due to possible measurement and registration errors.
  • the average color value can be determined for a superpixel, for example, by summing the color values of the image pixels in the superpixel and dividing the sum by the number of pixels for that superpixel.
  • the superpixel module 280 can advantageously identify a plurality of superpixels in a digital image and determine a color value for each of the plurality of superpixels.
  • the index module 290 is configured to identify individual image pixels that have the same or similar color values and assign these individual image pixels to one of a plurality of color indices.
  • the index module 290 is also configured to determine a color value for each color index by averaging the color values of all of the individual image pixels in the respective color index. Averaging is important to reduce noise due to possible measurement and registration errors.
  • the average color value can be determined for an index, for example, by summing the color values of the image pixels in the index and dividing the sum by the number of pixels for that index.
  • the LUT module 295 is configured to generate one or more lookup tables that correlate XYZ color values to RGB color values.
  • FIG. 4 B is a block diagram illustrating an example registered whole slide image 170 according to an embodiment of the invention.
  • whole slide image 170 may be a hyperspectral image or a digital pathology image.
  • image data from the hyperspectral digital image and the pathology digital image is analyzed to achieve X-Y alignment.
  • the image data may be analyzed to identify features in the image data that can be matched and aligned in order to register the hyperspectral digital image to the pathology digital image in X-Y.
  • pattern matching can be used to associate common features between the hyperspectral digital image to the pathology digital image to facilitate X-Y alignment.
  • the image data for the hyperspectral digital image and the pathology digital image is converted to common characteristics. This is because the imaging hardware of the hyperspectral scanning system and the pathology scanning system is unlikely to produce identical digital image data with respect to, for example, the magnification and image pixel size in the digital image data. Accordingly, during the registration process, the image data for the hyperspectral digital image and the pathology digital image is adjusted to have common characteristics. For example, a magnification adjustment may be needed and an adjustment to a common image pixel size is nearly always needed.
  • image pixel 180 is an image pixel having the common image pixel size.
  • FIG. 5 is a flow diagram illustrating an example process for calibrating color values generated by a digital pathology scanning apparatus using a superpixel process according to an embodiment of the invention. Certain steps of the illustrated process may be carried out by an image processor device such as previously described with respect to FIG. 4 A .
  • a test slide is any type of slide that will be scanned by both the hyperspectral imaging system and the digital pathology imaging system and used for calibration/validation purposes. Accordingly, no special type of slide preparation is required to be a test slide and there are no special characteristics of a test slide. Any slide having any stains can be used as a test slide to calibrate a digital pathology scanning apparatus.
  • a registration grid can be overlayed on the slide—preferably over the portion of the slide that includes the sample.
  • the registration grid if present, can be later used as a marker in the hyperspectral digital image and the digital pathology digital image to register the hyperspectral digital image to the digital pathology digital image by aligning the image data in X-Y.
  • a hyperspectral image is scanned and stored.
  • the hyperspectral image may be scanned as separate image tiles using tiling system hardware or the hyperspectral image may be scanned as a whole slide image using line scanning system hardware.
  • the present color calibration and validation systems and methods are hardware agnostic.
  • the native hyperspectral image includes one or more spectral stacks having a plurality of individual images that are each processed using color matching functions to generate a single digital image in XYZ color.
  • a color digital pathology image is scanned and stored.
  • the scanned digital pathology image is in RGB color.
  • the digital pathology image may also be scanned as separate image tiles using a tiling system or the digital pathology image may be scanned as a whole slide image using a line scanning system. While there are advantages to using cameras with the same or very similar characteristics such as pixel size and pixel number in the hyperspectral imaging system and the digital pathology imaging system, these advantages primarily serve to simplify the image registration process and make the registration process robust.
  • the hyperspectral image and the digital pathology image are registered to each other.
  • Image registration includes X-Y alignment, for example by pattern matching, and conversion to common characteristics, for example, magnification and image pixel size.
  • the registration process includes generation of resampled image pixels (larger or smaller) to make the individual image pixel size from the separate scanning systems substantially the same.
  • the registration process may also include localized variations in translation to account for optical distortions.
  • step 340 the color groups in the hyperspectral image are determined.
  • each individual image pixel in the hyperspectral image is its own color group.
  • contiguous individual image pixels having the same or very similar colors are combined into larger superpixels such as superpixel 190 shown in FIG. 4 B .
  • a disadvantage of including more pixels in a superpixel is that it reduces the range of colors across all superpixels.
  • the color of the superpixel determines a color group.
  • the color values of all image pixels included in the superpixel are averaged together to determine an average color value and that average color value is determined to be the color for that color group, as shown in step 350 .
  • the color groups in the hyperspectral image cover every image pixel in the hyperspectral image and each color group has an X-Y perimeter. Accordingly, because the hyperspectral image and the digital pathology image have been registered to each other, the X-Y perimeters of the color groups from the hyperspectral image can be applied to the digital pathology image as shown in step 360 to associate the individual image pixels of the digital pathology image into the same color groups as the hyperspectral image. Accordingly, the color values of the individual image pixels of each color group in the digital pathology image can be similarly averaged in step 370 to determine an average color value for each color group in the digital pathology image.
  • the lookup table can be embedded in a data structure that houses the digital pathology image.
  • the correlation of the XYZ color data to the RGB color data can be included in the digital pathology image data structure as part of an International Color Consortium (ICC) profile.
  • ICC International Color Consortium
  • the lookup table can be embedded in the digital pathology image data structure or alternatively, the information in the lookup table can be converted into a mathematical model or formula or a set of executable instructions and the model or formula or set of instructions can be embedded in the digital pathology image data structure.
  • An advantage of embedding the model or formula or set of instructions is that the data size of the model or formula or set of instructions is smaller and thereby reduces the size of the digital pathology image data structure.
  • the model or formula or set of instructions functions to average out minor differences and discrepancies in the correlation of the XYZ color data to the RGB color data, which may be introduced, for example, due to metamerism. Metamerism is when two colors that are not actually the same (i.e., they reflect different wavelengths of light) appear to be the same under certain lighting conditions.
  • a single combined lookup table is generated over time from a plurality of slides having a plurality of different stains.
  • a single combined lookup table can be generated and optimized over time such that the single combined lookup table can be used for any type of digital pathology slide with any type of staining profile.
  • FIG. 6 is a flow diagram illustrating an example process for validating color values generated by a digital pathology scanning apparatus using a superpixel process according to an embodiment of the invention. Certain steps of the illustrated process may be carried out by an image processor device such as previously described with respect to FIG. 4 A .
  • a test slide is initially prepared in step 400 .
  • the test slide can be any slide prepared in the normal fashion using a specimen and zero or more stains.
  • a lookup table is generated, for example using the process previously described with respect to FIG. 5 .
  • the lookup table may contain color values such as those shown in the Hyperspectral XYZ column and the associated Digital Pathology RGB column of Table 1 below, where each row represents the same color group (e.g., a superpixel) in the hyperspectral digital image and the digital pathology digital image.
  • Table 1 illustrates a correlation of hyperspectral image data in XYZ color values to digital pathology image data in RGB color values to digital pathology image data in XYZ color values.
  • step 420 the XYZ values of the digital pathology image are determined. This can be done in at least two ways. A first way is that the XYZ values for the digital pathology image can be calculated from the RGB values of the digital pathology image—for example using a lookup table or a formula. A second way is that the digital pathology image can be presented on a display and the colors emitted from the display can be measured using a color measurement device (e.g., colorimeter or spectrophotometer) that measures color in XYZ value.
  • a color measurement device e.g., colorimeter or spectrophotometer
  • step 430 the calculated or measured XYZ value for a particular region (e.g. a superpixel) of the digital pathology image is compared to the hyperspectral image XYZ value for the same region. In this fashion, the color information generated by digital pathology apparatus and presented on a display screen can be validated.
  • a particular region e.g. a superpixel
  • FIGS. 7 A, 7 B and 7 C are a graph diagrams illustrating example comparisons of color values of a specimen scanned by a hyperspectral imaging system versus superpixeling color values of the same specimen scanned by a digital pathology imaging system according to an embodiment of the invention.
  • the XYZ color values from the digital pathology imaging system were obtained by presenting the digital slide image on a display and measuring a superpixel region with a color measurement device.
  • the color value for the superpixel may be used to fill up the entire display with a single color and then measuring a region with a color measurement device.
  • the color values measured as coming off of the display screen were extremely close to the true color values as measured by the hyperspectral imaging system, with an average difference of 2.18, which is less than one Just Noticeable Difference.
  • graph 200 shows a comparison of the system hyperspectral value for lightness to the digital pathology display value for lightness.
  • graph 220 shows a comparison of the system hyperspectral value for green/red to the digital pathology display value for green/red.
  • graph 240 shows a comparison of the system hyperspectral value for blue/yellow to the digital pathology display value for blue/yellow.
  • FIG. 8 is a block diagram illustrating an example set of pixel groups that form a composite XYZ image 650 according to an embodiment of the invention.
  • there are ten indices namely index 1 , 600 , index 2 , 605 , index 3 , 610 , index 4 , 615 , index 5 620 , index 6 625 , index 7 630 , index 8 635 , index 9 640 and index 10 645 .
  • Each of the indices represents a separate color group for the underlying digital image.
  • the index color palette 660 represents each of the color values corresponding to each individual index.
  • each index image represents a scattering of all image pixels in the underlying digital image that have the same color value within a certain threshold.
  • the indexing process can be applied to either the XYZ digital image or the RGB digital image.
  • the indexing process analyzes the color values for each pixel in the digital image and identifies all pixels, regardless of X-Y location that belong to a single color value.
  • an entire digital pathology digital image whether created by a hyperspectral imaging system or a pathology imaging system can be indexed into about ten (10) color values.
  • a significant advantage of indexing all image pixels in a digital image into a relatively small number of color values is that the sample size for each color value is significantly increased, which significantly reduces noise.
  • Another advantage of indexing all image pixels in a digital image into a relatively small number of color values is that it provides the widest range of average color values from the smallest number of indices.
  • FIG. 9 is a flow diagram illustrating an example process for calibrating color values generated by a digital pathology scanning apparatus using an indexing process according to an embodiment of the invention. Certain steps of the illustrated process may be carried out by an image processor device such as previously described with respect to FIG. 4 A .
  • a slide is obtained. As previously discussed, any type of slide with a specimen having zero or more stains is suitable.
  • a hyperspectral image is scanned and stored as an XYZ image. As previously discussed, the hyperspectral image may be scanned as separate image tiles using any type of scanning system hardware.
  • step 720 a color digital pathology image is scanned and stored as an RGB image.
  • the hyperspectral image and the digital pathology image are registered to each other as previously described.
  • Image registration includes X-Y alignment and conversion to common characteristics such as image pixel size.
  • the hyperspectral image is indexed to identify a set of colors into which every image pixel in the hyperspectral image can be allocated.
  • the indexing process receives an index value (i.e., the total number of indices) and then sorts the individual pixels into that number of color groups in a fashion that minimizes the error values associated with allocating each pixel into an index that is defined by a color value that is not identical to the color value of the respective pixel being allocated to the index.
  • the index module 290 may be configured to use an index value of ten or fifteen or twenty for any digital image.
  • the index module 290 may be configured to analyze the digital image data to determine an optimal index value that allocates each pixel into the smallest number of indices that minimizes the error value over the entire digital image.
  • the color value of the respective index is determined by averaging the color values of all individual image pixels in the respective index to determine an average color value and that average color value is determined to be the color for that respective index of the hyperspectral image, as shown in step 750 .
  • the combined indices in the hyperspectral image include every image pixel in the hyperspectral image. Accordingly, because the hyperspectral image and the digital pathology image have been registered to each other, each index from the hyperspectral image can be applied to the digital pathology image in step 760 in order to group together the same individual image pixels included in an index for the hyperspectral image in a corresponding index for the digital pathology image. This can be accomplished because the hyperspectral image and the digital pathology image were previously registered to each other and their respective image pixel sizes were adjusted to be the same.
  • the color value of each respective index of the digital pathology image is determined by averaging the color values of all individual image pixels in the respective index to determine an average color value and that average color value is determined to be the color for that respective index, as shown in step 770 .
  • the lookup table can be embedded in a data structure that houses the digital pathology image.
  • the XYZ color data can be included in the digital pathology image data structure as part of an International Color Consortium (ICC) profile.
  • ICC International Color Consortium
  • the lookup table or mathematical model or formula or set of instructions can be embedded in the digital pathology image data structure.
  • a single combined lookup table can advantageously be generated over time from a plurality of slides having a plurality of different stains.
  • a single combined lookup table can be generated and optimized over time such that the single combined lookup table can be used for any type of digital pathology slide with any type of staining profile.
  • FIG. 10 is a flow diagram illustrating an example process for validating color values generated by a digital pathology scanning apparatus using an indexing process according to an embodiment of the invention. Certain steps of the illustrated process may be carried out by an image processor device such as previously described with respect to FIG. 4 A .
  • a test slide is prepared in step 800 .
  • the test slide can be any slide prepared in the normal fashion using a specimen and zero or more stains.
  • a lookup table is generated, for example using the process previously described with respect to FIG. 9 .
  • the lookup table may contain color values such as those shown in the Hyperspectral XYZ column and the associated Digital Pathology RGB column of Table 2 below, where each row represents a single color value (i.e., index) in the hyperspectral digital image and the digital pathology digital image.
  • Table 2 illustrates a correlation of hyperspectral image data in XYZ color values to digital pathology image data in RGB color values to digital pathology image data in XYZ color values.
  • step 820 the XYZ values of the digital pathology image are determined. As previously discussed, this can be done by calculating the XYZ values for the digital pathology image based on the RGB values of the digital pathology image or by presenting the color value on the entire display and measuring the color emitted from the display using a color measurement device that measures color in XYZ value.
  • step 830 the calculated or measured XYZ value for a particular color value (e.g. an index) of the digital pathology image is compared to the hyperspectral image XYZ value for the same index.
  • a particular color value e.g. an index
  • the color information generated by digital pathology apparatus and presented on a display screen can be validated against the true color as measured by the hyperspectral imaging system.
  • FIGS. 11 A, 11 B and 11 C are a graph diagrams illustrating example comparisons of indexed color values of a specimen scanned by a hyperspectral imaging system versus indexed color values of the same specimen scanned by a digital pathology imaging system according to an embodiment of the invention.
  • the XYZ color values from the digital pathology imaging system were obtained by presenting each indexed color value on the entire display and measuring a portion of the display with a color measurement device.
  • the color values measured as coming off of the display screen were extremely close to the true color values as measured by the hyperspectral imaging system, with an average difference that is less than one Just Noticeable Difference.
  • graph 210 shows a comparison of the system hyperspectral value for lightness to the digital pathology display value for lightness.
  • graph 230 shows a comparison of the system hyperspectral value for green/red to the digital pathology display value for green/red.
  • graph 250 shows a comparison of the system hyperspectral value for blue/yellow to the digital pathology display value for blue/yellow.
  • FIGS. 11 A, 11 B and 11 C are compared to FIGS. 7 A, 7 B and 7 C , there are fewer measurements, but the scatter due to noise is much less. This demonstrates the advantage of having a very large number of image pixels in each index that form the basis for determining the average color.
  • FIG. 12 A is a block diagram illustrating an example processor enabled device 550 that may be used in connection with various embodiments described herein. Alternative forms of the device 550 may also be used as will be understood by the skilled artisan.
  • the device 550 is presented as a digital imaging device (also referred to herein as a scanner system or a scanning system) that comprises one or more processors 555 , one or more memories 565 , one or more motion controllers 570 , one or more interface systems 575 , one or more movable stages 580 that each support one or more glass slides 585 with one or more samples 590 , one or more illumination systems 595 that illuminate the sample, one or more objective lenses 600 that each define an optical path 605 that travels along an optical axis, one or more objective lens positioners 630 , one or more optional epi-illumination systems 635 (e.g., included in a fluorescence scanner system), one or more focusing optics 610 , one or more line scan cameras 615 and/or one or more area
  • the various elements of the scanner system 550 are communicatively coupled via one or more communication busses 560 . Although there may be one or more of each of the various elements of the scanner system 550 , for simplicity in the description that follows, these elements will be described in the singular except when needed to be described in the plural to convey the appropriate information.
  • the one or more processors 555 may include, for example, a central processing unit (“CPU”) and a separate graphics processing unit (“GPU”) capable of processing instructions in parallel or the one or more processors 555 may include a multicore processor capable of processing instructions in parallel. Additional separate processors may also be provided to control particular components or perform particular functions such as image processing.
  • CPU central processing unit
  • GPU graphics processing unit
  • Additional separate processors may also be provided to control particular components or perform particular functions such as image processing.
  • additional processors may include an auxiliary processor to manage data input, an auxiliary processor to perform floating point mathematical operations, a special-purpose processor having an architecture suitable for fast execution of signal processing algorithms (e.g., digital signal processor), a slave processor subordinate to the main processor (e.g., back-end processor), an additional processor for controlling the line scan camera 615 , the stage 580 , the objective lens 225 , and/or a display (not shown).
  • auxiliary processor to manage data input
  • an auxiliary processor to perform floating point mathematical operations e.g., a special-purpose processor having an architecture suitable for fast execution of signal processing algorithms (e.g., digital signal processor), a slave processor subordinate to the main processor (e.g., back-end processor), an additional processor for controlling the line scan camera 615 , the stage 580 , the objective lens 225 , and/or a display (not shown).
  • Such additional processors may be separate discrete processors or may be integrated with the processor 555 .
  • the memory 565 provides storage of data and instructions for programs that can be executed by the processor 555 .
  • the memory 565 may include one or more volatile and persistent computer-readable storage mediums that store the data and instructions, for example, a random access memory, a read only memory, a hard disk drive, removable storage drive, and the like.
  • the processor 555 is configured to execute instructions that are stored in memory 565 and communicate via communication bus 560 with the various elements of the scanner system 550 to carry out the overall function of the scanner system 550 .
  • the one or more communication busses 560 may include a communication bus 560 that is configured to convey analog electrical signals and may include a communication bus 560 that is configured to convey digital data. Accordingly, communications from the processor 555 , the motion controller 570 , and/or the interface system 575 via the one or more communication busses 560 may include both electrical signals and digital data.
  • the processor 555 , the motion controller 570 , and/or the interface system 575 may also be configured to communicate with one or more of the various elements of the scanning system 550 via a wireless communication link.
  • the motion control system 570 is configured to precisely control and coordinate XYZ movement of the stage 580 and the objective lens 600 (e.g., via the objective lens positioner 630 ).
  • the motion control system 570 is also configured to control movement of any other moving part in the scanner system 550 .
  • the motion control system 570 is configured to coordinate movement of optical filters and the like in the epi-illumination system 635 .
  • the interface system 575 allows the scanner system 550 to interface with other systems and human operators.
  • the interface system 575 may include a user interface to provide information directly to an operator and/or to allow direct input from an operator.
  • the interface system 575 is also configured to facilitate communication and data transfer between the scanning system 550 and one or more external devices that are directly connected (e.g., a printer, removable storage medium) or external devices such as an image server system, an operator station, a user station, and an administrative server system that are connected to the scanner system 550 via a network (not shown).
  • a color measurement device 577 may be configured to read color information from the user interface 575 and translate the color information into one or more XYZ values.
  • the illumination system 595 is configured to illuminate a portion of the sample 590 .
  • the illumination system may include, for example, a light source and illumination optics.
  • the light source could be a variable intensity halogen light source with a concave reflective mirror to maximize light output and a KG-1 filter to suppress heat.
  • the light source could also be any type of arc-lamp, laser, or other source of light.
  • the illumination system 595 illuminates the sample 590 in transmission mode such that the line scan camera 615 and/or area scan camera 620 sense optical energy that is transmitted through the sample 590 .
  • the illumination system 595 may also be configured to illuminate the sample 590 in reflection mode such that the line scan camera 615 and/or area scan camera 620 sense optical energy that is reflected from the sample 590 .
  • the illumination system 595 is configured to be suitable for interrogation of the microscopic sample 590 in any known mode of optical microscopy.
  • the scanner system 550 optionally includes an epi-illumination system 635 to optimize the scanner system 550 for fluorescence scanning.
  • Fluorescence scanning is the scanning of samples 590 that include fluorescence molecules, which are photon sensitive molecules that can absorb light at a specific wavelength (excitation). These photon sensitive molecules also emit light at a higher wavelength (emission). Because the efficiency of this photoluminescence phenomenon is very low, the amount of emitted light is often very low. This low amount of emitted light typically frustrates conventional techniques for scanning and digitizing the sample 590 (e.g., transmission mode microscopy).
  • a line scan camera 615 that includes multiple linear sensor arrays (e.g., a time delay integration (“TDI”) line scan camera) increases the sensitivity to light of the line scan camera by exposing the same area of the sample 590 to each of the multiple linear sensor arrays of the line scan camera 615 . This is particularly useful when scanning faint fluorescence samples with low emitted light.
  • TDI time delay integration
  • the line scan camera 615 is preferably a monochrome TDI line scan camera.
  • monochrome images are ideal in fluorescence microscopy because they provide a more accurate representation of the actual signals from the various channels present on the sample.
  • a fluorescence sample 590 can be labeled with multiple florescence dyes that emit light at different wavelengths, which are also referred to as “channels.”
  • a line scan camera 615 used in the fluorescence scanning system 550 is a monochrome 10 bit 64 linear array TDI line scan camera. It should be noted that a variety of bit depths for the line scan camera 615 can be employed for use with a fluorescence scanner embodiment of the scanning system 550 .
  • the movable stage 580 is configured for precise XY movement under control of the processor 555 or the motion controller 570 .
  • the movable stage may also be configured for movement in Z under control of the processor 555 or the motion controller 570 .
  • the moveable stage is configured to position the sample in a desired location during image data capture by the line scan camera 615 and/or the area scan camera.
  • the moveable stage is also configured to accelerate the sample 590 in a scanning direction to a substantially constant velocity and then maintain the substantially constant velocity during image data capture by the line scan camera 615 .
  • the scanner system 550 may employ a high precision and tightly coordinated XY grid to aid in the location of the sample 590 on the movable stage 580 .
  • the movable stage 580 is a linear motor based XY stage with high precision encoders employed on both the X and the Y axis.
  • high precision encoders can be used on the axis in the scanning direction and on the axis that is in the direction perpendicular to the scanning direction and on the same plane as the scanning direction.
  • the stage is also configured to support the glass slide 585 upon which the sample 590 is disposed.
  • the sample 590 can be anything that may be interrogated by optical microscopy.
  • a glass microscope slide 585 is frequently used as a viewing substrate for specimens that include tissues and cells, chromosomes, DNA, protein, blood, bone marrow, urine, bacteria, beads, biopsy materials, or any other type of biological material or substance that is either dead or alive, stained or unstained, labeled or unlabeled.
  • the sample 590 may also be an array of any type of DNA or DNA-related material such as cDNA or RNA or protein that is deposited on any type of slide or other substrate, including any and all samples commonly known as a microarrays.
  • the sample 590 may be a microtiter plate, for example a 96-well plate.
  • Other examples of the sample 590 include integrated circuit boards, electrophoresis records, petri dishes, film, semiconductor materials, forensic materials, or machined parts.
  • Objective lens 600 is mounted on the objective positioner 630 which, in one embodiment, may employ a very precise linear motor to move the objective lens 600 along the optical axis defined by the objective lens 600 .
  • the linear motor of the objective lens positioner 630 may include a 50 nanometer encoder.
  • the relative positions of the stage 580 and the objective lens 600 in XYZ axes are coordinated and controlled in a closed loop manner using motion controller 570 under the control of the processor 555 that employs memory 565 for storing information and instructions, including the computer-executable programmed steps for overall scanning system 550 operation.
  • the objective lens 600 is a plan apochromatic (“APO”) infinity corrected objective with a numerical aperture corresponding to the highest spatial resolution desirable, where the objective lens 600 is suitable for transmission mode illumination microscopy, reflection mode illumination microscopy, and/or epi-illumination mode fluorescence microscopy (e.g., an Olympus 40X, 0.75NA or 20X, 0.75 NA).
  • objective lens 600 is capable of correcting for chromatic and spherical aberrations. Because objective lens 600 is infinity corrected, focusing optics 610 can be placed in the optical path 605 above the objective lens 600 where the light beam passing through the objective lens becomes a collimated light beam.
  • the focusing optics 610 focus the optical signal captured by the objective lens 600 onto the light-responsive elements of the line scan camera 615 and/or the area scan camera 620 and may include optical components such as filters, magnification changer lenses, etc.
  • the objective lens 600 combined with focusing optics 610 provides the total magnification for the scanning system 550 .
  • the focusing optics 610 may contain a tube lens and an optional 2 ⁇ magnification changer.
  • the 2 ⁇ magnification changer allows a native 20X objective lens 600 to scan the sample 590 at 40 ⁇ magnification.
  • the line scan camera 615 comprises at least one linear array of picture elements (“pixels”).
  • the line scan camera may be monochrome or color.
  • Color line scan cameras typically have at least three linear arrays, while monochrome line scan cameras may have a single linear array or plural linear arrays.
  • 3 linear array (“red-green-blue” or “RGB”) color line scan camera or a 96 linear array monochrome TDI may also be used.
  • TDI line scan cameras typically provide a substantially better signal-to-noise ratio (“SNR”) in the output signal by summing intensity data from previously imaged regions of a specimen, yielding an increase in the SNR that is in proportion to the square-root of the number of integration stages.
  • TDI line scan cameras comprise multiple linear arrays, for example, TDI line scan cameras are available with 24, 32, 48, 64, 96, or even more linear arrays.
  • the scanner system 550 also supports linear arrays that are manufactured in a variety of formats including some with 512 pixels, some with 1024 pixels, and others having as many as 4096 pixels. Similarly, linear arrays with a variety of pixel sizes can also be used in the scanner system 550 .
  • the salient requirement for the selection of any type of line scan camera 615 is that the motion of the stage 580 can be synchronized with the line rate of the line scan camera 615 so that the stage 580 can be in motion with respect to the line scan camera 615 during the digital image capture of the sample 590 .
  • the image data generated by the line scan camera 615 is stored a portion of the memory 565 and processed by the processor 555 to generate a contiguous digital image of at least a portion of the sample 590 .
  • the contiguous digital image can be further processed by the processor 555 and the revised contiguous digital image can also be stored in the memory 565 .
  • At least one of the line scan cameras 615 can be configured to function as a focusing sensor that operates in combination with at least one of the line scan cameras that is configured to function as an imaging sensor.
  • the focusing sensor can be logically positioned on the same optical path as the imaging sensor or the focusing sensor may be logically positioned before or after the imaging sensor with respect to the scanning direction of the scanner system 550 .
  • the image data generated by the focusing sensor is stored a portion of the memory 565 and processed by the one or more processors 555 to generate focus information to allow the scanner system 550 to adjust the relative distance between the sample 590 and the objective lens 600 to maintain focus on the sample during scanning.
  • the various components of the scanner system 550 and the programmed modules stored in memory 565 enable automatic scanning and digitizing of the sample 590 , which is disposed on a glass slide 585 .
  • the glass slide 585 is securely placed on the movable stage 580 of the scanner system 550 for scanning the sample 590 .
  • the movable stage 580 accelerates the sample 590 to a substantially constant velocity for sensing by the line scan camera 615 , where the speed of the stage is synchronized with the line rate of the line scan camera 615 .
  • the movable stage 580 decelerates and brings the sample 590 to a substantially complete stop.
  • the movable stage 580 then moves orthogonal to the scanning direction to position the sample 590 for scanning of a subsequent stripe of image data, e.g., an adjacent stripe. Additional stripes are subsequently scanned until an entire portion of the sample 590 or the entire sample 590 is scanned.
  • a subsequent stripe of image data e.g., an adjacent stripe. Additional stripes are subsequently scanned until an entire portion of the sample 590 or the entire sample 590 is scanned.
  • a contiguous digital image of the sample 590 is acquired as a plurality of contiguous fields of view that are combined together to form an image strip.
  • a plurality of adjacent image strips are similarly combined together to form a contiguous digital image of a portion or the entire sample 590 .
  • the scanning of the sample 590 may include acquiring vertical image strips or horizontal image strips.
  • the scanning of the sample 590 may be either top-to-bottom, bottom-to-top, or both (bi-directional) and may start at any point on the sample.
  • the scanning of the sample 590 may be either left-to-right, right-to-left, or both (bi-directional) and may start at any point on the sample.
  • the resulting image of the sample 590 may be an image of the entire sample 590 or only a portion of the sample 590 .
  • computer-executable instructions are stored in the memory 565 and, when executed, enable the scanning system 550 to perform the various functions described herein.
  • computer-readable storage medium is used to refer to any media used to store and provide computer executable instructions to the scanning system 550 for execution by the processor 555 . Examples of these media include memory 565 and any removable or external storage medium (not shown) communicatively coupled with the scanning system 550 either directly or indirectly, for example via a network (not shown).
  • FIG. 12 B illustrates a line scan camera having a single linear array 640 , which may be implemented as a charge coupled device (“CCD”) array.
  • the single linear array 640 comprises a plurality of individual pixels 645 .
  • the single linear array 640 has 4096 pixels.
  • linear array 640 may have more or fewer pixels.
  • common formats of linear arrays include 512, 1024, and 4096 pixels.
  • the pixels 645 are arranged in a linear fashion to define a field of view 625 for the linear array 640 .
  • the size of the field of view varies in accordance with the magnification of the scanner system 550 .
  • FIG. 12 C illustrates a line scan camera having three linear arrays, each of which may be implemented as a CCD array.
  • the three linear arrays combine to form a color array 650 .
  • each individual linear array in the color array 650 detects a different color intensity, for example red, green, or blue.
  • the color image data from each individual linear array in the color array 650 is combined to form a single field of view 625 of color image data.
  • FIG. 12 D illustrates a line scan camera having a plurality of linear arrays, each of which may be implemented as a CCD array.
  • the plurality of linear arrays combine to form a TDI array 655 .
  • a TDI line scan camera may provide a substantially better SNR in its output signal by summing intensity data from previously imaged regions of a specimen, yielding an increase in the SNR that is in proportion to the square-root of the number of linear arrays (also referred to as integration stages).
  • a TDI line scan camera may comprise a larger variety of numbers of linear arrays, for example common formats of TDI line scan cameras include 24, 32, 48, 64, 96, 120 and even more linear arrays.
  • the disclosure of the present application may be embodied in a system comprising a non-transitory computer readable medium configured to store data and executable programmed modules; at least one processor communicatively coupled with the non-transitory computer readable medium and configured to execute instructions stored thereon; a register module stored in the non-transitory computer readable medium and configured to be executed by the processor, the register module configured to obtain a first digital image of a specimen in XYZ color, the first digital image having a plurality of image pixels having a first image pixel size, obtain a second digital image of the specimen in RGB color, the second digital image having a plurality of image pixels having a second image pixel size, convert the first digital image and the second digital image to a common image pixel size, and align the converted image pixels of the first digital image with corresponding converted image pixels of the second digital image.
  • a system may be implemented as a processor enabled device such as the digital imaging device or the image processing device previously described with respect to FIGS. 4 A and 12 A- 12 D
  • the disclosure of the present application may also be embodied in a system comprising a non-transitory computer readable medium configured to store data and executable programmed modules; at least one processor communicatively coupled with the non-transitory computer readable medium and configured to execute instructions stored thereon; a register module stored in the non-transitory computer readable medium and configured to be executed by the processor, the register module configured to: obtain a first digital image of a specimen in XYZ color, the first digital image having a plurality of image pixels having a first image pixel size, obtain a second digital image of the specimen in RGB color, the second digital image having a plurality of image pixels having a second image pixel size, align image data of the first digital image with corresponding image data of the second digital image, and convert the first digital image and the second digital image to a common image pixel size, wherein the converted image pixels of the first digital image are aligned with corresponding converted image pixels of the second digital image; a look up table module stored in the non-transitory computer
  • the disclosure of the present application may be embodied in a system comprising a non-transitory computer readable medium configured to store data and executable programmed modules; at least one processor communicatively coupled with the non-transitory computer readable medium and configured to execute instructions stored thereon; a register module stored in the non-transitory computer readable medium and configured to be executed by the processor, the register module configured to: obtain a first digital image of a specimen in XYZ color, the first digital image having a plurality of image pixels having a first image pixel size, obtain a second digital image of the specimen in RGB color, the second digital image having a plurality of image pixels having a second image pixel size, align image data of the first digital image with corresponding image data of the second digital image, and convert the first digital image and the second digital image to a common image pixel size, wherein the converted image pixels of the first digital image are aligned with corresponding converted image pixels of the second digital image; a color module stored in the non-transitory computer readable medium and
  • any of the three system embodiments described above may further embody wherein each of the image pixels in the first set of image pixels in the first digital image is contiguous with at least one other image pixel in the first set of image pixels in the first digital image.
  • any of the three system embodiments described above may further embody wherein at least some of the image pixels in the first set of image pixels in the first digital image are not contiguous, and furthermore, wherein the color module is further configured to identify a plurality of first sets of image pixels in the first digital image, wherein each image pixel in a first set of image pixels in the first digital image has substantially a same XYZ color value.
  • the disclosure of the present application may also be embodied in a technical system comprising a non-transitory computer readable medium configured to store executable programmed modules and at least one processor communicatively coupled with the non-transitory computer readable medium configured to execute instructions to perform steps comprising: scanning a specimen using a hyperspectral imaging system to generate a first digital image of the specimen in XYZ color; scanning the same specimen using a digital pathology imaging system to generate a second digital image of the specimen in RGB color; registering the first digital image to the second digital image to align the image data in the first digital image and the second digital image; generating a lookup table that associates the XYZ color of the first digital image and the RGB color of the second digital image.
  • a system may be implemented as a processor enabled device such as the digital imaging device or the image processing device previously described with respect to FIGS. 4 A and 12 A- 12 D .
  • This system embodiment may further include providing XYZ color data to a display module for presentation of the second digital image on a display.
  • This system embodiment may further include storing the XYZ color data as part of the second digital image.
  • This system embodiment may further include using pattern matching to register the first digital image to the second digital image.
  • This system embodiment may further include overlaying a grid on the specimen prior to creating the first and second digital images and using the grid in the first and second digital images to register the first digital image to the second digital image.
  • This system embodiment may further include combining pixels in one or more of the first digital image and the second digital image to cause the pixel size in the first digital image to be substantially the same as the pixel size in the second digital image.
  • This system embodiment may further include generating a single lookup table for a single stain.
  • This system embodiment may further include generating a single lookup table for a plurality of stains.
  • the disclosure of the present application may also be embodied in a technical system comprising a non-transitory computer readable medium configured to store executable programmed modules and at least one processor communicatively coupled with the non-transitory computer readable medium configured to execute instructions to perform steps comprising: obtaining a first digital image of a specimen scanned by a first imaging system to generate the first digital image of the specimen in XYZ color, the first digital image having a plurality of image pixels having a first image pixel size; obtaining a second digital image of the specimen scanned by a second imaging system to generate the second digital image of the specimen in RGB color, the second digital image having a plurality of image pixels having a second image pixel size; registering the first digital image to the second digital image to align the image data in the first digital image and the second digital image; presenting the second digital image on a display; using a color measurement device to measure the XYZ values of the color presented on the display in a first region; and comparing the measured XYZ values of the
  • the disclosure of the present application may also be embodied in a method comprising: obtaining a first digital image of a specimen scanned by a first imaging system to generate the first digital image of the specimen in XYZ color, the first digital image having a plurality of image pixels having a first image pixel size, obtaining a second digital image of the specimen scanned by a second imaging system to generate the second digital image of the specimen in RGB color, the second digital image having a plurality of image pixels having a second image pixel size, generating a lookup table to associate the XYZ color of the first digital image and the RGB color of the second digital image.
  • a method may be implemented by a system such as the digital imaging device or the image processing device previously described with respect to FIGS. 4 A and 12 A- 12 D .
  • This method embodiment may further include aligning image data in the first digital image with image data in the second digital image; and generating a lookup table to associate the XYZ color of the first digital image with the corresponding RGB color of the second digital image in accordance with said alignment.
  • This method embodiment may further include, wherein the first digital image comprises a plurality of image pixels having a first image pixel size and the second digital image comprises a plurality of image pixels having a second image pixel size, converting the image pixels of the first digital image and the image pixels of the second digital image to a common image pixel size, and aligning the image pixels of the converted first digital image with corresponding image pixels of the converted second digital image.
  • This method embodiment may further include identifying a first set of image pixels in the first digital image, wherein each image pixel in the first set of image pixels in the converted first digital image has substantially a same XYZ color value; determining an average XYZ color value for the first set of image pixels in the converted first digital image; identifying a second set of image pixels in the converted second digital image, wherein each image pixel in the second set of image pixels in the converted second digital image corresponds to an image pixel in the first set of image pixels in the converted first digital image; determining an average RGB color value for the second set of image pixels in the converted second digital image, and generating a lookup table to associate the average XYZ color value for the first set of image pixels in the converted first digital image with the corresponding average RGB color value of the second set of image pixels in the converted second digital image.
  • the disclosure of the present application may also be embodied in a method comprising: obtaining a first digital image of a specimen scanned by a first imaging system to generate the first digital image of the specimen in XYZ color, the first digital image having a plurality of image pixels having a first image pixel size, obtaining a second digital image of the specimen scanned by a second imaging system to generate the second digital image of the specimen in RGB color, the second digital image having a plurality of image pixels having a second image pixel size, aligning image data in the first digital image with image data in the second digital image; and generating a lookup table to associate the XYZ color of the first digital image with the corresponding RGB color of the second digital image in accordance with said alignment.
  • Such a method may be implemented by a system such as the digital imaging device or the image processing device previously described with respect to FIGS. 4 A and 12 A- 12 D .
  • This method embodiment may further include, wherein the first digital image comprises a plurality of image pixels having a first image pixel size and the second digital image comprises a plurality of image pixels having a second image pixel size, converting the first digital image and the second digital image to a common image pixel size, and aligning the converted image pixels of the first digital image with corresponding converted image pixels of the second digital image.
  • This method embodiment may further include identifying a plurality of first sets of image pixels in the converted first digital image, wherein each image pixel in each set of image pixels in the plurality of first sets of image pixels in the converted first digital image has substantially a same XYZ color value; determining an average XYZ color value for each set of image pixels in the plurality of first sets of image pixels in the converted first digital image; identifying a corresponding plurality of second sets of image pixels in the converted second digital image, wherein each image pixel in each set of image pixels in the plurality of second sets of image pixels in the converted second digital image corresponds to an image pixel in the converted first digital image; determining an average RGB color value for each set of image pixels in the plurality of second sets of image pixels in the converted second digital image, and generating a lookup table to associate the average XYZ color value for each of the first sets of image pixels in the converted first digital image with the corresponding average RGB color value of the corresponding second set of image pixels in the converted second digital image.
  • the disclosure of the present application may also be embodied in a method comprising: obtaining a first digital image of a specimen scanned by a first imaging system to generate the first digital image of the specimen in XYZ color, the first digital image having a plurality of image pixels having a first image pixel size, obtaining a second digital image of the specimen scanned by a second imaging system to generate the second digital image of the specimen in RGB color, the second digital image having a plurality of image pixels having a second image pixel size, registering the first digital image to the second digital image to align the image data in the first digital image and the second digital image; and generating a lookup table that associates the XYZ color of the first digital image and the RGB color of the second digital image.
  • Such a method may be implemented by a system such as the digital imaging device or the image processing device previously described with respect to FIGS. 4 A and 12 A- 12 D .
  • This method embodiment may further include providing XYZ color data to a display module for presentation of the second digital image on a display.
  • This method embodiment may further include storing the XYZ color data as part of the second digital image file.
  • This method embodiment may further include using pattern matching to register the first digital image to the second digital image.
  • This method embodiment may further include overlaying a grid on the specimen prior to creating the first and second digital images and using the grid in the first and second digital images to register the first digital image to the second digital image.
  • This method embodiment may further include combining pixels in one or more of the first digital image and the second digital image to cause the first image pixel size in the first digital image to be substantially the same as the second image pixel size in the second digital image.
  • This method embodiment may further include generating a single lookup table for a single stain.
  • This method embodiment may further include generating a single lookup table for a plurality of stains.
  • the disclosure of the present application may also be embodied in a method comprising: obtaining a first digital image of a specimen scanned by a first imaging system to generate the first digital image of the specimen in XYZ color, the first digital image having a plurality of image pixels having a first image pixel size, obtaining a second digital image of the specimen scanned by a second imaging system to generate the second digital image of the specimen in RGB color, the second digital image having a plurality of image pixels having a second image pixel size, registering the first digital image to the second digital image to align the image data in the first digital image and the second digital image; presenting the second digital image on a display; using a color measurement device to measure the XYZ values of the color presented on the display in a first region; and comparing the measured XYZ values of the first region to the XYZ values of the first digital image for the first region to validate the digital pathology system.
  • a method may be implemented by a system such as the digital imaging device or the image processing device previously
  • the disclosure of the present application may also be embodied in a method comprising: obtaining a first digital image of a specimen scanned by a first imaging system to generate the first digital image of the specimen in XYZ color, the first digital image having a plurality of image pixels having a first image pixel size; obtaining a second digital image of the specimen scanned by a second imaging system to generate the second digital image of the specimen in RGB color, the second digital image having a plurality of image pixels having a second image pixel size, registering the first digital image to the second digital image to align the image data in the first digital image and the second digital image; converting the first digital image and the second digital image to a common image pixel size; identifying a plurality of first sets of image pixels in the converted first digital image, wherein each image pixel in each set of image pixels in the plurality of first sets of image pixels in the converted first digital image has substantially a same XYZ color value; determining an average XYZ color value for each set of image pixels in the plurality of

Abstract

Color calibration for digital pathology is provided. A standard glass slide is prepared with a specimen having zero or more stains. The specimen is scanned a first time using a hyperspectral imaging system to produce a first digital image having XYZ color values. The specimen is scanned a second time using a digital pathology imaging system to produce a second digital image having RGB color values. The first and second digital images are then registered against each other to align the digital image data. Individual pixels of the first and second images may be combined in the registration process so that the first and second digital images have substantially similar pixel sizes. A lookup table is generated to associate XYZ color values to RGB color values. Once the lookup table has been generated, it can be used to present RGB color on a display using the corresponding XYZ color.

Description

RELATED APPLICATION
The present application is a continuation of U.S. patent application Ser. No. 16/095,267, filed on Oct. 19, 2018, which is a national stage entry of International Patent App. No. PCT/US2017/028532, filed on Apr. 20, 2017, which claims priority to U.S. Provisional Patent App. No. 62/325,330, filed on Apr. 20, 2016, which are all hereby incorporated herein by reference as if set forth in full.
BACKGROUND Field of the Invention
The present invention generally relates to digital pathology and more specifically relates to systems and methods for calibrating color management in a digital pathology system and validating digital color data in connection with corresponding physical color spectrum.
Related Art
Accurate communication of color information is a continuing challenge in many industries. The most common approach to tackling this challenging problem is to generate specially constructed test targets that represent a particular color. In the photography industry, for example, these specially constructed test targets include an array of rows and columns, with each cell having a constant color. For reflected light applications, the specially constructed test target comes in the form of paper and for transmitted light applications, the specially constructed test target comes in the form of film. For example, for nature photography, the color patches are spectrally matched to blue sky, green foliage, brown skin, and so forth.
In the digital pathology industry, a significant problem exists in that the color being presented on a display is not the same as the color of the stained specimen. Exacerbating this problem is the lack of availability of any specially constructed test targets. This is generally because paper and/or film is an inadequate substitute for tissue as a medium for carrying the stains that contain the color information. Attempts to create such test targets using materials to approximate tissue have been largely unsuccessful. For example, special biopolymer strips have been hand crafted and stained with standard pathology stains. However, hand crafted fabrication of these targets is complex and expensive. Moreover, the specially constructed biopolymer strips, when stained, typically fail to provide an accurate color match to that of the same stain as applied to tissue. Accordingly, the unsolved problem is how to construct a test target that approximates tissue and provides an accurate color match when stains are applied to the test target.
Therefore, what is needed is a system and method that overcomes these significant problems found in the conventional systems as described above.
SUMMARY
To solve the problems discussed above, described herein are systems and methods for calibrating a digital pathology slide scanning system so that the colors of a scanned specimen presented on a display are substantially the same as the colors of the stained or unstained specimen on the physical slide.
The inventor has recognized that because the spectral properties of the staining agent change when bound to tissue, a test target that is constructed to faithfully approximate the characteristics of tissue with or without stains will always suffer significant drawbacks. This is because a proper solution requires that the spectral properties of the test targets match those of the objects being imaged. Accordingly, in the present description the inventor provides a solution that employs one or more standard slides as test targets that are used to successfully calibrate an imaging system.
In one aspect, a standard slide is prepared with a specimen. A single stain or a combination of stains or no stains can be applied to the specimen. A grid is overlayed or superimposed on the slide to divide the specimen into discrete sections. A digital image of the specimen on the slide is then obtained using a hyperspectral imaging system (referred to herein as the “hyperspectral image” or the “H image” or the “HYP image” or the “XYZ image”). Hyperspectral imaging systems are typically image tiling systems and, for example, a single field of view of the hyperspectral imaging system can be captured for each cell in the grid overlay. The resulting hyperspectral image for each cell includes a stack of images ranging between 400 nm and 750 nm (the visual spectrum). The number of images in the stack can vary, for example one image per 10 nm separation. The hyperspectral image stack is then processed to result in an XYZ color image having a plurality of individual picture elements (“pixels”) where each pixel has an XYZ color value. The hyperspectral image is then registered to the grid by mapping the top left corner pixel to the top left corner of the grid. The individual pixels in an XYZ color image can be combined to create super pixels, which advantageously reduces pixel location errors when subsequently associating XYZ pixels created by the hyperspectral imaging system to RGB pixels created by the digital pathology system.
Next, the same slide with the grid overlay is scanned using a digital pathology system having a color imaging capability. The resulting digital image (referred to herein as the “pathology image” or the “P image” or the “PATH image” or the “RGB image”) has a red, green and blue (“RGB”) value for each pixel. The pathology image is then registered to the grid by mapping the top left corner pixel to the top left corner of the grid. The individual pixels in the RGB can also be combined to create super pixels and to match the pixel size of the hyperspectral image to allow for direct color comparison between the XYZ values and the RGB values. Also, the pixel sizes of both the hyperspectral image and the pathology image can be downsized or upsized to optimize the pixel size matching.
Next, a lookup table (“LUT”) associating the XYZ color information for a single pixel in the hyperspectral image and the RGB color information for the same pixel in the pathology image is generated. The LUT associates the XYZ and RBG color information for all pixels in the pathology image. Advantageously registration of the hyperspectral image to the pathology image, including image pixel size mapping, results in a one-to-one pixel association in the LUT. However, it is also possible to include in the LUT an average RGB value from a combined number of pixels of the pathology image in association with the XYZ value from a single pixel in the hyperspectral image or vice versa.
Once the LUT has been generated, it can be used by a display module so that the colors of a scanned specimen in a digital image file having RGB color data can be presented on a display using the corresponding hyperspectral XYZ color to result in the displayed color being substantially the same as the colors of the specimen on the physical slide. The displayed color can be measured by a color measurement device such as a colorimeter or a spectrophotometer.
In an alternative embodiment, a standard slide having a specimen is scanned using the hyperspectral imaging system as described above to generate the XYZ image. The same slide is also scanned using the digital pathology system to generate the RGB image. The pixels of the XYZ image and RGB image are registered to each other to align the respective images. If the size of the individual pixels of the imaging sensors in the hyperspectral imaging system and the digital pathology imaging system differ, then pixel combining or downsizing may be employed to facilitate proper alignment of the XYZ and RGB images and proper image pixel size matching.
After scanning, one of the XYZ or RGB images is indexed to identify a small number of colors into which every pixel in the image being indexed can be assigned while also minimizing the error values for each association of an image pixel to a color. For example, while the camera sensor may be capable of sensing millions of colors, the indexing process may advantageously reduce the number of colors in the XYZ image to ten and each pixel in the XYZ image is assigned to one of the ten colors. In one embodiment, during indexing all pixels that are close to the same color are averaged into a single color of the index color palette. This process is done iteratively until all pixels have been assigned to an averaged single color value where the error value of the original pixel color compared to the averaged single color is minimized across all pixels in the XYZ image. In one embodiment, the error value can be minimized using a root mean square analysis.
Once the XYZ image has been indexed, the result is a set of N pixel groups, where N is the index value (ten in the example above) and combining the pixels in the N pixel groups results in the complete XYZ image. A pixel grouping is also referred to herein as an “index.” After the XYZ and RGB images have been registered, the pixels in the RGB image can be likewise associated into the same N pixel groups in accordance with the pixel registration between the XYZ and RGB images. Each of the N pixel groups from the RGB image are analyzed to calculate an average color value for each of the N pixel groups. The result is that each index in the XYZ image has an average color value and each index in the RGB image has an average color value and these average color values are used to generate the LUT.
Once the LUT has been generated, it can be used by a display module so that the colors of a scanned specimen in a digital image file having RGB color data can be presented on a display using the corresponding hyperspectral XYZ color to result in the displayed color (e.g., as measured by a colorimeter or spectrophotometer) being substantially the same as the colors of the specimen on the physical slide.
Other features and advantages of the present invention will become more readily apparent to those of ordinary skill in the art after reviewing the following detailed description and accompanying drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
The structure and operation of the present invention will be understood from a review of the following detailed description and the accompanying drawings in which like reference numerals refer to like parts and in which:
FIG. 1 is a plan view diagram illustrating an example digital pathology slide having a barcode and a grid overlay according to an embodiment of the invention;
FIG. 2 is a block diagram illustrating an example prior art hyperspectral imaging system according to an embodiment of the invention;
FIG. 3A is a block diagram illustrating an example hyperspectral imaging stack as generated by a hyperspectral imaging system according to an embodiment of the invention;
FIG. 3B is a graph diagram illustrating an example color matching for use with a hyperspectral imaging stack according to an embodiment of the invention;
FIG. 3C is a block diagram illustrating an example hyperspectral image in XYZ color according to an embodiment of the invention;
FIG. 3D is a flow diagram illustrating an example process for converting a hyperspectral image stack to a single XYZ color image according to an embodiment of the invention;
FIG. 4A is a block diagram illustrating an example image processor device according to an embodiment of the invention;
FIG. 4B is a block diagram illustrating example superpixels of a hyperspectral image in XYZ color according to an embodiment of the invention;
FIG. 5 is a flow diagram illustrating an example process for calibrating color values generated by a digital pathology scanning apparatus using a superpixel process according to an embodiment of the invention;
FIG. 6 is a flow diagram illustrating an example process for validating color values generated by a digital pathology scanning apparatus using a superpixel process according to an embodiment of the invention;
FIGS. 7A, 7B and 7C are a graph diagrams illustrating example comparisons of color values of a specimen scanned by a hyperspectral imaging system versus superpixeling color values of the same specimen scanned by a digital pathology imaging system and presented on a display and measured by a color measurement device according to an embodiment of the invention;
FIG. 8 is a block diagram illustrating an example set of pixel groups that form a composite XYZ image according to an embodiment of the invention;
FIG. 9 is a flow diagram illustrating an example process for calibrating color values generated by a digital pathology scanning apparatus using an indexing process according to an embodiment of the invention;
FIG. 10 is a flow diagram illustrating an example process for validating color values generated by a digital pathology scanning apparatus using an indexing process according to an embodiment of the invention;
FIGS. 11A, 11B and 11C are graph diagrams illustrating example comparisons of color values of a specimen scanned by a hyperspectral imaging system versus indexing color values of the same specimen scanned by a digital pathology imaging system and presented on a display and measured by a color measurement device according to an embodiment of the invention;
FIG. 12A is a block diagram illustrating an example processor enabled device 550 that may be used in connection with various embodiments described herein;
FIG. 12B is a block diagram illustrating an example line scan camera having a single linear array;
FIG. 12C is a block diagram illustrating an example line scan camera having three linear arrays; and
FIG. 12D is a block diagram illustrating an example line scan camera having a plurality of linear arrays.
DETAILED DESCRIPTION
Certain embodiments disclosed herein provide for color calibration of a digital pathology system. For example, one embodiment disclosed herein allows for a pathology slide to be scanned by a hyperspectral imaging system and the colors of the resulting digital hyperspectral image compared to the colors of a digital image of the same slide scanned by the digital pathology system. The comparison results in a lookup table that translates RGB values into XYZ values so that when the digital slide image scanned by the digital pathology system is presented on a display, the presented colors match the XYZ values corresponding to the true colors of the physical specimen on the slide. After reading this description it will become apparent to one skilled in the art how to implement the invention in various alternative embodiments and alternative applications. However, although various embodiments of the present invention will be described herein, it is understood that these embodiments are presented by way of example only, and not limitation. As such, this detailed description of various alternative embodiments should not be construed to limit the scope or breadth of the present invention as set forth in the appended claims.
FIG. 1 is a plan view diagram illustrating an example digital pathology slide 10 having a barcode 20 and a grid overlay 30 according to an embodiment of the invention. In the illustrated embodiment, the grid overlay 30 is positioned over the sample that is on the slide 10. The grid overlay 30 is used to facilitate registration of the hyperspectral digital image to the digital pathology digital image. An alternative way to register the images is to use image pattern matching of the two digital images in order to align the hyperspectral digital image and the digital pathology digital image. For successful image pattern matching it is helpful to have digital images of the same magnification.
FIG. 2 is a block diagram illustrating an example prior art hyperspectral imaging system 50 according to an embodiment of the invention. In the illustrated embodiment, the hyperspectral imaging system 50 includes a 2D pixel array monochrome camera 60, a microscope 70, a slide 80 that supports a specimen and a narrow band filter wheel 90. In one embodiment, the monochrome camera 60 of the hyperspectral imaging system 50 is a monochrome line scan camera. Preferably, the monochrome line scan camera 60 in the hyperspectral imaging system 50 has the same characteristics as the color line scan camera used in the digital pathology imaging system. Employing line scan cameras having the same characteristics (e.g., pixel size) in both the hyperspectral and digital pathology imaging systems advantageously simplifies image registration by reducing or eliminating the need for pixel matching—or by allowing the pixels to be easily resampled, e.g., by downsampling the pixels into larger superpixels to simplify registration of the XYZ and RGB images.
FIG. 3A is a block diagram illustrating an example hyperspectral imaging stack 100 as generated by a hyperspectral imaging system according to an embodiment of the invention. As will be understood by the skilled artisan, a hyperspectral imaging system generates a set of individual images that are each captured using a different wavelength of light, for example by using a narrow band filter wheel and capturing an image of the same region using each filter of the filter wheel. This set of images is referred to herein as a hyperspectral stack or a spectral stack. A spectral stack is can be generated for an individual region of a sample or for an entire sample/entire slide. For example, in one embodiment, a hyperspectral imaging system using a line scan camera can capture a whole slide image using each filter on the filter wheel to generate a whole slide image spectral stack.
FIG. 3B is a graph diagram illustrating an example color matching for use with a hyperspectral imaging stack according to an embodiment of the invention. In the illustrated embodiment, color matching functions can be applied to each digital image in the whole slide image spectral stack at a variety of wavelengths of light (110, 120 and 130) to generate a whole slide hyperspectral image 140 in XYZ color. FIG. 3C is a block diagram illustrating an example whole slide hyperspectral image 140 in XYZ color according to an embodiment of the invention.
FIG. 3D is a flow diagram illustrating an example process for converting a whole slide hyperspectral image stack 100 to a single hyperspectral image 140 in XYZ color according to an embodiment of the invention. Initially, whole slide images for each different wavelength of light are captured by the hyperspectral imaging system to generate a hyperspectral image stack 100. Then color matching functions are applied to each digital image in the spectral stack 100 at a variety of wavelengths of light (110, 120, 130) to generate a whole slide hyperspectral digital image 140 of the entire specimen in XYZ color.
FIG. 4A is a block diagram illustrating an example image processor device 260 according to an embodiment of the invention. In the illustrated embodiment, the image processor device 260 is processor enabled device having a processor 267 and a non-transitory data storage area 265 for storing information and instructions that can be executed by the processor 267. For example, the data storage area 260 may store a plurality of hyperspectral XYZ images and a plurality of digital pathology RGB images and a plurality of instructions for processing such images. As shown in the illustrated embodiment, the image processor device 260 includes a register module 270, a superpixel module 280, an index module 290 and a LUT module 295. In one embodiment, the superpixel module 280 and the index module 290 may be combined into a color module 285. The image processor device 260 may also be communicatively coupled with an integral or external display device 576. In one embodiment, a color measurement device 577 may be configured to read color information from the display device 576 and translate the color information into one or more XYZ values.
The register module 270 is configured to register two digital images to each other. For example the register module 270 is configured to register a hyperspectral XYZ image to a digital pathology RGB image. The register module 270 registers two digital images by aligning the image data of the two images in X-Y to bring the two images into X-Y alignment, for example by pattern matching of features in the image data. The register module 260 also registers two digital images by adjusting the digital images so that they have common characteristics. For example, the register model 270 may evaluate and adjust image characteristics including image pixel size and spatial alignment of translation, rotation, and magnification. Additionally, the register module 270 may also account for optical distortions between the two separate systems, which can be detected at the pixel level.
The superpixel module 280 is configured to identify contiguous image pixels that have the same or similar color values and combine those pixels into a single superpixel. The superpixel module 280 is also configured to determine a color value for the superpixel by averaging the color values of all of the individual image pixels in the superpixel. Averaging is important to reduce noise due to possible measurement and registration errors. The average color value can be determined for a superpixel, for example, by summing the color values of the image pixels in the superpixel and dividing the sum by the number of pixels for that superpixel. The superpixel module 280 can advantageously identify a plurality of superpixels in a digital image and determine a color value for each of the plurality of superpixels.
The index module 290 is configured to identify individual image pixels that have the same or similar color values and assign these individual image pixels to one of a plurality of color indices. The index module 290 is also configured to determine a color value for each color index by averaging the color values of all of the individual image pixels in the respective color index. Averaging is important to reduce noise due to possible measurement and registration errors. The average color value can be determined for an index, for example, by summing the color values of the image pixels in the index and dividing the sum by the number of pixels for that index.
The LUT module 295 is configured to generate one or more lookup tables that correlate XYZ color values to RGB color values.
FIG. 4B is a block diagram illustrating an example registered whole slide image 170 according to an embodiment of the invention. In the illustrated embodiment, whole slide image 170 may be a hyperspectral image or a digital pathology image. During the registration process, image data from the hyperspectral digital image and the pathology digital image is analyzed to achieve X-Y alignment. For example, the image data may be analyzed to identify features in the image data that can be matched and aligned in order to register the hyperspectral digital image to the pathology digital image in X-Y.
Advantageously, pattern matching can be used to associate common features between the hyperspectral digital image to the pathology digital image to facilitate X-Y alignment.
Additionally during the registration process, the image data for the hyperspectral digital image and the pathology digital image is converted to common characteristics. This is because the imaging hardware of the hyperspectral scanning system and the pathology scanning system is unlikely to produce identical digital image data with respect to, for example, the magnification and image pixel size in the digital image data. Accordingly, during the registration process, the image data for the hyperspectral digital image and the pathology digital image is adjusted to have common characteristics. For example, a magnification adjustment may be needed and an adjustment to a common image pixel size is nearly always needed. In the illustrated embodiment, image pixel 180 is an image pixel having the common image pixel size.
FIG. 5 is a flow diagram illustrating an example process for calibrating color values generated by a digital pathology scanning apparatus using a superpixel process according to an embodiment of the invention. Certain steps of the illustrated process may be carried out by an image processor device such as previously described with respect to FIG. 4A. Initially, in step 300 one or more test slides are prepared. A test slide is any type of slide that will be scanned by both the hyperspectral imaging system and the digital pathology imaging system and used for calibration/validation purposes. Accordingly, no special type of slide preparation is required to be a test slide and there are no special characteristics of a test slide. Any slide having any stains can be used as a test slide to calibrate a digital pathology scanning apparatus. This is a substantial deviation from all prior color calibration attempts and eliminates the problems of attempting to create a test slide or a test color sample. Using a slide prepared in the normal way with a specimen and stains also allows the scanning apparatus to be calibrated using actual stains as modified by their application to tissue that will be encountered during production scanning. This provides a significant advantage. Moreover, using multiple test slides can result a combined LUT that includes color values for multiple stains.
Additionally, in one embodiment, a registration grid can be overlayed on the slide—preferably over the portion of the slide that includes the sample. The registration grid, if present, can be later used as a marker in the hyperspectral digital image and the digital pathology digital image to register the hyperspectral digital image to the digital pathology digital image by aligning the image data in X-Y.
Next, in step 310 a hyperspectral image is scanned and stored. The hyperspectral image may be scanned as separate image tiles using tiling system hardware or the hyperspectral image may be scanned as a whole slide image using line scanning system hardware. Advantageously, the present color calibration and validation systems and methods are hardware agnostic.
After scanning, the native hyperspectral image includes one or more spectral stacks having a plurality of individual images that are each processed using color matching functions to generate a single digital image in XYZ color. Next, in step 320 a color digital pathology image is scanned and stored. The scanned digital pathology image is in RGB color. The digital pathology image may also be scanned as separate image tiles using a tiling system or the digital pathology image may be scanned as a whole slide image using a line scanning system. While there are advantages to using cameras with the same or very similar characteristics such as pixel size and pixel number in the hyperspectral imaging system and the digital pathology imaging system, these advantages primarily serve to simplify the image registration process and make the registration process robust.
Next in step 330, the hyperspectral image and the digital pathology image are registered to each other. Image registration includes X-Y alignment, for example by pattern matching, and conversion to common characteristics, for example, magnification and image pixel size. In some embodiments, the registration process includes generation of resampled image pixels (larger or smaller) to make the individual image pixel size from the separate scanning systems substantially the same. In some embodiments, the registration process may also include localized variations in translation to account for optical distortions.
Once the hyperspectral image and the digital pathology images are registered to each other, in step 340 the color groups in the hyperspectral image are determined. In a simple embodiment, each individual image pixel in the hyperspectral image is its own color group. However, this would result in significant noise because an individual image pixel is very small and consequently the sample size for each color is also very small. To reduce the noise, in step 340 contiguous individual image pixels having the same or very similar colors are combined into larger superpixels such as superpixel 190 shown in FIG. 4B. The larger the superpixel, the greater the sample size, which has the advantage of reduced noise. However, a disadvantage of including more pixels in a superpixel is that it reduces the range of colors across all superpixels. Once a superpixel has been identified, the color of the superpixel determines a color group. In one embodiment, the color values of all image pixels included in the superpixel are averaged together to determine an average color value and that average color value is determined to be the color for that color group, as shown in step 350.
Advantageously, the color groups in the hyperspectral image cover every image pixel in the hyperspectral image and each color group has an X-Y perimeter. Accordingly, because the hyperspectral image and the digital pathology image have been registered to each other, the X-Y perimeters of the color groups from the hyperspectral image can be applied to the digital pathology image as shown in step 360 to associate the individual image pixels of the digital pathology image into the same color groups as the hyperspectral image. Accordingly, the color values of the individual image pixels of each color group in the digital pathology image can be similarly averaged in step 370 to determine an average color value for each color group in the digital pathology image.
Once the average color values for the various color groups in the hyperspectral image and the average color values for the same color groups in the digital pathology image have been established, these color values can be correlated to each other in a lookup table having XYZ color values associated with their correlated RGB color values, as shown in step 380. In one embodiment, the lookup table can be embedded in a data structure that houses the digital pathology image. In one embodiment, the correlation of the XYZ color data to the RGB color data can be included in the digital pathology image data structure as part of an International Color Consortium (ICC) profile. For example, as previously mentioned, the lookup table can be embedded in the digital pathology image data structure or alternatively, the information in the lookup table can be converted into a mathematical model or formula or a set of executable instructions and the model or formula or set of instructions can be embedded in the digital pathology image data structure. An advantage of embedding the model or formula or set of instructions is that the data size of the model or formula or set of instructions is smaller and thereby reduces the size of the digital pathology image data structure. Another advantage is that the model or formula or set of instructions functions to average out minor differences and discrepancies in the correlation of the XYZ color data to the RGB color data, which may be introduced, for example, due to metamerism. Metamerism is when two colors that are not actually the same (i.e., they reflect different wavelengths of light) appear to be the same under certain lighting conditions.
In one embodiment, a single combined lookup table is generated over time from a plurality of slides having a plurality of different stains. Advantageously, a single combined lookup table can be generated and optimized over time such that the single combined lookup table can be used for any type of digital pathology slide with any type of staining profile.
FIG. 6 is a flow diagram illustrating an example process for validating color values generated by a digital pathology scanning apparatus using a superpixel process according to an embodiment of the invention. Certain steps of the illustrated process may be carried out by an image processor device such as previously described with respect to FIG. 4A. In the illustrated embodiment, a test slide is initially prepared in step 400. As previously discussed, the test slide can be any slide prepared in the normal fashion using a specimen and zero or more stains. Next, in step 410 a lookup table is generated, for example using the process previously described with respect to FIG. 5 . The lookup table may contain color values such as those shown in the Hyperspectral XYZ column and the associated Digital Pathology RGB column of Table 1 below, where each row represents the same color group (e.g., a superpixel) in the hyperspectral digital image and the digital pathology digital image.
TABLE 1
HYPERSPECTRAL DIGITAL DIGITAL
SUPERPIXEL XYZ PATHOLOGY RGB PATHOLOGY XYZ
1 0.777, 0.631, 0.551 225, 147, 169 0.770, 0.625, 0.556
2 0.743, 0.574, 0.500 219, 130, 154 0.750, 0.570, 0.510
3 0.712, 0.426, 0.454 213, 117, 140 0.719, 0.431, 0.450
4 0.683, 0.485, 0.413 208, 105, 128 0.678, 0.481, 0.418
Table 1 illustrates a correlation of hyperspectral image data in XYZ color values to digital pathology image data in RGB color values to digital pathology image data in XYZ color values.
Next, in step 420 the XYZ values of the digital pathology image are determined. This can be done in at least two ways. A first way is that the XYZ values for the digital pathology image can be calculated from the RGB values of the digital pathology image—for example using a lookup table or a formula. A second way is that the digital pathology image can be presented on a display and the colors emitted from the display can be measured using a color measurement device (e.g., colorimeter or spectrophotometer) that measures color in XYZ value.
Finally, in step 430 the calculated or measured XYZ value for a particular region (e.g. a superpixel) of the digital pathology image is compared to the hyperspectral image XYZ value for the same region. In this fashion, the color information generated by digital pathology apparatus and presented on a display screen can be validated.
FIGS. 7A, 7B and 7C are a graph diagrams illustrating example comparisons of color values of a specimen scanned by a hyperspectral imaging system versus superpixeling color values of the same specimen scanned by a digital pathology imaging system according to an embodiment of the invention. The XYZ color values from the digital pathology imaging system were obtained by presenting the digital slide image on a display and measuring a superpixel region with a color measurement device. Alternatively, because a superpixel region can be very small and therefore difficult to measure with a color measurement device, the color value for the superpixel may be used to fill up the entire display with a single color and then measuring a region with a color measurement device. As demonstrated by the graph, the color values measured as coming off of the display screen were extremely close to the true color values as measured by the hyperspectral imaging system, with an average difference of 2.18, which is less than one Just Noticeable Difference.
In FIG. 7A, graph 200 shows a comparison of the system hyperspectral value for lightness to the digital pathology display value for lightness. Similarly, in FIG. 7B, graph 220 shows a comparison of the system hyperspectral value for green/red to the digital pathology display value for green/red. Similarly, in FIG. 7C, graph 240 shows a comparison of the system hyperspectral value for blue/yellow to the digital pathology display value for blue/yellow. Additionally, in each of the graphs 200, 220 and 240, it is evident that there are a very large number of individual comparisons. Notably, each individual comparison corresponds to a separate superpixel. After the registration process, although the number of image pixels in the hyperspectral image and the digital pathology image is also very large, the consequence of having large number of superpixels is that the sample size (i.e., number of image pixels) for each superpixel is small and therefore noise is increased in the data set.
FIG. 8 is a block diagram illustrating an example set of pixel groups that form a composite XYZ image 650 according to an embodiment of the invention. In the illustrated embodiment, there are ten indices, namely index 1, 600, index 2, 605, index 3, 610, index 4, 615, index 5 620, index 6 625, index 7 630, index 8 635, index 9 640 and index 10 645. Each of the indices represents a separate color group for the underlying digital image. When combined together, the ten indices result in the composite image 650. The index color palette 660 represents each of the color values corresponding to each individual index.
As can be seen in the individual index images, each index image represents a scattering of all image pixels in the underlying digital image that have the same color value within a certain threshold. The indexing process can be applied to either the XYZ digital image or the RGB digital image. Advantageously, the indexing process analyzes the color values for each pixel in the digital image and identifies all pixels, regardless of X-Y location that belong to a single color value. In the illustrated embodiment, an entire digital pathology digital image, whether created by a hyperspectral imaging system or a pathology imaging system can be indexed into about ten (10) color values. A significant advantage of indexing all image pixels in a digital image into a relatively small number of color values is that the sample size for each color value is significantly increased, which significantly reduces noise. Another advantage of indexing all image pixels in a digital image into a relatively small number of color values is that it provides the widest range of average color values from the smallest number of indices.
FIG. 9 is a flow diagram illustrating an example process for calibrating color values generated by a digital pathology scanning apparatus using an indexing process according to an embodiment of the invention. Certain steps of the illustrated process may be carried out by an image processor device such as previously described with respect to FIG. 4A. Initially, in step 700 a slide is obtained. As previously discussed, any type of slide with a specimen having zero or more stains is suitable. Next, in step 710 a hyperspectral image is scanned and stored as an XYZ image. As previously discussed, the hyperspectral image may be scanned as separate image tiles using any type of scanning system hardware. Next, in step 720 a color digital pathology image is scanned and stored as an RGB image.
Next in step 730, the hyperspectral image and the digital pathology image are registered to each other as previously described. Image registration includes X-Y alignment and conversion to common characteristics such as image pixel size. Once the hyperspectral image and the digital pathology images are registered to each other, in step 740 the hyperspectral image is indexed to identify a set of colors into which every image pixel in the hyperspectral image can be allocated. In a simple embodiment, the indexing process receives an index value (i.e., the total number of indices) and then sorts the individual pixels into that number of color groups in a fashion that minimizes the error values associated with allocating each pixel into an index that is defined by a color value that is not identical to the color value of the respective pixel being allocated to the index. For example, the index module 290 may be configured to use an index value of ten or fifteen or twenty for any digital image. In a more complex embodiment, the index module 290 may be configured to analyze the digital image data to determine an optimal index value that allocates each pixel into the smallest number of indices that minimizes the error value over the entire digital image.
Once all of the individual image pixels have been allocated to an index of the hyperspectral image, the color value of the respective index is determined by averaging the color values of all individual image pixels in the respective index to determine an average color value and that average color value is determined to be the color for that respective index of the hyperspectral image, as shown in step 750.
Advantageously, the combined indices in the hyperspectral image include every image pixel in the hyperspectral image. Accordingly, because the hyperspectral image and the digital pathology image have been registered to each other, each index from the hyperspectral image can be applied to the digital pathology image in step 760 in order to group together the same individual image pixels included in an index for the hyperspectral image in a corresponding index for the digital pathology image. This can be accomplished because the hyperspectral image and the digital pathology image were previously registered to each other and their respective image pixel sizes were adjusted to be the same.
Once all of the individual image pixels of the digital pathology image have been allocated to an index, the color value of each respective index of the digital pathology image is determined by averaging the color values of all individual image pixels in the respective index to determine an average color value and that average color value is determined to be the color for that respective index, as shown in step 770.
Once the average color values for the various indices in the hyperspectral image and the average color values for the same indices in the digital pathology image have been established, these color values can be correlated to each other in a lookup table having XYZ color values associated with their correlated RGB color values, as shown in step 780. In one embodiment, the lookup table can be embedded in a data structure that houses the digital pathology image. In one embodiment, the XYZ color data can be included in the digital pathology image data structure as part of an International Color Consortium (ICC) profile. As previously mentioned, the lookup table or mathematical model or formula or set of instructions can be embedded in the digital pathology image data structure.
As previously described, a single combined lookup table can advantageously be generated over time from a plurality of slides having a plurality of different stains. Advantageously, a single combined lookup table can be generated and optimized over time such that the single combined lookup table can be used for any type of digital pathology slide with any type of staining profile.
FIG. 10 is a flow diagram illustrating an example process for validating color values generated by a digital pathology scanning apparatus using an indexing process according to an embodiment of the invention. Certain steps of the illustrated process may be carried out by an image processor device such as previously described with respect to FIG. 4A. Initially, a test slide is prepared in step 800. As previously discussed, the test slide can be any slide prepared in the normal fashion using a specimen and zero or more stains. Next, in step 810 a lookup table is generated, for example using the process previously described with respect to FIG. 9 . The lookup table may contain color values such as those shown in the Hyperspectral XYZ column and the associated Digital Pathology RGB column of Table 2 below, where each row represents a single color value (i.e., index) in the hyperspectral digital image and the digital pathology digital image.
TABLE 2
HYPERSPECTRAL DIGITAL DIGITAL
INDEX XYZ PATHOLOGY RGB PATHOLOGY XYZ
1 0.702 0.656 0.646 225 208 225 0.700 0.652 0.649
2 0.654 0.594 0.619 218 197 219 0.648 0.583 0.618
3 0.745 0.716 0.677 230 217 230 0.744 0.713 0.678
4 0.891 0.921 0.764 248 247 248 0.900 0.931 0.778
5 0.564 0.494 0.556 206 180 208 0.566 0.489 0.560
6 0.603 0.558 0.591 211 192 215 0.606 0.553 0.596
7 0.787 0.780 0.707 235 226 236 0.789 0.780 0.710
8 0.832 0.844 0.735 240 235 241 0.837 0.845 0.741
9 0.457 0.424 0.506 184 168 197 0.460 0.419 0.509
10 0.382 0.324 0.450 174 147 185 0.389 0.319 0.450
Table 2 illustrates a correlation of hyperspectral image data in XYZ color values to digital pathology image data in RGB color values to digital pathology image data in XYZ color values.
Next, in step 820 the XYZ values of the digital pathology image are determined. As previously discussed, this can be done by calculating the XYZ values for the digital pathology image based on the RGB values of the digital pathology image or by presenting the color value on the entire display and measuring the color emitted from the display using a color measurement device that measures color in XYZ value.
Finally, in step 830 the calculated or measured XYZ value for a particular color value (e.g. an index) of the digital pathology image is compared to the hyperspectral image XYZ value for the same index. In this fashion, the color information generated by digital pathology apparatus and presented on a display screen can be validated against the true color as measured by the hyperspectral imaging system.
FIGS. 11A, 11B and 11C are a graph diagrams illustrating example comparisons of indexed color values of a specimen scanned by a hyperspectral imaging system versus indexed color values of the same specimen scanned by a digital pathology imaging system according to an embodiment of the invention. The XYZ color values from the digital pathology imaging system were obtained by presenting each indexed color value on the entire display and measuring a portion of the display with a color measurement device. As demonstrated by the graphs, the color values measured as coming off of the display screen were extremely close to the true color values as measured by the hyperspectral imaging system, with an average difference that is less than one Just Noticeable Difference.
In FIG. 11A, graph 210 shows a comparison of the system hyperspectral value for lightness to the digital pathology display value for lightness. Similarly, in FIG. 7B, graph 230 shows a comparison of the system hyperspectral value for green/red to the digital pathology display value for green/red. Similarly, in FIG. 7C, graph 250 shows a comparison of the system hyperspectral value for blue/yellow to the digital pathology display value for blue/yellow. Additionally, in each of the graphs 210, 230 and 250, it is evident that there are a very small number of individual comparisons. Notably, each individual comparison corresponds to a separate index. Advantageously, having a small number of indices results in a large number of pixels in each index, which consequently reduces the noise in the data set. When FIGS. 11A, 11B and 11C are compared to FIGS. 7A, 7B and 7C, there are fewer measurements, but the scatter due to noise is much less. This demonstrates the advantage of having a very large number of image pixels in each index that form the basis for determining the average color.
FIG. 12A is a block diagram illustrating an example processor enabled device 550 that may be used in connection with various embodiments described herein. Alternative forms of the device 550 may also be used as will be understood by the skilled artisan. In the illustrated embodiment, the device 550 is presented as a digital imaging device (also referred to herein as a scanner system or a scanning system) that comprises one or more processors 555, one or more memories 565, one or more motion controllers 570, one or more interface systems 575, one or more movable stages 580 that each support one or more glass slides 585 with one or more samples 590, one or more illumination systems 595 that illuminate the sample, one or more objective lenses 600 that each define an optical path 605 that travels along an optical axis, one or more objective lens positioners 630, one or more optional epi-illumination systems 635 (e.g., included in a fluorescence scanner system), one or more focusing optics 610, one or more line scan cameras 615 and/or one or more area scan cameras 620, each of which define a separate field of view 625 on the sample 590 and/or glass slide 585. The various elements of the scanner system 550 are communicatively coupled via one or more communication busses 560. Although there may be one or more of each of the various elements of the scanner system 550, for simplicity in the description that follows, these elements will be described in the singular except when needed to be described in the plural to convey the appropriate information.
The one or more processors 555 may include, for example, a central processing unit (“CPU”) and a separate graphics processing unit (“GPU”) capable of processing instructions in parallel or the one or more processors 555 may include a multicore processor capable of processing instructions in parallel. Additional separate processors may also be provided to control particular components or perform particular functions such as image processing. For example, additional processors may include an auxiliary processor to manage data input, an auxiliary processor to perform floating point mathematical operations, a special-purpose processor having an architecture suitable for fast execution of signal processing algorithms (e.g., digital signal processor), a slave processor subordinate to the main processor (e.g., back-end processor), an additional processor for controlling the line scan camera 615, the stage 580, the objective lens 225, and/or a display (not shown). Such additional processors may be separate discrete processors or may be integrated with the processor 555.
The memory 565 provides storage of data and instructions for programs that can be executed by the processor 555. The memory 565 may include one or more volatile and persistent computer-readable storage mediums that store the data and instructions, for example, a random access memory, a read only memory, a hard disk drive, removable storage drive, and the like. The processor 555 is configured to execute instructions that are stored in memory 565 and communicate via communication bus 560 with the various elements of the scanner system 550 to carry out the overall function of the scanner system 550.
The one or more communication busses 560 may include a communication bus 560 that is configured to convey analog electrical signals and may include a communication bus 560 that is configured to convey digital data. Accordingly, communications from the processor 555, the motion controller 570, and/or the interface system 575 via the one or more communication busses 560 may include both electrical signals and digital data. The processor 555, the motion controller 570, and/or the interface system 575 may also be configured to communicate with one or more of the various elements of the scanning system 550 via a wireless communication link.
The motion control system 570 is configured to precisely control and coordinate XYZ movement of the stage 580 and the objective lens 600 (e.g., via the objective lens positioner 630). The motion control system 570 is also configured to control movement of any other moving part in the scanner system 550. For example, in a fluorescence scanner embodiment, the motion control system 570 is configured to coordinate movement of optical filters and the like in the epi-illumination system 635.
The interface system 575 allows the scanner system 550 to interface with other systems and human operators. For example, the interface system 575 may include a user interface to provide information directly to an operator and/or to allow direct input from an operator. The interface system 575 is also configured to facilitate communication and data transfer between the scanning system 550 and one or more external devices that are directly connected (e.g., a printer, removable storage medium) or external devices such as an image server system, an operator station, a user station, and an administrative server system that are connected to the scanner system 550 via a network (not shown). In one embodiment, a color measurement device 577 may be configured to read color information from the user interface 575 and translate the color information into one or more XYZ values.
The illumination system 595 is configured to illuminate a portion of the sample 590. The illumination system may include, for example, a light source and illumination optics. The light source could be a variable intensity halogen light source with a concave reflective mirror to maximize light output and a KG-1 filter to suppress heat. The light source could also be any type of arc-lamp, laser, or other source of light. In one embodiment, the illumination system 595 illuminates the sample 590 in transmission mode such that the line scan camera 615 and/or area scan camera 620 sense optical energy that is transmitted through the sample 590. Alternatively, or in combination, the illumination system 595 may also be configured to illuminate the sample 590 in reflection mode such that the line scan camera 615 and/or area scan camera 620 sense optical energy that is reflected from the sample 590. Overall, the illumination system 595 is configured to be suitable for interrogation of the microscopic sample 590 in any known mode of optical microscopy.
In one embodiment, the scanner system 550 optionally includes an epi-illumination system 635 to optimize the scanner system 550 for fluorescence scanning. Fluorescence scanning is the scanning of samples 590 that include fluorescence molecules, which are photon sensitive molecules that can absorb light at a specific wavelength (excitation). These photon sensitive molecules also emit light at a higher wavelength (emission). Because the efficiency of this photoluminescence phenomenon is very low, the amount of emitted light is often very low. This low amount of emitted light typically frustrates conventional techniques for scanning and digitizing the sample 590 (e.g., transmission mode microscopy). Advantageously, in an optional fluorescence scanner system embodiment of the scanner system 550, use of a line scan camera 615 that includes multiple linear sensor arrays (e.g., a time delay integration (“TDI”) line scan camera) increases the sensitivity to light of the line scan camera by exposing the same area of the sample 590 to each of the multiple linear sensor arrays of the line scan camera 615. This is particularly useful when scanning faint fluorescence samples with low emitted light.
Accordingly, in a fluorescence scanner system embodiment, the line scan camera 615 is preferably a monochrome TDI line scan camera. Advantageously, monochrome images are ideal in fluorescence microscopy because they provide a more accurate representation of the actual signals from the various channels present on the sample. As will be understood by those skilled in the art, a fluorescence sample 590 can be labeled with multiple florescence dyes that emit light at different wavelengths, which are also referred to as “channels.”
Furthermore, because the low and high end signal levels of various fluorescence samples present a wide spectrum of wavelengths for the line scan camera 615 to sense, it is desirable for the low and high end signal levels that the line scan camera 615 can sense to be similarly wide. Accordingly, in a fluorescence scanner embodiment, a line scan camera 615 used in the fluorescence scanning system 550 is a monochrome 10 bit 64 linear array TDI line scan camera. It should be noted that a variety of bit depths for the line scan camera 615 can be employed for use with a fluorescence scanner embodiment of the scanning system 550.
The movable stage 580 is configured for precise XY movement under control of the processor 555 or the motion controller 570. The movable stage may also be configured for movement in Z under control of the processor 555 or the motion controller 570. The moveable stage is configured to position the sample in a desired location during image data capture by the line scan camera 615 and/or the area scan camera. The moveable stage is also configured to accelerate the sample 590 in a scanning direction to a substantially constant velocity and then maintain the substantially constant velocity during image data capture by the line scan camera 615. In one embodiment, the scanner system 550 may employ a high precision and tightly coordinated XY grid to aid in the location of the sample 590 on the movable stage 580. In one embodiment, the movable stage 580 is a linear motor based XY stage with high precision encoders employed on both the X and the Y axis. For example, very precise nanometer encoders can be used on the axis in the scanning direction and on the axis that is in the direction perpendicular to the scanning direction and on the same plane as the scanning direction. The stage is also configured to support the glass slide 585 upon which the sample 590 is disposed.
The sample 590 can be anything that may be interrogated by optical microscopy. For example, a glass microscope slide 585 is frequently used as a viewing substrate for specimens that include tissues and cells, chromosomes, DNA, protein, blood, bone marrow, urine, bacteria, beads, biopsy materials, or any other type of biological material or substance that is either dead or alive, stained or unstained, labeled or unlabeled. The sample 590 may also be an array of any type of DNA or DNA-related material such as cDNA or RNA or protein that is deposited on any type of slide or other substrate, including any and all samples commonly known as a microarrays. The sample 590 may be a microtiter plate, for example a 96-well plate. Other examples of the sample 590 include integrated circuit boards, electrophoresis records, petri dishes, film, semiconductor materials, forensic materials, or machined parts.
Objective lens 600 is mounted on the objective positioner 630 which, in one embodiment, may employ a very precise linear motor to move the objective lens 600 along the optical axis defined by the objective lens 600. For example, the linear motor of the objective lens positioner 630 may include a 50 nanometer encoder. The relative positions of the stage 580 and the objective lens 600 in XYZ axes are coordinated and controlled in a closed loop manner using motion controller 570 under the control of the processor 555 that employs memory 565 for storing information and instructions, including the computer-executable programmed steps for overall scanning system 550 operation.
In one embodiment, the objective lens 600 is a plan apochromatic (“APO”) infinity corrected objective with a numerical aperture corresponding to the highest spatial resolution desirable, where the objective lens 600 is suitable for transmission mode illumination microscopy, reflection mode illumination microscopy, and/or epi-illumination mode fluorescence microscopy (e.g., an Olympus 40X, 0.75NA or 20X, 0.75 NA). Advantageously, objective lens 600 is capable of correcting for chromatic and spherical aberrations. Because objective lens 600 is infinity corrected, focusing optics 610 can be placed in the optical path 605 above the objective lens 600 where the light beam passing through the objective lens becomes a collimated light beam. The focusing optics 610 focus the optical signal captured by the objective lens 600 onto the light-responsive elements of the line scan camera 615 and/or the area scan camera 620 and may include optical components such as filters, magnification changer lenses, etc. The objective lens 600 combined with focusing optics 610 provides the total magnification for the scanning system 550. In one embodiment, the focusing optics 610 may contain a tube lens and an optional 2× magnification changer. Advantageously, the 2× magnification changer allows a native 20X objective lens 600 to scan the sample 590 at 40× magnification.
The line scan camera 615 comprises at least one linear array of picture elements (“pixels”). The line scan camera may be monochrome or color. Color line scan cameras typically have at least three linear arrays, while monochrome line scan cameras may have a single linear array or plural linear arrays. Any type of singular or plural linear array, whether packaged as part of a camera or custom-integrated into an imaging electronic module, can also be used. For example, 3 linear array (“red-green-blue” or “RGB”) color line scan camera or a 96 linear array monochrome TDI may also be used. TDI line scan cameras typically provide a substantially better signal-to-noise ratio (“SNR”) in the output signal by summing intensity data from previously imaged regions of a specimen, yielding an increase in the SNR that is in proportion to the square-root of the number of integration stages. TDI line scan cameras comprise multiple linear arrays, for example, TDI line scan cameras are available with 24, 32, 48, 64, 96, or even more linear arrays. The scanner system 550 also supports linear arrays that are manufactured in a variety of formats including some with 512 pixels, some with 1024 pixels, and others having as many as 4096 pixels. Similarly, linear arrays with a variety of pixel sizes can also be used in the scanner system 550. The salient requirement for the selection of any type of line scan camera 615 is that the motion of the stage 580 can be synchronized with the line rate of the line scan camera 615 so that the stage 580 can be in motion with respect to the line scan camera 615 during the digital image capture of the sample 590.
The image data generated by the line scan camera 615 is stored a portion of the memory 565 and processed by the processor 555 to generate a contiguous digital image of at least a portion of the sample 590. The contiguous digital image can be further processed by the processor 555 and the revised contiguous digital image can also be stored in the memory 565.
In an embodiment with two or more line scan cameras 615, at least one of the line scan cameras 615 can be configured to function as a focusing sensor that operates in combination with at least one of the line scan cameras that is configured to function as an imaging sensor. The focusing sensor can be logically positioned on the same optical path as the imaging sensor or the focusing sensor may be logically positioned before or after the imaging sensor with respect to the scanning direction of the scanner system 550. In such an embodiment with at least one line scan camera 615 functioning as a focusing sensor, the image data generated by the focusing sensor is stored a portion of the memory 565 and processed by the one or more processors 555 to generate focus information to allow the scanner system 550 to adjust the relative distance between the sample 590 and the objective lens 600 to maintain focus on the sample during scanning.
In operation, the various components of the scanner system 550 and the programmed modules stored in memory 565 enable automatic scanning and digitizing of the sample 590, which is disposed on a glass slide 585. The glass slide 585 is securely placed on the movable stage 580 of the scanner system 550 for scanning the sample 590. Under control of the processor 555, the movable stage 580 accelerates the sample 590 to a substantially constant velocity for sensing by the line scan camera 615, where the speed of the stage is synchronized with the line rate of the line scan camera 615. After scanning a stripe of image data, the movable stage 580 decelerates and brings the sample 590 to a substantially complete stop. The movable stage 580 then moves orthogonal to the scanning direction to position the sample 590 for scanning of a subsequent stripe of image data, e.g., an adjacent stripe. Additional stripes are subsequently scanned until an entire portion of the sample 590 or the entire sample 590 is scanned.
For example, during digital scanning of the sample 590, a contiguous digital image of the sample 590 is acquired as a plurality of contiguous fields of view that are combined together to form an image strip. A plurality of adjacent image strips are similarly combined together to form a contiguous digital image of a portion or the entire sample 590. The scanning of the sample 590 may include acquiring vertical image strips or horizontal image strips. The scanning of the sample 590 may be either top-to-bottom, bottom-to-top, or both (bi-directional) and may start at any point on the sample. Alternatively, the scanning of the sample 590 may be either left-to-right, right-to-left, or both (bi-directional) and may start at any point on the sample. Additionally, it is not necessary that image strips be acquired in an adjacent or contiguous manner. Furthermore, the resulting image of the sample 590 may be an image of the entire sample 590 or only a portion of the sample 590.
In one embodiment, computer-executable instructions (e.g., programmed modules and software) are stored in the memory 565 and, when executed, enable the scanning system 550 to perform the various functions described herein. In this description, the term “computer-readable storage medium” is used to refer to any media used to store and provide computer executable instructions to the scanning system 550 for execution by the processor 555. Examples of these media include memory 565 and any removable or external storage medium (not shown) communicatively coupled with the scanning system 550 either directly or indirectly, for example via a network (not shown).
FIG. 12B illustrates a line scan camera having a single linear array 640, which may be implemented as a charge coupled device (“CCD”) array. The single linear array 640 comprises a plurality of individual pixels 645. In the illustrated embodiment, the single linear array 640 has 4096 pixels. In alternative embodiments, linear array 640 may have more or fewer pixels. For example, common formats of linear arrays include 512, 1024, and 4096 pixels. The pixels 645 are arranged in a linear fashion to define a field of view 625 for the linear array 640. The size of the field of view varies in accordance with the magnification of the scanner system 550.
FIG. 12C illustrates a line scan camera having three linear arrays, each of which may be implemented as a CCD array. The three linear arrays combine to form a color array 650. In one embodiment, each individual linear array in the color array 650 detects a different color intensity, for example red, green, or blue. The color image data from each individual linear array in the color array 650 is combined to form a single field of view 625 of color image data.
FIG. 12D illustrates a line scan camera having a plurality of linear arrays, each of which may be implemented as a CCD array. The plurality of linear arrays combine to form a TDI array 655. Advantageously, a TDI line scan camera may provide a substantially better SNR in its output signal by summing intensity data from previously imaged regions of a specimen, yielding an increase in the SNR that is in proportion to the square-root of the number of linear arrays (also referred to as integration stages). A TDI line scan camera may comprise a larger variety of numbers of linear arrays, for example common formats of TDI line scan cameras include 24, 32, 48, 64, 96, 120 and even more linear arrays.
EXAMPLE EMBODIMENTS
The disclosure of the present application may be embodied in a system comprising a non-transitory computer readable medium configured to store data and executable programmed modules; at least one processor communicatively coupled with the non-transitory computer readable medium and configured to execute instructions stored thereon; a register module stored in the non-transitory computer readable medium and configured to be executed by the processor, the register module configured to obtain a first digital image of a specimen in XYZ color, the first digital image having a plurality of image pixels having a first image pixel size, obtain a second digital image of the specimen in RGB color, the second digital image having a plurality of image pixels having a second image pixel size, convert the first digital image and the second digital image to a common image pixel size, and align the converted image pixels of the first digital image with corresponding converted image pixels of the second digital image. Such a system may be implemented as a processor enabled device such as the digital imaging device or the image processing device previously described with respect to FIGS. 4A and 12A-12D.
The disclosure of the present application may also be embodied in a system comprising a non-transitory computer readable medium configured to store data and executable programmed modules; at least one processor communicatively coupled with the non-transitory computer readable medium and configured to execute instructions stored thereon; a register module stored in the non-transitory computer readable medium and configured to be executed by the processor, the register module configured to: obtain a first digital image of a specimen in XYZ color, the first digital image having a plurality of image pixels having a first image pixel size, obtain a second digital image of the specimen in RGB color, the second digital image having a plurality of image pixels having a second image pixel size, align image data of the first digital image with corresponding image data of the second digital image, and convert the first digital image and the second digital image to a common image pixel size, wherein the converted image pixels of the first digital image are aligned with corresponding converted image pixels of the second digital image; a look up table module stored in the non-transitory computer readable medium and configured to be executed by the processor, the look up table module configured to: generate a look up table to associate the XYZ color values of a plurality of converted image pixels of the first digital image with the RGB color values of a plurality of corresponding converted image pixels of the second digital image. Such a system may be implemented as a processor enabled device such as the digital imaging device or the image processing device previously described with respect to FIGS. 4A and 12A-12D.
The disclosure of the present application may be embodied in a system comprising a non-transitory computer readable medium configured to store data and executable programmed modules; at least one processor communicatively coupled with the non-transitory computer readable medium and configured to execute instructions stored thereon; a register module stored in the non-transitory computer readable medium and configured to be executed by the processor, the register module configured to: obtain a first digital image of a specimen in XYZ color, the first digital image having a plurality of image pixels having a first image pixel size, obtain a second digital image of the specimen in RGB color, the second digital image having a plurality of image pixels having a second image pixel size, align image data of the first digital image with corresponding image data of the second digital image, and convert the first digital image and the second digital image to a common image pixel size, wherein the converted image pixels of the first digital image are aligned with corresponding converted image pixels of the second digital image; a color module stored in the non-transitory computer readable medium and configured to be executed by the processor, the color module configured to: identify a first set of image pixels in the first digital image, wherein each image pixel in the first set of image pixels in the first digital image has substantially a same XYZ color value, determine an average XYZ color value for the first set of image pixels in the first digital image, identify a second set of image pixels in the second digital image, wherein each image pixel in the second set of image pixels in the second digital image corresponds to an image pixel in the first set of image pixels in the first digital image, and determine an average RGB color value for the second set of image pixels in the second digital image; a look up table module stored in the non-transitory computer readable medium and configured to be executed by the processor, the look up table module configured to: generate a look up table to associate the average XYZ color value of the first set of image pixels in the first digital image with the average RGB color value of the corresponding second set of image pixels in the second digital image. Such a system may be implemented as a processor enabled device such as the digital imaging device or the image processing device previously described with respect to FIGS. 4A and 12A-12D.
Any of the three system embodiments described above may further embody wherein each of the image pixels in the first set of image pixels in the first digital image is contiguous with at least one other image pixel in the first set of image pixels in the first digital image.
Alternatively, any of the three system embodiments described above may further embody wherein at least some of the image pixels in the first set of image pixels in the first digital image are not contiguous, and furthermore, wherein the color module is further configured to identify a plurality of first sets of image pixels in the first digital image, wherein each image pixel in a first set of image pixels in the first digital image has substantially a same XYZ color value.
The disclosure of the present application may also be embodied in a technical system comprising a non-transitory computer readable medium configured to store executable programmed modules and at least one processor communicatively coupled with the non-transitory computer readable medium configured to execute instructions to perform steps comprising: scanning a specimen using a hyperspectral imaging system to generate a first digital image of the specimen in XYZ color; scanning the same specimen using a digital pathology imaging system to generate a second digital image of the specimen in RGB color; registering the first digital image to the second digital image to align the image data in the first digital image and the second digital image; generating a lookup table that associates the XYZ color of the first digital image and the RGB color of the second digital image. Such a system may be implemented as a processor enabled device such as the digital imaging device or the image processing device previously described with respect to FIGS. 4A and 12A-12D.
This system embodiment may further include providing XYZ color data to a display module for presentation of the second digital image on a display.
This system embodiment may further include storing the XYZ color data as part of the second digital image.
This system embodiment may further include using pattern matching to register the first digital image to the second digital image.
This system embodiment may further include overlaying a grid on the specimen prior to creating the first and second digital images and using the grid in the first and second digital images to register the first digital image to the second digital image.
This system embodiment may further include combining pixels in one or more of the first digital image and the second digital image to cause the pixel size in the first digital image to be substantially the same as the pixel size in the second digital image.
This system embodiment may further include generating a single lookup table for a single stain.
This system embodiment may further include generating a single lookup table for a plurality of stains.
The disclosure of the present application may also be embodied in a technical system comprising a non-transitory computer readable medium configured to store executable programmed modules and at least one processor communicatively coupled with the non-transitory computer readable medium configured to execute instructions to perform steps comprising: obtaining a first digital image of a specimen scanned by a first imaging system to generate the first digital image of the specimen in XYZ color, the first digital image having a plurality of image pixels having a first image pixel size; obtaining a second digital image of the specimen scanned by a second imaging system to generate the second digital image of the specimen in RGB color, the second digital image having a plurality of image pixels having a second image pixel size; registering the first digital image to the second digital image to align the image data in the first digital image and the second digital image; presenting the second digital image on a display; using a color measurement device to measure the XYZ values of the color presented on the display in a first region; and comparing the measured XYZ values of the first region to the XYZ values of the first digital image for the first region to validate the digital pathology system. Such a system may be implemented as a processor enabled device such as the digital imaging device or the image processing device previously described with respect to FIGS. 4A and 12A-12D.
The disclosure of the present application may also be embodied in a method comprising: obtaining a first digital image of a specimen scanned by a first imaging system to generate the first digital image of the specimen in XYZ color, the first digital image having a plurality of image pixels having a first image pixel size, obtaining a second digital image of the specimen scanned by a second imaging system to generate the second digital image of the specimen in RGB color, the second digital image having a plurality of image pixels having a second image pixel size, generating a lookup table to associate the XYZ color of the first digital image and the RGB color of the second digital image. Such a method may be implemented by a system such as the digital imaging device or the image processing device previously described with respect to FIGS. 4A and 12A-12D.
This method embodiment may further include aligning image data in the first digital image with image data in the second digital image; and generating a lookup table to associate the XYZ color of the first digital image with the corresponding RGB color of the second digital image in accordance with said alignment.
This method embodiment may further include, wherein the first digital image comprises a plurality of image pixels having a first image pixel size and the second digital image comprises a plurality of image pixels having a second image pixel size, converting the image pixels of the first digital image and the image pixels of the second digital image to a common image pixel size, and aligning the image pixels of the converted first digital image with corresponding image pixels of the converted second digital image.
This method embodiment may further include identifying a first set of image pixels in the first digital image, wherein each image pixel in the first set of image pixels in the converted first digital image has substantially a same XYZ color value; determining an average XYZ color value for the first set of image pixels in the converted first digital image; identifying a second set of image pixels in the converted second digital image, wherein each image pixel in the second set of image pixels in the converted second digital image corresponds to an image pixel in the first set of image pixels in the converted first digital image; determining an average RGB color value for the second set of image pixels in the converted second digital image, and generating a lookup table to associate the average XYZ color value for the first set of image pixels in the converted first digital image with the corresponding average RGB color value of the second set of image pixels in the converted second digital image.
The disclosure of the present application may also be embodied in a method comprising: obtaining a first digital image of a specimen scanned by a first imaging system to generate the first digital image of the specimen in XYZ color, the first digital image having a plurality of image pixels having a first image pixel size, obtaining a second digital image of the specimen scanned by a second imaging system to generate the second digital image of the specimen in RGB color, the second digital image having a plurality of image pixels having a second image pixel size, aligning image data in the first digital image with image data in the second digital image; and generating a lookup table to associate the XYZ color of the first digital image with the corresponding RGB color of the second digital image in accordance with said alignment. Such a method may be implemented by a system such as the digital imaging device or the image processing device previously described with respect to FIGS. 4A and 12A-12D.
This method embodiment may further include, wherein the first digital image comprises a plurality of image pixels having a first image pixel size and the second digital image comprises a plurality of image pixels having a second image pixel size, converting the first digital image and the second digital image to a common image pixel size, and aligning the converted image pixels of the first digital image with corresponding converted image pixels of the second digital image.
This method embodiment may further include identifying a plurality of first sets of image pixels in the converted first digital image, wherein each image pixel in each set of image pixels in the plurality of first sets of image pixels in the converted first digital image has substantially a same XYZ color value; determining an average XYZ color value for each set of image pixels in the plurality of first sets of image pixels in the converted first digital image; identifying a corresponding plurality of second sets of image pixels in the converted second digital image, wherein each image pixel in each set of image pixels in the plurality of second sets of image pixels in the converted second digital image corresponds to an image pixel in the converted first digital image; determining an average RGB color value for each set of image pixels in the plurality of second sets of image pixels in the converted second digital image, and generating a lookup table to associate the average XYZ color value for each of the first sets of image pixels in the converted first digital image with the corresponding average RGB color value of the corresponding second set of image pixels in the converted second digital image.
The disclosure of the present application may also be embodied in a method comprising: obtaining a first digital image of a specimen scanned by a first imaging system to generate the first digital image of the specimen in XYZ color, the first digital image having a plurality of image pixels having a first image pixel size, obtaining a second digital image of the specimen scanned by a second imaging system to generate the second digital image of the specimen in RGB color, the second digital image having a plurality of image pixels having a second image pixel size, registering the first digital image to the second digital image to align the image data in the first digital image and the second digital image; and generating a lookup table that associates the XYZ color of the first digital image and the RGB color of the second digital image. Such a method may be implemented by a system such as the digital imaging device or the image processing device previously described with respect to FIGS. 4A and 12A-12D.
This method embodiment may further include providing XYZ color data to a display module for presentation of the second digital image on a display.
This method embodiment may further include storing the XYZ color data as part of the second digital image file.
This method embodiment may further include using pattern matching to register the first digital image to the second digital image.
This method embodiment may further include overlaying a grid on the specimen prior to creating the first and second digital images and using the grid in the first and second digital images to register the first digital image to the second digital image.
This method embodiment may further include combining pixels in one or more of the first digital image and the second digital image to cause the first image pixel size in the first digital image to be substantially the same as the second image pixel size in the second digital image.
This method embodiment may further include generating a single lookup table for a single stain.
This method embodiment may further include generating a single lookup table for a plurality of stains.
The disclosure of the present application may also be embodied in a method comprising: obtaining a first digital image of a specimen scanned by a first imaging system to generate the first digital image of the specimen in XYZ color, the first digital image having a plurality of image pixels having a first image pixel size, obtaining a second digital image of the specimen scanned by a second imaging system to generate the second digital image of the specimen in RGB color, the second digital image having a plurality of image pixels having a second image pixel size, registering the first digital image to the second digital image to align the image data in the first digital image and the second digital image; presenting the second digital image on a display; using a color measurement device to measure the XYZ values of the color presented on the display in a first region; and comparing the measured XYZ values of the first region to the XYZ values of the first digital image for the first region to validate the digital pathology system. Such a method may be implemented by a system such as the digital imaging device or the image processing device previously described with respect to FIGS. 4A and 12A-12D.
The disclosure of the present application may also be embodied in a method comprising: obtaining a first digital image of a specimen scanned by a first imaging system to generate the first digital image of the specimen in XYZ color, the first digital image having a plurality of image pixels having a first image pixel size; obtaining a second digital image of the specimen scanned by a second imaging system to generate the second digital image of the specimen in RGB color, the second digital image having a plurality of image pixels having a second image pixel size, registering the first digital image to the second digital image to align the image data in the first digital image and the second digital image; converting the first digital image and the second digital image to a common image pixel size; identifying a plurality of first sets of image pixels in the converted first digital image, wherein each image pixel in each set of image pixels in the plurality of first sets of image pixels in the converted first digital image has substantially a same XYZ color value; determining an average XYZ color value for each set of image pixels in the plurality of first sets of image pixels in the converted first digital image; identifying a corresponding plurality of second sets of image pixels in the converted second digital image, wherein each image pixel in each set of image pixels in the plurality of second sets of image pixels in the converted second digital image corresponds to an image pixel in the converted first digital image; determining an average RGB color value for each set of image pixels in the plurality of second sets of image pixels in the converted second digital image; generating a lookup table to associate the average XYZ color value for each of the first sets of image pixels in the converted first digital image with the corresponding average RGB color value of the corresponding second set of image pixels in the converted second digital image; presenting a first average RGB color value from the lookup table on a first region of a display; using a color measurement device to measure the XYZ value from the first region of the display; and comparing the measured XYZ value from the first region of the display to the average XYZ value corresponding to the first average RGB color value in the lookup table. Such a method may be implemented by a system such as the digital imaging device or the image processing device previously described with respect to FIGS. 4A and 12A-12D.
The above description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles described herein can be applied to other embodiments without departing from the spirit or scope of the invention. Thus, it is to be understood that the description and drawings presented herein represent a presently preferred embodiment of the invention and are therefore representative of the subject matter which is broadly contemplated by the present invention. It is further understood that the scope of the present invention fully encompasses other embodiments that may become obvious to those skilled in the art and that the scope of the present invention is accordingly not limited.

Claims (17)

What is claimed is:
1. A method comprising:
obtaining a first digital image of a specimen in XYZ color captured by a first imaging system, wherein the specimen is a tissue specimen, wherein the first digital image comprises a plurality of XYZ pixels, and wherein each of the plurality of XYZ pixels has an XYZ color value;
obtaining a second digital image of the specimen in RGB color captured by a second imaging system, wherein the second imaging system is separate from the first imaging system, wherein the second digital image comprises a plurality of RGB pixels, and wherein each of the plurality of RGB pixels has an RGB color value; and
generating a lookup table to associate each of a plurality of XYZ color values derived from the first digital image with one of a plurality of RGB color values derived from the second digital image.
2. The method of claim 1, wherein generating the lookup table comprises:
registering the first digital image and the second digital image to a common grid; and
mapping the plurality of XYZ pixels to the plurality of RGB pixels according to the common grid.
3. The method of claim 1, wherein generating the lookup table comprises:
aligning the first digital image with the second digital image based on pattern matching; and
mapping the plurality of XYZ pixels to the plurality of RGB pixels according to the alignment.
4. The method of claim 1, further comprising up-sampling or down-sampling one or both of the first digital image and the second digital image to a common pixel size.
5. The method of claim 1, wherein the plurality of XYZ color values derived from the first digital image are average XYZ color values for subsets of the plurality of XYZ pixels, and wherein the plurality of RGB color values derived from the second digital image are average RGB color values for subsets of the plurality of RGB pixels.
6. The method of claim 5, further comprising:
determining the subsets of XYZ pixels by grouping those of the plurality of XYZ pixels that have XYZ color values within a XYZ threshold; and
determining the subsets of RGB pixels by grouping those of the plurality of RGB pixels that have RGB color values within a RGB threshold.
7. The method of claim 6, wherein an amount of the subsets of XYZ pixels and an amount of the subsets of RGB pixels are both limited to a predefined number, such that an amount of the plurality of XYZ color values and an amount of the plurality of RGB color values in the lookup table are also both limited to the predefined number.
8. The method of claim 7, wherein the predefined number is ten.
9. The method of claim 6, wherein an amount of the subsets of XYZ pixels and an amount of the subsets of RGB pixels are limited to a smallest number that allocates each of the plurality of XYZ pixels to one of the plurality of XYZ color values and allocates each of the plurality of RGB pixels to one of the plurality of RGB color values, while minimizing an error between the respective color values of the pixels and the respective color values to which the pixels are allocated.
10. The method of claim 9, wherein the error is minimized using a root mean square analysis.
11. The method of claim 1, further comprising:
for each of a plurality of contiguous sets of two or more of the plurality of XYZ pixels that have similar XYZ color values, grouping the contiguous set of two or more XYZ pixels into an XYZ superpixel, and assigning an average XYZ color value of the contiguous set of two more XYZ pixels to the XYZ superpixel; and,
for each of a plurality of contiguous sets of two or more of the plurality of RGB pixels that have similar RGB color values, grouping the contiguous set of two or more RGB pixels into an RGB superpixel, and assigning an average RGB color value of the contiguous set of two more RGB pixels to the RGB superpixel;
wherein the plurality of XYZ color values is derived from the average XYZ color values assigned to the XYZ superpixels, and
wherein the plurality of RGB color values is derived from the average RGB color values assigned to the RGB superpixels.
12. The method of claim 1, wherein capturing the first digital image of a specimen in XYZ color via the first imaging system comprises:
capturing a hyperspectral image stack comprising images of the specimen at different wavelengths of light; and
generating the first digital image from the hyperspectral image stack.
13. The method of claim 1, further comprising embedding the lookup table into a data structure that comprises the second digital image.
14. The method of claim 13, wherein the lookup table is comprised in an International Color Consortium (ICC) profile in the data structure.
15. The method of claim 1, further comprising converting the lookup table into a model, and embedding the model into a data structure that comprises the second digital image.
16. A system comprising:
at least one hardware processor; and
one or more software modules that are configured to, when executed by the at least one hardware processor,
obtain a first digital image of a specimen in XYZ color captured by a first imaging system, wherein the first digital image comprises a plurality of XYZ pixels, and wherein each of the plurality of XYZ pixels has an XYZ color value,
obtain a second digital image of the specimen in RGB color captured by a second imaging system, wherein the second imaging system is separate from the first imaging system, wherein the second digital image comprises a plurality of RGB pixels, and wherein each of the plurality of RGB pixels has an RGB color value, and
generate a lookup table to associate each of a plurality of XYZ color values derived from the first digital image with one of a plurality of RGB color values derived from the second digital image.
17. A non-transitory computer-readable medium having instructions stored thereon, wherein the instructions, when executed by a processor, cause the processor to:
obtain a first digital image of a specimen in XYZ color captured by a first imaging system, wherein the first digital image comprises a plurality of XYZ pixels, and wherein each of the plurality of XYZ pixels has an XYZ color value;
obtain a second digital image of the specimen in RGB color captured by a second imaging system, wherein the second imaging system is separate from the first imaging system, wherein the second digital image comprises a plurality of RGB pixels, and wherein each of the plurality of RGB pixels has an RGB color value; and
generate a lookup table to associate each of a plurality of XYZ color values derived from the first digital image with one of a plurality of RGB color values derived from the second digital image.
US17/086,099 2016-04-20 2020-10-30 Digital pathology color calibration and validation Active 2037-10-11 US11614363B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/086,099 US11614363B2 (en) 2016-04-20 2020-10-30 Digital pathology color calibration and validation

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201662325330P 2016-04-20 2016-04-20
PCT/US2017/028532 WO2017184821A1 (en) 2016-04-20 2017-04-20 Digital pathology color calibration and validation
US201816095267A 2018-10-19 2018-10-19
US17/086,099 US11614363B2 (en) 2016-04-20 2020-10-30 Digital pathology color calibration and validation

Related Parent Applications (2)

Application Number Title Priority Date Filing Date
US16/095,267 Continuation US10845245B2 (en) 2016-04-20 2017-04-20 Digital pathology color calibration and validation
PCT/US2017/028532 Continuation WO2017184821A1 (en) 2016-04-20 2017-04-20 Digital pathology color calibration and validation

Publications (2)

Publication Number Publication Date
US20210048344A1 US20210048344A1 (en) 2021-02-18
US11614363B2 true US11614363B2 (en) 2023-03-28

Family

ID=60117014

Family Applications (2)

Application Number Title Priority Date Filing Date
US16/095,267 Active 2037-04-29 US10845245B2 (en) 2016-04-20 2017-04-20 Digital pathology color calibration and validation
US17/086,099 Active 2037-10-11 US11614363B2 (en) 2016-04-20 2020-10-30 Digital pathology color calibration and validation

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US16/095,267 Active 2037-04-29 US10845245B2 (en) 2016-04-20 2017-04-20 Digital pathology color calibration and validation

Country Status (5)

Country Link
US (2) US10845245B2 (en)
EP (2) EP4273806A3 (en)
JP (1) JP6771584B2 (en)
CN (2) CN109073454B (en)
WO (1) WO2017184821A1 (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109073454B (en) * 2016-04-20 2021-12-21 徕卡生物系统成像股份有限公司 Digital pathology color calibration and verification
EP3639237B1 (en) * 2017-06-13 2024-02-21 X-Rite, Incorporated Hyperspectral imaging spectrophotometer and system
JP6997312B2 (en) 2017-11-30 2022-01-17 ライカ バイオシステムズ イメージング インコーポレイテッド Update color monitor settings
GB201817092D0 (en) * 2018-10-19 2018-12-05 Cancer Research Tech Ltd Apparatus and method for wide-field hyperspectral imaging
US10937193B2 (en) * 2018-12-05 2021-03-02 Goodrich Corporation Multi-sensor alignment and real time distortion correction and image registration
WO2021026374A1 (en) 2019-08-06 2021-02-11 Leica Biosystems Imaging, Inc. Graphical user interface for slide-scanner control
CN110706190B (en) * 2019-09-30 2022-05-20 杭州智团信息技术有限公司 Color enhancement method and device for pathological digital image
WO2021091534A1 (en) * 2019-11-05 2021-05-14 Hewlett-Packard Development Company, L.P. Printer colour deviation detection

Citations (40)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5119186A (en) * 1989-09-28 1992-06-02 International Business Machines Corporation Color mapping system and method
US6118946A (en) 1999-06-29 2000-09-12 Eastman Kodak Company Method and apparatus for scannerless range image capture using photographic film
US6215892B1 (en) 1995-11-30 2001-04-10 Chromavision Medical Systems, Inc. Method and apparatus for automated image analysis of biological specimens
US6330349B1 (en) 1995-11-30 2001-12-11 Chromavision Medical Systems, Inc. Automated method for image analysis of residual protein
US20020001080A1 (en) 1999-08-06 2002-01-03 Cambridge Research & Instrumentation, Inc., Massachusetts Corporation Spectral imaging system
US20020180764A1 (en) 2001-06-01 2002-12-05 John Gilbert Method and system for digital image management
JP2003209705A (en) 2002-01-15 2003-07-25 Seiko Epson Corp Adjustment for image data output
JP2003230154A (en) 2002-01-31 2003-08-15 Olympus Optical Co Ltd Digital camera for microscope
US6718053B1 (en) 1996-11-27 2004-04-06 Chromavision Medical Systems, Inc. Method and apparatus for automated image analysis of biological specimens
US20040252875A1 (en) * 2000-05-03 2004-12-16 Aperio Technologies, Inc. System and method for data management in a linear-array-based microscope slide scanner
US20050036668A1 (en) 2003-02-12 2005-02-17 The University Of Iowa Research Foundation Methods and devices useful for analyzing color medical images
US20060118742A1 (en) 2004-12-06 2006-06-08 Richard Levenson Systems and methods for in-vivo optical imaging and measurement
US20090167893A1 (en) * 2007-03-05 2009-07-02 Fotonation Vision Limited RGBW Sensor Array
US20090185041A1 (en) 2008-01-22 2009-07-23 Samsung Electronics Co., Ltd. Apparatus and method of obtaining image and apparatus and method of processing image
JP2009239419A (en) 2008-03-26 2009-10-15 Seiko Epson Corp Profile preparation method, profile preparation apparatus, profile preparation program and printing device
US20100027072A1 (en) 2008-05-30 2010-02-04 Seiko Epson Corporation Image Processing Apparatus, Image Processing Method, Image Processing Program, and Printing Apparatus
US20100188672A1 (en) * 2009-01-26 2010-07-29 Xerox Corporation Gamut aim and gamut mapping method for spatially varying color lookup tables
US20100188424A1 (en) * 2009-01-26 2010-07-29 Hamamatsu Photonics K.K. Image outputting system, image outputting method, and image outputting program
US20110109735A1 (en) * 2009-11-09 2011-05-12 Olympus Corporation Virtual microscope system
US20120063675A1 (en) 2010-08-12 2012-03-15 High Technology Video, Inc. Methods and systems for automatic coloring of digital images
US20120269417A1 (en) 2011-04-25 2012-10-25 Bautista Pinky A Computer-aided staining of multispectral images
US20130016885A1 (en) 2011-07-14 2013-01-17 Canon Kabushiki Kaisha Image processing apparatus, imaging system, and image processing system
US20130027720A1 (en) 2011-07-29 2013-01-31 Nobuyuki Satoh Color measuring device, image forming apparatus, image forming method, and computer-readable storage medium
US20130094758A1 (en) 2010-01-05 2013-04-18 Vikas Yadav Color Saturation-Modulated Blending of Exposure-Bracketed Images
US20140055592A1 (en) 2012-08-24 2014-02-27 Datacolor, Inc. System and apparatus for color correction in transmission-microscope slides
US20140105480A1 (en) * 2012-05-30 2014-04-17 Panasonic Corporation Image measurement apparatus, image measurement method and image measurement system
US20140193050A1 (en) 2013-01-10 2014-07-10 Caliper Life Sciences, Inc. Multispectral Imaging Systems and Methods
US20140204112A1 (en) 2006-08-31 2014-07-24 Corel Corporation, Inc. Color selection and/or matching in a color image
CN103975382A (en) 2011-11-30 2014-08-06 高通Mems科技公司 Methods and apparatus for interpolating colors
US20140300753A1 (en) 2013-04-04 2014-10-09 Apple Inc. Imaging pipeline for spectro-colorimeters
JP2015035650A (en) 2013-08-07 2015-02-19 キヤノン株式会社 Image processing device, image processing method and program
US20160094826A1 (en) 2014-09-26 2016-03-31 Spero Devices, Inc. Analog image alignment
JP2016048905A (en) 2013-11-15 2016-04-07 富士フイルム株式会社 Color conversion table creation device, method, and program
US20160131524A1 (en) 2014-11-11 2016-05-12 Instrument Systems Optische Messtechnik Gmbh Colorimeter calibration
US9514508B2 (en) * 2011-12-08 2016-12-06 Dolby Laboratories Licensing Corporation Mapping for display emulation based on image characteristics
WO2017184821A1 (en) 2016-04-20 2017-10-26 Leica Biosystems Imaging, Inc. Digital pathology color calibration and validation
US20170372117A1 (en) 2014-11-10 2017-12-28 Ventana Medical Systems, Inc. Classifying nuclei in histology images
US20180040307A1 (en) * 2015-02-24 2018-02-08 Barco N.V. Steady color presentation manager
CN108140249A (en) 2015-09-02 2018-06-08 文塔纳医疗系统公司 For showing the image processing system of the multiple images of biological sample and method
US20180241908A1 (en) * 2017-02-21 2018-08-23 Fuji Xerox Co., Ltd. Image processing apparatus

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101750147A (en) * 2008-12-11 2010-06-23 张辉 Method and device for measuring color spectrum data of object to be measured
FR2943038B1 (en) * 2009-03-13 2012-07-27 Aircelle Sa DEFROSTING DEVICE, IN PARTICULAR FOR AN AIRCRAFT NACELLE
US20140001005A1 (en) * 2009-06-17 2014-01-02 Borgwarner Inc. Mechanical Slip Failsafe System For A Heavy-Duty Multi-Speed Fan Clutch
GB201000835D0 (en) * 2010-01-19 2010-03-03 Akzo Nobel Coatings Int Bv Method and system for determining colour from an image
WO2015107889A1 (en) * 2014-01-14 2015-07-23 有限会社パパラボ Coloration testing device and coloration testing method

Patent Citations (48)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5119186A (en) * 1989-09-28 1992-06-02 International Business Machines Corporation Color mapping system and method
US6215892B1 (en) 1995-11-30 2001-04-10 Chromavision Medical Systems, Inc. Method and apparatus for automated image analysis of biological specimens
US6330349B1 (en) 1995-11-30 2001-12-11 Chromavision Medical Systems, Inc. Automated method for image analysis of residual protein
US20040066960A1 (en) 1995-11-30 2004-04-08 Chromavision Medical Systems, Inc., A California Corporation Automated detection of objects in a biological sample
US6718053B1 (en) 1996-11-27 2004-04-06 Chromavision Medical Systems, Inc. Method and apparatus for automated image analysis of biological specimens
US6118946A (en) 1999-06-29 2000-09-12 Eastman Kodak Company Method and apparatus for scannerless range image capture using photographic film
US20020001080A1 (en) 1999-08-06 2002-01-03 Cambridge Research & Instrumentation, Inc., Massachusetts Corporation Spectral imaging system
US20040252875A1 (en) * 2000-05-03 2004-12-16 Aperio Technologies, Inc. System and method for data management in a linear-array-based microscope slide scanner
US20020180764A1 (en) 2001-06-01 2002-12-05 John Gilbert Method and system for digital image management
JP2003209705A (en) 2002-01-15 2003-07-25 Seiko Epson Corp Adjustment for image data output
JP2003230154A (en) 2002-01-31 2003-08-15 Olympus Optical Co Ltd Digital camera for microscope
US20050036668A1 (en) 2003-02-12 2005-02-17 The University Of Iowa Research Foundation Methods and devices useful for analyzing color medical images
US20060118742A1 (en) 2004-12-06 2006-06-08 Richard Levenson Systems and methods for in-vivo optical imaging and measurement
US20140204112A1 (en) 2006-08-31 2014-07-24 Corel Corporation, Inc. Color selection and/or matching in a color image
US20090167893A1 (en) * 2007-03-05 2009-07-02 Fotonation Vision Limited RGBW Sensor Array
US20090185041A1 (en) 2008-01-22 2009-07-23 Samsung Electronics Co., Ltd. Apparatus and method of obtaining image and apparatus and method of processing image
JP2009239419A (en) 2008-03-26 2009-10-15 Seiko Epson Corp Profile preparation method, profile preparation apparatus, profile preparation program and printing device
US20100027072A1 (en) 2008-05-30 2010-02-04 Seiko Epson Corporation Image Processing Apparatus, Image Processing Method, Image Processing Program, and Printing Apparatus
US20100188672A1 (en) * 2009-01-26 2010-07-29 Xerox Corporation Gamut aim and gamut mapping method for spatially varying color lookup tables
US20100188424A1 (en) * 2009-01-26 2010-07-29 Hamamatsu Photonics K.K. Image outputting system, image outputting method, and image outputting program
US20110109735A1 (en) * 2009-11-09 2011-05-12 Olympus Corporation Virtual microscope system
US20130094758A1 (en) 2010-01-05 2013-04-18 Vikas Yadav Color Saturation-Modulated Blending of Exposure-Bracketed Images
US8611654B2 (en) 2010-01-05 2013-12-17 Adobe Systems Incorporated Color saturation-modulated blending of exposure-bracketed images
US20120063675A1 (en) 2010-08-12 2012-03-15 High Technology Video, Inc. Methods and systems for automatic coloring of digital images
US20120269417A1 (en) 2011-04-25 2012-10-25 Bautista Pinky A Computer-aided staining of multispectral images
US20130016885A1 (en) 2011-07-14 2013-01-17 Canon Kabushiki Kaisha Image processing apparatus, imaging system, and image processing system
US20130027720A1 (en) 2011-07-29 2013-01-31 Nobuyuki Satoh Color measuring device, image forming apparatus, image forming method, and computer-readable storage medium
CN103975382A (en) 2011-11-30 2014-08-06 高通Mems科技公司 Methods and apparatus for interpolating colors
US9514508B2 (en) * 2011-12-08 2016-12-06 Dolby Laboratories Licensing Corporation Mapping for display emulation based on image characteristics
US20140105480A1 (en) * 2012-05-30 2014-04-17 Panasonic Corporation Image measurement apparatus, image measurement method and image measurement system
US9418414B2 (en) 2012-05-30 2016-08-16 Panasonic Intellectual Property Management Co., Ltd. Image measurement apparatus, image measurement method and image measurement system
US20140055592A1 (en) 2012-08-24 2014-02-27 Datacolor, Inc. System and apparatus for color correction in transmission-microscope slides
US20140193050A1 (en) 2013-01-10 2014-07-10 Caliper Life Sciences, Inc. Multispectral Imaging Systems and Methods
US20190120694A1 (en) 2013-04-04 2019-04-25 Apple Inc. Spectrocolorimeter imaging system
US20140300753A1 (en) 2013-04-04 2014-10-09 Apple Inc. Imaging pipeline for spectro-colorimeters
JP2015035650A (en) 2013-08-07 2015-02-19 キヤノン株式会社 Image processing device, image processing method and program
JP2016048905A (en) 2013-11-15 2016-04-07 富士フイルム株式会社 Color conversion table creation device, method, and program
US20160094826A1 (en) 2014-09-26 2016-03-31 Spero Devices, Inc. Analog image alignment
US20170372117A1 (en) 2014-11-10 2017-12-28 Ventana Medical Systems, Inc. Classifying nuclei in histology images
EP3021096A1 (en) 2014-11-11 2016-05-18 Instrument Systems Optische Messtechnik Gmbh Colorimeter calibration
US20160131524A1 (en) 2014-11-11 2016-05-12 Instrument Systems Optische Messtechnik Gmbh Colorimeter calibration
US20180040307A1 (en) * 2015-02-24 2018-02-08 Barco N.V. Steady color presentation manager
CN108140249A (en) 2015-09-02 2018-06-08 文塔纳医疗系统公司 For showing the image processing system of the multiple images of biological sample and method
WO2017184821A1 (en) 2016-04-20 2017-10-26 Leica Biosystems Imaging, Inc. Digital pathology color calibration and validation
CN109073454A (en) 2016-04-20 2018-12-21 徕卡生物系统成像股份有限公司 Digital pathology colorific adjustment and verifying
JP6771584B2 (en) 2016-04-20 2020-10-21 ライカ バイオシステムズ イメージング インコーポレイテッドLeica Biosystems Imaging, Inc. Color calibration and demonstration of digital pathology
US10845245B2 (en) * 2016-04-20 2020-11-24 Leica Biosystems Imaging, Inc. Digital pathology color calibration and validation
US20180241908A1 (en) * 2017-02-21 2018-08-23 Fuji Xerox Co., Ltd. Image processing apparatus

Non-Patent Citations (11)

* Cited by examiner, † Cited by third party
Title
Extended European Search Report dated Nov. 26, 2019 for related European Patent Application No. 17786613.4 in 11 pages.
First Office Action in Chinese Application No. 201780024621.5 in 25 pages.
International Search Report and Written Opinion issued in International Application No. PCT/US2017/028532 dated Jun. 30, 2017 in 29 pages.
Luo et al. "A colour management framework for medical imaging applications", Computerized Medical Imaging and Graphics, Pergamon Press, New York, NY, US (2006) 30(6-7), 1, pp. 357-361.
Martinez-Garcia et al., "Color calibration of an RGB digital camera for the microscopic observation of highly specular meterials," Electronic Imaging 2015, conference: Measuring, Modeling, and Reproducing Meterial Appearance, May 27, 2015. http://hai-iogs.archives-ouvertes.fr/hal-01135969/document.
Masahiro et al., "Color correction in whole slide digital pathology", 20th Color and Imaging Conference: Color Science and Engineering Systems, Technologies, and Applications, CIC 2012, Dec. 1, 2012.
Notice of Reasons for Refusal dated Nov. 5, 2019 for related Japanese Patent Application No. 2018-555249 in 26 pages.
Office Action in European Application No. 17786613.4 dated May 7, 2021, in 8 pages.
Second Office Action in Chinese Application No. 201780024621.5 dated Feb. 24, 2021, in 8 pages.
Shrestha et al., "Color accuracy and reproducibility in Whole slide imaging scanners", Journal of Medical Imaging, vol. 1, No. 2, 027501, Feb. 17, 2014 in 9 pages.
Third Office Action in Chinese Application No. 201780024621.5 dated Jul. 19, 2021, in 31 pages.

Also Published As

Publication number Publication date
US10845245B2 (en) 2020-11-24
EP4273806A2 (en) 2023-11-08
US20190137339A1 (en) 2019-05-09
CN109073454B (en) 2021-12-21
EP3446083A4 (en) 2019-12-25
CN114419114A (en) 2022-04-29
WO2017184821A1 (en) 2017-10-26
US20210048344A1 (en) 2021-02-18
JP6771584B2 (en) 2020-10-21
EP3446083A1 (en) 2019-02-27
EP3446083B1 (en) 2023-09-13
JP2019520721A (en) 2019-07-18
CN109073454A (en) 2018-12-21
EP4273806A3 (en) 2024-01-24

Similar Documents

Publication Publication Date Title
US11614363B2 (en) Digital pathology color calibration and validation
US10852523B2 (en) Real-time autofocus scanning
US11454781B2 (en) Real-time autofocus focusing algorithm
US11446670B2 (en) Adjustable slide stage for differently sized slides
US10809514B2 (en) Low resolution slide imaging and slide label imaging and high resolution slide imaging using dual optical paths and a single imaging sensor
EP3625605B1 (en) Two-dimensional and three-dimensional fixed z scanning
US10930241B2 (en) Color monitor settings refresh
US11422349B2 (en) Dual processor image processing

Legal Events

Date Code Title Description
AS Assignment

Owner name: LEICA BIOSYSTEMS IMAGING, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:OLSON, ALLEN;REEL/FRAME:054269/0259

Effective date: 20181020

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STCF Information on status: patent grant

Free format text: PATENTED CASE