WO2017053891A1 - Real-time focusing in line scan imaging - Google Patents
Real-time focusing in line scan imaging Download PDFInfo
- Publication number
- WO2017053891A1 WO2017053891A1 PCT/US2016/053581 US2016053581W WO2017053891A1 WO 2017053891 A1 WO2017053891 A1 WO 2017053891A1 US 2016053581 W US2016053581 W US 2016053581W WO 2017053891 A1 WO2017053891 A1 WO 2017053891A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- focusing
- image
- sensor
- field
- view
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/40—Optical focusing aids
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B21/00—Microscopes
- G02B21/24—Base structure
- G02B21/241—Devices for focusing
- G02B21/244—Devices for focusing using image analysis techniques
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B21/00—Microscopes
- G02B21/02—Objectives
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B21/00—Microscopes
- G02B21/06—Means for illuminating specimens
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B21/00—Microscopes
- G02B21/18—Arrangements with more than one light path, e.g. for comparing two specimens
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B21/00—Microscopes
- G02B21/24—Base structure
- G02B21/241—Devices for focusing
- G02B21/245—Devices for focusing using auxiliary sources, detectors
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B21/00—Microscopes
- G02B21/24—Base structure
- G02B21/26—Stages; Adjusting means therefor
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B21/00—Microscopes
- G02B21/34—Microscope slides, e.g. mounting specimens on microscope slides
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B21/00—Microscopes
- G02B21/36—Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
- G02B21/361—Optical details, e.g. image relay to the camera or image sensor
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B7/00—Mountings, adjusting means, or light-tight connections, for optical elements
- G02B7/28—Systems for automatic generation of focusing signals
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B7/00—Mountings, adjusting means, or light-tight connections, for optical elements
- G02B7/28—Systems for automatic generation of focusing signals
- G02B7/36—Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals
- G02B7/38—Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals measured at different points on the optical axis, e.g. focussing on two or more planes and comparing image data
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B13/00—Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
- G03B13/32—Means for focusing
- G03B13/34—Power focusing
- G03B13/36—Autofocus systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
Definitions
- the present invention generally relates to digital pathology and more particularly relates to a multiple independent linear sensor apparatus for performing real-time focusing in line scan imaging.
- a system for scanning a sample to acquire a digital image of the sample may comprise: a stage configured to support a sample; an objective lens having a single optical axis that is orthogonal to the stage; an imaging sensor; a focusing sensor; and at least one first beam splitter optically coupled to the objective lens and configured to receive a field of view corresponding to the optical axis of the objective lens, and simultaneously provide at least a first portion of the field of view to the imaging sensor and at least a second portion of the field of view to the focusing sensor.
- the focusing sensor may receive the second portion of the field of view along an optical path, wherein the focusing sensor is tilted at an angle with respect to the optical path, such that the second portion of the field of view is acquired, by the focusing sensor, as an image comprising pixels representing different focal distances.
- the focusing sensor may comprise a plurality of regions, wherein each region of the focusing sensor receives the second portion of the field of view along a separate optical path, and wherein the focusing sensor is tilted at an angle with respect to each of the separate optical paths, such that the second portion of the field of view is acquired, by each region of focusing sensor, at a different focal distance than the other regions of focusing sensor.
- the focusing sensor may comprise a plurality of regions, wherein each region of the focusing sensor receives the second portion of the field of view along a separate optical path, wherein the focusing sensor is orthogonal with respect to each of the separate optical paths, and wherein each of the separate optical paths has a different focus distance, such that the second portion of the field of view is acquired, by each region of focusing sensor, at a different focal distance than the other regions of focusing sensor.
- the focusing sensor may comprise a first portion and a second portion, wherein the first portion of the focusing sensor receives the second portion of the field of view along a first optical path and is tilted at a first angle with respect to the first optical path, and wherein the second portion of the focusing sensor receives the second portion of the field of view along a second optical path, that is separate from the first optical path, and is tilted at a second angle with respect to the second optical path that is reverse to the first angle.
- the focusing sensor may comprise a first region and a second region, wherein the first region receives the second portion of the field of view along a first optical path, wherein the second region receives a mirrored second portion of the field of view along a second optical path, and wherein the focusing sensor is tilted at an angle with respect to each of the first optical path and the second optical path.
- the tilted sensor can be substituted with a non-tilted sensor and a wedge prism placed in front of the non-tilted sensor. The angle of the tilt can have a negative value, a zero value, or a positive value.
- the focusing sensor may be a section of a sensor with a wedge optic in front of it to create focal variation covering this focusing section of the sensor along the sensor axis, while the other section of the sensor acts as an imaging sensor.
- the system may comprise a processor that is configured to, for each portion of the sample to be scanned: acquire a focusing image of the portion of the sample from the focusing sensor; for each of a plurality of positions on the focusing sensor, calculate a contrast measure for a region of the focusing image corresponding to that position on the focusing sensor; determine a peak for the contrast measures; and determine a position for the objective lens that provides the peak for the contrast measures at the one parfocal point on the focusing sensor.
- a method for automatic real-time focusing may comprise using a processor in a slide scanner to, for each portion of a sample to be scanned: prior to, at the same time as, or after the portion of the sample being sensed by an imaging sensor, acquire a focusing image of the portion of the sample from a focusing sensor, wherein a portion of a field of view that is sensed by the focusing sensor is offset from a portion of the field of view that is sensed by the imaging sensor, such that, in a scan direction, the focusing sensor senses a portion of the field of view before, at the time that, or after the imaging sensor senses that same portion of the field of view, and wherein one point on the focusing sensor is parfocal with the imaging sensor, for each a plurality of positions on the focusing sensor, calculate a contrast measure for a region of the focusing image corresponding to that position on the focusing sensor, determine a peak for the contrast measures, determine a position for an objective lens that provides the
- a non-transitory computer-readable medium having instructions stored thereon having instructions stored thereon.
- the instructions when executed by a processor, cause the processor to, for each portion of a sample to be scanned: prior to, at the same time as, or after the portion of the sample being sensed by an imaging sensor, acquire a focusing image of the portion of the sample from a focusing sensor, wherein a portion of a field of view that is sensed by the focusing sensor is offset from a portion of the field of view that is sensed by the imaging sensor, such that, in a scan direction, the focusing sensor senses a portion of the field of view before, at the same time that, or after the imaging sensor senses that same portion of the field of view, and wherein one point on the is parfocal with the imaging sensor, for each a plurality of positions on the focusing sensor, calculate a contrast measure for a region of the focusing image corresponding to that position on the focusing sensor, determine a peak for the contrast measures, determine a position for an objective
- a relationship (e.g., a difference or a ratio) between the contrast measure from the focusing image and the contrast measure from the main image is defined, and the peak of this relationship is determined, to thereby determine the position of the objective lens with respect to the parfocal point.
- FIG. 1 is a block diagram illustrating an example side view configuration of a scanning system, according to an embodiment
- FIG. 2 is a block diagram illustrating an example configuration of a focusing sensor and imaging sensor with respect to a radius of illumination and a circular optical field of view, according to an embodiment
- FIG. 3A is a block diagram illustrating an example top view configuration of a imaging sensor, according to an embodiment
- FIG. 3B is a block diagram illustrating an example top view configuration of a tilted focusing sensor, according to an embodiment
- FIG. 3C is a block diagram illustrating an example of a sensor, in which half of the sensor is used to produce a normal image, and the other half is used to produce an image with various focal depths across it, according to an embodiment;
- FIG. 4 is a block diagram illustrating an example top view configuration of a tilted focusing sensor, according to an embodiment;
- FIG. 5 is a time chart diagram illustrating an example interplay between a focusing sensor and an imaging sensor during scanning, according to an embodiment
- FIG. 6 is a block diagram illustrating an example tilted focusing sensor with focusing optics, according to an embodiment
- FIGS. 7A-7C are block diagrams illustrating an example non-tilted focusing sensor with focusing optics, according to an embodiment
- FIG. 8 illustrates example results from a peak-finding algorithm, according to an embodiment.
- FIG. 9A illustrates a focal relationship between a tilted focusing sensor and imaging sensor, according to an embodiment.
- FIGS. 9B-9D illustrate the relationships of contrast functions for a tilted focusing sensor and imaging sensor, according to an embodiment.
- FIG. 10 illustrates an example tilted focusing sensor comprising two tilted line sensors, according to an embodiment.
- FIG. HA is a block diagram illustrating an example tilted focusing sensor with focusing optics for acquiring reversed images, according to an embodiment.
- FIG. 1 IB is block diagram illustrating an example tilted focusing sensor with focusing optics for acquiring reversed images, according to an embodiment.
- FIG. 12A illustrates the directionality of focal distances for two images acquired by a focusing sensor, according to an embodiment.
- FIG. 12B illustrates contrast functions for two reversed images acquired by a focusing sensor, according to an embodiment.
- FIG. 13 is a flow diagram of a real-time focusing process, according to an embodiment.
- FIG. 14A is a block diagram illustrating an example microscope slide scanner, according to an embodiment
- FIG. 14B is a block diagram illustrating an alternative example microscope slide scanner, according to an embodiment
- FIG. 14C is a block diagram illustrating example linear sensor arrays, according to an embodiment.
- FIG. 15 is a block diagram illustrating an example wired or wireless processor- enabled device that may be used in connection with various embodiments described herein.
- Certain embodiments are based on image content analysis (e.g., tissue finding and macro focus), and take advantage of line imaging and line focusing for accurate real-time auto-focusing.
- full stripe focusing is performed during a retrace process of line scanning.
- focusing is performed during image scanning. Both embodiments eliminate time delays in image scanning, thereby speeding up the entire digital image scanning process.
- certain embodiments provide for real-time (i.e., instantaneous or near-instantaneous) focusing in line scan imaging using multiple linear detectors or other components.
- one or more focus points are determined for a sample (e.g., a tissue sample prepared on a glass microscope slide).
- a sample e.g., a tissue sample prepared on a glass microscope slide.
- a macro focus point or a plurality of focus points may be determined for the sample.
- one or more positions on the sample may be determined.
- the sample may be moved along X and Y axes (e.g., by a motorized stage), such that, that position on the sample is located under an objective lens.
- the objective lens may be moved along X and Y axes, or both the objective lens and the sample may be moved along X and Y axes, such that the objective lens is located above each position on the sample.
- an image of the region of the sample at that position may be acquired at a plurality of focus heights, while the sample is stationary in the X and Y axes, via a focusing sensor optically coupled with the objective lens as the objective lens is moved along a Z axis (i.e., orthogonal to both the X and Y axes) through the plurality of focus heights.
- Software may be used to compute the best focus height for each position, based on the images acquired at the plurality of focus heights for the position.
- a real-time focus mechanism may then constrain the objective lens at the computed best focus heights at the corresponding positions, via a feedback loop, while scanning the entire sample.
- focus height or "Z height” may be used throughout to describe a distance of the objective lens with respect to the sample, this term does not limit the disclosed embodiments to an objective lens positioned above the sample, but should instead be understood to encompass any distance that represents a distance between the objective lens and a plane of the sample, regardless of their orientations to each other.
- FIG. 1 is a block diagram illustrating an example side view configuration of a scanning system 11, according to an embodiment.
- scanning system 11 comprises a sample 120 (e.g., a tissue sample prepared on a glass microscope slide) that is placed on a motorized stage (not shown), illuminated by an illumination system (not shown), and moved in a scanning direction 65.
- An objective lens 130 has an optical field of view (FOV) 250 that is trained on sample 120 and provides an optical path for light from the illumination system that passes through the specimen on the slide, reflects off of the specimen on the slide, fluoresces from the specimen on the slide, or otherwise passes through objective lens 130.
- FOV optical field of view
- FIG. 1 illustrates the relative positions between an imaging sensor 20 and a focusing sensor 30 in space.
- the light travels on the optical path through objective lens 130 to a beam splitter 140 that allows some of the light to pass through lens 160 to imaging sensor 20.
- the light may be bent by a mirror 150 (e.g., at 90°) between lens 160 and imaging sensor 20.
- Imaging sensor 20 may be, for example, a line charge-coupled device (CCD) or a line complementary metal-oxide semiconductor (CMOS) device.
- CCD line charge-coupled device
- CMOS complementary metal-oxide semiconductor
- Focusing sensor 30 may also be, for example, a line charge-coupled device (CCD) or line CMOS device.
- the light that travels to imaging sensor 20 and the light that travels to focusing sensor 30 each represents the complete optical field of view 250 from object lens 130.
- the scanning direction 65 of sample 120 is logically oriented with respect to imaging sensor 20 and focusing sensor 30 such that the logical scanning direction 60 causes the optical field of view 250 of objective lens 130 to pass over the respective focusing sensor 30 and imaging sensor 20.
- FIG. 2 is a block diagram illustrating an example configuration of focusing sensor 30 and imaging sensor 20 with respect to an optical field of view 250 having a circular illumination radius 240, according to an embodiment. In the illustrated embodiment, the positioning of focusing sensor 30 is shown with respect to imaging sensor 20 and the logical scan direction 60.
- the scan direction 60 refers to the direction in which the stage or specimen (e.g., a tissue sample) is moving with respect to sensors 30 and 20 in space.
- imaging sensor 20 is centered within optical field of view 250 of objective lens 130, while focusing sensor 30 is offset from the center of optical field of view 250 of objective lens 130.
- the direction in which focusing sensor 30 is offset from the center of optical field of view 250 of the objective lens 130 is the opposite of the logical scanning direction 60.
- This placement logically orients focusing sensor 30 in front of imaging sensor 20, such that, as a specimen on a slide is scanned, focusing sensor 30 senses the image data before, at the same time that, or after the imaging sensor 20 senses that same image data.
- a given portion e.g., line
- imaging sensor 20 and focusing sensor 30 are projected onto a same plane using, for example a beam-splitter, focusing sensor 30 is within the illumination circle, which has a radius R, of the optical field of view 250 at a location ahead of the primary imaging sensor 20 in terms of the logical scanning direction 60.
- focusing sensor 30 is within the illumination circle, which has a radius R, of the optical field of view 250 at a location ahead of the primary imaging sensor 20 in terms of the logical scanning direction 60.
- focus data can be captured and the focus height for objective lens 130 can be calculated based on one or more predetermined algorithms, prior to, at the same time as, or after the time of the view of the same section of tissue sample passing imaging sensor 20.
- the focus data and the calculated focus height for objective lens 130 can be used to control (e.g., by a controller) the height of objective lens 130 from sample 120 before the view of the same section of the tissue sample is sensed by imaging sensor 20 via objective lens 130. In this manner, imaging sensor 20 senses the view of the section of the tissue sample while objective lens 130 is at the calculated focus height.
- Circular illumination radius 240 preferably illuminates an optical field of view 250 covering both focusing sensor 30 and imaging sensor 20.
- Radius 240 is a function of the field of view on sample 120 and the optical magnification of the focusing optical path focusin g . The function can be expressed as:
- the available time t for focusing sensor 30 to capture multiple camera lines, for focus height calculation and for moving objective lens 130 to the right focus height, is a function of the distance h between focusing sensor 30 and imaging sensor 20, magnification M f0Cusing , and scan speed v:
- the maximum number of camera lines captured by focusing sensor 30, available for the focus calculation is:
- N t * K, where ⁇ is the line rate of the focusing sensor 30.
- Nma x 1,464 lines, where objective lens 130 stays at the same height. Otherwise, N ⁇ N max to allow objective lens 130 to move to the next focus height.
- a sample 120 (e.g., tissue sample) is passed under objective lens 130 in an X direction.
- a portion of sample 120 is illuminated to create an illuminated optical field of view 250 in the Z direction of a portion of sample 120 (i.e., perpendicular to an X-Y plane of sample 120).
- the illuminated optical field of view 250 passes through objective lens 130 which is optically coupled to both focusing sensor 30 and imaging sensor 20, for example, using a beam splitter 140.
- Focusing sensor 30 and imaging sensor 20 are positioned, such that focusing sensor 30 receives a region or line of the optical field of view 250 before, at the same time as, or after imaging sensor 20 receives the same region or line.
- imaging sensor 20 is simultaneously receiving a second line of image data which was previously received by focusing sensor 30 and which is a distance h/Mf 0CUS i ng on sample 120 from the first line of image data. It will take a time period At for imaging sensor 20 to receive the first line of image data after focusing sensor 30 has received the first line of image data, where At represents the time that it takes sample 120 to move a distance h/Mfocusing in the logical scan direction 60.
- a processor of scanning system 10 calculates an optimal focus height in the Z direction for the first line of image data, and adjusts objective lens 130 to the calculated optimal focus distance before, at the time that, or after imaging sensor 20 receives the first line of image data.
- focusing sensor 30 is separate from imaging sensor 20 and is tilted at an angle ⁇ with respect to a direction that is perpendicular to the optical imaging path.
- focusing sensor 30 simultaneously receives pixels of image data at a plurality of Z height values.
- the processor may then determine the pixel(s) having the best focus within the line of image data (e.g., having the highest contrast with respect to the other pixels within the line of image data).
- the processor or other controller may move objective lens 130 in the Z direction to the determined optimal Z height value before, simultaneously as, or after imaging sensor 20 receives the same line of image data.
- FIG. 3A is a block diagram illustrating an example top view configuration of imaging sensor 20 with respect to an imaging optical path 210, according to an embodiment.
- FIG. 3B is a block diagram illustrating an example top view configuration of a tilted focusing sensor 30, with respect to a focusing optical path 200, according to an embodiment. As can be seen in FIG. 3B, focusing sensor 30 is tilted at an angle ⁇ with respect to a direction that is perpendicular to focusing optical path 200.
- FIG. 3A is a block diagram illustrating an example top view configuration of imaging sensor 20 with respect to an imaging optical path 210, according to an embodiment.
- FIG. 3B is a block diagram illustrating an example top view configuration of a tilted focusing sensor 30, with respect to a focusing optical path 200, according to an embodiment.
- focusing sensor 30 is tilted at an angle ⁇ with respect to a direction that is perpendicular to focusing optical path 200.
- 3C is a block diagram illustrating an example of a sensor, in which half of the sensor is used to acquire a main image, and the other half of the sensor is used to acquire a focusing image.
- an image projected on tilted focusing sensor 30 and acquired as a line of image data by tilted focusing sensor 30 will have variable sharpness or contrast.
- This line of image data will have its highest focus (e.g., greatest sharpness or contrast) in a particular region or pixel location of tilted focusing sensor 30.
- Each region or pixel location of tilted focusing sensor 30 may be directly mapped or otherwise correlated to a Z height of objective lens 130, such that the Z height of objective lens 130 may be determined from a particular pixel location of tilted focusing sensor 30.
- the Z height of objective lens 130 providing the highest focus may be determined by identifying the Z height of objective lens 130 that is mapped to that pixel location of highest focus. Accordingly, a feedback loop may be constructed. By this feedback loop, for a given region on sample 120, the position of objective lens 130 may be automatically controlled (e.g., by increasing or decreasing the height of objective lens 130) to always correspond to the position on tilted focusing sensor 30 having the highest focus for that region, before, at the time that, or after imaging sensor 20 senses the same region of sample 120, such that the region of sample 120 being imaged by imaging sensor 20 is always at the best available focus.
- the position of objective lens 130 may be automatically controlled (e.g., by increasing or decreasing the height of objective lens 130) to always correspond to the position on tilted focusing sensor 30 having the highest focus for that region, before, at the time that, or after imaging sensor 20 senses the same region of sample 120, such that the region of sample 120 being imaged by imaging sensor 20 is always at the best available focus.
- FIG. 4 is a block diagram illustrating an example tilted focusing sensor 30, according to an embodiment.
- tilted focusing sensor 30 comprises a plurality of sensor pixels 218 within a range of focusing (z) on a tissue sample (e.g., 20 ⁇ ).
- tilted focusing sensor 30 may be positioned at a location where the entire focusing range (z) in the Z direction is transferred by optics to the entire array of sensor pixels 218 in tilted focusing sensor 30 in the Y direction.
- the location of each sensor pixel 218 is directly correlated or mapped to a Z height of objective lens 130. As illustrated in FIG.
- each dashed line, pi, p 2 , ...pi...p n , across projected focusing range (d) represents a different focus value and corresponds to a different focus height of objective lens 130.
- the pi having the highest focus for a given region of a sample can be used by scanning system 11 to determine the optimal focus height of objective lens 130 for that region of sample 120.
- a scan line (e.g., one-dimensional image data), acquired by tilted focusing sensor 30 from sample 120, is analyzed.
- a figure of merit (FOM) (e.g., contrast of the data) may be defined.
- the location (corresponding to a focus height value of objective lens 130) of a pixel 218 of the maximum FOM on the sensor array can be found. In this manner, the focus height of objective lens 130, corresponding to the location of the pixel 218 of the maximum FOM, can be determined for that scan line.
- the centers of both sensors are preferably aligned to each other along the Y axis.
- FIG. 5 is a time chart diagram illustrating an example interplay between a focusing sensor 30 and an imaging sensor 20 during scanning, according to an embodiment. Specifically, the timing of a scan using an imaging sensor 20 and focusing sensor 30 is illustrated.
- the focus height of objective lens 130 is at Z 0 on tissue section Xi, which is in the field of view of focusing sensor 30.
- Focusing sensor 30 receives focusing data corresponding to the tissue section Xi.
- the focus height Z 1 is determined to be the optimal focus height for tissue section X 1 using the focusing data and, in some embodiments, associated focusing algorithms.
- the optimal focus height is then fed to the Z positioner to move objective lens 130 to the height Zi, for example, using a control loop.
- tissue section X 1 is moved into the field of view of imaging sensor 20.
- imaging sensor 20 will sense an optimally-focused image of tissue section Xi.
- focusing sensor 30 captures focusing data from tissue section X 2 , and the focusing data will be used to determine the optimal focus height Z 2 which in turn will be fed into the Z positioner prior to, at the time that, or after tissue section X 2 passes into the field of view of imaging sensor 20 at time t 2 .
- Such a process can continue until the entire tissue sample is scanned.
- tissue section X n+ i is in the field of view of focusing sensor 30
- tissue section X n is in the field of view of imaging sensor 30
- objective lens 130 is at a focus height of Z n
- the optimal focus height for tissue section X n+ i is determined and the focus height of objective lens 130 is adjusted to Z n+1 .
- focusing sensor 30 senses tissue section X 1 and determines the focus height as Z 1 for tissue section Xi; at time ti, tissue section X 1 moves under imaging sensor 20 and objective lens 130 moves to focus height Z 1 while focusing sensor 30 senses tissue section X 2 and determines the focus height as Z 2 for tissue section X 2 ; at time t n , tissue section X n moves under imaging sensor 20 and objective lens 130 moves to focus height Z n while focusing sensor 30 senses tissue section X n+ i and determines the focus height as Z n+ i for tissue section X n+1.
- X n-1 and X n do not necessarily represent consecutive or adjacent lines of image data, as long as a scan line is acquired by focusing sensor 30 and an optimal focus height for the scan line is determined and set prior to, at the same time as, or after the same scan line being acquired by imaging sensor 20.
- focusing sensor 30 and imaging sensor 20 may be arranged such that one or more scan lines exist between the field of view of focusing sensor 30 and the field of view of imaging sensor 20, i.e., that distance h between focusing sensor 30 and imaging sensor 20 comprises one or more scan lines of data.
- tissue section X 6 would be in the field of view of focusing sensor 30 at the same time that tissue section Xi is in the field of view of imaging sensor 20.
- the focus height of objective lens 130 would be adjusted to the calculated optimal focus height after the tissue section X 5 is sensed by imaging sensor 20 but prior to, at the same time as, or after the tissue section X 6 being sensed by imaging sensor 20.
- the focus height of objective lens 130 may be smoothly controlled between tissue section X 1 and X 6 such that there are incremental changes in focus height between X 1 and X 6 that approximate a gradual slope of the tissue sample.
- FIG. 6 illustrates a tilted focusing sensor 30 that utilizes one or more beam splitters and one or more prism mirrors, according to an embodiment.
- the beam splitter(s) and mirror(s) are used to create a plurality of images of the same field of view on tilted focusing sensor 30, with each of the plurality of images at a different focal distance, thereby enabling focusing sensor 30 to simultaneously sense multiples images of the same region of sample 120 at different foci (corresponding to different focus heights for objective lens 130).
- Focusing sensor 30 may be a single large line sensor, or may comprise a plurality of line sensors (e.g., positioned in a row along a longitudinal axis).
- FIG. 6 illustrates a tilted focusing sensor 30 that utilizes beam splitters 620A and 620B and prism mirror 630 to guide a light beam 605 (illustrated as separate red, blue, and green channels, although they do not need to be separated channels) through a plurality of optical paths 610A-610C, each having different focal distances, onto a single tilted line sensor 30.
- light beam 605 conveys a field of view from objective lens 130.
- the optical paths in order from highest focal distance to lowest focal distance, are 610A, 610B, and 610C.
- each optical path will reach tilted line sensor 30 at a range of focal distances, rather than at a single focal distance, due to the tilt of tilted line sensor 30.
- the image acquired by tilted line sensor 30 on each optical path will comprise pixels acquired at a focal distance that increases from a first side of the image to a second, opposite side of the image.
- light beam 605 enters beam splitter 620A and is split into an optical path 61 OA, which proceeds to a first region of tilted focusing sensor 30 at a first focal distance, and an optical path which proceeds to beam splitter 620B.
- the optical path that proceeds to beam splitter 620B is split into an optical path 610B, which proceeds to a second region of tilted focusing sensor 30 at a second focal distance, and an optical path 610C, which is reflected off of mirror 630 onto a third region of tilted focusing sensor 30 at a third focal distance.
- Each of the first, second, and third focal distances and first, second, and third regions are different from each other.
- a single tilted focusing sensor 30 simultaneously senses light beam 605 at a plurality of different focal distances (e.g., three in the illustrated example).
- fewer or more beam splitters 620 and/or mirrors 630 may be used to create fewer or more optical paths 610 with different focal distances (e.g., two optical paths or four or more optical paths, each with a different focal distance with respect to tilted focusing sensor 30).
- the best focus may be determined and correlated to a height of objective lens 130, in the same manner as described above. Redundant information from a plurality of images at different focal distances may provide higher confidence to a focus result.
- FIGS. 7A and 7B illustrate alternatives to a tilted focusing sensor.
- FIGS. 7A and 7B illustrate a non-tilted focusing sensor 30 that utilizes one or more beam splitters and one or more prism mirrors to achieve the same results as a tilted focusing sensor, according to a couple of embodiments.
- the beam splitter(s) and mirror(s) are used to create a plurality of images of the same field of view on focusing sensor 30, with each of the plurality of images at a different focal distance, thereby enabling focusing sensor 30 to simultaneously sense multiple images of the same region of sample 120 at different foci (corresponding to different focus heights for objective lens 130).
- Focusing sensor 30 may be a single large line sensor, or may comprise a plurality of line sensors positioned in a row along a longitudinal axis.
- FIG. 7 A illustrates a non-tilted focusing sensor 30 that utilizes beam splitters 620 A and 620B and prism mirrors 63 OA and 630B to guide a light beam 605 (illustrated as separate red, blue, and green channels, although they do not need to be separated channels) through a plurality of optical paths 610A-610C, each having different focal distances, onto a single line sensor 30.
- light beam 605 conveys a field of view from objective lens 130.
- the optical paths in order from highest focal distance to lowest focal distance, are 61 OA, 610B, and 6 IOC.
- light beam 605 enters beam splitter 620A and is split into an optical path 610B, which is reflected off of mirror 63 OA and through glass block 640 A onto a first region of focusing sensor 30 at a first focal distance, and an optical path which proceeds to beam splitter 620B.
- the optical path that proceeds to beam splitter 620B is split into an optical path 610A which passes onto a second region of focusing sensor 30 (e.g., adjacent to the first region of focusing sensor 30) at a second focal distance, and an optical path 610C, which is reflected off of mirror 630B and through glass block 640B onto a third region of focusing sensor 30 (e.g., adjacent to the second region of focusing sensor 30) at a third focal distance.
- Each of the first, second, and third focal distances and first, second, and third regions are different from each other.
- focusing sensor 30 simultaneously senses light beam 605 at a plurality of different focal distances (e.g., three in the illustrated example).
- fewer or more beam splitters 620, mirrors 630, glass blocks 640, and/or regions of focusing sensor 30 may be used to create fewer or more optical paths 610 with different focal distances (e.g., two optical paths or four or more optical paths, each with a different focal distance with respect to focusing sensor 30).
- FIG. 7B illustrates a non-tilted focusing sensor 30 that utilizes beam splitters 620 A and 620B and prism mirrors 63 OA and 630B to guide a light beam 605 (illustrated as separate red, blue, and green channels, although they do not need to be separated channels) through a plurality of optical paths 610A-610C, each having different focal distances, onto respective ones of a plurality of line sensors 30A-30C. As illustrated, the optical paths, in order from highest focal distance to lowest focal distance, are 610A, 610B, and 610C.
- light beam 605 enters beam splitter 620A and is split into an optical path 610B, which is reflected off of mirror 630A and through glass block 640A onto a first region of focusing sensor 30 at a first focal distance, and an optical path which proceeds to beam splitter 620B.
- the optical path that proceeds to beam splitter 620B is split into an optical path 61 OA which passes onto a second region of focusing sensor 30 at a second focal distance, and an optical path 610C, which is reflected off of mirror 630B and through glass block 640B onto a third region of focusing sensor 30 at a third focal distance.
- Each of the first, second, and third focal distances and the first, second, and third regions of focusing sensor 30 are different from each other.
- focusing sensor 30 simultaneously senses light beam 605 at a plurality of different respective focal distances (e.g., three in the illustrated example). It should be understood that fewer or more beam splitters 620, mirrors 630, glass blocks 640, and/or regions of focusing sensor 30 may be used to create fewer or more optical paths 610 with different focal distances (e.g., two optical paths or four or more optical paths, each with a different focal distance with respect to a different focusing sensor 30).
- the beam splitters and mirrors are positioned after the imaging lens in the optical path.
- tube lenses may be positioned after the beam splitting optics.
- the locations of individual images with the same field of view are defined by the focal lengths and positions of the lenses.
- FIG. 7C illustrates an alternative non-tilted focusing sensor 30 in which the beam splitting optics are positioned before tube lenses, according to an embodiment.
- non-tilted focusing sensor 30 utilizes beam splitters 620 A, 620B, and 620C and prism mirror 630 to guide a light beam 605 through a plurality of optical paths 610A- 610D, each having different focal distances, onto a single line sensor 30.
- the optical paths in order from highest focal distance to lowest focal distance, are 61 OA, 61 OB, 6 IOC, and 610D.
- light beam 605 enters beam splitter 620A and is split into an optical path 61 OA, which is focused by lens 650A onto a first region of focusing sensor 30 at a first focal distance, and an optical path which proceeds to beam splitter 620B.
- the optical path that proceeds to beam splitter 620B is split into an optical path 61 OB, which is focused by lens 650B onto a second region of focusing sensor 30 at a second focal distance, and an optical path which proceeds to beam splitter 620C.
- the optical path that proceeds to beam splitter 620B is split into an optical path 6 IOC, which is focused by lens 650C onto a third region of focusing sensor 30 at a third focal distance, and an optical path 610C, which is reflected off of mirror 630 and focused by lens 650D onto a fourth region of tilted focusing sensor 30 at a fourth focal distance.
- Each of the first, second, third, and fourth focal distances and the first, second, third, and fourth regions are different from each other.
- focusing sensor 30 e.g., comprising a single line sensor or a plurality of line sensors
- simultaneously senses light beam 605 at a plurality of different focal distances e.g., four in the illustrated example).
- fewer or more beam splitters 620, mirrors 630, and/or regions of focusing sensor 30 may be used to create fewer or more optical paths 610 with different focal distances (e.g., two optical paths, three optical paths, or five or more optical paths, each with a different focal distance with respect to focusing sensor 30).
- a given region of a sample 120 is simultaneously acquired by different regions of a focusing sensor 30 at a plurality of different focal distances, producing a plurality of images at different focal distances.
- An algorithm can then be applied to this plurality of images to determine a best focal distance, which can be correlated to a focus height of objective lens 130 along the Z axis.
- the plurality of images acquired by different regions of focusing sensor 30 can correlate or map to various focus spots from a focus buffer.
- the focus buffer may contain contrast measures, for the focus points, calculated from image data that has been continuously acquired while objective lens 130 moves along the Z axis (i.e., as the focus height of objective lens 130 changes). For instance, a measure of contrast (e.g., averaged contrast) for each focus height represented by the plurality of images may be plotted, as illustrated in an example by the points in FIG. 8.
- the best focus i.e., the peak of contrast measures in the focus buffer
- the best focus can be determined by using a peak-finding algorithm (e.g., fitting, hill-climbing, etc.) to identify the peak of a curve that best fits the points, as illustrated in an example by the curve in FIG. 8.
- the peak of the curve represents the best contrast measure, and maps to a particular focus height that provides the best focus.
- FIG. 9 A illustrates the focal relationship between a tilted focusing sensor 30 and imaging sensor 20, according to an embodiment.
- a point P of tilted focusing sensor 30 is parfocal with imaging sensor 20.
- the appropriate focus height of objective lens 130 from sample 120 may be determined as the focus height of objective lens 130 that positions the pixel(s) having the best focus at point P of focusing sensor 30. This determined focus height is what may then be used for objective lens 130 when sensing the same region using imaging sensor 20.
- FIGS. 9B-9D illustrate the focus functions for tilted focusing sensor 30 and imaging sensor 20.
- the focus function may be a function of contrast within the images sensed by tilted focusing sensor 30 and imaging sensor 20.
- Q represents the contrast function for imaging sensor 20
- C T represents the contrast function for tilted focusing sensor 30.
- Q(x) returns a contrast measure for an image pixel at position x along the array of imaging sensor 20
- CT(X) returns a contrast measure for an image pixel at position x along the array of tilted focusing sensor 30.
- the contrast measure may be a root mean square of contrast values at x.
- CD represents the difference between C T and Q (e.g., C T -Ci).
- C D (x) represents the difference between CT and Ci at position x along the arrays of imaging sensor 20 and tilted focusing sensor 30 (e.g., CT(X)-CI(X)).
- CD(X) removes tissue-dependent spatial variations.
- a ratio between the contrast functions of the two images can be used as well to remove tissue-dependent spatial variations (e.g., C T (x)/Ci(x)).
- a threshold can be defined to remove influences from background noise.
- FIG. 9B illustrates the contrast function C ⁇ which represents contrast measures as a function of a position on imaging sensor 20.
- FIG 9C illustrates the contrast function C T 2 which represents contrast measures as a function of a position on tilted focusing sensor 30.
- FIG. 9D illustrates a ratio of the contrast function for tilted focusing sensor 30 to the contrast function for imaging sensor 20 (i.e., CT 2 /CI 2 ).
- tilted focusing sensor 30 and imaging sensor 20 both sense the same region of sample 120, the best focus will be at a position x on both sensors 30 and 20 at which the ratio of CT to Ci (e.g., CT/CI) is 1.0.
- a predetermined point P on tilted focusing sensor 30 is parfocal with imaging sensor 20. This point P may be determined during system calibration.
- a figure-of-merit (FOM) function may be used to determine the best focus for a region of sample 120 based on data acquired by tilted focusing sensor 30. Specifically, a peak of the C T function may be determined.
- This C T peak will correspond to a position x on tilted focusing sensor 30 and is correlated to a focus height within the Z range of objective lens 130.
- C T (P) does not represent the peak value
- a command may be initiated to move objective lens 130 in real time along an axis that is orthogonal to sample 120 (i.e., Z axis) until the peak of C T is at P (i.e., until C T (P) is the peak value for C T ).
- de-focus is characterized by the shift of the peak value of C T away from parfocal point P on tilted focusing sensor 30, and auto-focusing can be achieved via a feedback loop that moves objective lens 130 along the Z axis until the peak value of C T is at parfocal point P on tilted focusing sensor 30.
- an image of a field of view acquired by imaging sensor 20 is compared to an image of the same field of view acquired by focusing sensor 30 (e.g., tilted focusing sensor 30) using their ratio, difference, or other calculation.
- focusing sensor 30 may comprise a single line sensor or dual line sensors designed to acquire two images of the same field of view that are reversed in terms of the focal distances represented by the pixels of the images. For example, a first one of the two images has pixels representing the lowest focal distance at a first side (e.g., left side) of the captured field of view and pixels representing the highest focal distance at a second side of the captured field of view that is opposite the first side (e.g., right side), whereas the second one of the two images has pixels representing the highest focal distance at the first side (e.g., left side) of the captured field of view and pixels representing the lowest focal distance at the second side (e.g., right side) of the captured field of view.
- first side e.g., left side
- second side e.g., right side
- a line of pixels in the center of each image will be parfocal between the two images, and the corresponding lines of pixels emanating from the center to the sides edges (e.g., left and right edges) of the captured field of view of each image will also be parfocal between the two images but in opposite directions.
- a vertical line of pixels in the first image that is a distance D from the center to the left edge of the field of view represented in the first image will be parfocal with a vertical line of pixels in the second image that is a distance D from the center to the right edge of the field of view in the second image.
- FIG. 10 illustrates optical components for forming two images with reversed focal distances using two focusing sensors 30A and 30B, according to an embodiment.
- the tilt of focusing sensor 30A is reversed with respect to the tilt of focusing sensor 3 OB around the logical Z axis (i.e., the focus axis).
- the logical Z axis in FIG. 10 is not necessarily the same as the physical Z axis of objective lens 130, since the light may be bent (e.g., orthogonally by a beam splitter or prism mirror) after it passes through objective lens 130.
- optical paths 610A and 610B provide the same optical field of view to focusing sensors 30A and 30B, but since focusing sensors 30A and 30B are reversed in terms of their tilt, the focal distances of pixels in the two images are reversed.
- This is illustrated by the sets of three arrows labeled Zi, Z 2 , and Z 3 , which each represent a different sample focal height.
- Zi a and Zi b both represent a first focal height
- Z 2 a and Z 2 b both represent a second focal height
- Z 3 a and Z 3 b both represent a third focal height, where each of the first, second, and third focal heights are different from each other.
- FIGS. 11A and 11B illustrate optical components for forming two mirror images on a tilted focusing sensor 30, according to two different embodiments.
- one or more optical components may be used to form the field of view on a first region of tilted focusing sensor 30 and a reversed field of view on a second region of tilted focusing sensor 30.
- tilted focusing sensor 30 may be a single line sensor or a plurality of adjacent line sensors.
- FIG. 11 A illustrates optical components for forming two mirror images on a tilted focusing sensor 30, according to a first embodiment.
- a light beam 605 enters beam splitter 620 and is split into an optical path 61 OA, which is reflected off of mirror 630 onto a first region 30A of focusing sensor 30 such that a first image is acquired from first region 30A of focusing sensor 30, and an optical path 610B, which passes through dove prism 660.
- Dove prism 660 reverses light beam 605 so that a mirror image is formed on a second region 30B of focusing sensor 30 such that a second image is acquired from second region 30B of focusing sensor 30.
- optical path 61 OA provides the field of view to first region 30A of focusing sensor 30 and a mirrored field of view to second region 30B of focusing sensor 30.
- the second image is a mirror image of the first image around the logical Z axis. Since the angle of tilt ( ⁇ ) is the same in both first region 30A and second region 3 OB of tilted focusing sensor 30, the fields of view depicted in the first and second images are reversed in terms of the direction of the focal distances (e.g., from highest to lowest) at which they were acquired.
- FIG. 1 IB illustrates optical components for forming two mirror images on a tilted focusing sensor 30, according to a second embodiment.
- a light beam 605 enters beam splitter 620 and is split into an optical path 61 OA, which is reflected off of mirror 630A (e.g., a flat plate) to a first region 30A of tilted focusing sensor 30, and an optical path 610B.
- Optical path 610B reflects off of two surfaces of mirror 630B back into beam splitter 620, where it is reflected onto a second region 30B of tilted focusing sensor 30.
- Light beam 605 traveling on optical path 610B is reversed such that it produces an image on second region 30B of tilted focusing sensor 30 that is the mirror image of the image formed in optical path 610A on first region 30A of tilted focusing sensor 30. Since the angle of tilt ( ⁇ ) is the same in both first region 30A and the second region 30B of tilted focusing sensor 30, the fields of view depicted in the first and second images are reversed in terms of the direction (e.g., from highest to lowest) of the focal distances at which they were acquired.
- FIG. 12A illustrates the directionality of the focal distances for the two images acquired by regions 30A and 30B of focusing sensor 30 in the embodiments illustrated in FIGS. 10, 11 A, and 1 IB, according to an embodiment.
- a comparison of these two images using a difference, a ratio, or another calculation may provide the amount of movement and direction of movement required to place objective lens 130 at a focus height on the Z axis that achieves the best focus for focusing sensor 30.
- the center or a parfocal point of each of the region(s) of focusing sensor 30 i.e., each region corresponding to its own separate optical path 610) is parfocal with each other, as well as with imaging sensor 20.
- determining a best focus for a given region of sample 120 comprises identifying a focus height for objective lens 130, such that the best foci for the two images acquired by focusing sensor 30 are at the center or a parfocal point of the respective regions of focusing sensor 30 that acquired the images.
- the focus height of objective lens 130 that corresponds to the centers or parfocal points of both regions is also the focus height at which imaging sensor 30 (which is parfocal with the centers of both regions of focusing sensor 30 or with a parfocal point, e.g., determined during system calibration) is at the best focus for the given region of sample 120.
- 12B illustrates the focus functions for two reversed images acquired by regions 30A and 30B of focusing sensor 30.
- a mirrored image is acquired (e.g., by region 30B in FIGS. 1 1A and 1 1B)
- the mirrored image is inverted by software or other means prior to operations performed with respect to the two reversed images.
- This inversion of the mirrored image results in the two images no longer being mirror images of each other in terms of content.
- the two images represent the same field of view in the same orientation.
- the orientation of the field of view represented by the images are the same, the directions of their focal distances are reversed.
- the content on one side of a first one of the images will have been acquired at focal distance Zi, while the same content on the same side of the second one of the images will have been acquired at focal distance Z 3 , and the content on the other side of the first image will have been acquired at focal distance Z 3 , while the same content on the same side of the second image will have been acquired at Z .
- the centers of the images will both have been acquired at Z 2 .
- the focus function may be a function of contrast within the reversed images.
- the functions may return a contrast measure for a given position x along each region of focusing sensor 30 (e.g., regions 30A and 30B) that acquires one of the reversed images (e.g., a root mean square of contrast values at a position x).
- C b represents the contrast measures for region 3 OB of focusing sensor 30 which acquired the reversed image
- C a represents the contrast measures for region 30A of focusing sensor 30 which acquired the non-reversed image.
- C 2 a and C 2 b represent the contrast measures for the middle portions of the reversed images
- Ci a and Ci b represent the contrast measures for the corresponding portions of one side of the reversed images
- C 3 a and C 3 b represent the contrast measures for the corresponding portions of the other side of the reversed images.
- C 2 a /C 2 b will be close to 1.0 across the entire field of view of focusing sensor 30 when the best foci for both images are centered in their corresponding regions (e.g., regions 30A and 30B) of focusing sensor 30.
- a command may be sent in a feedback loop to move objective lens 130 along the Z axis in a direction such that the minimum of Ci a /Ci b moves towards parfocal point P.
- a command may be sent in a feedback loop to move objective lens 130 along the Z axis in a direction such that the maximum of C 3 a /C 3 b moves towards parfocal point P.
- the same algorithm may be applied to the other half of the ratio data centered to parfocal point P (i.e., the right-hand side of the curve).
- the second set of data can be used in cases where half of the field of view contains no tissue or non-useful data, or simply for redundancy to increase a success rate.
- focusing sensor 30 may be a single focusing sensor comprising the multiple regions, or a plurality of focusing sensors each consisting of one of the multiple regions. Furthermore, in embodiments in which a plurality of focusing sensors are used as the regions of focusing sensor 30, each of the plurality of focusing sensors may be arranged in the same plane as each other, or in different planes from each other, depending on the particular design.
- FIG. 13 illustrates a method for real-time focusing, according to an embodiment.
- Calibration step 1302 may comprise locating a parfocal point P (e.g., parfocal with imaging sensor 20) on a tilted focusing sensor 30 (in embodiments which utilize a tilted focusing sensor), determining an illumination profile for images from imaging sensor 20, and/or determining an illumination profile for images from focusing sensor 30.
- calibration step 1302 may be performed only once for a particular system 11, or periodically for the system 11 if recalibration is needed or desired.
- the real-time focusing process may begin in step 1304, in which one or more, and preferably a plurality of three or more, focus points are acquired using a focus-buffer method.
- Each focus point may comprise an X, Y, and Z position, where the X and Y positions represent a position in a plane of sample 120, and the Z position represents a focus height of objective lens 130.
- each focus point is obtained by positioning objective lens 130 over an X-Y position on sample 120, sweeping objective lens 130 from one end of its height range to the other end of its height range to determine the focus height providing the best focus (e.g., peak of a contrast function) at the X-Y position.
- a reference plane is created using the focus points obtained in step 1304. It should be understood that a reference plane can be created from as few as three focus points. When there are more than three focus points, focus points that are outliers with respect to a flat reference plane may be discarded. Otherwise, all focus points may be used to fit a reference plane.
- a focal surface instead of a reference plane, may be created from any plurality of focus points. Different embodiments for creating a reference plane or focal surface are described in U.S. Patent App. No. 09/563,437, filed on May 3, 2000 and issued as U.S. Patent No. 6,711,283 on March 23, 2004, and U.S. Patent App. No. 10/827,207, filed on April 16, 2004 and issued as U.S. Patent No. 7,518,652 on April 14, 2009, the entireties of both of which are hereby incorporated herein by reference.
- step 1308 objective lens 130 is moved to a Z position defined by the reference plane as a function of the X-Y position to be scanned.
- step 1310 a focusing image is acquired from focusing sensor 30.
- step 1320 a main image is acquired from imaging sensor 20.
- step 1312 the illumination in the focusing image acquired in step 1310 is corrected using any well-known illumination-correction technique.
- the illumination in the main image acquired in step 1320 is corrected using any well-known illumination-correction techniques.
- the illumination correction for the focusing image may be based on the illumination profile for focusing sensor 30 that was determined in calibration step 1302, and the illumination correction for the main image may be based on the illumination profile for imaging sensor 20 that was determined in calibration step 1302.
- step 1314 an absolute gradient of the illumination-corrected focusing image is calculated.
- step 1324 an absolute gradient of the illumination-corrected main image is calculated.
- step 1316 the rows in the focusing image gradient calculated in step 1314 are averaged.
- step 1326 the rows in the main image gradient calculated in step 1324 are averaged.
- step 1318 a low-pass filter is applied to the focusing image gradient.
- step 1328 a low-pass filter is applied to the main image gradient.
- step 1330 it is determined whether or not the background area (i.e., the area of the image without tissue) in the main image is less than the tissue area (i.e., the area of the image with tissue) in the main image. If the background area is greater than the tissue area in the main image (i.e., "No” in step 1330), the process may return to step 1308. Otherwise, if the background area is less than the tissue area in the main image (i.e., "Yes” in step 1330), the process may proceed to step 1332.
- the background area i.e., the area of the image without tissue
- the process may proceed to step 1332.
- step 1332 ratio(s) are calculated between the focusing image gradient and the main image gradient.
- the focusing image gradient may be divided by the main image gradient.
- step 1334 a peak is fit to the ratio(s) calculated in step 1332 with minimal error. For example, a best-fit curve may be found for the ratio(s).
- step 1336 the peak of the fitting in step 1334 is determined. For example, in an embodiment in which a best-fit curve is found for the ratio(s) in step 1334, the peak of the best-fit curve may be identified in step 1336.
- step 1338 if the peak identified in step 1336 is not at the parfocal point P, objective lens 130 is moved until the peak is at the parfocal point P, for example, using a feedback loop as described elsewhere herein.
- step 1340 it is determined whether or not the scan is complete. If the scan is not complete (i.e., "No” in step 1340), the process returns to steps 1310 and 1320. Otherwise, if the scan is complete (i.e., "Yes” in step 1340), the process ends.
- FIGS. 14A and 14B are block diagrams illustrating example microscope slide scanners, according to an embodiment
- FIG. 14C is a block diagram illustrating example linear sensor arrays, according to an embodiment. These three figures will be described in more detail below. However, they will first be described in combination to provide an overview. It should be noted that the following description is just an example of a slide scanner device and that alternative slide scanner devices can also be employed.
- FIGS. 14A and 14B illustrate example microscope slide scanners that can be used in conjunction with the disclosed sensor arrangement.
- FIG. 14C illustrates example linear sensors, which can be used in any combination as the disclosed sensors (imaging sensor 20 or focusing sensor 30).
- imaging sensor 20 and focusing sensor 30 may be arranged, as discussed above, using line scan camera 18 as primary imaging sensor 20.
- line scan camera 18 may include both focusing sensor 30 and imaging sensor 20.
- Imaging sensor 20 and focusing sensor 30 can receive image information from a sample 120 through the microscope objective lens 130 and/or the focusing optics 34 and 290.
- Focusing optics 290 for focusing sensor 30 may comprise the various beam splitters 620, mirrors 630, and glass blocks 640 illustrated in FIGS. 6-8.
- Imaging sensor 20 and focusing sensor 30 can provide information to, and/or receive information from, data processor 21.
- Data processor 21 is communicatively connected to memory 36 and data storage 38.
- Data processor 21 may further be communicatively connected to a communications port, which may be connected by at least one network 42 to one or more computers 44, which may in turn be connected to display monitor(s) 46.
- Data processor 21 may also be communicatively connected to and provide instructions to a stage controller 22, which controls a motorized stage 14 of slide scanner 11.
- Motorized stage 14 supports sample 120 and moves in one or more directions in the X-Y plane. In one embodiment, motorized stage 14 may also move along the Z axis.
- Data processor 21 may also be communicatively connected to and provide instructions to a motorized controller 26, which controls a motorized positioner 24 (e.g., a piezo positioner). Motorized positioner 24 is configured to move objective lens 130 in the Z axis.
- Slide scanner 11 also comprises a light source 31 and/or illumination optics 32 to illuminate sample 120, either from above or below.
- FIG. 14A is a block diagram of an embodiment of an optical microscopy system 10, according to an embodiment.
- the heart of system 10 is a microscope slide scanner 1 1 that serves to scan and digitize a specimen or sample 120.
- Sample 120 can be anything that may be interrogated by optical microscopy.
- sample 120 may comprise a microscope slide or other sample type that may be interrogated by optical microscopy.
- a microscope slide is frequently used as a viewing substrate for specimens that include tissues and cells, chromosomes, DNA, protein, blood, bone marrow, urine, bacteria, beads, biopsy materials, or any other type of biological material or substance that is either dead or alive, stained or unstained, labeled or unlabeled.
- Sample 120 may also be an array of any type of DNA or DNA-related material such as cDNA or RNA or protein that is deposited on any type of slide or other substrate, including any and all samples commonly known as microarrays.
- Sample 120 may be a microtiter plate, for example a 96-well plate.
- Other examples of sample 120 include integrated circuit boards, electrophoresis records, petri dishes, film, semiconductor materials, forensic materials, or machined parts.
- Scanner 11 includes a motorized stage 14, a microscope objective lens 130, a line scan camera 18, and a data processor 21.
- Sample 120 is positioned on motorized stage 14 for scanning.
- Motorized stage 14 is connected to a stage controller 22 which is connected in turn to data processor 21.
- Data processor 21 determines the position of sample 120 on motorized stage 14 via stage controller 22.
- motorized stage 14 moves sample 120 in at least the two axes (x/y) that are in the plane of sample 120. Fine movements of sample 120 along the optical z-axis may also be necessary for certain applications of scanner 11, for example, for focus control.
- Z-axis movement may be accomplished with a piezo positioner 24, such as the PIFOC from Polytec PI or the MIPOS 3 from Piezosystem Jena.
- Piezo positioner 24 is attached directly to microscope objective 130 and is connected to and directed by data processor 21 via piezo controller 26.
- a means of providing a coarse focus adjustment may also be needed and can be provided by Z-axis movement as part of motorized stage 14 or a manual rack-and-pinion coarse focus adjustment (not shown).
- motorized stage 14 includes a high-precision positioning table with ball bearing linear ways to provide smooth motion and excellent straight line and flatness accuracy.
- motorized stage 14 could include two Daedal model 106004 tables stacked one on top of the other.
- motorized stages 14 are also suitable for scanner 11, including stacked single-axis stages based on ways other than ball bearings, single- or multiple-axis positioning stages that are open in the center and are particularly suitable for trans-illumination from below the sample, or larger stages that can support a plurality of samples.
- motorized stage 14 includes two stacked single-axis positioning tables, each coupled to two millimeter lead-screws and Nema-23 stepping motors. At the maximum lead screw speed of twenty-five revolutions per second, the maximum speed of sample 120 on the motorized stage 14 is fifty millimeters per second. Selection of a lead screw with larger diameter, for example five millimeters, can increase the maximum speed to more than 100 millimeters per second.
- Motorized stage 14 can be equipped with mechanical or optical position encoders which has the disadvantage of adding significant expense to the system. Consequently, such an embodiment does not include position encoders. However, if one were to use servo motors in place of stepping motors, then one would have to use position feedback for proper control.
- stage controller 22 includes a 2-axis servo/stepper motor controller (Compumotor 6K2) and two 4-amp microstepping drives (Compumotor OEMZL4).
- Microstepping provides a means for commanding the stepper motor in much smaller increments than the relatively large single 1.8 degree motor step. For example, at a microstep of 100, sample 120 can be commanded to move at steps as small as 0.1 micrometer. In an embodiment, a microstep of 25,000 is used. Smaller step sizes are also possible. It should be understood that the optimum selection of motorized stage 14 and stage controller 22 depends on many factors, including the nature of sample 120, the desired time for sample digitization, and the desired resolution of the resulting digital image of sample 120.
- Microscope objective lens 130 can be any microscope objective lens commonly available. One of ordinary skill in the art will recognize that the choice of which objective lens to use will depend on the particular circumstances. In an embodiment, microscope objective lens 130 is of the infinity-corrected type.
- Sample 120 is illuminated by an illumination system 28 that includes a light source 31 and illumination optics 32.
- light source 31 includes a variable intensity halogen light source with a concave reflective mirror to maximize light output and a KG-1 filter to suppress heat.
- light source 31 could also be any other type of arc-lamp, laser, light emitting diode (“LED”), or other source of light.
- illumination optics 32 include a standard Kohler illumination system with two conjugate planes that are orthogonal to the optical axis. Illumination optics 32 are representative of the bright-field illumination optics that can be found on most commercially-available compound microscopes sold by companies such as Leica, Carl Zeiss, Nikon, or Olympus.
- One set of conjugate planes includes (i) a field iris aperture illuminated by light source 31, (ii) the object plane that is defined by the focal plane of sample 120, and (iii) the plane containing the light-responsive elements of line scan camera 18.
- a second conjugate plane includes (i) the filament of the bulb that is part of light source 31, (ii) the aperture of a condenser iris that sits immediately before the condenser optics that are part of illumination optics 32, and (iii) the back focal plane of the microscope objective lens 130.
- sample 120 is illuminated and imaged in transmission mode, with line scan camera 18 sensing optical energy that is transmitted by sample 120, or conversely, optical energy that is absorbed by sample 120.
- Scanner 11 is equally suitable for detecting optical energy that is reflected from sample 120, in which case light source 31, illumination optics 32, and microscope objective lens 130 must be selected based on compatibility with reflection imaging.
- a possible embodiment may therefore include illumination through a fiber optic bundle that is positioned above sample 120.
- Other possibilities include excitation that is spectrally conditioned by a monochromator.
- microscope objective lens 130 is selected to be compatible with phase-contrast microscopy, then the incorporation of at least one phase stop in the condenser optics that are part of illumination optics 32 will enable scanner 11 to be used for phase contrast microscopy.
- the modifications required for other types of microscopy such as differential interference contrast and confocal microscopy should be readily apparent.
- scanner 11 is suitable, with appropriate but well-known modifications, for the interrogation of microscopic samples in any known mode of optical microscopy.
- line scan camera focusing optics 34 that focus the optical signal captured by microscope objective lens 130 onto the light-responsive elements of line scan camera 18 (e.g., imaging sensor 20).
- the focusing optics between the microscope objective lens and the eyepiece optics, or between the microscope objective lens and an external imaging port comprise an optical element known as a tube lens that is part of a microscope's observation tube. Many times the tube lens consists of multiple optical elements to prevent the introduction of coma or astigmatism.
- Infinity-corrected microscope objective lenses are typically inscribed with an infinity mark.
- the magnification of an infinity corrected microscope objective lens is given by the quotient of the focal length of the tube lens divided by the focal length of the objective lens. For example, a tube lens with a focal length of 180 millimeters will result in 20x magnification if an objective lens with 9 millimeter focal length is used.
- One of the reasons that the objective lenses manufactured by different microscope manufacturers are not compatible is because of a lack of standardization in the tube lens focal length.
- a 20x objective lens from Olympus a company that uses a 180 millimeter tube lens focal length, will not provide a 20x magnification on a Nikon microscope that is based on a different tube length focal length of 200 millimeters.
- the effective magnification of such an Olympus objective lens engraved with 20x and having a 9 millimeter focal length will be 22.2x, obtained by dividing the 200 millimeter tube lens focal length by the 9 millimeter focal length of the objective lens.
- Changing the tube lens on a conventional microscope is virtually impossible without disassembling the microscope.
- the tube lens is part of a critical fixed element of the microscope.
- Line scan camera focusing optics 34 include a tube lens optic mounted inside of a mechanical tube. Since scanner 11, in an embodiment, lacks binoculars or eyepieces for traditional visual observation, the problem suffered by conventional microscopes of potential incompatibility between objective lenses and binoculars is immediately eliminated. One of ordinary skill will similarly realize that the problem of achieving parfocality between the eyepieces of the microscope and a digital image on a display monitor is also eliminated by virtue of not having any eyepieces. Since scanner 11 also overcomes the field of view limitation of a traditional microscope by providing a field of view that is practically limited only by the physical boundaries of sample 120, the importance of magnification in an all-digital imaging microscope such as provided by scanner 11 is limited.
- Scanner 11 provides diffraction-limited digital imaging by selection of a tube lens focal length that is matched according to the well-known Nyquist sampling criteria to both the size of an individual pixel element in a light-sensing camera such as line scan camera 18 and to the numerical aperture of microscope objective lens 130. It is well known that numerical aperture, not magnification, is the resolution-limiting attribute of a microscope objective lens.
- the tube lens must be selected to achieve a magnification of 46.7, obtained by dividing 28 micrometers, which corresponds to two 14 micrometer pixels, by 0.6 micrometers, the smallest resolvable feature dimension.
- the optimum tube lens optic focal length is therefore about 420 millimeters, obtained by multiplying 46.7 by 9.
- Line scan focusing optics 34 with a tube lens optic having a focal length of 420 millimeters will therefore be capable of acquiring images with the best possible spatial resolution, similar to what would be observed by viewing a specimen under a microscope using the same 20x objective lens.
- scanner 11 utilizes a traditional 20x microscope objective lens 130 in a higher magnification optical configuration (about 47x in the example above) in order to acquire diffraction-limited digital images.
- a traditional 20x magnification objective lens 130 with a higher numerical aperture were used, say 0.75, the required tube lens optic magnification for diffraction-limited imaging would be about 615 millimeters, corresponding to an overall optical magnification of 68x.
- the numerical aperture of the 20x objective lens were only 0.3, the optimum tube lens optic magnification would only be about 28x, which corresponds to a tube lens optic focal length of approximately 252 millimeters.
- Line scan camera focusing optics 34 may be modular elements of scanner 11 that can be interchanged as necessary for optimum digital imaging.
- the advantage of diffraction-limited digital imaging is particularly significant for applications, for example bright field microscopy, in which the reduction in signal brightness that accompanies increases in magnification is readily compensated by increasing the intensity of an appropriately designed illumination system 28.
- scanner 11 In a conventional microscope, different power objectives lenses are typically used to view the specimen at different resolutions and magnifications. Standard microscopes have a nosepiece that holds five objectives lenses. In an all-digital imaging system, such as scanner 11, there is a need for only one microscope objective lens 130 with a numerical aperture corresponding to the highest spatial resolution desirable. Thus, in an embodiment, scanner 11 consists of only one microscope objective lens 130. Once a diffraction-limited digital image has been captured at this resolution, it is straightforward using standard digital image processing techniques, to present imagery information at any desirable reduced resolutions and magnifications.
- scanner 11 is based on a Dalsa SPARK line scan camera 18 with 1024 pixels (picture elements) arranged in a linear array, with each pixel having a dimension of 14 by 14 micrometers. Any other type of linear array, whether packaged as part of a camera or custom-integrated into an imaging electronic module, can also be used.
- the linear array in one embodiment effectively provides eight bits of quantization, but other arrays providing higher or lower level of quantization may also be used. Alternate arrays based on three-channel red-green-blue (RGB) color information or time delay integration (TDI), may also be used.
- RGB red-green-blue
- TDI time delay integration
- TDI arrays provide a substantially better signal-to- noise ratio (S R) in the output signal by summing intensity data from previously imaged regions of a specimen, yielding an increase in the SNR that is in proportion to the square- root of the number of integration stages.
- TDI arrays can comprise multiple stages of linear arrays. TDI arrays are available with 24, 32, 48, 64, 96, or even more stages.
- Scanner 11 also supports linear arrays that are manufactured in a variety of formats including some with 512 pixels, some with 1024 pixels, and others having as many as 4096 pixels. Appropriate, but well known, modifications to illumination system 28 and line scan camera focusing optics 34 may be required to accommodate larger arrays.
- Linear arrays with a variety of pixel sizes can also be used in scanner 11.
- the salient requirement for the selection of any type of line scan camera 18 is that sample 120 can be in motion with respect to line scan camera 18 during the digitization of sample 120, in order to obtain high quality images, overcoming the static requirements of the conventional imaging tiling approaches known in the prior art.
- the output signal of line scan camera 18 is connected to data processor 21.
- data processor 21 includes a central processing unit with ancillary electronics (e.g., a motherboard) to support at least one signal digitizing electronics board such as an imaging board or a frame grabber.
- the imaging board is an EPIX PIXCID24 PCI bus imaging board.
- An alternative embodiment could be a line scan camera that uses an interface such as IEEE 1394, also known as Firewire, to bypass the imaging board altogether and store data directly on data storage 38 (e.g., a hard disk).
- Data processor 21 is also connected to a memory 36, such as random access memory (RAM), for the short-term storage of data, and to data storage 38, such as a hard drive, for long-term data storage. Further, data processor 21 is connected to a communications port 40 that is connected to a network 42 such as a local area network (LAN), a wide area network (WAN), a metropolitan area network (MAN), an intranet, an extranet, or the Internet. Memory 36 and data storage 38 are also connected to each other. Data processor 21 is also capable of executing computer programs, in the form of software, to control critical elements of scanner 11 such as line scan camera 18 and stage controller 22, or for a variety of image-processing functions, image-analysis functions, or networking. Data processor 21 can be based on any operating system, including operating systems such as Windows, Linux, OS/2, Mac OS, and Unix. In an embodiment, data processor 21 operates based on the Windows NT operating system.
- Data processor 21, memory 36, data storage 38, and communication port 40 are each elements that can be found in a conventional computer.
- a personal computer such as a Dell Dimension XPS T500 that features a Pentium III 500 MHz processor and up to 756 megabytes (MB) of RAM.
- the computer elements which include the data processor 21, memory 36, data storage 38, and communications port 40 are all internal to scanner 11, so that the only connection of scanner 11 to the other elements of system 10 is via communication port 40.
- the computer elements could be external to scanner 11 with a corresponding connection between the computer elements and scanner 11.
- scanner 11 integrates optical microscopy, digital imaging, motorized sample positioning, computing, and network-based communications into a single-enclosure unit.
- the major advantage of packaging scanner 11 as a single-enclosure unit, with communications port 40 as the primary means of data input and output, are reduced complexity and increased reliability.
- the various elements of scanner 11 are optimized to work together, in sharp contrast to traditional microscope-based imaging systems in which the microscope, light source, motorized stage, camera, and computer are typically provided by different vendors and require substantial integration and maintenance.
- Communication port 40 provides a means for rapid communications with the other elements of system 10, including network 42.
- One communications protocol for communications port 40 is a carrier-sense multiple-access collision detection protocol such as Ethernet, together with the TCP/IP protocol for transmission control and internetworking.
- Scanner 11 is intended to work with any type of transmission media, including broadband, baseband, coaxial cable, twisted pair, fiber optics, DSL, or wireless.
- control of scanner 11 and review of the imagery data captured by scanner 11 are performed on a computer 44 that is connected to network 42.
- computer 44 is connected to a display monitor 46 to provide imagery information to an operator.
- a plurality of computers 44 may be connected to network 42.
- computer 44 communicates with scanner 11 using a network browser such as Internet ExplorerTM from MicrosoftTM, ChromeTM from GoogleTM, SafariTM from AppleTM, etc.
- Images are stored on scanner 11 in a common compressed format, such as JPEG, which is an image format that is compatible with standard image-decompression methods that are already built into most commercial browsers. Other standard or nonstandard, lossy or lossless, image compression formats will also work.
- scanner 1 1 is a web server providing an operator interface that is based on web pages that are sent from scanner 11 to computer 44.
- an embodiment of scanner 11 is based on playing back, for review on display monitor 46 that is connected to computer 44, multiple frames of imagery data using standard multiple-frame browser compatible software packages such as Media-PlayerTM from MicrosoftTM, QuicktimeTM from AppleTM, or RealPlayerTM from Real NetworksTM.
- the browser on computer 44 uses the Hypertext Transfer Protocol (HTTP) together with TCP for transmission control.
- HTTP Hypertext Transfer Protocol
- scanner 11 could communicate with computer 44, or a plurality of computers. While one embodiment is based on standard means and protocols, the approach of developing one or multiple customized software modules known as applets is equally feasible and may be desirable for selected future applications of scanner 11. Furthermore, there are no constraints that computer 44 be of any specific type such as a personal computer (PC) or be manufactured by any specific company such as DellTM. One of the advantages of a standardized communications port 40 is that any type of computer 44, operating common network browser software, can communicate with scanner 11.
- Spectrally-resolved images are images in which spectral information is measured at every image pixel.
- Spectrally-resolved images could be obtained by replacing line scan camera 18 of scanner 11 with an optical slit and an imaging spectrograph.
- the imaging spectrograph uses a two-dimensional CCD detector to capture wavelength-specific intensity data for a column of image pixels by using a prism or grating to disperse the optical signal that is focused on the optical slit along each of the rows of the detector.
- FIG. 14B is a block diagram of a second embodiment of an optical microscopy system 10, according to an embodiment.
- scanner 11 is more complex and expensive than the embodiment shown in FIG. 14 A.
- the additional attributes of scanner 11 that are shown do not all have to be present for any alternative embodiment to function correctly.
- FIG. 14B is intended to provide a reasonable example of additional features and capabilities that could be incorporated into scanner 11.
- FIG. 14B provides for a much greater level of automation than the embodiment of FIG. 14A.
- a more complete level of automation of illumination system 28 is achieved by connections between data processor 21 and both light source 31 and illumination optics 32 of illumination system 28.
- the connection to light source 31 may control the voltage, or current, in an open or closed loop fashion, in order to control the intensity of light source 31. Recall that light source 31 may be a halogen bulb.
- the connection between data processor 21 and illumination optics 32 could provide closed-loop control of the field iris aperture and the condenser iris to provide a means for ensuring that optimum Kohler illumination is maintained.
- FIG. 14B provides for a fluorescence filter cube 50 that includes an excitation filter, a dichroic filter, and a barrier filter. Fluorescence filter cube 50 is positioned in the infinity-corrected beam path that exists between microscope objective lens 130 and line scan camera focusing optics 34.
- An embodiment for fluorescence imaging could include the addition of a filter wheel or tunable filter into illumination optics 32 to provide appropriate spectral excitation for the variety of fluorescent dyes or nano-crystals available on the market.
- the addition of at least one beam splitter 52 into the imaging path allows the optical signal to be split into at least two paths.
- the primary path is via line scan camera focusing optics 34, as discussed previously, to enable diffraction-limited imaging by line scan camera 18 (which may include imaging sensor 20).
- a second path is provided via an area scan camera focusing optics 54 for imaging by an area scan camera 56. It should be readily apparent that proper selection of these two focusing optics can ensure diffraction- limited imaging by the two camera sensors having different pixel sizes.
- Area scan camera 56 can be one of many types that are currently available, including a simple color video camera, a high performance, cooled, CCD camera, or a variable integration-time fast frame camera. Area scan camera 56 provides a traditional imaging system configuration for scanner 11. Area scan camera 56 is connected to data processor 21.
- both camera types could be connected to the data processor using either a single dual-purpose imaging board, two different imaging boards, or the IEEE1394 Firewire interface, in which case one or both imaging boards may not be needed.
- Other related methods of interfacing imaging sensors to data processor 21 are also available.
- scanner 11 While the primary interface of scanner 11 to computer 44 is via network 42, there may be instances, for example a failure of network 42, where it is beneficial to be able to connect scanner 11 directly to a local output device such as display monitor 58 and to also provide local input devices such as a keyboard and mouse 59 that are connected directly into data processor 21 of scanner 11. In this instance, the appropriate driver software and hardware would have to be provided as well.
- a local output device such as display monitor 58
- local input devices such as a keyboard and mouse 59
- the second embodiment shown in FIG. 14B also provides for a much greater level of automated imaging performance. Enhanced automation of the imaging of scanner 11 can be achieved by closing the focus-control loop comprising piezo positioner 24, piezo controller 26, and data processor 21 using well-known methods of autofocus.
- the second embodiment also provides for a motorized nose-piece 62 to accommodate several objectives lenses.
- the motorized nose-piece 62 is connected to and directed by data processor 21 through a nose-piece controller 64.
- scanner 11 there are other features and capabilities of scanner 11 which could be incorporated.
- the process of scanning sample 120 with respect to microscope objective lens 130 that is substantially stationary in the X-Y plane of sample 120 could be modified to comprise scanning of microscope objective lens 130 with respect to a stationary sample 120 (i.e., moving microscope objective lens 130 in an X-Y plane).
- Scanning sample 120, or scanning microscope objective lens 130, or scanning both sample 120 and microscope objective lens 130 simultaneously are possible embodiments of scanner 11 which can provide the same large contiguous digital image of sample 120 as discussed previously.
- Scanner 11 also provides a general purpose platform for automating many types of microscope-based analyses.
- Illumination system 28 could be modified from a traditional halogen lamp or arc-lamp to a laser-based illumination system to permit scanning of sample 120 with laser excitation.
- Modifications, including the incorporation of a photomultiplier tube or other non-imaging detector, in addition to or in lieu of line scan camera 18 or area scan camera 56, could be used to provide a means of detecting the optical signal resulting from the interaction of the laser energy with sample 120.
- line scan camera field of view 250 comprises the region of sample 120 of FIG. 14A that is imaged by a multitude of individual pixel elements 72 that are arranged in a linear fashion into a linear array 74.
- Linear array 74 of an embodiment comprises 1024 of the individual pixel elements 72, with each of pixel elements 72 being 14 micrometers square.
- the physical dimensions of linear array 74 are 14.34 millimeters by 14 micrometers.
- line scan camera field of view 250 corresponds to a region of sample 120 that has dimensions equal to 1.43 millimeters by 1.4 micrometers.
- Each pixel element 72 images an area about 1.4 micrometers by 1.4 micrometers.
- the scanning and digitization is performed in a direction of travel that alternates between image strips.
- This type of bi-directional scanning provides for a more rapid digitization process than uni-directional scanning, a method of scanning and digitization which requires the same direction of travel for each image strip.
- line scan camera 18 e.g., comprising imaging sensor 20
- focusing sensor 30 typically determine whether scanning and focusing can be done bi- directionally or uni-directionally.
- Uni-directional systems often comprise more than one linear array 74, such as a three channel color array 86 or a multi-channel TDI array 88 shown in FIG. 14C.
- Color array 86 detects the RGB intensities required for obtaining a color image.
- An alternative embodiment for obtaining color information uses a prism to split the broadband optical signal into the three color channels.
- TDI array 88 could be used in an alternate embodiment of scanner 11 to provide a means of increasing the effective integration time of line scan camera 18, while maintaining a fast data rate, and without significant loss in the signal-to-noise ratio of the digital imagery data.
- FIG. 15 is a block diagram illustrating an example wired or wireless system 1500 that may be used in connection with various embodiments described herein.
- system 1500 may be used as or in conjunction with one or more of the mechanisms, processes, methods, or functions described above, and may represent components of slide scanner 11, such as data processor 21.
- System 1500 can be any processor-enabled device that is capable of wired or wireless data communication.
- Other computer systems and/or architectures may be also used, as will be clear to those skilled in the art.
- System 1500 preferably includes one or more processors, such as processor 1510. Additional processors may be provided, such as an auxiliary processor to manage input/output, an auxiliary processor to perform floating point mathematical operations, a special-purpose microprocessor having an architecture suitable for fast execution of signal processing algorithms (e.g., digital signal processor), a slave processor subordinate to the main processing system (e.g., back-end processor), an additional microprocessor or controller for dual or multiple processor systems, or a coprocessor.
- auxiliary processors may be discrete processors or may be integrated with the processor 1510. Examples of processors which may be used with system 1500 include, without limitation, the Pentium® processor, Core i7® processor, and Xeon® processor, all of which are available from Intel Corporation of Santa Clara, California.
- Processor 1510 is preferably connected to a communication bus 1505.
- Communication bus 1505 may include a data channel for facilitating information transfer between storage and other peripheral components of system 1500.
- Communication bus 1505 further may provide a set of signals used for communication with processor 1510, including a data bus, address bus, and control bus (not shown).
- Communication bus 1505 may comprise any standard or non-standard bus architecture such as, for example, bus architectures compliant with industry standard architecture (ISA), extended industry standard architecture (EISA), Micro Channel Architecture (MCA), peripheral component interconnect (PCI) local bus, or standards promulgated by the Institute of Electrical and Electronics Engineers (IEEE) including IEEE 488 general-purpose interface bus (GPIB), IEEE 696/S-100, and the like.
- ISA industry standard architecture
- EISA extended industry standard architecture
- MCA Micro Channel Architecture
- PCI peripheral component interconnect
- IEEE Institute of Electrical and Electronics Engineers
- IEEE Institute of Electrical and Electronics Engineers
- GPIB general-purpose interface bus
- System 1500 preferably includes a main memory 1515 and may also include a secondary memory 1520.
- Main memory 1515 provides storage of instructions and data for programs executing on processor 1510, such as one or more of the functions and/or modules discussed above. It should be understood that programs stored in the memory and executed by processor 1510 may be written and/or compiled according to any suitable language, including without limitation C/C++, Java, JavaScript, Perl, Visual Basic, .NET, and the like.
- Main memory 1515 is typically semiconductor-based memory such as dynamic random access memory (DRAM) and/or static random access memory (SRAM). Other semiconductor-based memory types include, for example, synchronous dynamic random access memory (SDRAM), Rambus dynamic random access memory (RDRAM), ferroelectric random access memory (FRAM), and the like, including read only memory (ROM).
- SDRAM synchronous dynamic random access memory
- RDRAM Rambus dynamic random access memory
- FRAM ferroelectric random access memory
- ROM read only memory
- Secondary memory 1520 may optionally include an internal memory 1525 and/or a removable medium 1530.
- Removable medium 1530 is read from and/or written to in any well-known manner.
- Removable storage medium 1530 may be, for example, a magnetic tape drive, a compact disc (CD) drive, a digital versatile disc (DVD) drive, other optical drive, a flash memory drive, etc.
- Removable storage medium 1530 is a non-transitory computer-readable medium having stored thereon computer-executable code (i.e., software) and/or data.
- the computer software or data stored on removable storage medium 1530 is read into system 1500 for execution by processor 1510.
- secondary memory 1520 may include other similar means for allowing computer programs or other data or instructions to be loaded into system 1500. Such means may include, for example, an external storage medium 1545 and a communication interface 1540 (e.g., communication port 40), which allows software and data to be transferred from external storage medium 1545 to system 1500. Examples of external storage medium 1545 may include an external hard disk drive, an external optical drive, an external magneto-optical drive, etc. Other examples of secondary memory 1520 may include semiconductor-based memory such as programmable read-only memory (PROM), erasable programmable read-only memory (EPROM), electrically erasable read-only memory (EEPROM), or flash memory (block-oriented memory similar to EEPROM).
- PROM programmable read-only memory
- EPROM erasable programmable read-only memory
- EEPROM electrically erasable read-only memory
- flash memory block-oriented memory similar to EEPROM
- system 1500 may include a communication interface 1540.
- Communication interface 1540 allows software and data to be transferred between system 1500 and external devices (e.g. printers), networks, or other information sources.
- computer software or executable code may be transferred to system 1500 from a network server via communication interface 1540.
- Examples of communication interface 1540 include a built-in network adapter, network interface card (NIC), Personal Computer Memory Card International Association (PCMCIA) network card, card bus network adapter, wireless network adapter, Universal Serial Bus (USB) network adapter, modem, a network interface card (NIC), a wireless data card, a communications port, an infrared interface, an IEEE 1394 fire-wire, or any other device capable of interfacing system 550 with a network or another computing device.
- NIC network interface card
- PCMCIA Personal Computer Memory Card International Association
- USB Universal Serial Bus
- Communication interface 1540 preferably implements industry-promulgated protocol standards, such as Ethernet IEEE 802 standards, Fiber Channel, digital subscriber line (DSL), asynchronous digital subscriber line (ADSL), frame relay, asynchronous transfer mode (ATM), integrated digital services network (ISDN), personal communications services (PCS), transmission control protocol/Internet protocol (TCP/IP), serial line Internet protocol/point to point protocol (SLIP/PPP), and so on, but may also implement customized or non-standard interface protocols as well.
- industry-promulgated protocol standards such as Ethernet IEEE 802 standards, Fiber Channel, digital subscriber line (DSL), asynchronous digital subscriber line (ADSL), frame relay, asynchronous transfer mode (ATM), integrated digital services network (ISDN), personal communications services (PCS), transmission control protocol/Internet protocol (TCP/IP), serial line Internet protocol/point to point protocol (SLIP/PPP), and so on, but may also implement customized or non-standard interface protocols as well.
- Communication interface 1540 Software and data transferred via communication interface 1540 are generally in the form of electrical communication signals 1555. These signals 1555 may be provided to communication interface 1540 via a communication channel 1550.
- communication channel 1550 may be a wired or wireless network, or any variety of other communication links.
- Communication channel 1550 carries signals 1555 and can be implemented using a variety of wired or wireless communication means including wire or cable, fiber optics, conventional phone line, cellular phone link, wireless data communication link, radio frequency (“RF”) link, or infrared link, just to name a few.
- RF radio frequency
- Computer-executable code i.e., computer programs or software
- main memory 1515 and/or the secondary memory 1520 Computer programs can also be received via communication interface 1540 and stored in main memory 1515 and/or secondary memory 1520. Such computer programs, when executed, enable system 1500 to perform the various functions of the disclosed embodiments as described elsewhere herein.
- computer-readable medium is used to refer to any non-transitory computer-readable storage media used to provide computer-executable code (e.g., software and computer programs) to system 1500.
- Examples of such media include main memory 1515, secondary memory 1520 (including internal memory 1525, removable medium 1530, and external storage medium 1545), and any peripheral device communicatively coupled with communication interface 1540 (including a network information server or other network device).
- These non-transitory computer-readable mediums are means for providing executable code, programming instructions, and software to system 1500.
- the software may be stored on a computer-readable medium and loaded into system 1500 by way of removable medium 1530, I/O interface 1535, or communication interface 1540.
- the software is loaded into system 1500 in the form of electrical communication signals 1555.
- the software when executed by processor 1510, preferably causes processor 1510 to perform the features and functions described elsewhere herein.
- I/O interface 1535 provides an interface between one or more components of system 1500 and one or more input and/or output devices.
- Example input devices include, without limitation, keyboards, touch screens or other touch-sensitive devices, biometric sensing devices, computer mice, trackballs, pen-based pointing devices, and the like.
- Examples of output devices include, without limitation, cathode ray tubes (CRTs), plasma displays, light-emitting diode (LED) displays, liquid crystal displays (LCDs), printers, vacuum florescent displays (VFDs), surface-conduction electron-emitter displays (SEDs), field emission displays (FEDs), and the like.
- CTRs cathode ray tubes
- LED light-emitting diode
- LCDs liquid crystal displays
- VFDs vacuum florescent displays
- SEDs surface-conduction electron-emitter displays
- FEDs field emission displays
- System 1500 also includes optional wireless communication components that facilitate wireless communication over a voice network and/or a data network.
- the wireless communication components comprise an antenna system 1570, a radio system 1565, and a baseband system 1560.
- RF radio frequency
- antenna system 1570 may comprise one or more antennae and one or more multiplexors (not shown) that perform a switching function to provide antenna system 1570 with transmit and receive signal paths.
- received RF signals can be coupled from a multiplexor to a low noise amplifier (not shown) that amplifies the received RF signal and sends the amplified signal to radio system 1565.
- radio system 1565 may comprise one or more radios that are configured to communicate over various frequencies.
- radio system 1565 may combine a demodulator (not shown) and modulator (not shown) in one integrated circuit (IC). The demodulator and modulator can also be separate components. In the incoming path, the demodulator strips away the RF carrier signal leaving a baseband receive audio signal, which is sent from radio system 1565 to baseband system 1560.
- baseband system 1560 decodes the signal and converts it to an analog signal. Then the signal is amplified and sent to a speaker. Baseband system 1560 also receives analog audio signals from a microphone.
- Baseband system 1560 also codes the digital signals for transmission and generates a baseband transmit audio signal that is routed to the modulator portion of radio system 1565.
- the modulator mixes the baseband transmit audio signal with an RF carrier signal generating an RF transmit signal that is routed to antenna system 1570 and may pass through a power amplifier (not shown).
- the power amplifier amplifies the RF transmit signal and routes it to antenna system 1570 where the signal is switched to the antenna port for transmission.
- Baseband system 1560 is also communicatively coupled with processor 1510, which may be a central processing unit (CPU).
- Processor 1510 has access to data storage areas 1515 and 1520.
- Processor 1510 is preferably configured to execute instructions (i.e., computer programs or software) that can be stored in main memory 1515 or secondary memory 1520.
- Computer programs can also be received from baseband processor 1560 and stored in main memory 1510 or in secondary memory 1520, or executed upon receipt.
- Such computer programs when executed, enable system 1500 to perform the various functions of the disclosed embodiments.
- data storage areas 1515 or 1520 may include various software modules.
- any of the software components described herein may take a variety of forms.
- a component may be a stand-alone software package, or it may be a software package incorporated as a "tool" in a larger software product. It may be downloadable from a network, for example, a website, as a stand-alone product or as an add-in package for installation in an existing software application. It may also be available as a client- server software application, as a web-enabled software application, and/or as a mobile application.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Microscoopes, Condenser (AREA)
- Automatic Focus Adjustment (AREA)
- Studio Devices (AREA)
Abstract
Description
Claims
Priority Applications (9)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CA3000053A CA3000053C (en) | 2015-09-24 | 2016-09-23 | Real-time focusing in line scan imaging |
BR112018005822A BR112018005822A2 (en) | 2015-09-24 | 2016-09-23 | real-time focus in line scan imaging |
KR1020187009072A KR20180058730A (en) | 2015-09-24 | 2016-09-23 | Real-time focusing in line scan video |
EP16849812.9A EP3353601A4 (en) | 2015-09-24 | 2016-09-23 | Real-time focusing in line scan imaging |
CN201680055451.2A CN108139650B (en) | 2015-09-24 | 2016-09-23 | Real-time focusing in line scan imaging |
JP2018515597A JP6865740B2 (en) | 2015-09-24 | 2016-09-23 | Real-time focusing in line scan imaging |
US15/763,061 US10634894B2 (en) | 2015-09-24 | 2016-09-23 | Real-time focusing in line scan imaging |
AU2016326723A AU2016326723B2 (en) | 2015-09-24 | 2016-09-23 | Real-time focusing in line scan imaging |
US16/818,916 US11422350B2 (en) | 2015-09-24 | 2020-03-13 | Real-time focusing in line scan imaging |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201562232229P | 2015-09-24 | 2015-09-24 | |
US62/232,229 | 2015-09-24 |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/763,061 A-371-Of-International US10634894B2 (en) | 2015-09-24 | 2016-09-23 | Real-time focusing in line scan imaging |
US16/818,916 Continuation US11422350B2 (en) | 2015-09-24 | 2020-03-13 | Real-time focusing in line scan imaging |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017053891A1 true WO2017053891A1 (en) | 2017-03-30 |
Family
ID=58387414
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2016/053581 WO2017053891A1 (en) | 2015-09-24 | 2016-09-23 | Real-time focusing in line scan imaging |
Country Status (9)
Country | Link |
---|---|
US (2) | US10634894B2 (en) |
EP (1) | EP3353601A4 (en) |
JP (1) | JP6865740B2 (en) |
KR (1) | KR20180058730A (en) |
CN (1) | CN108139650B (en) |
AU (1) | AU2016326723B2 (en) |
BR (1) | BR112018005822A2 (en) |
CA (1) | CA3000053C (en) |
WO (1) | WO2017053891A1 (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019068043A1 (en) | 2017-09-29 | 2019-04-04 | Leica Biosystems Imaging, Inc. | Real-time autofocus scanning |
WO2019068038A1 (en) | 2017-09-29 | 2019-04-04 | Leica Biosystems Imaging, Inc. | Real-time autofocus focusing algorithm |
WO2019108691A1 (en) * | 2017-11-28 | 2019-06-06 | Leica Biosystems Imaging, Inc. | Dual processor image processing |
US11422350B2 (en) | 2015-09-24 | 2022-08-23 | Leica Biosystems Imaging, Inc. | Real-time focusing in line scan imaging |
US11704003B2 (en) | 2019-08-06 | 2023-07-18 | Leica Biosystems Imaging, Inc. | Graphical user interface for slide-scanner control |
US11709155B2 (en) | 2017-09-18 | 2023-07-25 | Waters Technologies Corporation | Use of vapor deposition coated flow paths for improved chromatography of metal interacting analytes |
US11709156B2 (en) | 2017-09-18 | 2023-07-25 | Waters Technologies Corporation | Use of vapor deposition coated flow paths for improved analytical analysis |
US11918936B2 (en) | 2020-01-17 | 2024-03-05 | Waters Technologies Corporation | Performance and dynamic range for oligonucleotide bioanalysis through reduction of non specific binding |
Families Citing this family (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019062462A (en) * | 2017-09-27 | 2019-04-18 | キヤノン株式会社 | Display unit and control apparatus thereof, and control method therefor |
CN111133359B (en) * | 2017-09-29 | 2022-12-13 | 徕卡生物系统成像股份有限公司 | Two-dimensional and three-dimensional stationary Z-scan |
US11055105B2 (en) * | 2018-08-31 | 2021-07-06 | Micron Technology, Inc. | Concurrent image measurement and execution |
JPWO2020066043A1 (en) | 2018-09-28 | 2021-08-30 | オリンパス株式会社 | Microscope system, projection unit, and image projection method |
CN112714888B (en) | 2018-09-28 | 2023-02-17 | 仪景通株式会社 | Microscope system, projection unit, and image projection method |
CN112714886B (en) * | 2018-09-28 | 2023-03-21 | 仪景通株式会社 | Microscope system, projection unit, and image projection method |
JP7150867B2 (en) | 2018-09-28 | 2022-10-11 | 株式会社エビデント | microscope system |
CN109342322A (en) * | 2018-10-23 | 2019-02-15 | 南京光声超构材料研究院有限公司 | Auto focusing method in time domain heat reflection spectrometry |
CN109905618B (en) * | 2019-03-26 | 2021-03-16 | 中国科学院长春光学精密机械与物理研究所 | Sandwich imaging unit structure and one-step sample design method |
CN113366364A (en) | 2019-08-06 | 2021-09-07 | 徕卡生物系统成像股份有限公司 | Real-time focusing in slide scanning system |
US11053540B1 (en) | 2020-01-17 | 2021-07-06 | Element Biosciences, Inc. | High performance fluorescence imaging module for genomic testing assay |
CN111866387B (en) * | 2020-07-27 | 2021-11-02 | 支付宝(杭州)信息技术有限公司 | Depth image imaging system and method |
JP2022049808A (en) * | 2020-09-17 | 2022-03-30 | オリンパス株式会社 | Photographing device, photographing system, and control method |
CN111932542B (en) * | 2020-10-14 | 2021-02-02 | 深圳市瑞图生物技术有限公司 | Image identification method and device based on multiple focal lengths and storage medium |
JP2022127536A (en) * | 2021-02-19 | 2022-08-31 | 株式会社キーエンス | Enlarging observation device, enlarged image observation method, enlarged image observation program, and computer-readable recording medium, and apparatus storing program |
CN113031243B (en) * | 2021-03-26 | 2021-08-31 | 杭州辰景光电科技有限公司 | Reflective on-chip digital holographic microscopic device based on waveguide sheet |
CN113281874A (en) * | 2021-06-30 | 2021-08-20 | 成都易迅光电科技有限公司 | Telescopic multi-lens module |
KR102717663B1 (en) | 2021-08-11 | 2024-10-15 | 주식회사 뷰웍스 | Image acquisition device and method for determining focus position using the same |
CN117908219A (en) * | 2024-03-06 | 2024-04-19 | 东莞市沃德普自动化科技有限公司 | Focusing module, automatic focusing microscopic imaging system and method |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050105174A1 (en) * | 2003-10-03 | 2005-05-19 | Nikon Corporation | Microscope system |
US20080137938A1 (en) * | 2006-12-11 | 2008-06-12 | Cytyc Corporation | Method for assessing image focus quality |
US20130093874A1 (en) * | 2010-06-24 | 2013-04-18 | Koninklijke Philips Electronics N.V. | Autofocus based on differential measurements |
WO2013165576A1 (en) | 2012-05-02 | 2013-11-07 | Aperio Technologies, Inc. | Real-time focusing in line scan imaging |
Family Cites Families (56)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3967110A (en) * | 1973-09-21 | 1976-06-29 | Corning Glass Works | Automatic focusing system |
US4636627A (en) | 1983-01-17 | 1987-01-13 | Canon Kabushiki Kaisha | Focus detecting apparatus |
US5756307A (en) | 1991-09-20 | 1998-05-26 | The United States Of America As Represented By The Department Of Health And Human Services | Sequence of human dopamine transporter cDNA |
JP2966311B2 (en) * | 1995-03-10 | 1999-10-25 | 科学技術振興事業団 | Microphotometer |
JPH08320430A (en) * | 1995-05-23 | 1996-12-03 | Nikon Corp | Automatic focus detector |
JPH09189850A (en) * | 1996-01-09 | 1997-07-22 | Olympus Optical Co Ltd | Automatic focusing microscope |
JPH1152224A (en) | 1997-06-04 | 1999-02-26 | Hitachi Ltd | Automatic focus detecting method and device and inspecting device |
US6091075A (en) | 1997-06-04 | 2000-07-18 | Hitachi, Ltd. | Automatic focus detection method, automatic focus detection apparatus, and inspection apparatus |
US20060060781A1 (en) | 1997-08-11 | 2006-03-23 | Masahiro Watanabe | Charged-particle beam apparatus and method for automatically correcting astigmatism and for height detection |
US6640014B1 (en) * | 1999-01-22 | 2003-10-28 | Jeffrey H. Price | Automatic on-the-fly focusing for continuous image acquisition in high-resolution microscopy |
US6839469B2 (en) * | 2000-01-21 | 2005-01-04 | Lam K. Nguyen | Multiparallel three dimensional optical microscopy system |
US6724489B2 (en) | 2000-09-22 | 2004-04-20 | Daniel Freifeld | Three dimensional scanning camera |
US7009651B2 (en) | 2000-10-12 | 2006-03-07 | Amnis Corporation | System and method for high numeric aperture imaging systems |
DE10127284A1 (en) | 2001-06-05 | 2002-12-12 | Zeiss Carl Jena Gmbh | Automatic focussing of a microscope using an analysis unit that compares measured values with stored design values and adjusts the microscope accordingly |
JP2004046132A (en) | 2002-05-17 | 2004-02-12 | Olympus Corp | Automatic focusing system |
US6760154B1 (en) | 2002-06-04 | 2004-07-06 | Biotechs, Inc. | Microscope system with continuous autofocus |
JP4370554B2 (en) | 2002-06-14 | 2009-11-25 | 株式会社ニコン | Autofocus device and microscope with autofocus |
DE10319182B4 (en) | 2003-04-29 | 2008-06-12 | Carl Zeiss Jena Gmbh | Method and arrangement for determining the focus position when imaging a sample |
US7330574B2 (en) | 2003-05-08 | 2008-02-12 | Ometrix, Inc. | Best-focus estimation by lateral scanning |
US20050089208A1 (en) | 2003-07-22 | 2005-04-28 | Rui-Tao Dong | System and method for generating digital images of a microscope slide |
JP2005070225A (en) | 2003-08-21 | 2005-03-17 | Tokyo Seimitsu Co Ltd | Surface image projector and the surface image projection method |
US7115890B2 (en) | 2003-11-18 | 2006-10-03 | Applied Materials, Inc. | Method and apparatus for inspecting a sample having a height measurement ahead of a focal area |
US7813579B2 (en) * | 2004-05-24 | 2010-10-12 | Hamamatsu Photonics K.K. | Microscope system |
US7232980B2 (en) * | 2004-05-24 | 2007-06-19 | Hamamatsu Photonics K.K. | Microscope system |
US8164622B2 (en) | 2005-07-01 | 2012-04-24 | Aperio Technologies, Inc. | System and method for single optical axis multi-detector microscope slide scanner |
JP4680052B2 (en) | 2005-12-22 | 2011-05-11 | シスメックス株式会社 | Specimen imaging apparatus and specimen analyzer including the same |
EP1989508A4 (en) | 2006-02-10 | 2009-05-20 | Monogen Inc | Method and apparatus and computer program product for collecting digital image data from microscope media-based specimens |
JP5094048B2 (en) | 2006-06-09 | 2012-12-12 | 株式会社日立ハイテクノロジーズ | Appearance inspection device |
DE102006027836B4 (en) * | 2006-06-16 | 2020-02-20 | Carl Zeiss Microscopy Gmbh | Microscope with auto focus device |
DE102007017598A1 (en) | 2007-04-13 | 2008-10-16 | Carl Zeiss Microimaging Gmbh | Method and arrangement for positioning a light sheet in the focal plane of a detection optical system |
US7576307B2 (en) | 2007-04-30 | 2009-08-18 | General Electric Company | Microscope with dual image sensors for rapid autofocusing |
US8179432B2 (en) | 2007-04-30 | 2012-05-15 | General Electric Company | Predictive autofocusing |
US8878923B2 (en) | 2007-08-23 | 2014-11-04 | General Electric Company | System and method for enhanced predictive autofocusing |
JP4885156B2 (en) | 2008-02-07 | 2012-02-29 | 株式会社日立製作所 | Focus control apparatus and method |
WO2010067256A1 (en) | 2008-12-09 | 2010-06-17 | Koninklijke Philips Electronics N.V. | Autofocus for a microscope system. |
ATE551841T1 (en) | 2009-04-22 | 2012-04-15 | Raytrix Gmbh | DIGITAL IMAGING METHOD FOR SYNTHESIZING AN IMAGE USING DATA RECORDED BY A PLENOPTIC CAMERA |
US8304704B2 (en) | 2009-07-27 | 2012-11-06 | Sensovation Ag | Method and apparatus for autofocus using a light source pattern and means for masking the light source pattern |
SG187479A1 (en) | 2009-10-19 | 2013-02-28 | Ventana Med Syst Inc | Imaging system and techniques |
ES2561937T3 (en) | 2009-12-30 | 2016-03-01 | Koninklijke Philips N.V. | Sensor for microscopy |
US10061108B2 (en) | 2010-05-18 | 2018-08-28 | Koninklijke Philips N.V. | Autofocus imaging for a microscope |
US8175452B1 (en) | 2010-10-26 | 2012-05-08 | Complete Genomics, Inc. | Method and system for imaging high density biochemical arrays with sub-pixel alignment |
EP2715321A4 (en) | 2011-05-25 | 2014-10-29 | Huron Technologies Internat Inc | 3d pathology slide scanner |
GB201113071D0 (en) * | 2011-07-29 | 2011-09-14 | Ffei Ltd | Method and apparatus for image scanning |
IN2014DN03134A (en) | 2011-09-21 | 2015-05-22 | Huron Technologies Internat Inc | |
US9402036B2 (en) | 2011-10-17 | 2016-07-26 | Rudolph Technologies, Inc. | Scanning operation with concurrent focus and inspection |
WO2014112086A1 (en) | 2013-01-17 | 2014-07-24 | 浜松ホトニクス株式会社 | Image acquisition device and focus method for image acquisition device |
US10298833B2 (en) | 2011-12-19 | 2019-05-21 | Hamamatsu Photonics K.K. | Image capturing apparatus and focusing method thereof |
JP6019998B2 (en) | 2012-02-17 | 2016-11-02 | ソニー株式会社 | Imaging apparatus, imaging control program, and imaging method |
GB2505691B (en) | 2012-09-07 | 2018-02-21 | Ffei Ltd | Method and apparatus for image scanning |
EP2947487A4 (en) * | 2013-01-17 | 2016-08-24 | Hamamatsu Photonics Kk | Image acquisition device and focus method for image acquisition device |
JP2014178474A (en) | 2013-03-14 | 2014-09-25 | Sony Corp | Digital microscope apparatus, focusing position searching method therefor, and program |
CN105143952B (en) | 2013-04-26 | 2018-09-28 | 浜松光子学株式会社 | The focus method of image capturing device and image capturing device |
JP6010505B2 (en) | 2013-06-11 | 2016-10-19 | 浜松ホトニクス株式会社 | Image acquisition device and focus method of image acquisition device |
JP6394850B2 (en) | 2013-09-20 | 2018-09-26 | 大学共同利用機関法人自然科学研究機構 | Compensating optical system and optical apparatus |
TWI515503B (en) | 2013-12-09 | 2016-01-01 | 聯詠科技股份有限公司 | Automatic-focusing imaging capture device and imaging capture method |
AU2016326723B2 (en) | 2015-09-24 | 2021-11-11 | Leica Biosystems Imaging, Inc. | Real-time focusing in line scan imaging |
-
2016
- 2016-09-23 AU AU2016326723A patent/AU2016326723B2/en active Active
- 2016-09-23 WO PCT/US2016/053581 patent/WO2017053891A1/en active Application Filing
- 2016-09-23 EP EP16849812.9A patent/EP3353601A4/en active Pending
- 2016-09-23 BR BR112018005822A patent/BR112018005822A2/en not_active Application Discontinuation
- 2016-09-23 CN CN201680055451.2A patent/CN108139650B/en active Active
- 2016-09-23 CA CA3000053A patent/CA3000053C/en active Active
- 2016-09-23 US US15/763,061 patent/US10634894B2/en active Active
- 2016-09-23 JP JP2018515597A patent/JP6865740B2/en active Active
- 2016-09-23 KR KR1020187009072A patent/KR20180058730A/en not_active Application Discontinuation
-
2020
- 2020-03-13 US US16/818,916 patent/US11422350B2/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050105174A1 (en) * | 2003-10-03 | 2005-05-19 | Nikon Corporation | Microscope system |
US20080137938A1 (en) * | 2006-12-11 | 2008-06-12 | Cytyc Corporation | Method for assessing image focus quality |
US20130093874A1 (en) * | 2010-06-24 | 2013-04-18 | Koninklijke Philips Electronics N.V. | Autofocus based on differential measurements |
WO2013165576A1 (en) | 2012-05-02 | 2013-11-07 | Aperio Technologies, Inc. | Real-time focusing in line scan imaging |
Non-Patent Citations (1)
Title |
---|
See also references of EP3353601A4 |
Cited By (36)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11422350B2 (en) | 2015-09-24 | 2022-08-23 | Leica Biosystems Imaging, Inc. | Real-time focusing in line scan imaging |
US11709156B2 (en) | 2017-09-18 | 2023-07-25 | Waters Technologies Corporation | Use of vapor deposition coated flow paths for improved analytical analysis |
US11709155B2 (en) | 2017-09-18 | 2023-07-25 | Waters Technologies Corporation | Use of vapor deposition coated flow paths for improved chromatography of metal interacting analytes |
EP3625600A4 (en) * | 2017-09-29 | 2021-01-27 | Leica Biosystems Imaging, Inc. | Real-time autofocus scanning |
WO2019068038A1 (en) | 2017-09-29 | 2019-04-04 | Leica Biosystems Imaging, Inc. | Real-time autofocus focusing algorithm |
CN111149037A (en) * | 2017-09-29 | 2020-05-12 | 徕卡生物系统成像股份有限公司 | Real-time automatic focusing algorithm |
CN114779459A (en) * | 2017-09-29 | 2022-07-22 | 徕卡生物系统成像股份有限公司 | Real-time autofocus scanning |
US12078790B2 (en) | 2017-09-29 | 2024-09-03 | Leica Biosystems Imaging, Inc. | Real-time autofocus scanning |
JP2020535477A (en) * | 2017-09-29 | 2020-12-03 | ライカ バイオシステムズ イメージング インコーポレイテッドLeica Biosystems Imaging, Inc. | Real-time autofocus scanning |
WO2019068043A1 (en) | 2017-09-29 | 2019-04-04 | Leica Biosystems Imaging, Inc. | Real-time autofocus scanning |
US11422351B2 (en) | 2017-09-29 | 2022-08-23 | Leica Biosystems Imaging, Inc. | Real-time autofocus scanning |
AU2018339006B2 (en) * | 2017-09-29 | 2024-02-15 | Leica Biosystems Imaging, Inc. | Real-time autofocus focusing algorithm |
EP3625602A4 (en) * | 2017-09-29 | 2021-02-24 | Leica Biosystems Imaging, Inc. | Real-time autofocus focusing algorithm |
CN115061271B (en) * | 2017-09-29 | 2024-02-02 | 徕卡生物系统成像股份有限公司 | Real-time auto-focusing algorithm |
CN111183385B (en) * | 2017-09-29 | 2022-04-08 | 徕卡生物系统成像股份有限公司 | Real-time autofocus scanning |
KR102411099B1 (en) | 2017-09-29 | 2022-06-22 | 라이카 바이오시스템즈 이미징 인크. | Real-time autofocus scanning |
EP4254037A3 (en) * | 2017-09-29 | 2023-12-27 | Leica Biosystems Imaging, Inc. | Real-time autofocus scanning |
KR102419163B1 (en) * | 2017-09-29 | 2022-07-08 | 라이카 바이오시스템즈 이미징 인크. | Real-time autofocus focusing algorithm |
CN111149037B (en) * | 2017-09-29 | 2022-07-12 | 徕卡生物系统成像股份有限公司 | Real-time automatic focusing algorithm |
CN111183385A (en) * | 2017-09-29 | 2020-05-19 | 徕卡生物系统成像股份有限公司 | Real-time autofocus scanning |
AU2018339011B2 (en) * | 2017-09-29 | 2023-11-02 | Leica Biosystems Imaging, Inc. | Real-time autofocus scanning |
KR20220092999A (en) * | 2017-09-29 | 2022-07-04 | 라이카 바이오시스템즈 이미징 인크. | A digital scanning apparatus |
KR20200041982A (en) * | 2017-09-29 | 2020-04-22 | 라이카 바이오시스템즈 이미징 인크. | Real-time autofocus scanning |
CN115061271A (en) * | 2017-09-29 | 2022-09-16 | 徕卡生物系统成像股份有限公司 | Real-time automatic focusing algorithm |
US11454781B2 (en) | 2017-09-29 | 2022-09-27 | Leica Biosystems Imaging, Inc. | Real-time autofocus focusing algorithm |
KR102523559B1 (en) | 2017-09-29 | 2023-04-19 | 라이카 바이오시스템즈 이미징 인크. | A digital scanning apparatus |
EP4254037A2 (en) | 2017-09-29 | 2023-10-04 | Leica Biosystems Imaging, Inc. | Real-time autofocus scanning |
KR20200041983A (en) * | 2017-09-29 | 2020-04-22 | 라이카 바이오시스템즈 이미징 인크. | Real-time autofocus focusing algorithm |
WO2019108691A1 (en) * | 2017-11-28 | 2019-06-06 | Leica Biosystems Imaging, Inc. | Dual processor image processing |
AU2018375358B2 (en) * | 2017-11-28 | 2021-02-04 | Leica Biosystems Imaging, Inc. | Dual processor image processing |
US11422349B2 (en) | 2017-11-28 | 2022-08-23 | Leica Biosystems Imaging, Inc. | Dual processor image processing |
CN111279242B (en) * | 2017-11-28 | 2022-03-29 | 徕卡生物系统成像股份有限公司 | Dual processor image processing |
EP3625611A4 (en) * | 2017-11-28 | 2021-02-24 | Leica Biosystems Imaging, Inc. | Dual processor image processing |
CN111279242A (en) * | 2017-11-28 | 2020-06-12 | 徕卡生物系统成像股份有限公司 | Dual processor image processing |
US11704003B2 (en) | 2019-08-06 | 2023-07-18 | Leica Biosystems Imaging, Inc. | Graphical user interface for slide-scanner control |
US11918936B2 (en) | 2020-01-17 | 2024-03-05 | Waters Technologies Corporation | Performance and dynamic range for oligonucleotide bioanalysis through reduction of non specific binding |
Also Published As
Publication number | Publication date |
---|---|
US10634894B2 (en) | 2020-04-28 |
US20180275388A1 (en) | 2018-09-27 |
CA3000053A1 (en) | 2017-03-30 |
EP3353601A4 (en) | 2019-10-02 |
CN108139650A (en) | 2018-06-08 |
CN108139650B (en) | 2020-10-30 |
JP6865740B2 (en) | 2021-04-28 |
US20200218053A1 (en) | 2020-07-09 |
JP2018534610A (en) | 2018-11-22 |
KR20180058730A (en) | 2018-06-01 |
AU2016326723B2 (en) | 2021-11-11 |
BR112018005822A2 (en) | 2018-10-09 |
US11422350B2 (en) | 2022-08-23 |
CA3000053C (en) | 2023-07-04 |
EP3353601A1 (en) | 2018-08-01 |
AU2016326723A1 (en) | 2018-04-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11422350B2 (en) | Real-time focusing in line scan imaging | |
US11243387B2 (en) | Real-time focusing in line scan imaging | |
US9729749B2 (en) | Data management in a linear-array-based microscope slide scanner | |
US9235041B2 (en) | System and method for single optical axis multi-detector microscope slide scanner | |
EP2737453B1 (en) | Standardizing fluorescence microscopy systems | |
EP1989583B1 (en) | System and method for single optical axis multi-detector microscope slide scanner |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16849812 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15763061 Country of ref document: US Ref document number: 2018515597 Country of ref document: JP |
|
ENP | Entry into the national phase |
Ref document number: 3000053 Country of ref document: CA |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 20187009072 Country of ref document: KR Kind code of ref document: A |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112018005822 Country of ref document: BR |
|
ENP | Entry into the national phase |
Ref document number: 2016326723 Country of ref document: AU Date of ref document: 20160923 Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2016849812 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 112018005822 Country of ref document: BR Kind code of ref document: A2 Effective date: 20180323 |