US10422987B2 - Image acquisition device and image acquisition method for image acquisition device - Google Patents

Image acquisition device and image acquisition method for image acquisition device Download PDF

Info

Publication number
US10422987B2
US10422987B2 US15/030,198 US201415030198A US10422987B2 US 10422987 B2 US10422987 B2 US 10422987B2 US 201415030198 A US201415030198 A US 201415030198A US 10422987 B2 US10422987 B2 US 10422987B2
Authority
US
United States
Prior art keywords
reading
pixel row
light
last
sample
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
US15/030,198
Other versions
US20160252717A1 (en
Inventor
Fumio Iwase
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hamamatsu Photonics KK
Original Assignee
Hamamatsu Photonics KK
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hamamatsu Photonics KK filed Critical Hamamatsu Photonics KK
Assigned to HAMAMATSU PHOTONICS K.K. reassignment HAMAMATSU PHOTONICS K.K. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: IWASE, FUMIO
Publication of US20160252717A1 publication Critical patent/US20160252717A1/en
Application granted granted Critical
Publication of US10422987B2 publication Critical patent/US10422987B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/36Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
    • G02B21/365Control or image processing arrangements for digital or video microscopes
    • G02B21/367Control or image processing arrangements for digital or video microscopes providing an output produced by processing a plurality of individual source images, e.g. image tiling, montage, composite images, depth sectioning, image comparison
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/06Means for illuminating specimens
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/24Base structure
    • G02B21/241Devices for focusing
    • G06T5/002
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/689Motion occurring during a rolling shutter mode
    • H04N5/23212
    • H04N5/2329
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/06Means for illuminating specimens
    • G02B21/08Condensers
    • G02B21/086Condensers for transillumination only

Definitions

  • the present invention relates to an image acquisition device and an image acquisition method for an image acquisition device.
  • an image acquisition device for acquiring still images of a sample such as tissue cells
  • partial images of the sample are sequentially acquired while moving the sample relative to the objective lens, and then, the partial images are combined so as to acquire an image of the entire sample (see, for example, Patent Literature 1 to Patent Literature 3).
  • Patent Literature 1 Japanese Unexamined Patent Publication No. 2003-222801
  • Patent Literature 2 Japanese Unexamined Patent Publication No. 2000-501844
  • Patent Literature 3 Japanese Unexamined Patent Publication No. S63-191063
  • Imaging methods for the image acquisition device as described above include a method in which global reading in which exposure and reading are performed at the same time in all pixels in an imaging element is used, and a method in which rolling reading in which reading is performed sequentially in each pixel row in an imaging element is used.
  • the method in which the rolling reading advantageous for improvement of an S/N ratio is used has been studied.
  • an optical image of a sample is received at different timing in each pixel row. Accordingly, when a moving sample is imaged, an obtained still image may be distorted.
  • the present invention has been made to solve the above problem, and an object thereof is to provide an image acquisition device and an image acquisition method for an image acquisition device capable of acquiring a still image of which the distortion is suppressed with respect to a moving sample with a sufficient S/N ratio.
  • an image acquisition device includes a stage on which a sample is placed; a light emitting means for emitting instantaneous light; a light guiding optical system including an objective lens arranged to face the sample on the stage; a driving unit for moving a focal position of the objective lens relative to the sample in an optical axis direction of the objective lens; an imaging element for capturing an optical image of the sample guided by the light guiding optical system; an image processing unit for processing an image data output from the imaging element; and a control unit for controlling the light emitting means, wherein the imaging element is a two-dimensional imaging element including a plurality of pixel rows and is capable of rolling reading, the control unit causes the instantaneous light to be emitted from the light emitting means during a period of time from the start of exposure of the pixel row of which the reading last starts in the rolling reading to the start of the reading of the pixel row of which the reading first starts, and the image processing unit sequentially acquires image data from the imaging element, and
  • this image acquisition device includes the control unit that causes the instantaneous light to be emitted from the light emitting means from the start of exposure of the pixel row of which reading last starts in the rolling reading to the start of reading of the pixel row of which the reading first starts.
  • the control unit causes the sample to be irradiated with the instantaneous light from the light emitting means during a period of time in which all of the pixel rows of the imaging element capable of rolling reading are exposed. Accordingly, since an optical image of the sample is received in the respective pixel rows at the same timing, it is possible to acquire a still image of which the distortion is suppressed.
  • the imaging element outputs a trigger signal indicating the start of exposure of the pixel row of which the reading last starts in the rolling reading to the control unit, and the control unit starts the irradiation of the instantaneous light based on the trigger signal output from the imaging element.
  • the control unit starts the irradiation of the instantaneous light based on the trigger signal output from the imaging element.
  • control unit prefferably controls the light emitting means so that there is a predetermined period of time from the end of the irradiation of the instantaneous light to the start of the reading of the pixel row of which the reading first starts in the rolling reading. Accordingly, even when there is delayed light arriving at the imaging element with a delay after the end of the irradiation of the instantaneous light, the delayed light is received during the predetermined period of time. Accordingly, the optical image of the sample can be received in all of the pixel rows under the same conditions.
  • control unit prefferably controls the light emitting means so that there is a predetermined period of time from the start of exposure of the pixel row of which the reading last starts in the rolling reading to the start of the irradiation of the instantaneous light. In this case, it is possible to cause the sample to be irradiated with the instantaneous light more reliably during a period of time in which all of the pixel rows are exposed.
  • the driving unit moves a focal position of the objective lens to a focusing position during a period of time from the start of reading of the pixel row of which the reading first starts in the rolling reading to the start of exposure of the pixel row of which the reading last starts.
  • focusing of the objective lens can be performed using a period of time in which each pixel row does not capture the optical image of the sample, faster acquisition of still images can be realized.
  • the driving unit prefferably moves the focal position of the objective lens so that there is a predetermined period of time from the end of the movement of the focal position of the objective lens to the start of exposure of the pixel row of which the reading last starts in the rolling reading. In this case, it is possible to reduce an influence of vibration due to driving of the objective lens or the stage when the image data is acquired.
  • An acquisition method for an image acquisition device is an image acquisition method for an image acquisition device including a stage on which a sample is placed; a light emitting means for emitting instantaneous light; a light guiding optical system including an objective lens arranged to face the sample on the stage; a driving unit for moving a focal position of the objective lens relative to the sample in an optical axis direction of the objective lens; an imaging element for capturing an optical image of the sample guided by the light guiding optical system; an image processing unit for processing an image data output from the imaging element; and a control unit for controlling the light emitting means, the image acquisition method including: using, as the imaging element, a two-dimensional imaging element including a plurality of pixel rows and is capable of rolling reading; causing, by the control unit, the instantaneous light to be emitted from the light emitting means during a period of time from the start of exposure of the pixel row of which the reading last starts in the rolling reading to the start of the reading of which the pixel row of which the reading first starts
  • an optical image of the sample irradiated with the instantaneous light is captured using the two-dimensional imaging element capable of rolling reading. Therefore, noise in the imaging element can be reduced and the image can be captured with a sufficient S/N ratio even when the optical image is weak. Further, in this image acquisition method for an image acquisition device, instantaneous light is caused to be emitted from the light emitting means by the control unit from the start of exposure of the pixel row of which reading last starts in the rolling reading to the start of reading of the pixel row of which the reading first starts.
  • the sample is caused to be irradiated with the instantaneous light from the light emitting means by the control unit during a period of time in which all of the pixel rows of the imaging element capable of rolling reading are exposed. Accordingly, since an optical image of the sample is received in the respective pixel rows at the same timing, it is possible to acquire a still image of which the distortion is suppressed.
  • FIG. 1 is a diagram illustrating an embodiment of an image acquisition device according to the present invention.
  • FIGS. 2A and 2B are diagrams illustrating an example of an imaging element, where FIG. 2A illustrates a light reception surface of the imaging element, and FIG. 2B illustrates a relationship between rolling reading in the imaging element and, for example, irradiation of instantaneous light.
  • FIG. 3 is a diagram illustrating an example of scan of an image acquisition region for a sample.
  • FIG. 4 is a diagram illustrating an example of a configuration of pixel rows in the imaging element.
  • FIGS. 5A and 5B are diagrams illustrating an imaging element according to a modification example, where FIG. 5A illustrates a light reception surface of the imaging element, and FIG. 5B illustrates a relationship between rolling reading in the imaging element and irradiation of instantaneous light.
  • FIGS. 6A and 6B are diagrams illustrating an imaging element according to another modification example, where FIG. 6A illustrates a light reception surface of the imaging element, and FIG. 6B illustrates a relationship between rolling reading in the imaging element and irradiation of instantaneous light.
  • FIG. 1 is a diagram illustrating an embodiment of an image acquisition device according to the present invention.
  • an image acquisition device 1 includes a stage 2 on which a sample S is placed, a light source 3 (light emitting means) that irradiates the sample with instantaneous light, a light guiding optical system 5 including an objective lens 25 arranged to face the sample S on the stage 2 , and an imaging element 6 that captures an optical image of the sample S guided by the light guiding optical system 5 .
  • a light source 3 light emitting means
  • a light guiding optical system 5 including an objective lens 25 arranged to face the sample S on the stage 2
  • an imaging element 6 that captures an optical image of the sample S guided by the light guiding optical system 5 .
  • the image acquisition device 1 includes a stage driving unit 11 that moves a position of a field of view of the objective lens 25 relative to the sample S, an objective lens driving unit 12 (driving unit) that changes a focal position of the objective lens 25 relative to the sample S, a light source control unit 13 (control unit) that controls the light source 3 , and an image processing unit 14 .
  • the sample S observed by the image acquisition device 1 is, for example, a biological sample, such as tissue cells, and is placed on the stage 2 in a state in which the sample S is sealed in a glass slide.
  • the light source 3 is arranged on the bottom side of the stage 2 .
  • a laser diode (LD), a light emitting diode (LED), a super luminescent diode (SLD), a flash lamp light source such as a xenon flash lamp, or the like is used as the light source 3 .
  • the light guiding optical system 5 includes an illumination optical system 21 arranged between the light source 3 and the stage 2 , and a microscope optical system 22 arranged between the stage 2 and the imaging element 6 .
  • the illumination optical system 21 includes, for example, a Koehler illumination optical system including a condenser lens 23 and a projection lens 24 , and guides light from the light source 3 and irradiates the sample S with uniform light.
  • the microscope optical system 22 includes the objective lens 25 , and an image forming lens 26 arranged on the downstream side (imaging element 6 side) of the objective lens 25 , and guides an optical image of the sample S to the imaging element 6 .
  • the optical image of the sample S is an image formed by transmitted light in the case of bright field illumination, scattered light in the case of dark field illumination, or emitted light (fluorescence) in the case of emitted light measurement. Further, the optical image of the sample S may also be an image formed by reflected light from the sample S. In these cases, an optical system corresponding to image acquisition of the transmitted optical image, the scattered optical image, and the emitted light (fluorescence) image of the sample S can be adopted as the light guiding optical system 5 .
  • the imaging element 6 is a two-dimensional imaging element that has a plurality of pixel rows and is capable of rolling reading.
  • Such an imaging element 6 may include, for example, a CMOS image sensor.
  • a plurality of pixel rows 31 (a first pixel row 31 1 , a second pixel row 31 2 , a third pixel row 31 3 , . . . , an N-th pixel row 31 N ) in which a plurality of pixels are arranged in a direction perpendicular to a reading direction are arranged in the reading direction on a light reception surface 6 a of the imaging element 6 , as illustrated in FIG. 2A . Since sequential exposure and reading are performed in each pixel row of the imaging element 6 , for example, as illustrated in FIG. 2B , a period of time of exposure and a period of time of reading in each pixel row are different.
  • the stage driving unit 11 includes, for example, a motor such as a stepping motor (pulse motor) or an actuator such as a piezoelectric actuator.
  • the stage driving unit 11 drives the stage 2 in an XY direction relative to a surface having a predetermined angle (for example, 90°) with respect to a plane perpendicular to an optical axis of the objective lens 25 . Accordingly, the sample S fixed to the stage 2 is moved relative to the optical axis of the objective lens, and a position of the field of view of the objective lens 25 relative to the sample S is moved.
  • the field of view V of the objective lens 25 is smaller than the sample S and, as illustrated in FIG. 3 , a region in which an image can be acquired by onetime imaging becomes smaller than the sample S. Accordingly, in order to image the entire sample S, it is necessary for the field of view V of the objective lens 25 to be moved relative to the sample S.
  • an image acquisition region 32 is set to include the sample S with respect to a sample container (for example, a glass slide) holding the sample S, and positions of a plurality of divided regions 33 are set based on the image acquisition region 32 and the field of view V of the objective lens 25 on the sample S.
  • a portion of the sample S corresponding to the divided region 33 is imaged so as to acquire partial image data corresponding to the divided region 33 , and then, if the field of view V of the objective lens 25 constitutes a position of the divided region 33 to be next imaged, imaging is performed again to acquire partial image data.
  • the stage driving unit 11 continues to drive the stage 2 at a constant velocity. Thereafter, in the image acquisition device 1 , this operation is repeatedly executed, and the image processing unit 14 combines the acquired partial image data to form the entire image (combined image data) of the sample S.
  • the objective lens driving unit 12 includes; for example, a motor such as a stepping motor (pulse motor) or an actuator such as a piezoelectric actuator, similar to the stage driving unit 11 .
  • the objective lens driving unit 12 drives the objective lens 25 in a Z direction along the optical axis of the objective lens 25 . Accordingly, the focal position of the objective lens 25 relative to the sample S is moved.
  • the light source control unit 13 causes the instantaneous light to be emitted from the light source 3 , as illustrated in FIG. 2B . That is, first, the imaging element 6 performs exposure and reading sequentially in order of the first pixel row 31 1 , the second pixel row 31 2 , the third pixel row 31 3 , . . . , the N-th pixel row 31 N . The imaging element 6 outputs a trigger signal to the light source control unit 13 after the start T 1 of exposure of the N-th pixel row 31 N .
  • the light source control unit 13 starts irradiation of the instantaneous light from the light source 3 based on the trigger signal output from the imaging element 6 .
  • the light source control unit 13 ends the irradiation of the instantaneous light from the light source 3 .
  • T 2 -F 2 there is a predetermined period of time from the end F 2 of the irradiation of the instantaneous light to the start T 2 of reading of the first pixel row 31 1 of which the exposure first starts.
  • the objective lens driving unit 12 starts driving of the objective lens 25 to the focusing position in the divided region 33 to be imaged next after the start T 2 of reading of the first pixel row 31 1 of which the reading first starts. Then, the objective lens driving unit 12 ends the driving of the objective lens 25 . In this case, there is a predetermined period of time from the end of driving of the objective lens 25 to the start of exposure of the N-th pixel row 31 N of which the exposure last starts.
  • a focusing position or a focus map acquired in each divided region 33 prior to imaging of the sample S can be used for the focusing position.
  • the stage driving unit 11 drives the stage 2 , for example, at a speed such that partial image data of the sample S in each divided region 33 can be acquired by the irradiation of each instantaneous light from the light source 3 .
  • the imaging element 6 performs the rolling reading, for example, it is possible to reduce noise as compared to global reading and acquire an image with a sufficient S/N ratio even when the optical image of the sample S is weak. Further, since the sample S is irradiated with the instantaneous light from the light source 3 during the period of time in which all of the pixel rows necessary for imaging of the imaging element 6 are exposed, the optical image of the sample S is received in the respective pixel rows at the same timing, and it is possible to acquire a still image of which the distortion is suppressed.
  • the light source control unit 13 starts the irradiation of the instantaneous light from the light source 3 based on the trigger signal from the imaging element 6 and there is the predetermined period of time (F 1 -T 1 ) from the start T 1 of exposure of the N-th pixel row 31 N of which the exposure last starts to the start of the irradiation F 1 of the instantaneous light from the light source 3 , it is possible to irradiate the sample S with the instantaneous light more reliably during the period of time in which all of the pixel rows of the imaging element 6 are exposed.
  • the predetermined period of time (T 2 -F 2 ) from the end F 2 of the irradiation of the instantaneous light to the start T 2 of reading of the first pixel row 31 1 of which the reading first starts. Accordingly, if there is delayed light arriving at the imaging element 6 with a delay after the end of the irradiation of the instantaneous light (for example, the fluorescence of the sample S is acquired), the delayed light is received in the first pixel row 31 1 during the predetermined period of time (T 2 -F 2 ). Accordingly, the optical image of the sample S can be received in all of the pixel rows of the imaging element 6 under the same conditions.
  • the objective lens driving unit 12 performs focusing of the objective lens 25 using a period of time in which each pixel row does not capture the optical image of the sample S, faster acquisition of still images can be realized. Further, since there is the predetermined period of time from the end of driving of the objective lens 25 to the start of exposure of the N-th pixel row of which the exposure last starts, it is possible to reduce an influence of vibration due to driving of the objective lens 25 when the image data is acquired.
  • the present invention is not limited to the above embodiment.
  • the light source control unit 13 may start the irradiation of the instantaneous light from the light source 3 during a period of time from the start of exposure of the N-th pixel row 31 N of which the exposure first starts independently of the imaging element 6 to the start of reading of the first pixel row 31 1 of which the reading first starts. In this case, it is possible to acquire a still image of which the distortion is suppressed.
  • the imaging element 6 may output the trigger signal to the light source control unit 13 at the same time as the start T 1 of exposure of the N-th pixel row 31 N of which the exposure last starts.
  • the light source control unit 13 starts the irradiation of the instantaneous light from the light source 3 at substantially the same time as the start T 1 of exposure of the N-th pixel row 31 N of which the exposures last starts based on the trigger signal output from the imaging element 6 .
  • the light source control unit 13 controls the light source 3 so that there is a predetermined period of time (T 2 -F 2 ) from the end of irradiation F 2 of the instantaneous light to the start T 2 of reading of the first pixel row 31 1 of which the reading first starts in the above embodiment
  • the light source control unit 13 may end the irradiation of the instantaneous light at the same time as the start T 2 of reading of the first pixel row 31 1 of which the reading first starts. In this case, it is possible to irradiate the sample S with the instantaneous light during a period of time in which all of the pixel rows of the imaging element 6 are exposed.
  • the objective lens driving unit 12 drives the objective lens 25 to the focusing position during a period of time from the start of reading of the first pixel row 31 1 of which the reading first starts to the start of exposure of the N-th pixel row 31 N of which the reading last starts in the above embodiment
  • the objective lens driving unit 12 may drive the objective lens 25 to the focusing position during a period of time other than the above period.
  • the driving of the objective lens 25 may end at the same time as the start of exposure of the N-th pixel row 31 N of which the reading last starts.
  • continuous light may be emitted from the light source 3 and a shutter may be provided between the light source 3 and the sample S.
  • a light emitting means is configured with the light source 3 and the shutter, and the light source control unit 13 controls opening and closing of the shutter. Accordingly, the sample S can be irradiated with the instantaneous light during the period of time in which all of the pixel rows 31 of the imaging element 6 are exposed.
  • the focal position of the objective lens 25 relative to the sample S is moved in the optical axis direction of the objective lens 25 by the objective lens driving unit 12 moving the objective lens 25 in the optical axis direction thereof in the above embodiment
  • the focal position of the objective lens 25 relative to the sample S may be moved in the optical axis direction of the objective lens 25 by the stage driving unit 11 (driving unit) moving the stage 2 in the optical axis direction of the objective lens 25 .
  • a plurality of pixel rows 31 constituting the light reception surface 6 a of the imaging element 6 may be divided into a first pixel row group 31 a and a second pixel row group 31 b each including a plurality of pixel rows, as illustrated in FIG. 4 , and each of the first pixel row group 31 a and the second pixel row group 31 b may be read.
  • a reading direction of the first pixel row group 31 a and a reading direction of the second pixel row group 31 b may be set to be opposite directions.
  • the first pixel row group 31 a and the second pixel row group 31 b may be read sequentially from the pixel row at a center to the pixel row at an end, or the first pixel row group 31 a and the second pixel row group 31 b may be read sequentially from the pixel row at the end to the pixel row at the center.
  • FIGS. 5 and 6 A relationship between a configuration of the light reception surface 6 a and the periods of time of exposure and reading in this case is illustrated in FIGS. 5 and 6 .
  • the light source control unit 13 causes the instantaneous light to be emitted from the light source 3 during a period of time from the start of the exposure of the (1-m)-th and (2-m)-th pixel rows that are located at the ends of the light reception surface 6 a and of which the exposure last starts to the start of reading of the (1-l)-th and (2-l)-th pixel rows that are located at the center of the light reception surface 6 a and of which the reading first starts.
  • the light source control unit 13 causes the instantaneous light to be emitted from the light source 3 during a period of time from the start of the exposure of the (1-n)-th and (2-n)-th pixel rows that are located at the center of the light reception surface 6 a and of which the exposure last starts to the start of reading of the (1-l)-th and (2-l)-th pixel rows that are located at the ends of the light reception surface 6 a and of which the reading first starts.
  • the sample S can be irradiated with the instantaneous light during a period of time in which all of the pixel rows of the imaging element 6 are exposed, it is possible to acquire a still image of which the distortion is suppressed.
  • 1 image acquisition device
  • 2 stage
  • 3 light source (light emitting means)
  • 5 light guiding optical system
  • 6 imaging element
  • 11 stage driving unit (driving unit)
  • 12 objective lens driving unit (driving unit)
  • 13 light source control unit (control unit)
  • 14 image processing unit
  • 25 objective lens
  • 31 pixel row
  • 32 image acquisition region
  • 33 divided region
  • S sample
  • V field of view of objective lens.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • General Physics & Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Optics & Photonics (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Microscoopes, Condenser (AREA)

Abstract

In an image acquisition device, imaging is performed using an imaging element capable of rolling reading an optical image of a sample irradiated with instantaneous light. Therefore, the image can be captured with a sufficient S/N ratio even when the optical image is weak. Further, the image acquisition device includes a light source control unit that causes the instantaneous light to be emitted from a light source from the start of exposure of a pixel row of which reading last starts to the start of reading of a pixel row of which the reading first starts. That is, in the image acquisition device, the light source control unit causes the sample to be irradiated with the instantaneous light from the light emitting means during a period of time in which all pixel rows of the imaging element are exposed.

Description

TECHNICAL FIELD
The present invention relates to an image acquisition device and an image acquisition method for an image acquisition device.
BACKGROUND ART
In an image acquisition device for acquiring still images of a sample such as tissue cells, when the sample is larger than an imaging field of view of an imaging element, for example, partial images of the sample are sequentially acquired while moving the sample relative to the objective lens, and then, the partial images are combined so as to acquire an image of the entire sample (see, for example, Patent Literature 1 to Patent Literature 3).
CITATION LIST Patent Literature
[Patent Literature 1] Japanese Unexamined Patent Publication No. 2003-222801
[Patent Literature 2] Japanese Unexamined Patent Publication No. 2000-501844
[Patent Literature 3] Japanese Unexamined Patent Publication No. S63-191063
SUMMARY OF INVENTION Technical Problem
Imaging methods for the image acquisition device as described above include a method in which global reading in which exposure and reading are performed at the same time in all pixels in an imaging element is used, and a method in which rolling reading in which reading is performed sequentially in each pixel row in an imaging element is used. Among these methods, the method in which the rolling reading advantageous for improvement of an S/N ratio is used has been studied. However, in this method, an optical image of a sample is received at different timing in each pixel row. Accordingly, when a moving sample is imaged, an obtained still image may be distorted.
The present invention has been made to solve the above problem, and an object thereof is to provide an image acquisition device and an image acquisition method for an image acquisition device capable of acquiring a still image of which the distortion is suppressed with respect to a moving sample with a sufficient S/N ratio.
Solution to Problem
In order to solve the above problem, an image acquisition device according to the present invention includes a stage on which a sample is placed; a light emitting means for emitting instantaneous light; a light guiding optical system including an objective lens arranged to face the sample on the stage; a driving unit for moving a focal position of the objective lens relative to the sample in an optical axis direction of the objective lens; an imaging element for capturing an optical image of the sample guided by the light guiding optical system; an image processing unit for processing an image data output from the imaging element; and a control unit for controlling the light emitting means, wherein the imaging element is a two-dimensional imaging element including a plurality of pixel rows and is capable of rolling reading, the control unit causes the instantaneous light to be emitted from the light emitting means during a period of time from the start of exposure of the pixel row of which the reading last starts in the rolling reading to the start of the reading of the pixel row of which the reading first starts, and the image processing unit sequentially acquires image data from the imaging element, and combines the acquired image data to form combined image data.
In the image acquisition device, an optical image of the sample irradiated with the instantaneous light is captured using the two-dimensional imaging element capable of rolling reading. Therefore, noise in the imaging element can be reduced and the image can be captured with a sufficient S/N ratio even when the optical image is weak. Further, this image acquisition device includes the control unit that causes the instantaneous light to be emitted from the light emitting means from the start of exposure of the pixel row of which reading last starts in the rolling reading to the start of reading of the pixel row of which the reading first starts. That is, in this image acquisition device, the control unit causes the sample to be irradiated with the instantaneous light from the light emitting means during a period of time in which all of the pixel rows of the imaging element capable of rolling reading are exposed. Accordingly, since an optical image of the sample is received in the respective pixel rows at the same timing, it is possible to acquire a still image of which the distortion is suppressed.
It is preferable that the imaging element outputs a trigger signal indicating the start of exposure of the pixel row of which the reading last starts in the rolling reading to the control unit, and the control unit starts the irradiation of the instantaneous light based on the trigger signal output from the imaging element. In this case, it is possible to cause the sample to be irradiated with the instantaneous light more reliably during a period of time in which all of the pixel rows are exposed.
It is preferable for the control unit to control the light emitting means so that there is a predetermined period of time from the end of the irradiation of the instantaneous light to the start of the reading of the pixel row of which the reading first starts in the rolling reading. Accordingly, even when there is delayed light arriving at the imaging element with a delay after the end of the irradiation of the instantaneous light, the delayed light is received during the predetermined period of time. Accordingly, the optical image of the sample can be received in all of the pixel rows under the same conditions.
It is preferable for the control unit to control the light emitting means so that there is a predetermined period of time from the start of exposure of the pixel row of which the reading last starts in the rolling reading to the start of the irradiation of the instantaneous light. In this case, it is possible to cause the sample to be irradiated with the instantaneous light more reliably during a period of time in which all of the pixel rows are exposed.
It is preferable for the driving unit to move a focal position of the objective lens to a focusing position during a period of time from the start of reading of the pixel row of which the reading first starts in the rolling reading to the start of exposure of the pixel row of which the reading last starts. In this case, since focusing of the objective lens can be performed using a period of time in which each pixel row does not capture the optical image of the sample, faster acquisition of still images can be realized.
It is preferable for the driving unit to move the focal position of the objective lens so that there is a predetermined period of time from the end of the movement of the focal position of the objective lens to the start of exposure of the pixel row of which the reading last starts in the rolling reading. In this case, it is possible to reduce an influence of vibration due to driving of the objective lens or the stage when the image data is acquired.
An acquisition method for an image acquisition device according to the present invention is an image acquisition method for an image acquisition device including a stage on which a sample is placed; a light emitting means for emitting instantaneous light; a light guiding optical system including an objective lens arranged to face the sample on the stage; a driving unit for moving a focal position of the objective lens relative to the sample in an optical axis direction of the objective lens; an imaging element for capturing an optical image of the sample guided by the light guiding optical system; an image processing unit for processing an image data output from the imaging element; and a control unit for controlling the light emitting means, the image acquisition method including: using, as the imaging element, a two-dimensional imaging element including a plurality of pixel rows and is capable of rolling reading; causing, by the control unit, the instantaneous light to be emitted from the light emitting means during a period of time from the start of exposure of the pixel row of which the reading last starts in the rolling reading to the start of the reading of which the pixel row of which the reading first starts; and sequentially acquiring, by the image processing unit, image data from the imaging element, and combining the acquired image data to form combined image data.
In this image acquisition method for an image acquisition device, an optical image of the sample irradiated with the instantaneous light is captured using the two-dimensional imaging element capable of rolling reading. Therefore, noise in the imaging element can be reduced and the image can be captured with a sufficient S/N ratio even when the optical image is weak. Further, in this image acquisition method for an image acquisition device, instantaneous light is caused to be emitted from the light emitting means by the control unit from the start of exposure of the pixel row of which reading last starts in the rolling reading to the start of reading of the pixel row of which the reading first starts. That is, in this image acquisition method for an image acquisition device, the sample is caused to be irradiated with the instantaneous light from the light emitting means by the control unit during a period of time in which all of the pixel rows of the imaging element capable of rolling reading are exposed. Accordingly, since an optical image of the sample is received in the respective pixel rows at the same timing, it is possible to acquire a still image of which the distortion is suppressed.
Advantageous Effects of Invention
According to the present invention, it is possible to acquire a still image of which the distortion is suppressed with respect to a moving sample with a sufficient S/N ratio.
BRIEF DESCRIPTION OF DRAWINGS
FIG. 1 is a diagram illustrating an embodiment of an image acquisition device according to the present invention.
FIGS. 2A and 2B are diagrams illustrating an example of an imaging element, where FIG. 2A illustrates a light reception surface of the imaging element, and FIG. 2B illustrates a relationship between rolling reading in the imaging element and, for example, irradiation of instantaneous light.
FIG. 3 is a diagram illustrating an example of scan of an image acquisition region for a sample.
FIG. 4 is a diagram illustrating an example of a configuration of pixel rows in the imaging element.
FIGS. 5A and 5B are diagrams illustrating an imaging element according to a modification example, where FIG. 5A illustrates a light reception surface of the imaging element, and FIG. 5B illustrates a relationship between rolling reading in the imaging element and irradiation of instantaneous light.
FIGS. 6A and 6B are diagrams illustrating an imaging element according to another modification example, where FIG. 6A illustrates a light reception surface of the imaging element, and FIG. 6B illustrates a relationship between rolling reading in the imaging element and irradiation of instantaneous light.
DESCRIPTION OF EMBODIMENTS
Hereinafter, preferred embodiments of an image acquisition device according to the present invention will be described in detail with reference to the accompanying drawings.
FIG. 1 is a diagram illustrating an embodiment of an image acquisition device according to the present invention. As illustrated in FIG. 1, an image acquisition device 1 includes a stage 2 on which a sample S is placed, a light source 3 (light emitting means) that irradiates the sample with instantaneous light, a light guiding optical system 5 including an objective lens 25 arranged to face the sample S on the stage 2, and an imaging element 6 that captures an optical image of the sample S guided by the light guiding optical system 5.
Further, the image acquisition device 1 includes a stage driving unit 11 that moves a position of a field of view of the objective lens 25 relative to the sample S, an objective lens driving unit 12 (driving unit) that changes a focal position of the objective lens 25 relative to the sample S, a light source control unit 13 (control unit) that controls the light source 3, and an image processing unit 14.
The sample S observed by the image acquisition device 1 is, for example, a biological sample, such as tissue cells, and is placed on the stage 2 in a state in which the sample S is sealed in a glass slide. The light source 3 is arranged on the bottom side of the stage 2. For example, a laser diode (LD), a light emitting diode (LED), a super luminescent diode (SLD), a flash lamp light source such as a xenon flash lamp, or the like is used as the light source 3.
The light guiding optical system 5 includes an illumination optical system 21 arranged between the light source 3 and the stage 2, and a microscope optical system 22 arranged between the stage 2 and the imaging element 6. The illumination optical system 21 includes, for example, a Koehler illumination optical system including a condenser lens 23 and a projection lens 24, and guides light from the light source 3 and irradiates the sample S with uniform light.
Meanwhile, the microscope optical system 22 includes the objective lens 25, and an image forming lens 26 arranged on the downstream side (imaging element 6 side) of the objective lens 25, and guides an optical image of the sample S to the imaging element 6. The optical image of the sample S is an image formed by transmitted light in the case of bright field illumination, scattered light in the case of dark field illumination, or emitted light (fluorescence) in the case of emitted light measurement. Further, the optical image of the sample S may also be an image formed by reflected light from the sample S. In these cases, an optical system corresponding to image acquisition of the transmitted optical image, the scattered optical image, and the emitted light (fluorescence) image of the sample S can be adopted as the light guiding optical system 5.
The imaging element 6 is a two-dimensional imaging element that has a plurality of pixel rows and is capable of rolling reading. Such an imaging element 6 may include, for example, a CMOS image sensor. A plurality of pixel rows 31 (a first pixel row 31 1, a second pixel row 31 2, a third pixel row 31 3, . . . , an N-th pixel row 31 N) in which a plurality of pixels are arranged in a direction perpendicular to a reading direction are arranged in the reading direction on a light reception surface 6 a of the imaging element 6, as illustrated in FIG. 2A. Since sequential exposure and reading are performed in each pixel row of the imaging element 6, for example, as illustrated in FIG. 2B, a period of time of exposure and a period of time of reading in each pixel row are different.
The stage driving unit 11 includes, for example, a motor such as a stepping motor (pulse motor) or an actuator such as a piezoelectric actuator. The stage driving unit 11 drives the stage 2 in an XY direction relative to a surface having a predetermined angle (for example, 90°) with respect to a plane perpendicular to an optical axis of the objective lens 25. Accordingly, the sample S fixed to the stage 2 is moved relative to the optical axis of the objective lens, and a position of the field of view of the objective lens 25 relative to the sample S is moved.
In the image acquisition device 1, imaging of the sample S is performed at a high magnification such as 20× or 40×. Therefore, the field of view V of the objective lens 25 is smaller than the sample S and, as illustrated in FIG. 3, a region in which an image can be acquired by onetime imaging becomes smaller than the sample S. Accordingly, in order to image the entire sample S, it is necessary for the field of view V of the objective lens 25 to be moved relative to the sample S.
Therefore, in the image acquisition device 1, an image acquisition region 32 is set to include the sample S with respect to a sample container (for example, a glass slide) holding the sample S, and positions of a plurality of divided regions 33 are set based on the image acquisition region 32 and the field of view V of the objective lens 25 on the sample S. A portion of the sample S corresponding to the divided region 33 is imaged so as to acquire partial image data corresponding to the divided region 33, and then, if the field of view V of the objective lens 25 constitutes a position of the divided region 33 to be next imaged, imaging is performed again to acquire partial image data. In this case, the stage driving unit 11 continues to drive the stage 2 at a constant velocity. Thereafter, in the image acquisition device 1, this operation is repeatedly executed, and the image processing unit 14 combines the acquired partial image data to form the entire image (combined image data) of the sample S.
The objective lens driving unit 12 includes; for example, a motor such as a stepping motor (pulse motor) or an actuator such as a piezoelectric actuator, similar to the stage driving unit 11. The objective lens driving unit 12 drives the objective lens 25 in a Z direction along the optical axis of the objective lens 25. Accordingly, the focal position of the objective lens 25 relative to the sample S is moved.
The light source control unit 13 causes the instantaneous light to be emitted from the light source 3, as illustrated in FIG. 2B. That is, first, the imaging element 6 performs exposure and reading sequentially in order of the first pixel row 31 1, the second pixel row 31 2, the third pixel row 31 3, . . . , the N-th pixel row 31 N. The imaging element 6 outputs a trigger signal to the light source control unit 13 after the start T1 of exposure of the N-th pixel row 31 N.
The light source control unit 13 starts irradiation of the instantaneous light from the light source 3 based on the trigger signal output from the imaging element 6. In this case, there is a predetermined period of time (F1-T1) from the start T1 of exposure of the N-th pixel row 31 N of which the exposure last starts to the start F1 of irradiation of the instantaneous light. Subsequently, the light source control unit 13 ends the irradiation of the instantaneous light from the light source 3. In this case, there is a predetermined period of time (T2-F2) from the end F2 of the irradiation of the instantaneous light to the start T2 of reading of the first pixel row 31 1 of which the exposure first starts.
Further, the objective lens driving unit 12 starts driving of the objective lens 25 to the focusing position in the divided region 33 to be imaged next after the start T2 of reading of the first pixel row 31 1 of which the reading first starts. Then, the objective lens driving unit 12 ends the driving of the objective lens 25. In this case, there is a predetermined period of time from the end of driving of the objective lens 25 to the start of exposure of the N-th pixel row 31 N of which the exposure last starts. Here, for the focusing position, for example, a focusing position or a focus map acquired in each divided region 33 prior to imaging of the sample S can be used.
Thereafter, in the image acquisition device 1, an operation thereof is repeatedly performed. When this operation is repeatedly executed, the stage driving unit 11 drives the stage 2, for example, at a speed such that partial image data of the sample S in each divided region 33 can be acquired by the irradiation of each instantaneous light from the light source 3.
As described above, since the imaging element 6 performs the rolling reading, for example, it is possible to reduce noise as compared to global reading and acquire an image with a sufficient S/N ratio even when the optical image of the sample S is weak. Further, since the sample S is irradiated with the instantaneous light from the light source 3 during the period of time in which all of the pixel rows necessary for imaging of the imaging element 6 are exposed, the optical image of the sample S is received in the respective pixel rows at the same timing, and it is possible to acquire a still image of which the distortion is suppressed.
In this case, since the light source control unit 13 starts the irradiation of the instantaneous light from the light source 3 based on the trigger signal from the imaging element 6 and there is the predetermined period of time (F1-T1) from the start T1 of exposure of the N-th pixel row 31 N of which the exposure last starts to the start of the irradiation F1 of the instantaneous light from the light source 3, it is possible to irradiate the sample S with the instantaneous light more reliably during the period of time in which all of the pixel rows of the imaging element 6 are exposed.
Further, there is the predetermined period of time (T2-F2) from the end F2 of the irradiation of the instantaneous light to the start T2 of reading of the first pixel row 31 1 of which the reading first starts. Accordingly, if there is delayed light arriving at the imaging element 6 with a delay after the end of the irradiation of the instantaneous light (for example, the fluorescence of the sample S is acquired), the delayed light is received in the first pixel row 31 1 during the predetermined period of time (T2-F2). Accordingly, the optical image of the sample S can be received in all of the pixel rows of the imaging element 6 under the same conditions.
Further, since the objective lens driving unit 12 performs focusing of the objective lens 25 using a period of time in which each pixel row does not capture the optical image of the sample S, faster acquisition of still images can be realized. Further, since there is the predetermined period of time from the end of driving of the objective lens 25 to the start of exposure of the N-th pixel row of which the exposure last starts, it is possible to reduce an influence of vibration due to driving of the objective lens 25 when the image data is acquired.
The present invention is not limited to the above embodiment. For example, while the light source control unit 13 starts the irradiation of the instantaneous light from the light source 3 based on the trigger signal output from the imaging element 6 in the above embodiment, the light source control unit 13 may start the irradiation of the instantaneous light from the light source 3 during a period of time from the start of exposure of the N-th pixel row 31 N of which the exposure first starts independently of the imaging element 6 to the start of reading of the first pixel row 31 1 of which the reading first starts. In this case, it is possible to acquire a still image of which the distortion is suppressed.
Further, while the imaging element 6 outputs the trigger signal to the light source control unit 13 after the start T1 of exposure of the N-th pixel row 31 N of which the exposure last starts in the above embodiment, the imaging element 6 may output the trigger signal to the light source control unit 13 at the same time as the start T1 of exposure of the N-th pixel row 31 N of which the exposure last starts. In this case, the light source control unit 13 starts the irradiation of the instantaneous light from the light source 3 at substantially the same time as the start T1 of exposure of the N-th pixel row 31 N of which the exposures last starts based on the trigger signal output from the imaging element 6. In this case, it is possible to irradiate the sample S with the instantaneous light during a period of time in which all of the pixel rows of the imaging element 6 are exposed.
Further, while the light source control unit 13 controls the light source 3 so that there is a predetermined period of time (T2-F2) from the end of irradiation F2 of the instantaneous light to the start T2 of reading of the first pixel row 31 1 of which the reading first starts in the above embodiment, the light source control unit 13 may end the irradiation of the instantaneous light at the same time as the start T2 of reading of the first pixel row 31 1 of which the reading first starts. In this case, it is possible to irradiate the sample S with the instantaneous light during a period of time in which all of the pixel rows of the imaging element 6 are exposed.
Further, while the objective lens driving unit 12 drives the objective lens 25 to the focusing position during a period of time from the start of reading of the first pixel row 31 1 of which the reading first starts to the start of exposure of the N-th pixel row 31 N of which the reading last starts in the above embodiment, the objective lens driving unit 12 may drive the objective lens 25 to the focusing position during a period of time other than the above period.
Further, while the objective lens driving unit 12 ends the driving of the objective lens 25 so that there is a predetermined period of time from the end of driving of the objective lens 25 to the start of exposure of the N-th pixel row 31 N of which the reading last starts, the driving of the objective lens 25 may end at the same time as the start of exposure of the N-th pixel row 31 N of which the reading last starts.
Further, while the instantaneous light is caused to be emitted from the light source 3 in the above embodiment, continuous light (CW light) may be emitted from the light source 3 and a shutter may be provided between the light source 3 and the sample S. In this case, a light emitting means is configured with the light source 3 and the shutter, and the light source control unit 13 controls opening and closing of the shutter. Accordingly, the sample S can be irradiated with the instantaneous light during the period of time in which all of the pixel rows 31 of the imaging element 6 are exposed.
Further, while the focal position of the objective lens 25 relative to the sample S is moved in the optical axis direction of the objective lens 25 by the objective lens driving unit 12 moving the objective lens 25 in the optical axis direction thereof in the above embodiment, the focal position of the objective lens 25 relative to the sample S may be moved in the optical axis direction of the objective lens 25 by the stage driving unit 11 (driving unit) moving the stage 2 in the optical axis direction of the objective lens 25.
Further, while the pixel row 31 constituting the light reception surface 6 a of the imaging element 6 includes a group of pixel rows such as the first pixel row 31 1, the second pixel row 31 2, the third pixel row 31 3, . . . , the N-th pixel row 31 N in the above embodiment, a plurality of pixel rows 31 constituting the light reception surface 6 a of the imaging element 6 may be divided into a first pixel row group 31 a and a second pixel row group 31 b each including a plurality of pixel rows, as illustrated in FIG. 4, and each of the first pixel row group 31 a and the second pixel row group 31 b may be read.
In this case, a reading direction of the first pixel row group 31 a and a reading direction of the second pixel row group 31 b may be set to be opposite directions. Specifically, the first pixel row group 31 a and the second pixel row group 31 b may be read sequentially from the pixel row at a center to the pixel row at an end, or the first pixel row group 31 a and the second pixel row group 31 b may be read sequentially from the pixel row at the end to the pixel row at the center.
A relationship between a configuration of the light reception surface 6 a and the periods of time of exposure and reading in this case is illustrated in FIGS. 5 and 6. As illustrated in FIGS. 5A and 5B, when the first pixel row group 31 a and the second pixel row group 31 b may be read sequentially from the pixel row at the center to the pixel row at the end, the light source control unit 13 causes the instantaneous light to be emitted from the light source 3 during a period of time from the start of the exposure of the (1-m)-th and (2-m)-th pixel rows that are located at the ends of the light reception surface 6 a and of which the exposure last starts to the start of reading of the (1-l)-th and (2-l)-th pixel rows that are located at the center of the light reception surface 6 a and of which the reading first starts. On the other hand, as illustrated in FIGS. 6A and 6B, when the first pixel row group 31 a and the second pixel row group 31 b may be read sequentially from the pixel row at the end to the pixel row at the center, the light source control unit 13 causes the instantaneous light to be emitted from the light source 3 during a period of time from the start of the exposure of the (1-n)-th and (2-n)-th pixel rows that are located at the center of the light reception surface 6 a and of which the exposure last starts to the start of reading of the (1-l)-th and (2-l)-th pixel rows that are located at the ends of the light reception surface 6 a and of which the reading first starts. In this case, similarly, since the sample S can be irradiated with the instantaneous light during a period of time in which all of the pixel rows of the imaging element 6 are exposed, it is possible to acquire a still image of which the distortion is suppressed.
REFERENCE SIGNS LIST
1: image acquisition device, 2: stage, 3: light source (light emitting means), 5: light guiding optical system, 6: imaging element, 11: stage driving unit (driving unit), 12: objective lens driving unit (driving unit), 13: light source control unit (control unit), 14: image processing unit, 25: objective lens, 31: pixel row, 32: image acquisition region, 33: divided region, S: sample, V: field of view of objective lens.

Claims (6)

The invention claimed is:
1. An image acquisition device, comprising:
a stage on which a sample is placed;
a light source configured to emit instantaneous light;
a light guiding optical system including an objective lens arranged to face the sample on the stage;
a driver circuit configured to move a focal position of the objective lens relative to the sample in an optical axis direction of the objective lens;
an imaging sensor configured to capture an optical image of the sample guided by the light guiding optical system;
an image processor configured to process an image data output from the imaging sensor; and
a controller configured to start irradiation of the instantaneous light from the light source based on a trigger signal from the image sensor,
wherein the imaging sensor is a two-dimensional imaging sensor including a plurality of pixel rows and is configured to be capable of rolling reading in a reading direction, the plurality of pixel rows including a plurality of pixels arranged in a direction perpendicular to the reading direction on a light reception surface of the imaging sensor so that sequential exposure and reading are performed in each pixel row of the sensor, the plurality of pixel rows including a first pixel row being rolling read first in the reading direction and a last pixel row being rolling read after the first pixel row, and the last pixel row being rolling read last in the reading direction,
the controller causes the instantaneous light to be emitted from the light source during a period of time when no reading of a pixel row occurs and during a period of time from a start of exposure of the last pixel row of which the reading last starts in the rolling reading to a start of the reading of the first pixel row of which the reading first starts,
the image processor sequentially acquires image data from the imaging sensor, and combines the acquired image data to form combined image data, and
the driver circuit moves a focal position of the objective lens to a focusing position during a period of time from the start of reading of the first pixel row of which the reading first starts in the rolling reading to a start of next exposure of the last pixel row of which the reading last starts.
2. The image acquisition device according to claim 1,
the trigger signal indicates the start of exposure of the last pixel row of which the reading last starts in the rolling reading.
3. The image acquisition device according to claim 1,
wherein the controller controls the light source so that there is a predetermined period of time from the end of the irradiation of the instantaneous light to the start of the reading of the first pixel row of which the reading first starts in the rolling reading.
4. The image acquisition device according to claim 1,
wherein the controller controls the light source so that there is a predetermined period of time from the start of exposure of the last pixel row of which the reading last starts in the rolling reading to the start of the irradiation of the instantaneous light.
5. The image acquisition device according to claim 1,
wherein the driver circuit moves the focal position of the objective lens so that there is a predetermined period of time from an end of the movement of the focal position of the objective lens to the start of exposure of the last pixel row of which the reading last starts in the rolling reading.
6. An image acquisition method, comprising:
emitting instantaneous light by a light source,
moving a focal position of an objective lens relative to a sample in an optical axis direction of the objective lens,
capturing, by a two-dimensional imaging sensor including a plurality of pixel rows capable of rolling reading in a reading direction, the plurality of pixel rows including a plurality of pixels arranged in a direction perpendicular to the reading direction on a light reception surface of the imaging sensor so that sequential exposure and reading are performed in each pixel row of the sensor, the plurality of pixel rows including a first pixel row being rolling read first in the reading direction and a last pixel row being rolling read after the first pixel row, and the last pixel row being rolling read last in the reading direction, an optical image of the sample guided by a light guiding optical system including the objective lens arranged to face the sample on a stage,
causing the instantaneous light to be emitted from the light source based on a trigger signal from the image sensor during a period of time when no reading of a pixel row occurs and during a period of time from a start of exposure of the last pixel row of which the reading last starts in the rolling reading to a start of the reading of the first pixel row of which the reading first starts,
sequentially acquiring image data output from the two-dimensional imaging sensor,
combining the acquired image data to form combined image data, and
moving the focal position of the objective lens to a focusing position during a period of time from the start of reading of the first pixel row of which the reading first starts in the rolling reading to a start of next exposure of the last pixel row of which the reading last starts.
US15/030,198 2013-11-01 2014-01-28 Image acquisition device and image acquisition method for image acquisition device Active US10422987B2 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2013-228606 2013-11-01
JP2013228606A JP6134249B2 (en) 2013-11-01 2013-11-01 Image acquisition device and image acquisition method of image acquisition device
PCT/JP2014/051805 WO2015064117A1 (en) 2013-11-01 2014-01-28 Image acquisition device and image acquisition method for image acquisition device

Publications (2)

Publication Number Publication Date
US20160252717A1 US20160252717A1 (en) 2016-09-01
US10422987B2 true US10422987B2 (en) 2019-09-24

Family

ID=53003741

Family Applications (1)

Application Number Title Priority Date Filing Date
US15/030,198 Active US10422987B2 (en) 2013-11-01 2014-01-28 Image acquisition device and image acquisition method for image acquisition device

Country Status (7)

Country Link
US (1) US10422987B2 (en)
EP (1) EP3064982B1 (en)
JP (1) JP6134249B2 (en)
CN (1) CN105683806B (en)
DK (1) DK3064982T3 (en)
HU (1) HUE058663T2 (en)
WO (1) WO2015064117A1 (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6812149B2 (en) * 2016-06-30 2021-01-13 オリンパス株式会社 Scanning microscope and control method of scanning microscope
JP6842387B2 (en) * 2017-08-31 2021-03-17 浜松ホトニクス株式会社 Image acquisition device and image acquisition method
CN108169783A (en) * 2018-02-26 2018-06-15 苏州大学 A kind of real-time measurement apparatus and measuring method of the distribution of radiation space dosage
CN109669264A (en) * 2019-01-08 2019-04-23 哈尔滨理工大学 Self-adapting automatic focus method based on shade of gray value

Citations (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS63191063A (en) 1987-02-03 1988-08-08 Sumitomo Electric Ind Ltd System for processing microscopic image
JP2000501844A (en) 1995-07-19 2000-02-15 モルフォメトリックス テクノロジーズ インク. Automatic scanning of microscope slides
JP2003043363A (en) 2001-07-27 2003-02-13 Olympus Optical Co Ltd Confocal microscope
US20030063851A1 (en) * 2001-09-27 2003-04-03 Bio-Rad Laboratories, Inc. Biochemical assay detection in a liquid receptacle using a fiber optic exciter
US20030136980A1 (en) * 2002-01-18 2003-07-24 Malcolm Lin Image pickup apparatus and exposure control method therefor
JP2003222801A (en) 2002-01-29 2003-08-08 Olympus Optical Co Ltd Microscopic image photographing device
JP2003270545A (en) 2002-03-14 2003-09-25 Olympus Optical Co Ltd Optical scanning stage
EP1439385A1 (en) 2003-01-15 2004-07-21 Negevtech Ltd. Method and system for fast on-line electro-optical detection of wafer defects
WO2006098443A1 (en) 2005-03-17 2006-09-21 Hamamatsu Photonics K.K. Microscopic image capturing device
US20070206097A1 (en) * 2006-03-01 2007-09-06 Hamamatsu Photonics K.K. Image acquiring apparatus, image acquiring method, and image acquiring program
US20070223802A1 (en) * 2006-03-23 2007-09-27 Fujitsu Limited Inspection apparatus, lamination apparatus, and inspection method
US20070230939A1 (en) 2006-04-03 2007-10-04 Samsung Techwin Co., Ltd. Photographing apparatus and method
JP2007263868A (en) 2006-03-29 2007-10-11 Toray Eng Co Ltd Automatic visual inspection apparatus and automatic visual inspection method
JP2008507719A (en) 2004-07-23 2008-03-13 ジーイー・ヘルスケア・ナイアガラ・インク Confocal fluorescence microscopy and equipment
US20100067780A1 (en) * 2006-12-04 2010-03-18 Tokyo Electron Limited Defect detecting apparatus, defect detecting method, information processing apparatus, information processing method, and program therefor
JP2012108184A (en) 2010-11-15 2012-06-07 Sony Corp Focal position information detector, microscope device and focal position information detection method
US20120147224A1 (en) 2010-12-08 2012-06-14 Canon Kabushiki Kaisha Imaging apparatus
CN102854615A (en) 2012-04-27 2013-01-02 麦克奥迪实业集团有限公司 Full-automatic scanning system and method for microscopic section
US20130088634A1 (en) 2011-10-05 2013-04-11 Sony Corporation Image acquisition apparatus, image acquisition method, and computer program
JP2015536482A (en) 2012-11-16 2015-12-21 モレキュラー デバイシーズ, エルエルシー System and method for acquiring an image using a rolling shutter camera while sequencing a microscope device asynchronously

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2000056302A2 (en) * 1999-03-19 2000-09-28 Parker Hughes Institute Vanadium (iv) complexes containing catacholate ligand and having spermicidal activity
US20070022380A1 (en) * 2005-07-20 2007-01-25 Microsoft Corporation Context aware task page

Patent Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS63191063A (en) 1987-02-03 1988-08-08 Sumitomo Electric Ind Ltd System for processing microscopic image
JP2000501844A (en) 1995-07-19 2000-02-15 モルフォメトリックス テクノロジーズ インク. Automatic scanning of microscope slides
JP2003043363A (en) 2001-07-27 2003-02-13 Olympus Optical Co Ltd Confocal microscope
US20030063851A1 (en) * 2001-09-27 2003-04-03 Bio-Rad Laboratories, Inc. Biochemical assay detection in a liquid receptacle using a fiber optic exciter
US20030136980A1 (en) * 2002-01-18 2003-07-24 Malcolm Lin Image pickup apparatus and exposure control method therefor
JP2003222801A (en) 2002-01-29 2003-08-08 Olympus Optical Co Ltd Microscopic image photographing device
JP2003270545A (en) 2002-03-14 2003-09-25 Olympus Optical Co Ltd Optical scanning stage
EP1439385A1 (en) 2003-01-15 2004-07-21 Negevtech Ltd. Method and system for fast on-line electro-optical detection of wafer defects
JP2008507719A (en) 2004-07-23 2008-03-13 ジーイー・ヘルスケア・ナイアガラ・インク Confocal fluorescence microscopy and equipment
WO2006098443A1 (en) 2005-03-17 2006-09-21 Hamamatsu Photonics K.K. Microscopic image capturing device
US20070206097A1 (en) * 2006-03-01 2007-09-06 Hamamatsu Photonics K.K. Image acquiring apparatus, image acquiring method, and image acquiring program
US20070223802A1 (en) * 2006-03-23 2007-09-27 Fujitsu Limited Inspection apparatus, lamination apparatus, and inspection method
JP2007263868A (en) 2006-03-29 2007-10-11 Toray Eng Co Ltd Automatic visual inspection apparatus and automatic visual inspection method
US20070230939A1 (en) 2006-04-03 2007-10-04 Samsung Techwin Co., Ltd. Photographing apparatus and method
US20100067780A1 (en) * 2006-12-04 2010-03-18 Tokyo Electron Limited Defect detecting apparatus, defect detecting method, information processing apparatus, information processing method, and program therefor
JP2012108184A (en) 2010-11-15 2012-06-07 Sony Corp Focal position information detector, microscope device and focal position information detection method
US20120147224A1 (en) 2010-12-08 2012-06-14 Canon Kabushiki Kaisha Imaging apparatus
JP2012138068A (en) 2010-12-08 2012-07-19 Canon Inc Image generation apparatus
US20130088634A1 (en) 2011-10-05 2013-04-11 Sony Corporation Image acquisition apparatus, image acquisition method, and computer program
CN102854615A (en) 2012-04-27 2013-01-02 麦克奥迪实业集团有限公司 Full-automatic scanning system and method for microscopic section
JP2015536482A (en) 2012-11-16 2015-12-21 モレキュラー デバイシーズ, エルエルシー System and method for acquiring an image using a rolling shutter camera while sequencing a microscope device asynchronously

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
International Preliminary Report on Patentability dated May 3, 2016 for PCT/JP2014/051805.

Also Published As

Publication number Publication date
EP3064982A1 (en) 2016-09-07
JP6134249B2 (en) 2017-05-24
EP3064982B1 (en) 2022-02-23
CN105683806B (en) 2019-01-01
US20160252717A1 (en) 2016-09-01
DK3064982T3 (en) 2022-04-11
WO2015064117A1 (en) 2015-05-07
EP3064982A4 (en) 2017-11-15
HUE058663T2 (en) 2022-09-28
CN105683806A (en) 2016-06-15
JP2015087723A (en) 2015-05-07

Similar Documents

Publication Publication Date Title
US9911028B2 (en) Image acquisition device and image acquisition method for image acquisition device
US10602087B2 (en) Image acquisition device, and imaging device
JP6266601B2 (en) Image acquisition device, method and system for creating focus map of sample
US10598916B2 (en) Image acquisition device and method and system for acquiring focusing information for specimen
US9667858B2 (en) Image acquisition device and image acquisition device focusing method
US10422987B2 (en) Image acquisition device and image acquisition method for image acquisition device
JP5848596B2 (en) Image acquisition device and focus method of image acquisition device
WO2014112084A1 (en) Image acquisition device and focus method for image acquisition device
JP5296861B2 (en) Image acquisition device and focus method of image acquisition device
WO2014112085A1 (en) Image acquisition device and focus method for image acquisition device
JP6496772B2 (en) Image acquisition apparatus and image acquisition method
WO2014112086A1 (en) Image acquisition device and focus method for image acquisition device
US11921277B2 (en) Image acquisition device and image acquisition method
WO2014112083A1 (en) Image acquisition device and focus method for image acquisition device
JP5986041B2 (en) Image acquisition device and focus method of image acquisition device

Legal Events

Date Code Title Description
AS Assignment

Owner name: HAMAMATSU PHOTONICS K.K., JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:IWASE, FUMIO;REEL/FRAME:038612/0898

Effective date: 20160425

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4