WO2015037387A1 - 撮像装置、顕微鏡システム、撮像方法及び撮像プログラム - Google Patents

撮像装置、顕微鏡システム、撮像方法及び撮像プログラム Download PDF

Info

Publication number
WO2015037387A1
WO2015037387A1 PCT/JP2014/071395 JP2014071395W WO2015037387A1 WO 2015037387 A1 WO2015037387 A1 WO 2015037387A1 JP 2014071395 W JP2014071395 W JP 2014071395W WO 2015037387 A1 WO2015037387 A1 WO 2015037387A1
Authority
WO
WIPO (PCT)
Prior art keywords
imaging
light
image
imaging range
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2014/071395
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
三由 貴史
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Olympus Corp
Original Assignee
Olympus Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Olympus Corp filed Critical Olympus Corp
Publication of WO2015037387A1 publication Critical patent/WO2015037387A1/ja
Priority to US15/065,010 priority Critical patent/US10254530B2/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/36Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
    • G02B21/365Control or image processing arrangements for digital or video microscopes
    • G02B21/367Control or image processing arrangements for digital or video microscopes providing an output produced by processing a plurality of individual source images, e.g. image tiling, montage, composite images, depth sectioning, image comparison
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/06Means for illuminating specimens
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/16Microscopes adapted for ultraviolet illumination ; Fluorescence microscopes
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/09Beam shaping, e.g. changing the cross-sectional area, not otherwise provided for
    • G02B27/0927Systems for changing the beam intensity distribution, e.g. Gaussian to top-hat
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/2624Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects for obtaining an image which is composed of whole input images, e.g. splitscreen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/265Mixing
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/62Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light
    • G01N21/63Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light optically excited
    • G01N21/64Fluorescence; Phosphorescence
    • G01N21/645Specially adapted constructive features of fluorimeters
    • G01N21/6456Spatial resolved fluorescence measurements; Imaging
    • G01N21/6458Fluorescence microscopy

Definitions

  • the present invention relates to an imaging apparatus, a microscope system, an imaging method, and an imaging program that create a wide-field image by pasting together a plurality of images acquired by imaging a sample.
  • a so-called virtual slide technique in which an image obtained by copying a sample placed on a slide glass is recorded as electronic data so that the user can observe the image on a monitor such as a personal computer.
  • a part of an image of a sample magnified by a microscope is sequentially pasted to construct a wide-field and high-resolution image showing the entire sample.
  • the virtual slide technique is a technique for acquiring a plurality of images with different fields of view for the same subject, and generating an enlarged image of the subject by connecting these images.
  • processing for connecting a plurality of images is sometimes called stitching processing (see, for example, Patent Document 1).
  • a stained sample may be faded by light irradiation, or the health of cultured cells may be impaired by light irradiation.
  • Such an action of light is also called phototoxicity. Therefore, a sample that is deteriorated by light irradiation is stored in a light-shielded state, and is handled so that a minimum amount of light is irradiated only during imaging (see, for example, Non-Patent Document 1).
  • the present invention has been made in view of the above, and an object thereof is to provide an imaging apparatus, a microscope system, an imaging method, and an imaging program that can suppress local deterioration of a sample caused by light irradiation. .
  • an imaging apparatus generates imaging data for imaging a subject in an imaging range and generating image data, and generates light irradiated to the imaging range
  • Lighting means for controlling the luminance distribution of the light so that the light attenuates in a peripheral region within the imaging range
  • an imaging range movement control means for sequentially changing the imaging range for the subject.
  • An image processing unit that sequentially obtains image data of a plurality of images having different imaging ranges from the imaging unit, and combines the plurality of images to create a composite image.
  • the imaging range is changed so that the peripheral areas where the light attenuates overlap with each other, and the computing unit adjusts the imaging area to the adjacent imaging ranges.
  • the illumination control unit includes a field stop provided in an optical path of the light from the illumination unit to the subject.
  • the illumination control unit further includes an aperture adjustment unit that adjusts an opening size of the field stop so that an image of an opening end portion of the field stop overlaps a peripheral part of the imaging range,
  • the imaging range movement control means sequentially changes the imaging range in accordance with the size of the opening.
  • the illumination control unit further includes a position adjusting unit that moves the field stop along the optical axis of the illumination unit, and the imaging range movement control unit is configured to move the field stop in the optical axis direction of the field stop.
  • the imaging range is sequentially changed according to the position.
  • the field stop is formed of a member whose light transmittance continuously changes as the distance from the opening end portion increases.
  • the calculation unit further extracts a feature region from the image regions corresponding to the imaging ranges adjacent to each other, and aligns the corresponding images based on the feature region. It is characterized by performing.
  • the calculation unit performs shading correction on the image areas of the images corresponding to the imaging ranges adjacent to each other, and performs the alignment based on the image areas after shading correction.
  • the calculation unit determines a gain in the shading correction based on a maximum value of a shading component in the image area.
  • the calculation unit determines a gain in the shading correction with reference to a median value of shading components in the image area.
  • a microscope system includes the imaging device, a stage on which the subject is placed, and an objective lens provided to face the stage.
  • the subject is a sample that has been stained for fluorescence observation
  • the illumination unit is an epi-illumination unit that generates excitation light that excites the sample.
  • the imaging method includes a luminance distribution control step for controlling the luminance distribution of light so that the light is attenuated in a peripheral region within the imaging range of the imaging means, and the imaging of the light whose luminance distribution is controlled
  • a light irradiation step for irradiating a range an imaging step for capturing an image of a subject in the imaging range irradiated with the light to generate image data
  • an imaging range movement control step for sequentially changing the imaging range for the subject.
  • the control step changes the imaging range so that the peripheral areas where the light attenuates overlap in imaging ranges adjacent to each other.
  • the imaging program includes a luminance distribution control step for controlling a luminance distribution of light so that the light is attenuated in a peripheral region within the imaging range of the imaging unit, and the luminance distribution is controlled by the illumination unit.
  • a light irradiation step for irradiating the imaging range with light an imaging step for causing the imaging means to image a subject within the imaging range irradiated with the light and generating image data, and the imaging range for the subject in order
  • An imaging range movement control step to be changed, and a calculation step of sequentially obtaining a plurality of images having different imaging ranges based on the image data generated in the imaging step, and creating a composite image by combining the plurality of images
  • the imaging range movement control step performs the steps in which the light attenuates in the imaging ranges adjacent to each other.
  • the imaging range is changed so that regions overlap each other, and the calculation step pastes the images corresponding to the imaging ranges adjacent to each other so that the image regions corresponding to the peripheral region overlap.
  • the luminance distribution of the light irradiated to the imaging range is controlled so that the light attenuates in the peripheral area in the imaging range, and the peripheral areas overlap in the adjacent imaging ranges. Since the imaging range is changed, the cumulative amount of light irradiated to the peripheral region can be made comparable to other regions, and local deterioration of the sample due to light irradiation can be suppressed. .
  • FIG. 1 is a block diagram showing a configuration example of a microscope system according to Embodiment 1 of the present invention.
  • FIG. 2 is a schematic diagram showing a configuration of the microscope apparatus shown in FIG.
  • FIG. 3 is a schematic diagram showing the configuration of the epi-illumination field stop shown in FIG.
  • FIG. 4 is a schematic diagram showing the configuration of the epi-illumination field stop shown in FIG.
  • FIG. 5 is a schematic diagram illustrating a luminance profile of light irradiated to the imaging range.
  • FIG. 6 is a flowchart showing the operation of the microscope system shown in FIG.
  • FIG. 7 is a flowchart showing a process for acquiring the luminance distribution of light irradiated to the imaging range.
  • FIG. 1 is a block diagram showing a configuration example of a microscope system according to Embodiment 1 of the present invention.
  • FIG. 2 is a schematic diagram showing a configuration of the microscope apparatus shown in FIG.
  • FIG. 3 is a schematic diagram showing
  • FIG. 8 is a schematic diagram illustrating an example of a preferable luminance distribution of light irradiated on the imaging range.
  • FIG. 9 is a flowchart showing image combining processing according to Embodiment 1 of the present invention.
  • FIG. 10 is a schematic diagram for explaining the image pasting process shown in FIG.
  • FIG. 11 is a schematic diagram for explaining a case where images are pasted in a 3 ⁇ 2 matrix.
  • FIG. 12 is a flowchart showing the correction process of the luminance distribution in the composite image.
  • FIG. 13 is a schematic diagram for explaining the correction process of the luminance distribution in the composite image.
  • FIG. 14 is a schematic diagram for explaining the arrangement of the field stop in Modification 1-1 of Embodiment 1 of the present invention.
  • FIG. 15 is a graph showing a luminance profile of irradiation light according to the arrangement of the field stop shown in FIG.
  • FIG. 16 is a diagram showing a configuration of a microscope system according to Embodiment 2 of the present invention.
  • FIG. 17 is a flowchart showing the operation of the microscope system shown in FIG.
  • FIG. 18 is a diagram showing a configuration of a microscope system according to Embodiment 3 of the present invention.
  • FIG. 19 is a block diagram showing a configuration of the relative position correction unit shown in FIG.
  • FIG. 20 is a flowchart showing image combining processing according to Embodiment 3 of the present invention.
  • FIG. 21 is a schematic diagram illustrating a part of a superimposed region of two adjacent images.
  • FIG. 22 is a schematic diagram for explaining the shading correction processing for the overlapping region shown in FIG.
  • FIG. 23 is a schematic diagram showing a superposed region on which shading correction has been performed.
  • FIG. 24 is a schematic diagram for explaining the shading correction processing for the overlapped area in the modified example 3-1 of the third embodiment of the present invention.
  • FIG. 1 is a diagram showing a microscope system according to Embodiment 1 of the present invention.
  • the microscope system 1 controls the operation of the microscope apparatus 2 provided with the imaging unit 200 and the microscope apparatus 2, and captures image data from the imaging unit 200.
  • a control device 3 that performs predetermined arithmetic processing.
  • FIG. 2 is a schematic diagram showing the configuration of the microscope apparatus 2.
  • the microscope apparatus 2 includes a substantially C-shaped arm 201 provided with an epi-illumination unit 210 and a transmission illumination unit 220, and a sample SP that is a subject mounted on the arm 201.
  • It has a fluorescent cube 206, an eyepiece 208 provided with an eyepiece lens 207, and a stage position changing unit 209 that moves the specimen stage 202.
  • the imaging unit 200 is provided on the other end side of the lens barrel 203, and the trinocular tube unit 204 transmits the observation light of the sample SP incident from the objective lens 205 to the direction of the imaging unit 200 and the direction of the eyepiece tube 208.
  • the epi-illumination unit 210 includes an epi-illumination light source 211 that generates epi-illumination light irradiated on the sample SP, and an epi-illumination optical system that guides the epi-illumination light in the direction of the observation optical path L.
  • the epi-illumination optical system includes a collector lens 212, an epi-illumination filter 213, an epi-illumination field stop 214, and an epi-illumination aperture stop 215.
  • the epi-illumination field stop 214 and the epi-illumination aperture stop 215 are provided with an aperture adjustment unit that adjusts the respective aperture amounts.
  • the epi-illumination light emitted from such an epi-illumination unit 210 has its optical path bent at the fluorescent cube 206 and is irradiated onto the sample SP on the specimen stage 202.
  • FIG. 3 and 4 are schematic diagrams showing the configuration of the incident field stop 214.
  • FIG. the direction parallel to the optical axis L1 of the epi-illumination optical system is referred to as the Az direction, and the plane orthogonal to the optical axis L1 is referred to as the Ax-Ay plane.
  • the imaging range refers to a region of the sample SP surface on the specimen stage 202 that enters the field of view of the imaging unit 200.
  • the relative coordinates of the imaging range when the position of the sample SP on the specimen stage 202 is used as a reference is (u, v).
  • the incident field stop 214 includes a rectangular field stop formed by combining two L-shaped diaphragm members 214a and 214b, and an aperture adjusting unit 214c that adjusts the size of the opening of the incident field stop 214.
  • the aperture members 214a and 214b are provided to be movable symmetrically and interlocked with the optical axis L1 in the Ax-Ay plane. By adjusting the position of one of the diaphragm members 214a and 214b by the opening adjusting portion 214c, the size of the opening forming the square is changed.
  • the area where the imaging range V and the diaphragm members 214a and 214b overlap is shaded.
  • the transmission illumination unit 220 includes a transmission illumination light source 221 that generates transmission illumination light irradiated on the sample SP, and a transmission illumination optical system that guides the transmission illumination light in the direction of the observation optical path L.
  • the transmission illumination optical system includes a collector lens 222, a transmission filter 223, a transmission field stop 224, and a transmission aperture stop 225.
  • the transmission field stop 224 and the transmission aperture stop 225 are each provided with an aperture adjustment unit for adjusting the respective aperture amounts.
  • the configuration of the transmission field stop 224 is the same as that shown in FIGS.
  • the transmitted illumination light emitted from such a transmitted illumination unit 220 has its optical path bent by the reflection mirror 226 and irradiated onto the sample SP through the window lens 227 and the transmission condenser lens 228.
  • the objective lens 205 is attached to a revolver 230 that can hold a plurality of objective lenses (for example, objective lenses 205 and 205 ′) having different magnifications.
  • the imaging magnification can be changed by rotating the revolver 230 and changing the objective lenses 205 and 205 ′ facing the sample stage 202.
  • the revolver 230 is provided with an encoder, and an output value of the encoder is input to the control device 3.
  • the control device 3 can obtain the magnification of the objective lens (for example, the objective lens 205) facing the sample SP, that is, the observation magnification in the microscope device 2, based on the output value of the encoder.
  • a zoom unit including a plurality of zoom lenses and a drive unit (none of which is shown) that changes the position of these zoom lenses is provided inside the lens barrel 203. It may be provided.
  • the zoom unit enlarges or reduces the subject image by adjusting the position of each zoom lens.
  • An encoder may be further provided in the drive unit in the lens barrel 203. In this case, the output value of the encoder may be output to the control device 3, and the control device 3 may detect the position of the zoom lens from the output value of the encoder and automatically calculate the imaging magnification.
  • the fluorescent cube 206 is arranged in the observation optical path L when performing fluorescence observation in the microscope apparatus 2.
  • the fluorescent cube 206 selectively emits light (excitation light) having a specific wavelength band out of the light emitted from the epi-illumination light source 211 and passing through the epi-illumination optical system including the collector lens 212 to the epi-illumination aperture stop 215.
  • the excitation filter 231 to be transmitted, the dichroic mirror 232 that reflects the light transmitted through the excitation filter 231 in the direction of the sample SP and the light incident from the direction of the sample SP, and the light incident from the direction of the sample SP
  • An optical member in which an absorption filter 233 that selectively transmits only light (fluorescence) having a specific wavelength band is combined in a cube shape.
  • the stage position changing unit 209 includes, for example, motors 234 and 235, and moves the position of the sample stage 202 within the XY plane under the control of the control device 3, thereby relatively changing the imaging range with respect to the sample SP.
  • the objective lens 205 is focused on the sample SP.
  • the stage position changing unit 209 is provided with a position detecting unit 236 that detects the position of the sample stage 202 and outputs a detection signal to the control device 3.
  • the position detection unit 236 is configured by an encoder that detects the amount of rotation of the motor 234, for example.
  • the stage position changing unit 209 may be configured by a pulse generating unit that generates a pulse under the control of the control device 3 and a stepping motor.
  • the imaging unit 200 includes, for example, an imaging device such as a CCD or a CMOS, and a camera that can capture a monochrome image that outputs a luminance value Y as a pixel level (pixel value) in each pixel included in the imaging device, or R (red) , G (green), and B (blue) are cameras capable of capturing color images having pixel levels (pixel values) in each band, and operate at a predetermined timing in accordance with control of the control device 3.
  • the imaging unit 200 receives light (observation light) incident from the objective lens 205 via the optical system in the lens barrel 203, generates image data corresponding to the observation light, and outputs the image data to the control device 3.
  • the imaging unit 200 may convert the pixel value represented in the RGB color space into a pixel value represented in the YCbCr color space and output the converted pixel value to the control device 3.
  • the control device 3 includes an input unit 31 used when inputting various commands and information to the control device 3, an image acquisition unit 32 that acquires image data from the imaging unit 200, and a storage A unit 33, a calculation unit 34 that performs predetermined image processing on the image data acquired by the image acquisition unit 32 and performs various calculation processes, and an output unit 35 that outputs an image subjected to the image processing by the calculation unit 34. And a control unit 36 that controls these units in an integrated manner.
  • the input unit 31 includes an input device such as a keyboard, various buttons, and various switches, and a pointing device such as a mouse and a touch panel, and inputs a signal corresponding to an operation performed by the user from the outside to the control unit 36.
  • an input device such as a keyboard, various buttons, and various switches
  • a pointing device such as a mouse and a touch panel
  • the image acquisition unit 32 is an interface that can be connected to an external device such as the imaging unit 200, and sequentially acquires the image data generated by the imaging unit 200.
  • the storage unit 33 includes a recording device such as a flash memory, RAM, and ROM that can be updated and recorded, a recording medium such as a hard disk, MO, CD-R, and DVD-R that is built-in or connected by a data communication terminal, and
  • the recording apparatus includes a writing / reading apparatus that writes and reads information on a recording medium.
  • the storage unit 33 includes an image data storage unit 331 that stores image data on which image processing has been performed by the calculation unit 34.
  • the calculation unit 34 includes an illumination luminance distribution estimation unit 341 that estimates a luminance distribution of light (hereinafter, also simply referred to as irradiation light) irradiated on the imaging range, and a sample stage 202 each time imaging is performed in the microscope apparatus 2. It is created by a positioning unit 342 that determines the amount of movement (hereinafter referred to as stage movement amount) to be moved, an image composition unit 343 that creates a composite image by combining a plurality of images with different imaging ranges, and an image composition unit 343. And an image luminance distribution correction unit 344 that corrects the luminance distribution in the combined image.
  • an illumination luminance distribution estimation unit 341 that estimates a luminance distribution of light (hereinafter, also simply referred to as irradiation light) irradiated on the imaging range, and a sample stage 202 each time imaging is performed in the microscope apparatus 2. It is created by a positioning unit 342 that determines the amount of movement (hereinafter referred to as stage movement amount) to be
  • the illumination luminance distribution estimation unit 341 estimates the luminance distribution of light emitted when observing the sample SP based on an image acquired by imaging a standard sample for luminance distribution measurement in the microscope apparatus 2, and the light. Create brightness distribution information for.
  • the positioning unit 342 determines the stage movement amount based on the luminance distribution estimated by the illumination luminance distribution estimation unit 341, and also uses the image composition unit 343 to paste adjacent images based on the stage movement amount.
  • the coordinate value of the image area corresponding to “margin” is determined.
  • an image area corresponding to “margin” is referred to as a superimposed area.
  • the image composition unit 343 creates a composite image with a wide field of view by pasting together a plurality of images acquired by sequentially capturing the sample SP while changing the imaging range. More specifically, the image composition unit 343 includes a trimming unit 343a that trims a region used for image composition from each image, and a composition unit 343b that pastes the trimmed images.
  • the image luminance distribution correction unit 344 corrects the luminance distribution in the composite image obtained by pasting together a plurality of images based on the luminance distribution of the irradiation light estimated by the illumination luminance distribution estimation unit 341. More specifically, the image luminance distribution correcting unit 344 corrects the calculated luminance unevenness with the luminance unevenness calculating unit 344a that calculates the luminance unevenness generated in the superimposed region where the images are pasted based on the luminance distribution of the irradiation light.
  • a correction gain calculation unit 344b that calculates the correction gain of the image, and a correction unit 344c that corrects luminance unevenness in the superimposed region of the composite image using the correction gain.
  • the output unit 35 is an external interface that outputs the composite image created by the calculation unit 34 and other predetermined information to an external device such as an LCD, an EL display, or a CRT display.
  • an external device such as an LCD, an EL display, or a CRT display.
  • a display device that displays a composite image or the like is provided outside the microscope system 1, but may be provided inside the microscope system 1.
  • the control unit 36 comprehensively controls the operation of the entire microscope system 1 based on various commands input from the input unit 31 and various information input from the microscope apparatus 2.
  • the control unit 36 includes an illumination luminance distribution control unit 361 and a stage position control unit 362.
  • the illumination brightness distribution control unit 361 outputs a field stop control signal for adjusting the size of the incident field stop 214 or the transmission field stop 224 based on the brightness distribution information created by the illumination brightness distribution estimation unit 341. Thus, the luminance distribution of the light irradiated to the imaging range is controlled.
  • FIG. 5 is a schematic diagram illustrating a luminance profile of irradiation light in the imaging range V.
  • FIG. 5 shows a luminance profile in the u direction passing through the optical axis L0 of the objective lens 205.
  • the diaphragm member 214 a the image of the opening end 214 d of the incident field stop 214 (or the transmission field stop 224) does not overlap the imaging range V.
  • the incident-light field stop 214 or the transmission field stop 224 and the illumination luminance distribution control unit 361 that controls the size of the apertures of these field stops are used for the light emitted to the imaging range V.
  • An illumination control means for controlling the luminance distribution is configured.
  • the stage position control unit 362 is an imaging range movement control unit that changes the imaging range V by moving the sample stage 202 on which the sample SP is placed relative to the objective lens 205.
  • the stage position control unit 362 outputs a stage control signal for operating the stage position changing unit 209 based on the stage movement amount calculated by the positioning unit 342, so that the sample stage 202 of the imaging unit 200 at each imaging timing is output. Control the position.
  • the calculation unit 34 and the control unit 36 may be configured by dedicated hardware, or may be configured by reading a predetermined program into hardware such as a CPU.
  • the storage unit 33 further includes a control program for controlling the operations of the microscope apparatus 2 and the control apparatus 3, an image processing program for causing the arithmetic unit 34 to execute various arithmetic processes including image processing, Various parameters and setting information used during the execution of these programs are stored.
  • FIG. 6 is a flowchart showing the operation of the microscope system 1.
  • the sample SP that has been subjected to fluorescence staining is irradiated with excitation light via an epi-illumination optical system, and an image of fluorescence generated from the sample SP (fluorescence observation image) is acquired.
  • step S10 an observation magnification for observing the sample SP is set. Specifically, the revolver 230 is rotated, and the objective lens 205 having a desired magnification is disposed at a position facing the sample stage 202.
  • step S ⁇ b> 11 the microscope system 1 acquires a luminance distribution of light (excitation light) irradiated on the imaging range of the imaging unit 200.
  • FIG. 7 is a flowchart showing a process for acquiring the luminance distribution of light irradiated to the imaging range.
  • step S11 a fluorescent sample capable of uniformly generating fluorescence according to the intensity of excitation light is set as a standard sample on the specimen stage 202 of the microscope apparatus 2.
  • the illumination luminance distribution control unit 361 outputs a field stop control signal and adjusts the size of the aperture of the incident field stop 214 (the positions of the stop members 214a and 214b).
  • the aperture size is set to the maximum value as an initial setting.
  • control unit 36 causes the imaging unit 200 to image the standard sample set on the specimen stage 202.
  • step S113 the computing unit 34 acquires an image of a standard sample based on the image data output from the imaging unit 200.
  • step S114 the illumination luminance distribution estimation unit 341 estimates the luminance distribution of the excitation light irradiated to the imaging range from the luminance value of each pixel constituting the image of the standard sample, and creates luminance distribution information. At this time, this estimation is performed based on the correlation between the excitation light intensity and the fluorescence intensity.
  • step S115 the illumination luminance distribution control unit 361 determines whether or not the luminance distribution of the excitation light irradiated to the imaging range is appropriate based on the luminance distribution information created by the illumination luminance distribution estimation unit 341. This determination may be made by the user looking at the image of the standard sample.
  • FIG. 8 is a schematic diagram illustrating an example of a preferable luminance distribution of light (excitation light) irradiated to the imaging range.
  • the luminance profile P1 shown in FIG. 8 is a profile in the u direction (u (1) min ⁇ u ⁇ u (1) max ) of the excitation light irradiated to the imaging range V1.
  • the imaging range V1 the central region where the intensity of the excitation light is substantially constant is referred to as a flat region, and the region where the intensity is zero is referred to as a light shielding region.
  • the peripheral region that attenuates light is also referred to as an attenuation region.
  • the shading area of the imaging range V1 is shaded. Note that the attenuation rate in the attenuation region may be controlled so that the intensity of the excitation light is exactly zero at the end of the imaging range V1, and in this case, no light shielding region is generated.
  • step S115: Yes If the luminance distribution is appropriate (step S115: Yes), the process returns to the main routine. On the other hand, when the luminance distribution is not appropriate (step S115: No), the process returns to step S111. In this case, in the subsequent step S111, the illumination luminance distribution control unit 361 adjusts the epi-illumination field stop 214 so that the luminance distribution is shifted in an appropriate direction based on the luminance distribution estimated in step S114.
  • step S ⁇ b> 12 following step S ⁇ b> 11, the positioning unit 342 selects the “stage movement amount per imaging and the image when the imaging range is adjacent to each other” based on the final luminance distribution acquired in step S ⁇ b> 11.
  • the coordinate value of the overlap area which is “margin” is determined.
  • the stage movement amount ⁇ X is determined so that the attenuation regions of the luminance profile P1 overlap between the adjacent imaging ranges V1 and V2. More specifically, the stage movement amount ⁇ X is determined based on the luminance profile P1 so that the cumulative amount of excitation light in the overlapping region C in the imaging ranges V1 and V2 is approximately the same as the intensity in the flat region. .
  • the positioning unit 342 determines a region on the image corresponding to the region C overlapping between the imaging ranges V1 and V2 as a superimposed region, and acquires the coordinate value of the superimposed region in the coordinate system on the image.
  • the positioning unit 342 similarly determines the stage movement amount in the Y direction and the coordinate value of the overlapping region.
  • step S13 the microscope system 1 sequentially images the sample SP to be observed set on the specimen stage 202 while changing the imaging range. That is, the operation of performing imaging by irradiating the sample SP with the excitation light having the luminance distribution acquired in step S11 and then moving the sample stage 202 by the stage movement amount determined in step S12 is performed on the entire sample SP. Repeat until it is covered. As a result, image data of images having different imaging ranges are sequentially output from the imaging unit 200.
  • the cumulative amount of light when the peripheral region is irradiated with the excitation light multiple times is approximately the same as the amount of light irradiated to the flat region. . That is, the cumulative amount of excitation light applied to the sample SP is substantially constant throughout the entire region regardless of the position.
  • step S14 the calculation unit 34 sequentially acquires images having different imaging ranges based on the image data output from the imaging unit 200. At this time, the calculation unit 34 acquires the position information of the specimen stage 202 at the time of capturing each image together with the image data from the position detection unit 236 (see FIG. 2).
  • step S15 the image composition unit 343 pastes a plurality of images that are sequentially input based on the coordinate values of the overlapping area determined in step S12.
  • FIG. 9 is a flowchart showing image combining processing.
  • FIG. 10 is a schematic diagram for explaining image pasting processing, and shows a case where images M1 and M2 whose imaging ranges are adjacent in the u direction are pasted.
  • the synthesis unit 343b calculates the relative position of the trimming image M2 'with respect to the trimming image M1' based on the position information of the specimen stage 202 acquired from the position detection unit 236.
  • step S153 the synthesizing unit 343b determines a superimposing region to be superimposed between the images M1 'and M2' based on the coordinate value of the superimposing region calculated by the positioning unit 342.
  • step S154 the combining unit 343b pastes the images M1 'and M2' so as to overlap each other in the overlapping region determined in step S153, as shown in FIG. 10B.
  • Such a pasting process is sequentially performed every time image data is input from the imaging unit 200, so that a wide-field composite image is created. For example, as shown in FIG. 11, for an imaging range arranged in a 3 ⁇ 2 matrix, imaging is performed while moving the imaging range in a U-shape as shown by an arrow in FIG.
  • the pasting process is performed in the order of acquisition. That is, after trimming images M1 'to M3' are sequentially pasted in the u direction, the trimming image M4 'is pasted in the -v direction, and further, the trimming images M5' and M6 'are sequentially pasted in the -u direction. Thereafter, the process returns to the main routine.
  • step S16 the image luminance distribution correction unit 344 corrects the luminance distribution in the composite image created in step S15.
  • FIG. 12 is a flowchart showing the correction process of the luminance distribution in the composite image.
  • FIG. 13 is a schematic diagram for explaining luminance distribution correction processing in a composite image.
  • the luminance unevenness calculation unit 344a acquires the luminance distribution of the fluorescence generated when the standard sample (see step S11) is imaged from the illumination luminance distribution estimation unit 341, and is adjacent to the luminance distribution. Luminance unevenness that occurs when overlapping regions are overlapped between images is calculated. As shown in FIG. 13A, the difference between the fluorescence intensity I (u) and the fluorescence intensity I 0 is luminance unevenness in the u direction.
  • the correction gain calculation unit 344b calculates the correction gain G (u) at each point in the superimposed region based on the fluorescence intensity calculated in step S161, and creates a gain map (FIG. 13B). reference).
  • the image luminance distribution correction unit 344 may capture the luminance distribution information at the time when the illumination luminance distribution estimation unit 341 estimates the luminance distribution of the excitation light (see step S11), and create and hold the gain map. good.
  • step S163 the correction unit 344c corrects the luminance unevenness in the superimposed region of the composite image using the gain map created in step S162. Thereafter, the process returns to the main routine.
  • step S ⁇ b> 17 following step S ⁇ b> 16 the calculation unit 34 outputs the created composite image, displays it on an external device such as a display device, and stores it in the storage unit 33. Thereafter, the operation of the microscope system 1 ends.
  • the luminance distribution of the irradiation light is controlled so that the irradiation light is attenuated in the peripheral region in the imaging range, and adjacent imaging ranges overlap in the peripheral region.
  • the accumulated light amount of the irradiation light with respect to the peripheral region can be set to the same level as the central portion of the imaging range. That is, since the accumulated light amount of the irradiation light with respect to the sample SP can be made uniform as a whole, local deterioration of the sample SP due to light irradiation can be suppressed.
  • the luminance distribution of the irradiation light is controlled by adjusting the size of the aperture so that the image of the aperture end of the epi-illumination field stop 214 overlaps the imaging range. Applicable light irradiation is possible.
  • the case where the fluorescence observation is performed in the microscope apparatus 2 has been described.
  • other spectroscopic methods performed in the microscope apparatus 2 such as transmission observation, phase difference observation, and differential interference observation are also described above.
  • the first embodiment can be applied.
  • the size of the aperture of the incident field stop 214 or the transmission field stop 224 may be controlled according to the spectroscopic method.
  • the position of the observation optical system including the objective lens 205 is fixed, and the imaging field of view is changed by moving the sample stage 202 on which the sample SP is placed. May be fixed and the observation optical system side may be moved. Alternatively, both the sample SP and the observation optical system may be relatively moved.
  • FIG. 14 is a schematic diagram for explaining the arrangement of the field stop in the present modification 1-1
  • FIG. 15 is a graph showing the luminance profile of the irradiation light according to the arrangement shown in FIG.
  • the luminance distribution of the irradiation light is controlled by adjusting the size of the aperture of the field stop and adjusting the position of the field stop in the optical axis direction.
  • the rising of the luminance distribution of the irradiation light is relatively sharp, and the contrast at the boundary of the irradiation region of the irradiation light is clear.
  • the width of the attenuation region and the luminance attenuation rate in the attenuation region are controlled to desired values by adjusting the position of the incident field stop 214 in the Az direction. Can do.
  • the luminance distribution of the irradiation light can be controlled by adjusting the position of the transmission field stop 224 in the optical axis direction.
  • the incident field stop 214 and the transmission field stop 224 are configured using the stop members 214a and 214b (see FIGS. 3 and 4) made of a light shielding member.
  • the diaphragm members 214a and 214b may be formed of a member whose transmittance continuously changes (decreases) as the distance from the opening end portion increases.
  • the transmittance can be changed by forming a thin film by depositing a metal on a transparent member such as glass and adjusting the film thickness at this time.
  • the irradiation light can be gradually attenuated in the peripheral region of the imaging range without moving the diaphragm members 214a and 214b along the optical axis Az direction.
  • the magnitude of the transmittance and the rate of change of the transmittance of the diaphragm members 214a and 214b it is possible to realize the luminance distribution of the irradiation light according to the spectroscopic method and the type of fluorescence.
  • Modification 1-3 of Embodiment 1 of the present invention will be described.
  • various control information field stop position information, stage movement amount, and coordinate value of the overlapping area
  • the control information and the like may be acquired using a sample to be observed.
  • FIG. 16 is a diagram showing a configuration of a microscope system according to Embodiment 2 of the present invention.
  • the microscope system 4 according to the second embodiment includes a microscope device 2 and a control device 5.
  • the configuration and operation of the microscope apparatus 2 are the same as those in the first embodiment.
  • the control device 5 includes a storage unit 51 that further includes a table storage unit 511 with respect to the control device 3 shown in FIG.
  • the configuration and operation of each unit of the control device 5 other than the storage unit 51 are the same as those in the first embodiment.
  • the table storage unit 511 has a luminance distribution of irradiation light according to the observation magnification in the microscope apparatus 2 and field stop control information for realizing the luminance distribution, that is, the Ax, Ay, and Az directions of the diaphragm members 214a and 214b. And a luminance distribution table in which position information is stored. Such a luminance distribution table may be created based on the result of imaging a standard sample in advance in the microscope apparatus 2 and based on the result of the imaging, or may be created based on the result of separate measurement or simulation. .
  • FIG. 17 is a flowchart showing the operation of the microscope system 4.
  • step S20 an observation magnification for observing the sample SP is set. Specifically, the revolver 230 is rotated, and the objective lens 205 having a desired magnification is arranged at a position facing the sample stage 202. As a result, the output value of the encoder provided in the revolver 230 is input to the control device 5. The controller 36 acquires the observation magnification in the microscope apparatus 2 based on the output value.
  • step S ⁇ b> 21 the control unit 36 acquires, from the luminance distribution table, the luminance distribution of the irradiation light and the field stop control information corresponding to the observation magnification in the microscope apparatus 2. Further, the positioning unit 342 calculates the stage movement amount per imaging and the coordinate value of the overlapping region between adjacent images based on the luminance distribution of the irradiation light acquired by the control unit 36.
  • the subsequent operations in steps S13 to S17 are the same as those in the first embodiment.
  • the position of the field stop is obtained by imaging the standard sample in advance. There is no need to perform work such as adjustment. Therefore, it is possible to acquire a wide-field image in which the sample SP is captured in a shorter time.
  • the stage movement amount and the coordinate value of the superimposed region are calculated from the luminance distribution of the irradiation light acquired from the luminance distribution table.
  • the stage movement amount and the coordinate value of the superimposed region are also calculated.
  • it may be associated with the observation magnification and tabulated in advance.
  • FIG. 18 is a diagram showing a configuration of a microscope system according to Embodiment 3 of the present invention.
  • the microscope system 6 according to the third embodiment includes a microscope device 2 and a control device 7. Among these, the configuration and operation of the microscope apparatus 2 are the same as those in the first embodiment.
  • the control device 7 includes a calculation unit 71 having an image composition unit 711 instead of the image composition unit 343 with respect to the control device 3 shown in FIG.
  • the configuration and operation of each unit of the control device 7 other than the image synthesis unit 343 are the same as those in the first embodiment.
  • the image composition unit 711 includes a trimming unit 711a that trims a region used for image composition from each image acquired via the image acquisition unit 32, and a relative position between images based on features in a superimposed region between adjacent images.
  • the operation of the trimming unit 711a is the same as that of the trimming unit 343a shown in FIG.
  • the stage movement amount and the coordinate value of the overlapping area between adjacent images are calculated based on the luminance distribution of the irradiation light, and the sample stage is calculated according to the stage movement amount.
  • Imaging is performed while moving 202.
  • the sample stage 202 may be displaced due to the influence of backlash or the like.
  • the bonding accuracy is lowered. Therefore, in the third embodiment, the feature region is extracted from the superimposed region of the actually acquired image, and the alignment is performed individually based on the feature region, thereby improving the accuracy of pasting the images.
  • FIG. 19 is a block diagram showing the configuration of the relative position correction unit 711b.
  • the relative position correction unit 711b includes a shading correction unit 711b-1 that performs shading correction on a superimposed region of each image, and a feature region extraction unit 711b that extracts a feature region from the superimposed region subjected to the shading correction. -2, and a search unit 711b-3 that searches for a highly correlated area between the overlapped areas based on the extracted feature area, and corrects the relative position between adjacent images based on the search result A position correction unit 711b-4.
  • FIG. 20 is a flowchart illustrating image combining processing according to the third embodiment. Note that the processing in steps S151 to S153 is the same as that in the first embodiment (see FIG. 9).
  • step S301 following step S153, the shading correction unit 711b-1 performs shading correction on the superimposed region of each image.
  • FIG. 21 is a schematic diagram illustrating a part of a superimposed region of two adjacent images. Among these, the overlapping area C1 shown in FIG. 21 is an image area corresponding to the right end area C of the imaging range V1 shown in FIG. 8, and the overlapping area C2 is an image area corresponding to the left end area C of the imaging range V2. It is.
  • FIG. 22 is a schematic diagram for explaining the shading correction processing for the overlapping region.
  • the shading correction unit 711b-1 normalizes the luminance by performing shading correction on each of the overlapping regions C1 and C2.
  • the brightness profiles p1 and p2 of the shading components are respectively obtained in advance for the overlap region C1 and the overlap region C2 as shown in FIG. deep. Then, as shown in FIG. 22B, with respect to the superimposed region C1, the luminance is reduced on the inner side (left side in the figure) of the image with reference to the median value I med of the luminance profile p1 of the shading component.
  • the correction gain that increases the luminance is applied outside (right side in the figure).
  • the gain increase ratio from the dark portion can be reduced by adjusting the gain in each of the overlapping regions C1 and C2 with reference to the median value I med in the luminance profiles p1 and p2 of the shading component. Can be suppressed.
  • FIG. 23 shows an example in which two feature regions c 1 and c 2 are extracted from the overlap region C1 ′.
  • step S303 the searching unit 711b-3 is one of the overlapping region C1 to explore 'the feature region c n extracted from the other overlapping region C2 to the original' feature region c n high areas c n correlation ' Is detected.
  • step S304 the position correction unit 711b-4 calculates a positional shift amount between the overlapping regions C1 ′ and C2 ′ based on the detection result in step S303, and the positional shift amount.
  • the relative position coordinates when the images including the overlapping regions C1 and C2 are pasted together are corrected by this amount.
  • step S305 the shading correction unit 711b-1 restores the corrected shading performed in step S301 for each of the overlapping regions C1 and C2.
  • step S306 the compositing unit 343b pastes the images including the overlapping regions C1 and C2 based on the relative position coordinates corrected in step S304. Thereafter, the processing returns to the main routine (see FIG. 6).
  • the luminance in each superimposed region is normalized by shading correction, so that the correlation calculation based on the feature region existing in the superimposed region can be performed. Therefore, it is possible to correct a positional deviation when adjoining adjacent images and improve the accuracy of the pasting.
  • the background of a fluorescent image is a dark region with a small amount of features, the amount of positional deviation between images can be accurately calculated by using the feature region extracted from the overlap region as a window for correlation calculation. It becomes possible.
  • the image combining process in the third embodiment may be applied to the second embodiment.
  • the correction gain is determined based on the median value I med of the luminance profiles p1 and p2 of the shading component.
  • the shading correction may be performed so as to increase the gain on the low luminance side with reference to the maximum value I max of the luminance profiles p1 and p2 of the shading component.
  • the correlation calculation can be performed based on the feature region cn having sufficient luminance, so that the positional deviation amount between the images can be calculated more accurately.
  • the present invention is not limited to the above-described first to third embodiments and modifications, but by appropriately combining a plurality of constituent elements disclosed in the first to third embodiments and modifications.
  • Various inventions can be formed. For example, some components may be excluded from all the components shown in the first to third embodiments and the modified examples. Or you may form combining the component shown in different embodiment suitably.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • Microscoopes, Condenser (AREA)
  • Investigating, Analyzing Materials By Fluorescence Or Luminescence (AREA)
PCT/JP2014/071395 2013-09-13 2014-08-13 撮像装置、顕微鏡システム、撮像方法及び撮像プログラム Ceased WO2015037387A1 (ja)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/065,010 US10254530B2 (en) 2013-09-13 2016-03-09 Imaging apparatus, microscope system, imaging method, and computer-readable recording medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2013-190820 2013-09-13
JP2013190820A JP6270388B2 (ja) 2013-09-13 2013-09-13 撮像装置、顕微鏡システム、撮像方法及び撮像プログラム

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/065,010 Continuation US10254530B2 (en) 2013-09-13 2016-03-09 Imaging apparatus, microscope system, imaging method, and computer-readable recording medium

Publications (1)

Publication Number Publication Date
WO2015037387A1 true WO2015037387A1 (ja) 2015-03-19

Family

ID=52665506

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/071395 Ceased WO2015037387A1 (ja) 2013-09-13 2014-08-13 撮像装置、顕微鏡システム、撮像方法及び撮像プログラム

Country Status (3)

Country Link
US (1) US10254530B2 (enExample)
JP (1) JP6270388B2 (enExample)
WO (1) WO2015037387A1 (enExample)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6494294B2 (ja) * 2014-05-15 2019-04-03 キヤノン株式会社 画像処理装置、及び撮像システム
JP6253509B2 (ja) * 2014-05-21 2017-12-27 オリンパス株式会社 画像表示方法、制御装置、顕微鏡システム
JP2017068302A (ja) * 2015-09-28 2017-04-06 株式会社Screenホールディングス 画像作成装置および画像作成方法
WO2017197217A1 (en) * 2016-05-12 2017-11-16 Life Technologies Corporation Systems, methods, and apparatuses for image capture and display
JP2018054690A (ja) * 2016-09-26 2018-04-05 オリンパス株式会社 顕微鏡撮像システム
JP2018066845A (ja) * 2016-10-19 2018-04-26 オリンパス株式会社 顕微鏡システム
WO2018109796A1 (ja) * 2016-12-12 2018-06-21 株式会社オプティム 遠隔制御システム、遠隔制御方法、およびプログラム

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007065669A (ja) * 2005-08-31 2007-03-15 Trestle Acquisition Corp イメージ・ブロックを合成し顕微鏡スライドのシームレスな拡大イメージを作成するシステム及び方法
JP2008191427A (ja) * 2007-02-05 2008-08-21 Olympus Corp バーチャルスライド作成装置、バーチャルスライド作成方法およびバーチャルスライド作成プログラム
JP2010134374A (ja) * 2008-12-08 2010-06-17 Olympus Corp 顕微鏡システム及び該動作方法
JP2012003214A (ja) * 2010-05-19 2012-01-05 Sony Corp 情報処理装置、情報処理方法、プログラム、撮像装置、及び光学顕微鏡を搭載した撮像装置

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7456377B2 (en) 2004-08-31 2008-11-25 Carl Zeiss Microimaging Ais, Inc. System and method for creating magnified images of a microscope slide
JP2012090051A (ja) * 2010-10-19 2012-05-10 Sony Corp 撮像装置及び撮像方法
JP5997039B2 (ja) * 2012-12-26 2016-09-21 株式会社日立ハイテクノロジーズ 欠陥検査方法および欠陥検査装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007065669A (ja) * 2005-08-31 2007-03-15 Trestle Acquisition Corp イメージ・ブロックを合成し顕微鏡スライドのシームレスな拡大イメージを作成するシステム及び方法
JP2008191427A (ja) * 2007-02-05 2008-08-21 Olympus Corp バーチャルスライド作成装置、バーチャルスライド作成方法およびバーチャルスライド作成プログラム
JP2010134374A (ja) * 2008-12-08 2010-06-17 Olympus Corp 顕微鏡システム及び該動作方法
JP2012003214A (ja) * 2010-05-19 2012-01-05 Sony Corp 情報処理装置、情報処理方法、プログラム、撮像装置、及び光学顕微鏡を搭載した撮像装置

Also Published As

Publication number Publication date
US10254530B2 (en) 2019-04-09
US20160187638A1 (en) 2016-06-30
JP2015055849A (ja) 2015-03-23
JP6270388B2 (ja) 2018-01-31

Similar Documents

Publication Publication Date Title
JP6270388B2 (ja) 撮像装置、顕微鏡システム、撮像方法及び撮像プログラム
US9596416B2 (en) Microscope system
US10379329B2 (en) Microscope system and setting value calculation method
US9990752B2 (en) Image processing device, imaging device, microscope system, image processing method, and computer-readable recording medium
JP5911296B2 (ja) 画像処理装置、撮像装置、顕微鏡システム、画像処理方法、及び画像処理プログラム
US20120075455A1 (en) Imaging method and microscope device
JP2012237693A (ja) 画像処理装置、画像処理方法及び画像処理プログラム
JP2009163155A (ja) 顕微鏡装置
US20190025213A1 (en) Microscopy system, microscopy method, and computer-readable storage medium
JP6099477B2 (ja) 撮像装置、顕微鏡システム及び撮像方法
JP6061619B2 (ja) 顕微鏡システム
US10656406B2 (en) Image processing device, imaging device, microscope system, image processing method, and computer-readable recording medium
JP6312410B2 (ja) アライメント装置、顕微鏡システム、アライメント方法、及びアライメントプログラム
US20170243386A1 (en) Image processing apparatus, imaging apparatus, microscope system, image processing method, and computer-readable recording medium
JP6253509B2 (ja) 画像表示方法、制御装置、顕微鏡システム
JP6246551B2 (ja) 制御装置、顕微鏡システム、制御方法およびプログラム
JP5996462B2 (ja) 画像処理装置、顕微鏡システム及び画像処理方法
JP6346455B2 (ja) 画像処理装置及び顕微鏡システム
JP7369801B2 (ja) 顕微鏡システム、設定探索方法、及び、プログラム
JP2018116197A (ja) 顕微鏡システム、貼り合わせ画像生成プログラム、及び貼り合わせ画像生成方法
CN115222613A (zh) 用于产生亮度校正图像的方法和设备
JP2013092636A (ja) 画像処理装置、顕微鏡システム、画像処理方法、画像処理プログラム
JP2017083790A (ja) 画像取得装置、及びそれを用いた画像取得方法
JP2013160816A (ja) 顕微鏡
WO2017068655A1 (ja) 画像処理装置、画像処理システム、顕微鏡システム、画像処理方法、及び画像処理プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14844410

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14844410

Country of ref document: EP

Kind code of ref document: A1