WO2016166871A1 - 顕微鏡観察システム、顕微鏡観察方法、及び顕微鏡観察プログラム - Google Patents
顕微鏡観察システム、顕微鏡観察方法、及び顕微鏡観察プログラム Download PDFInfo
- Publication number
- WO2016166871A1 WO2016166871A1 PCT/JP2015/061742 JP2015061742W WO2016166871A1 WO 2016166871 A1 WO2016166871 A1 WO 2016166871A1 JP 2015061742 W JP2015061742 W JP 2015061742W WO 2016166871 A1 WO2016166871 A1 WO 2016166871A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- observation
- imaging
- unit
- omnifocal
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 44
- 238000003384 imaging method Methods 0.000 claims abstract description 158
- 230000003287 optical effect Effects 0.000 claims abstract description 86
- 238000012545 processing Methods 0.000 claims abstract description 39
- 230000000007 visual effect Effects 0.000 claims description 62
- 230000008859 change Effects 0.000 claims description 6
- 230000004048 modification Effects 0.000 description 32
- 238000012986 modification Methods 0.000 description 32
- 238000010586 diagram Methods 0.000 description 29
- 238000005286 illumination Methods 0.000 description 26
- 230000008569 process Effects 0.000 description 19
- 230000006870 function Effects 0.000 description 9
- 238000001514 detection method Methods 0.000 description 5
- 230000005540 biological transmission Effects 0.000 description 4
- NJPPVKZQTLUDBO-UHFFFAOYSA-N novaluron Chemical compound C1=C(Cl)C(OC(F)(F)C(OC(F)(F)F)F)=CC=C1NC(=O)NC(=O)C1=C(F)C=CC=C1F NJPPVKZQTLUDBO-UHFFFAOYSA-N 0.000 description 4
- 238000012935 Averaging Methods 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 210000003855 cell nucleus Anatomy 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 239000000470 constituent Substances 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 239000003814 drug Substances 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000004611 spectroscopical analysis Methods 0.000 description 1
- 210000000130 stem cell Anatomy 0.000 description 1
- 238000010408 sweeping Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B21/00—Microscopes
- G02B21/36—Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
- G02B21/365—Control or image processing arrangements for digital or video microscopes
- G02B21/367—Control or image processing arrangements for digital or video microscopes providing an output produced by processing a plurality of individual source images, e.g. image tiling, montage, composite images, depth sectioning, image comparison
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B21/00—Microscopes
- G02B21/24—Base structure
- G02B21/241—Devices for focusing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/73—Deblurring; Sharpening
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/12—Details of acquisition arrangements; Constructional details thereof
- G06V10/14—Optical characteristics of the device performing the acquisition or on the illumination arrangements
- G06V10/147—Details of sensors, e.g. sensor lenses
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/25—Determination of region of interest [ROI] or a volume of interest [VOI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/69—Microscopic objects, e.g. biological cells or cellular parts
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B21/00—Microscopes
- G02B21/02—Objectives
- G02B21/025—Objectives with variable magnification
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B21/00—Microscopes
- G02B21/06—Means for illuminating specimens
- G02B21/08—Condensers
- G02B21/088—Condensers for both incident illumination and transillumination
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B21/00—Microscopes
- G02B21/24—Base structure
- G02B21/26—Stages; Adjusting means therefor
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B21/00—Microscopes
- G02B21/36—Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
- G02B21/368—Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements details of associated display arrangements, e.g. mounting of LCD monitor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10056—Microscopic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10141—Special mode during image acquisition
- G06T2207/10148—Varying focus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
Definitions
- the present invention relates to a microscope observation system, a microscope observation method, and a microscope observation program for observing a subject through an image acquired in a microscope apparatus.
- a multifocal image generated by superimposing a Z stack image is restored using a blur function, or a focused region is extracted from each of a plurality of images having different focal planes.
- Patent Document 1 discloses two images focused on the near end side and the far end side of a subject, and an omnifocal image generated by imaging while sweeping the image sensor from the near end side to the far end side of the subject.
- the image is focused on the near-end side and the far-end side using the omnifocal image, and the amount of blur in the partial area in the image is calculated.
- a technique for acquiring a distance and creating a distance map is disclosed.
- Patent Document 1 it is possible to visually grasp the Z position of the structure shown in the image by generating a distance map.
- the distance map the front-rear relationship between the structures in the Z direction cannot be visually reproduced, and it is difficult for the user to grasp intuitively.
- the present invention has been made in view of the above, and an image that allows a user to visually and intuitively grasp the position in the Z direction of the structure in the image and the front-rear relationship between the structures is shorter than before.
- An object of the present invention is to provide a microscope observation system, a microscope observation method, and a microscope observation program that can be generated in time and that can suppress the amount of data and the amount of calculation in image processing more than ever.
- a microscope observation system includes an imaging unit that captures a subject image generated by an observation optical system of a microscope and acquires the image, and the observation optical system And a plurality of surfaces in the optical axis direction of the observation optical system by shifting the positions of the focal plane and the visual field during one exposure period of the imaging unit.
- An imaging control unit that causes the imaging unit to acquire a multi-focus superimposed image including image information in the image, a shift amount acquisition processing unit that acquires a shift amount for shifting the position of the visual field, and a plurality of conditions with different shift amounts.
- An omnifocal image generation unit that generates a plurality of omnifocal images based on the plurality of multifocal superimposed images respectively acquired in step (b), and a display unit that displays the plurality of omnifocal images. It is characterized in.
- the imaging control unit determines an imaging start position when acquiring each of the plurality of multi-focus superimposed images based on the shift amount acquired by the shift amount acquisition processing unit. It is characterized by.
- the microscope observation system includes an observation region determination processing unit that determines, as an observation region, a region selected from any one of the plurality of omnifocal images according to an operation performed from the outside, and the observation region The region corresponding to the observation region is extracted from the omnifocal image other than the selected omnifocal image, and the position of the observation region in the omnifocal image selected as the observation region and all the regions from which the region has been extracted are extracted. And a target slice acquisition unit that acquires a position of a slice including a structure in the subject corresponding to the observation region based on a shift amount between the region and the position of the region in the focus image. .
- the omnifocal image generation unit is configured to determine the position of the field of view of the observation optical system during one exposure period of the imaging unit based on the position of the observation region in the omnifocal image in which the observation region is selected.
- an imaging position determination processing unit that determines an imaging position when acquiring a multifocal superimposed image by shifting.
- the imaging position determination processing unit determines the imaging position so that a position of the observation region in each omnifocal image does not change between the plurality of omnifocal images. To do.
- the microscope observation method is the microscope observation method in which a subject image generated by an observation optical system of a microscope is captured by an imaging unit to acquire an image, and the observation optical system is used during one exposure period of the imaging unit.
- An omnifocal image generation step for generating a plurality of omnifocal images based on a plurality of multifocal superimposed images respectively acquired under a plurality of different conditions, and displaying the plurality of omnifocal images on a display unit And a display step.
- the microscope observation program is a microscope observation program for acquiring an image by capturing an image of a subject generated by an observation optical system of a microscope using an imaging unit, and the observation optical system during one exposure period of the imaging unit.
- An imaging control step for performing control for acquiring a multifocal superimposed image including image information on a plurality of surfaces in the optical axis direction of the observation optical system by shifting the focal plane and the visual field position of the observation optical system, and the position of the visual field
- An omnifocal image generation step for generating a plurality of omnifocal images based on a plurality of multifocal superimposed images respectively acquired under a plurality of conditions with different shift amounts, and the plurality of omnifocal images.
- a display step of displaying on the display unit is executed by a computer.
- the multifocal superimposed image is acquired by shifting the position of the focal plane and the visual field during one exposure period of the imaging unit. Therefore, the Z stack image is acquired and the multifocus image is generated by image processing. Compared to the case, the imaging time can be greatly shortened, and the data amount and the calculation amount can be greatly suppressed.
- the user since a plurality of omnifocal images generated under a plurality of conditions with different shift amounts of the visual field position are displayed on the screen, the user can compare these omnifocal images by comparing them. It becomes possible to visually and intuitively grasp the position in the Z direction of the structure shown in the image and the front-rear relationship between the structures.
- FIG. 1 is a block diagram showing a configuration example of a microscope observation system according to Embodiment 1 of the present invention.
- FIG. 2 is a schematic diagram illustrating a configuration example of the microscope apparatus illustrated in FIG. 1.
- FIG. 3 is a flowchart showing the operation of the microscope observation system shown in FIG.
- FIG. 4 is a schematic diagram for explaining a process for acquiring a plurality of multi-focus superimposed images.
- FIG. 5 is a flowchart showing details of the processing for acquiring a plurality of multi-focus superimposed images.
- FIG. 6 is a flowchart showing details of a process for generating a plurality of omnifocal images.
- FIG. 7 is a schematic diagram illustrating an example in which two omnifocal images are displayed side by side on the display device illustrated in FIG. 1.
- FIG. 8 is a schematic diagram for explaining a method of acquiring a multifocal superimposed image in Modification 1 of Embodiment 1 of the present invention.
- FIG. 9 is a schematic diagram for explaining a method for acquiring a multifocal superimposed image in Modification 1 of Embodiment 1 of the present invention.
- FIG. 10 is a flowchart showing details of a process for acquiring a plurality of multifocal superimposed images in the third modification of the first embodiment of the present invention.
- FIG. 11 is a schematic diagram for explaining a process for acquiring a plurality of multi-focus superimposed images in the third modification of the first embodiment of the present invention.
- FIG. 12 is a schematic diagram for explaining the multifocal superimposed image acquisition process according to the fourth modification of the first embodiment of the present invention.
- FIG. 13 is a block diagram illustrating a configuration example of a microscope observation system according to Embodiment 2 of the present invention.
- FIG. 14 is a flowchart showing the operation of the microscope observation system shown in FIG.
- FIG. 15 is a schematic diagram showing a plurality of multi-focus superimposed images.
- FIG. 16 is a schematic diagram illustrating an example of a method for selecting an observation region.
- FIG. 17 is a flowchart showing details of the processing for acquiring the Z position information of the observation area.
- FIG. 18 is a block diagram illustrating a configuration example of a microscope observation system according to Embodiment 3 of the present invention.
- FIG. 19 is a flowchart showing the operation of the microscope observation system shown in FIG.
- FIG. 20 is a schematic diagram for explaining the operation of the microscope observation system shown in FIG.
- FIG. 21 is a schematic diagram illustrating a method of shifting the position of the visual field in a modification of the third embodiment.
- FIG. 22 is a schematic diagram illustrating another method of shifting the position of the visual field in the modification of the third embodiment.
- FIG. 1 is a block diagram showing a configuration example of a microscope observation system according to Embodiment 1 of the present invention.
- a microscope observation system 1 according to Embodiment 1 includes a microscope apparatus 10 that generates a subject image, and an imaging apparatus 20 that acquires and processes an enlarged image generated by the microscope apparatus 10. And a display device 30 for displaying an image processed by the imaging device 20.
- FIG. 2 is a schematic diagram illustrating a configuration example of the microscope apparatus 10.
- the microscope apparatus 10 includes a substantially C-shaped arm 100, a lens barrel 102 and an eyepiece unit 103 supported on the arm 100 via a trinocular tube unit 101, and the arm 100.
- an objective lens 140 that forms an image of observation light from the subject S.
- the objective lens 140, the lens barrel 102 connected via the trinocular tube unit 101, and an imaging unit 211 (described later) provided on the other end side of the lens barrel 102 are an observation optical system (imaging optical system). ) 104 is configured.
- the trinocular tube unit 101 branches the observation light incident from the objective lens 140 in the direction of an eyepiece unit 103 for the user to directly observe the subject S and an imaging unit 211 described later.
- the epi-illumination unit 110 includes an epi-illumination light source 111 and an epi-illumination optical system 112 and irradiates the subject S with epi-illumination light.
- the epi-illumination optical system 112 condenses the illumination light emitted from the epi-illumination light source 111 and guides it in the direction of the optical axis L of the observation optical system 104, specifically a filter unit, a shutter, and a field of view. Including diaphragm, aperture diaphragm, etc.
- the transmitted illumination unit 120 includes a transmitted illumination light source 121 and a transmitted illumination optical system 122 and irradiates the subject S with transmitted illumination light.
- the transmission illumination optical system 122 includes various optical members that condense the illumination light emitted from the transmission illumination light source 121 and guide it in the direction of the optical axis L, specifically, a filter unit, a shutter, a field stop, an aperture stop, and the like. Including.
- any one of these epi-illumination units 110 and transmission illumination units 120 is selected and used according to the spectroscopic method.
- the microscope apparatus 10 may be provided with only one of the epi-illumination unit 110 and the transmission illumination unit 120.
- the electric stage unit 130 includes a stage 131, a stage drive unit 132 that moves the stage 131, and a position detection unit 133.
- the stage drive unit 132 is configured by a motor, for example.
- a subject placement surface 131 a of the stage 131 is provided so as to be orthogonal to the optical axis of the objective lens 140.
- the subject placement surface 131a is the XY plane, and the normal direction of the XY plane, that is, the direction parallel to the optical axis is the Z direction.
- the downward direction in the figure, that is, the direction away from the objective lens 140 is the plus direction.
- the position of the field of view of the objective lens 140 can be shifted by moving the stage 131 in the XY plane. Further, the focal plane of the objective lens 140 can be shifted along the optical axis L by moving the stage 131 in the Z direction. That is, the electric stage unit 130 is a shift unit that shifts the position of the focal plane and the visual field by moving the stage 131 under the control of the imaging control unit 22 described later.
- the position of the observation optical system 104 including the lens barrel 102 to the objective lens 140 is fixed and the stage 131 is moved. May be fixed and the observation optical system 104 side may be moved. Alternatively, both the stage 131 and the observation optical system 104 may be moved in opposite directions. That is, any configuration may be used as long as the observation optical system 104 and the subject S are relatively movable.
- the focal plane may be shifted by moving the observation optical system 104 in the Z direction, and the position of the visual field V may be shifted by moving the stage 131 in the XY plane.
- the position detection unit 133 is configured by an encoder that detects the amount of rotation of the stage drive unit 132 made of a motor, for example, and detects the position of the stage 131 and outputs a detection signal.
- a pulse generation unit and a stepping motor that generate pulses in accordance with the control of the imaging control unit 22 described later may be provided.
- the objective lens 140 is attached to a revolver 142 that can hold a plurality of objective lenses having different magnifications (for example, the objective lenses 140 and 141).
- the imaging magnification can be changed by rotating the revolver 142 and changing the objective lenses 140 and 141 facing the stage 131.
- FIG. 2 shows a state in which the objective lens 140 faces the stage 131.
- the imaging device 20 acquires an image by capturing a subject image generated by the observation optical system 104 of the microscope device 10, and performs an imaging operation of the image acquisition unit 21.
- the image capturing control unit 22 to be controlled, the various operations in the image capturing apparatus 20, the control unit 23 that processes the image acquired by the image acquiring unit 21, and the image data and control program of the image acquired by the image acquiring unit 21
- a storage unit 24 for storing various information such as, an input unit 25 for inputting instructions and information to the imaging device 20, and an image based on the image data stored in the storage unit 24 and other various types of information to an external device.
- an output unit 26 for outputting.
- the image acquisition unit 21 includes an imaging unit 211 and a memory 212.
- the image pickup unit 211 includes an image pickup device (imager) 211a made of, for example, a CCD or a CMOS, and pixel levels (R (red), G (green), and B (blue)) in each pixel included in the image pickup device 211a ( It is configured using a camera capable of capturing a color image having a pixel value. Or you may comprise the imaging part 211 using the camera which can image the monochrome image which outputs the luminance value Y as a pixel level (pixel value) in each pixel.
- the imaging unit 211 is provided at one end of the lens barrel 102 so that the optical axis L passes through the center of the light receiving surface of the imaging element 211a, and includes an observation optical system including the objective lens 140 to the lens barrel 102.
- the observation light incident on the light receiving surface via 104 is photoelectrically converted to generate image data of the subject image that enters the field of view of the objective lens 140.
- the memory 212 includes a recording device such as a flash memory that can be updated and recorded, a semiconductor memory such as a RAM, and a ROM, and temporarily stores the image data generated by the imaging unit 211.
- a recording device such as a flash memory that can be updated and recorded
- a semiconductor memory such as a RAM, and a ROM
- the imaging control unit 22 outputs a control signal to the microscope apparatus 10 and moves the stage 131 during one exposure period of the imaging unit 211 to shift the position of the focal plane and field of view of the objective lens 140, thereby Control is performed to acquire a multifocal superimposed image including image information on a plurality of surfaces in the optical axis L direction of the observation optical system 104.
- the control unit 23 is configured by hardware such as a CPU, for example, and by reading a program stored in the storage unit 24, based on various parameters stored in the storage unit 24, information input from the input unit 25, and the like, The overall operation of the imaging device 20 and the microscope observation system 1 is controlled. Further, the control unit 23 performs a process of generating an omnifocal image by performing predetermined image processing on the image data input from the image acquisition unit 21.
- control unit 23 includes a shift amount acquisition processing unit 231 that acquires a shift amount for shifting the position of the field of view of the observation optical system 104 when acquiring a multifocal superimposed image, and a point spread function that represents image blur. And an omnifocal image generation unit 232 that generates an omnifocal image by restoring the multifocal superimposed image using (Point Spread Function).
- the storage unit 24 includes a recording device such as a flash memory, RAM, and ROM that can be updated and recorded, a recording medium such as a hard disk, MO, CD-R, and DVD-R that is built-in or connected by a data communication terminal, and the like.
- a writing / reading apparatus that writes information to a recording medium and reads information recorded on the recording medium.
- the storage unit 24 includes a parameter storage unit 241 that stores parameters used for calculation in the control unit 23 and a program storage unit 242 that stores various programs.
- the parameter storage unit 241 stores parameters such as a shift amount for shifting the position of the visual field when acquiring a multi-focus superimposed image.
- the program storage unit 242 stores a control program for causing the imaging device 20 to execute a predetermined operation, an image processing program, and the like.
- the input unit 25 includes an input device such as a keyboard, various buttons, and various switches, a pointing device such as a mouse and a touch panel, and the like, and inputs signals corresponding to operations performed on these devices to the control unit 23. .
- the output unit 26 outputs an image based on the image data acquired by the image acquisition unit 21, an omnifocal image generated by the control unit 23, and other various information to an external device such as the display device 30, and a predetermined format This is an external interface to be displayed with.
- Such an imaging device 20 can be configured by combining a general-purpose digital camera via an external interface with a general-purpose device such as a personal computer or a workstation.
- the display device 30 is configured by, for example, an LCD, an EL display, a CRT display, or the like, and displays an image and related information output from the output unit 26.
- the display device 30 is provided outside the imaging device 20, but may be provided inside the imaging device 20.
- FIG. 3 is a flowchart showing the operation of the microscope observation system 1.
- step S10 the image acquisition unit 21 acquires a plurality of multifocus superimposed images.
- FIG. 4 is a schematic diagram for explaining a process for acquiring a plurality of multi-focus superimposed images.
- the range of thickness D ( ⁇ m) in the subject S is superimposed and imaged.
- This range of thickness D is referred to as a superimposed imaging range.
- the thickness ⁇ z of each slice F j corresponds to the depth of field of the observation optical system 104.
- each slice F j a region surrounded by a bold line is the field of view V of the observation optical system 104 to be imaged, and an arrow superimposed on the field of view V indicates a direction in which the focal plane and the position of the field of view are shifted. .
- FIG. 5 is a flowchart showing details of the processing for acquiring a plurality of multi-focus superimposed images.
- the shift amount acquisition processing unit 231 acquires a shift amount for shifting the position of the visual field V when acquiring the multifocus superimposed image.
- This shift amount may be a preset amount or may be acquired based on information input from the input unit 25 in response to a user operation.
- the shift amount ⁇ is determined according to the user operation, as shown in FIG. 4B, the angle ⁇ when the user's line of sight is inclined with respect to the direction directly above the subject S is input. Good to do.
- the shift amount ⁇ (pixel) is given by the following equation (2).
- ⁇ (Z / tan ⁇ ) / p (2)
- the distance Z can be approximated by the distance from the objective lens 140 to each depth in the subject S.
- the imaging control unit 22 calculates a shift speed v 1 for shifting the visual field V along the X direction during one exposure period as an imaging parameter.
- step S104 the image acquisition unit 21 controls the focal plane and field of view V of the observation optical system 104 during one exposure period of the imaging unit 211 under the control of the imaging control unit 22 based on the imaging parameters set in step S103.
- the image acquisition unit 21 controls the focal plane and field of view V of the observation optical system 104 during one exposure period of the imaging unit 211 under the control of the imaging control unit 22 based on the imaging parameters set in step S103.
- the direction in which the positions of the focal plane and the visual field V are shifted is not limited to the direction of the arrow shown in FIG.
- the visual field V may be shifted in the + X direction while shifting the focal plane in the + Z direction.
- the order of obtaining the multiple multi-focus superimposed images SI 0 and SI 1 and the shift direction of the focal plane and the field of view V are set so that the number of movements and the amount of movement of the stage 131 are as small as possible.
- step S11 the omnifocal image generation unit 232 generates an omnifocal image based on the multiple multifocal superimposed images acquired in step S10.
- FIG. 6 is a flowchart showing details of the omnifocal image generation process.
- the total focus image generation unit 232 the point spread represents the image blur in the image of each slice F j function (Point Spread Function: PSF) to obtain information, to generate a PSF image based on the PSF information.
- the point spread function is stored in advance in the parameter storage unit 241 in association with imaging conditions such as the magnification of the objective lens 140 in the microscope apparatus 10 and the slice F j .
- the omnifocal image generation unit 232 reads out the point spread function corresponding to the slice F j from the parameter storage unit 241 based on the imaging conditions such as the magnification of the objective lens 140, and based on the point spread function, A PSF image for each slice F j is generated by calculating a pixel value corresponding to each pixel position.
- the total focus image generating unit 232 generates multiple focuses superimposed PSF image PI 0 shift amount zero corresponding to multifocal superimposed image SI 0.
- the total focus image generating unit 232 by using a plurality of PSF image after the shift processing in step S113, it generates a multi-focal superimposed PSF image PI 1 of shift sigma. Specifically, the pixel value of each pixel of the multifocal superimposed PSF image PI 1 is calculated by averaging the pixel values of pixels corresponding in position among the plurality of PSF images after the shift process.
- step S115 the total focus image generating unit 232, by using the multi-focal superimposed PSF image PI 0, PI 1, to restore each generated plurality of multiple focuses superimposed image SI 0, SI 1 to the step S10. Thereby, an omnifocal image AI 0 is generated from the multifocal superimposed image SI 0, and an omnifocal image AI 1 is generated from the multifocal superimposed image SI 1 . Thereafter, the operation of the control unit 23 returns to the main routine.
- step S12 the imaging device 20 outputs the image data of the plurality of omnifocal images AI 0 and AI 1 generated in step S11 to the display device 30, and these omnifocal images AI 0 and AI 1 are output.
- the display method of the omnifocal images AI 0 and AI 1 is not particularly limited.
- the omnifocal images AI 0 and AI 1 may be displayed side by side, or the omnifocal images AI 0 and AI 1 may be alternately displayed in the same region.
- the omnifocal images AI 0 and AI 1 may be automatically switched at a predetermined cycle, or manually input to the user using the input unit 25. It is good also as making it switch with.
- FIG. 7 is a schematic diagram illustrating a display example of an omnifocal image on the display device 30.
- two omnifocal images AI 0 and AI 1 are displayed side by side. Thereafter, the operation of the microscope observation system 1 ends.
- a multifocal superimposed image is acquired by performing imaging while shifting the focal plane and the field of view during one exposure period.
- An omnifocal image is generated by restoration.
- a state in which the subject S is virtually viewed from a plurality of viewpoints can be reproduced.
- FIG. 4 it is possible to reproduce the state of the subject S viewed from directly above by setting the shift amount to zero, and to reproduce the state of viewing the subject S from the upper left by setting the shift amount to ⁇ . Can do.
- the user can visually and intuitively grasp the position in the Z direction of the structure in the subject S, the anteroposterior relationship between the structures, the overlapping state of the structures, and the like. It becomes possible.
- a Z stack image is acquired by imaging a plurality of times, and this Z Compared with the case where the stack images are added and averaged to obtain a multifocus superimposed image, imaging can be performed in a short time, and the amount of data and the amount of calculation in image processing can be greatly suppressed.
- the case where the field of view V of the observation optical system 104 is shifted only in the X direction has been described in order to facilitate understanding.
- similar processing can be performed in the Y direction.
- an omnifocal image corresponding to the case where the virtual viewpoint with respect to the subject S is moved along the Y direction can be generated.
- an omnifocal image corresponding to the case where the virtual viewpoint with respect to the subject S is moved in the horizontal plane may be generated. Is possible.
- FIG. 8 and FIG. 9 are schematic diagrams for explaining a method for acquiring a multifocal superimposed image in the first modification.
- Embodiment 1 described above when the optical axis of the observation optical system 104 is orthogonal to the stage 131 and the multifocal superimposed image SI 1 with the shift amount ⁇ is acquired, the stage 131 is moved in the Z direction and the X direction. Imaging was performed while moving. However, imaging may be performed with the optical axis of the observation optical system 104 inclined in advance with respect to the stage 131.
- the subject placement surface 131a of the stage 131 is installed horizontally, and the optical axis L of the observation optical system 104 is inclined by an angle ⁇ with respect to the normal line of the subject placement surface 131a. Accordingly, the focal plane Pf of the imaging unit 211 is inclined at an angle ⁇ with respect to the subject placement surface 131a.
- the focal plane Pf moves in the + Z direction with respect to the subject S, and the field of view shifts in the + X direction. To do. That is, control for moving the observation optical system 104 two-dimensionally becomes unnecessary, and drive control for the observation optical system 104 can be simplified.
- the subject placement surface 131a of the stage 131 is installed horizontally, and the optical axis L of the observation optical system 104 is placed perpendicular to the subject placement surface 131a. Then, a pedestal 106 having an inclined surface with an angle ⁇ with respect to the bottom surface is installed on the stage 131. By placing the subject S on the inclined surface 161a of the pedestal 106, the focal plane Pf of the imaging unit 211 is inclined at an angle ⁇ with respect to the inclined surface 161a.
- the focal plane Pf with respect to the subject S moves in the + Z direction, and the field of view shifts in the + X direction.
- the shift amount acquisition processing unit 231 calculates and outputs the angle ⁇ based on the shift amount ⁇ . Based on this angle ⁇ , the imaging control unit 22 performs control to incline the focal plane Pf of the observation optical system 104 with respect to the subject placement surface 131a by the angle ⁇ , as shown in FIG.
- the shift amount acquisition processing unit 231 calculates a shift amount ⁇ corresponding to the angle ⁇ from Expression (4), and the imaging control unit 22 calculates various control parameters based on the shift amount ⁇ .
- the multifocal superimposed image is acquired by continuously shifting the focal plane and the position of the visual field V while the shutter is opened during one exposure period of the imaging unit 211.
- the shutter that blocks the incidence of light on the imaging unit 211 is opened and closed at a predetermined cycle, and the position of the focal plane and the visual field V can be shifted stepwise while the shutter is closed. good.
- the number of times the shutter is opened and closed during one exposure period that is, the number of times the subject S is exposed to the imaging unit 211, the number of times the positions of the focal plane and the visual field are shifted, and the positions of the focal plane and the visual field V per time.
- the shift amount is appropriately set according to one exposure period, shutter speed, and the like in the imaging unit 211.
- the focal plane is set within a predetermined superimposed imaging range, specifically the object field while the shutter is open. Move multiple times the depth (k ⁇ ⁇ z, k is a natural number). Further, when acquiring the multifocal superimposed image SI 1 with the shift amount ⁇ shown in FIG. 4B, the focal plane is set within a predetermined superimposed imaging range, specifically the object field while the shutter is open. While moving the depth multiple times (k ⁇ ⁇ z), the position of the visual field V is shifted multiple times the shift amount ⁇ (k ⁇ ⁇ ).
- step S111 of FIG. 6 a PSF image corresponding to a plurality of slices from when the shutter is opened to when it is closed is generated.
- step S113 in FIG. 6 the PSF image is shifted in accordance with the shift amount of the position of the visual field V corresponding to the opening / closing cycle of the shutter.
- the processes in S112, S114, and S115 are the same as those in the first embodiment.
- FIG. 10 is a flowchart showing details of the acquisition processing of a plurality of multi-focus superimposed images in the third modification.
- FIG. 11 is a schematic diagram for explaining a process for acquiring a plurality of multi-focus superimposed images in the third modification.
- the visual field V is shifted in the + X direction when the focal plane is shifted in the + Z direction.
- the subscript i is a variable indicating the acquisition order of the multi-focus superimposed image.
- shift amounts ⁇ i may be preset amounts, or may be acquired based on information input from the input unit 25 in response to a user operation. In the latter case, it is preferable to cause the user to input an angle ⁇ i when the user's line of sight is inclined with respect to the direction directly above the subject S.
- the imaging control unit 22 calculates a shift speed for shifting the visual field V along the X direction during one exposure period of the imaging unit 211 as an imaging parameter.
- the method for calculating the shift speed is the same as in the first embodiment (see step S103 in FIG. 5).
- the image acquisition unit 21 controls the focal plane and field of view V of the observation optical system 104 during one exposure period of the imaging unit 211 under the control of the imaging control unit 22 based on the imaging parameters set in step S122. by capturing an object S while shifting the position, and acquires multiple focuses superimposed image SI i of the shift amount sigma i.
- the method for obtaining the multifocal superimposed image SI i is the same as in the first embodiment (see step S104 in FIG. 5).
- step S124 the control unit 23 determines whether or not the variable i has reached the maximum value n.
- step S124: No the control unit 23 increments the variable i (step S125). Thereafter, the operation of the control unit 23 returns to step S121.
- the amount of movement of the stage 131 is suppressed by appropriately setting the imaging parameters for controlling the acquisition order of the multi-focus superimposed image SI i , the imaging start position, and the shift direction of the focal plane and the field of view V.
- the total imaging time can be shortened and the multi-focus superimposed image SI i can be efficiently acquired.
- FIG. 11A by shifting the position of the visual field V in the X direction at a pace of the shift amount k while shifting the focal plane in the + Z direction during one exposure period. acquires multiple focuses superimposed image SI 11 of shift sigma 11.
- FIG. 11A by shifting the position of the visual field V in the X direction at a pace of the shift amount k while shifting the focal plane in the + Z direction during one exposure period. acquires multiple focuses superimposed image SI 11 of shift sigma 11.
- step S124 Yes
- the operation of the microscope observation system 1 returns to the main routine.
- the omnifocal image generation process (see step S11 in FIG. 3 and FIG. 6) based on the plurality of multifocal superimposed images SI i generated in this way is the same as that of the first embodiment as a whole.
- steps S112 to S115 shown in FIG. 6 a multi-focus superimposed PSF image is generated for each multi-focus superimposed image SI i using the shift amount ⁇ i used when the multi-focus superimposed image SI i is generated.
- the multi-focus superimposed image SI i is generated using each of these multi-focus superimposed PSF images.
- a plurality of omnifocal images having different shift amounts ⁇ i are generated.
- the plurality of omnifocal images may be displayed side by side, or these omnifocal images may be sequentially switched and displayed in the same region. May be.
- a plurality of omnifocal images having different directions for shifting the position of the visual field V are generated and displayed. Can be reproduced from the respective directions of directly above ( ⁇ 12 ), upper left ( ⁇ 11 ), and upper right ( ⁇ 12 ). Therefore, by referring to such an omnifocal image, the user can grasp in more detail the degree of overlap and the front-rear relationship in the Z direction between the structures in the subject S.
- FIG. 12 is a schematic diagram for explaining the multifocal superimposed image acquisition process according to the fourth modification.
- the magnitudes of the shift amounts ⁇ 21 to ⁇ 25 are set to ⁇ 21 ⁇ 22 ⁇ 23 ⁇ 24 ⁇ 25. In order of increasing. In this way, by changing the magnitude of the shift amount ⁇ i , it is possible to reproduce a state in which the subject S is virtually observed from various angles.
- the acquisition process of these multifocal superimposed images SI i is the same as that in the third modification (see FIG. 10).
- the multi-focus superimposed image SI 22 is acquired by shifting the visual field V in the ⁇ X direction at a pace of the shift amount ⁇ 22 while shifting to the ⁇ X direction.
- the multi-focus superimposed image SI 22 is acquired by shifting the visual field V in the
- the processing for generating a plurality of omnifocal images and the processing for displaying a plurality of images based on these multifocal superimposed images SI i are the same as in the third modification.
- the fourth modification of the first embodiment of the present invention since the magnitude of the shift amount ⁇ i is changed between the plurality of multifocus superimposed images SI i , virtually wider The state in which the subject S is observed from a plurality of directions in the range can be reproduced. Therefore, the user can intuitively and more realistically understand the position in the Z direction of the structure in the subject S, the degree of overlap between the structures, and the front-rear relationship.
- FIG. 13 is a block diagram showing a configuration of a microscope observation system according to Embodiment 2 of the present invention.
- the microscope observation system 2 according to the second embodiment includes a microscope apparatus 10, an imaging apparatus 40 that acquires and processes an image of a subject image generated by the microscope apparatus 10, and an imaging apparatus 40. And a display device 50 that displays the processed image and the like.
- the configuration and operation of the microscope apparatus 10 are the same as those in the first embodiment (see FIG. 2).
- the imaging device 40 includes a control unit 41 instead of the control unit 23 shown in FIG.
- the control unit 41 further includes an attention slice acquisition unit 411 with respect to the control unit 23.
- the operations of the shift amount acquisition processing unit 231 and the omnifocal image generation unit 232 are the same as those in the first embodiment.
- the attention slice acquisition unit 411 acquires the position in the Z direction of the slice including the structure in the subject S corresponding to the observation region input from the display device 50 described later via the input unit 25, and uses this slice as the attention slice. Determine as.
- the display device 50 is configured by, for example, an LCD, an EL display, a CRT display, or the like, and an image display unit 51 that displays an image output from the output unit 26 or related information, and an image display unit according to an operation performed from the outside.
- An observation area determination unit 52 that determines an area in the omnifocal image displayed at 51 as an observation area and inputs a signal representing the observation area to the control unit 41.
- FIG. 14 is a flowchart showing the operation of the microscope observation system 2.
- the operations in steps S10 to S12 are the same as in the first embodiment.
- a plurality of omnifocal images AI i are respectively generated by restoring these multifocal superimposed images SI i , and are sequentially switched and displayed on the image display unit 51.
- step S21 following step S12 the observation region determination unit 52 selects an arbitrary region for any one of the omnifocal images AI 31 , AI 32 , AI 33 , and AI 34 displayed on the image display unit 51. It is determined whether or not an operation has been performed.
- step S21: No If no user operation is performed (step S21: No), the operation of the microscope observation system 2 returns to step S12.
- FIG. 16 is a schematic diagram illustrating an example of a method for selecting an observation region.
- the observation area is selected by surrounding a desired area in the omnifocal image displayed on the image display unit 51 by a pointer operation using a mouse or the like.
- step S ⁇ b> 23 the control unit 41 acquires the Z position information of the observation region based on the information indicating the observation region input from the observation region determination unit 52.
- FIG. 17 is a flowchart showing details of the processing for acquiring the Z position information of the observation area. In the following description, it is assumed that the region R 34 in the omnifocal image AI 34 shown in FIG. 15 is determined as the observation region as an example.
- step S231 focused slice acquisition unit 411 acquires the XY position information of the observation area R 34 in the all-focus image A 34.
- step S232 focused slice acquisition unit 411, the total focus image A 34 each omnifocal image AI 31 except, AI 32, AI 33, region R corresponding to the observation region R 34 '31, R' 32, R '33 is extracted and the XY position information of each area is acquired.
- Regions R ′ 31 , R ′ 32 , and R ′ 33 can be extracted using a known image recognition technique such as pattern matching.
- these regions R ′ 31 , R ′ 32 and R ′ 33 are also referred to as observation regions.
- the target slice acquisition unit 411 shifts the XY positions of the observation regions R ′ 31 , R ′ 32 , R ′ 33 , and R 34 between the omnifocal images AI 31 , AI 32 , AI 33 , and AI 34.
- the shift amount between the X position of the observation region R ′ 31 in the omnifocal image AI 31 and the position of the observation region R ′ 32 in the omnifocal image AI 32 and the total focus.
- step S234 the target slice acquisition unit 411, based on the shift amounts of the observation regions R ′ 31 , R ′ 32 , R ′ 33 , R 34 , these observation regions R ′ 31 , R ′ 32 , R ′ 33. to obtain a slice F j that contains the R 34.
- FIG. 15 (c), (d), the shift amount sigma 33 is 2 pixels in the entire focus image AI 33, shift sigma 34 in total focus image AI 34 is 3 pixels, the observation region R '33,
- the target slice acquisition unit 411 outputs the slice F j acquired in this way as the Z position information of the observation region. Thereafter, the operation of the control unit 41 returns to the main routine.
- step S24 following step S23 the control unit 41 acquires an image of a slice including the observation region by performing imaging while focusing on the Z position based on the Z position information output by the target slice acquisition unit 411. And displayed on the display device 50. At this time, the control unit 41 may acquire and display images of slices including the observation region and images of other slices adjacent to the slice (that is, before and after). Thereafter, the operation of the microscope observation system 2 ends.
- the user can intuitively and easily grasp the position in the Z direction of the structures that appear to overlap each other on the plane and the front-rear relationship between the structures.
- FIG. 18 is a block diagram illustrating a configuration example of a microscope observation system according to Embodiment 3 of the present invention.
- the microscope observation system 3 according to the third embodiment includes a microscope apparatus 10, an imaging apparatus 60 that acquires and processes an image of a subject image generated by the microscope apparatus 10, and the imaging apparatus 60.
- a display device 50 that displays the processed image and the like.
- the configuration and operation of the microscope apparatus 10 are the same as those in the first embodiment (see FIG. 2).
- the configuration and operation of the display device 50 are the same as those in the second embodiment (see FIG. 13).
- the imaging device 60 includes a control unit 61 instead of the control unit 41 shown in FIG.
- the control unit 61 includes an omnifocal image generation unit 611 instead of the omnifocal image generation unit 232 with respect to the control unit 41.
- the configuration and operation of each unit of the imaging device 60 other than the control unit 61 and the configuration and operation of each unit of the control unit 61 other than the omnifocal image generation unit 611 are the same as those in the third embodiment.
- the omnifocal image generation unit 611 includes an imaging position determination processing unit 612 that determines an imaging position when acquiring a multifocal superimposed image based on the position of the target slice determined by the target slice determination unit 411.
- FIG. 19 is a flowchart showing the operation of the microscope observation system 3. Steps S10 to S24 are the same as in the second embodiment (see FIG. 14).
- step S31 following step S24 the imaging position determination processing unit 612 obtains a multi-focus superimposed image so that the position of the observation region determined in step S22 does not change between a plurality of all-focus images.
- An imaging position is determined.
- the imaging position determination processing unit 612 includes the slice F including the observation region R so that the position of the observation region R does not change between the omnifocal images AI 41 , AI 42 , AI 43 , AI 44.
- Determine the position of the field of view V at j 2 . 20A to 20D, the position of the visual field V is determined so that the observation region R is at the center of the visual field V.
- the imaging start position of each multi-focus superimposed image SI 41 , SI 42 , SI 43 , SI 44 is calculated. Based on the imaging start position and the shift amounts ⁇ 41 , ⁇ 42 , ⁇ 43 , ⁇ 44 , the imaging position for each slice F j when acquiring each multi-focus superimposed image SI 41 , SI 42 , SI 43 , SI 44 is obtained. Determined.
- step S ⁇ b> 32 the imaging control unit 22 controls the position of the stage 131 and the imaging unit 211 based on the imaging position determined by the imaging position determination processing unit 612 to reacquire a plurality of multifocus superimposed images.
- the re-acquisition processing of a plurality of multi-focus superimposed images is the same as step S11 except that the imaging parameters are different.
- step S33 the omnifocal image generation unit 611 generates a plurality of omnifocal images by restoring the multifocal superimposed image acquired in step S32 using the PSF function.
- the omnifocal image generation process is the same as that in step S12.
- step S34 the imaging device 60 causes the display device 50 to display the plurality of omnifocal images generated in step S33. Thereafter, the operation of the microscope observation system 3 ends.
- a plurality of omnifocal images with different virtual viewpoints are displayed without changing the position of the observation region selected by the user in the omnifocal image. Can do. Therefore, the user can intuitively grasp the position of the observation region in the Z direction, the front-rear relationship with other structures, and the like without changing the line of sight with respect to the observation region selected by the user.
- FIG. 21 is a schematic diagram illustrating a method of shifting the position of the visual field in a modification of the third embodiment.
- FIG. 22 is a schematic diagram showing another method of shifting the visual field in the modification of the third embodiment.
- Embodiments 1 to 3 and the modifications described above are not limited as they are, and various inventions are formed by appropriately combining a plurality of constituent elements disclosed in the embodiments and modifications. be able to. For example, some components may be excluded from all the components shown in the embodiment. Or you may form combining the component shown in different embodiment suitably.
- Microscope observation system 10 Microscope device 20, 40, 60 Imaging device 21 Image acquisition unit 22 Imaging control unit 23, 41, 61 Control unit 24 Storage unit 25 Input unit 26 Output unit 30, 50 Display device 51 Image display Unit 52 observation region determination unit 100 arm 101 trinocular tube unit 102 lens tube 103 eyepiece unit 104 observation optical system 110 epi-illumination unit 111 epi-illumination light source 112 epi-illumination optical system 120 trans-illumination unit 121 trans-illumination light source 122 trans-illumination Optical system 130 Electric stage unit 131 Stage 132 Stage drive unit 133 Position detection unit 140, 141 Objective lens 142 Revolver 161 Pedestal 161a Slope 211 Imaging unit 212 Memory 231 Shift amount acquisition processing unit 232, 611 All-focus image generation unit 2 41 parameter storage unit 242 program storage unit 411 attention slice acquisition unit 612 imaging position determination processing unit
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Optics & Photonics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Analytical Chemistry (AREA)
- Chemical & Material Sciences (AREA)
- Molecular Biology (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biomedical Technology (AREA)
- Vascular Medicine (AREA)
- Medical Informatics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Quality & Reliability (AREA)
- Microscoopes, Condenser (AREA)
- Image Processing (AREA)
Abstract
Description
図1は、本発明の実施の形態1に係る顕微鏡観察システムの構成例を示すブロック図である。図1に示すように、実施の形態1に係る顕微鏡観察システム1は、被写体像を生成する顕微鏡装置10と、該顕微鏡装置10が生成した拡大像の画像を取得して処理する撮像装置20と、撮像装置20が処理した画像を表示する表示装置30とを備える。
sj=σ×j …(1)
σ=(Z/tanθ)/p …(2)
式(2)において距離Zは、対物レンズ140から被写体S内の各深度までの距離によって近似することができる。
v1=(p×σ/M)/(T1/N) …(3)
次に、本発明の実施の形態1の変形例1について説明する。図8及び図9は、変形例1における多重焦点重畳画像の取得方法を説明するための模式図である。
α=tan-1{(p×σ×N/M)/D} …(4)
次に、本発明の実施の形態1の変形例2について説明する。上記実施の形態1においては、撮像部211の1露光期間中にシャッタを開放させたまま焦点面及び視野Vの位置を連続的にシフトさせることにより、多重焦点重畳画像を取得した。しかし、1露光期間中に、撮像部211への光の入射を遮るシャッタを所定の周期で開閉し、シャッタが閉じている間に焦点面や視野Vの位置を段階的にシフトさせることとしても良い。
次に、本発明の実施の形態1の変形例3について説明する。上記実施の形態1においては、シフト量がゼロ及びσの2つの多重焦点重畳画像を取得したが、シフト量が異なる多重焦点重畳画像をさらに取得しても良い。
σi=(Z/tanθi)/p …(5)
次に、本発明の実施の形態1の変形例4について説明する。上記変形例3においては、2つの多重焦点重畳画像SI11、SI13の間で視野Vの位置のシフト量σiの大きさを同一にしたが、複数の多重焦点重畳画像SIiの間で、シフト量σiの大きさを変化させても良い。
次に、本発明の実施の形態2について説明する。図13は、本発明の実施の形態2に係る顕微鏡観察システムの構成を示すブロック図である。図13に示すように、実施の形態2に係る顕微鏡観察システム2は、顕微鏡装置10と、該顕微鏡装置10が生成した被写体像の画像を取得して処理する撮像装置40と、撮像装置40が処理した画像等を表示する表示装置50とを備える。このうち、顕微鏡装置10の構成及び動作は実施の形態1と同様である(図2参照)。
si,j=σi×j …(6)
|s(i+1) ,j-si,j|=σi+1×j-σi×j
j=|s(i+1),j-si,j|/(σi+1-σi) …(7)
次に、本発明の実施の形態3について説明する。図18は、本発明の実施の形態3に係る顕微鏡観察システムの構成例を示すブロック図である。図18に示すように、実施の形態3に係る顕微鏡観察システム3は、顕微鏡装置10と、該顕微鏡装置10が生成した被写体像の画像を取得して処理する撮像装置60と、撮像装置60が処理した画像等を表示する表示装置50とを備える。このうち、顕微鏡装置10の構成及び動作は、実施の形態1と同様である(図2参照)。また、表示装置50の構成及び動作は、実施の形態2と同様である(図13参照)。
次に、本発明の実施の形態3の変形例について説明する。上記実施の形態3では、各多重焦点重畳画像内において隣り合うスライス間でのシフト量を同一としたが、1つの多重焦点重畳画像内においても、隣り合うスライス間でのシフト量を変化させても良い。
10 顕微鏡装置
20、40、60 撮像装置
21 画像取得部
22 撮像制御部
23、41、61 制御部
24 記憶部
25 入力部
26 出力部
30、50 表示装置
51 画像表示部
52 観察領域決定部
100 アーム
101 三眼鏡筒ユニット
102 鏡筒
103 接眼レンズユニット
104 観察光学系
110 落射照明ユニット
111 落射照明用光源
112 落射照明光学系
120 透過照明ユニット
121 透過照明用光源
122 透過照明光学系
130 電動ステージユニット
131 ステージ
132 ステージ駆動部
133 位置検出部
140、141 対物レンズ
142 レボルバ
161 台座
161a 斜面
211 撮像部
212 メモリ
231 シフト量取得処理部
232、611 全焦点画像生成部
241 パラメータ記憶部
242 プログラム記憶部
411 注目スライス取得部
612 撮像位置決定処理部
Claims (7)
- 顕微鏡の観察光学系により生成される被写体像を撮像して画像を取得する撮像部と、
前記観察光学系の焦点面及び視野の位置をシフトさせるシフト手段と、
前記撮像部の1露光期間中に、前記焦点面及び前記視野の位置をシフトさせることにより、前記観察光学系の光軸方向の複数の面における画像情報を含む多重焦点重畳画像を前記撮像部に取得させる撮像制御部と、
前記視野の位置をシフトさせるシフト量を取得するシフト量取得処理部と、
前記シフト量が異なる複数の条件の下でそれぞれ取得された複数の多重焦点重畳画像をもとに、複数の全焦点画像をそれぞれ生成する全焦点画像生成部と、
前記複数の全焦点画像を表示する表示部と、
を備えることを特徴とする顕微鏡観察システム。 - 前記撮像制御部は、前記シフト量取得処理部が取得した前記シフト量に基づいて、前記複数の多重焦点重畳画像の各々を取得する際の撮像開始位置を決定する、ことを特徴とする請求項1に記載の顕微鏡観察システム。
- 外部からなされる操作に応じて前記複数の全焦点画像のうちのいずれかの全焦点画像から選択された領域を観察領域として決定する観察領域決定処理部と、
前記観察領域を選択された全焦点画像以外の全焦点画像から前記観察領域に対応する領域を抽出すると共に、前記観察領域を選択された全焦点画像における前記観察領域の位置と、前記領域を抽出された全焦点画像における前記領域の位置との間のシフト量に基づいて、前記観察領域に対応する被写体内の構造が含まれるスライスの位置を取得する注目スライス取得部と、
をさらに備えることを特徴とする請求項1に記載の顕微鏡観察システム。 - 前記全焦点画像生成部は、前記観察領域を選択された全焦点画像における前記観察領域の位置に基づき、前記撮像部の1露光期間中に前記観察光学系の視野の位置をシフトさせて多重焦点重畳画像を取得する際の撮像位置を決定する撮像位置決定処理部を有する、ことを特徴とする請求項3に記載の顕微鏡観察システム。
- 前記撮像位置決定処理部は、前記複数の全焦点画像の間で各全焦点画像における前記観察領域の位置が変化しないように、前記撮像位置を決定する、ことを特徴とする請求項4に記載の顕微鏡観察システム。
- 顕微鏡の観察光学系により生成される被写体像を撮像部により撮像して画像を取得する顕微鏡観察方法において、
前記撮像部の1露光期間中に、前記観察光学系の焦点面及び視野の位置をシフトさせることにより、前記観察光学系の光軸方向の複数の面における画像情報を含む多重焦点重畳画像を取得する撮像ステップと、
前記視野の位置をシフトさせるシフト量が異なる複数の条件の下でそれぞれ取得された複数の多重焦点重畳画像をもとに、複数の全焦点画像をそれぞれ生成する全焦点画像生成ステップと、
前記複数の全焦点画像を表示部に表示させる表示ステップと、
を含むことを特徴とする顕微鏡観察方法。 - 顕微鏡の観察光学系により生成される被写体像を撮像部により撮像して画像を取得する顕微鏡観察プログラムにおいて、
前記撮像部の1露光期間中に、前記観察光学系の焦点面及び視野の位置をシフトさせることにより、前記観察光学系の光軸方向の複数の面における画像情報を含む多重焦点重畳画像を取得する制御を行う撮像制御ステップと、
前記視野の位置をシフトさせるシフト量が異なる複数の条件の下でそれぞれ取得された複数の多重焦点重畳画像をもとに、複数の全焦点画像をそれぞれ生成する全焦点画像生成ステップと、
前記複数の全焦点画像を表示部に表示させる表示ステップと、
をコンピュータに実行させることを特徴とする顕微鏡観察プログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017512154A JP6563486B2 (ja) | 2015-04-16 | 2015-04-16 | 顕微鏡観察システム、顕微鏡観察方法、及び顕微鏡観察プログラム |
DE112015006271.8T DE112015006271T5 (de) | 2015-04-16 | 2015-04-16 | Mikroskopiesystem, Mikroskopieverfahren und Mikroskopieprogramm |
PCT/JP2015/061742 WO2016166871A1 (ja) | 2015-04-16 | 2015-04-16 | 顕微鏡観察システム、顕微鏡観察方法、及び顕微鏡観察プログラム |
US15/782,920 US10613313B2 (en) | 2015-04-16 | 2017-10-13 | Microscopy system, microscopy method, and computer-readable recording medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2015/061742 WO2016166871A1 (ja) | 2015-04-16 | 2015-04-16 | 顕微鏡観察システム、顕微鏡観察方法、及び顕微鏡観察プログラム |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/782,920 Continuation US10613313B2 (en) | 2015-04-16 | 2017-10-13 | Microscopy system, microscopy method, and computer-readable recording medium |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016166871A1 true WO2016166871A1 (ja) | 2016-10-20 |
Family
ID=57125922
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/061742 WO2016166871A1 (ja) | 2015-04-16 | 2015-04-16 | 顕微鏡観察システム、顕微鏡観察方法、及び顕微鏡観察プログラム |
Country Status (4)
Country | Link |
---|---|
US (1) | US10613313B2 (ja) |
JP (1) | JP6563486B2 (ja) |
DE (1) | DE112015006271T5 (ja) |
WO (1) | WO2016166871A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107367515A (zh) * | 2017-07-14 | 2017-11-21 | 华南理工大学 | 一种超薄柔性ic基板油墨异物检测系统及方法 |
CN109241812A (zh) * | 2017-07-10 | 2019-01-18 | 上海爱观视觉科技有限公司 | 一种原物识别装置及识别方法 |
JP2020134228A (ja) * | 2019-02-15 | 2020-08-31 | 日本分光株式会社 | 自動サンプル検出機能を有する顕微分光装置 |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2018066845A (ja) * | 2016-10-19 | 2018-04-26 | オリンパス株式会社 | 顕微鏡システム |
JP7034636B2 (ja) * | 2017-09-07 | 2022-03-14 | ソニー・オリンパスメディカルソリューションズ株式会社 | 医療用観察装置、および医療用観察システム |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH01309478A (ja) * | 1988-02-23 | 1989-12-13 | Olympus Optical Co Ltd | 画像入出力装置 |
JP2005017557A (ja) * | 2003-06-25 | 2005-01-20 | National Institute Of Advanced Industrial & Technology | 三次元顕微鏡システムおよび画像表示方法 |
JP2011107669A (ja) * | 2009-06-23 | 2011-06-02 | Sony Corp | 生体サンプル像取得装置、生体サンプル像取得方法及び生体サンプル像取得プログラム |
JP2014021490A (ja) * | 2012-07-19 | 2014-02-03 | Sony Corp | スタックされた顕微鏡画像をナビゲートするための方法及び装置 |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE3905619C2 (de) | 1988-02-23 | 2000-04-13 | Olympus Optical Co | Bildeingabe-/Ausgabevorrichtung |
DE3931934C2 (de) | 1988-10-03 | 1994-11-10 | Olympus Optical Co | Bild-Ein/Ausgabevorrichtung |
WO2003023482A1 (de) * | 2001-09-11 | 2003-03-20 | Leica Microsystems Wetzlar Gmbh | Verfahren und vorrichtung zur optischen untersuchung eines objektes |
JP2008067915A (ja) * | 2006-09-14 | 2008-03-27 | Canon Inc | 医用画像表示装置 |
JP5868183B2 (ja) | 2010-06-15 | 2016-02-24 | パナソニック株式会社 | 撮像装置及び撮像方法 |
WO2014149554A1 (en) * | 2013-03-15 | 2014-09-25 | Hologic, Inc. | System and method for navigating a tomosynthesis stack including automatic focusing |
WO2016166858A1 (ja) * | 2015-04-15 | 2016-10-20 | オリンパス株式会社 | 顕微鏡観察システム、顕微鏡観察方法、及び顕微鏡観察プログラム |
CN108700733A (zh) * | 2016-02-22 | 2018-10-23 | 皇家飞利浦有限公司 | 用于生成生物样本的具有增强景深的合成2d图像的系统 |
US20180192030A1 (en) * | 2016-06-07 | 2018-07-05 | Gary Greenberg | 4-D Video Of An Object Using A Microscope |
US10838190B2 (en) * | 2016-06-21 | 2020-11-17 | Sri International | Hyperspectral imaging methods and apparatuses |
-
2015
- 2015-04-16 DE DE112015006271.8T patent/DE112015006271T5/de not_active Withdrawn
- 2015-04-16 WO PCT/JP2015/061742 patent/WO2016166871A1/ja active Application Filing
- 2015-04-16 JP JP2017512154A patent/JP6563486B2/ja active Active
-
2017
- 2017-10-13 US US15/782,920 patent/US10613313B2/en not_active Expired - Fee Related
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH01309478A (ja) * | 1988-02-23 | 1989-12-13 | Olympus Optical Co Ltd | 画像入出力装置 |
JP2005017557A (ja) * | 2003-06-25 | 2005-01-20 | National Institute Of Advanced Industrial & Technology | 三次元顕微鏡システムおよび画像表示方法 |
JP2011107669A (ja) * | 2009-06-23 | 2011-06-02 | Sony Corp | 生体サンプル像取得装置、生体サンプル像取得方法及び生体サンプル像取得プログラム |
JP2014021490A (ja) * | 2012-07-19 | 2014-02-03 | Sony Corp | スタックされた顕微鏡画像をナビゲートするための方法及び装置 |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109241812A (zh) * | 2017-07-10 | 2019-01-18 | 上海爱观视觉科技有限公司 | 一种原物识别装置及识别方法 |
CN107367515A (zh) * | 2017-07-14 | 2017-11-21 | 华南理工大学 | 一种超薄柔性ic基板油墨异物检测系统及方法 |
CN107367515B (zh) * | 2017-07-14 | 2019-11-15 | 华南理工大学 | 一种超薄柔性ic基板油墨异物检测方法 |
JP2020134228A (ja) * | 2019-02-15 | 2020-08-31 | 日本分光株式会社 | 自動サンプル検出機能を有する顕微分光装置 |
JP7265248B2 (ja) | 2019-02-15 | 2023-04-26 | 日本分光株式会社 | 自動サンプル検出機能を有する顕微分光装置 |
Also Published As
Publication number | Publication date |
---|---|
US20180081162A1 (en) | 2018-03-22 |
JPWO2016166871A1 (ja) | 2018-02-08 |
DE112015006271T5 (de) | 2018-01-18 |
US10613313B2 (en) | 2020-04-07 |
JP6563486B2 (ja) | 2019-08-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6563486B2 (ja) | 顕微鏡観察システム、顕微鏡観察方法、及び顕微鏡観察プログラム | |
JP5555014B2 (ja) | バーチャルスライド作成装置 | |
JP5090188B2 (ja) | 顕微鏡装置 | |
EP3035104B1 (en) | Microscope system and setting value calculation method | |
US10429632B2 (en) | Microscopy system, microscopy method, and computer-readable recording medium | |
EP2804145B1 (en) | Microscope system and stitched area decision method | |
US10089768B2 (en) | Image processing device, image processing method, image processing program, and imaging system | |
JP2018181333A (ja) | 医学−光学式表示システムを作動させるための方法 | |
JP2015197623A (ja) | 顕微鏡システム | |
US10721413B2 (en) | Microscopy system, microscopy method, and computer readable recording medium | |
JP2007017930A (ja) | 顕微鏡装置 | |
JPWO2016092674A1 (ja) | 観察システム、光学部品、及び観察方法 | |
JP2018194780A (ja) | 顕微鏡システム、制御方法、及び、プログラム | |
JP6312410B2 (ja) | アライメント装置、顕微鏡システム、アライメント方法、及びアライメントプログラム | |
JP7030986B2 (ja) | 画像生成装置、画像生成方法および画像生成プログラム | |
JP5996462B2 (ja) | 画像処理装置、顕微鏡システム及び画像処理方法 | |
JP6422761B2 (ja) | 顕微鏡システム、及び、z位置と補正装置の設定値との関係算出方法 | |
JP6423261B2 (ja) | 顕微鏡システム、関数算出方法、及び、プログラム | |
JP2016206228A (ja) | 合焦位置検出装置、合焦位置検出方法、撮像装置、撮像システム | |
JP2012150142A (ja) | 顕微鏡制御装置、顕微鏡システム及び該制御方法 | |
US11119303B2 (en) | Whole slide image creation device | |
JP2013255100A (ja) | 顕微鏡システム | |
JP2013250400A (ja) | 画像処理装置、画像処理方法、および画像処理プログラム | |
JP2018033059A (ja) | 画像処理方法、画像処理装置、画像処理プログラムおよび撮像装置 | |
JP2017003874A (ja) | 撮像装置および撮像装置の制御方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15889209 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2017512154 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 112015006271 Country of ref document: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 15889209 Country of ref document: EP Kind code of ref document: A1 |