WO2010109811A1 - 観察装置 - Google Patents
観察装置 Download PDFInfo
- Publication number
- WO2010109811A1 WO2010109811A1 PCT/JP2010/001876 JP2010001876W WO2010109811A1 WO 2010109811 A1 WO2010109811 A1 WO 2010109811A1 JP 2010001876 W JP2010001876 W JP 2010001876W WO 2010109811 A1 WO2010109811 A1 WO 2010109811A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- sample
- cut surface
- observation device
- main controller
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/62—Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light
- G01N21/63—Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light optically excited
- G01N21/64—Fluorescence; Phosphorescence
- G01N21/645—Specially adapted constructive features of fluorimeters
- G01N21/6456—Spatial resolved fluorescence measurements; Imaging
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N1/00—Sampling; Preparing specimens for investigation
- G01N1/02—Devices for withdrawing samples
- G01N1/04—Devices for withdrawing samples in the solid state, e.g. by cutting
- G01N1/06—Devices for withdrawing samples in the solid state, e.g. by cutting providing a thin slice, e.g. microtome
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/69—Microscopic objects, e.g. biological cells or cellular parts
- G06V20/693—Acquisition
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N1/00—Sampling; Preparing specimens for investigation
- G01N1/02—Devices for withdrawing samples
- G01N1/04—Devices for withdrawing samples in the solid state, e.g. by cutting
- G01N1/06—Devices for withdrawing samples in the solid state, e.g. by cutting providing a thin slice, e.g. microtome
- G01N2001/065—Drive details
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/01—Arrangements or apparatus for facilitating the optical investigation
- G01N21/03—Cuvette constructions
- G01N2021/0339—Holders for solids, powders
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/17—Systems in which incident light is modified in accordance with the properties of the material investigated
- G01N21/21—Polarisation-affecting properties
Definitions
- the present invention relates to an observation apparatus for observing the internal structure of a sample such as a biological sample.
- a sample is continuously sliced using a microtome, and the resulting section is attached on a glass slide to form a slide, and the section is A method of observing or imaging is known.
- the observation apparatus described in Patent Document 1 causes a sample to protrude from the upper end of the holding cylinder by raising the sample held by the holding cylinder upward by a predetermined amount by the movable stage. Then, the observation device rotates the rotating plate and cuts the protruding portion of the sample with a cutting blade to form a cut surface.
- the cutting surfaces newly formed one after another by the cutting blade are imaged by the camera, and based on the image data of the cutting surfaces, a three-dimensional image is displayed on the monitor.
- JP-A-10-206296 (paragraphs [0036] to [0039], FIG. 10)
- Patent Document 1 only one image data can be acquired for one cut surface. Therefore, with the observation device described in Patent Literature 1, the user cannot observe a high-resolution image.
- an object of the present invention is to provide an observation apparatus capable of observing a high-resolution image.
- an observation apparatus includes a holding unit, a cutting unit, an imaging mechanism, a scanning mechanism, and a control unit.
- the holding unit holds a sample or a solid material including the sample.
- the cutting section cuts the held sample or the solid material and sequentially forms a new cut surface.
- the imaging mechanism captures a partial image that is an image within an imaging range smaller than the cut surface and includes an image of a part of the cut surface.
- the scanning mechanism scans the imaging range along the cut surface.
- the control means is an image in which the plurality of partial images are combined for each cutting plane by driving the scanning mechanism and causing the imaging mechanism to capture the partial images for each imaging range. Composite image information of the cut surface is generated.
- the imaging unit can capture an image within an imaging range that is smaller than the cut surface, and thus can acquire a high-resolution partial image.
- This high-resolution partial image is synthesized for each cut surface, and information of the synthesized image is generated.
- the observation device may display a display image such as a planar image or a stereoscopic image of the sample based on the composite image information. Thereby, the user can observe a high-resolution image.
- control unit may set a scanning region in which the imaging range is scanned based on the composite image information every time the cut surface is newly formed.
- a scanning area having a size suitable for the newly formed cut surface can be set. As a result, it is possible to prevent an unnecessary area from being scanned, so that high-speed processing is possible.
- control unit may set the scanning region according to the newly formed cut surface based on past composite image information of the cut surface. As a result, it is possible to prevent an unnecessary area from being scanned, so that high-speed processing is possible.
- control unit may perform edge detection of an image corresponding to the sample from a past composite image of the cut surface, and set the scanning region based on the detected edge information. Good.
- control unit may change an image region surrounded by the detected edge and set a region including the edge of the image region after the change as the scanning region. As a result, it is possible to prevent an unnecessary area from being scanned, so that high-speed processing is possible.
- the imaging mechanism captures an entire image that is an image within a range including at least the entire cut surface of the sample
- the control means may set the scanning area corresponding to the cut surface based on the whole image information every time the cut surface is newly formed.
- a scanning area having a size suitable for the newly formed cut surface can be set. As a result, it is possible to prevent an unnecessary area from being scanned, so that high-speed processing is possible.
- control unit may variably control an interval at which the sample is cut by the cutting unit.
- the interval at which the sample is cut is variably controlled.
- the interval at which the sample is cut corresponds to the Z resolution of the image data. That is, the Z resolution of the image data is variably controlled. Thereby, for example, in a range where high Z resolution is not required, high speed processing is possible by increasing the interval. On the other hand, in a range where high Z resolution is required, image data with higher resolution can be acquired by reducing the interval.
- control unit may extract a feature amount in the image of the sample based on the composite image information, and may control the interval variably according to the extracted feature amount.
- the interval can be made variable according to the feature amount in the image of the sample. Therefore, for example, the interval (Z resolution) is variably controlled using cancer cells in the biological sample as the feature amount. be able to.
- control unit may control the interval so that the interval decreases as the feature amount increases.
- the interval can be reduced and the Z resolution can be increased as the number of cancer cells increases.
- an observation apparatus capable of observing a high-resolution image can be provided.
- FIG. 5 is a schematic diagram for explaining the operation shown in FIG. 4. It is a flowchart which shows operation
- FIG. 1 is a schematic diagram showing an observation apparatus 100 according to the first embodiment of the present invention.
- the observation apparatus 100 includes a sample holder 8, a blade 7, an optical system 3, an electronic camera 2, and a control system 5.
- the sample holder 8 has a movable part 8a movable in the horizontal direction (XY direction) in the measuring part, and the sample P is sandwiched between the movable part 8a and the side part 8b opposite to the movable part. Fix it.
- the sample holder 8 is connected to the XYZ stage 4.
- the XYZ stage 4 includes, for example, an elevating mechanism 14 that is connected to the sample holder 8 and moves the sample holder 8 up and down, and an XY stage 15 that moves the elevating mechanism 14 in the X-axis direction and the Y-axis direction.
- the blade 7 is rotated by a rotation mechanism (not shown) and is configured to cut the sample P held by the sample holder 8 along the XY plane.
- the blade 7 is rotated by a rotation mechanism at a fixed position with respect to the observation apparatus 100.
- the blade 7 may be configured to cut the sample P by horizontal movement.
- the blade 7 may have any form as long as it is configured to cut the sample P along the XY plane.
- Driving mechanisms such as the elevating mechanism 14, the XY stage 15, and the rotating mechanism are realized by driving mechanisms such as a rack and pinion, a belt, a chain, a linear motor, a ball screw, and a fluid pressure cylinder.
- the optical system 3 includes an illumination light source 19, two objective lenses 11 and 12, and a revolver 13 that switches between the two objective lenses 11 and 12.
- the revolver 13 switches between the two objective lenses 11 and 12 by being rotated by a rotation mechanism (not shown).
- the light source 19 is, for example, a light emitting diode or a xenon lamp.
- the light from the light source 19 may be reflected by a mirror (not shown) and incident on the objective lens to illuminate the sample P.
- the first objective lens 11 for example, a lens having a magnification of about 40 to 60 times is used.
- the second objective lens 12 is a wide-angle lens having a lower magnification than the first objective lens 11.
- a lens having a magnification of several times to several tens of times is used.
- the magnification of each lens is not limited to the above range.
- the optical system 3 may have a filter, a dichroic mirror, or the like. Thereby, it can comprise so that a fluorescence image, a multi-color image, etc. can be acquired.
- the electronic camera 2 includes an imaging device such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor).
- the control system 5 includes a main controller 16, an image processing unit 17, and a storage device 18.
- the main controller 16 controls the entire observation apparatus 100 in an integrated manner.
- the main controller 16 controls driving of the XYZ stage 4, the rotation mechanism of the blade 7, and the rotation mechanism of the revolver 13, and outputs image data obtained by the electronic camera 2 to the storage device 18.
- the main controller 16 acquires the position of the sample holder 8 by the XYZ stage 4, that is, position information of the sample P in three dimensions from the XYZ stage 4.
- the storage device 18 stores the image data output from the main controller 16 in a table together with the position information of the XYZ stage 4 and holds it.
- the image processing unit 17 extracts image data and position information stored in the storage device 18 and executes predetermined image processing based on the extracted image data and position information.
- the observation apparatus 100 includes a display unit 6 such as a liquid crystal display or an organic EL display.
- the image processing unit 17 causes the main controller 16 to output and display the image generated by the image processing on the display unit 6 under control.
- the observation apparatus 100 may include a printing device such as a printer in addition to the display unit 6 or instead of the display unit 6.
- Examples of hardware for realizing the main controller 16 and the image processing unit 17 include a CPU (Central Processing Unit), a DSP (Digital Signal Processor), an FPGA (Field Programmable Gate Array), an ASIC (Application Specific Specific Integrated Circuit), and the like. Similar or combinations thereof are used.
- a CPU Central Processing Unit
- DSP Digital Signal Processor
- FPGA Field Programmable Gate Array
- ASIC Application Specific Specific Integrated Circuit
- the image processing unit 17 may be realized by both software and hardware.
- the hardware includes at least a storage device (for example, ROM or other storage device) that stores a software program.
- the storage device 18 may be a storage medium such as a solid (semiconductor, dielectric or magnetoresistive) memory in addition to a disk-shaped storage medium such as a magnetic disk or an optical disk.
- sample P for example, a pathological sample, or a biological tissue sample P such as a non-human animal or plant is used.
- the kind of sample P is not specifically limited, The sample P selected suitably from the medical field, the pharmaceutical field, the food field, the agricultural field, etc. should just be used.
- the sample P may be embedded in an embedding material such as resin or paraffin and held in the sample holder 8 as a solid containing the sample P, for example.
- the sample P may be held by the sample holder 8 as the sample P by freeze embedding.
- the sample holder 8 may have a cooling unit (not shown).
- the sample P itself may be held by the sample holder 8.
- the sample P may be stained before imaging is executed.
- the staining include staining in a bright field using a staining solution such as hematoxylin / eosin staining (HE staining), staining by IHC (immunohisto chemistry), and FISH (Fluorescent in situ hybridization) method.
- a staining method using a fluorescent substance such as nucleic acid fluorescence staining with DAPI (4 ′, 6-diamino-2-phenylindole), a staining method using an antibody or a nucleic acid probe, and the like can be mentioned.
- the cut surface of the sample P can be dyed by using the whole mount method.
- FIG. 2 is a flowchart showing the operation of the observation apparatus 100.
- FIG. 3 is a schematic diagram for explaining the operation shown in FIG.
- the main controller 16 drives a rotation mechanism provided in the blade 7 and cuts the upper end portion of the sample P by the blade 7 (step 101).
- a cut surface of the nth layer of the sample P is formed.
- the main controller 16 controls the electronic camera 2 through the second objective lens 12 so as to image the cut surface of the nth layer of the sample P. (Step 102).
- the image captured through the second objective lens 12 is an image within a range including at least the entire cut surface of the nth layer of the sample P.
- the main controller 16 When the main controller 16 acquires the entire image data including the entire cut surface of the nth layer, the main controller 16 sets the scanning region 1A based on the entire image data (step 103).
- the scanning area 1 ⁇ / b> A refers to an area in which the imaging range 2 ⁇ / b> A captured by the electronic camera 2 through the first objective lens 11 is scanned (see FIG. 3C).
- the main controller 16 performs edge detection of the cut surface of the n-th layer based on the entire image data (see FIG. 3B), and a region encompassing the inside of this edge is detected.
- the edge detection may be executed by, for example, threshold determination of luminance information of the entire image.
- the main controller 16 drives the rotation mechanism of the revolver 13 to switch from the second objective lens 12 to the first objective lens 11 (step 104).
- the main controller 16 moves the sample holder 8 in the X axis direction and the Y axis direction by driving the XY stage 15 based on the set information of the scanning region 1A (step 105).
- the distance from the first objective lens 11 to the n-th cut surface of the sample P is measured by an active distance measuring method using near infrared rays or the like. Is done.
- the main controller 16 moves the lifting mechanism 14 up and down according to the measured distance, and adjusts the focus (step 106).
- the distance measurement method is not limited to the active distance measurement method.
- a passive distance measurement method such as a TTL (Through-the-Lens) method may be used, and the distance measurement method is not particularly limited.
- the main controller 16 controls the electronic camera 2 to capture a partial image corresponding to a part of the cut surface of the nth layer via the first objective lens 11 (step 107).
- the range in which the electronic camera 2 can capture images through the first objective lens 11 will be described as the imaging range 2A (see FIG. 3C).
- the main controller 16 acquires the three-dimensional position information of the sample P from the XYZ stage 4 and outputs it to the storage device 18 together with the partial image data.
- the storage device 18 stores and stores the partial image data output from the main controller 16 and the position information of the XYZ stage 4 in a table (step 108).
- the main controller 16 determines whether or not all the partial image data in the scanning area 1A has been acquired (step 109). When all the images in the scanning area 1A have not been acquired (step 109 NO), the main controller 16 moves the XY stage 15 by a predetermined distance (step 105). In this case, the imaging range 2 ⁇ / b> A in which the electronic camera 2 can image through the first objective lens 11 is moved along the cut surface of the sample P so as to be scanned.
- Steps 105 to 109 are repeated until all the images in the scanning region 1A are acquired.
- the main controller 16 raises the elevating mechanism 14 and raises the sample P (Step 110).
- the distance by which the sample P is raised is, for example, 50 ⁇ m to 100 ⁇ m, but is not limited to this range.
- the distance by which the sample P is raised corresponds to the Z resolution of the acquired image data.
- the main controller 16 rotates the blade 7. As a result, the sample P is cut by the increased distance of the sample P, and the (n + 1) th layer cut surface of the sample P is formed. Thereafter, the processing shown in steps 101 to 110 is executed until the entire sample P is cut.
- the image processing unit 17 acquires partial image data and position information of the XYZ stage 4 from the storage device 18, and generates combined image data for each cutting plane by combining the partial image data based on the position information. To do.
- the image processing unit 17 displays a display image such as a planar image or a stereoscopic image of the sample P based on the composite image data.
- the image processing unit 17 may execute processing such as position adjustment, color tone, and brightness correction when combining the partial image data.
- the image processing unit 17 may artificially create a stereoscopic image cut at an arbitrary cross section based on the partial image data, and display the stereoscopic image on the display unit 6.
- the main controller 16 may perform various types of image analysis instead of displaying a display image such as a planar image or a stereoscopic image as it is. For example, the main controller 16 identifies specific cells and specific tissues, detects the presence or absence of image features specific to a lesion, identifies characteristic regions, detects the presence or absence of expression of specific genes, and analyzes the spatial distribution thereof. Such processing may be executed.
- the partial image data of the sample P imaged through the first objective lens 11 with a high magnification is acquired, and the partial image data is synthesized to obtain the planar image data and the stereoscopic image. Image data is generated. Thereby, the user can observe a high-resolution image.
- the scanning region 1A is set based on the entire image data imaged through the second objective lens 12, it is newly formed every time a cut surface is newly formed.
- the scanning area 1A having a size suitable for the cut surface can be set. As a result, it is possible to prevent an unnecessary area from being scanned, so that high-speed processing is possible.
- the positional information of the XYZ stage 4 is recorded together with the partial image data, the alignment of the composite image data for each cutting plane obtained from a plurality of partial image data is used as the recorded positional information. Can be done originally. This is much simpler and faster than stereoscopic image construction by digital pathology using slide glass images.
- the observation apparatus 100 is particularly effective when the pathological sample P is observed.
- Step 104 an area including the inside of the detected edge is set as the scanning area 1A. That is, the scanning area 1A has been described as an area that encompasses all edges and is larger than the edges.
- the present invention is not limited to this, and the scanning region 1A may be a region smaller than the edge.
- the main controller 16 may be controlled to set the scanning region 1A as a region smaller than the edge. Thereby, the user can observe the display image of the necessary part of the sample P.
- a high-speed process is also attained.
- FIG. 4 is a flowchart showing the operation of the observation apparatus 100 according to the second embodiment.
- FIG. 5 is a schematic diagram for explaining the operation shown in FIG.
- the main controller 16 drives the blade 7 and cuts the upper end portion of the sample P with the blade 7 (step 201).
- a cut surface of the nth layer of the sample P is formed.
- the main controller 16 determines whether or not the entire image data of the (previous) cut surface of the n ⁇ 1th layer of the sample P is stored in the storage device 18. Is determined (step 202).
- the entire image data of the cut surface of the (n ⁇ 1) th layer determined in step 202 may be synthesized image data formed by synthesizing the partial image data. It may be the entire image data of the cut surface acquired via the.
- the main controller 16 sets the maximum scanning area, which is the maximum area of the scanning area 1A, as the scanning area 1A (step 203).
- the main controller 16 scans the imaging range 2A within the maximum scanning area (Step 204 to Step 207). In this case, movement of the XY stage 15 (step 204) ⁇ focusing (step 205) ⁇ imaging (first objective lens 11) (step 206) ⁇ data storage (step 207) ⁇ determination (step 208) within the maximum scanning region. ) Is repeated.
- the imaging range 2A has been scanned within the maximum scanning area, the process returns to step 201 again.
- step 202 if the entire image data of the n ⁇ 1th layer cut surface is stored, the main controller 16 determines the nth layer cut surface based on the entire image data of the n ⁇ 1th layer cut surface.
- the scanning area 1A is set (step 209).
- the main controller 16 performs edge detection of the image data of the (n-1) th layer (see FIG. 5A). Then, the main controller 16 forms a region (hereinafter referred to as a change region) obtained by expanding a region surrounded by the detected edge by a certain amount (see FIG. 5B). Next, the main controller 16 sets a region including the change region as the scanning region 1A (see FIG. 5C).
- the main controller 16 executes the processing shown in Step 204 to Step 210.
- a new cutting plane scanning area 1A can be set based on the previous cutting plane image data, scanning of an extra area can be omitted. Thereby, high-speed processing is realized.
- step 202 the main controller 16 obtains the entire image data of the n-th layer cut surface via the second objective lens 12 when there is no entire image data of the n-th layer cut surface. Also good. In this case, the main controller 16 sets the scanning area 1A based on the acquired entire image data of the nth layer, and scans the imaging range 2A within the scanning area 1A. That is, the main controller 16 may execute the processing shown in steps 102 to 109 in FIG. 2 when there is no whole image data of the (n ⁇ 1) -th layer cut surface.
- the change region is formed by expanding the region surrounded by the edge.
- the present invention is not limited to this, and the change region may be formed by reducing the region surrounded by the edge.
- FIG. 6 is a flowchart showing the operation of the observation apparatus 100 according to the third embodiment.
- the main controller 16 rotates the blade 7, cuts the end of the sample P, and forms a cut surface of the nth layer of the sample P (step 301).
- the main controller 16 executes the same processing as the processing shown in steps 105 to 109 in FIG.
- the same processing as step 101 to 109 shown in FIG. 2 may be executed, or the same processing as step 201 to 208 shown in FIG. 4 is executed. May be. Further, all the modifications shown in the above-described embodiments can be applied to this embodiment.
- the main controller 16 When the cut surface of the nth layer of the sample P is formed, the main controller 16 synthesizes the partial image data obtained by the processing in steps 302 to 306 to generate synthesized image data. Then, the main controller 16 extracts an image feature amount by image analysis based on the composite image data (step 306).
- the image feature amount is determined based on, for example, luminance information of the composite image data.
- Various image feature amounts can be used as indexes, but in the present embodiment, an image pattern of cancer cells is described as an index. When the image pattern of cancer cells is used as an index, the size of cancer cells may be used as an index, and the ratio of the size of cancer cells to the size of the cut surface may be used as an index.
- the main controller 16 raises the sample P by raising the lifting mechanism 14 by a distance corresponding to the size of the cancer cells. (Step 307).
- the distance by which the sample P is raised is set to become smaller as the cancer cells become larger.
- the distance by which the sample P is raised corresponds to the Z resolution of the image data as described above. Therefore, the Z resolution of the image data increases as the cancer cells become larger.
- the distance by which the sample P is raised may decrease stepwise as the cancer cells increase, or may decrease in a linear function. Alternatively, it may be reduced exponentially.
- the working time can be shortened.
- the image feature amount include, for example, the area of the cut surface of the sample P occupied in the composite image data. Also in this case, the distance by which the sample P is raised is set to be smaller as the image feature amount is larger.
- the time until the sample P appears on the cut surface of the sample P to be photographed can be shortened, thus improving work efficiency. Can be made.
- the image feature to be used and the Z resolution corresponding to it vary depending on the properties of the object to be observed. Therefore, by preparing these parameters on a computer in advance and attaching a mechanism that allows the user to select parameters used in the experiment when performing the experiment, the user's work efficiency can be improved.
- the mode in which the sample P is moved in the XY directions and the optical system 3 and the electronic camera 2 are fixed has been described.
- the present invention is not limited to this, and the sample P may be fixed in the XY directions, and the optical system 3 and the electronic camera 2 may be moved in the XY directions.
- the sample P and both the optical system 3 and the electronic camera 2 may be configured to move in the XY directions. That is, any form may be used as long as the relative positions of the sample P, the optical system 3 and the electronic camera 2 in the XY directions can be changed.
- the movement in the Z direction has been described for the form in which the sample P moves to the optical system 3 and the electronic camera 2 side.
- the present invention is not limited to this, and the optical system 3 and the electronic camera 2 may be moved to the sample P side.
- the blade 7 is also moved to the sample P side in accordance with the movement of the optical system 3 and the electronic camera 2.
- the staining of the sample P is performed as a pretreatment.
- the method is not limited to this, and a method of applying a dyeing chemical to the newly formed cut surface every time the cut surface of the sample P is formed may be used.
- an application mechanism for applying the dyeing chemical may be arranged at a position facing the cut surface of the sample P.
- FIG. 7 is a schematic diagram showing another form of the optical system.
- the optical system 20 includes a light source 21, a polarizer 22, a beam splitter 23, a Wellaston prism 24, an objective lens 25, and an analyzer 26.
- the light from the light source 21 enters the polarizer 22 and becomes linearly polarized light in a predetermined vibration direction.
- the linearly polarized light from the polarizer 22 is reflected by the beam splitter 23 and enters the Wellaston prism 24, and is separated into two linearly polarized lights whose vibration directions are perpendicular to each other.
- the two linearly polarized lights become substantially parallel condensed light through the objective lens 25 and are vertically illuminated at different positions on the cut surface of the sample P.
- the light reflected at two different positions again enters the Wellaston prism 24 through the objective lens 25, and is synthesized and travels on the same optical path.
- the two bundles of light from the Wellaston prism 24 enter the analyzer 26 via the beam splitter 23, and the components in the same vibration direction are extracted and interfere with polarization. Thereafter, the polarized interference light is guided to the image plane of the electronic camera 2 to form a differential interference image.
Landscapes
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Biochemistry (AREA)
- Analytical Chemistry (AREA)
- Immunology (AREA)
- Pathology (AREA)
- Chemical & Material Sciences (AREA)
- Multimedia (AREA)
- Molecular Biology (AREA)
- Biomedical Technology (AREA)
- Theoretical Computer Science (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Microscoopes, Condenser (AREA)
- Investigating, Analyzing Materials By Fluorescence Or Luminescence (AREA)
- Sampling And Sample Adjustment (AREA)
- Automatic Analysis And Handling Materials Therefor (AREA)
- Investigating Or Analysing Materials By Optical Means (AREA)
Abstract
Description
他方、プレパラートを作成せずに、サンプルの内部構造を観察することができる試料観察装置も知られている(例えば、特許文献1参照)。
前記保持部は、サンプル、または前記サンプルを含む固形物を保持する。
前記切断部は、前記保持された前記サンプルまたは前記固形物を切断し、順次新たな切断面を形成する。
前記撮像機構は、前記切断面よりも小さい撮像範囲内の画像であって、前記切断面の一部を含む画像である部分画像を撮像する。
前記走査機構は、前記撮像範囲を前記切断面に沿って走査させる。
前記制御手段は、前記走査機構を駆動させ、前記撮像機構により前記撮像範囲ごとに前記部分画像を撮像させることで、前記切断面毎に、前記複数の部分画像が合成された画像である、前記切断面の合成画像情報を生成する。
本発明では、切断面が新たに形成される毎に、新たに形成される切断面に適した大きさの走査領域を設定することができる。これにより、余計な領域が走査されることを防止することができるので、高速な処理が可能となる。
これにより、余計な領域が走査されることを防止することができるので、高速な処理が可能となる。
これにより、余計な領域が走査されることを防止することができるので、高速な処理が可能となる。
前記制御手段は、前記切断面が新たに形成される毎に、前記全体画像情報に基づいて、前記切断面に応じた前記走査領域を設定してもよい。
本発明では、切断面が新たに形成される毎に、新たに形成される切断面に適した大きさの走査領域を設定することができる。これにより、余計な領域が走査されることを防止することができるので、高速な処理が可能となる。
本発明では、サンプルが切断される間隔が可変に制御される。このサンプルが切断される間隔は、画像データのZ分解能に相当する。すなわち、画像データのZ分解能が可変に制御されることになる。これにより、例えば、高いZ分解能が要求されない範囲は、上記間隔を大きくすることで、高速な処理が可能となる。一方、高いZ分解能が要求される範囲は、上記間隔を小さくすることで、さらに高解像度の画像データを取得することができる。
本発明は、サンプルの画像中の特徴量に応じて、間隔を可変とすることができるので、例えば、生体サンプル内のがん細胞を特徴量として、上記間隔(Z分解能)を可変に制御することができる。
これにより、例えば、特徴量が生体サンプル内のがん細胞である場合、がん細胞が増加するに従って、上記間隔を小さくすることができ、Z分解能を高くすることができる。
<第1実施形態>
(観察装置の全体構成)
図1は、本発明の第1実施形態に係る観察装置100を示す模式図である。
図1に示すように、観察装置100は、サンプルホルダー8と、ブレード7と、光学系3と、電子カメラ2と、制御系5とを備える。
電子カメラ2は、例えば、CCD(Charge Coupled Device)や、CMOS(Complementary Metal Oxide Semiconductor)などの撮像デバイスを備えている。
メインコントローラ16は、観察装置100の全体を統括的に制御する。メインコントローラ16は、例えば、XYZステージ4、ブレード7の回転機構、及びレボルバー13の回転機構の駆動を制御したり、電子カメラ2により得られた画像データを記憶デバイス18に出力したりする。また、メインコントローラ16は、XYZステージ4によるサンプルホルダー8の位置、すなわち、サンプルPの3次元内の位置情報を、XYZステージ4から取得する。
画像処理部17は、記憶デバイス18に記憶された、画像データ及び位置情報を抽出し、抽出された画像データ及び位置情報に基づき、所定の画像処理を実行する。
記憶デバイス18は、磁気ディスク、光ディスク等のディスク状の記憶媒体のほか、固体(半導体、誘電体または磁気抵抗)メモリ等の記憶媒体でもよい。
図2は、観察装置100の動作を示すフローチャートである。図3は、図2に示す動作を説明するための模式図である。
以降、ステップ101~110に示す処理が、サンプルP全体が切断されるまで実行される。
次に、本発明の第2実施系形態について説明する。
図4は、第2実施形態に係る観察装置100の動作を示すフローチャートである。図5は、図4に示す動作を説明するため模式図である。
次に、本発明の第3実施形態について説明する。
図6は、第3実施形態に係る観察装置100の動作を示すフローチャートである。
上述の各実施形態では、サンプルPがXY方向に移動され、光学系3及び電子カメラ2が固定される形態について説明した。しかし、これに限られず、サンプルPがXY方向で固定され、光学系3及び電子カメラ2がXY方向に移動されるように構成されてもよい。あるいは、サンプルPと、光学系3及び電子カメラ2との両方がXY方向に移動されるように構成されてもよい。すなわち、サンプルPと、光学系3及び電子カメラ2とのXY方向での相対的な位置を変化させることができる形態であれば、どのような形態であってもよい。
図7に示すように、光学系20は、光源21と、ポラライザ22と、ビームスプリッタ23と、ウェラストンプリズム24と、対物レンズ25と、アナライザ26とで構成される。
1A…走査領域
2A…撮像範囲
2…電子カメラ
3…光学系
4…XYZステージ
5…制御系
6…表示部
7…ブレード
8…サンプルホルダー
11…第1の対物レンズ
12…第2の対物レンズ
13…レボルバー
14…昇降機構
15…XYステージ
16…メインコントローラ
17…画像処理部
18…記憶デバイス
100…観察装置
Claims (9)
- サンプル、または前記サンプルを含む固形物を保持する保持部と、
前記保持された前記サンプルまたは前記固形物を切断し、順次新たな切断面を形成する切断部と、
前記切断面よりも小さい撮像範囲内の画像であって、前記切断面の一部を含む画像である部分画像を撮像する撮像機構と、
前記撮像範囲を前記切断面に沿って走査させる走査機構と、
前記走査機構を駆動させ、前記撮像機構により前記撮像範囲ごとに前記部分画像を撮像させることで、前記切断面毎に、前記複数の部分画像が合成された画像である、前記切断面の合成画像情報を生成する制御手段と
を具備する観察装置。 - 請求項1に記載の観察装置であって、
前記制御手段は、前記切断面が新たに形成される毎に、前記合成画像情報に基づき、撮像範囲が走査される走査領域を設定する
観察装置。 - 請求項2に記載の観察装置であって、
前記制御手段は、過去の前記切断面の合成画像情報に基づいて、前記新たに形成された前記切断面に応じた前記走査領域を設定する
観察装置。 - 請求項3に記載の観察装置であって、
前記制御手段は、過去の前記切断面の合成画像から前記サンプルに相当する画像のエッジ検出を実行し、前記検出されたエッジの情報に基づいて前記走査領域を設定する
観察装置。 - 請求項4に記載の観察装置であって、
前記制御手段は、前記検出されたエッジで囲まれる画像領域を変化させ、前記変化後の画像領域のエッジを包括する領域を、前記走査領域として設定する
観察装置。 - 請求項1に記載の観察装置であって、
前記撮像機構は、少なくとも前記サンプルの前記切断面全体を包括する範囲内の画像である全体画像を撮像し、
前記制御手段は、前記切断面が新たに形成される毎に、前記全体画像情報に基づいて、前記切断面に応じた前記走査領域を設定する
観察装置。 - 請求項1に記載の観察装置であって、
前記制御手段は、前記サンプルが前記切断部により切断される間隔を可変に制御する
観察装置。 - 請求項7に記載の観察装置であって、
前記制御手段は、前記合成画像情報に基づき、前記サンプルの画像中の特徴量を抽出し、前記抽出された特徴量に応じて、前記間隔を可変に制御する
観察装置。 - 請求項8に記載の観察装置であって、
前記制御手段は、前記特徴量が増加するに従って、前記間隔を小さくするように、前記間隔を制御する
観察装置。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201080012996.8A CN102362168B (zh) | 2009-03-27 | 2010-03-16 | 观察设备 |
EP10755615A EP2413130A1 (en) | 2009-03-27 | 2010-03-16 | Observation device |
US13/256,379 US20120002043A1 (en) | 2009-03-27 | 2010-03-16 | Observation apparatus |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2009078578A JP5316161B2 (ja) | 2009-03-27 | 2009-03-27 | 観察装置 |
JP2009-078578 | 2009-03-27 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2010109811A1 true WO2010109811A1 (ja) | 2010-09-30 |
Family
ID=42780511
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2010/001876 WO2010109811A1 (ja) | 2009-03-27 | 2010-03-16 | 観察装置 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20120002043A1 (ja) |
EP (1) | EP2413130A1 (ja) |
JP (1) | JP5316161B2 (ja) |
CN (1) | CN102362168B (ja) |
WO (1) | WO2010109811A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10401608B2 (en) | 2016-05-19 | 2019-09-03 | Olympus Corporation | Image acquisition apparatus |
Families Citing this family (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
BRPI1011689B1 (pt) | 2009-03-11 | 2019-12-17 | Sakura Finetek Usa Inc | método de autofocalização e dispositivo de autofocalização |
US10139613B2 (en) | 2010-08-20 | 2018-11-27 | Sakura Finetek U.S.A., Inc. | Digital microscope and method of sensing an image of a tissue sample |
US20130050431A1 (en) * | 2011-08-29 | 2013-02-28 | Shiseido Company, Ltd. | Method of observing cross-section of cosmetic material |
WO2013105373A1 (ja) | 2012-01-11 | 2013-07-18 | ソニー株式会社 | 情報処理装置、撮像制御方法、プログラム、デジタル顕微鏡システム、表示制御装置、表示制御方法及びプログラム |
DE102013103971A1 (de) | 2013-04-19 | 2014-11-06 | Sensovation Ag | Verfahren zum Erzeugen eines aus mehreren Teilbildern zusammengesetzten Gesamtbilds eines Objekts |
US10007102B2 (en) | 2013-12-23 | 2018-06-26 | Sakura Finetek U.S.A., Inc. | Microscope with slide clamping assembly |
AT518719A1 (de) * | 2016-05-19 | 2017-12-15 | Luttenberger Herbert | Mikrotom |
US11280803B2 (en) | 2016-11-22 | 2022-03-22 | Sakura Finetek U.S.A., Inc. | Slide management system |
EP3550009A4 (en) * | 2016-11-29 | 2019-12-18 | Sony Corporation | INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD, PROGRAM, AND OBSERVATION SYSTEM |
US10684199B2 (en) | 2017-07-27 | 2020-06-16 | Agilent Technologies, Inc. | Preparation of tissue sections using fluorescence-based detection |
JP7231345B2 (ja) * | 2017-07-27 | 2023-03-01 | アジレント・テクノロジーズ・インク | 蛍光に基づく検知を使用する組織切片の調製 |
CN108020503B (zh) * | 2017-11-20 | 2020-09-08 | 苏州博芮恩光电科技有限公司 | 一种光片照明显微切片成像系统及成像结果处理方法 |
US20190333399A1 (en) * | 2018-04-25 | 2019-10-31 | General Electric Company | System and method for virtual reality training using ultrasound image data |
WO2021092725A1 (en) * | 2019-11-11 | 2021-05-20 | Leica Biosystems Nussloch Gmbh | Moving and clamping device, and blade holder |
JP7362008B1 (ja) | 2023-04-19 | 2023-10-16 | 三菱電機株式会社 | 消弧板および遮断器 |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1026586A (ja) * | 1996-07-10 | 1998-01-27 | Res Dev Corp Of Japan | 試料観察方法及びその装置 |
JPH10206296A (ja) | 1997-01-21 | 1998-08-07 | Kagaku Gijutsu Shinko Jigyodan | 観察試料支持方法 |
JPH1195125A (ja) * | 1997-09-22 | 1999-04-09 | Olympus Optical Co Ltd | 顕微鏡デジタル画像撮影システム及びその撮影方法 |
JP2001338276A (ja) * | 2000-05-29 | 2001-12-07 | Japan Science & Technology Corp | 試料内の氷結晶構造の計測方法 |
JP2002148153A (ja) * | 2000-11-15 | 2002-05-22 | Inst Of Physical & Chemical Res | 3次元内部構造の解析方法及び装置 |
JP2002533670A (ja) * | 1998-12-21 | 2002-10-08 | ヒストオテック エーピーエス | 組織の塊を切断する方法及び装置 |
JP2003504627A (ja) * | 1999-07-13 | 2003-02-04 | クロマビジョン メディカル システムズ インコーポレイテッド | 生物試料中の物体の自動検出 |
JP2004101871A (ja) * | 2002-09-10 | 2004-04-02 | Olympus Corp | 顕微鏡画像撮影装置 |
JP2004343222A (ja) * | 2003-05-13 | 2004-12-02 | Olympus Corp | 画像処理装置 |
JP2008139143A (ja) * | 2006-12-01 | 2008-06-19 | Sysmex Corp | 標本画像作成方法及び装置 |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6198532B1 (en) * | 1991-02-22 | 2001-03-06 | Applied Spectral Imaging Ltd. | Spectral bio-imaging of the eye |
US5793879A (en) * | 1992-04-13 | 1998-08-11 | Meat Research Corporation | Image analysis for meat |
US7312921B2 (en) * | 2004-02-27 | 2007-12-25 | Hamamatsu Photonics K.K. | Microscope and sample observation method |
US7588438B2 (en) * | 2005-11-01 | 2009-09-15 | The Board Of Regents, The University Of Texas System | System, method and apparatus for fiber sample preparation for image analysis |
JP4840765B2 (ja) * | 2006-02-09 | 2011-12-21 | セイコーインスツル株式会社 | 薄切片作製装置及び薄切片の作製方法 |
-
2009
- 2009-03-27 JP JP2009078578A patent/JP5316161B2/ja not_active Expired - Fee Related
-
2010
- 2010-03-16 CN CN201080012996.8A patent/CN102362168B/zh not_active Expired - Fee Related
- 2010-03-16 WO PCT/JP2010/001876 patent/WO2010109811A1/ja active Application Filing
- 2010-03-16 EP EP10755615A patent/EP2413130A1/en not_active Withdrawn
- 2010-03-16 US US13/256,379 patent/US20120002043A1/en not_active Abandoned
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1026586A (ja) * | 1996-07-10 | 1998-01-27 | Res Dev Corp Of Japan | 試料観察方法及びその装置 |
JPH10206296A (ja) | 1997-01-21 | 1998-08-07 | Kagaku Gijutsu Shinko Jigyodan | 観察試料支持方法 |
JPH1195125A (ja) * | 1997-09-22 | 1999-04-09 | Olympus Optical Co Ltd | 顕微鏡デジタル画像撮影システム及びその撮影方法 |
JP2002533670A (ja) * | 1998-12-21 | 2002-10-08 | ヒストオテック エーピーエス | 組織の塊を切断する方法及び装置 |
JP2003504627A (ja) * | 1999-07-13 | 2003-02-04 | クロマビジョン メディカル システムズ インコーポレイテッド | 生物試料中の物体の自動検出 |
JP2001338276A (ja) * | 2000-05-29 | 2001-12-07 | Japan Science & Technology Corp | 試料内の氷結晶構造の計測方法 |
JP2002148153A (ja) * | 2000-11-15 | 2002-05-22 | Inst Of Physical & Chemical Res | 3次元内部構造の解析方法及び装置 |
JP2004101871A (ja) * | 2002-09-10 | 2004-04-02 | Olympus Corp | 顕微鏡画像撮影装置 |
JP2004343222A (ja) * | 2003-05-13 | 2004-12-02 | Olympus Corp | 画像処理装置 |
JP2008139143A (ja) * | 2006-12-01 | 2008-06-19 | Sysmex Corp | 標本画像作成方法及び装置 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10401608B2 (en) | 2016-05-19 | 2019-09-03 | Olympus Corporation | Image acquisition apparatus |
Also Published As
Publication number | Publication date |
---|---|
CN102362168B (zh) | 2014-04-09 |
CN102362168A (zh) | 2012-02-22 |
JP2010230495A (ja) | 2010-10-14 |
EP2413130A1 (en) | 2012-02-01 |
JP5316161B2 (ja) | 2013-10-16 |
US20120002043A1 (en) | 2012-01-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5316161B2 (ja) | 観察装置 | |
CA2868263C (en) | Slide scanner with dynamic focus and specimen tilt and method of operation | |
KR102523559B1 (ko) | 디지털 스캐닝 장치 | |
EP2758825B1 (en) | Slide scanner with a tilted image plane | |
US11391936B2 (en) | Line-scanning, sample-scanning, multimodal confocal microscope | |
US20120307037A1 (en) | Objective-coupled selective plane illumination microscopy | |
US10876970B2 (en) | Light-sheet microscope with parallelized 3D image acquisition | |
US20150185456A1 (en) | Microscope system and control method therefor | |
EP2005235A1 (en) | Confocal microscopy with a two-dimensional array of light emitting diodes | |
JP6972188B2 (ja) | 異なるサイズのスライド用の調整可能なスライドステージ | |
US10409049B2 (en) | Low resolution slide imaging and slide label imaging and high resolution slide imaging using dual optical paths and a single imaging sensor | |
JP7004808B2 (ja) | スライドガラスの走査および処理のための対向縁部システム | |
WO2012002886A1 (en) | Confocal fluorescence lifetime imaging system | |
JP2006275964A (ja) | 走査型蛍光顕微鏡のシェーディング補正方法 | |
EP3435136B1 (en) | Multi-surface image acquisition system | |
JP7119085B2 (ja) | 衝撃再走査システム | |
Ivanov et al. | Correlative imaging of spatio-angular dynamics of molecular assemblies and cells with multimodal instant polarization microscope | |
WO2023161375A1 (en) | Device for measuring intrinsic autofluorescence of a biological sample and method using thereof | |
CN110869747B (zh) | 试样观察装置和试样观察方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201080012996.8 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10755615 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2010755615 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13256379 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 7180/DELNP/2011 Country of ref document: IN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |