WO2015133475A1 - Dispositif microscope et procédé d'analyse - Google Patents

Dispositif microscope et procédé d'analyse Download PDF

Info

Publication number
WO2015133475A1
WO2015133475A1 PCT/JP2015/056217 JP2015056217W WO2015133475A1 WO 2015133475 A1 WO2015133475 A1 WO 2015133475A1 JP 2015056217 W JP2015056217 W JP 2015056217W WO 2015133475 A1 WO2015133475 A1 WO 2015133475A1
Authority
WO
WIPO (PCT)
Prior art keywords
spectral
dimensional
image
imaging
wavelength
Prior art date
Application number
PCT/JP2015/056217
Other languages
English (en)
Japanese (ja)
Inventor
直樹 野呂
洋平 高良
史識 安藤
雄大 藤森
Original Assignee
エバ・ジャパン株式会社
直樹 野呂
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by エバ・ジャパン株式会社, 直樹 野呂 filed Critical エバ・ジャパン株式会社
Publication of WO2015133475A1 publication Critical patent/WO2015133475A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/06Scanning arrangements arrangements for order-selection
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/0205Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows
    • G01J3/0208Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows using focussing or collimating elements, e.g. lenses or mirrors; performing aberration correction
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/12Generating the spectrum; Monochromators
    • G01J3/14Generating the spectrum; Monochromators using refracting elements, e.g. prisms
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/2823Imaging spectrometer
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/0004Microscopes specially adapted for specific applications
    • G02B21/002Scanning microscopes
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B26/00Optical devices or arrangements for the control of light using movable or deformable optical elements
    • G02B26/08Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light
    • G02B26/0875Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light by means of one or more refracting elements
    • G02B26/0883Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light by means of one or more refracting elements the refracting element being a prism
    • G02B26/0891Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light by means of one or more refracting elements the refracting element being a prism forming an optical wedge
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B26/00Optical devices or arrangements for the control of light using movable or deformable optical elements
    • G02B26/08Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light
    • G02B26/10Scanning systems
    • G02B26/108Scanning systems having one or more prisms as scanning elements

Definitions

  • the present invention relates to a microscope apparatus and an analysis method, and more particularly, to a microscope apparatus capable of acquiring spectral information of an observation object and an analysis method using the microscope apparatus.
  • the object is enlarged using visible light and light in a wavelength region in the vicinity thereof, and a person visually recognizes details of the object and a minute object. It is known to be able to.
  • transmitted light from an object obtained by irradiating light from a light source to an object to be observed (hereinafter, “object to be observed” is simply referred to as “object”)
  • object Observe the reflected light or radiated light
  • transmitted light, reflected light or radiated light from the object as appropriate, “light flux from the object”
  • the image of the light beam from the object is observed with the imaging device, the object is observed, and processing such as classification and discrimination of the object (hereinafter referred to as “classification and discrimination, etc.” Is simply referred to as “classification processing” as appropriate).
  • the basis of the classification processing of the object is that only monochrome or RGB composite information can be obtained as color information (spectral information) in both cases of direct visual observation and visual inspection of an image. Absent. For this reason, it has been pointed out as a problem that classification processing is difficult for approximate color information, and the accuracy of classification processing is limited.
  • the microscope apparatus when observing an object with poor color information (for example, a living cell, a tissue section, a crystal, etc.), the microscope apparatus usually stains the object to classify the structure and properties of the object. It is necessary to perform such processing.
  • an object with poor color information for example, a living cell, a tissue section, a crystal, etc.
  • the state of the object for example, living cells, tissue slices, crystals, etc.
  • the state of the object in real life for example, biological activity
  • the present invention has been made in view of the various problems of the conventional techniques as described above.
  • the object of the present invention is to acquire high wavelength resolution spectral information in a wide range of the object.
  • An object of the present invention is to provide a microscope apparatus that can classify and discriminate objects with high accuracy.
  • an object of the present invention is to provide a microscope apparatus capable of observing an object (for example, a living cell, a tissue slice, a crystal, etc.) over time without causing state deterioration or property change. To do.
  • an object of the present invention is to provide an analysis method for analyzing an object without staining.
  • a microscope apparatus includes a light source that irradiates light on an object and an imaging optical that forms an image of the object on a primary image plane by a light beam incident through an objective lens.
  • An imaging means comprising a detector and a moving means for moving an imaging position imaged by the two-dimensional imaging detector in the predetermined direction; and a first timing for controlling the imaging timing of the two-dimensional imaging detector.
  • the second control means for controlling the movement of the moving means, and the signal, two-dimensional spectroscopic data having one-dimensional spatial information and one-dimensional wavelength information at the photographing position is created.
  • Spectral data creating means for creating three-dimensional spectral data having two-dimensional spatial information and one-dimensional wavelength information from the two-dimensional spectral data at each photographing position, and creating a spectral image for each wavelength from the three-dimensional spectral data.
  • the one-dimensional wavelength information includes several tens of wavelengths in the wavelength range from ultraviolet to infrared (eg, 200 nm to 13 ⁇ m) including visible light visible to humans with a wavelength resolution of 0.1 nm to 100 nm. High-wavelength-resolved spectroscopic information (hyperspectral data) that splits into several hundred bands was used.
  • a microscope apparatus includes a light source that irradiates light on an object, and a microscope unit that includes an imaging optical unit that forms an image of the object on a primary image plane by a light beam incident through an objective lens.
  • a two-dimensional imaging detector that obtains a signal based on the incident light flux by dispersing the light flux from the image incident through the lens by photographing in a direction orthogonal to a predetermined direction by the dispersion optical element.
  • a moving means for moving a photographing position photographed by the two-dimensional imaging detector in the predetermined direction, and an imaging means detachably disposed on the microscope unit, and the two-dimensional imaging detector
  • First control means for controlling the timing of photographing
  • second control means for controlling movement of the moving means, and one-dimensional spatial information and one-dimensional wavelength information at the photographing position based on the signal.
  • Spectral data creation means for creating 3D spectral data having 2D spatial information and 1D wavelength information from 2D spectral data at each imaging position, and for each wavelength from the 3D spectral data
  • An image creation means for creating a spectral image of the above, an acquisition means for obtaining spectral radiance, spectral brightness and spectral reflectance or spectral transmittance at each pixel of the spectral image, and analyzing the spectral image by a predetermined analysis method And an analysis processing means.
  • the two-dimensional imaging detector in the above-described microscope apparatus, can be attached and detached together with an optical element including the lens and the dispersion optical element.
  • an optical element including the lens and the dispersion optical element.
  • the analysis method according to the present invention is to analyze the object without staining using the above-described microscope apparatus.
  • the microscope apparatus according to the present invention is such that in the above-described microscope apparatus, the moving means is a moving member on which a configuration subsequent to the lens is mounted.
  • the microscope apparatus includes a light source that irradiates light on an object, an imaging optical unit that forms an image of the object on a primary image plane with a light beam incident through an objective lens, and photographing.
  • Imaging with a two-dimensional imaging detector for acquiring a light beam from the image incident through the lens after being dispersed in a direction orthogonal to a predetermined direction by a dispersion optical element and acquiring a signal based on the incident light beam
  • two-dimensional spectroscopic data having one-dimensional spatial information and one-dimensional wavelength information at the photographing position based on the signal, control means for controlling photographing timing of the two-dimensional imaging detector, and the signal.
  • spectral data generating means for generating three-dimensional spectral data having two-dimensional spatial information and one-dimensional wavelength information from the two-dimensional spectral data at each photographing position
  • the above three-dimensional spectral data Image creation means for creating a spectral image for each length, acquisition means for obtaining spectral radiance, spectral brightness and spectral reflectance or spectral transmittance in each pixel of the spectral image, and the spectral image by a predetermined analysis method
  • Analysis processing means for performing analysis processing, and the imaging means captures the image while moving the photographing position in the predetermined direction by changing the positional relationship with the object in the predetermined direction. It is what I do.
  • a microscope apparatus includes a light source that irradiates light on an object, and a microscope unit that includes an imaging optical unit that forms an image of the object on a primary image plane with a light beam incident through an objective lens; A two-dimensional imaging detector that obtains a signal based on the incident light flux by dispersing the light flux from the image incident through the lens by photographing in a direction orthogonal to a predetermined direction by the dispersion optical element.
  • Imaging means that is detachably disposed in the microscope unit, control means for controlling the timing of imaging of the two-dimensional imaging detector, and one-dimensional spatial information at the imaging position based on the signal, Two-dimensional spectral data having one-dimensional wavelength information is created, and three-dimensional spectral data having two-dimensional spatial information and one-dimensional wavelength information is created from the two-dimensional spectral data at each photographing position.
  • Optical data creation means image creation means for creating a spectral image for each wavelength from the three-dimensional spectral data, and acquisition for obtaining spectral radiance, spectral brightness, and spectral reflectance or spectral transmittance at each pixel of the spectral image Means and analysis processing means for analyzing the spectral image by a predetermined analysis method, and the imaging means changes the positional relationship with the object in the predetermined direction, thereby changing the imaging position. The image is taken while moving in a predetermined direction.
  • the present invention is configured as described above, it has an excellent effect that it is possible to classify and discriminate objects with high accuracy, which cannot be classified by the prior art.
  • the present invention is configured as described above, it is possible to observe an object such as a living cell, a tissue section, a crystal, etc. over time without causing deterioration of the state or property change. There is an effect.
  • the present invention since the present invention is configured as described above, it has an excellent effect that the object can be analyzed without staining.
  • FIG. 1A is an explanatory diagram of a schematic configuration of a microscope apparatus according to the present invention
  • FIG. 1B is a view taken in the direction of arrow A in FIG.
  • FIG. 2 is a block diagram illustrating the control unit of the microscope apparatus according to the present invention.
  • FIG. 3 is a flowchart showing a processing routine of imaging processing in the microscope apparatus.
  • FIG. 4 is a flowchart showing a processing routine of analysis processing in the microscope apparatus.
  • FIG. 5 is a flowchart showing the processing routine of the calibration process.
  • 6 (a) is an RGB image of a nematode photographed by the microscope apparatus according to the present invention
  • FIG. 6 (b) is a transmission spectrum of the photographed image obtained by the microscope apparatus according to the present invention.
  • FIG. 6 (a) is an RGB image of a nematode photographed by the microscope apparatus according to the present invention
  • FIG. 6 (b) is a transmission spectrum of the photographed image obtained by the microscope apparatus according to the present invention.
  • FIG. 6C is a spectrum analysis image acquired by the microscope apparatus according to the present invention.
  • FIG. 7 (a) is an RGB image of alga bodies photographed by the microscope apparatus according to the present invention
  • FIG. 7 (b) is a transmission spectrum of the photographed image acquired by the microscope apparatus according to the present invention.
  • FIG. 7C is a spectrum analysis image acquired by the microscope apparatus according to the present invention.
  • FIG. 8 is a schematic configuration explanatory diagram of a modified example of the imaging unit in the microscope apparatus according to the present invention.
  • FIG. 9 is a schematic configuration explanatory diagram of a modified example of the imaging unit in the microscope apparatus according to the present invention.
  • FIG. 10 is a schematic configuration explanatory diagram of a modified example of the imaging unit in the microscope apparatus according to the present invention.
  • FIG. 11A is an explanatory diagram of a pair of achromatic prisms arranged in the initial state
  • FIG. 11B is a view taken in the direction of arrow B in FIG. (C) is explanatory drawing of a pair of achromatic prism of the state which functions as a parallel plane board
  • FIG.11 (d) is C arrow line view of FIG.11 (c).
  • 12 (a) and 12 (b) are schematic configuration explanatory views showing a modification of the microscope apparatus according to the present invention.
  • FIG. 1 (a) shows a schematic configuration explanatory diagram of a microscope apparatus according to the present invention
  • FIG. 1 (b) shows a view as seen from the arrow A in FIG. 1 (a).
  • FIG. 2 is an explanatory diagram of the block configuration of the control unit.
  • the microscope apparatus 10 shown in FIG. 1 has a variable magnification optical system (not shown) that changes the diameter of a substantially parallel light beam incident through an objective lens (not shown) and emits it again as a substantially parallel light beam. ), And an image forming optical unit 12 that forms an image of a light beam from the object 200 on a primary image plane via a variable power optical system (not shown), and an image is formed on the primary image plane by the image forming optical unit 12.
  • the imaging unit 14 that performs imaging of a predetermined region of the primary image plane including the image 210 that is formed, and the imaging by the imaging unit 14 are controlled, and the zooming optical system (not shown) in the imaging optical unit 12 is controlled.
  • a control unit 16 that adjusts the magnification of the image formed on the primary image plane.
  • the microscope apparatus 10 is provided with a transmission type light source unit 18 that irradiates light that passes through the object 200 and an epi-illumination type light source unit 18 that irradiates light that is reflected by the object 200.
  • a halogen lamp, a xenon lamp, a mercury lamp, an LED lamp, a laser, or the like is used.
  • FIG. 1 the case where a transmissive light source unit is used is described.
  • the imaging unit 14 includes an objective lens 22 that receives a light beam from an image 210 formed on the primary image plane, a precision linear motion stage 24 that moves in the Y-axis direction in an XYZ orthogonal coordinate system, and an objective.
  • a slit plate 26 disposed so that a slit opening 26a extending in the Z-axis direction is positioned on the image plane of the lens 22, a collimating lens 28 that converts the light beam that has passed through the slit opening 26a into parallel light, and The parallel light from the collimator lens 28 is detected on the image plane of the dispersion optical element 30 for dispersing in the Y-axis direction, the imaging lens 32 for imaging the light beam emitted from the dispersion optical element 30, and the imaging lens 32. And a two-dimensional imaging detection unit 34 disposed so that the unit 34a is located.
  • a slit plate 26 In the precision linear motion stage 24, a slit plate 26, a collimating lens 28, a dispersion optical element 30, an imaging lens 32, and a two-dimensional imaging detection unit 34 are fixedly disposed.
  • the slit plate 26, the collimating lens 28, the dispersion optical element 30, the imaging lens 32, and the two-dimensional imaging detection unit 34 are integrated with the Y axis as the precision linear motion stage 24 moves in the Y axis direction. Will move in the direction.
  • the slit plate 26 is arranged so that the light beam from the objective lens 22 passes through a slit opening 26a extending in the Z-axis direction.
  • a diffraction grating, a prism, or a grism can be used for the dispersion optical element 30.
  • the grism is a direct-view diffraction grating that combines a transmission diffraction grating and a prism.
  • the diffraction grating, prism, or grism is disposed so as to disperse the incident light beam in the Y-axis direction perpendicular to the Z-axis direction that is the extension direction of the slit opening 26a.
  • the objective lens 22, the slit plate 26, the collimating lens 28, the dispersion optical element 30, and the imaging lens 32 are configured to be exchangeable as appropriate.
  • the two-dimensional imaging detector 34 is an area sensor in which the detection unit 34a is arranged in parallel with the Z axis and the Y axis (that is, parallel to the primary image plane and parallel to the YZ plane).
  • the two-dimensional imaging detector 34 has a spatial pixel number of 1 to 29 million pixels, an obtainable wavelength range of 200 nm to 13 ⁇ m, and a wavelength resolution of 0.1 nm to 100 nm, for observation purposes and objects It is possible to change according to the type.
  • the two-dimensional imaging detector 34 having a different wavelength range, wavelength resolution, number of spatial pixels, and the like is replaced, and the objective lens 22, the slit plate 26, the collimating lens 28, the dispersion optical element 30, and the imaging lens
  • the imaging unit 14 By appropriately replacing 32, it is possible to change the wavelength range, wavelength resolution, number of spatial pixels, and the like that can be captured by the imaging unit 14.
  • the wavelength range, wavelength resolution, and number of spatial pixels in the acquired spectral image can be changed.
  • control unit 16 is connected to the precision linear motion stage 24 and the two-dimensional imaging device 34, and is configured by, for example, a microcomputer or a general-purpose personal computer. Further, the control unit 16 is connected to the imaging optical unit 12 and controls a variable power optical system (not shown), and is connected to the light source unit 18 to control ON / OFF of the light source.
  • the control unit 16 moves the Y-axis direction of the imaging control unit 40 that controls the timing at which the two-dimensional imaging detector 34 acquires an electrical signal (that is, the timing for imaging) and the precision linear motion stage 24. It is created by a movement control unit 42 that controls the direction, amount of movement, and timing of movement, a spectral data creation unit 44 that creates spectral data based on an electrical signal from the two-dimensional imaging detector 34, and a spectral data creation unit 44. Based on the obtained spectral data, an image creating unit 46 that creates a spectral image, an image processing unit 48 that processes a spectral image for each wavelength acquired by photographing the object 200, and a spectral processed by the image processing unit 48. And an analysis processing unit 56 that performs image analysis processing.
  • the imaging control unit 40 performs imaging in the two-dimensional imaging detector 34 (that is, the electric power in the two-dimensional imaging detector). Signal is acquired), and information indicating that photographing has been performed is output to the movement control unit 42.
  • the imaging control unit 40 determines whether or not the precision linear motion stage 24 has been moved until the imaging position in the predetermined imaging area including the image 210 reaches the imaging end position, and from the movement control unit 42 to the imaging end position. When information indicating that the camera has moved is output, the shooting process is terminated after shooting.
  • the movement control unit 42 moves the precision linear motion stage 24 from the one end 210c in the Y-axis direction of the image 210 to the Y-axis direction within a predetermined imaging region including the other end 210d, and the slit of the slit opening 26a. It is sequentially moved at a predetermined interval according to the width. Thereby, the photographing position in the predetermined photographing region is moved in the Y-axis direction.
  • the movement control unit 42 moves the precise linear motion stage 24 so that the photographing position of the two-dimensional imaging detector 34 becomes the photographing start position when the operator instructs the start of photographing, and the precise linear motion is performed.
  • Information indicating that the stage 24 has been moved is output to the imaging control unit 40.
  • the movement control unit 42 moves the precision linear motion stage 24 at a predetermined interval corresponding to the slit width of the slit opening 26a.
  • Information indicating that the linear motion stage 24 has been moved is output to the imaging control unit 40.
  • the movement control unit 42 moves the precision linear motion stage 24 until the shooting position of the two-dimensional imaging detector 34 reaches the shooting end position, the movement control unit 42 outputs information indicating that the movement has been moved to the shooting end position to the shooting control unit 40. .
  • the spectral data creation unit 44 creates and creates two-dimensional spectral data having one-dimensional spatial information and one-dimensional wavelength information based on the electrical signal output from the two-dimensional imaging detector 34.
  • the dimensional spectroscopic data is output and stored in the storage unit 50 provided in the control unit 16.
  • the spectral data creation unit 44 uses the two-dimensional spectral data stored in the storage unit 50 to complete the three-dimensional spectroscopy having two-dimensional spatial information and one-dimensional wavelength information when imaging at each imaging position is completed. Data is generated, and the generated three-dimensional spectral data is output to the storage unit 50 and stored.
  • the image creation unit 46 creates a spectral image for each wavelength based on the three-dimensional spectral data created by the spectral data creation unit 44.
  • each of the image forming units 46 is based on the three-dimensional spectroscopic data spectrally divided into several hundred bands in a predetermined wavelength range of 200 nm to 13 ⁇ m and a predetermined wavelength resolution of 0.1 nm to 100 nm. A spectral image for each band (each wavelength) is acquired.
  • the image processing unit 48 includes a calibration processing unit 52 that performs dark noise correction processing, inter-pixel sensitivity deviation correction processing, luminance calibration processing, and light source correction processing on a spectral image for each wavelength acquired by photographing the object 200.
  • Spectral luminance (cd / m 2 ⁇ nm) is calculated from the spectral radiance (W / sr ⁇ m 2 ⁇ nm) calculated during the luminance calibration process, and a sensitivity correction coefficient, luminance calibration coefficient,
  • a calculation unit 54 for calculating reflectance or spectral transmittance.
  • the calibration processing unit 52 performs dark noise correction (dark correction) processing for removing noise caused by dark current in the two-dimensional imaging detector 34.
  • dark noise correction dark correction
  • the sensitivity deviation correction process between pixels is performed using the sensitivity correction coefficient on the spectral image that has been subjected to the dark noise correction process.
  • the luminance calibration processing is performed using the luminance calibration coefficient on the spectral image that has been subjected to the inter-pixel sensitivity deviation correction processing.
  • the illumination unevenness of the light source light in the space in the spectral image subjected to the dark noise correction process is corrected, and the light source correction process for acquiring the spectral reflectance or the spectral transmittance of the object 200 is performed.
  • the dark noise correction data used in the dark noise correction process is not light-blocked without placing the target object 200 in the shooting region before observing the target object 200, that is, before shooting the target object 200.
  • This is spectral image data for each wavelength created based on three-dimensional spectral data acquired by photographing in a state. Note that the shooting processing procedure at this time is the same as the shooting processing described later.
  • the sensitivity correction coefficient is such that the spatial distribution of the radiance is equalized by an integrating sphere or the like before the object 200 is observed, that is, before the object 200 is imaged, without placing the object 200 in the imaging region.
  • 3D spectroscopic data acquired by photographing the light source light hereinafter referred to as “uniform uniform light source”. Note that the shooting processing procedure at this time is the same as the shooting processing described later.
  • the sensitivity correction coefficient is output from a specified pixel (can be a single pixel or an average of a plurality of pixels) as a reference value in a spectral image for each wavelength of uniform standard light source imaging data, and the reference value is output from each pixel.
  • the correction coefficient for each pixel is calculated by dividing by the value, and is created as three-dimensional spectral data having a correction count value for each pixel of the spectral image for each wavelength. Note that the calculation of the sensitivity correction coefficient is performed by the calculation unit 54 and output to the storage unit 50 for storage.
  • the luminance calibration coefficient is a light source light having a spectral radiance value that is not placed on the imaging region before the object 200 is observed, that is, before the object 200 is imaged.
  • this is one-dimensional data created based on three-dimensional spectral data acquired by photographing "spectral radiance standard light source”. Note that the shooting processing procedure at this time is the same as the shooting processing described later.
  • the luminance calibration coefficient is one-dimensional data corresponding to each wavelength, and the spectral radiance value valued for each wavelength of the spectral radiance standard light source is obtained for each wavelength obtained by photographing the spectral radiance standard light source.
  • a luminance calibration coefficient having a conversion coefficient for each wavelength is created by dividing by the output value of a designated pixel (which may be a single pixel or an average of a plurality of pixels) in the spectral image. Note that the calculation of the luminance calibration coefficient is performed by the calculation unit 54 and output to the storage unit 50 for storage.
  • the light source data is applied to the reflection reference such as a standard white plate without placing the object 200 in the imaging region before observing the object 200, that is, before photographing the object 200. It is created as three-dimensional spectroscopic data acquired by photographing the reflected light obtained in this way. Note that the shooting processing procedure at this time is the same as the shooting processing described later.
  • spectral reflectance or spectral transmittance R ( ⁇ ) of each pixel in the spectral image for each wavelength is expressed by the following equation.
  • R ( ⁇ ) C ( ⁇ ) / E ( ⁇ )
  • the calculation unit 54 calculates the spectral luminance L ( ⁇ ) (cd / m 2 ⁇ nm) from the spectral radiance Le ( ⁇ ) (W / sr ⁇ m 2 ⁇ nm) in the spectral image for each wavelength processed in the calibration processing unit 52. nm).
  • L ( ⁇ ) Km ⁇ Le ( ⁇ ) ⁇ V ( ⁇ )
  • L ( ⁇ ) Spectral luminance (cd / m 2 ⁇ nm) for each pixel of the spectral image
  • Km Maximum luminous efficacy (683lm ⁇ W ⁇ 1 )
  • data used for spectral analysis is generally spectral radiance or spectral reflectance (or spectral transmittance), and spectral luminance is used only in specific situations such as color relations.
  • the process of calculating the luminance is executed when an instruction from the operator is given.
  • the analysis processing unit 56 uses, for example, spectral radiance, spectral luminance, spectral reflectance, or spectral transmittance at each pixel of the spectral image for each wavelength acquired by photographing the object 200 processed by the image processing unit 48. Then, analysis processing is performed based on the set data analysis method.
  • analysis processing unit 56 various analysis methods can be used.
  • K-mean method K-mean method
  • ISODATA method spectrum angle mapper (Spectral Angle Mapper)
  • support vector machine Support
  • Support Vector Regression Method Support Vector Regression Method
  • Cross-Correlation Method Linear Unmixing Method
  • Principal Component Analysis Principal Component Analysis
  • PCA Components Analysis
  • MNF Minimum Noise Fraction Conversion
  • Matched Filter Matched Filter
  • Minimum Energy Limit Method Consstrained Energy Minimization
  • Orthogonal Subspace Projection Decision Boundary Decision
  • Feature extraction Battacha distance method B-Distance
  • cluster analysis discriminant analysis Discriminant Analysis
  • linear multiple regression analysis Multiple Linear Regression: MLR
  • principal component regression analysis Principal Components Regression: PCR
  • classical least square method Classic L as Square
  • PLS regression analysis Partial Last Squares Regression: PLSR
  • Fourier Transform Regression Analysis Normalized Vegetation Index Normalized Differential Vegetation Index: NDVI
  • NormalizedNdSlD Normalized Spectral Reflectance Index
  • NDSI Normalized Water Index
  • NDWI Moisture Ratio Prediction Index Analysis
  • PP Moisture Ratio Prediction Index Analysis
  • the two-dimensional imaging detector 34 uses the one end 210a and the other end 210b in the Z-axis direction of the image 210. In such a state that can be photographed, the operator gives an instruction to start photographing.
  • control unit 16 starts shooting processing.
  • the flowchart of FIG. 3 shows the detailed processing contents of the imaging process in the microscope apparatus 10 according to the present invention.
  • the imaging position by the two-dimensional imaging detector 34 starts imaging.
  • the precision linear motion stage 24 is moved so as to coincide with the position (step S302).
  • the imaging start position is set in advance, and is, for example, one end in the Y-axis direction in a predetermined imaging area including the image 210.
  • the movement controller 42 moves the precision linear motion stage 24 to a position where the imaging position coincides with the imaging start position by the two-dimensional imaging detector 34, and the precision linear motion stage 24 is Information indicating that it has moved is output to the imaging control unit 40.
  • the imaging control unit 40 executes processing for imaging in the two-dimensional imaging detector 34 (step S304).
  • the photographing control unit 40 performs photographing (acquisition of an electrical signal) in the two-dimensional imaging detector 34 and outputs information indicating that photographing has been performed to the movement control unit 42.
  • the two-dimensional imaging detector 34 determines the light intensity of the incident light. The distribution is converted into an electrical signal, and this electrical signal is output to the spectral data creation unit 44.
  • the spectral data creation unit 44 creates two-dimensional spectral data having one-dimensional spatial information and one-dimensional wavelength information based on the electrical signal output from the two-dimensional imaging detector 34.
  • the created two-dimensional spectroscopic data is output to and stored in the storage unit 50.
  • the movement control unit 42 moves the precision linear motion stage 24 at a predetermined interval (step S306).
  • the movement control unit 42 moves the precision linear motion stage 24 at a predetermined interval and outputs information to the imaging control unit 40 that the precision linear motion stage 24 has moved. To do.
  • the movement control unit 42 outputs information indicating that it has moved to the photographing end position.
  • the imaging control unit 40 determines whether or not the precise linear motion stage 24 has moved until the imaging position of the two-dimensional imaging detector 34 reaches the imaging end position (step S308).
  • this photographing end position is set in advance, and is, for example, the other end in the Y-axis direction in a predetermined photographing region.
  • the shooting control unit 40 determines whether or not information indicating that the movement control unit 42 has moved to the shooting end position has been output.
  • step S308 if the information indicating that the movement control unit 42 has moved to the shooting end position is not output, the shooting position of the two-dimensional imaging detector 34 is set to the shooting end in the shooting control unit 40. It is determined that the precision linear motion stage 24 has not moved until the position is reached. On the other hand, when information indicating that the movement control unit 42 has moved to the photographing end position is output, the precise linear motion stage 24 is used until the photographing position of the two-dimensional imaging detector 34 reaches the photographing end position in the photographing control unit 40. Is determined to have moved.
  • step S308 If it is determined in the determination process of step S308 that the imaging linear movement stage 24 has not moved until the imaging position of the two-dimensional imaging detector 34 reaches the imaging end position, the process returns to the process of step S304, and the process of step S304. Perform the following processing.
  • step S308 if it is determined in step S308 that the photographing linear movement stage 24 has moved until the photographing position of the two-dimensional imaging detector 34 reaches the photographing end position, the two-dimensional imaging detector 34 is obtained at the photographing end position. Then, photographing (acquisition of electrical signals) is performed (step S310), and this photographing process is terminated.
  • the flowchart of FIG. 4 shows the detailed processing contents of the analysis processing in the microscope apparatus 10 according to the present invention.
  • the spectral data creation unit 44 acquires it at each photographing position. 3D spectroscopic data having two-dimensional spatial information and one-dimensional wavelength information indicating information in a predetermined imaging region by integrating the two-dimensional spectroscopic data having the one-dimensional spatial information and the one-dimensional wavelength information. Is created (step S402).
  • the image creation unit 46 creates a spectral image for each wavelength in the created three-dimensional spectral data (step S404), and performs calibration processing for the created spectral image for each wavelength (step S406).
  • FIG. 5 shows a flowchart showing the detailed processing contents of the calibration processing.
  • dark noise correction processing is performed (step S502).
  • step S502 dark noise correction processing is performed using dark noise correction data created in advance and stored in the storage unit 50.
  • the output value of the corresponding pixel in is subtracted.
  • a sensitivity deviation correction process between pixels is performed on the spectral image for each wavelength subjected to the dark noise correction process in this way (step S504).
  • the calibration processing unit 52 performs sensitivity correction created in advance and stored in the storage unit 50 for each pixel of the spectral image for each wavelength subjected to dark noise correction processing. Multiply by a coefficient.
  • a luminance calibration process is performed on the spectral image for each wavelength subjected to the inter-pixel sensitivity deviation correction process (step S506).
  • the calibration processing unit 42 creates and stores in advance in the storage unit 50 the pixel output value of the spectral image for each wavelength subjected to the inter-pixel sensitivity deviation correction process. Multiply by the luminance calibration factor. In addition, what was obtained by multiplying the luminance calibration coefficient indicates the spectral radiance at each pixel of the spectral image for each wavelength.
  • the calculation unit 54 calculates the spectral luminance at each pixel of the spectral image for each wavelength from the acquired spectral radiance at each pixel of the spectral image for each wavelength (step S508), and the analysis processing at step S408. Proceed to processing.
  • step S508 Since the spectral luminance is used only in a specific case as described above, for example, when there is no instruction to calculate the spectral luminance from the operator, the processing in step S508 may be omitted. .
  • step S408 the light source correction process (step S408) is performed, and each pixel of the spectral image for each wavelength subjected to the dark noise correction process acquired in the process of step S502 in the calibration processing unit 52.
  • the output value of the corresponding pixel in the spectral image of the wavelength corresponding to the light source data is divided from the output value of the light source data, and the light source correction process is performed, so that the spectral for each pixel in the spectral image for each wavelength of the object 200 is obtained.
  • the analysis processing unit 56 sets, for example, the user.
  • the analysis image analyzed by the predetermined analysis method is displayed on a display unit (not shown) (step S412), and the analysis process is terminated.
  • the worker classifies or discriminates the object with reference to the analysis image data displayed on the display unit (not shown).
  • nematodes were grown on an NGM plate in which E. coli (OP50) was cultured until they became adults by performing morphological culture at a constant temperature of 20 ° C. at intervals of 4 days.
  • E. coli OP50
  • spectrum data was analyzed by a supervised maximum likelihood method, and classified into tissues and visualized.
  • FIG. 6A shows an RGB image of a nematode photographed by the microscope apparatus 10
  • FIG. 6B shows a transmission of the photographed image acquired by the microscope apparatus 10.
  • a spectrum is shown
  • FIG. 6C shows a spectrum analysis image acquired by a microscope apparatus.
  • the spectral transmittance R ( ⁇ ) is calculated for each pixel of the spectral image, a representative location is selected for each nematode organ, and the average spectral transmittance of each region (ROI) is calculated. It is displayed as a graph.
  • each ROI selected in FIG. 6C is assigned to each class as a teacher location, and the occurrence probability (likelihood) for each living class is determined based on the spectral information of each spatial pixel. It was obtained from the following formula and classified into each class by the maximum likelihood method, and organ classification of nematodes that remained unstained was performed.
  • Is the likelihood of class c i selection range
  • Is the variance-covariance inverse of class c i and Is a deviation matrix
  • n is the number of bands.
  • algae two types of algal zoospore germination bodies were cultured in a beaker containing a culture solution, and grown algae were used.
  • the cultured algae are collected, and the collected algae are placed on the preparation in an unfixed and unstained state, and placed on the preparation in the microscope apparatus 10 using the transmitted light of the halogen light source.
  • the placed alga bodies were observed.
  • the inclination of the feature point was analyzed for the spectrum data, and algae and bacteria were classified and visualized.
  • FIG. 7A shows an RGB image of algal bodies photographed by the microscope apparatus 10, and FIG. 7B shows transmission of the photographed image acquired by the microscope apparatus 10. A spectrum is shown, and FIG. 7C shows a spectrum analysis image acquired by a microscope apparatus.
  • FIG. 7 (b) calculates the spectral transmittance R ( ⁇ ) for each pixel of the spectral image and displays the spectral transmittance graph at the selected location.
  • FIG. 7 (c) estimates the characteristic wavelength at which the difference in the transmission spectrum of algae and bacteria appears from the spectral transmission graph of FIG. 7 (b), and calculates the normalized spectral reflectance index from the estimated wavelength by the following formula. Calculation and classification analysis of algae and bacteria by threshold.
  • NDSI (R (1) ⁇ R (2) / (R (1) + R (2)) NDSI: normalized spectral reflection index R (1), R (2): spectral transmittance of characteristic wavelength
  • the microscope apparatus 10 includes the slit plate 26, the collimator lens 28, the dispersion optical element 30, the imaging lens 32, and the two-dimensional imaging detector in the imaging unit 14 after the objective lens 22. 34 is provided so as to acquire one-dimensional spatial information extended in the Z-axis direction and one-dimensional wavelength information extended in the Y-axis direction.
  • the slit plate 26, the collimating lens 28, the dispersion optical element 30, the imaging lens 32, and the two-dimensional imaging detector 34 are disposed on the precision linear motion stage 24 that is movably disposed in the Y-axis direction.
  • the shooting position is changed in the Y-axis direction.
  • the microscope apparatus 10 it is possible to photograph all the predetermined photographing areas including the image 210 on the primary image plane on which the image 210 of the object 200 is formed without moving the object 200. It becomes possible.
  • the microscope apparatus 10 of the present invention it is not necessary to perform processing such as moving the target object 200 when observing the target object 200.
  • the microscope apparatus 10 is sensitive to physical stimulation (for example, vibration). Even when a simple object is observed, the object 200 can be observed without giving a physical stimulus.
  • the microscope apparatus 10 performs a calibration process and the like, and performs spectral radiance and spectral brightness of each pixel and spectral reflectance or spectral transmittance in a spectral image for each wavelength acquired by photographing the object 200.
  • the image analyzed by a predetermined analysis method is displayed on a display unit (not shown).
  • an analysis image analyzed using the spectral radiance or the like in the spectral image can be acquired, so that the cells are not stained, in addition to the stained cells. Unstained cells can be observed.
  • the microscope apparatus 10 for example, it becomes possible to observe live cells without staining them.
  • the object 200 can be observed over time without causing state deterioration or property change.
  • the slit plate 26, the collimating lens 28, the dispersion optical element 30, the imaging lens 32, and the two-dimensional imaging detector 34 are on the precision linear motion stage 24 that can move in the Y-axis direction.
  • the arrangement is not limited to this.
  • the configuration after the slit plate 26 is integrally formed by a configuration based on a technology capable of moving control on a straight line, for example, an actuator such as a fluid type (hydraulic pressure, pneumatic pressure), electromagnetic type, ultrasonic type, or piezo type. It is good also as a structure which moves to an axial direction.
  • an actuator such as a fluid type (hydraulic pressure, pneumatic pressure), electromagnetic type, ultrasonic type, or piezo type. It is good also as a structure which moves to an axial direction.
  • the imaging unit 14 may have the following three configurations.
  • FIG. 8 shows a schematic configuration explanatory diagram of a modified example of the imaging unit in the microscope apparatus according to the present invention.
  • the imaging unit 104 shown in FIG. 8 includes an objective lens 112 that receives a light beam from an image 210, a precision linear motion stage 114 that is provided with the objective lens 112 and moves in the Y-axis direction, and an image plane of the objective lens 12.
  • a slit plate 26 disposed so that a slit opening 26a extending in the Z-axis direction is located, a collimating lens 28 that collimates the light beam that has passed through the slit opening 26a, and the collimating lens 28
  • the dispersive optical element 30 that disperses the parallel light in the Y-axis direction, the image forming lens 32 that forms an image of the light beam emitted from the dispersive optical element 30, and the detection unit 34a are positioned on the image plane of the image forming lens 32.
  • a two-dimensional imaging detection unit 34 arranged as described above.
  • the slit plate 26, the collimating lens 28, the dispersion optical element 30, the imaging lens 32, and the two-dimensional imaging detector 34 are fixedly disposed.
  • the movement control unit 42 moves the precision linear motion stage 114 in the Y-axis direction within the predetermined imaging region including the one end 210c in the Y-axis direction of the image 210 and the other end 210d in the Y-axis direction. 26a is sequentially moved at a predetermined interval corresponding to the width in the Y-axis direction.
  • the movement control unit 42 moves the precise linear motion stage 114 so that the photographing position of the two-dimensional imaging detector 34 becomes the photographing start position when the operator gives an instruction to start photographing.
  • Information indicating that the moving stage 114 has moved is output to the imaging control unit 40.
  • the movement control unit 42 moves the precision linear motion stage 114 at a predetermined interval corresponding to the width of the slit opening 26a in the Y-axis direction.
  • the information indicating that the precision linear motion stage 114 has been moved is output to the imaging control unit 40.
  • the movement control unit 42 moves the precision linear motion stage 114 until the shooting position of the two-dimensional imaging detector 34 reaches the shooting end position, the movement control unit 42 outputs information indicating that the movement to the shooting end position is performed to the shooting control unit 40. .
  • the movement controller 42 moves the precision linear motion stage 114 to move the imaging position, and the imaging controller 40 performs imaging and creates spectral data based on the electrical signal from the two-dimensional imaging detector 34.
  • the imaging controller 40 performs imaging and creates spectral data based on the electrical signal from the two-dimensional imaging detector 34.
  • two-dimensional spectroscopic data having one-dimensional spatial information and one-dimensional wavelength information is created.
  • FIG. 9 shows a schematic configuration explanatory diagram of a modified example of the imaging unit in the microscope apparatus according to the present invention.
  • the imaging unit 124 shown in FIG. 9 includes an F- ⁇ lens 132 that receives a light beam from the image 210, a galvano mirror 134 provided at the rear stage of the F- ⁇ lens 132, and a light beam reflected by the galvano mirror 134.
  • the imaging lens 136 that forms an image, the slit plate 26 that is disposed so that the slit opening 26a extending in the Z-axis direction on the image plane of the imaging lens 136 is positioned, and the slit opening 26a.
  • a collimator lens 28 that converts the light beam into parallel light
  • a dispersion optical element 30 that disperses the parallel light from the collimator lens 28 in the X-axis direction
  • an imaging lens 32 that forms an image of the light beam emitted from the dispersion optical element 30
  • the two-dimensional imaging detection unit 34 is arranged so that the detection unit 34 a is positioned on the image plane of the imaging lens 32.
  • the F- ⁇ lens 134, the imaging lens 136, the slit plate 26, the collimating lens 28, the dispersion optical element 30, the imaging lens 32, and the two-dimensional imaging detector 34 are fixedly disposed.
  • the galvanometer mirror 134 reflects the light substantially collimated by the F- ⁇ lens 132 on the reflecting surface and enters the imaging lens 136, and the reflecting surface rotates around the Z axis. It is arranged.
  • the movement control unit 42 controls the rotation angle and the rotation direction of the reflecting surface. Note that such a rotation angle and a rotation direction are set before photographing processing.
  • rotate the galvano mirror 134 means “rotate the reflecting surface of the galvano mirror 134”.
  • the galvanometer mirror 134 rotates the reflection surface to change the reflection angle of the light collimated by the F- ⁇ lens 132.
  • the movement control unit 42 galvanometer mirror 134 so that the imaging position moves in the Y-axis direction within a predetermined imaging region including the other end 210d from one end 210c in the Y-axis direction of the image 210.
  • the movement control unit 42 galvanometer mirror 134 so that the imaging position moves in the Y-axis direction within a predetermined imaging region including the other end 210d from one end 210c in the Y-axis direction of the image 210.
  • the movement control unit 42 rotates the galvano mirror 134 so that the shooting position of the two-dimensional imaging detector 34 becomes the shooting start position. Information about the rotation is output to the imaging control unit 40.
  • the movement control unit 42 rotates the galvano mirror 134 at a predetermined rotation angle corresponding to the width of the slit opening 26a in the Y-axis direction.
  • the information indicating that the galvano mirror 134 is rotated is output to the imaging control unit 40.
  • the movement control unit 42 outputs information indicating that the two-dimensional imaging detector 34 has moved to the photographing end position to the photographing control unit 40.
  • the imaging control unit 40 performs imaging, and based on the electrical signal from the two-dimensional imaging detector 34, spectral data is obtained.
  • the creating unit 44 creates two-dimensional spectral data having one-dimensional spatial information and one-dimensional wavelength information.
  • FIG. 10 shows a schematic configuration explanatory diagram of a modified example of the imaging unit in the microscope apparatus according to the present invention.
  • the imaging unit 144 shown in FIG. 10 is extended in the Z-axis direction on the image plane of the objective lens 22 and a pair of achromatic prisms 146 disposed in front of the objective lens 22 on which the light beam from the image 210 is incident.
  • the slit plate 26 disposed so that the slit opening 26a is positioned, the collimating lens 28 that makes the light beam that has passed through the slit opening 26a parallel light, and the parallel light from the collimating lens 28 in the Y-axis direction.
  • a dispersive optical element 30 that disperses, an image forming lens 32 that forms an image of a light beam emitted from the dispersive optical element 30, and a two-dimensional arrangement in which the detector 34a is positioned on the image plane of the image forming lens 32. And an imaging detection unit 34.
  • the objective lens 22, the slit plate 26, the collimator lens 28, the dispersion optical element 30, the imaging lens 32, and the two-dimensional imaging detector 34 are fixedly disposed.
  • the pair of achromatic prisms 146 are constituted by achromatic prisms 146-1 and 146-2, and the achromatic prisms 146-1 and 146-2 are aligned in the X-axis direction with the same apex angle direction. Arranged.
  • the state of the pair of achromatic prisms shown in FIGS. 11A and 11B is referred to as “initial state”.
  • the pair of achromatic prisms 146 are rotated from the initial state by the same angle in the opposite direction to the achromatic prisms 146-1 and 146-2, respectively, under the control of the movement control unit 42.
  • the achromatic prisms 146-1 and 146-2 rotate in opposite directions, the apex angle direction is reversed, and the Z axis direction is horizontal (FIG. 11 ( c) (Refer to (d)), it functions as a plane parallel plate.
  • FIG. 11C shows a state in which the achromatic prism 146-1 in FIG. 11A is rotated by 90 ° in the direction of arrow I and the achromatic prism 146-2 is rotated by 90 ° in the direction of arrow II. It is.
  • the achromatic prisms 146-1 and 146-2 are rotated in the opposite directions about the coincident central axis O.
  • the achromatic prism 146-1 is moved in the direction of the arrow I.
  • the achromatic prism 146-2 rotates in the direction of arrow II (see FIG. 13A).
  • the achromatic prisms 146-1 and 146-2 can be rotated in the range of 0 ° to 180 °.
  • the pair of achromatic prisms 146 having such a configuration makes it possible to change the position of the light beam incident on the two-dimensional imaging detector 34 disposed downstream of the pair of achromatic prisms 146. As a result, the shooting position of shooting is moved in the Y-axis direction.
  • the movement control unit 42 is an achromatic prism so that the photographing position moves in the Y-axis direction within a predetermined photographing region including the other end 210d from one end 210c in the Y-axis direction of the image 210.
  • 146-1 and 146-2 are sequentially rotated around the X axis at respective predetermined rotation angles according to the width of the slit opening 26a in the Y axis direction.
  • the movement control unit 42 sets the achromatic prisms 146-1 and 146-2 so that the imaging position of the one-dimensional imaging detector 236 becomes the imaging start position when the operator gives an instruction to start imaging.
  • Information indicating that the achromatic prisms 146-1 and 146-2 have been rotated in the opposite directions is output to the imaging control unit 40.
  • the movement control unit 42 moves the achromatic prisms 146-1 and 146-2 according to the width of the slit opening 26a in the Y-axis direction.
  • Information indicating that the achromatic prisms 146-1 and 146-2 are rotated is output to the imaging control unit 40 by rotating at a predetermined rotation angle.
  • the movement control unit 42 photographs information indicating that the two-dimensional imaging detector 34 has moved to the photographing end position. Output to the control unit 40.
  • the movement control unit 42 rotates the achromatic prisms 146-1 and 146-2 to move the shooting position, while shooting is performed by the shooting control unit 40, and the electrical signal from the two-dimensional imaging detector 34 is converted into an electric signal.
  • the spectral data creating unit 44 creates two-dimensional spectral data having one-dimensional spatial information and one-dimensional wavelength information.
  • the calibration process is performed after the spectral image for each wavelength acquired by photographing the object 200 is created.
  • the present invention is not limited to this. Yes, the dark noise correction process and the light source correction process in the calibration process may be performed after an instruction from the operator, for example.
  • the microscope apparatus 10 is configured by the imaging optical unit 12, the light source unit 18, and the imaging unit 14, but it is needless to say that the present invention is not limited to this.
  • a microscope apparatus according to the present invention a microscope unit having an imaging optical unit for forming an image of an object on a primary image plane and a light source for irradiating the object with light, specifically, a conventionally known optical unit. It is good also as a structure arrange
  • the precision linear motion stage 24 provided in the imaging unit 14 in the microscope apparatus 10 is used to move the calibration after the objective lens 22 in the Y-axis direction so that the imaging position is set to Y.
  • the objective lens 22, the slit plate 26, the collimator lens 28, the dispersion optical element 30, the imaging lens 32, and the like The two-dimensional imaging detector 34 may be fixedly arranged, and the photographing position may be moved in the Y-axis direction by moving the positional relationship with the object 200 in the Y-axis direction.
  • the imaging position is moved in the Y-axis direction.
  • the objective lens 22, the slit plate 26, the collimating lens 28, the dispersion optical element 30, the imaging lens 32, and the two-dimensional imaging detector 34 are used.
  • the imaging unit 14 is arranged on the Y axis by moving means that is provided separately from the imaging unit 14 and controlled by a control unit different from the control unit 16 with respect to the fixed object 200. It is also possible to move in the direction (refer to FIG. 12A). Note that the control unit that controls the moving unit and the control unit 16 of the imaging unit 14 are connected to this difference and moved. And the timing of shooting are controlled.
  • the objective lens 22, the slit plate 26, the collimating lens 28, the dispersion optical element 30, the imaging lens 32, and the two-dimensional imaging detector 34 are fixedly arranged in the imaging device 14.
  • the object 200 is arranged on a stage movable in the Y-axis direction, and the object 200 is moved in the Y-axis direction by the stage controlled by the control unit with respect to the fixed imaging unit 14. (Refer to FIG. 12B).
  • the control unit that controls the moving unit and the control unit 16 of the imaging unit 14 are connected, and the timing of movement and the timing of imaging are controlled.
  • the wavelength is changed by exchanging hardware configurations such as the objective lens 22, the slit plate 26, the collimating lens 28, the dispersion optical element 30, the imaging lens 32, and the two-dimensional imaging detector 34.
  • the area, wavelength resolution, and number of spatial pixels are changed, but this is not limited to this.
  • reading externally defined files or changing parameters to the worker May specify not only the wavelength range, wavelength resolution, and number of spatial pixels, but also the imaging speed, exposure time, and gain.
  • the present invention is suitable for use as a microscope apparatus for observing cells and tissues.

Landscapes

  • Physics & Mathematics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Investigating Or Analysing Materials By Optical Means (AREA)
  • Microscoopes, Condenser (AREA)

Abstract

L'invention concerne un dispositif de microscope apte à effectuer une classification de haute précision des sujets et à effectuer une analyse non contaminée. Le dispositif selon l'invention comporte : une source lumineuse pour éclairer un sujet avec de la lumière ; un moyen d'imagerie pour capturer une image du sujet sur la surface d'une image principale ; un moyen de captation d'image muni d'un détecteur qui, au cours d'une captation d'image, est frappé par de la lumière diffuse provenant d'une image, et qui acquière un signal basé sur les rayons de lumière frappants, et de moyens de déplacement pour déplacer la position de la captation d'image dans la direction voulue ; un moyen de commande pour commander la synchronisation de la captation d'image ; un moyen de commande pour commande les moyens de déplacement ; un moyen pour créer, sur la base du signal, les premières données en ayant des informations spatiales unidimensionnelles au niveau des positions de captations d'images et des informations de longueur d'ondes à haute résolution, de même que pour créer, à partir des premières données au niveau des positions de captation d'images, les deuxièmes données en ayant des informations bidimensionnelles et des informations de longueur d'onde unidimensionnelles ; un moyen pour créer une image spectrale pour chaque longueur d'onde à partir des deuxièmes données ; un moyen pour acquérir la luminance radiante spectrale, la luminance spectrale et la réflexivité spectrale ou la transmittance spectrale à chaque pixel d'une image spectrale ; et un moyen pour mettre en œuvre un processus d'analyse de l'image spectral, par une technique d'analyse prescrite.
PCT/JP2015/056217 2014-03-03 2015-03-03 Dispositif microscope et procédé d'analyse WO2015133475A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014-040620 2014-03-03
JP2014040620A JP2015166763A (ja) 2014-03-03 2014-03-03 顕微鏡装置および解析方法

Publications (1)

Publication Number Publication Date
WO2015133475A1 true WO2015133475A1 (fr) 2015-09-11

Family

ID=54055281

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/056217 WO2015133475A1 (fr) 2014-03-03 2015-03-03 Dispositif microscope et procédé d'analyse

Country Status (2)

Country Link
JP (1) JP2015166763A (fr)
WO (1) WO2015133475A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114270134A (zh) * 2019-08-21 2022-04-01 株式会社V技术 显微镜图像测定装置及显微镜图像测定方法

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4001890A4 (fr) * 2019-07-19 2023-08-02 Hitachi High-Tech Corporation Dispositif de mesure quantitative de particules

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001292369A (ja) * 2000-02-04 2001-10-19 Olympus Optical Co Ltd 顕微鏡システム
JP2005072967A (ja) * 2003-08-25 2005-03-17 Olympus Corp 顕微鏡像撮像装置及び顕微鏡像撮像方法
US20130229663A1 (en) * 2010-05-10 2013-09-05 University Of Pittsburgh - Of The Commonwealth System Of Higher Education Spatial-domain low-coherence quantitative phase microscopy
JP2014016531A (ja) * 2012-07-10 2014-01-30 Jasco Corp 共焦点顕微装置

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3667397B2 (ja) * 1995-09-04 2005-07-06 日本分光株式会社 ラマン分光装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001292369A (ja) * 2000-02-04 2001-10-19 Olympus Optical Co Ltd 顕微鏡システム
JP2005072967A (ja) * 2003-08-25 2005-03-17 Olympus Corp 顕微鏡像撮像装置及び顕微鏡像撮像方法
US20130229663A1 (en) * 2010-05-10 2013-09-05 University Of Pittsburgh - Of The Commonwealth System Of Higher Education Spatial-domain low-coherence quantitative phase microscopy
JP2014016531A (ja) * 2012-07-10 2014-01-30 Jasco Corp 共焦点顕微装置

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114270134A (zh) * 2019-08-21 2022-04-01 株式会社V技术 显微镜图像测定装置及显微镜图像测定方法

Also Published As

Publication number Publication date
JP2015166763A (ja) 2015-09-24

Similar Documents

Publication Publication Date Title
Gao et al. Optical hyperspectral imaging in microscopy and spectroscopy–a review of data acquisition
US10229310B2 (en) High throughput partial wave spectroscopic microscopy and associated systems and methods
JP2019058681A (ja) 皮膚疾患の光学検出のためのシステム及び方法
Klein et al. Quantitative hyperspectral reflectance imaging
US20080144013A1 (en) System and method for co-registered hyperspectral imaging
Bedard et al. Image mapping spectrometry: calibration and characterization
JP2013238576A (ja) 変角分光イメージング測定方法およびその装置
JP2012526269A (ja) 多波長偏光画像からシーンを識別するための方法
JP6068375B2 (ja) 分光放射輝度計
KR101819602B1 (ko) 촬영 장치
US8634067B2 (en) Method and apparatus for detecting microscopic objects
US20170146786A1 (en) Microscope
JP2010151801A (ja) ラマンイメージング装置
WO2015133475A1 (fr) Dispositif microscope et procédé d'analyse
WO2012090416A1 (fr) Dispositif d'essai
WO2017141291A1 (fr) Appareil d'imagerie et procédé de traitement d'image
US20230092749A1 (en) High throughput snapshot spectral encoding device for fluorescence spectral microscopy
JP5217046B2 (ja) 光学特性測定装置および光学特性測定方法
JP5246798B2 (ja) 生体組織識別装置及び方法
JP2012189342A (ja) 顕微分光測定装置
JP5408527B2 (ja) メラノーマ診断用画像の作成方法
JP2012098244A (ja) 成分分布分析方法、成分分布分析装置、および、プログラム
Hjeij et al. Mid-infrared speckle reduction technique for hyperspectral imaging
Arnold et al. Hyper-spectral video endoscopy system for intra-surgery tissue classification
JP2021001777A (ja) 植物の生育状態評価方法および評価装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15758569

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15758569

Country of ref document: EP

Kind code of ref document: A1