CN112639582A - Hyperspectral apparatus and method - Google Patents

Hyperspectral apparatus and method Download PDF

Info

Publication number
CN112639582A
CN112639582A CN201980056673.XA CN201980056673A CN112639582A CN 112639582 A CN112639582 A CN 112639582A CN 201980056673 A CN201980056673 A CN 201980056673A CN 112639582 A CN112639582 A CN 112639582A
Authority
CN
China
Prior art keywords
sample
array
beamlets
beamlet
structured illumination
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201980056673.XA
Other languages
Chinese (zh)
Other versions
CN112639582B (en
Inventor
史蒂文·詹姆斯·弗里斯肯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Cylite Pty Ltd
Original Assignee
Cylite Pty Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cylite Pty Ltd filed Critical Cylite Pty Ltd
Publication of CN112639582A publication Critical patent/CN112639582A/en
Application granted granted Critical
Publication of CN112639582B publication Critical patent/CN112639582B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/1025Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for confocal scanning
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/0016Operational features thereof
    • A61B3/0025Operational features thereof characterised by electronic signal processing, e.g. eye models
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/12Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for looking at the eye fundus, e.g. ophthalmoscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/14Arrangements specially adapted for eye photography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0071Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence by measuring fluorescence emission
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0075Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence by spectroscopy, i.e. measuring spectra, e.g. Raman spectroscopy, infrared absorption spectroscopy
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/0205Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows
    • G01J3/0208Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows using focussing or collimating elements, e.g. lenses or mirrors; performing aberration correction
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/0205Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows
    • G01J3/0229Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows using masks, aperture plates, spatial light modulators or spatial filters, e.g. reflective filters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/0289Field-of-view determination; Aiming or pointing of a spectrometer; Adjusting alignment; Encoding angular position; Size of measurement area; Position tracking
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/06Scanning arrangements arrangements for order-selection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/10Arrangements of light sources specially adapted for spectrometry or colorimetry
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/12Generating the spectrum; Monochromators
    • G01J3/14Generating the spectrum; Monochromators using refracting elements, e.g. prisms
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/2803Investigating the spectrum using photoelectric array detector
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/2823Imaging spectrometer
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/42Absorption spectrometry; Double beam spectrometry; Flicker spectrometry; Reflection spectrometry
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/44Raman spectrometry; Scattering spectrometry ; Fluorescence spectrometry
    • G01J3/4406Fluorescence spectrometry
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/44Raman spectrometry; Scattering spectrometry ; Fluorescence spectrometry
    • G01J3/4412Scattering spectrometry
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/09Beam shaping, e.g. changing the cross-sectional area, not otherwise provided for
    • G02B27/0938Using specific optical elements
    • G02B27/095Refractive optical elements
    • G02B27/0955Lenses
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/04Arrangements of multiple sensors of the same type
    • A61B2562/046Arrangements of multiple sensors of the same type in a matrix array
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2576/00Medical imaging apparatus involving image processing or analysis
    • A61B2576/02Medical imaging apparatus involving image processing or analysis specially adapted for a particular organ or body part
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/102Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for optical coherence tomography [OCT]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0033Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room
    • A61B5/004Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room adapted for image acquisition of a particular organ or body part
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0062Arrangements for scanning
    • A61B5/0068Confocal scanning
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0082Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/40Detecting, measuring or recording for evaluating the nervous system
    • A61B5/4076Diagnosing or monitoring particular conditions of the nervous system
    • A61B5/4088Diagnosing of monitoring cognitive diseases, e.g. Alzheimer, prion diseases or dementia
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/06Scanning arrangements arrangements for order-selection
    • G01J2003/064Use of other elements for scan, e.g. mirror, fixed grating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/42Absorption spectrometry; Double beam spectrometry; Flicker spectrometry; Reflection spectrometry
    • G01J2003/425Reflectance
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/47Scattering, i.e. diffuse reflection
    • G01N2021/4704Angular selective
    • G01N2021/4709Backscatter
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/55Specular reflectivity
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/62Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light
    • G01N21/63Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light optically excited
    • G01N21/64Fluorescence; Phosphorescence

Abstract

The invention relates to a hyperspectral apparatus and method. One aspect of the invention provides an apparatus for analyzing a sample. The device includes a light source configured to generate a broadband input radiation field. The apparatus also includes a structured light generator for converting an input radiation field into a structured illumination field comprising an array of beamlets. An optical system projects the structured illumination field onto an area of the sample, such as the retina. The spectrometer is configured to spectrally analyze a portion of the light reflected, backscattered, or fluoresced from a region of the sample. A processor is operatively associated with the spectrometer and configured to generate a hyperspectral image including two or more frontal images of the eye area. The front image includes spectral response information of the sample from each beamlet of the structured illumination field.

Description

Hyperspectral apparatus and method
Technical Field
The present invention relates to apparatus and methods for interrogating a sample, and in particular to apparatus and methods for performing hyperspectral imaging and fluorescence spectroscopy of a sample. Although some embodiments will be described herein with specific reference to this application, it should be understood that the invention is not limited to such fields of use, but rather can be applied in a broader context.
RELATED APPLICATIONS
This application claims priority from U.S. provisional patent application 62/727,492 filed on 5.9.2018, the contents of which are incorporated herein by reference.
Background
Any discussion of the background art throughout the specification should in no way be considered as an admission that such art is widely known or forms part of common general knowledge in the field.
Hyperspectral imaging of the human eye can be used to extract both spectral and spatial information about the eye. However, current techniques typically involve point scanning systems that have long acquisition times and suffer from motion blur artifacts. Furthermore, current systems are generally not limited and thus may capture stray light from different sample locations.
Another limitation of existing systems is cost. Hyperspectral imaging systems require a spatially coherent broadband light source to operate efficiently. Such light sources are often expensive.
Retinal Autofluorescence (AF) is a contrast indicator that can be used to identify the establishment or concentration of certain molecules in the retina, particularly in the Retinal Pigment Epithelium (RPE). Blue light excited AF of the retina preferably uses confocal detection rather than fundus photography to identify the AF signal to prevent large signals from other parts of the eye from blurring the image. There are commercially available instruments that can provide AF measurements using blue light excitation, which is particularly valuable in the lipofuscin contrast, an important molecule related to the aging and health of the RPE layer as well as the photoreceptor.
It is also of interest to recognize molecules, such as collagen, from other parts of the eye, such as the sclera and lens.
Alternatively, information about the nature of fluorescence can be obtained from the lifetime of the AF signal, but this requires very high processing speeds and only point-by-point imaging. This limits the region of the eye that can be scanned in a comfortable time for the patient. An alternative method of lifetime measurement is to discriminate based on fluorescence and/or absorption spectra. However, only limited work has been done in this area, and many studies are in vitro.
The inventors have recognized that it would be advantageous to have an improved apparatus and method for more efficiently and accurately performing hyperspectral imaging and fluorescence spectroscopy of an eye.
Disclosure of Invention
It is an object of the present invention to overcome or ameliorate at least one of the disadvantages of the prior art, or to provide a useful alternative.
According to a first aspect of the present invention there is provided an apparatus for analysing a sample, the apparatus comprising:
a light source configured to generate an input radiation field having a wavelength band comprised of a plurality of wavelengths;
a structured light generator for converting an input radiation field into a structured illumination field comprising an array of beamlets;
an optical system for projecting the structured illumination field onto an area of the sample, including angle-encoding the beamlets such that each beamlet is projected onto a location of the sample corresponding to the encoded angle;
a spectrometer comprising a two-dimensional sensor array, the spectrometer configured to spectrally analyze a portion of light reflected, backscattered, or fluoresced from a region of a sample; and
a processor operatively associated with the spectrometer, the processor comprising:
a spectral mapping module configured to map locations on the sensor array to two-dimensional locations on the sample based on a predefined mapping function and wavelengths of light within a plurality of predefined wavelength intervals; and
a hyperspectral image generator configured to generate a hyperspectral image from the sensor signals of the sensor array and a predefined mapping function, the hyperspectral image comprising two or more frontal images of an area of the sample, the two or more frontal images comprising spectral response information of the sample from each beamlet of the structured illumination field.
Preferably, the structured light generator is configured to convert the input radiation field into a structured illumination field comprising a two-dimensional array of beamlets.
In some embodiments, the light sources are spatially incoherent.
In some embodiments, the optical system includes a means for translating the structured illumination field relative to the sample. In one embodiment, the means for translating the structured illumination field relative to the sample comprises a tiltable mirror.
In some embodiments, the optical system includes a dispersive element for angularly dispersing each of the beamlets into elongated bands of spectral components of the beamlets corresponding to each of the plurality of wavelengths. The optical system preferably comprises a lens relay, which is arranged, in use, between the dispersive element and the sample, to project the beamlet spectral components onto the sample at a position that depends on both the original beamlet position in the beamlet array and the wavelength of the beamlet spectral components. Preferably, the beamlet spectral components of different beamlets overlap over an area of the sample, such that a plurality of wavelengths is imaged by the spectrometer at each point in the overlapping area over the area of the sample.
The spectral response information preferably includes a reflected power spectrum corresponding to each point on the area of the sample.
In some embodiments, the dispersive element may be mechanically movable to be removed from the optical path. In some embodiments, the dispersive element is interchangeable with the non-dispersive optical guiding element.
In some embodiments, the processor is configured to divide wavelengths corresponding to elongated bands of spectral components of the beamlets into wavelength intervals, wherein wavelengths within a wavelength interval are designated to originate from a common location on the sample and wavelengths of different wavelength intervals are designated to originate from different locations on the sample.
In some embodiments, each of the angularly encoded beamlets projected onto an area of the sample includes each of a plurality of wavelengths.
In some embodiments, the spectral response information includes a fluorescence spectrum corresponding to the response of the sample to each beamlet of the structured illumination field.
In some embodiments, the structured light generator comprises a beamlet generation device having a two-dimensional array of optical power elements, the beamlet generation device being positioned such that, in use, the input radiation field is incident on the array of optical power elements to generate the two-dimensional array of at least partially collimated beamlets. Preferably, the beamlet generation means comprise a first array of microlenses, wherein the optical power elements are microlenses.
In some embodiments, the structured light generator further comprises:
a first optical power arrangement for converging the array of at least partially collimated beamlets to a predefined width at a convergence plane; and
an aperture disposed at the convergence plane, the aperture having a diameter smaller than a diameter of the array of at least partially collimated beamlets at the convergence plane.
In one embodiment, the first optical power device is a high numerical aperture lens. In another embodiment, the first optical power means is a variable index lens.
Preferably, the structured light generator further comprises:
a second optical power device having a predefined focal length and disposed at a distance from the convergence plane equal to the predefined focal length, the second optical power device generating a near-collimated beam comprising a set of at least partially overlapping beamlets; and
a second microlens array having a two-dimensional array of microlenses and positioned to receive the near-collimated beam and generate a two-dimensional array of beamlets for illuminating the sample.
In some embodiments, the apparatus includes an aperture array positioned to confocally project light reflected, backscattered, or fluoresced from a region of the sample into the spectrometer, the aperture array having a pitch corresponding to the beamlet array.
In some embodiments, the light source is configured to generate an input radiation field having a plurality of wavelength bands, each wavelength band consisting of a corresponding plurality of wavelengths. Preferably, the first wavelength band is selected from the ultraviolet a, violet, blue or green region of the electromagnetic spectrum. Preferably, the second wavelength band is in the near infrared region of the electromagnetic spectrum.
In some embodiments, the optical system includes a multiplexing element for multiplexing a plurality of wavelength bands together. Preferably, the multiplexing element is a volume phase grating. The volume phase grating is preferably configured to spatially disperse a plurality of wavelengths from one wavelength band while maintaining a plurality of wavelengths of another wavelength band in a spatially limited state.
In some embodiments, the apparatus includes a reference arm and a power distribution element configured to direct a portion of the optical power of the structured illumination field along the reference arm and a remaining portion of the optical power of the structured illumination field toward the sample.
In certain embodiments, the length of the reference arm is selectively adjustable to select the coherent wavelength to be imaged by the spectrometer.
Preferably, the device is configured for analyzing a sample comprising an eye.
According to a second aspect of the invention, there is provided a system for generating a structured illumination field, the system comprising:
a spatially incoherent light source configured to generate an input radiation field having a predefined spectral output;
a beamlet generation device having a two-dimensional array of optical power elements, the beamlet generation device being positioned such that the input radiation field is incident on the array of optical power elements to generate a two-dimensional array of at least partially collimated beamlets;
a first optical power arrangement for converging the array of at least partially collimated beamlets to a predefined width at a convergence plane;
a second optical power device having a predefined focal length and disposed at a distance equal to the focal length from the convergence plane, the second optical power device generating a near-collimated beam comprising a set of at least partially overlapping beamlets; and
a microlens array having an array of microlenses and positioned to receive the near-collimated light beam and generate a structured illumination field comprising an array of beamlets.
The microlens array preferably comprises a two-dimensional array of microlenses, such that the structured illumination field comprises a two-dimensional array of beamlets.
In some embodiments, the system includes an aperture disposed at the convergence plane, the aperture having a diameter smaller than a predefined width of the array of at least partially collimated beamlets at the convergence plane.
In one embodiment, the first optical power device is a high numerical aperture lens. In another embodiment, the first optical power means is a variable index lens.
Preferably, the beamlet generation means comprise a microlens array, wherein the optical power element is a microlens.
According to a third aspect of the present invention there is provided a method of analysing a sample, the method comprising the steps of:
generating a structured illumination field comprising an array of beamlets from an input radiation field having a wavelength band comprised of a plurality of wavelengths;
projecting a structured illumination field onto an area of a sample, including angle-encoding beamlets such that each beamlet is projected onto a location of the sample corresponding to an encoding angle; and
spectrally analyzing a portion of light reflected, backscattered, or fluoresced from a region of a sample using a two-dimensional sensor array, the spectral analysis comprising:
mapping locations on the sensor array to two-dimensional locations on the sample based on a predefined mapping function and wavelengths of light within a plurality of predefined wavelength intervals; and
a hyperspectral image is generated from sensor signals of the sensor array and a predefined mapping function, the hyperspectral image comprising more than two frontal images of an area of the sample, the more than two frontal images comprising spectral response information of the sample from each beamlet of the structured illumination field.
Preferably, the structured illumination field comprises a two-dimensional array of beamlets.
According to a fourth aspect of the present invention there is provided an article of manufacture comprising a computer usable medium having computer readable program code configured to operate the apparatus of the first aspect.
According to a fifth aspect of the invention, there is provided an article of manufacture comprising a computer usable medium having computer readable program code configured to implement the method of the third aspect.
Drawings
Preferred embodiments of the present disclosure will now be described, by way of example only, with reference to the accompanying drawings, in which:
fig. 1A is a schematic system level diagram of an apparatus for analyzing a sample in the form of a human eye according to a first embodiment of the invention;
FIG. 1B shows a deformed structured illumination field produced by the apparatus of FIG. 1A;
FIG. 2 is a schematic side view of an instrument configured to receive the device of FIG. 1A;
FIGS. 3 and 4 show plan and side views, respectively, of a square array of 16 microlenses;
FIG. 5 is a schematic view of a grid of beamlets dispersed in a beamlet strip extending along an axis of the grid;
FIG. 6 is an example map of a two-dimensional grid of beamlets dispersed over a unique set of pixels of a sensor array;
fig. 7 is a schematic system level diagram of an apparatus for analyzing the human eye according to a second embodiment of the present invention; and
fig. 8 is a schematic system level diagram of an apparatus for analyzing the human eye according to a third embodiment of the present invention.
Detailed Description
Overview of the System
The invention will be described with reference to applications for studying the physiology and morphology of an eye, such as the eye, in vivo. However, it should be understood that the present invention may be applied in a wider range of non-ocular applications, such as skin examination, food analysis and high resolution microscopy.
A first embodiment of the invention is schematically illustrated in fig. 1A. Referring to the figure, an apparatus 100 for analyzing a sample in the form of a human eye 102 is shown. The apparatus 100 includes a light source 104 configured to generate an input radiation field 106 having a first wavelength band comprised of a plurality of wavelengths. In the arrangement shown, the light source 104 comprises a spectrally broadband source in the form of a Light Emitting Diode (LED) with low or no spatial coherence. In one embodiment, light source 104 includes an array of 16 substantially square emission areas arranged in a 4 x 4 configuration. However, in other embodiments, the light source 104 may include an LED or an array of LEDs having a single emission area. In alternative embodiments, the light source 104 may include one or more spatially highly coherent sources, such as superluminescent diodes or fiber-based supercontinuum sources.
In a preferred embodiment, the light source 104 emits visible light such that the first wavelength band is in the visible light region. In general, however, light in the Near Infrared (NIR), visible, or UV spectral regions may be used depending on the particular application. Further, the light source 104 may have a spectral signature that includes a plurality of spectral peaks. For example, the source may comprise a white light phosphor LED source extending from 400nm to 720nm with intensity peaks at 450nm and 550 nm.
In the case of a light source, the parameters of spatial coherence can be defined as a single source generating a field of radiation, where one continuous region of the field is coherent with another region of the field at any point in time. Thus, a spatially incoherent source may be considered as a single light source generating a continuous radiation field, where different regions of the field are incoherent at some point in time. In such light sources, light is emitted independently from different points across the surface of a single source, and thus, different points generate light with different phase relationships. Mathematically, the spatial coherence between two physical points is the cross-correlation between the two points over a period of time. Thus, incoherent light sources produce a radiation field in which different points in the cross-section of the radiation field are uncorrelated (e.g., have a cross-correlation value of less than 0.1). Examples of spatially incoherent light sources include incandescent bulbs, LEDs and LED lasers or LED pumped phosphor arrays, blackbody radiation, and plasma light sources.
In a preferred embodiment, a structured light generator 108 converts an input radiation field 106 into a structured illumination field 110, the structured illumination field 110 comprising a two-dimensional array of beamlets 111 as shown in the inset of FIG. 1A. In an alternative embodiment, a structured light generator 108 converts an input radiation field 106 into a structured illumination field 110, the structured illumination field 110 comprising a one-dimensional array of linear beamlets 119, as shown in FIG. 1B. The operation of the structured light generator 108 in the case of spatially incoherent light sources is described in detail below. However, in alternative embodiments where the light sources are spatially coherent, the structured light generator 108 may be simplified to include only a focusing lens and a microlens array, for example. Examples of spatially coherent light sources include SLEDs, single mode fiber ASE sources, and supercontinuum white light generating or swept wavelength diode lasers in single mode fibers.
An optical system 114 is provided for projecting the structured illumination field 110 onto an area of the eye 102, such as an area of the retina 145. Optical system 114 includes an optical device configured to angularly encode beamlets 111 such that each beamlet is projected to a location, e.g., 147, of eye 102 that corresponds to the encoded angle. The components and operation of the optical system 114 are described in detail below.
Finally, the apparatus 100 includes a spectrometer 116 that includes a two-dimensional sensor array 118, such as a CMOS camera, for spectrally analyzing a portion of the light 149 that is reflected, backscattered, or fluoresced from the illuminated area of the eye 102. The spectrometer 116 is operatively associated with a processor 158, as described below, which processor 158 is configured to generate a hyperspectral image comprising two or more frontal images of the illuminated area of the eye 102. The two or more frontal images include spectral response information of the sample from each beamlet 111 of the structured illumination field 110. In the context of the present invention, the spectral response information comprises one or more of: (1) reflected or backscattered color spectral information from the illuminated area of the eye 102 contained in the reflected or backscattered beamlets; or (2) fluorescence spectrum information emitted from the illuminated region of the eye 102.
The term "frontal image" is used in imaging applications such as Optical Coherence Tomography (OCT) to refer to a frontal laterally resolved image of a sample. The frontal image comprises a lateral image of a thin depth layer of the sample, where the data from which the image is generated may be initially captured in two dimensions (e.g., using a laser scanning ophthalmoscope) or three dimensions (e.g., in an OCT C-scan). The frontal image may also be an integrated view of multiple layers or surfaces, as is common in OCT angiography. Front-side imaging is in contrast to other imaging techniques, such as "a-scan" and "B-scan" imaging, which involve imaging one or more depth slices of a sample in one or two dimensions.
Referring to fig. 2, the device 100 is adapted to be incorporated into an instrument 200 for imaging one or both eyes of a person. The instrument 200 may include a main housing 202 incorporating the device 100, a user support 204 for supportively holding a person's head during operation, and a support base 206 for supporting the instrument 200 on a surface. In some embodiments, the housing 202 and the user support 204 are manually or electromechanically translatable relative to one another, as indicated by arrow 208.
Hyperspectral dispersive structure illumination from spatially and spectrally incoherent sources
Referring again to FIG. 1A, a first aspect of the invention is the use of a structured light generator 108 to generate hyperspectral dispersed structured illumination. This allows the use of low cost, spectrally broadband and spatially incoherent sources, such as LEDs, to generate high density structured illumination patterns consisting of, for example, tens, hundreds or thousands of dots or lines.
For example, the light source 104 may be a low cost LED device comprising a 4 x 4 array of 16 blue LED emitting areas with a pitch of 200 microns provided by the Luxeon Z LED device sold by Lumileds Holding b.v. The device is capable of providing short pulse illumination of over 1W power at an excitation wavelength centered at 425 microns, which may be further cascaded through dielectric filter multiplexing cubes if desired. The short pulse time reduces motion artifacts during the measurement. The lower duty cycle also provides the ability for a relatively high instantaneous peak power but a lower average power. Dielectric multiplexing cubes (dielectric multiplexing cubes) provide the ability to combine more than one source.
Within structured light generator 108, a two-dimensional array of beamlets 111 may be formed by a combination of optics as now described. Initially, beamlet generation means in the form of a first two-dimensional microlens array 112 is used to form the input radiation field 106 into a plurality of co-propagating beamlets 120. The array 112 comprises a two-dimensional array of optical power elements in the form of substantially similar microlenses having a common focal length. The microlens array is preferably matched to the array of LEDs or emission areas in the source 104, such that each LED or emission area has a corresponding microlens. The microlenses are preferably arranged in a substantially linear array to match the array of LEDs in the light source 104. However, it will be understood that the microlenses may be arranged in other types of arrays to match other arrays of LEDs. For example, microlens array 112 may be formed from an array of crossed cylindrical microlenses of suitable numerical apertures. In an alternative embodiment, microlens array 112 may be replaced with microbeads, such as those available from Nippon Electric Glass, Inc. These beads can be placed on the grid ring and directly in contact with the LED light source.
Fig. 3 and 4 illustrate the geometry of an example 4 x 4 array 112 of square microlenses 151. The microlenses were disposed at a pitch of 200 microns with respect to each other. The thickness to the apex of each microlens 151 is about 0.450mm, and the radius of curvature of each microlens is about 0.200 mm. The size of the whole substrate can be variable, e.g. 1.6mm x 1.6mm, but there is a shaded area 113 on the planar side that extends to one edge, and a rounded 1.2mm shaded area 115 at the height of the lens vertex on the convex side of the optics that should be considered.
As described above, structured light generator 108 includes first microlens array 112. Array 112 is positioned such that input radiation field 106 is incident on the array of microlenses 151 to generate a two-dimensional array of at least partially collimated initial beamlets 120.
Initial beamlets 120 are projected through a first optical power device in the form of converging lens 122 to spatially converge at least partially collimated beamlets 120 to a predefined width at a convergence plane 124. The lens 122 may be, for example, a high numerical aperture lens having a numerical aperture in the range of 0.4 to 1.5, or a variable index lens.
An aperture 126 may be provided at the convergence plane 124 to limit the spatial extent of the LED output to allow a percentage of the generated light to pass through and to absorb or reflect the remaining light, preferably for recycling. The diameter of aperture 126 is preferably smaller than the diameter of the array of at least partially collimated beamlets 120 at the convergence plane. For example, the aperture 126 may be about 250 microns, allowing light to pass within the 0.1 numerical aperture of the 1.25mm focal length variable index lens 122. Further, assuming that the light source 104 is a fixed size 16 element LED array, the numerical aperture of the light emitted from the aperture 126 will be about 0.4mm/l.25mm or 0.32 in each axis. Due to the size of the LED array 104 and the lens 122, here 0.4mm corresponds to a numerical aperture at aperture 126.
A second optical power device in the form of an optical lens 128 is disposed downstream of the convergence plane 124 by a distance equal to its focal length. Lens 128 generates a near-collimated beam 129, where the total radiation is a combination of the individual fields of radiation from beamlets 120 that may partially or completely overlap. The numerical aperture of the near-collimated beam 129 may be, for example, 0.125mm/25mm or 0.005. The light field at this point represents the expanded and at least partially overlapping form of beamlets 120 modified by lenses 122 and 128 and aperture 126.
A second microlens array 130 having a broad two-dimensional array of microlenses is positioned to receive the expanded beamlets output from lens 128 and form a two-dimensional array of imaging beamlets 111, which represents structured illumination field 110 in a preferred embodiment. For example, second microlens array 130 may include an array of 30 × 30 microlenses to form a grid of 900 imaging beamlets 111. Assuming each microlens of array 130 has a focal length of 1000 microns, the second microlens array 130 takes the set of beamlets in the near-collimated beam 129 and generates a two-dimensional array of reduced beamlets 111 that will have a 250 micron spot at aperture 126 demagnified by a factor of 1/25 at the focal point, creating an array of 10 micron images of aperture 126 representing the structured illumination field 110.
The second microlens array 130 focuses the beamlets to a first focal plane 131, producing the structured illumination field 110 at the first focal plane 131.
Depending on the wavelength, the rayleigh diffraction limit (rayleigh diffraction limit) of the microlenses used for the array 130 is smaller than but comparable to the aperture size, so the aperture image will no longer be clearly resolved, and the convolution and rayleigh resolution of the 10 micron aperture image will provide a beamlet diameter of slightly more than 10 microns to the structured illumination field 110. The structured illumination field 110 can be used for various imaging applications, such as those described below.
Hyperspectral decentralized illumination of a sample to extract color spectral information
A second aspect of the invention relates to hyperspectral structured confocal imaging of a sample to extract colour spectral information. This involves simultaneously measuring multiple wavelengths at multiple points on the sample to extract color or color spectrum information from across the sample location.
Extending from the above description, and still referring to the embodiment shown in FIG. 1A, the structured illumination field 110 is projected onto the eye 102 through an optical system 114. The system 114 includes an optical power distribution element in the form of a beam splitter 132. The beam splitter 132 is preferably formed by a partially reflective mirror 134 angled at about 45 degrees from the plane of the structured illumination field 110. The partial mirror transmits a portion of the optical power and reflects the remaining optical power, which is coupled and attenuated from the optical path (not shown). The reflective characteristics of the mirror are selected to be relatively uniform across the first wavelength band, and the reflectivity is selected so that a suitable level of power is incident on the eye 102 for comfort and safety. It will be appreciated that in other embodiments, the beam splitter 132 may comprise a polarizing beam splitter to selectively transmit only certain polarization components.
In the case where the light source 104 is sufficiently bright, the beam splitter 132 may transmit approximately 20% of the incident light power, thereby reducing the power input to the eye 102 to within the maximum allowable exposure limits of the LED.
A portion of the light transmitted through the beam splitter 132 is projected onto a first collimating lens 136, which first collimating lens 136 is preferably positioned at a distance equal to its focal length from the first focal plane 131. The lens 136 focuses the structured illumination field 110 on a two-dimensional steering element 138 in the form of a microelectromechanical mirror system (MEMS), the two-dimensional steering element 138 being located at a relative focal plane of the lens 136. Preferably, each beamlet 111 in the structured illumination field 110 is collimated to fill a MEMS 138, which MEMS 138 may have a diameter of about 5 mm. The MEMS 138 can include an electrically controllable mirror that can be selectively rotated in two dimensions by a connected controller (not shown) to translate the positioning of the structured illumination field 110 across different areas of the eye 102.
The reflected and diverted light from the MEMS element 138 is now angularly dispersed by the dispersive element 140 in the form of a transmissive diffraction grating. In some embodiments, the grating 140 has a line density of 300 lines/mm and is configured to provide efficient coupling to the near Littrow angle (Littrow angle) over the first wavelength band. Grating 140 angularly disperses each beamlet 111 into an elongated band of spectral components corresponding to the plurality of wavelengths emitted by light source 104. Thus, at this point, each beamlet propagates at a different angle as determined by its position in the array of beamlets 111, and each constituent wavelength of a beamlet is angularly dispersed at a unique angle. Described another way, each constituent wavelength component from source 104 will have a corresponding beamlet grid, where each grid is shifted in the dispersion axis of grating 140.
It should be understood that in alternative embodiments, diffraction grating 140 may be replaced with alternative dispersive elements, such as a reflective grating or a prism.
The dispersed beamlets are then passed through the 8-f lens relay of lenses 142 and 144 and then projected onto eye 102. Lenses 142 and 144 represent a retinal lens arrangement that directs the structured illumination pattern toward the eye 102. In this system, the eye 102 performs an angular to offset conversion to define a grid of beamlets on the retina 145. The optical power elements of the eye 102 function to focus the collimated beamlets on the retina 145 at locations 147 corresponding to their propagation angles. Now, this angle, and hence the position on the retina, is uniquely determined by the position of the original beamlet and the wavelength dispersion angle (i.e. the color of the spectral component). That is, the location 147 at which each beamlet and wavelength component impinges the eye depends on the original beamlet location in the array of beamlets 111 and the wavelength of the spectral components of the beamlets.
The retinal lens arrangements 142, 144 may include electromechanical adjustment along the axial dimension to adjust the focal position in a manner similar to autofocus techniques.
The dispersion angle of the beamlets with respect to the beamlet grid is determined by the orientation of microlens array 130 with respect to the dispersion axis of grating 140, and may be selected to be in the direction of one of the axes of the beamlet grid as shown in FIG. 5. This allows a series of lines to be projected onto the retina and where multiple locations on the retina 145 are illuminated by multiple wavelength components (colors) originating from different microlens locations within the microlens array 130 due to the overlap of the elongated thin light beam 501. Thus, the spectrometer can image multiple wavelengths from each illumination point in an overlapping region in the illuminated region of the sample to obtain color spectral information.
Alternatively, the wavelength components may be dispersed at an angle with respect to the axis of beamlet grid 110. This results in a more uniform and less intense exposure of the retina, which can help reduce patient discomfort. In both cases, a complete image of the retina in multiple colors is obtained by dithering the MEMS 138 in two axes using a controller. Including a complete scan over a predetermined angular range on the eye 102, allows each position within a selected region of the retina 145 to be illuminated by all wavelengths within a dithering cycle.
The return path of light 149 reflected or backscattered from the retina of the eye 102 is now described.
The unadapted eye acts essentially as a retroreflector to return at least a portion of the light in each beamlet along the original beamlet trajectory. Thus, the angular dispersion introduced on the forward path of retina 145 can be reversed on the return path by lenses 144 and 142 and diffraction grating 140. The returned light reflects from the mirror 134 of the beam splitter 132 and the structured illumination field 110 (comprising a grid of full wavelength beamlets) is reformed at the conjugate focal plane 146. After the above example where the beam splitter 132 transmits 20% of the light from the light source 104, 80% of the return light is reflected from the beam splitter 132 to the plane 146 with little power loss. At plane 146, each spectral component or color in a given beamlet represents the reflectance of retina 145 for that color at location 147 on the mapped retina.
Thus, at plane 146, the returning light represents the incident structured illumination field 110 but with retinal reflections and color spectral information encoded therein.
The returning light then passes through another microlens array 148, where each beamlet is collimated. The collimated beamlets are then passed through an aperture array 150, the aperture array 150 having a pitch corresponding to a two-dimensional array of beamlets. The aperture array 150 is positioned to confocally project light collected from the illuminated area of the sample into the spectrometer 116. In this manner, aperture array 150 spatially filters out peripheral light and unfocused light from the beamlets in a confocal manner. The beamlets are then analyzed by a spectrometer 116.
The spectrometer 116 includes a relay of lenses 152 and 154 arranged in a 4-f confocal configuration. First lens 152 spatially converges the beamlets to a point at a common focal length between lenses 152 and 154. A dispersive element 156 in the form of a wedge pair is located at the common focal point and is configured to angularly disperse the wavelength components from each beamlet. The lens 154 images the angularly dispersed wavelength components onto the two-dimensional sensor array 118. An example mapping of a two-dimensional grid of beamlets 600 dispersed over a unique set of pixels 602 of sensor array 118 is shown in FIG. 6. In other embodiments, the wedge pairs 156 may be replaced with alternative dispersive elements, such as transmissive or reflective diffraction gratings or prisms.
The spectrometer 116 is operatively associated with a processor 158, which processor 158 processes the sensor data to form a frontal image of the eye for each wavelength component (color). To construct the image, the processor 158 includes a spectral mapping module 160, the spectral mapping module 160 configured to map locations on the sensor array 118 to two-dimensional locations on the sample 102 and wavelengths of light (from each beamlet spectral component) within a plurality of predefined wavelength intervals based on a predefined mapping function.
In this manner, the dispersed beamlet spectral components may be used for both position information and spectral information. In these embodiments, spectral mapping module 160 is configured to divide the wavelengths corresponding to the elongated bands of detected spectral components of each dispersed beamlet into wavelength intervals. Wavelengths within a given wavelength interval are designated to originate from a common location on the sample 102 and thus represent the spectral response of that retinal location. Wavelengths in different wavelength intervals are designated to originate from different locations on the sample, e.g., spatially different retinal locations.
For example, the input first wavelength band from source 104 may be spread over a bandwidth of 500nm, which defines the size of each elongated band of the spectral components of the beamlets. By dividing the 500nm range into 100 intervals of 5nm, each beamlet can contribute 100 spatially distinct retinal points, each having a spectral bandwidth of 5 nm. Thus, using an arrangement of only 30 × 30 beamlets, 90,000 different spatial points across the retina can be imaged at multiple wavelengths simultaneously. By adjusting the size of the wavelength interval, the spectral bandwidth and spatial resolution of the image can be balanced.
The processor 158 further comprises a hyperspectral image generator 162, the hyperspectral image generator 162 being configured to generate a hyperspectral image from the sensor signals of the sensor array 118 and the predefined mapping function of the module 160. The generated hyperspectral image includes two or more frontal images of the illuminated area of the eye, where each of the frontal images includes spectral response information of the eye from each beamlet 111 of the structured illumination field 110. Thus, the system 100 can image both spatially and spectrally.
It should be understood that the spectral mapping module 160 and the hyperspectral image generator 162 represent functional elements of the processor 158 that may be implemented as algorithms in software, hardware (e.g., CMOS), or a combination of both.
To image different areas across the retina 145, the MEMS 138 is tilted by an electronic dither signal to translate the illuminated area laterally across the retina in two dimensions as described above. Accordingly, a full frontal image of a selected area of the retina 145 can be constructed for each wavelength within the first wavelength band. In this embodiment, spectral response information may be extracted from the retina, including a reflected power spectrum corresponding to each retinal point within the selected region.
Due to the confocal nature of the apparatus 100, only a small range of depths of the sample will be imaged during the two-dimensional angular scan of the MEMS 138. To generate a three-dimensional image of the retina, the structured illumination field may be translated so as to focus at different retinal depths. To this end, the optical system 114 preferably includes means for translating the focal plane of the structured illumination field 110 in an axial direction relative to the retina 145. In some embodiments, the translation component includes a manual or electromechanical adjustment mechanism or actuator for physically adjusting the position of the system 100 embodied in the instrument 200 relative to the eye 102. For example, as shown in fig. 2, the housing 202 including the device 100 and the user support 204 can be manually or electromechanically translated relative to one another, as shown in arrow 208. For example, the user support 204 may translate while the housing remains stationary. Alternatively, the housing 202 may be translated while the user support 204 remains stationary. The translation may be by a linear actuator disposed on the support base 206. In other embodiments, optical system 114 may include a retinal lens, and the focus of the retinal lens may include a motorized stage or a variable focal length lens that can be adjusted to select different imaging depths.
This process is typically performed to initially set the desired imaging depth. To subsequently change the imaging depth, the axial position of one or more of the lenses 142 and 144 is mechanically actuated by a linear or other actuator to adjust the focal position of the structured illumination field 110 on the eye 102. For example, the axial position of the lens 144 may be linearly translated relative to the lens 142 to adjust the focal position and thus adjust the imaging depth within the eye 102.
During the acquisition of a series of images, there may be some eye motion as the various frames are acquired, in addition to the expected translation of illumination across the retina 145 at different MEM settings. Although each color interval provides only a sparse image of the retina, each of the image frames may be accurately co-registered based on a color-averaged version of the image, based on a full line scan. In this process, the color information is used as additional position information to overcome or at least reduce the effects due to the movement of the eye 102 during the image sequence. For each of the registered image frames, a hyperspectral component corresponding to a given pixel location may be acquired. Once the number of frames is high enough, a hyperspectral image can be obtained by dithering, i.e., translating the imaging area through the MEMS 138, and oversampling a full high pixel count.
For many applications, this amount of information is overwhelming and may not be helpful in the visualization of specific retinal pathologies, and thus simplification of the processing data may be applied. For example, a simple true color image of retina 145 may be provided by grouping the detected wavelength components into red, green, and blue (RGB) bands. Alternatively, a narrow spectral region can be added with an appropriate sum or difference to provide a single value that can be represented in intensity or false color.
The apparatus 100 can be combined with a NIR OCT system, wherein the OCT image can be registered relative to a confocal image obtained by the apparatus 100. A somewhat similar system is described in published PCT patent application WO 2018/000036A 1 entitled "Apparatus and method for consistent microscopic use of structured filing" by Steven James Frisken, the contents of which are incorporated herein by cross-reference. However, in this embodiment, the confocal images obtained by the apparatus 100 can now be processed separately into different spectral bands to provide a hyperspectral analysis of the retina, which can help identify spectral features. Thus, in contrast to or in addition to the structural information provided by confocal or OCT images of the retina, the present embodiments provide the ability to diagnose certain pathologies that can be identified or characterized by chemical or molecular spectral features of the retina.
In some embodiments, the 8-f retinal lens relay 142, 144 shown in FIG. 1A may be replaced with an anterior lens arrangement configured to project the angularly encoded beamlets from the MEMS 138 onto a plurality of locations on one or more anterior regions of the eye 102 (such as the cornea or the crystalline lens).
Structured white light illumination of samples to extract color and fluorescence spectral information
A third aspect of the invention relates to illumination of a sample with white light and provides for obtaining reflected or backscattered color spectral information as well as fluorescence spectral information from the sample. A device 700 according to a second embodiment is schematically shown in fig. 7. The apparatus 700 includes many elements in common with the apparatus 100, and for simplicity, these common elements are identified with the same reference numerals.
Illumination in the device 700 is provided by the light source 104 as a spectrally broadband for illuminating a sample (such as the eye 102) with "white light" having wavelengths extending across substantially the entire visible and NIR wavelength ranges. For example, the light source 104 may comprise an LED, an LED array, or a superluminescent diode, wherein the emitted wavelength range covers hundreds of nanometers in the visible and NIR spectral regions. Thus, the same or similar light sources may be used for both the first and second embodiments. The light sources may be spatially incoherent or spatially coherent. As shown in fig. 1A, the device is shown with a spatially incoherent light source 104 and a structured light generator 108 for providing a structured illumination field 110.
The main difference between the apparatus 100 and the apparatus 700 is the replacement of the diffraction grating 140 with a non-dispersive optical guiding element 702, such as a mirror, in the main sample path 705. Mirror 702 is used to directly reflect all beamlets without dispersing the constituent wavelength components such that all wavelengths co-propagate along path 705 to eye 102. This results in the eye 102 being illuminated by a grid of "white light" beamlets, each beamlet comprising all wavelengths emitted by the light source 104. This provides more concentrated illumination of the retina 145 than the dispersed beamlets of the apparatus 100, and thus the light source 104 is typically operated at a lower intensity than the apparatus 100 to improve safety and patient comfort.
The remaining operation of apparatus 700 is substantially similar to the remaining operation of apparatus 100, with each returned beamlet being confocal imaged by aperture array 150 and spectrometer 116. The illumination source is decoupled from the spectrum of the measured reflected light, enabling additional information about the eye to be extracted. For example, the sensor array 118 is now able to simultaneously detect both the reflectance spectrum and any fluorescence spectrum due to autofluorescence in the eye tissue as spectral response information. This is possible because the fluorescence spectrum is detected at a different wavelength than the excitation spectrum, and fluorescence can be distinguished without any controlled dispersion of the beamlets from dedicated dispersive optical elements.
The apparatus 700 may optionally include a reference arm 704 for performing OCT. In this embodiment, the beam splitter 132 operates as a power distribution element configured to direct a portion of the optical power of the structured illumination field 110 (referred to as reference light 706) along the reference arm 704 and the remainder of the optical power of the structured illumination field towards the eye 102.
In the illustrated embodiment, the reference arm 704 includes a reflective element 708, such as a prism or mirror, for directing reference light 706 along the reference arm 704. The reference arm 704 further includes a lens relay including lenses 710, 712, 714, and 716 for substantially matching the path length of a main sample path 705 including optical power elements of the eye 102. A reflective element in the form of a curved metal reflector 718 is provided for reflecting light back along the reference arm 704 to recombine with light returning from the sample 102 at the beam splitter 132. Preferably, the characteristics of the lens 716 and the reflector 718 are defined in combination such that they substantially match the dispersion characteristics of the eye 102. It should be appreciated that if the collective effect were to substantially match the path length and dispersion characteristics of the main sample path 705, the number, type, and location of optical elements in the reference arm 704 may vary significantly in different designs.
The inclusion of the reference arm 704 allows coherent gating of the received signal to perform a more axial local examination of the eye 102. This form of full color OCT can be used to localize the hyperspectral signal to the axial region of interest. For a reasonable simplification of the system, the spectrum of the input radiation field 106 detected on the two-dimensional sensor array 118 can be divided into a plurality of wavelength intervals (e.g., 100 intervals), each interval having a defined spectral width (e.g., 5 nm). Within each wavelength interval, a subset of the individual wavelengths (e.g., 32 wavelength points) can be used as a coherence gate, as in OCT, on which a Fast Fourier Transform (FFT) can be performed to obtain structural (depth) information for each spectral band. These FFTs may be performed to provide a range of axial positions across the sample to build a plurality of hyperspectral images, each image comprising an entire two-dimensional mapping across an entire range of colors.
By limiting the requirements on the range and resolution of OCT, the device can be kept relatively simple without the need for highly accurate dispersion compensation over an extended wavelength range. That is, only dispersion needs to be compensated for within a single hyperspectral wavelength interval.
In some embodiments, the length of the reference arm 704 may be selectively adjustable to select the coherent wavelength to be imaged by the spectrometer. This may be accomplished by an actuator mechanism to mechanically translate the reflector 718 and/or other optical elements in the reference arm 704. In one embodiment, the means for adjusting the reference path length has the same mechanism as the means for translating the focal plane of the main sample path 705 to a different sample depth.
Thus, the depth within the eye 102 being analyzed can be varied by adjusting the position of the instrument 700 relative to the eye 102. If retinal lenses 142, 144 are present, the focal positions of the retinal lenses can be adjusted to select both confocal and coherent gating simultaneously and provide hyperspectral mapping of the retinal layers. Where the presence of absorption or scattering features is sought, the spectral difference technique can be applied to different axial layers. For example, the wavelength intervals may be differentiated to produce information about the blood oxygen saturation of the eye 102.
In some embodiments, device 100 and device 700 may be combined into a single instrument, the functionality of which can be used in conjunction with each other. For example, in some embodiments, diffraction grating 140 and mirror 702 are interchangeable. This interchangeability can be achieved by mechanically moving diffraction grating 140 and mirror 702 into or out of the optical path by an actuator as needed. This may be performed by a linear actuator or the like controlled by an external controller or processor 158.
In some embodiments, system 700 may utilize an eye tracker and/or a retinal camera to accurately register the location on the eye currently being imaged. This allows accurate positional registration of light incident on different areas of the sensor array 118.
Hyperspectral fluorescence imaging of samples
A third embodiment of the present invention is shown in fig. 8 as device 800. The apparatus 800 provides hyperspectral fluorescence imaging of a sample, such as the eye 102, with accurate positional registration to account for eye movement. This function represents a fourth aspect of the invention. The apparatus 800 includes many elements that are common to both apparatuses 100 and 700, and for simplicity, these common elements are identified with the same reference numerals.
The device 800 comprises two light sources 802 and 804, the light sources 802 and 804 being configured to generate respective input light beams 806 and 808. Similar to fig. 1A and 7, both light sources 802 and 804 are spatially incoherent and have structured light generators 108a and 108b, as described above, to generate structured illumination fields 810 and 812, respectively. However, it will be understood that the apparatus 800 may employ spatially coherent light sources, such as SLEDs or supercontinuum light sources.
The light source 802 is configured to provide a light beam 806 in the ultraviolet or visible wavelength range and preferably in one of the UVA, violet, blue or green wavelength ranges, which will be referred to as "colored light". The light source 804 is configured to provide a light beam 808 in the NIR wavelength range, which will be referred to as "NIR light".
In an alternative embodiment, different wavelength bands of light beams 806 and 808 are obtained from a single light source using wavelength selective optics such as diffractive elements, optical filters, or bulk optical dichroic beam splitters.
The structured color light illumination field 810 from source 802 passes through a primary imaging path 814 that includes the beam splitter 132, lens 136, MEMS 138, mirror 702, and lenses 142 and 144 as previously described. The main imaging path 814 also includes a diffractive multiplexing element in the form of a volume phase grating 815, the operation of which is described below. In other embodiments, the volume phase grating 815 may be replaced with a dichroic mirror and a conventional diffraction grating.
At the same time, NIR light from source 804 structures the illumination field 812 through a second path 816. Path 816 includes a beam splitter 818, a focusing lens 820, and a mirror 822 to couple NIR light onto the volumetric phase grating 815. Beam splitter 818 includes a partially reflective mirror 819 and has similar characteristics as beam splitter 132 described above. The beam splitter 818 is used to direct NIR light scattered or reflected back to the second path 816 to the spectrometer 824 for spectral analysis in a manner similar to the spectrometer 116. The spectrometer 824 has a corresponding sensor array 826 operatively associated with the processor 158 (connection not shown). The spectrometer 824 also includes a dispersive system 825 similar to the spectrometer 116.
Both the color structured illumination field 810 and the NIR structured illumination field 812 are focused to a limited area on the volumetric phase grating 815. The volume phase grating 815 is configured to both multiplex light from the fields 810 and 812 together and spatially disperse the NIR wavelengths from the NIR field 812. All wavelengths of the color structure light illumination field 810 are transmitted through the volume phase grating 815 in a spatially confined (non-dispersive) state. This is achieved by configuring the volume phase grating 815 such that beamlets of the NIR field 812 are incident on the grating at angles that ensure coupling of the first diffraction order along the main imaging path 814. At the same time, the beamlets of the colored structured illumination field 810 should be incident on the volume phase grating 815 at an angle that ensures that zeroth order diffraction, including all co-located wavelengths, is coupled along the main imaging path 814.
After transmission through the volume phase grating 815, the NIR field 812 includes a dispersed grid of beamlets that are elongated in the dispersion axis of the volume phase grating 815. In other words, after being dispersed by the volume phase grating 815, the NIR field 812 includes multiple angularly encoded co-propagating grids of beamlets of different wavelengths, where each grid has a propagation angle that depends on its wavelength. The color structured light illumination field 810 is not substantially modified by the volume phase grating 815 and includes all wavelengths of light (e.g., UVA, violet, blue, or green) within the color band emitted from the light source 802. Thus, all wavelengths within the color field 810 co-propagate along the same trajectory as a single grid of beamlets.
The co-propagating color structured illumination field 810 and NIR structured illumination field 812 are projected onto the eye 102 and imaged at multiple locations 147 on the retina 145. Upon incidence on the retina 145, the beamlets of the colored structured illumination field 810 are encoded with fluorescence information from the retina 145 due to autofluorescence from certain molecules.
Upon reflection, backscatter, or fluorescing 149 from the retina 145, the structured illumination fields 810, 812 return to the volume phase grating 815 along the main imaging path 814. The color structured illumination field 810 and the generated fluorescence light return through the volume phase grating 815 without dispersion and then reflect from the mirror 134 of the beam splitter 132 and pass through the aperture array 150 for confocal gating before being detected by the grid spectrometer 116. Confocal gating is important in fluorescence imaging to reject light that is not related to the measured sample portion (in terms of depth). For example, in the case of imaging an eye, fluorescence or backscatter from the cornea or aqueous humor of the eye can provide a strong signal, which is preferably removed in order to achieve a stronger signal-to-noise ratio for the fluorescence signal from retina 145.
The dispersion of light in the NIR structured illumination field 812 reflected or backscattered from the retina 145 is reversed as it returns through the volume phase grating 815. The recovered NIR beamlets then return along a second path 816 where they are spatially filtered with an appropriately sized aperture array 850 and imaged or analyzed at a spectrometer 824, which includes a dispersive element 825 and a sensor array 826.
The sensor array 118 of the spectrometer 116 receives a grid of colored beamlets from the first light source 802 that are dispersed by the pair of dispersion wedges 156 according to wavelength. Thus, spectral information for each grid point on the sample may be obtained, including fluorescence spectral information due to local molecules at that location and depth.
At the same time, sensor array 826 receives a grid of NIR beamlets dispersed by dispersive element 825. Processor 158 processes signals received at sensor array 826 to generate spectral mapping information. This information provides positional registration of the sensor locations to locations on the retina 145 because the processor 158 can apply a mapping function to map the trace of dispersed NIR wavelengths. The known locations may be used to reference the fluorescence spectra measured at the sensor 118 to retinal locations to provide accurate mapping and reduce motion blur artifacts.
Through the above process, the apparatus 800 can produce a two-dimensional fluorescence spectrum of a particular region and layer of the eye 102. The device 800 may be used to image different regions of the eye 102 by mechanically tilting the MEMS 138 in the manner described above. Different depths of the eye 102 can be imaged by mechanically translating the device 800 relative to the patient in the manner described above. By imaging different lateral regions and depths, a three-dimensional fluorescence spectrum of the eye 102 can be generated.
In a preferred embodiment, as shown in FIG. 1A, structured illumination field 110 comprises a two-dimensional array of beamlets 111. However, in an alternative embodiment, as shown in FIG. 1B, the structured illumination field 110 comprises a one-dimensional array of linear beamlets 119. Referring to FIG. 1A, this form of structured illumination field can be produced by replacing the two-dimensional microlens array 130 with a cylindrical microlens array. In other variations, the returned linear beamlets are passed through a microlens array 148, the microlens array 148 comprising a cylindrical microlens array to collimate it in the short axis, and then the collimated beamlets are passed through a linear aperture array 150 having a pitch corresponding to a one-dimensional array of linear beamlets. The spatially filtered return beamlets are then passed to spectrometer 116 for spectral analysis, as previously described. The devices described with reference to fig. 7 and 8 may be modified in a similar manner.
Conclusion
It will be appreciated that the above-described invention provides the ability to image color or fluorescence spectral information from a sample at multiple points simultaneously. The present invention can be operated in different modes to obtain different spectral information from a sample, including full-field color spectral information and fluorescence spectral information. The present invention can also be combined with an OCT system if desired.
Embodiments of the present invention provide an efficient retinal confocal imaging system that can rapidly capture fluorescence spectra with sufficient resolution and across a large field of view to be suitable for screening and disease diagnosis. Further embodiments (not shown) can perform the above-described operations according to the excitation wavelength by selectively changing the wavelength of the input light source.
The fluorescence information obtained from the eye by the apparatus and method of the present invention can be used to identify important information related to the aging and health of the RPE layer, and thus the photoreceptor. The fluorescence information can also be used to diagnose Alzheimer's disease in a patient.
Explanation of the invention
Those skilled in the art will appreciate that the frequency and wavelength of the laser beam are connected by the following equation:
the speed of light is wavelength frequency.
As a result, when referring to terms such as wavelength band, wavelength dispersion, wavelength correlation, and the like, these terms are interchangeable with the corresponding terms band, frequency dispersion, frequency correlation, and the like.
It should be understood that various optical elements described as lenses, prisms, or mirrors are interchangeable with corresponding optical power elements to produce the same overall effect (focusing, converging, collimating, etc.). For example, a mirror, prism, or relay of optical elements may be substituted for a lens having an equivalent optical effect to the lens. Similarly, one dispersive element, such as a diffraction grating, may replace another dispersive element, such as a prism, to perform an equivalent dispersive operation.
Throughout the specification, the use of the term "element" is intended to refer to a single integral component or a collection of components that are combined to perform a particular function or purpose.
Unless specifically stated otherwise as apparent from the following discussions, it is appreciated that throughout the specification discussions utilizing terms such as "processing," "computing," "calculating," "determining," "analyzing," or the like, refer to the action and/or processes of a computer or computing system, or similar electronic computing device, that manipulate and/or transform data represented as physical, such as electronic, quantities into other data similarly represented as physical quantities.
In a similar manner, the term "controller" or "processor" may refer to any device or portion of a device that processes electronic data, e.g., from registers and/or memory, to transform that electronic data into other electronic data that may be stored, e.g., in registers and/or memory. A "computer" or "computing machine" or "computing platform" may include one or more processors.
In one embodiment, the methods described herein may be performed by one or more processors accepting computer readable (also referred to as machine readable) code comprising a set of instructions which, when executed by the one or more processors, perform at least one of the methods described herein. Including any processor capable of executing a set of instructions that specify actions to be taken, sequentially or otherwise. Thus, one example is a typical processing system that includes one or more processors. Each processor may include one or more of a CPU, a graphics processing unit, and a programmable DSP unit. The processing system further may include a memory subsystem including main RAM and/or static RAM and/or ROM. A bus subsystem may be included for communicating between the components. The processing system may further be a distributed processing system having processors coupled by a network. If the processing system requires a display, such a display may be included, for example, a Liquid Crystal Display (LCD) or Cathode Ray Tube (CRT) display. If manual data entry is required, the processing system also includes input devices such as one or more of an alphanumeric input unit (such as a keyboard), a pointing control device (such as a mouse), and the like. The term memory unit as used herein also includes storage systems such as disk drive units, if clear from the context and unless otherwise explicitly indicated. The processing system in some configurations may include a sound output device and a network interface device. Thus, the memory subsystem includes a computer-readable carrier medium carrying computer-readable code (e.g., software) comprising a set of instructions which, when executed by one or more processors, will result in the performance of one or more of the methods described herein. Note that when a method includes multiple elements, e.g., several steps, no order of the elements is implied unless specifically stated. The software may reside in the hard disk, or may also reside, completely or at least partially, within the RAM and/or the processor during execution thereof by the computer system. Thus, the memory and the processor also constitute a computer-readable carrier medium carrying computer-readable code.
Reference throughout this specification to "one embodiment," "some embodiments," or "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present disclosure. Thus, the appearances of the phrases "in one embodiment," "in some embodiments," or "in an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments as would be apparent to one of ordinary skill in the art in view of this disclosure.
As used herein, unless otherwise specified the use of the ordinal adjectives "first", "second", "third", etc., to describe a common object, merely indicate that different instances of like objects are being referred to, and are not intended to imply that the objects so described must be in a given sequence, either temporally, spatially, in ranking, or in any other manner.
In the appended claims and the description herein, any of the terms "comprising," "comprised by … …," or "including" is an open term that means including at least the elements/features that follow, but not excluding others. Thus, when the term "comprising" is used in the claims, it should not be interpreted as being limited to the means or elements or steps listed thereafter. For example, the scope of the expression "a device comprising a and B" should not be limited to devices consisting of only elements a and B. Any of the terms "comprising" or "including" or "comprising thereof, as used herein, are also open-ended terms that also mean including at least the elements/features that follow the term, but not excluding other elements/features. Thus, "comprising" is synonymous with "including" and means "including".
It should be appreciated that in the foregoing description of exemplary embodiments of the disclosure, various features of the disclosure are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. However, the methods of the present disclosure should not be construed as reflecting the intent: the claims require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this disclosure.
Furthermore, as one of ordinary skill in the art will appreciate, although some embodiments described herein include some but not other features included in other embodiments, combinations of features of different embodiments are intended to be within the scope of the present disclosure and form different embodiments. For example, in the following claims, any of the claimed embodiments may be used in any combination.
In the description provided herein, numerous specific details are set forth.
However, it is understood that embodiments of the disclosure may be practiced without these specific details. In other instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it is to be noticed that the term 'coupled', when used in the claims, should not be interpreted as being restricted to direct connections only. The terms "coupled" and "connected," along with their derivatives, may be used. It should be understood that these terms are not intended to be equivalent to each other. Thus, the scope of the expression "device a coupled to device B" should not be limited to devices or systems in which the output of device a is directly connected to the input of device B. This means that there exists a path between the output of a and the input of B, which may be a path including other devices or components. "coupled" may mean that two or more elements are in direct physical, electrical, or optical contact, or that two or more elements are not in direct contact with each other, but yet still co-operate or interact with each other.
Thus, while there has been described what is believed to be the preferred embodiments of the disclosure, those skilled in the art will recognize that other and further modifications may be made thereto without departing from the spirit of the disclosure, and it is intended to claim all such changes and modifications as fall within the scope of the disclosure. For example, any of the formulas given above are merely representative of processes that may be used. Functions may be added or deleted from the block diagrams and operations may be interchanged among the functional blocks. Steps may be added or deleted to methods described within the scope of the present disclosure.

Claims (40)

1. An apparatus for analyzing a sample, the apparatus comprising:
a light source configured to generate an input radiation field having a wavelength band comprised of a plurality of wavelengths;
a structured light generator for converting the input radiation field into a structured illumination field comprising an array of beamlets;
an optical system for projecting the structured illumination field onto an area of the sample, including angularly encoding beamlets such that each beamlet is projected onto a position of the sample corresponding to an encoded angle;
a spectrometer comprising a two-dimensional array of sensors, the spectrometer configured to spectrally analyze a portion of light reflected, backscattered, or fluoresced from the region of the sample; and
a processor operatively associated with the spectrometer, the processor comprising:
a spectral mapping module configured to map locations on the sensor array to two-dimensional locations on the sample based on a predefined mapping function and wavelengths of light within a plurality of predefined wavelength intervals; and
a hyperspectral image generator configured to generate a hyperspectral image from sensor signals of the sensor array and the predefined mapping function, the hyperspectral image comprising two or more frontal images of the area of the sample, the two or more frontal images comprising spectral response information of the sample from each beamlet of the structured illumination field.
2. The apparatus of claim 1, wherein the structured light generator is configured to convert the input radiation field into the structured illumination field comprising the two-dimensional array of beamlets.
3. The apparatus of claim 1, wherein the light sources are spatially incoherent.
4. The apparatus of claim 1, wherein the optical system comprises means for translating the structured illumination field relative to the sample.
5. The apparatus of claim 4, wherein the means for translating the structured illumination field relative to the sample comprises a tiltable mirror.
6. The apparatus of claim 1, wherein the optical system comprises a dispersive element for angularly dispersing each of the beamlets into an elongated band of spectral components of the beamlets corresponding to each of the plurality of wavelengths.
7. The apparatus of claim 6, wherein the optical system comprises a lens relay disposed, in use, between the dispersive element and the sample to project the beamlet spectral components onto the sample at a location that depends on both the location of original beamlets in the array of beamlets and the wavelength of the beamlet spectral components.
8. The apparatus of claim 7, wherein the beamlet spectral components of different beamlets overlap over the region of the sample such that the plurality of wavelengths are imaged by the spectrometer at each point in the overlapping region over the region of the sample.
9. The apparatus of claim 7, wherein the spectral response information comprises a reflected power spectrum corresponding to each point on the area of the sample.
10. The apparatus of claim 6, wherein the dispersive element is mechanically movable to be removed from the optical path.
11. The apparatus of claim 6, wherein the dispersive element is interchangeable with a non-dispersive optical guiding element.
12. The device of claim 6, wherein said processor is configured to divide said wavelengths corresponding to said elongate bands of said beamlet spectral components into said wavelength bins, wherein wavelengths within said wavelength bins are designated to originate from a common location on said sample and wavelengths of different said wavelength bins are designated to originate from different locations on said sample.
13. The apparatus of claim 1, wherein each of the angularly encoded beamlets projected onto the region of the sample comprises each of the plurality of wavelengths.
14. The apparatus of claim 1, wherein the spectral response information comprises a fluorescence spectrum corresponding to a response of the sample to each beamlet of the structured illumination field.
15. An apparatus as claimed in claim 3, wherein the structured light generator comprises a beamlet generation device having a two-dimensional array of optical power elements, the beamlet generation device being positioned such that, in use, the input radiation field is incident on the array of optical power elements to generate a two-dimensional array of at least partially collimated beamlets.
16. The apparatus of claim 15, wherein the beamlet generation device comprises a first array of microlenses, wherein the optical power elements are microlenses.
17. The apparatus (100, 700) of claim 15, wherein the structured light generator (108) further comprises:
first optical power means for spatially converging the array of at least partially collimated beamlets to a predefined width at a convergence plane; and
an aperture disposed at the convergence plane, the aperture having a diameter smaller than a diameter of the array of at least partially collimated beamlets at the convergence plane.
18. The apparatus of claim 17, wherein the first optical power device comprises a high numerical aperture lens.
19. The apparatus of claim 17, wherein the first optical power device comprises a variable index lens.
20. The apparatus of claim 17, wherein the structured light generator further comprises:
a second optical power device having a predefined focal length and disposed at a distance from the convergence plane equal to the predefined focal length, the second optical power device generating a near-collimated beam comprising a set of at least partially overlapping beamlets; and
a second microlens array having a two-dimensional array of microlenses and positioned to receive the near-collimated beam and generate a two-dimensional array of beamlets for illuminating the sample.
21. The apparatus of claim 1, wherein the apparatus comprises an aperture array positioned to confocally project the light reflected, backscattered, or fluoresced from the region of the sample into the spectrometer, the aperture array having a pitch corresponding to the beamlet array.
22. The device of claim 1, wherein the light source is configured to generate the input radiation field having a plurality of the wavelength bands, each wavelength band consisting of a corresponding plurality of wavelengths.
23. The apparatus of claim 22 wherein the first wavelength band is selected from ultraviolet a, violet, blue or green regions of the electromagnetic spectrum.
24. The apparatus of claim 22 wherein the second wavelength band is in the near infrared region of the electromagnetic spectrum.
25. The apparatus of claim 22, wherein the optical system comprises a multiplexing element for multiplexing a plurality of the wavelength bands together.
26. The apparatus of claim 25, wherein the multiplexing element comprises a volume phase grating.
27. The device of claim 26, wherein the volume-phase grating is configured to spatially disperse the plurality of wavelengths from one of the wavelength bands while maintaining the plurality of wavelengths of another wavelength band in a spatially-limited state.
28. The apparatus of claim 1, wherein the apparatus comprises a reference arm and a power distribution element configured to direct a portion of an optical power of the structured illumination field along the reference arm and a remaining portion of the optical power of the structured illumination field toward the sample.
29. The apparatus of claim 28, wherein a length of the reference arm is selectively adjusted to select a coherent wavelength to be imaged by the spectrometer.
30. The device of claim 1, wherein the device is configured to analyze the sample comprising an eye.
31. A system for generating a structured illumination field, the system comprising:
a spatially incoherent light source configured to generate an input radiation field having a predefined spectral output;
a beamlet generation device having a two-dimensional array of optical power elements, the beamlet generation device being positioned such that the input radiation field is incident on the array of optical power elements to generate a two-dimensional array of at least partially collimated beamlets;
first optical power means for spatially converging the array of at least partially collimated beamlets to a predefined width at a convergence plane;
a second optical power device having a predefined focal length and disposed at a distance from the convergence plane equal to the focal length, the second optical power device generating a near-collimated beam comprising a set of at least partially overlapping beamlets; and
a microlens array having an array of microlenses and positioned to receive the near-collimated light beam and generate a structured illumination field comprising an array of beamlets.
32. The system of claim 31, wherein the array of microlenses comprises a two-dimensional array of the microlenses such that the structured illumination field comprises a two-dimensional array of the beamlets.
33. The system of claim 31, comprising an aperture disposed at the convergence plane, the aperture having a diameter smaller than the predefined width of the array of at least partially collimated beamlets at the convergence plane.
34. The system of claim 31, wherein the first optical power device comprises a high numerical aperture lens.
35. The system of claim 31, wherein the first optical power device comprises a variable index lens.
36. The system of claim 31, wherein the beamlet generation device comprises an array of microlenses, wherein the optical power element is the microlens.
37. A method of analyzing a sample, the method comprising the steps of:
generating a structured illumination field comprising an array of beamlets from an input radiation field having a wavelength band comprised of a plurality of wavelengths;
projecting the structured illumination field onto an area of the sample, including angularly encoding the beamlets such that each beamlet is projected onto a location of the sample corresponding to an encoded angle; and
spectrally analyzing a portion of the light reflected, backscattered, or fluoresced from the region of the sample using a two-dimensional sensor array, the spectral analysis comprising:
mapping locations on the sensor array to two-dimensional locations on the sample based on a predefined mapping function and wavelengths of light within a plurality of predefined wavelength intervals; and
generating a hyperspectral image from sensor signals of the sensor array and the predefined mapping function, the hyperspectral image comprising two or more frontal images of the area of the sample, the two or more frontal images comprising spectral response information of the sample from each beamlet of the structured illumination field.
38. The method of claim 37, wherein the structured illumination field comprises a two-dimensional array of the beamlets.
39. An article of manufacture comprising a computer usable medium having computer readable program code configured to operate the apparatus of claim 1.
40. An article of manufacture comprising a computer usable medium having computer readable program code configured to implement the method of claim 37.
CN201980056673.XA 2018-09-05 2019-09-04 Hyperspectral apparatus and method Active CN112639582B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201862727492P 2018-09-05 2018-09-05
US62/727,492 2018-09-05
PCT/AU2019/050943 WO2020047594A1 (en) 2018-09-05 2019-09-04 Hyperspectral apparatus and method

Publications (2)

Publication Number Publication Date
CN112639582A true CN112639582A (en) 2021-04-09
CN112639582B CN112639582B (en) 2024-02-23

Family

ID=69721441

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201980056673.XA Active CN112639582B (en) 2018-09-05 2019-09-04 Hyperspectral apparatus and method

Country Status (6)

Country Link
US (1) US20210330184A1 (en)
EP (1) EP3847502A4 (en)
JP (1) JP7414807B2 (en)
CN (1) CN112639582B (en)
AU (1) AU2019333923A1 (en)
WO (1) WO2020047594A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114063275A (en) * 2022-01-17 2022-02-18 北京九辰智能医疗设备有限公司 Corneal endothelial cell imaging system, method, apparatus and storage medium

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2023517677A (en) * 2020-03-13 2023-04-26 ユニバーシティ オブ サザン カリフォルニア A High Throughput Snapshot Spectral Encoding Device for Fluorescence Spectral Microscopy
CN116249475A (en) * 2020-07-14 2023-06-09 澳大利亚眼科研究中心有限公司 Non-mydriatic hyperspectral eye base camera
TWI800991B (en) * 2021-11-18 2023-05-01 宏碁股份有限公司 Visual field detection method and visual field detection system
US20240094054A1 (en) * 2022-09-14 2024-03-21 Alcon Inc. Generation of multispectral imaging information using analytical multispectral imaging

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003046607A1 (en) * 2001-11-26 2003-06-05 Leonid Viktorovich Volkov Method for forming the image in millimetre and sub-millimetre wave band (variants), system for forming the image in millimetre and sub-millimeter wave band (variants), diffuser light (variants) and transceiver (variants)
US20060192971A1 (en) * 2005-02-28 2006-08-31 Princeton Lightwave, Inc. Scanning spectrum analyzer
US20090154886A1 (en) * 2007-12-13 2009-06-18 Microvision, Inc. Multi-zone scanned-beam imager
WO2009098516A2 (en) * 2008-02-08 2009-08-13 University Of Kent Camera adapter based optical imaging apparatus
US20130038941A1 (en) * 2011-08-09 2013-02-14 Primesense Ltd. Lens Array Projector
CN104568777A (en) * 2015-01-12 2015-04-29 南京理工大学 Spectrum-coding-based confocal microscopy imaging device and method
CN105324649A (en) * 2013-06-20 2016-02-10 赛莱特私人有限公司 Ocular metrology employing spectral wavefront analysis of reflected light
CN105352923A (en) * 2014-10-21 2016-02-24 清华大学 Fast wide-field-of-view volume holographic fluorescent microscopic imaging system
TW201638628A (en) * 2015-04-29 2016-11-01 國立中央大學 Structured illumination fluorescence hyperspectral microscopy system with parallel recording
US20170176338A1 (en) * 2015-12-21 2017-06-22 Verily Life Sciences Llc Spectrally And Spatially Multiplexed Fluorescent Probes For In Situ Cell Labeling
US20170343477A1 (en) * 2016-05-27 2017-11-30 Verily Life Sciences Llc Systems and Methods for 4-D Hyperspectral Imaging
WO2018000036A1 (en) * 2016-07-01 2018-01-04 Cylite Pty Ltd Apparatus and method for confocal microscopy using dispersed structured illumination
CN107850530A (en) * 2015-05-04 2018-03-27 港大科桥有限公司 Apparatus and method for the optical imagery of quantitative phase gradient linearity frequency modulation Wavelength-encoding

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018528830A (en) * 2015-09-11 2018-10-04 サイモン フレーザー ユニバーシティーSimon Fraser University Coherence gate wavefront sensorless adaptive optics multi-photon microscopy and related systems and methods
US20170176368A1 (en) 2015-12-22 2017-06-22 Shell Oil Company Apparatus to measure conductivity of non-aqueous liquids at variable temperatures and applied voltages
EP3743711A1 (en) * 2018-01-22 2020-12-02 Verily Life Sciences LLC High-throughput hyperspectral imaging systems

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003046607A1 (en) * 2001-11-26 2003-06-05 Leonid Viktorovich Volkov Method for forming the image in millimetre and sub-millimetre wave band (variants), system for forming the image in millimetre and sub-millimeter wave band (variants), diffuser light (variants) and transceiver (variants)
US20060192971A1 (en) * 2005-02-28 2006-08-31 Princeton Lightwave, Inc. Scanning spectrum analyzer
US20090154886A1 (en) * 2007-12-13 2009-06-18 Microvision, Inc. Multi-zone scanned-beam imager
WO2009098516A2 (en) * 2008-02-08 2009-08-13 University Of Kent Camera adapter based optical imaging apparatus
US20130038941A1 (en) * 2011-08-09 2013-02-14 Primesense Ltd. Lens Array Projector
CN105324649A (en) * 2013-06-20 2016-02-10 赛莱特私人有限公司 Ocular metrology employing spectral wavefront analysis of reflected light
CN105352923A (en) * 2014-10-21 2016-02-24 清华大学 Fast wide-field-of-view volume holographic fluorescent microscopic imaging system
CN104568777A (en) * 2015-01-12 2015-04-29 南京理工大学 Spectrum-coding-based confocal microscopy imaging device and method
TW201638628A (en) * 2015-04-29 2016-11-01 國立中央大學 Structured illumination fluorescence hyperspectral microscopy system with parallel recording
US20160320305A1 (en) * 2015-04-29 2016-11-03 National Central University Fluorescence hyperspectral microscopy system featuring structured illumination and parallel recording
CN107850530A (en) * 2015-05-04 2018-03-27 港大科桥有限公司 Apparatus and method for the optical imagery of quantitative phase gradient linearity frequency modulation Wavelength-encoding
US20170176338A1 (en) * 2015-12-21 2017-06-22 Verily Life Sciences Llc Spectrally And Spatially Multiplexed Fluorescent Probes For In Situ Cell Labeling
US20170343477A1 (en) * 2016-05-27 2017-11-30 Verily Life Sciences Llc Systems and Methods for 4-D Hyperspectral Imaging
WO2017205857A1 (en) * 2016-05-27 2017-11-30 Verily Life Sciences Llc Systems and methods for 4-d hyperspectrial imaging
WO2018000036A1 (en) * 2016-07-01 2018-01-04 Cylite Pty Ltd Apparatus and method for confocal microscopy using dispersed structured illumination

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114063275A (en) * 2022-01-17 2022-02-18 北京九辰智能医疗设备有限公司 Corneal endothelial cell imaging system, method, apparatus and storage medium

Also Published As

Publication number Publication date
WO2020047594A1 (en) 2020-03-12
AU2019333923A1 (en) 2021-05-06
EP3847502A1 (en) 2021-07-14
EP3847502A4 (en) 2022-06-01
CN112639582B (en) 2024-02-23
JP2021536274A (en) 2021-12-27
US20210330184A1 (en) 2021-10-28
JP7414807B2 (en) 2024-01-16

Similar Documents

Publication Publication Date Title
CN112639582B (en) Hyperspectral apparatus and method
JP6928623B2 (en) Equipment and methods for confocal microscopy using distributed structured illumination
US7488070B2 (en) Optical measuring system and optical measuring method
JP4734326B2 (en) Eye Fourier domain OCT ray tracing method
JP5078004B2 (en) Spectroscopic measurement apparatus and spectral measurement method
JP5243774B2 (en) Ophthalmic surgical microscope with OCT system
US8651662B2 (en) Optical tomographic imaging apparatus and imaging method for optical tomographic image
US9566001B2 (en) Ophthalmologic apparatus
JP7421175B2 (en) Optical unit and retinal imaging device used for retinal imaging
JP2017522066A (en) Imaging system and method with improved frequency domain interferometry
WO2014077057A1 (en) Optical image measuring device
US9517009B2 (en) Structured illumination ophthalmoscope
KR101830320B1 (en) Apparatus and method for optical coherence tomography
WO2022057402A1 (en) High-speed functional fundus three-dimensional detection system based on near-infrared light
KR20150043115A (en) Optical Coherence Tomography Device
JP5587395B2 (en) Ophthalmic surgical microscope with OCT system
JP6898716B2 (en) Optical tomography imaging device
WO2019002256A1 (en) Eye fundus inspection apparatus
US20230087685A1 (en) Apparatus and method for spectral domain optical imaging
WO2023010174A1 (en) Spectral domain optical imaging with wavelength comb illumination
CN115040066B (en) Multifunctional fundus scanning method and system
JP2020174852A (en) Objective lens and fundus imaging apparatus including objective lens
JP2019213609A (en) Imaging device and control method thereof
JPWO2018016409A1 (en) Eye analysis device and eye analysis method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant