WO2021148119A1 - Spectral imaging apparatus and method for simultaneous capture of multiple spectral images - Google Patents

Spectral imaging apparatus and method for simultaneous capture of multiple spectral images Download PDF

Info

Publication number
WO2021148119A1
WO2021148119A1 PCT/EP2020/051556 EP2020051556W WO2021148119A1 WO 2021148119 A1 WO2021148119 A1 WO 2021148119A1 EP 2020051556 W EP2020051556 W EP 2020051556W WO 2021148119 A1 WO2021148119 A1 WO 2021148119A1
Authority
WO
WIPO (PCT)
Prior art keywords
imaging apparatus
spectral imaging
resonating cavity
sections
image
Prior art date
Application number
PCT/EP2020/051556
Other languages
French (fr)
Inventor
Mikko Muukki
Mikko Aulis PERÄLÄ
Original Assignee
Huawei Technologies Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co., Ltd. filed Critical Huawei Technologies Co., Ltd.
Priority to PCT/EP2020/051556 priority Critical patent/WO2021148119A1/en
Publication of WO2021148119A1 publication Critical patent/WO2021148119A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/0205Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows
    • G01J3/0208Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows using focussing or collimating elements, e.g. lenses or mirrors; performing aberration correction
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/0256Compact construction
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/0256Compact construction
    • G01J3/0259Monolithic
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/0272Handheld
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/0291Housings; Spectrometer accessories; Spatial arrangement of elements, e.g. folded path arrangements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/12Generating the spectrum; Monochromators
    • G01J3/26Generating the spectrum; Monochromators using multiple reflection, e.g. Fabry-Perot interferometer, variable interference filters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/2823Imaging spectrometer
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/30Measuring the intensity of spectral lines directly on the spectrum itself
    • G01J3/36Investigating two or more bands of a spectrum by separate detectors
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/2823Imaging spectrometer
    • G01J2003/2826Multispectral imaging, e.g. filter imaging

Definitions

  • the present application relates to an imaging apparatus and method, and more specifically, although not exclusively, to a spectral imaging apparatus and method.
  • Objects can emit, absorb or reflect electromagnetic radiation, and the spectroscopic properties of an object determined from measurements of such emission, absorption or reflection can enable various characteristics of the obj ect to be determined. That is, the spectroscopic response of an object or material, which varies as a function of wavelength (or frequency), can be used to determine various characteristics and properties of an object or environment. For example, emission spectra can enable useful information to be derived about an object, such as its composition.
  • each of which has a different wavelength bandpass response enables simultaneous capture of images in different EM bands across the EM spectrum. This provides a reduction in size of the apparatus since complicated structures to vary the response of the apparatus to differing EM bands are not required. Concurrently, a speed increase flows as the apparatus does not require tuning between image capture events to enable the differing EM bands to be imaged.
  • the resonating cavity is adjustable. This enables multiple sets of images to be captured in reduced time since, as noted above, simultaneous capture of multiple images in different EM bands across the EM spectrum can be performed. Thus, with an adjustment to the resonating cavity following capture of these multiple images, a further set of multiple images can be captured, thereby reducing the overall time required to generate, e.g. multi- or hyperspectral images that can be derived or parsed form the captured images.
  • the resonating cavity is an interferometer comprising first and second partially reflective (or semi-transparent) elements.
  • the first reflective element may be stepped, whereby to define multiple partially reflective surfaces. As such, gaps between each of the multiple reflective surfaces and the second reflective element define the multiple sections of the spectral imaging apparatus.
  • a single element may be provided that is fabricated in a stepped configuration, thereby enabling an interferometer structure to be provided that has multiple spectral responses.
  • At least one of the first and second reflective elements is movable. Accordingly, it is possible to vary the spectral response of the multiple sections in one fell swoop.
  • at least one piezoelectric and/or micro-electro-mechanical actuator is provided in order effect the movement of the first and/or second reflective element. A single actuator reduces size, weight, complexity and power requirements.
  • the imaging structure comprises an image sensor.
  • the multiple sections of the apparatus can illuminate a sub-portion of the image sensor.
  • the image sensor can comprise a colour, low-pass, high-pass or bandpass filter over at least a portion thereof. This can augment or amplify the response of the apparatus to a particular EM band of interest for example.
  • the imaging structure can comprise an array of multiple image sensors.
  • Such an array may be one- or two-dimensional.
  • Each section of the resonating cavity can be arranged to illuminate an image sensor. So, for example, each section has a corresponding image sensor that receives EM radiation that has passed through that section of the resonant cavity.
  • This has the advantage that each image sensor in the array can simultaneously generate image data representing a different band across the EM spectrum.
  • Each resultant image can then be processed in order to derive or generate a multi- or hyperspectral image.
  • At least one of the image sensors in the array can comprise a colour, low-pass, high-pass or bandpass filter over at least a portion thereof.
  • each section of the resonating cavity may be so profiled as to define a plano-convex or a plano-concave surface.
  • a plano-convex or a plano-concave surface of a section of the resonating cavity may be adjacent to an image sensor.
  • the profile of the sections thus provides different thicknesses enabling a reduction in wavelength shift (which may be experienced in the center of an image sensor compared to its corner) that results from EM radiation entering the resonant cavity at different angles.
  • the resonating cavity is a Fabry-Perot interferometer.
  • a method for generating multiple narrow-band images of a portion of a scene or object can comprise providing a resonating cavity defining multiple sections having respective different wavelength bandpass responses, and generating data representing the multiple narrow-band images using an imaging structure configured to receive electromagnetic radiation from the multiple sections of the resonating cavity.
  • the resonating cavity can comprise first and second reflective elements, such that the method further comprises modifying the position of at least one of the first and second reflective elements, whereby to tune the wavelength bandpass responses of the sections of the resonating cavity.
  • the method further comprises generating data representing the multiple narrow-band image at multiple different positions of the at least one of the first and second reflective elements.
  • the imaging structure comprises an image sensor, and the method further comprises generating data representing the multiple narrow- band images from respective sub-portions of the image sensor.
  • the imaging structure comprises an array of multiple image sensors, and the method further comprises generating data representing the multiple narrow-band images from respective image sensors of the array.
  • the data representing the multiple narrow-band images can be generated by exposing the imaging structure to the scene or object, or a portion thereof. Exposing the imaging structure can further comprise exposing the imaging structure via the multiple sections of the resonating cavity at substantially the same time.
  • the data representing the multiple narrow-band images can be used to generate a multispectral or hyperspectral image of the portion of a scene or object.
  • Figure l is a schematic representation of a spectral imaging apparatus according to an embodiment
  • Figure 2 is a schematic representation of an imaging apparatus according to an embodiment
  • Figure 3 is a schematic representation of an imaging apparatus according to an embodiment
  • Figure 4 is a graph depicting transmission response of an apparatus according to an embodiment
  • Figure 5 is a schematic representation of an imaging apparatus according to an embodiment
  • Figure 6 is a schematic representation of an imaging apparatus according to an embodiment
  • Figure 7 is a flow diagram of a method according to an embodiment
  • Figure 8 is a flow diagram of a method according to an embodiment
  • Figure 9 is block diagram of an apparatus according to example.
  • spectral imaging can be used to image in multiple bands across the EM spectrum, such as the infrared (around 700nm - 10 5 nm wavelength range), the visible spectrum (around 380 nm and 760 nm), the ultraviolet (around lOnm - 400nm wavelength range), x-rays (around lnm - 1pm wavelength range) and so on, or some combination of the these in order to acquire image data that can be used to determine information or insight about an object or environment that would otherwise not be determinable using visible light alone. That is, image data can be generated in visible and non-visible bands of the EM spectrum.
  • multispectral imaging in which up to around 10-15 different wavelength ranges across the EM spectrum can be used, including light from wavelengths beyond the visible light range
  • Multispectral imaging can be used to capture image data in multiple spectral bands.
  • Multispectral imaging can be used in, for example, printed circuit board inspection, detection of counterfeit currency, skin characterisation and inspection of foodstuffs, to name but a few.
  • Hyperspectral imaging defines another subcategory of spectral imaging in which, broadly speaking, in excess of 10-15 different wavelength ranges across the EM spectrum can be used for capture. Hyperspectral imaging thus measures continuous spectral bands, and a complete spectrum can be generated at every pixel in an image plane. Multiple such planes define a hyperspectral image cube.
  • Apparatus for generating such data is typically large due to the nature of the physical components used to enable the capture of images in the multiple different EM bands, whilst also being slow to capture data due to the way in which the apparatus is tuned between image capture events in order to reconfigure the apparatus to sample the next desired EM band.
  • Such apparatus used to create spectral images can comprise, for example, imaging devices that include mechanically rotating filters, electronically tunable filters (e.g., liquid crystal based), pixel level filter arrays, scanning prism based systems, camera arrays with FPI (Fabry-Perot Interferometer) filters, cameras with MEMS (Micro-Electro-Mechanical System) tunable FPIs (scanning type), and so on.
  • Such apparatus can generate a sequence of images within respective different EM bands, which can then be combined in a desired way to create a multi- or hyperspectral image.
  • a capture apparatus utilizing a Fabry-Perot interferometer can comprise a pair of partially reflective glass optical flats that may be spaced nanometers to millimeters apart, with the reflective surfaces facing each other.
  • Optical waves can pass through the optical cavity only when they are in resonance with it, thereby enabling the FPI to act as a band pass filter.
  • the cavity between two optical surfaces can be changed in order to change the filter band and thus the wavelength of the captured light.
  • spectral information could be used for more reliable illuminant detection, which can lead to improvements in the color of photos or videos, such as by enabling colour temperature to be more accurately determined to enable white balance adjustments.
  • Spectral images can also provide additional information (in comparison to traditional RGB images), which can help image processing algorithms, for example in the field of object or edge detection where certain parts of an image which may otherwise be difficult to detect in an RGB image can be more easily determined when imaged in a different, e.g. non-visible, band of the EM spectrum.
  • a spectral imaging apparatus can be used to simultaneously generate data representing multiple images of a scene or an object.
  • Each of the images can comprise data relating to a selected EM band, such as those noted above for example.
  • the spectral imaging apparatus is compact enough to be provided as part of a mobile device, and can generate image data of an object or scene relating to multiple different EM bands without the need for physical adjustment stemming from a requirement to vary a band pass response of the apparatus between image capture events. This has an advantage that simultaneous image capture across several different EM bands can be performed, as opposed to the quasi-simultaneous capture of devices in which capture events in differing bands proceed subsequent to physical modification of a band pass structure, thereby affecting the rate at which images can be captured.
  • spectral image data can be generated which can be used to improve image or video processing by, for example, enabling illuminant detection to facilitate corresponding white balance adjustments to be made that otherwise more accurately reflect the situation being imaged.
  • spectral imaging can also be used to enable improved edge detection (e.g. for parts of an image that may be substantially undetectable in a typical RGB image).
  • edge detection e.g. for parts of an image that may be substantially undetectable in a typical RGB image.
  • Such detected edges can be used for image segmentation, which can be used to facilitate various image processing effects that can be isolated to certain parts of an image as desired, such as sharpening, blurring, enhanced bokeh effects and so on.
  • the spectral imaging apparatus comprises a resonating (or resonant) cavity disposed over an imaging structure.
  • the resonating cavity comprises multiple sections each having respective different wavelength bandpass responses.
  • exposure of the imaging structure to a scene or object results in generation of data representing multiple spectral images, the nature of which depend on the response of the sections of the resonating cavity to incident EM radiation.
  • object, scene and environment do not preclude consideration of portions or parts of the same. That is, when referring to an object, scene or environment, such references can include the case that a part, portion or region of the object, scene or environment is that being considered.
  • the resonating cavity of the imaging apparatus can be adjusted, thereby increasing the number of measurements that can be taken across the EM spectrum.
  • Figure l is a schematic representation of a spectral imaging apparatus according to an embodiment.
  • Apparatus 100 can be used to generate image data representing multiple images of a scene or an object (or portion thereof), and comprises a resonating cavity 101 disposed over an imaging structure 103.
  • the resonating cavity 101 comprises multiple sections 105a-e.
  • the bandpass response of each section 105a-e of the apparatus is different. That is, EM radiation incident on the apparatus at the outwardly facing surface of element 107 passes, via the cavity 101 (which may comprise free space for example), to the upper surfaces 109a-e of the sections 105a-e.
  • each of the upper surfaces 109a-e of the sections 105a-e is disposed at a different distance from the element 107.
  • the resonating cavity of the apparatus is in the form of a Fabry-Perot interferometer (FPI).
  • FPI Fabry-Perot interferometer
  • the inwardly facing surface of element 107 and the upper surfaces 109a-e are optical flats that are partially reflective, with their reflective surfaces substantially parallel to and facing each other.
  • EM radiation incident on the apparatus can pass through element 107 into the resonating cavity 101.
  • Such incident EM radiation will then only pass through the partially reflective upper surfaces 109a-e of the sections 105a-e when it is in resonance.
  • Whether a particular wavelength of incident radiation will pass by virtue of it being in resonance is determined by the relative separations 11 la-e between the reflective surfaces 109a-e and the reflective surface of the element 107, which increase as shown.
  • separations 11 la-e may be of the order of micrometres or centimetres.
  • each of the sections 105a-e forms a band pass filter since each section is structured as an FPI, each of which has a different thickness parameter thereby leading to differing resonant enhancements.
  • one or more of element 107 and the surfaces 109a-e can be slightly wedge shaped, as is common in FPI structures, to prevent the generation of interference fringes.
  • an anti-reflective coating may be applied to one or more of the rear (i.e. non-reflective surface) of the element 107 and the interface 113 between the section 105a-e and the imaging structure 103.
  • the sections 105a-e define stepped surfaces of an FPI element.
  • Such stepped surfaces may be manufactured by, for example, etching a stepped plate from a substrate material, and coating the stepped plate with, e.g., a thin metallic layer (such as silver) to form the semitransparent surfaces.
  • the element 107 may also be made semitransparent (or, put another way, partially reflective) by coating its inwardly facing (i.e. facing towards the inside of the resonant cavity 101) surface with a thin metallic layer (again, such as silver). That is, the inwardly facing surface of element 107 and the upper surfaces 109a-e (facing towards the inside of the resonant cavity 101) can be made partially reflective by coating with a thin layer of reflective material.
  • the thickness of such layers may be between 20-100nm, and more preferably between 40- 60nm.
  • the nominal value for 11 la-e (representing the size of an airgap for example) can be different for each section and the difference between two steps may be different.
  • the bandpass responses need not all have the same width.
  • the full-width at half-maximum (FWHM) of a bandpass response may be, for example, between 10-100nm, and more preferably between 10-50nm.
  • the resonant cavity can be adjustable. That is, element 107 may be movable using, e.g., one or more piezoelectric and/or MEMs actuators, and/or by some other actuating mechanism. In addition to element 107 being moveable, one or more of the surfaces 109a-e may also be movable using an actuation mechanism such as one of those mentioned above. That is, the relative separations 11 la-e between the reflective surfaces 109a-e and the reflective surface of the element 107 can, in an embodiment, be varied by moving element 107 up or down (relative to the surfaces 109a-e), and/or moving the surfaces 109a-e up or down (relative to the element 107).
  • Figure 1 depicts actuators 150, 151. One or both may be provided to move element 107 and or sections 105a. Actuator 151 may also be configured to move sections 105b-e, or separate actuators (not shown) may be used.
  • movement of element 107 relative to the surfaces 109a-e results in a change to the band pass properties of the sections of the resonant cavity 101.
  • image data may be generated at several different positions of the element 107 resulting in multiple spectral images.
  • the positions 109a-e may be moved.
  • a reduction in complexity results from simply enabling movement of element 107 using a single actuation mechanism.
  • movement of element 107 provides an elegant way to adjust multiple band pass responses (i.e. of sections 105a-e) in one go, resulting in the ability to simultaneously generate image data across multiple bands of the EM spectrum.
  • the gap defining the resonant cavity 101 between element 107 and the stepped sections 105a-e is changed, the wavelength of passing radiation is changed.
  • different gap e.g. air gap
  • multiple different transmissions can be imaged.
  • figure 2 is a schematic representation of an imaging apparatus according to an embodiment.
  • the imaging structure 103 comprises multiple image sensors 201a-e, each disposed beneath a corresponding section 105a-e of the apparatus 100.
  • Each image sensor generates image data from a section of the apparatus that sits above it. So, for example, image sensor 201a captures an image from EM radiation that passes through the resonant cavity 101 via the section 105a.
  • image sensor 201b captures an image from EM radiation that passes through the resonant cavity 101 via the section 105b, and so on.
  • each image sensor will image a different part of the EM spectrum, as defined or selected based on the relative separations 11 la-e between the reflective surfaces 109a-e and the reflective surface of the element 107.
  • a broad wavelength range can be covered in a single capture operation without the requirement to adjust components between capture events to vary the response of the apparatus.
  • Figure 3 is a schematic representation of an imaging apparatus according to an embodiment.
  • the apparatus of figure 3 is similar to the apparatus described above with reference to figure 1 or 2, but is shown from a different viewpoint. Specifically, the apparatus of figure 3 is depicted in plan view (i.e. from the top, as opposed to the side views as shown in figures 1 and 2).
  • an array of 15 image sensors is depicted, arranged in three rows 301a-e. 303a-e, 305a-e. It will be appreciated that more or fewer image sensors may be provided as desired.
  • each of the image sensors is associated with a corresponding section of an apparatus 100.
  • sections 105a-e may be associated with image sensors in one of the three rows depicted, with corresponding additional sections for each of the other sensors in the other two rows. That is, each image sensor of figure 3 can capture an image from a selected band of the EM spectrum that has passed via the resonant cavity 101.
  • a group or combination of image sensors may capture an image from a single (e.g. relatively larger) section.
  • a group or any combination of two or more image sensors can capture an image relating to one EM band, although preferably the sections are configured such that they each enable a different EM band to be imaged using corresponding single image sensors, thereby enabling simultaneous capture of multiple spectrally different images of a scene or an object, or portion thereof.
  • each image sensor in an image sensor array can have a different gap above it that defines the size between the corresponding FPI element surfaces.
  • a single image sensor is provided, which comprises 15 sub-portions arranged in three rows 301a-e. 303a-e, 305a-e. It will be appreciated that more or fewer sub-portions may be provided as desired.
  • each of the sub-portions is associated with a corresponding section of an apparatus 100.
  • sections 105a-e may be associated with sub-portions in one of the three rows depicted, with corresponding additional sections for each of the sub-portions in the other two rows. That is, each sub-portions of figure 3 can capture an image from a selected band of the EM spectrum that has passed via the resonant cavity 101.
  • a group or combination of sub-portions may capture an image from a single (e.g. relatively larger) section.
  • a group or any combination of two or more sub-portions can capture an image relating to one EM band, although preferably the sections are configured such that they each enable a different EM band to be imaged using a corresponding single sub-portion, thereby enabling simultaneous capture of multiple spectrally different images of a scene or an object, or portion thereof.
  • each sub-portion in a camera array can have a different gap above it that defines the size between the corresponding FPI element surfaces.
  • an imaging structure which can be in the form of a camera array, may thus comprise a single image sensor with multiple sub portions and/or an array of multiple image sensors as described above.
  • the camera array can include the tunable FPI element and a lens or multiple lenses.
  • lenses may be located between an image sensor and the tunable FPI element (e.g. element 107) and/or on top of FPI element.
  • the camera array may have a size of NxM, where at least one of N or M is larger than 1.
  • Values for an array of image sensors and/or sensor sub portions may be, for example, 1x2, lx, 3, 1x4, 2x2, 2x3, 2x4, 3x3, 4x3, 4x4.
  • one or more of the surfaces defining the FPI resonant cavity can be moved in order to tune the FPI. That is, the distance between FPI element surfaces can be changed by moving the element 107, the stepped plate defining the sections of the apparatus, or both.
  • figure 4 is a graph depicting transmission response of an apparatus according to an embodiment.
  • three capture events are depicted, each at a different position of the element 107, the stepped plate defining the sections of the apparatus, or both for three image sensors (and three corresponding FPI sections).
  • the element 107, the stepped plate defining the sections of the apparatus, or both is moved between capture events, there is a wavelength shift. Accordingly, a high number of images at differing wavelengths can be captured with relatively few changes to the physical structure. This provides an increase in the speed compared to systems in which a single wavelength is imaged at a time.
  • FIG. 5 is a schematic representation of an imaging apparatus according to an embodiment.
  • each section 105a-e is modified so that it varies spatially.
  • a region 301a-e of different material may be provided at the interface 113 between sections 105a-e and the imaging structure 103.
  • regions 301a-e may comprise lenses. The size and/or shape of the regions 301a-e may vary between one another.
  • each section 105a-e is not flat, but defines a curved region e.g. spherically shaped regions 301a-e in figure 5.
  • regions 301a-e may have a plano-convex and/or plano-concave surface to guide radiation to a corresponding image sensor.
  • Figure 6 is a schematic representation of an imaging apparatus according to an embodiment in which the regions 301a-e are plano-concave (compared to the example shown in figure 5).
  • the FPI element has an optical aperture that passes selected bands of incident EM radiation.
  • the shape of the aperture may be circular, elliptical, or comprise an array optical aperture for, e.g. each image sensor in an array of imaging sensors.
  • the element 107 may be provided in a stepped form, either along with or in place of the stepped nature of the sections as depicted in the accompanying figures.
  • each sub-portion of a camera array may have an optical aperture that defines a limiting aperture that passes radiation/light. Those apertures may be the same as the optical apertures of FPI elements or there can be additional apertures in at least one optical path of the camera array.
  • one or more of image sensors can comprise a filter, such as a low pass, high pass or bandpass filter, to limit the wavelengths that the image sensor detects.
  • a filter such as a low pass, high pass or bandpass filter
  • one or more of the image sensors may comprise a polarizing filter.
  • Transmission curves may be tuned by effectively modifying the size of the gap between reflective surfaces of the resonant cavity.
  • tuning of the FPI filters can be performed using an actuator (e.g., Piezo or MEMS actuator) or by some other actuating mechanism.
  • the size of the system can be balanced with the speed by the number of the imaging sensors of the imaging structure.
  • Tuning of the element can be performed with a single actuator (e.g. 150), which saves space and cost.
  • a final spectral image of an object or scene/environment can be parsed from the multiple images captured by the image sensors of the imaging structure.
  • the sequence can be captured to mimic a multispectral camera or to mimic a hyperspectral camera where scanning is performed in small increments.
  • Figure 7 is a flow diagram of a method according to an embodiment.
  • the method of figure 7 comprises generating multiple narrow-band images of a scene or object, or portion thereof.
  • a resonating cavity defining multiple sections having respective different wavelength bandpass responses is provided.
  • data representing the multiple narrow-band images is generated.
  • the data is generated, in the example of figure 7, using an imaging structure configured to receive electromagnetic radiation from the multiple sections of the resonating cavity.
  • Figure 8 is a flow diagram of a method according to an embodiment.
  • the method of figure 8 comprises generating multiple narrow-band images of a scene or object, or portion thereof, similarly to that provided according to the method of figure 7.
  • the resonating cavity is adjusted in order to modify the response of the various sections therein (such as described with reference to figure 4 for example).
  • the size of the gap between reflective surfaces of the resonant cavity can be modified by moving one or other or both of the element 107 and surfaces 109a-e.
  • the position of the surfaces may be modified any desired number of times between capture events of block 703 resulting in, e.g., a set of transmission responses such as shown in figure 4.
  • FIG. 9 is block diagram of an apparatus according to example.
  • An apparatus 900 as shown in figure 9 may be configured to implement each step and method in the foregoing method embodiments.
  • the apparatus 900 may be applied to or present as part of mobile device, such as a smart phone or a terminal in various communications systems for example.
  • the apparatus 900 includes a processing unit (including one or more processors) 901, and a memory 902.
  • the processing unit 901 controls an operation of the apparatus 900, and may also be called a CPU (Central Processing Unit, central processing unit).
  • the memory 902 may include a read-only memory and a random-access memory (RAM), and can provide instructions 903 and data for the processing unit 901.
  • RAM random-access memory
  • a part of the memory 902 may further include a nonvolatile random-access memory (NVRAM).
  • NVRAM nonvolatile random-access memory
  • the apparatus 900 may be embedded into or may be a wireless communications device such as a mobile phone or other portable communications device such as a smart phone or tablet.
  • each step of the method may be completed by using an integrated logic circuit of hardware in the processing unit 901 or instructions in a software form. These instructions may be implemented and controlled by using the processing unit 901.
  • the foregoing processing unit may include a general-purpose processor, a digital signal processor (DSP), an application-specific integrated circuit (ASIC), a field- programmable gate array (FPGA), or another programmable logic device, a discrete gate or transistor logic device, or a discrete hardware component; and can implement or execute each disclosed method, step, and logic block diagram in the embodiments of the present invention.
  • the general-purpose processor may be a microprocessor or the processor may be any common processor or decoder, and so on.
  • a step with reference to a method disclosed herein may be directly executed and completed by a hardware decoding processor or executed and completed by a combination of hardware and a software module in a decoding processor.
  • the software module may be located in a mature storage medium in the art, such as a random- access memory, a flash memory, a read-only memory, a programmable read-only memory, an electronically erasable programmable memory, or a register.
  • the storage medium is located in the memory 902, and the processing unit 901 can read information in the memory 902, and can complete the steps of a method with reference to the hardware.
  • the memory 902 may store information 904 about a narrow band image for the processing unit 901 to process.
  • the memory 902 may store information 904 about (or representing) multiple narrow band images for the processing unit 901 to parse in order to generate a final spectral image of an object or scene.
  • units and algorithm steps may be implemented by electronic hardware, or a combination of computer software and electronic hardware. Whether the functions are performed by hardware or software depends on particular applications and design constraint conditions of the technical solutions. A person skilled in the art may use different methods to implement the described functions for each particular application, but it should not be considered that such implementation goes beyond the scope of the present invention.
  • the disclosed system, apparatus, and method may be implemented in other manners.
  • the described apparatus embodiment is merely exemplary.
  • a plurality of units or components may be combined or integrated into another system, or some features may be ignored or not performed.
  • the displayed or discussed mutual couplings or direct couplings or communications connections may be implemented through some interfaces.
  • the indirect couplings or communications connections between the apparatuses or units may be implemented in electronic, mechanical, or other forms.
  • the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one position, or may be distributed on a plurality of network units. A part or all of the units may be selected according to actual needs to achieve the objectives of the solutions of the embodiments.
  • functional units in the embodiments of the present invention may be integrated into one processing unit, or each of the units may exist alone physically, or two or more units are integrated into one unit.
  • the functions When the functions are implemented in the form of a software functional unit and sold or used as an independent product, the functions may be stored in a computer-readable storage medium. Based on such an understanding, the technical solutions of the present invention essentially, or the part contributing to the prior art, or a part of the technical solutions may be implemented in the form of a software product.
  • the computer software product is stored in a storage medium, and includes several instructions for instructing a computer device (which may be a personal computer, a server, or a network device) to perform all or a part of the steps of the methods described in the embodiments of the present invention.
  • the foregoing storage medium includes: any medium that can store program codes, such as a USB flash drive, a removable hard disk, a read-only memory (ROM, Read-Only Memory), a random-access memory (RAM, Random Access Memory), a magnetic disk, or an optical disc.
  • program codes such as a USB flash drive, a removable hard disk, a read-only memory (ROM, Read-Only Memory), a random-access memory (RAM, Random Access Memory), a magnetic disk, or an optical disc.

Landscapes

  • Physics & Mathematics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • General Physics & Mathematics (AREA)
  • Spectrometry And Color Measurement (AREA)

Abstract

According to example, a spectral imaging apparatus for generating multiple images of a scene or an object comprises a resonating cavity disposed over an imaging structure, wherein the resonating cavity comprises multiple sections each having respective different wavelength bandpass responses.

Description

SPECTRAL IMAGING APPARATUS AND METHOD FOR SIMULTANEOUS CAPTURE
OF MULTIPLE SPECTRAL IMAGES
TECHNICAL FIELD
The present application relates to an imaging apparatus and method, and more specifically, although not exclusively, to a spectral imaging apparatus and method.
BACKGROUND
Objects can emit, absorb or reflect electromagnetic radiation, and the spectroscopic properties of an object determined from measurements of such emission, absorption or reflection can enable various characteristics of the obj ect to be determined. That is, the spectroscopic response of an object or material, which varies as a function of wavelength (or frequency), can be used to determine various characteristics and properties of an object or environment. For example, emission spectra can enable useful information to be derived about an object, such as its composition.
SUMMARY
In a first aspect, a spectral imaging apparatus for generating multiple images of a scene or an object comprises a resonating cavity disposed over an imaging structure, wherein the resonating cavity comprises multiple sections each having respective different wavelength bandpass responses.
The provision of the multiple sections, each of which has a different wavelength bandpass response, enables simultaneous capture of images in different EM bands across the EM spectrum. This provides a reduction in size of the apparatus since complicated structures to vary the response of the apparatus to differing EM bands are not required. Concurrently, a speed increase flows as the apparatus does not require tuning between image capture events to enable the differing EM bands to be imaged.
In an implementation of the first aspect, the resonating cavity is adjustable. This enables multiple sets of images to be captured in reduced time since, as noted above, simultaneous capture of multiple images in different EM bands across the EM spectrum can be performed. Thus, with an adjustment to the resonating cavity following capture of these multiple images, a further set of multiple images can be captured, thereby reducing the overall time required to generate, e.g. multi- or hyperspectral images that can be derived or parsed form the captured images.
In an implementation of the first aspect, the resonating cavity is an interferometer comprising first and second partially reflective (or semi-transparent) elements. The first reflective element may be stepped, whereby to define multiple partially reflective surfaces. As such, gaps between each of the multiple reflective surfaces and the second reflective element define the multiple sections of the spectral imaging apparatus.
Thus, a single element may be provided that is fabricated in a stepped configuration, thereby enabling an interferometer structure to be provided that has multiple spectral responses.
In an implementation of the first aspect, at least one of the first and second reflective elements is movable. Accordingly, it is possible to vary the spectral response of the multiple sections in one fell swoop. In an implementation of the first aspect, at least one piezoelectric and/or micro-electro-mechanical actuator is provided in order effect the movement of the first and/or second reflective element. A single actuator reduces size, weight, complexity and power requirements.
In an implementation of the first aspect, the imaging structure comprises an image sensor.
The multiple sections of the apparatus can illuminate a sub-portion of the image sensor. The image sensor can comprise a colour, low-pass, high-pass or bandpass filter over at least a portion thereof. This can augment or amplify the response of the apparatus to a particular EM band of interest for example.
According to an implementation of the first aspect, the imaging structure can comprise an array of multiple image sensors. Such an array may be one- or two-dimensional. Each section of the resonating cavity can be arranged to illuminate an image sensor. So, for example, each section has a corresponding image sensor that receives EM radiation that has passed through that section of the resonant cavity. This has the advantage that each image sensor in the array can simultaneously generate image data representing a different band across the EM spectrum. Each resultant image can then be processed in order to derive or generate a multi- or hyperspectral image. At least one of the image sensors in the array can comprise a colour, low-pass, high-pass or bandpass filter over at least a portion thereof.
In an implementation of the first aspect, each section of the resonating cavity may be so profiled as to define a plano-convex or a plano-concave surface. Such a plano-convex or a plano-concave surface of a section of the resonating cavity may be adjacent to an image sensor.
The profile of the sections thus provides different thicknesses enabling a reduction in wavelength shift (which may be experienced in the center of an image sensor compared to its corner) that results from EM radiation entering the resonant cavity at different angles.
In an implementation of the first aspect, the resonating cavity is a Fabry-Perot interferometer.
In a second aspect, there is provided a method for generating multiple narrow-band images of a portion of a scene or object. The method can comprise providing a resonating cavity defining multiple sections having respective different wavelength bandpass responses, and generating data representing the multiple narrow-band images using an imaging structure configured to receive electromagnetic radiation from the multiple sections of the resonating cavity.
In an implementation of the second aspect, the resonating cavity can comprise first and second reflective elements, such that the method further comprises modifying the position of at least one of the first and second reflective elements, whereby to tune the wavelength bandpass responses of the sections of the resonating cavity. In an implementation of the second aspect, the method further comprises generating data representing the multiple narrow-band image at multiple different positions of the at least one of the first and second reflective elements.
As such, multiple sets of images can be captured in reduced time since, as noted above, simultaneous capture of multiple images in different EM bands across the EM spectrum can be performed. In another implementation of the second aspect, the imaging structure comprises an image sensor, and the method further comprises generating data representing the multiple narrow- band images from respective sub-portions of the image sensor. In another implementation of the second aspect, the imaging structure comprises an array of multiple image sensors, and the method further comprises generating data representing the multiple narrow-band images from respective image sensors of the array.
In an implementation of the second aspect, the data representing the multiple narrow-band images can be generated by exposing the imaging structure to the scene or object, or a portion thereof. Exposing the imaging structure can further comprise exposing the imaging structure via the multiple sections of the resonating cavity at substantially the same time.
In an implementation of the second aspect, the data representing the multiple narrow-band images can be used to generate a multispectral or hyperspectral image of the portion of a scene or object.
BRIEF DESCRIPTION OF THE DRAWINGS
Embodiments will now be described, by way of example only, with reference to the accompanying drawings, in which:
Figure l is a schematic representation of a spectral imaging apparatus according to an embodiment;
Figure 2 is a schematic representation of an imaging apparatus according to an embodiment;
Figure 3 is a schematic representation of an imaging apparatus according to an embodiment;
Figure 4 is a graph depicting transmission response of an apparatus according to an embodiment;
Figure 5 is a schematic representation of an imaging apparatus according to an embodiment;
Figure 6 is a schematic representation of an imaging apparatus according to an embodiment; Figure 7 is a flow diagram of a method according to an embodiment;
Figure 8 is a flow diagram of a method according to an embodiment; and Figure 9 is block diagram of an apparatus according to example.
DESCRIPTION
Example embodiments are described below in sufficient detail to enable those of ordinary skill in the art to embody and implement the systems and processes herein described. It is important to understand that embodiments can be provided in many alternate forms and should not be construed as limited to the examples set forth herein.
Accordingly, while embodiments can be modified in various ways and take on various alternative forms, specific embodiments thereof are shown in the drawings and described in detail below as examples. There is no intent to limit to the particular forms disclosed. On the contrary, all modifications, equivalents, and alternatives falling within the scope of the appended claims should be included. Elements of the example embodiments are consistently denoted by the same reference numerals throughout the drawings and detailed description where appropriate.
The terminology used herein to describe embodiments is not intended to limit the scope. The articles “a,” “an,” and “the” are singular in that they have a single referent, however the use of the singular form in the present document should not preclude the presence of more than one referent. In other words, elements referred to in the singular can number one or more, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises,” “comprising,” “includes,” and/or “including,” when used herein, specify the presence of stated features, items, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, items, steps, operations, elements, components, and/or groups thereof.
Unless otherwise defined, all terms (including technical and scientific terms) used herein are to be interpreted as is customary in the art. It will be further understood that terms in common usage should also be interpreted as is customary in the relevant art and not in an idealized or overly formal sense unless expressly so defined herein. Spectral imaging can be used to generate image data of an object or environment in various EM bands. For example, spectral imaging can be used to image in multiple bands across the EM spectrum, such as the infrared (around 700nm - 105nm wavelength range), the visible spectrum (around 380 nm and 760 nm), the ultraviolet (around lOnm - 400nm wavelength range), x-rays (around lnm - 1pm wavelength range) and so on, or some combination of the these in order to acquire image data that can be used to determine information or insight about an object or environment that would otherwise not be determinable using visible light alone. That is, image data can be generated in visible and non-visible bands of the EM spectrum.
For example, multispectral imaging (in which up to around 10-15 different wavelength ranges across the EM spectrum can be used, including light from wavelengths beyond the visible light range) can be used to capture image data in multiple spectral bands. Multispectral imaging can be used in, for example, printed circuit board inspection, detection of counterfeit currency, skin characterisation and inspection of foodstuffs, to name but a few.
Hyperspectral imaging defines another subcategory of spectral imaging in which, broadly speaking, in excess of 10-15 different wavelength ranges across the EM spectrum can be used for capture. Hyperspectral imaging thus measures continuous spectral bands, and a complete spectrum can be generated at every pixel in an image plane. Multiple such planes define a hyperspectral image cube.
Apparatus for generating such data is typically large due to the nature of the physical components used to enable the capture of images in the multiple different EM bands, whilst also being slow to capture data due to the way in which the apparatus is tuned between image capture events in order to reconfigure the apparatus to sample the next desired EM band. Such apparatus used to create spectral images can comprise, for example, imaging devices that include mechanically rotating filters, electronically tunable filters (e.g., liquid crystal based), pixel level filter arrays, scanning prism based systems, camera arrays with FPI (Fabry-Perot Interferometer) filters, cameras with MEMS (Micro-Electro-Mechanical System) tunable FPIs (scanning type), and so on. Such apparatus can generate a sequence of images within respective different EM bands, which can then be combined in a desired way to create a multi- or hyperspectral image. For example, a capture apparatus utilizing a Fabry-Perot interferometer can comprise a pair of partially reflective glass optical flats that may be spaced nanometers to millimeters apart, with the reflective surfaces facing each other. Optical waves can pass through the optical cavity only when they are in resonance with it, thereby enabling the FPI to act as a band pass filter. In MEMS tunable FPI systems, the cavity between two optical surfaces can be changed in order to change the filter band and thus the wavelength of the captured light.
However, as noted, image capture with these types of system is slow since every band is captured individually. Furthermore, such systems are generally large to accommodate the mechanism used to vary the position of the physical components to enable scanning. This makes them impractical for certain use cases, such as for use in mobile devices for example. The same is true for prism-based and tuneable FPI-based systems. Camera arrays based on fixed filters (e.g. FPI or thin-films) are not flexible since the pass band is fixed. This has the knock-on effect that the size of the apparatus increases when the ability to be able to capture data in more bands is added. Furthermore, during the time between physical variation of components the object or scene being imaged may have changed, which can adversely affect the final multi- or hyperspectral image since it will be difficult, if not impossible, to combine images into a sensible final result.
The size and speed constraints imposed by typical spectral imaging apparatus hinder their use in applications in which such spectral information may otherwise usefully be employed. For example, in mobile devices, such as smart phones for example, spectral information could be used for more reliable illuminant detection, which can lead to improvements in the color of photos or videos, such as by enabling colour temperature to be more accurately determined to enable white balance adjustments. Spectral images can also provide additional information (in comparison to traditional RGB images), which can help image processing algorithms, for example in the field of object or edge detection where certain parts of an image which may otherwise be difficult to detect in an RGB image can be more easily determined when imaged in a different, e.g. non-visible, band of the EM spectrum.
According to an embodiment, a spectral imaging apparatus is provided that can be used to simultaneously generate data representing multiple images of a scene or an object. Each of the images can comprise data relating to a selected EM band, such as those noted above for example. The spectral imaging apparatus is compact enough to be provided as part of a mobile device, and can generate image data of an object or scene relating to multiple different EM bands without the need for physical adjustment stemming from a requirement to vary a band pass response of the apparatus between image capture events. This has an advantage that simultaneous image capture across several different EM bands can be performed, as opposed to the quasi-simultaneous capture of devices in which capture events in differing bands proceed subsequent to physical modification of a band pass structure, thereby affecting the rate at which images can be captured. Thus, when provided as part of a mobile device for example, spectral image data can be generated which can be used to improve image or video processing by, for example, enabling illuminant detection to facilitate corresponding white balance adjustments to be made that otherwise more accurately reflect the situation being imaged. As noted above, such spectral imaging can also be used to enable improved edge detection (e.g. for parts of an image that may be substantially undetectable in a typical RGB image). Such detected edges can be used for image segmentation, which can be used to facilitate various image processing effects that can be isolated to certain parts of an image as desired, such as sharpening, blurring, enhanced bokeh effects and so on.
The spectral imaging apparatus according to an embodiment comprises a resonating (or resonant) cavity disposed over an imaging structure. The resonating cavity comprises multiple sections each having respective different wavelength bandpass responses. As such, exposure of the imaging structure to a scene or object results in generation of data representing multiple spectral images, the nature of which depend on the response of the sections of the resonating cavity to incident EM radiation. It should be noted that, as used herein, the terms object, scene and environment do not preclude consideration of portions or parts of the same. That is, when referring to an object, scene or environment, such references can include the case that a part, portion or region of the object, scene or environment is that being considered. In an embodiment, the resonating cavity of the imaging apparatus can be adjusted, thereby increasing the number of measurements that can be taken across the EM spectrum.
Figure l is a schematic representation of a spectral imaging apparatus according to an embodiment. Apparatus 100 can be used to generate image data representing multiple images of a scene or an object (or portion thereof), and comprises a resonating cavity 101 disposed over an imaging structure 103. The resonating cavity 101 comprises multiple sections 105a-e. The bandpass response of each section 105a-e of the apparatus is different. That is, EM radiation incident on the apparatus at the outwardly facing surface of element 107 passes, via the cavity 101 (which may comprise free space for example), to the upper surfaces 109a-e of the sections 105a-e. As can be seen from the example of figure 1, each of the upper surfaces 109a-e of the sections 105a-e is disposed at a different distance from the element 107.
In an embodiment, the resonating cavity of the apparatus is in the form of a Fabry-Perot interferometer (FPI). As such, the inwardly facing surface of element 107 and the upper surfaces 109a-e are optical flats that are partially reflective, with their reflective surfaces substantially parallel to and facing each other. In this way, EM radiation incident on the apparatus can pass through element 107 into the resonating cavity 101. Such incident EM radiation will then only pass through the partially reflective upper surfaces 109a-e of the sections 105a-e when it is in resonance. Whether a particular wavelength of incident radiation will pass by virtue of it being in resonance is determined by the relative separations 11 la-e between the reflective surfaces 109a-e and the reflective surface of the element 107, which increase as shown. It will be appreciated that no scale should be implied with reference to figure 1 (or any other figure herein). For example, separations 11 la-e may be of the order of micrometres or centimetres.
As incident EM radiation passes into the resonant cavity 101, it is multiply reflected between the reflective surfaces 109a-e and the reflective surface of the element 107, thereby generating multiple transmitted beams which pass through the remainder of the section in question and are incident on the imaging structure 103. That is, there is a constructive interference between in-phase components of beams of incident EM radiation, whereas beams that are out of phase destructively interfere. Whether the multiply reflected beams are in phase or not depends, amongst other factors which are not of interest (such as the refractive index of the material between the reflective surfaces for example) on the wavelength of the incident radiation, and on the relative separations 11 la-e between the reflective surfaces 109a-e and the reflective surface of the element 107, which define the thicknesses of the FPI for each section of the apparatus. Incident radiation that is transmitted through the resonant cavity 101 to the imaging structure 103 is thus spectrally modified compared to the incident radiation. That is, different sections 105a-e will pass different wavelengths of incident EM radiation as a result in the differences 11 la-e between the reflective surfaces 109a-e and the reflective surface of the element 107. Put another way, each of the sections 105a-e forms a band pass filter since each section is structured as an FPI, each of which has a different thickness parameter thereby leading to differing resonant enhancements.
In an embodiment, one or more of element 107 and the surfaces 109a-e can be slightly wedge shaped, as is common in FPI structures, to prevent the generation of interference fringes. Furthermore, in an embodiment, an anti-reflective coating may be applied to one or more of the rear (i.e. non-reflective surface) of the element 107 and the interface 113 between the section 105a-e and the imaging structure 103.
In the example of figure 1, the sections 105a-e define stepped surfaces of an FPI element.
Such stepped surfaces may be manufactured by, for example, etching a stepped plate from a substrate material, and coating the stepped plate with, e.g., a thin metallic layer (such as silver) to form the semitransparent surfaces. The element 107 may also be made semitransparent (or, put another way, partially reflective) by coating its inwardly facing (i.e. facing towards the inside of the resonant cavity 101) surface with a thin metallic layer (again, such as silver). That is, the inwardly facing surface of element 107 and the upper surfaces 109a-e (facing towards the inside of the resonant cavity 101) can be made partially reflective by coating with a thin layer of reflective material. In an embodiment, the thickness of such layers, such as silver layers, may be between 20-100nm, and more preferably between 40- 60nm. In an embodiment, the step size between different surfaces may be in the range 50- 500nm, and more preferable between 100-300nm (the step meaning the difference between 11 la-b, 11 lb-c, 11 lc-d, and so on. For example, if 11 la = 400nm and 11 lb = 450nm, the step size is 50nm). For example, the nominal value for 11 la-e (representing the size of an airgap for example) can be different for each section and the difference between two steps may be different. The bandpass responses need not all have the same width. The full-width at half-maximum (FWHM) of a bandpass response may be, for example, between 10-100nm, and more preferably between 10-50nm.
According to an embodiment, the resonant cavity can be adjustable. That is, element 107 may be movable using, e.g., one or more piezoelectric and/or MEMs actuators, and/or by some other actuating mechanism. In addition to element 107 being moveable, one or more of the surfaces 109a-e may also be movable using an actuation mechanism such as one of those mentioned above. That is, the relative separations 11 la-e between the reflective surfaces 109a-e and the reflective surface of the element 107 can, in an embodiment, be varied by moving element 107 up or down (relative to the surfaces 109a-e), and/or moving the surfaces 109a-e up or down (relative to the element 107). Figure 1 depicts actuators 150, 151. One or both may be provided to move element 107 and or sections 105a. Actuator 151 may also be configured to move sections 105b-e, or separate actuators (not shown) may be used.
According to an embodiment, movement of element 107 relative to the surfaces 109a-e results in a change to the band pass properties of the sections of the resonant cavity 101.
Accordingly, image data may be generated at several different positions of the element 107 resulting in multiple spectral images. The same is true of the situation in which the surfaces 109a-e are moved relative to element 107, or to the situation in which both parts are moved relative to one another. In an embodiment, the position of individual surfaces 109a-e may be moved. However, a reduction in complexity (and thus physical size of the apparatus) results from simply enabling movement of element 107 using a single actuation mechanism. Given the relative displacements of surfaces 109a-e from element 107, movement of element 107 provides an elegant way to adjust multiple band pass responses (i.e. of sections 105a-e) in one go, resulting in the ability to simultaneously generate image data across multiple bands of the EM spectrum.
Put another way, when the gap defining the resonant cavity 101 between element 107 and the stepped sections 105a-e is changed, the wavelength of passing radiation is changed. Thus, due to different gap (e.g. air gap) thicknesses on top of the imaging structure 103, multiple different transmissions can be imaged.
In this connection, figure 2 is a schematic representation of an imaging apparatus according to an embodiment. In the example of figure 2, the imaging structure 103 comprises multiple image sensors 201a-e, each disposed beneath a corresponding section 105a-e of the apparatus 100. Each image sensor generates image data from a section of the apparatus that sits above it. So, for example, image sensor 201a captures an image from EM radiation that passes through the resonant cavity 101 via the section 105a. Similarly, image sensor 201b captures an image from EM radiation that passes through the resonant cavity 101 via the section 105b, and so on. Since each section acts as a band pass filter, each image sensor will image a different part of the EM spectrum, as defined or selected based on the relative separations 11 la-e between the reflective surfaces 109a-e and the reflective surface of the element 107. Thus, a broad wavelength range can be covered in a single capture operation without the requirement to adjust components between capture events to vary the response of the apparatus.
Figure 3 is a schematic representation of an imaging apparatus according to an embodiment. The apparatus of figure 3 is similar to the apparatus described above with reference to figure 1 or 2, but is shown from a different viewpoint. Specifically, the apparatus of figure 3 is depicted in plan view (i.e. from the top, as opposed to the side views as shown in figures 1 and 2). In one embodiment of the example of figure 3, an array of 15 image sensors is depicted, arranged in three rows 301a-e. 303a-e, 305a-e. It will be appreciated that more or fewer image sensors may be provided as desired. According to an embodiment, each of the image sensors is associated with a corresponding section of an apparatus 100. For example, as viewed in figures 1 or 2, sections 105a-e may be associated with image sensors in one of the three rows depicted, with corresponding additional sections for each of the other sensors in the other two rows. That is, each image sensor of figure 3 can capture an image from a selected band of the EM spectrum that has passed via the resonant cavity 101. Of course, it is possible that a group or combination of image sensors may capture an image from a single (e.g. relatively larger) section. For example, a group or any combination of two or more image sensors can capture an image relating to one EM band, although preferably the sections are configured such that they each enable a different EM band to be imaged using corresponding single image sensors, thereby enabling simultaneous capture of multiple spectrally different images of a scene or an object, or portion thereof. Thus, each image sensor in an image sensor array can have a different gap above it that defines the size between the corresponding FPI element surfaces.
In another embodiment of the example of figure 3, a single image sensor is provided, which comprises 15 sub-portions arranged in three rows 301a-e. 303a-e, 305a-e. It will be appreciated that more or fewer sub-portions may be provided as desired. According to an embodiment, each of the sub-portions is associated with a corresponding section of an apparatus 100. For example, as viewed in figures 1 or 2, sections 105a-e may be associated with sub-portions in one of the three rows depicted, with corresponding additional sections for each of the sub-portions in the other two rows. That is, each sub-portions of figure 3 can capture an image from a selected band of the EM spectrum that has passed via the resonant cavity 101. Of course, it is possible that a group or combination of sub-portions may capture an image from a single (e.g. relatively larger) section. For example, a group or any combination of two or more sub-portions can capture an image relating to one EM band, although preferably the sections are configured such that they each enable a different EM band to be imaged using a corresponding single sub-portion, thereby enabling simultaneous capture of multiple spectrally different images of a scene or an object, or portion thereof. Thus, each sub-portion in a camera array can have a different gap above it that defines the size between the corresponding FPI element surfaces. In an embodiment, an imaging structure, which can be in the form of a camera array, may thus comprise a single image sensor with multiple sub portions and/or an array of multiple image sensors as described above. The camera array can include the tunable FPI element and a lens or multiple lenses. For example, lenses may be located between an image sensor and the tunable FPI element (e.g. element 107) and/or on top of FPI element. In an embodiment, the camera array may have a size of NxM, where at least one of N or M is larger than 1. Values for an array of image sensors and/or sensor sub portions may be, for example, 1x2, lx, 3, 1x4, 2x2, 2x3, 2x4, 3x3, 4x3, 4x4.
As noted above, one or more of the surfaces defining the FPI resonant cavity can be moved in order to tune the FPI. That is, the distance between FPI element surfaces can be changed by moving the element 107, the stepped plate defining the sections of the apparatus, or both.
In this connection, figure 4 is a graph depicting transmission response of an apparatus according to an embodiment. In the example of figure 3, for the sake of clarity, three capture events are depicted, each at a different position of the element 107, the stepped plate defining the sections of the apparatus, or both for three image sensors (and three corresponding FPI sections). As can be seen from figure 4, as the element 107, the stepped plate defining the sections of the apparatus, or both is moved between capture events, there is a wavelength shift. Accordingly, a high number of images at differing wavelengths can be captured with relatively few changes to the physical structure. This provides an increase in the speed compared to systems in which a single wavelength is imaged at a time. Concomitantly, depending on the nature of the object or environment being imaged, simultaneous capture of multiple wavelengths results in no image artefacts as a result of, e.g., movement of the object or of some aspects of the environment. Figure 5 is a schematic representation of an imaging apparatus according to an embodiment.
In the example of figure 5, the thickness of each section 105a-e is modified so that it varies spatially. For example, a region 301a-e of different material may be provided at the interface 113 between sections 105a-e and the imaging structure 103. For example, regions 301a-e may comprise lenses. The size and/or shape of the regions 301a-e may vary between one another.
The provision of different thicknesses enables a reduction in wavelength shift (which may be experienced in the center of an image sensor compared to its comer) that results from EM radiation entering the resonant cavity 101 at different angles. Thus, in the example of figure 5, the bottom part of each section 105a-e is not flat, but defines a curved region e.g. spherically shaped regions 301a-e in figure 5. In an embodiment, regions 301a-e may have a plano-convex and/or plano-concave surface to guide radiation to a corresponding image sensor.
Figure 6 is a schematic representation of an imaging apparatus according to an embodiment in which the regions 301a-e are plano-concave (compared to the example shown in figure 5).
According to an embodiment, the FPI element has an optical aperture that passes selected bands of incident EM radiation. Although depicted as rectangular in the figures, the shape of the aperture may be circular, elliptical, or comprise an array optical aperture for, e.g. each image sensor in an array of imaging sensors. It will also be noted that the element 107 may be provided in a stepped form, either along with or in place of the stepped nature of the sections as depicted in the accompanying figures. In an embodiment, each sub-portion of a camera array may have an optical aperture that defines a limiting aperture that passes radiation/light. Those apertures may be the same as the optical apertures of FPI elements or there can be additional apertures in at least one optical path of the camera array.
In an embodiment, one or more of image sensors can comprise a filter, such as a low pass, high pass or bandpass filter, to limit the wavelengths that the image sensor detects. Furthermore, one or more of the image sensors may comprise a polarizing filter.
There is therefore provided a multi-spectral imaging apparatus. Transmission curves may be tuned by effectively modifying the size of the gap between reflective surfaces of the resonant cavity. For example, tuning of the FPI filters can be performed using an actuator (e.g., Piezo or MEMS actuator) or by some other actuating mechanism. The size of the system can be balanced with the speed by the number of the imaging sensors of the imaging structure. Tuning of the element can be performed with a single actuator (e.g. 150), which saves space and cost.
According to an embodiment, a final spectral image of an object or scene/environment (or part thereof) can be parsed from the multiple images captured by the image sensors of the imaging structure. When multiple captures are performed (e.g., as described with reference to figure 4), the sequence can be captured to mimic a multispectral camera or to mimic a hyperspectral camera where scanning is performed in small increments.
Figure 7 is a flow diagram of a method according to an embodiment. The method of figure 7 comprises generating multiple narrow-band images of a scene or object, or portion thereof.
As such, in block 700, a resonating cavity defining multiple sections having respective different wavelength bandpass responses is provided. In block 703, data representing the multiple narrow-band images is generated. The data is generated, in the example of figure 7, using an imaging structure configured to receive electromagnetic radiation from the multiple sections of the resonating cavity.
Figure 8 is a flow diagram of a method according to an embodiment. The method of figure 8 comprises generating multiple narrow-band images of a scene or object, or portion thereof, similarly to that provided according to the method of figure 7. However, in block 801 of figure 8, the resonating cavity is adjusted in order to modify the response of the various sections therein (such as described with reference to figure 4 for example). For example, the size of the gap between reflective surfaces of the resonant cavity can be modified by moving one or other or both of the element 107 and surfaces 109a-e. The position of the surfaces may be modified any desired number of times between capture events of block 703 resulting in, e.g., a set of transmission responses such as shown in figure 4.
Figure 9 is block diagram of an apparatus according to example. An apparatus 900 as shown in figure 9 may be configured to implement each step and method in the foregoing method embodiments. The apparatus 900 may be applied to or present as part of mobile device, such as a smart phone or a terminal in various communications systems for example. In an embodiment shown in figure 9, the apparatus 900 includes a processing unit (including one or more processors) 901, and a memory 902. The processing unit 901 controls an operation of the apparatus 900, and may also be called a CPU (Central Processing Unit, central processing unit). The memory 902 may include a read-only memory and a random-access memory (RAM), and can provide instructions 903 and data for the processing unit 901. A part of the memory 902 may further include a nonvolatile random-access memory (NVRAM). In an actual application, the apparatus 900 may be embedded into or may be a wireless communications device such as a mobile phone or other portable communications device such as a smart phone or tablet.
The methods disclosed according to the examples described herein may be applied in processing unit 901. In a process of implementation, each step of the method may be completed by using an integrated logic circuit of hardware in the processing unit 901 or instructions in a software form. These instructions may be implemented and controlled by using the processing unit 901. Configured to execute the method disclosed in the foregoing examples, the foregoing processing unit may include a general-purpose processor, a digital signal processor (DSP), an application-specific integrated circuit (ASIC), a field- programmable gate array (FPGA), or another programmable logic device, a discrete gate or transistor logic device, or a discrete hardware component; and can implement or execute each disclosed method, step, and logic block diagram in the embodiments of the present invention. The general-purpose processor may be a microprocessor or the processor may be any common processor or decoder, and so on. A step with reference to a method disclosed herein may be directly executed and completed by a hardware decoding processor or executed and completed by a combination of hardware and a software module in a decoding processor. The software module may be located in a mature storage medium in the art, such as a random- access memory, a flash memory, a read-only memory, a programmable read-only memory, an electronically erasable programmable memory, or a register. The storage medium is located in the memory 902, and the processing unit 901 can read information in the memory 902, and can complete the steps of a method with reference to the hardware. For example, the memory 902 may store information 904 about a narrow band image for the processing unit 901 to process. The memory 902 may store information 904 about (or representing) multiple narrow band images for the processing unit 901 to parse in order to generate a final spectral image of an object or scene. A person of ordinary skill in the art may be aware that, in combination with the examples described in the embodiments disclosed in this specification, units and algorithm steps may be implemented by electronic hardware, or a combination of computer software and electronic hardware. Whether the functions are performed by hardware or software depends on particular applications and design constraint conditions of the technical solutions. A person skilled in the art may use different methods to implement the described functions for each particular application, but it should not be considered that such implementation goes beyond the scope of the present invention.
It may be clearly understood by a person skilled in the art that, for the purpose of convenient and brief description, for a detailed working process of the foregoing system, apparatus, and unit, reference may be made to the corresponding process in the foregoing method embodiments, and details are not described herein again.
In the embodiments provided in the present application, it should be understood that the disclosed system, apparatus, and method may be implemented in other manners. For example, the described apparatus embodiment is merely exemplary. For example, a plurality of units or components may be combined or integrated into another system, or some features may be ignored or not performed. In addition, the displayed or discussed mutual couplings or direct couplings or communications connections may be implemented through some interfaces. The indirect couplings or communications connections between the apparatuses or units may be implemented in electronic, mechanical, or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one position, or may be distributed on a plurality of network units. A part or all of the units may be selected according to actual needs to achieve the objectives of the solutions of the embodiments.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each of the units may exist alone physically, or two or more units are integrated into one unit.
When the functions are implemented in the form of a software functional unit and sold or used as an independent product, the functions may be stored in a computer-readable storage medium. Based on such an understanding, the technical solutions of the present invention essentially, or the part contributing to the prior art, or a part of the technical solutions may be implemented in the form of a software product. The computer software product is stored in a storage medium, and includes several instructions for instructing a computer device (which may be a personal computer, a server, or a network device) to perform all or a part of the steps of the methods described in the embodiments of the present invention. The foregoing storage medium includes: any medium that can store program codes, such as a USB flash drive, a removable hard disk, a read-only memory (ROM, Read-Only Memory), a random-access memory (RAM, Random Access Memory), a magnetic disk, or an optical disc.
The present inventions can be embodied in other specific apparatus and/or methods. The described embodiments are to be considered in all respects as illustrative and not restrictive.
In particular, the scope of the invention is indicated by the appended claims rather than by the description and figures herein. All changes that come within the meaning and range of equivalency of the claims are to be embraced within their scope.

Claims

1. A spectral imaging apparatus for generating multiple images of a scene or an object, the spectral imaging apparatus comprising: a resonating cavity disposed over an imaging structure, wherein the resonating cavity comprises multiple sections each having respective different wavelength bandpass responses.
2. The spectral imaging apparatus as claimed in claim 1, wherein the resonating cavity is adjustable.
3. The spectral imaging apparatus as claimed in claim 1 or 2, wherein the resonating cavity is an interferometer comprising first and second partially reflective elements.
4. The spectral imaging apparatus as claimed in claim 3, wherein the first reflective element is stepped, whereby to define multiple partially reflective surfaces.
5. The spectral imaging apparatus as claimed in claim 4, wherein respective gaps between each of the multiple reflective surfaces and the second reflective element define the multiple sections.
6. The spectral imaging apparatus as claimed in claim 4, wherein the second reflective element is stepped, whereby to define multiple reflective surfaces.
7. The spectral imaging apparatus as claimed in any of claims 2 to 6 wherein at least one of the first and second reflective elements is movable.
8. The spectral imaging apparatus as claimed in any preceding claim, further comprising at least one piezoelectric and/or micro-electro-mechanical actuator to move the first and/or second reflective element.
9. The spectral imaging apparatus as claimed in any preceding claim, wherein the imaging structure comprises an image sensor.
10. The spectral imaging apparatus as claimed in claim 9, wherein the multiple sections illuminate a sub-portion of the image sensor.
11. The spectral imaging apparatus as claimed in claim 9 or 10, wherein the image sensor comprises a colour, low-pass, high-pass or bandpass filter over at least a portion thereof.
12. The spectral imaging apparatus as claimed in any of claims 1 to 8, wherein the imaging structure comprises an array of multiple image sensors.
13. The spectral imaging apparatus as claimed in claim 12, wherein respective sections of the resonating cavity are arranged to illuminate respective ones of the image sensors.
14. The spectral imaging apparatus as claimed in claim 12 or 13, wherein at least one of the image sensors in the array comprises a colour, low-pass, high-pass or bandpass filter over at least a portion thereof.
15. The spectral imaging apparatus as claimed in any preceding claim, wherein respective sections of the resonating cavity are so profiled as to define a plano-convex or plano-concave surface.
16. The spectral imaging apparatus as claimed in claim 15, wherein a plano-convex or a plano-concave surface of a section of the resonating cavity is adjacent to an image sensor.
17. The spectral imaging apparatus as claimed in any preceding claim, wherein the resonating cavity is a Fabry-Perot interferometer.
18. A method for generating multiple narrow-band images of a portion of a scene or object, the method comprising: providing a resonating cavity defining multiple sections having respective different wavelength bandpass responses; and generating data representing the multiple narrow-band images using an imaging structure configured to receive electromagnetic radiation from the multiple sections of the resonating cavity.
19. The method as claimed in claim 18, wherein the resonating cavity comprises first and second reflective elements, the method further comprising: modifying the position of at least one of the first and second reflective elements, whereby to tune the wavelength bandpass responses of the sections of the resonating cavity.
20. The method as claimed in claim 18 or 19, further comprising: generating data representing the multiple narrow-band image at multiple different positions of the at least one of the first and second reflective elements.
21. The method as claimed in any of claims 18 to 20, wherein the imaging structure comprises an image sensor, the method further comprising: generating data representing the multiple narrow-band images from respective sub portions of the image sensor.
22. The method as claimed in any of claims 18 to 20, wherein the imaging structure comprises an array of multiple image sensors, the method further comprising: generating data representing the multiple narrow-band images from respective image sensors of the array.
23. The method as claimed in any of claims 18 to 22, further comprising: generating the data representing the multiple narrow-band images by exposing the imaging structure to the portion of the scene or object.
24. The method as claimed in claim 23, wherein exposing the imaging structure further comprises: exposing the imaging structure via the multiple sections of the resonating cavity at substantially the same time.
25. The method as claimed in any of claims 18 to 24, further comprising: using the data representing the multiple narrow-band images, generating a multispectral or hyperspectral image of the portion of a scene or object.
PCT/EP2020/051556 2020-01-23 2020-01-23 Spectral imaging apparatus and method for simultaneous capture of multiple spectral images WO2021148119A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/EP2020/051556 WO2021148119A1 (en) 2020-01-23 2020-01-23 Spectral imaging apparatus and method for simultaneous capture of multiple spectral images

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2020/051556 WO2021148119A1 (en) 2020-01-23 2020-01-23 Spectral imaging apparatus and method for simultaneous capture of multiple spectral images

Publications (1)

Publication Number Publication Date
WO2021148119A1 true WO2021148119A1 (en) 2021-07-29

Family

ID=69192067

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2020/051556 WO2021148119A1 (en) 2020-01-23 2020-01-23 Spectral imaging apparatus and method for simultaneous capture of multiple spectral images

Country Status (1)

Country Link
WO (1) WO2021148119A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110284981A1 (en) * 2008-12-11 2011-11-24 Ki Soo Chang Image sensor comprising microlens array, and manufacturing method thereof
US20150276478A1 (en) * 2011-11-04 2015-10-01 Imec Spectral camera with mosaic of filters for each image pixel
WO2017023209A1 (en) * 2015-08-04 2017-02-09 Agency For Science, Technology And Research Hyperspectral imaging apparatus and method
US20170227398A1 (en) * 2015-09-11 2017-08-10 L-3 Communications Cincinnati Electronics Corporation Hyperspectral optical element for monolithic detectors
US20180315788A1 (en) * 2017-05-01 2018-11-01 Visera Technologies Company Limited Image sensor

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110284981A1 (en) * 2008-12-11 2011-11-24 Ki Soo Chang Image sensor comprising microlens array, and manufacturing method thereof
US20150276478A1 (en) * 2011-11-04 2015-10-01 Imec Spectral camera with mosaic of filters for each image pixel
WO2017023209A1 (en) * 2015-08-04 2017-02-09 Agency For Science, Technology And Research Hyperspectral imaging apparatus and method
US20170227398A1 (en) * 2015-09-11 2017-08-10 L-3 Communications Cincinnati Electronics Corporation Hyperspectral optical element for monolithic detectors
US20180315788A1 (en) * 2017-05-01 2018-11-01 Visera Technologies Company Limited Image sensor

Similar Documents

Publication Publication Date Title
CN108291800B (en) Spectral imaging method and system
EP2773929B1 (en) Spectral camera with mosaic of filters for each image pixel
JP5954801B2 (en) Mirror for Fabry-Perot interferometer and method of making the mirror
US9848135B2 (en) Spectral camera with mirrors for projecting multiple adjacent image copies onto sensor array
EP2776798B1 (en) Spectral camera with integrated filters and multiple adjacent image copies projected onto sensor array
US10050075B2 (en) Multi-layer extraordinary optical transmission filter systems, devices, and methods
CN105323443B (en) Spectral image acquisition device and by optical wavelength adquisitiones
CN108293097B (en) Iris imaging
US9253420B2 (en) Hyperspectral single pixel imager with fabry perot filter
JP2020053910A (en) Optical device and imaging device
US20200132277A1 (en) Tunable spectral illuminator for camera
AU2016357096B2 (en) Proximity focus imaging interferometer
US11930128B2 (en) Dual camera module, electronic apparatus including the same, and method of operating electronic apparatus
WO2021148119A1 (en) Spectral imaging apparatus and method for simultaneous capture of multiple spectral images
US10931894B2 (en) Tunable spectral illuminator for camera
US6636357B2 (en) Electrically variable optical filter
WO2021096648A1 (en) Tunable spectral illuminator for camera
JP7112223B2 (en) optical device
RU2377510C1 (en) Displaying spectrometre
US20220404531A1 (en) Method and system for fabrication and use of a spectral basis filter
US20220252452A1 (en) Compact Computational Spectrometer Using Solid Wedged Low Finesse Etalon
WO2023032146A1 (en) Spectral function-equipped imaging element and manufacturing method therefor, manufacturing method for pixelated optical filter array, and product comprising spectral function-equipped imaging element
WO2022048727A1 (en) Imaging device comprising fabry-perot interferometer

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20701976

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20701976

Country of ref document: EP

Kind code of ref document: A1