WO2013064507A1 - Spectral camera with overlapping segments of image copies interleaved onto sensor array - Google Patents

Spectral camera with overlapping segments of image copies interleaved onto sensor array Download PDF

Info

Publication number
WO2013064507A1
WO2013064507A1 PCT/EP2012/071506 EP2012071506W WO2013064507A1 WO 2013064507 A1 WO2013064507 A1 WO 2013064507A1 EP 2012071506 W EP2012071506 W EP 2012071506W WO 2013064507 A1 WO2013064507 A1 WO 2013064507A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
array
filters
different
spectral
Prior art date
Application number
PCT/EP2012/071506
Other languages
English (en)
French (fr)
Inventor
Bert Geelen
Andy Lambrechts
Klaas Tack
Original Assignee
Imec
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Imec filed Critical Imec
Priority to IN2966CHN2014 priority Critical patent/IN2014CN02966A/en
Priority to JP2014539314A priority patent/JP6082401B2/ja
Priority to EP12788152.2A priority patent/EP2776797B1/en
Publication of WO2013064507A1 publication Critical patent/WO2013064507A1/en
Priority to US14/267,776 priority patent/US9366573B2/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/2803Investigating the spectrum using photoelectric array detector
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/0205Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows
    • G01J3/0208Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows using focussing or collimating elements, e.g. lenses or mirrors; performing aberration correction
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/0205Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows
    • G01J3/0229Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows using masks, aperture plates, spatial light modulators or spatial filters, e.g. reflective filters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/0205Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows
    • G01J3/0235Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows using means for replacing an element by another, for replacing a filter or a grating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/0264Electrical interface; User interface
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/2823Imaging spectrometer
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/46Measurement of colour; Colour measuring devices, e.g. colorimeters
    • G01J3/50Measurement of colour; Colour measuring devices, e.g. colorimeters using electric radiation detectors
    • G01J3/51Measurement of colour; Colour measuring devices, e.g. colorimeters using electric radiation detectors using colour filters
    • G01J3/513Measurement of colour; Colour measuring devices, e.g. colorimeters using electric radiation detectors using colour filters having fixed filter-detector pairs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/135Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on four or more different wavelength filter elements
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10TECHNICAL SUBJECTS COVERED BY FORMER USPC
    • Y10TTECHNICAL SUBJECTS COVERED BY FORMER US CLASSIFICATION
    • Y10T29/00Metal working
    • Y10T29/49Method of mechanical manufacture
    • Y10T29/49826Assembling or joining

Definitions

  • An object of the invention is to provide improved apparatus or methods.
  • a first aspect provides an integrated circuit for an imaging system as set out in independent claim 1.
  • This provides a spectral camera for producing a spectral output and having an objective lens for producing an image, an array of lenses for producing optical copies of segments of the image on different optical channels, an array of filters aligned with the different optical channels and having an interleaved spatial pattern of different passbands across the array, and one or more sensor arrays arranged to detect the copies of the image segments on the different filtered optical channels, the interleaved spatial pattern of the array of filters being arranged so that detected segment copies of spatially adjacent optical channels have different passbands and represent segments of the image which at least partially overlap each other, and so that at least some of the detected segment copies of the same passband on spatially non-adjacent optical channels represent segments of the image which are adjacent and fit together.
  • the sensor array can be arranged to detect the copies of the segments of the image simultaneously.
  • the array of filters can be integrated on the sensor array. This can enable reduced cross talk as there is effectively no cavity between the filters and the sensors.
  • At least some parts of the interleaved spatial pattern can have a superimposed finer pattern of the passbands at a finer granularity than that of the interleaved spatial pattern.
  • the spectral camera can have a post processing part for electrically processing the detected segment copies for a given passband to stitch them together. This can enable more contiguous images to be output without gaps.
  • Figure 5 shows a schematic view of a spectral camera according to another embodiment with a processor for restitching, and a database for the image cube,
  • Figure 7 shows a view of hexagon shaped image copies
  • Figures 8 to 12 show views of different arrangements of overlapping image segment copies on the sensor array with different arrangements of passbands
  • Figures 14 to 16 show projected image copies with other arrangements of filters according to embodiments
  • Figures 13 and 14 show steps in methods of operation of the cameras
  • Figures 15 and 16 show steps in methods of configuring such cameras during manufacture
  • Figure 17 shows a cross section view of a sensor array integrated with an array of Fabry Perot filters.
  • Elements or parts of the described receivers may comprise logic encoded in media for performing any kind of information processing.
  • Logic may comprise software encoded in a disk or other computer-readable medium and/or instructions encoded in an application specific integrated circuit (ASIC), field programmable gate array (FPGA), or other processor or hardware.
  • ASIC application specific integrated circuit
  • FPGA field programmable gate array
  • references to software can encompass any type of programs in any language executable directly or indirectly by a processor.
  • References to arrays of optical filters or arrays of optical sensors are intended to encompass 2-dimensional arrays, rectangular or non rectangular arrays, irregularly spaced arrays, and non planar arrays for example.
  • References to integrated circuits are intended to encompass at least dies or packaged dies for example having the array of optical filters monolithically integrated onto the array of sensors, or devices in which the array of optical filters is manufactured separately and added later onto the die or into the same integrated circuit package.
  • Hyperspectral imaging systems or cameras can consist of different discrete components, e.g. the optical sub-system for receiving the incoming electromagnetic spectrum, the array of filters for creating the different bands within the received spectrum and the image sensor array for detecting the different bands.
  • the optical sub-system can consist of a single or a combination of different lenses, apertures and/or slits.
  • the array of filters can consist of one or more prisms, gratings, optical filters, acousto-optical tunable filters, liquid crystal tunable filters etc or a combination of these.
  • Sensing spectral information is typically achieved either using dispersive optical elements or spectral filters.
  • Dispersive optical elements such as prisms or gratings have limited light throughput and require time-consuming spatial scanning over multiple frames, as they sense only 1 spectral and 1 spatial dimension at a time.
  • Spectral filters capture only 2 spatial dimensions (width W and height H) at one wavelength and thus require spectral scanning over multiple frames, requiring significant amounts of time due to the large switching overhead.
  • Embodiments as described below can now enable higher-speed acquisition of the 3D HSI cube to enable snapshot imaging, by mapping each point in the WxHxNB-sized cube (where NB is the number of passbands) to a sensor element on the sensor array more efficiently.
  • Each of the WxH spatial points sensed in the scene is optically duplicated NB times on the sensor array, by subdividing each of the NB spectral subimages in a number of segments, and interleaving these segments alongside each other on the 2D sensor. Due to the limited space on the sensor array, there will usually be a trade-off between spatial and spectral resolution.
  • the lenses should be positioned at a distance of (M+/-l).f lens , depending on the system configuration.
  • M represents the inverse magnification and is determined based on the number of spectral bands.
  • Image segment copies on the sensor array are interleaved, so that those of the same passband are spaced apart so that they represent adjacent segments of the image and are interleaved with overlapping segment copies of different passbands.
  • All detected segments each corresponding to one specific band can be stitched together to produce one plane of the image cube.
  • the size of the segments is a design parameter, and determines the corresponding maximal size of its lens and the number of segments. Since the resulting segments and lenses can be significantly smaller than those of the tiled imager, (smaller) microlenses may be used. This improves the lens' optical performance and resolution.
  • producing a spectral slice out of the cube requires stitching together the segments of that band.
  • Light throughput is likely to be higher for the interleaved imager, due the positioning of the (micro)lenses near to the objective image.
  • Crosstalk can be avoided by either matching the objective speed to the microlens speed and using the objective aperture as a field stop, resulting in trade-off between system light throughput and vignetting inside the segments, or by adding (micro)baffles between the segments, enabling both high-light throughput and crosstalk segmentation.
  • Figure 3 shows a magnified view of part of the camera of figure 2. This shows a side view of the lenses and the sensor array with integrated filters. There is a filter of a particular band for each lens. Hence only the light of a given colour will be detected for each lens. Hence although the image shows coloured beams on the left of the lens, this is not intended to be realistic, as all colours would reach the filters from the left, and just one colour would pass through to reach the detectors.
  • light from one segment of the image reaches a block of sensor elements as shown in the plan view of the sensor array. Each block of sensor elements corresponds to one of the lenses (which can be microlenses if the block is small enough).
  • Figure 4 plan view of image segment copies with different passbands,
  • Figure 7 shows a view of hexagon shaped image copies on hexagon shaped filters. This shows three complete hexagon shaped image copies 230, 240, and 250, and a number of half image copies, 260, 270, 280, and 290. The half copies as well as the corner segments could be restitched to make complete images at different bands.
  • Vignetting is a reduction of an image's brightness or saturation at the periphery compared to the image center.
  • one part of the design of the hyperspectral imaging module is the arrangement of the different filters over the image sensor array.
  • the design process can be split into the following parts:
  • Figure 15 shows steps in methods of configuring such cameras during manufacture, with step 500 showing a first step of selecting how many image copies to provide and how to arrange them on the sensor array.
  • Step 510 shows selecting passbands and selecting the spatial arrangement of the passbands on the image copies.
  • Step 520 shows manufacturing the layers of the integrated filters according to the passbands and their spatial arrangement.
  • Figure 16 is similar to figure 15 except that step 510 is replaced by step 515 in which the selection of passbands and their spatial arrangement is such as to have some variation of which passbands are detected in different parts of the image cube, or variation of spatial or spectral resolution of the detection in different parts of the image cube. This may involve the spatial pattern having a finer granularity than the image copies, so for a part of the array of filters for a respective one of the image copies there is a spatial pattern of multiple different ones of the passbands.
  • Figure 17 shows a cross section view of a sensor array 40 integrated with an array of Fabry Perot filters 31. This has a top semi mirror coating 33 and a bottom semi mirror coating 32. Although gaps are shown between the parts, this is for clarity and in practice there would be no gaps. More details of examples of this part are now set out.
  • a lower cost technology having a large critical dimension (CD), e.g. 130nm, resulting a larger pixels and smaller spatial resolution of the image sensor array Alternatively one can choose to manufacture the image sensor array in a state in a higher cost technology having a smaller critical dimension (CD), e.g. 45nm, resulting a smaller pixels and higher spatial resolution of the image sensor array.
  • CD critical dimension
  • the image sensor array can be a front-illuminated sensor, whereby the array of filters is post processed on top of the substrate comprising the sensor. Optionally this substrate is thinned afterwards thereby removing the bulk of the substrate and leaving a thin slice containing the image sensor array and the spectral unit monolithically integrated therewith.
  • the image sensor array can be a back-illuminated sensor, whereby first the substrate comprising the sensor is thinned from the backside onwards. On backside the thinned substrate the spectral unit is then post processed.
  • any order of Fabry-Perot filters can be manufactured and used, preferably only order Fabry- Perot filters are formed on the image sensor array thereby reducing the complexity for removing and/or blocking higher order components.
  • a monolithically integrated hyperspectral imaging system with a 1 st order Fabry- Perot filter array typically doesn't require a focusing lens in the optical subsystem.
  • the design of the filters e.g. the thickness which defines the cavity length of the filters, can take into account the location of the particular filter on the chip to reduce the dependency on variations in the incident angle of the incoming electromagnetic spectrum.
  • the filter is post-processed on top of an image sensor array and every filter is aligned with the rows or columns of the image sensor array.
  • Photons that exit the filter structure above a certain pixel can cross the gap and fall onto a neighboring pixel. This effect will be heavily reduced when the gap is reduced or completely removed by a direct postprocessing of the filter onto the pixels. There can still be some cross-talk as a result of the thickness of the filter itself however, as a photon that enters the filter above one pixel can still propagate through the filter and fall onto a neighboring pixel. This is reduced by designing thinner filters and by controlling the angle of incidence.
  • the extra non-functional layer gives rise to extra reflections on its boundaries if the refractive indices are not matched and therefore to extra stray light on top of the cross-talk discussed above.
  • stray light is reduced.
  • the distance that is traveled by the stray light (D) is well within normal pixel dimensions (e.g. 1 to 15 _m). This is not the case for more macroscopic integration distances, e.g. 1 mm substrate, in which case the distance of the traveled light D ranges over tens to hundreds of pixels, leading to a severe deterioration of the spatial and spectral resolution. In some cases, the distance D can become so large, an additional focus lens is required to focus the light back onto the pixel.
  • the dielectric stack and metals on top of the photodiodes reflect part of the light. Together with the gap because of the heterogeneous integration and the bottom mirror of the cavity, this forms a parasitic Fabry-Perot interfering with the actual one.
  • This process can be optimized with the monolithic integration as the dielectric layers in the imager become part of the bottom Bragg stack, made in similar materials (e.g. oxide) and which is not very sensitive to the width of these layers.
  • the thin filters are much less sensitive to this and the displacement D stays in most cases below the pixel dimensions, i.e. preferably in the 1 to lOnm range, for all but the largest angles of incidence and the smallest pixels sizes.
  • Traditional production techniques in combination with hybrid integration of the filter structure and the image sensor, can not reach the required accuracy to fabricate Fabry-Perot filters of the first order.
  • higher order Fabry-Perot structures have to be used.
  • additional dichroic or other filters have to be added to the module, in order to select the required order only. This gives rise to additional energy loss, additional costs and hence reduced overall system optimality.
  • the output of the filter exhibits phase differences that, when focused by a lens, take on the form of concentric circles.
  • the concentric circles are a result of the different interfering waves where you have at different locations constructive and destructive interference.
  • the focusing lens is needed for macroscopic filters because of the large distances covered by reflections inside the filter and in order to focus all these reflections back onto one pixel.
  • the distance between the filter structure and the image sensor is very small and as the filter is designed for the first order, there is no need for a focusing lens.
  • the concentric circles that are the result of the phase difference, will still be there, but will all be focused inside the same pixel and their effect is all integrated in the output of that pixel.
  • the direct post-processing of the filter structure on top of an active IC, in this case the image sensor, should be compatible with the contamination, mechanical, temperature and other limitations of that IC. This means that e.g. none of the steps used in the fabrication of the filter can use materials or processing steps that would damage the image sensor below.
  • the material selection has been done such that standard materials have been used, that are fully compatible with standard processing.
  • Using some materials is not possible, e.g. Au or Ag, as they tend to diffuse into the different layers and into the tools and thereby negatively affect the yield of the current and even future processing steps.
  • such a layer can still be acceptable as a final step (top layer), when the deposition is done outside of the normal processing line and when the tool is only used for that purpose. This can only be done as a final step, as the wafer can not enter the normal flow after this operation.
  • a Fabry-Perot filter is made of a transparent layer (called cavity) with two reflecting surfaces at each side of that layer.
  • Fabry-Perot wavelength selection involves multiple light rays within the cavity being reflected, which results in constructive and destructive interference, based on the wavelength of the light, on the distance 1 between the semi-mirrors and on the incident angle ⁇ .
  • Higher orders are also selected, which results in an order selection problem.
  • the filter operation is based on this well-known Fabry-Perot principle, in which the height of each filter is selected to be tuned to the desired passband.
  • Each filter is formed by a resonant cavity of which the resonance frequency is determined by the height of the filter.
  • a second important parameter of the mirrors is their absorption, as this will determine the efficiency of the filter. If a full range of Fabry - Perot optical filters has to be constructed over a certain wavelength range, it is beneficial that these two parameters (reflectivity and absorption) stay as constant as possible over this spectral range. In that case, the wavelength range can be covered/sampled by varying only the cavity length of the Fabry- Perot filters and the materials and mirror layers can be kept constant.
  • the selected wavelength range has to match the sensitivity of the selected image sensor, which is the second component of the module
  • a high-n material is amorphous silicon, with reduced absorption index because of process parameter tuning, e.g. temperature and hydrogen content. Hard oxides have better tolerances but cannot be used because of the need for higher temperatures than allowed by standard CMOS processing.
  • the bandwidth ⁇ 0 depends on both the central wavelength ⁇ and the refractive indices n n 2 of the selected materials.
  • a certain central wavelength e.g. 600 nm spectral range around 700 nm
  • Si02 has one of the lowest refractive indices (1 :46) and a very low absorption coefficient. Both parameters are stable over a very large spectral range.
  • the second material in the Bragg stack will ideally need to have refractive index equal to 6:4, in addition to an absorption coefficient as close as possible to 0.
  • the refractive index of Si02 can be lowered by making it porous (mix it with air, which has a refractive index of 1). This results in a need for better manufacturable refractive index equal to 5 for the same spectral range and central wavelength.
  • Another example of material engineering is lowering the absorption index of amorphous silicon by changing process (deposition) parameters, like temperature, concentration of hydrogen, etc.
  • Equation 2 the reflectivity R of such a Bragg mirror is easily controlled by the number of pairs of dielectric layers. The more layers, the higher the reflectivity and the higher the finesse of the Fabry-Perot filter that will be built with that particular mirror.
  • n 0 is the refractive index of the surrounding medium
  • n s is the refractive index of the substrate
  • n ! is the refractive index of the first material
  • n 2 is the refractive index of the second material
  • N is the number of pairs in the Bragg stack.
  • One instantiation of a distributed Bragg stack is a combination of Si02 and engineered amorphous Silicon for a central wavelength around 700nm and a range from 540 nm to 1000 nm.
  • a second instantiation is a combination of Si02 and SiGe for a central wavelength of 1500 nm and a bandwidth of 1000 nm, in casu from 1000 nm to 2000 nm.
  • a consequence of using Bragg stacks for the mirror layers is an additional phase shift during the reflection of the light.
  • Fabrication methods for manufacturing ID or 2D Fabry-Perot filters can include successive patterning and etching steps, requiring a large number of processing steps in order to produce k different thicknesses. Planar ity of the image sensor
  • this planarization layer can to some extent be taken into account during the design of the filter structure. However, this layer is not a part of the active filter structure and does not have a large effect on the filter itself, as long as the correct material transition (important for the refractive index) is correctly taken into account.
  • a variation in deposited thicknesses in the components of the Fabry-Perot filters, in casu the layers of the Bragg stack and the thickness of the cavity, will result in a mismatch between the designed filter and the produced filter.
  • the effect of the variations on the thickness of the cavity is that: the thickness of all filters will be changed by more or less an equal amount, causing a shift of the spectral range to the right of the left of the theoretical design.
  • This global shift in the selected wavelengths, either up or down, with respect to the designed filter location, can be tolerated if it is a small proportion of the spectral width of the passbands, which can be one of the design parameters.
  • the sharp edges that form the transition between one filter and the next one can show rounding.
  • the width of each filter can cover multiple columns of sensor elements, in other cases just one or two sensor elements, in which case such corner rounding may have more effect on the passband.
  • Some of the method steps discussed above for image processing may be implemented by logic in the form of hardware or, for example, in software using a processing engine such as a microprocessor or a programmable logic device (PLD's) such as a PLA (programmable logic array), PAL (programmable array logic), FPGA (field programmable gate array).
  • a processing engine such as a microprocessor or a programmable logic device (PLD's) such as a PLA (programmable logic array), PAL (programmable array logic), FPGA (field programmable gate array).
  • PLA programmable logic array
  • PAL programmable array logic
  • FPGA field programmable gate array
  • Software programs may be stored in an internal ROM (read only memory) and/or on any other non-volatile memory, e.g. they may be stored in an external memory. Access to an external memory may be provided by conventional hardware which can include an external bus interface if needed, with address, data and control busses.
  • Features of the method and apparatus of the present invention may be implemented as software to run on a processor. In particular image processing in accordance with the present invention may be implemented by suitable programming of the processor.
  • the methods and procedures described above may be written as computer programs in a suitable computer language such as C and then compiled for the specific processor in the embedded design. For example, the software may be written in C and then compiled using a known compiler and known assembler.
  • the software has code, which when executed on a processing engine provides the methods and image processor for the present invention.
  • the software programs may be stored on any suitable machine readable medium such as magnetic disks, diskettes, solid state memory, tape memory, optical disks such as CD-ROM or DVD-ROM, etc. Other variations can be envisaged within the claims.

Landscapes

  • Physics & Mathematics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Spectrometry And Color Measurement (AREA)
  • Color Television Image Signal Generators (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)
  • Studio Devices (AREA)
PCT/EP2012/071506 2011-11-04 2012-10-30 Spectral camera with overlapping segments of image copies interleaved onto sensor array WO2013064507A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
IN2966CHN2014 IN2014CN02966A (enrdf_load_stackoverflow) 2011-11-04 2012-10-30
JP2014539314A JP6082401B2 (ja) 2011-11-04 2012-10-30 センサアレイ上にインターリーブ配置された画像コピーのオーバーラップしたセグメントを備えたスペクトルカメラ
EP12788152.2A EP2776797B1 (en) 2011-11-04 2012-10-30 Spectral camera with overlapping segments of image copies interleaved onto sensor array
US14/267,776 US9366573B2 (en) 2011-11-04 2014-05-01 Spectral camera with overlapping segments of image copies interleaved onto sensor array

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201161555933P 2011-11-04 2011-11-04
US61/555,933 2011-11-04

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US14/267,776 Continuation US9366573B2 (en) 2011-11-04 2014-05-01 Spectral camera with overlapping segments of image copies interleaved onto sensor array

Publications (1)

Publication Number Publication Date
WO2013064507A1 true WO2013064507A1 (en) 2013-05-10

Family

ID=47215512

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2012/071506 WO2013064507A1 (en) 2011-11-04 2012-10-30 Spectral camera with overlapping segments of image copies interleaved onto sensor array

Country Status (5)

Country Link
US (1) US9366573B2 (enrdf_load_stackoverflow)
EP (1) EP2776797B1 (enrdf_load_stackoverflow)
JP (1) JP6082401B2 (enrdf_load_stackoverflow)
IN (1) IN2014CN02966A (enrdf_load_stackoverflow)
WO (1) WO2013064507A1 (enrdf_load_stackoverflow)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9366573B2 (en) 2011-11-04 2016-06-14 Imec Leuven Spectral camera with overlapping segments of image copies interleaved onto sensor array
US9923007B2 (en) 2015-12-29 2018-03-20 Viavi Solutions Inc. Metal mirror based multispectral filter array
US9960199B2 (en) 2015-12-29 2018-05-01 Viavi Solutions Inc. Dielectric mirror based multispectral filter array

Families Citing this family (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9456198B2 (en) * 2011-10-13 2016-09-27 Panasonic Intellectual Property Management Co., Ltd. Depth estimating image capture device and image sensor
JP2015111241A (ja) * 2013-10-30 2015-06-18 日本電波工業株式会社 光学部品
JP2017208585A (ja) * 2014-09-30 2017-11-24 株式会社ニコン 撮像装置および画像データ生成プログラム
SG10201911462UA (en) * 2015-08-04 2020-02-27 Agency Science Tech & Res Hyperspectral imaging apparatus and method
US9992477B2 (en) 2015-09-24 2018-06-05 Ouster, Inc. Optical system for collecting distance information within a field
US10612976B1 (en) * 2015-10-06 2020-04-07 J.A. Woollan Co., Inc. Systems and methods for producing a more uniform intensity wavelength dispersed beam of electromagnetic radiation entering a multi-element detector, while maintaining information content therein
US10018560B2 (en) * 2016-02-02 2018-07-10 Kla-Tencor Corporation System and method for hyperspectral imaging metrology
US10914961B2 (en) 2017-02-13 2021-02-09 Viavi Solutions Inc. Optical polarizing filter
US10798316B2 (en) * 2017-04-04 2020-10-06 Hand Held Products, Inc. Multi-spectral imaging using longitudinal chromatic aberrations
US10594935B2 (en) 2017-04-12 2020-03-17 Spectrum Optix Inc. Along track flat optical lens imaging device
US11131773B2 (en) * 2017-05-15 2021-09-28 Ouster, Inc. Lidar unit with an optical link between controller and photosensor layer
CN110945561B (zh) * 2017-06-13 2024-06-14 爱色丽公司 高光谱成像分光光度计和系统
US11143803B2 (en) * 2018-07-30 2021-10-12 Viavi Solutions Inc. Multispectral filter
WO2023283272A2 (en) * 2021-07-07 2023-01-12 Owl Autonomous Imaging, Inc. Split-field optics for imaging and ranging
CN113885267A (zh) * 2021-12-06 2022-01-04 深圳市海谱纳米光学科技有限公司 一种光学滤波组件
WO2024110595A1 (en) * 2022-11-23 2024-05-30 ams Sensors Germany GmbH Optical sensor element, multi-spectral optical sensor and electronic device
WO2025097165A1 (en) * 2023-11-03 2025-05-08 Imagia, Inc. Systems and methods for ultra-low-power, high-speed sensors using optical filters
WO2025104170A1 (en) * 2023-11-15 2025-05-22 Ams-Osram Ag Multi-zone hyperspectral sensor
CN118999785B (zh) * 2024-08-19 2025-03-25 中国科学技术大学 一种应用于二维超光谱成像设备的最优化调焦及交替色差校正方法

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5479015A (en) * 1994-08-18 1995-12-26 Grumman Aerospace Corporation Multi-image detector assembly
WO2006046898A1 (en) * 2004-10-25 2006-05-04 Forskarpatent I Uppsala Ab A system for multi- and hyperspectral imaging
US7242478B1 (en) * 2003-12-05 2007-07-10 Surface Optics Corporation Spatially corrected full-cubed hyperspectral imager
WO2007103385A2 (en) * 2006-03-09 2007-09-13 Northrop Grumman Corporation Spectral filter for optical sensor
WO2008012715A2 (en) * 2006-07-28 2008-01-31 Koninklijke Philips Electronics N.V. An integrated image recognition and spectral detection device and a device and method for automatically controlling the settings of a light by image recognition and spectral detection of the light
WO2009120928A2 (en) * 2008-03-28 2009-10-01 The Trustees Of Columbia University In The City Of New York Generalized assorted pixel camera systems and methods
US20090256927A1 (en) * 2008-04-11 2009-10-15 Olympus Corporation Image capturing apparatus
WO2010019711A1 (en) * 2008-08-12 2010-02-18 Digital Fusion, Inc. System and method for optically co-registering pixels
WO2011064403A1 (en) 2009-11-30 2011-06-03 Imec Integrated circuit for spectral imaging system
WO2011076975A1 (en) * 2009-12-23 2011-06-30 Nokia Corporation Filter setup learning for binary sensor

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08233658A (ja) * 1995-02-24 1996-09-13 Olympus Optical Co Ltd 分光装置及び分光画像記録装置
US6057925A (en) 1998-08-28 2000-05-02 Optical Coating Laboratory, Inc. Compact spectrometer device
US8174694B2 (en) 2001-12-21 2012-05-08 Bodkin Design And Engineering Llc Hyperspectral imaging systems
US6927857B2 (en) * 2002-03-09 2005-08-09 Kimberly-Clark Worldwide, Inc. Process for the detection of marked components of a composite article using infrared blockers
JP2004228662A (ja) * 2003-01-20 2004-08-12 Minolta Co Ltd 撮像装置
US7433042B1 (en) * 2003-12-05 2008-10-07 Surface Optics Corporation Spatially corrected full-cubed hyperspectral imager
EP1880165A2 (en) 2005-03-24 2008-01-23 Infotonics Technology Center, Inc. Hyperspectral imaging system and methods thereof
US20080204744A1 (en) 2005-07-11 2008-08-28 Jose Mir High Speed, Optically-Multiplexed, Hyperspectral Imagers and Methods Thereof
US7924483B2 (en) * 2006-03-06 2011-04-12 Smith Scott T Fused multi-array color image sensor
NZ613219A (en) * 2008-01-04 2014-11-28 Intellikine Llc Heterocyclic containing entities, compositions and methods
FI20080305L (fi) * 2008-04-22 2009-10-23 Serlachius Jarl Fredrik Nykäyksenvaimentaja
US8902321B2 (en) * 2008-05-20 2014-12-02 Pelican Imaging Corporation Capturing and processing of images using monolithic camera array with heterogeneous imagers
US8687055B2 (en) * 2010-03-16 2014-04-01 Eli Margalith Spectral imaging of moving objects with a stare down camera
US20110240858A1 (en) 2010-03-30 2011-10-06 General Electric Company Multi-spectral pyrometry imaging system
US8724000B2 (en) * 2010-08-27 2014-05-13 Adobe Systems Incorporated Methods and apparatus for super-resolution in integral photography
EP2776797B1 (en) 2011-11-04 2018-12-05 IMEC vzw Spectral camera with overlapping segments of image copies interleaved onto sensor array

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5479015A (en) * 1994-08-18 1995-12-26 Grumman Aerospace Corporation Multi-image detector assembly
US7242478B1 (en) * 2003-12-05 2007-07-10 Surface Optics Corporation Spatially corrected full-cubed hyperspectral imager
WO2006046898A1 (en) * 2004-10-25 2006-05-04 Forskarpatent I Uppsala Ab A system for multi- and hyperspectral imaging
WO2007103385A2 (en) * 2006-03-09 2007-09-13 Northrop Grumman Corporation Spectral filter for optical sensor
WO2008012715A2 (en) * 2006-07-28 2008-01-31 Koninklijke Philips Electronics N.V. An integrated image recognition and spectral detection device and a device and method for automatically controlling the settings of a light by image recognition and spectral detection of the light
WO2009120928A2 (en) * 2008-03-28 2009-10-01 The Trustees Of Columbia University In The City Of New York Generalized assorted pixel camera systems and methods
US20090256927A1 (en) * 2008-04-11 2009-10-15 Olympus Corporation Image capturing apparatus
WO2010019711A1 (en) * 2008-08-12 2010-02-18 Digital Fusion, Inc. System and method for optically co-registering pixels
WO2011064403A1 (en) 2009-11-30 2011-06-03 Imec Integrated circuit for spectral imaging system
WO2011076975A1 (en) * 2009-12-23 2011-06-30 Nokia Corporation Filter setup learning for binary sensor

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9366573B2 (en) 2011-11-04 2016-06-14 Imec Leuven Spectral camera with overlapping segments of image copies interleaved onto sensor array
US9923007B2 (en) 2015-12-29 2018-03-20 Viavi Solutions Inc. Metal mirror based multispectral filter array
US9960199B2 (en) 2015-12-29 2018-05-01 Viavi Solutions Inc. Dielectric mirror based multispectral filter array
US10651216B2 (en) 2015-12-29 2020-05-12 Viavi Solutions Inc. Metal mirror based multispectral filter array
US11114485B2 (en) 2015-12-29 2021-09-07 Viavi Solutions Inc. Metal mirror based multispectral filter array
US11450698B2 (en) 2015-12-29 2022-09-20 Viavi Solutions Inc. Dielectric mirror based multispectral filter array
US11670658B2 (en) 2015-12-29 2023-06-06 Viavi Solutions Inc. Metal mirror based multispectral filter array
US12170300B2 (en) 2015-12-29 2024-12-17 Viavi Solutions Inc. Metal mirror based multispectral filter array of optical sensor device

Also Published As

Publication number Publication date
JP6082401B2 (ja) 2017-02-15
EP2776797A1 (en) 2014-09-17
IN2014CN02966A (enrdf_load_stackoverflow) 2015-07-03
US9366573B2 (en) 2016-06-14
US20140267878A1 (en) 2014-09-18
EP2776797B1 (en) 2018-12-05
JP2014533463A (ja) 2014-12-11

Similar Documents

Publication Publication Date Title
US9366573B2 (en) Spectral camera with overlapping segments of image copies interleaved onto sensor array
US9848135B2 (en) Spectral camera with mirrors for projecting multiple adjacent image copies onto sensor array
US9857222B2 (en) Spectral camera with mosaic of filters for each image pixel
EP2776798B1 (en) Spectral camera with integrated filters and multiple adjacent image copies projected onto sensor array
US11029207B2 (en) Integrated circuit for spectral imaging system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12788152

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2014539314

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2012788152

Country of ref document: EP