WO2023222278A1 - Multispectral optical sensor, camera system and parallax compensation method - Google Patents

Multispectral optical sensor, camera system and parallax compensation method Download PDF

Info

Publication number
WO2023222278A1
WO2023222278A1 PCT/EP2023/055542 EP2023055542W WO2023222278A1 WO 2023222278 A1 WO2023222278 A1 WO 2023222278A1 EP 2023055542 W EP2023055542 W EP 2023055542W WO 2023222278 A1 WO2023222278 A1 WO 2023222278A1
Authority
WO
WIPO (PCT)
Prior art keywords
optical
channels
multispectral
spectral
compensation
Prior art date
Application number
PCT/EP2023/055542
Other languages
French (fr)
Inventor
Gunter Siess
Alexander Gaiduk
Mohsen Mozaffari
Original Assignee
ams Sensors Germany GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ams Sensors Germany GmbH filed Critical ams Sensors Germany GmbH
Publication of WO2023222278A1 publication Critical patent/WO2023222278A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/0205Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows
    • G01J3/0208Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows using focussing or collimating elements, e.g. lenses or mirrors; performing aberration correction
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/2823Imaging spectrometer
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • H04N23/13Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths with multiple sensors
    • H04N23/15Image signal generation with circuitry for avoiding or correcting image misregistration
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/2803Investigating the spectrum using photoelectric array detector
    • G01J2003/2806Array and filter array
    • G01J2003/2809Array and correcting filter
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/2803Investigating the spectrum using photoelectric array detector
    • G01J2003/28132D-array
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/2823Imaging spectrometer
    • G01J2003/2826Multispectral imaging, e.g. filter imaging

Definitions

  • the present disclosure relates to a multispectral optical sensor, a multispectral optical camera system including such a multispectral optical sensor, and a method of using the multispectral optical sensor, in particular for compensating a captured image of an obj ect or a scene for the ef fects of parallax .
  • spectral channels Due to the increased number of spectral channels compared to conventional digital photography, for example , a number of application fields for multispectral technology arises , among others in professional photography . I f the spectral channels are arranged in a suitable manner across the visible range of the electromagnetic spectrum, the spectra of the photographed obj ects can be mathematically reconstructed from the sensor signals . This poses a signi ficant advantage over conventional digital photography, which typically provides only three channels , e . g . RGB channels . From the spectra known in each pixel , the appearance of the captured obj ect under any given type of light can be calculated with very high accuracy .
  • Multispectral camera systems are based on a chip array of optical sensors , or sub-cameras , with di f ferent spectral characteristics for generating di f ferent monochromatic images of a scene or an obj ect in order to enable spectral reconstruction .
  • parallax errors become a serious limitation particularly for near- field imaging, as the parallax error shows an exponential increase with decreasing distance of the obj ect or scene to be captured .
  • the image to the detector array is shi fted in distance radial to the center .
  • each signal from the individual sub-cameras is either required to employ a distance-adj usted image array, or to interpolate the shi fted images to a typical center-camera range .
  • the latter can be achieved by means of image processing and comparing the structural shi ft between the images taken by each subcamera .
  • the geometrical information can overlap with spectral information and the image adj usting may fail in consequence . It can be shown that a practical spectral reconstruction requires an accuracy of the raw data of 0 . 5% .
  • one approach is to analyze the signals taken by each sub-camera and analyzing the signals for structural patterns in order to determine a parallax shi ft for each image .
  • the individual sensors of a multispectral camera having di f ferent spectral sensitivities , however, identi fying the very same structural pattern can be extremely di f ficult or even impossible . In such cases , the alignment of parallax by comparing the signals via image processing will fail and lead to falsi fied results .
  • An obj ect to be achieved is to provide a multispectral optical sensor that is capable of determining a parallax error in a reliable manner .
  • a further obj ect is to provide a multispectral camera system comprising such an optical sensor .
  • a further obj ect is to provide a parallax compensation method
  • the improved concept is based on the idea of providing a multispectral image sensor that comprises multiple spectral channels that have identical spectral characteristics and are distributed across the sensor surface .
  • a compensation algorithm is rendered independent from any spectral overlap that occurs for channels with di f ferent filter characteristics .
  • the parallax detection is performed on these so-called compensation channels via feature recognition .
  • a shi ft of these features across the compensation channels directly leads to the parallax error, which can be on the one hand used to directly correct the captured signals of the compensation channels , and on the other hand can be interpolated for the remaining spectral channels that typically have distinct spectral characteristics .
  • a multispectral optical sensor comprises a substrate portion having an array of subarrays of optical detector regions , a plurality of lens elements and a plurality of optical filters .
  • the optical detector regions , the lens elements and the optical filters form a plurality of spectral channels , with each spectral channel including a lens element , an optical filter and a subarrays of optical detector regions .
  • spectral channels a least three are compensation channels that are characteri zed by an identical spectral response .
  • each of the compensation channels has an optical filter with identical spectral transmission characteristics , comprises a corresponding plurality of optical detector regions , and each subarray has an identical relative spatial arrangement of the optical detector regions .
  • the substrate portion is for example a semiconductor substrate , e . g . a chip substrate , which comprises a plurality of optical detector regions , e . g . optical pixels , which are configured to receive electromagnetic radiation and generate electrical signals based on a photocurrent that is generated in response to the captured electromagnetic radiation .
  • the pixels can comprise a photodiode .
  • all pixels of the optical sensor can comprise the same type of photodiode , e . g . a silicon-based photodiode for applications in the visible and/or NIR domain or a germanium- based photodiode for applications in the SWIR range of the electromagnetic spectrum .
  • the spectral channels are defined in terms of the transmission behavior of the individual optical filters .
  • the working principle of pixels of an image sensor is a well-known concept and not further detailed throughout this disclosure .
  • the multispectral optical sensor may be manufactured at least in part using on- chip integration enabling wafer scale packaging .
  • the optical detector regions are arranged in an array, e . g . a ID or 2D array, of subarrays .
  • the optical sensor further comprises lens elements and optical filters such that one optical filter and one lens element is associated to each subarray of optical detector regions . This way, the spectral channels of the multispectral optical sensor are formed .
  • each subarray of optical detector regions may be considered to act as a monochromatic, i . e . narrowband, subarray of optical detector regions .
  • the spatial arrangement of the optical detector regions of each subarray, the optical filters and the lens elements define the sectoring of the field of view of the multispectral optical sensor .
  • each optical detector region detects light from the same sector of a scene , which is necessary for spectral reconstruction .
  • Such a multi-spectral optical sensor may be used to generate sectored color and spectral information for each di f ferent region of a scene e . g . the center of the scene , border of the scene and outside areas of the scene .
  • the sectored color and spectral information may be used to reali ze a gradient white balancing of a captured image of the scene with respect to di f ferent ambient light conditions in the same scene .
  • the optical filters can be interference filters , optical absorption filters , Fabry-Perot filters , plasmonic filers or meta surface structures configured to perform filtering .
  • the optical filters can be characteri zed by possessing corresponding spectral passbands defining the range of optical frequencies or wavelengths that can pass through a given filter.
  • the passbands of the optical filters are typically designed such that each optical filter transmits a narrow wavelength range, e.g. a 10-100 nm wide range, and such that all passbands of the optical filters combined span across a broad wavelength range, e.g. the UV, the visible, NIR or SWIR portion of the electromagnetic spectrum.
  • the passbands of the individual filter elements can partially overlap each other.
  • a multispectral optical sensor provides at least three spectral channels as compensation channels that show the same spectral response.
  • these compensation channels have optical filters with identical transmission characteristics, i.e. identical spectral passbands, an identical number of optical detector regions, e.g. pixels, and an identical relative spatial arrangement of the latter.
  • the subarrays of optical detector regions are identical in terms of pixel type, pixel number and spatial pixel arrangement for all compensation channels.
  • each of the compensation channels is designed to generate identical photo signals of an object or a scene the electromagnetic radiation is received from if parallax errors were to be omitted.
  • the filter elements are the same for the compensation channels, hence leading to a slight reduction in resolution compared to conventional optical sensors that feature entirely distinct optical filters and thus entirely distinct spectral responses of the individual channels .
  • the optical sensor can comprise further elements that are typical to optical sensors such as housings, apertures, diffusors, and cut filters, for instance.
  • the multispectral optical sensor according to the improved concept thus provides compensation channels that, owing to their identical spectral response, provide efficient means to accurately perform pattern recognition in the generated signals without having to compensate for different spectral responses. This enables determining the exact spatial shifts caused by parallax errors. These errors are of particular relevance in the near-field due to the non-negligible spatial separation of the spectral channels. The determined spatial shifts can be interpolated to the remaining non-compensation spectral channels and used for also correcting the signals of these channels for any parallax error.
  • the compensation channels are not directly adjacent to each other within the array of subarrays. Choosing a position of the compensation channels to not be adjacent to each other ensures more accurate results of the parallax compensation as the spatial shift of a recognized pattern, also due to the non-zero dimension of each detector region, i.e. pixel, can be determined more precisely.
  • the compensation channels occupy corner positions of the respective array, e.g. four corners of a rectangular 2D array while all other positions of the subarray are occupied by non-compensation channels.
  • the compensation channels are arranged maximally distant from each other within the array. A maximum distance between the compensation channels further improves the result of the spatial shift determination.
  • determining a spatial shift within a subarray is limited to a spatial extent of a single detector region, i.e. pixel. Determining the spatial shift across compensation channels over the largest possible distance compensates for this limitation as the non- zero pixel si ze becomes negligible i f the subarrays of the compensation channels are maximally separated across the respective array .
  • the compensation channels are arranged at equal distance from each other .
  • the optical sensor comprises three compensation channels that are arranged on endpoints of an imaginary equilateral triangle across the array of the spectral channels . Having at least three compensation channels that are arranged in an equal distance to each other further enhances the determination of a spatial shi ft of a recorded pattern or structure as a signal of a third compensation channel can act as a confirmation for a shi ft determined between the other two channels in a three compensation channel arrangement since the relative shi fts are expected to be the same .
  • each of the optical detector regions comprises a corresponding plurality of optical detector regions .
  • each subarray of optical detector regions has the same relative spatial arrangement of optical detector regions as each of the other subarrays of optical detector regions .
  • each spectral channel and therefore each subarray comprises an equal amount of detection regions , i . e . pixels . This way, an interpolation of the non-compensation spectral channels can be performed in a straight- forward manner i f the subarrays of these channels match the subarrays of the compensation channels in terms of spatial arrangement and number of detection regions .
  • each of the spectral channels other than the compensation channels is characteri zed by an optical filter having a distinct spectral transmission characteristic that is di f ferent from that of the compensation channels and from that of the other spectral channels .
  • the optical filters of the non-compensation channels feature passbands that are distinct to each other, similar as it is the case for all channels in conventional multispectral sensors , and to the compensation channels .
  • choosing three or four of the spectral channels to be compensation channels slightly sacri fices resolution but on the other hands enables a more accurate spectral reconstruction due to the reliable parallax compensation process .
  • the optical filters of the compensation channels are optical bandpass filters that transmit a wavelength range that is substantially centered between a minimum and a maximum transmission wavelength across all optical filters .
  • the compensation channels rather than being sensitive at boundaries of said wavelength range , are sensitive in a range that is substantially centered within the wavelength range of all spectral channels combined .
  • i f the spectral channels are sensitive across the visible range from 400-700 nm
  • the compensation channels can be sensitive for green light around 550 nm . This minimi zes error channels in the parallax compensation and spectral reconstruction caused by wavelength dispersion in the spatial shi fts of each spectral channel due to the parallax error .
  • the optical filters are arranged between the lens elements and the optical detector regions , in particular the optical filters are disposed or formed on a front surface of the substrate portion .
  • the optical filters can be formed on, or attached to , a monolithic multispectral semiconductor chip in front of corresponding subarrays of optical detector regions .
  • the lens elements are arranged between the optical filters and the optical detector regions .
  • the filters can be arranged, or formed on, a front surface of an optical substrate that comprises the lens elements , wherein the front surface faces away from the substrate portion .
  • the plurality of lens elements forms a micro lens array, MLA, or a micro Fresnel lens array .
  • the plurality of lens elements can be defined by, or formed on, an optical substrate .
  • Said substrate can be arranged on a spacer that is located between the substrate portion and the optical substrate of the MLA.
  • the spacer can define a plurality of apertures , wherein each aperture is aligned with a corresponding lens element , a corresponding optical filter and a corresponding subarray of optical detector regions .
  • the lens elements are Fresnel lens elements provided as a micro Fresnel lens array, wherein each Fresnel lens element is defined by, or formed on, a corresponding optical filter of the multispectral optical sensor .
  • the optical filters are transmissive in the visible domain .
  • Spectral imaging particularly in the visible domain can allow extraction of additional information the human eye fails to capture .
  • multispectral imaging allows for an improved color balancing as the precise spectrum of the imaged object or scene can be determined.
  • white balancing on objects e.g. a white surface or wall, wherein different portions are illuminated by different light sources, e.g. sunlight on one side and artificial light on an opposite side, can be efficiently performed in an extremely reliable manner.
  • the compensation channels are sensitive for green light, which is located in the center of the visible domain.
  • the optical filters are transmissive in the infrared domain, in particular in the SWIR domain.
  • a transparency in the SWIR domain allows for various applications. Unlike medium and long wavelength infrared light that is emitted by objects themselves, e.g. as temperature radiation, the SWIR spectrum is similar in properties to visible light, meaning photons are reflected or absorbed by an object, creating the strong contrast required for high resolution imaging. Natural sources of SWIR light include starlight and night sky glow, which provide excellent illumination for outdoor imaging at night. SWIR imaging is used for a variety of applications, such as RGB, solar cell and food inspection, identification and sorting, surveillance, counterfeit detection, process quality control, etc .
  • the optical sensor further comprises a plurality of apertures, wherein each aperture is aligned with a corresponding lens element, a corresponding optical filter and a corresponding subarray of optical detector regions.
  • the array is a 4x4 2D array and the compensation channels are formed from the subarrays of optical detector regions located at corner positions of the array .
  • the compensation channels are formed from the subarrays of optical detector regions located at corner positions of the array .
  • the array is a 2D array formed from the subarrays of optical detector regions arranged in rows and columns of the array .
  • the compensation channels are arranged such that each row and column of the array comprises at most one compensation channel .
  • the array is a 3x3 array of subarrays , wherein the compensation channels are arranged, such that they are located on endpoints of an equilateral triangle , wherein each row and column of the 3x3 array comprises a single compensation channel .
  • a multispectral optical camera system comprises a multispectral optical sensor according to one of the embodiments described above .
  • the multispectral optical camera system further comprises a processing resource , wherein the multispectral optical sensor and the processing resource are configured for communication with one another .
  • the processing resource is configured to perform a parallax compensation process , which comprises the steps of : reading out electrical signals from the spectral channels of the multispectral optical sensor, wherein the electrical signals are generated by the subarrays of optical detector regions in response to incident electromagnetic radiation, identi fying a common structural pattern in the electrical signals from each of the compensation channels , and determining a parallax of the common structural pattern between the compensation channels .
  • the parallax compensation process further comprises calculating from the determined parallax an interpolated parallax for each of the remaining spectral channels , correcting the electrical signals from the compensation channels for the determined parallax, correcting the electrical signals from the spectral channels other than the compensation channels for the interpolated parallax, and performing spectral reconstruction of the electrical signals from all spectral channels .
  • the processing resource is further configured to calculate , from the determined parallax of the compensation channels , a distance to an obj ect the electromagnetic radiation is received from .
  • the multispectral optical camera system further comprises a time-of- f light , TOF, sensor that is configured to determine a distance between the multispectral optical camera system and an obj ect the electromagnetic radiation is received from .
  • TOF time-of- f light
  • the interpolated parallax for each of the remaining spectral channels is calculated based on the determined parallax from the compensation channels and the determined distance .
  • the TOF sensor can determine whether the obj ect or scene to be captured is located in the near- field where a signi ficant parallax error is to be expected .
  • the distance determined via the TOF sensor can be used during the parallax compensation process to further enhance the determination of the spatial shi fts of the patterns identi fied in the compensation channels .
  • optical camera system becomes apparent to the skilled reader from the embodiments of the multispectral optical sensor described above , and vice-versa .
  • a parallax compensation method comprises the steps of capturing electromagnetic radiation using a multispectral optical sensor according to one of the embodiments described above , reading out electrical signals from the spectral channels of the multispectral optical sensor, wherein the electrical signals are generated by the subarrays of optical detector regions in response to the incident electromagnetic radiation, and identi fying a structural pattern in the electrical signals from each of the compensation channels .
  • the method further comprises determining a parallax of the structural pattern between the compensation channels , calculating from the determined parallax an interpolated parallax for each of the remaining spectral channels , correcting the electrical signals from the compensation channels for the determined parallax, correcting the electrical signals from the remaining spectral channels for the interpolated parallax, and performing spectral reconstruction of the electrical signals from all spectral channels .
  • the method further comprises calculating, from the determined parallax of the compensation channels , a distance to an obj ect the electromagnetic radiation is received from .
  • Figure 1 shows a schematic cross-section of an exemplary embodiment of a multispectral optical sensor according to the improved concept ;
  • Figure 2 shows a schematic of an exemplary embodiment of a monolithic semiconductor chip of a multispectral optical sensor ;
  • Figures 3 and 4 illustrate the working principle of a multispectral optical sensor in the far field and near field, respectively;
  • Figures 5 and 6 illustrate image fields of the spectral channels of a multispectral optical sensor in the far field and near field, respectively;
  • Figure 7 shows an exemplary spatial shi ft within a spectral channel evaluated against a distance to an obj ect imaged by the multispectral optical sensor .
  • Figures 8 to 10 illustrate various embodiments of arrangements of the spectral channels of a multispectral optical sensor .
  • Figure 11 shows a schematic of a rear side of an electronic device in the form of a smartphone having a multispectral optical sensor .
  • FIG. 1 shows a schematic cross-section of an exemplary embodiment of a multispectral optical sensor 1 according to the improved concept .
  • the multispectral optical sensor 1 in this embodiment comprises a monolithic semiconductor chip, e . g . a silicon chip, as the substrate portion 10 , which for this embodiment is shown in more detail in Fig . 2 .
  • the substrate portion 10 defines a plurality of subarrays 11 of optical detector regions I la, 11b, 11c, ... H i in the form of a number of subarrays 11 arranged in a rectangular, e . g . 3x4 or 4x4 , array of subarrays 11 , wherein the optical detector regions I la, 11b, 11c, ...
  • each subarray 11 are pixel structures , for instance , which have the same relative spatial arrangement as the optical detector regions I la, 11b, 11c, ... H i of each of the other subarrays 11 .
  • each of the subarrays 11 in this embodiment defines a 3x3 array of optical detector regions Ha, 11b, He, ... H i .
  • a number of optical detector regions Ha, Hb, He, ... H i in each subarray and their spatial arrangement defines an image resolution of the multispectral optical sensor 1 .
  • a 3x3 resolution as illustrated in this embodiment is suf ficient for applications such as color balancing or analysis applications that do not rely on capturing an obj ect in high resolution but merely a spectral composition of the received light .
  • Higher image resolutions e . g . 100x100 or even higher, can however likewise be implemented for applications that do require resolving structural features of an obj ect or a scene , e . g . augmented reality applications .
  • the multispectral optical sensor 1 further comprises a plurality of optical filters 12a, 12b, 12c, ... 12 i as well as a plurality of lens elements 13 in the form of a micro lens array (MLA) defined by, or formed on, an optical substrate 15 .
  • the multispectral optical sensor 1 also includes a spacer 14 located between the substrate portion 10 and the optical substrate 15 of the MLA. The substrate portion 10 and the optical substrate 15 are attached to opposite sides of the spacer 14 .
  • the spacer 14 defines a plurality of apertures 16 , wherein each aperture 16 is aligned with a corresponding lens element 13 , a corresponding optical filter 12 and a corresponding subarray 11 of optical detector regions Ha, Hb, He, ... H i .
  • each of the subarrays 11 , the corresponding optical filter 12 and the corresponding lens element 13 form a respective spectral channel of the multispectral optical sensor 1.
  • Each of the optical filters 12a, 12b, 12c, ... 12i has a corresponding optical transmission spectrum.
  • Each optical filter 12a, 12b, 12c, ... 12i is a passband optical interference filter, for example, which defines a corresponding spectral passband.
  • the optical filters 12a, 12b, 12c, ... 12i define different spectral passbands with the additional requirement that at least three of the optical filters 12a define identical spectral passbands, thus forming the compensation spectral channels. In other words, the compensation channels are independent from spectral characteristics of an object or scene to be captured.
  • the remaining optical filters 12b, 12c, ... 12i define distinct spectral passbands that are different from each other and from the optical filters 12a of the compensation channels.
  • the optical filters 12a, 12b, 12c, ... 12i define passbands of equal bandwidth and can partially overlap each other, such that a predefined range of the electromagnetic spectrum is covered, e.g. the visible or SWIR domain.
  • the center wavelengths of transmission of the optical filters 12a, 12b, 12c, ... 12i are equally spaced from each other.
  • each optical filter 12a, 12b, 12c, ... 12i is formed on, or attached to, the substrate portion 10 in front of a corresponding subarray 11 of optical detector regions Ila, 11b, 11c, ... Hi.
  • Each optical filter 12a, 12b, 12c, ... 12i is aligned between a corresponding lens element 13 and a corresponding subarray 11 of optical detector regions Ila, 11b, 11c, ... Hi such that, during operation of the multispectral optical sensor 1, any light which is incident on any one of the lens elements 13 along any given direction of incidence converges through the corresponding optical filter 12a, 12b, 12c, ... 12 i onto a corresponding one of the optical detector regions I la, 11b, 11c, ... H i of the corresponding subarray 11 of optical detector regions I la, 11b, 11c, ... H i , wherein the corresponding one of the optical detector regions Ha, 11b, He, ...
  • H i depends on the given direction of incidence .
  • light incident on any one of the lens elements 13 along a direction of incidence which is parallel to the optical axis 20 of the multispectral optical sensor 1 as represented by the solid rays shown in FIG . 1 is focused by the lens element 13 to the central optical detector region He of the corresponding subarray 11 through the corresponding optical filter 12a, 12b, 12c, ... 12 i .
  • any one of the lens elements 13 along a direction of incidence which is oblique to the optical axis 20 of the multispectral optical sensor 1 as represented by the dashed and dotted-dashed rays shown in FIG . 1 is focused by the lens element 13 to one of the peripheral optical detector regions Ha, Hb, He, l id, I l f , 11g, l lh, H i of the corresponding subarray 11 through the corresponding optical filter 12a, 12b, 12c, ... 12 i which depends on the particular direction of incidence .
  • Fig . 2 shows a schematic of the substrate portion 10 of the exemplary embodiment of a multispectral optical sensor of Fig . 1 .
  • the substrate portion 10 comprises twelve subarrays 11 that are arranged in a rectangular 3x4 ( or 4x3 ) array, wherein each of the subarrays 11 comprises nine optical detector regions Ha, Hb, He, ... H i arranged in a square 3x3 array .
  • the 3x3 subarrays 11 define the image resolution of the multispectral sensor 1
  • the 3x4 array defines its spectral resolution .
  • the spectral sensor 1 compared to conventional multispectral sensors that do not comprise means for parallax compensation, equips the corner subarrays 11 each with an optical filter 12a having an identical spectral passband with respect to each other, while the remaining optical filters 12b, 12c, ... 12 i have passbands that are distinct to each other and to those of the corner subarrays 11 .
  • the multispectral optical sensor according to this embodiment comprises nine di f ferent spectral channels and a number of four compensation channels with a global image resolution of 3x3 .
  • higher image and spectral resolutions can be easily achieved by increasing the number of subarrays 11 in the array and the number of optical detector regions I la, 11b, 11c, ... H i , respectively .
  • the compensation channels having the identical optical filters 12a are arranged at corner positions of the array as illustrated, thus resulting in a maximal spacing of the compensation channels .
  • arranging the compensation channels maximally distant from each other enables a maximum sensitivity to said spatial shi fts , thus leading to an ef ficient means to detect parallax errors and provide the basis for a compensation process .
  • Fig . 3 illustrates a ray optics scheme of an obj ect captured by the multispectral image sensor 1 in its far field, in which the distance d to said obj ect is signi ficantly larger, e . g . by several orders of magnitude , than a pitch of the array of subarrays 11 or a distance of the outermost subarray 11 from the optical axis 20, here indicated by the arrow labelled a.
  • the multispectral optical sensor 1 comprises an array having four rows of spectral channels, with each spectral channel having a subarray 11 of optical detector regions Ila, ... Hi, wherein the latter form a 100x100 square array as the subarray 11.
  • Each spectral channel further comprises a lens element 13 and an optical filter 12, in this case disposed on a surface of the lens element 13.
  • a square subarray 11 of 100x100 pixels has a side length of 0.25 mm, thus a pixel pitch of 2.5 pm.
  • a pitch of the subarrays with respect to each other can correspondingly be in the order of 0.25 to 0.5 mm, for instance.
  • a focal length f of the lens elements is in the order of 2 mm, for instance.
  • the image field of each spectral channel is concentric to the corresponding lens element 13.
  • a spatial shift Ax of a certain feature here indicated as an imaged arrow, is zero or nearly zero for all spectral channels, particularly for the outermost spectral channels.
  • the parallax error in the far field is nonexistent or at least negligible.
  • Fig. 4 illustrates a ray optics scheme of an object captured by the multispectral image sensor 1 in its near field, meaning that a distance d of the object or scene is in the same order of magnitude of the pitch of the array of subarrays 11.
  • the distance d is comparable to a distance of the subarrays 11 from the optical axis 20, here indicated by the arrows labelled i and a2.
  • Ax t at an angle a ir with tan ⁇ Z; Therein, denotes the f radial distance of the respective spectral channel from the optical axis 20, d the distance of the object to be imaged, f the focal length of the lens elements 13, and A%; the spatial shift of the channel with radial distance in distance units .
  • Figures 5 and 6 illustrate the image regions as highlighted square with respect to the lens elements 13 indicated as circles and optical filters 12a, 12b, ... 12m for the cases of far field and near field imaging of Figs. 3 and 4, respectively.
  • the subarrays 11 of optical detector regions are congruent with the square-shaped optical filters 12a, 12b, ... 12m.
  • all image fields in case of far field imaging are concentric with the lens elements 13, meaning that a parallax error is nonexistent or at least negligible as also described in the context of Fig. 3.
  • the four corner spectral channels here designed as the compensation channels having the identical optical filters 12a, show a significant spatial shift in both x and y direction.
  • any attempt to perform spectral reconstruction by comparing the signals of pixels located at the same position within the subarray 11 across all spectral channels will lead to a strongly falsified result.
  • a compensation mechanism is required, that enables spectral reconstruction not only in the far field but also in the near field and in intermediate object distances d.
  • FIGS 8 to 10 illustrate various embodiments realizing a core idea of the multispectral optical sensor 1 according to the improved concept.
  • a multispectral optical sensor according to this disclosure realizes at least three so-called compensation channels, which can be regarded as usual spectral channels, however, having an optical filter 12a with identical spectral response, i.e. transmission behavior.
  • the spectral resolution is slightly decreased as the spectral compensation channels, neglecting any parallax, record the same spectral component of an object or scene to be imaged.
  • Figure 8 shows a first embodiment of an optical filter configuration of a 4x4 array of subarrays 11 of a multispectral optical sensor 1 according to the improved concept.
  • the corner subarrays 11 of the array are equipped with optical filters 12a, which have identical spectral responses, thus forming four identical compensation channels.
  • optical filters 12a which have identical spectral responses, thus forming four identical compensation channels.
  • the remaining spectral channels in this embodiment are characteri zed by twelve distinct optical filters 12b, 12c, ... 12m that di f fer in terms of their spectral response from each other and from the compensation channels .
  • the optical filters 12a, 12b, ... 12m each have optical passbands characteri zed by respective center wavelength and bandwidth, wherein the optical passbands are engineered such that at least one optical filter 12a, 12b, ... 12m is transmissive for any given optical wavelength selected from a predefined range .
  • the optical filters 12a, 12b, ... 12m cover the visible domain, e . g . a wavelength range between 400 nm and 700 nm, such that any wavelength within this range is transmitted by at least one of the optical filters 12a, 12b, ... 12m .
  • the passbands of the optical filters 12a, 12b, ... 12m can at least partially overlap each other .
  • the passbands of the optical filters 12a of the compensation channels can be located in or towards a center of the wavelength range , which is covered by all optical filters 12a, 12b, ... 12m .
  • the optical filters 12a, 12b, ... 12m are characteri zed by passbands of 25-50 nm bandwidth, wherein a passband of the optical filters 12a of the compensation channels is transmissive for green light at around 550 nm, for instance .
  • the spectral channels with the identical optical filters 12a are arranged maximally distant from each other and from the center of the multispectral optical sensor 1 , i . e . from the center of the array . This is due to the fact that the parallax error is maximum at these locations , as it increases radially from the center of the array . Hence , the maximum spatial shi ft due to the parallax error in both dimensions x and y can be determined at these locations and used to interpolate the spectral shi fts of the remaining noncompensation channels not located at corners of the array in this embodiment .
  • Fig . 9 shows a second embodiment of an optical filter configuration of a 4x4 array of subarrays 11 of a multispectral optical sensor 1 according to the improved concept .
  • this embodiment likewise comprising a total of 16 channels , three of these channels are designed as the compensation channels , wherein the identical optical filters 12a are arranged maximally distant from each other in a manner, in which each row and column of the array comprises at most one compensation channel , i . e . at most one of the identical filters 12a .
  • the compensation channels are arranged such that they form endpoints of an equilateral imaginary triangle indicated in the figure .
  • this embodiment features an additional distinct spectral channel with an optical filter 12m, thus increasing the spectral resolution of the sensor .
  • the three compensation channels suf fice to reliably detect spatial shi fts due to the parallax error in both dimensions x and y .
  • the signal in one of the compensation channels can be used to veri fy spatial shi fts determined by the respective other two .
  • Figure 10 shows a third embodiment of an optical filter configuration of a 3x3 array of subarrays 11 of a multispectral optical sensor 1 according to the improved concept .
  • the compensation channels are arranged maximally distant from each other such that the parallax error can be reliably detected using the signals from these three channels .
  • the remaining channels feature distinct optical filters 12b, 12c, ... 12g for achieving an optimal spectral resolution .
  • spectral channels arrangements and particularly of the compensation channels of Figs 8 to 10 merely represent exemplary embodiments for illustrating the improved concept . It is obvious that alternative arrangements with larger arrays , a larger number of compensation channels , di f ferent locations of the compensation channels , etc . are likewise possible and ful fill the improved concept .
  • Figure 11 shows an electronic device 100 , e . g . a smartphone , comprising a multispectral optical camera system 101 having a multispectral optical sensor 1 according to the improved concept .
  • the optical camera system 101 can further comprise a camera module 3 for imaging purposes .
  • the optional camera module 3 has a known spatial relationship relative to the multispectral optical sensor 1 .
  • the multispectral optical camera system 101 is coupled to a processing resource 2 of the electronic device 100 which is configured to receive data from the multispectral optical sensor 1 and the image sensor (not shown) of the camera 3 .
  • the multispectral optical sensor 1 and the processing resource 2 are configured for communication with one another .
  • the multispectral optical camera system 101 can further comprise a TOF sensor for determining a distance to an obj ect or scene to be captured, e . g . for determining whether a parallax compensation process is necessary, or for serving as additional input for a parallax compensation process .
  • a TOF sensor for determining a distance to an obj ect or scene to be captured, e . g . for determining whether a parallax compensation process is necessary, or for serving as additional input for a parallax compensation process .
  • the processing resource 2 is configured to perform spectral reconstruction based on signals received from the multispectral optical sensor 1, wherein the spectral reconstruction comprises a parallax compensation process.
  • the processing resource 2 after an exposure phase of the multispectral optical sensor 1, is configured to read out electrical signals from the spectral channels of the multispectral optical sensor 1, wherein the electrical signals are generated by the subarrays 11 of optical detector regions lla-lli in response to incident electromagnetic radiation. Each of the optical detector regions lla-lli of each subarray 11 generates an individual electrical photo signal .
  • the processing resource 2 is configured to identifying a common structural pattern in the electrical signals from each of the compensation channels and to determine a spatial shift of this common structural pattern between the compensation channels in both dimensions x and y.
  • the processing resource 2 calculates from the determined spatial shift an interpolated spatial shift for each of the remaining spectral channels that are non-compensation channels.
  • the processing resource 2 corrects the signals of the spectral channels, e.g. by assigning each of the detector regions lla-lli a modified position within the subarray 11 such that the common structural feature is located at the same modified position across all spectral channels.
  • processing resource 2 performs spectral reconstruction of the parallax corrected electrical signals from the spectral channels, e.g. from the non-compensation channels and at least one of the compensation channels.
  • the processing resource 2 evaluates the information of each compensation channel, e.g. corner channels of a 4x4 array, in order to calculate the geometrical shi ft of each spectral channel position and to interpolate each spectral channel image to a typical position .
  • the interpolated geometrical adj usted spectral images can be used for distance and parallax compensated spectral compensation .
  • the geometrical shi ft can be accurately be determined and calculated for all spectral channels .

Abstract

A multispectral optical sensor (1) comprises a substrate portion (10) having an array of subarrays (11) of optical detector regions (11a-11i), a plurality of lens elements (13), and a plurality of optical filters (12a-12n). The subarrays (11), the lens elements (13) and the optical filters (12a-12n) form a plurality of spectral channels, with each spectral channel including a lens element (13), an optical filter (12a-12n) and a subarray (11) of optical detector regions (11a-11i). At least three of the plurality of spectral channels are compensation channels that are characterized by having an optical filter (12a) with identical spectral transmission characteristics, comprising a corresponding plurality of optical detector regions (11a- 11i), and each subarray (11) having an identical relative spatial arrangement of the optical detector regions (11a-20 11i).

Description

Description
MULTISPECTRAL OPTICAL SENSOR, CAMERA SYSTEM AND PARALLAX COMPENSATION METHOD
The present disclosure relates to a multispectral optical sensor, a multispectral optical camera system including such a multispectral optical sensor, and a method of using the multispectral optical sensor, in particular for compensating a captured image of an obj ect or a scene for the ef fects of parallax .
BACKGROUND OF THE INVENTION
Due to the increased number of spectral channels compared to conventional digital photography, for example , a number of application fields for multispectral technology arises , among others in professional photography . I f the spectral channels are arranged in a suitable manner across the visible range of the electromagnetic spectrum, the spectra of the photographed obj ects can be mathematically reconstructed from the sensor signals . This poses a signi ficant advantage over conventional digital photography, which typically provides only three channels , e . g . RGB channels . From the spectra known in each pixel , the appearance of the captured obj ect under any given type of light can be calculated with very high accuracy .
Multispectral camera systems are based on a chip array of optical sensors , or sub-cameras , with di f ferent spectral characteristics for generating di f ferent monochromatic images of a scene or an obj ect in order to enable spectral reconstruction . However, because of the non- zero lateral geometrical distance between the sub cameras , parallax errors become a serious limitation particularly for near- field imaging, as the parallax error shows an exponential increase with decreasing distance of the obj ect or scene to be captured . In result , depending on the distance to the obj ect , the image to the detector array is shi fted in distance radial to the center . To match the spectral image information, each signal from the individual sub-cameras is either required to employ a distance-adj usted image array, or to interpolate the shi fted images to a typical center-camera range . The latter can be achieved by means of image processing and comparing the structural shi ft between the images taken by each subcamera . However, the geometrical information can overlap with spectral information and the image adj usting may fail in consequence . It can be shown that a practical spectral reconstruction requires an accuracy of the raw data of 0 . 5% .
For multispectral cameras , one approach is to analyze the signals taken by each sub-camera and analyzing the signals for structural patterns in order to determine a parallax shi ft for each image . With the individual sensors of a multispectral camera having di f ferent spectral sensitivities , however, identi fying the very same structural pattern can be extremely di f ficult or even impossible . In such cases , the alignment of parallax by comparing the signals via image processing will fail and lead to falsi fied results .
An obj ect to be achieved is to provide a multispectral optical sensor that is capable of determining a parallax error in a reliable manner . A further obj ect is to provide a multispectral camera system comprising such an optical sensor . A further obj ect is to provide a parallax compensation method These obj ects are achieved with the sub ect-matter of the independent claims . Further developments and embodiments are described in dependent claims .
SUMMARY OF THE INVENTION
The improved concept is based on the idea of providing a multispectral image sensor that comprises multiple spectral channels that have identical spectral characteristics and are distributed across the sensor surface . This way, a compensation algorithm is rendered independent from any spectral overlap that occurs for channels with di f ferent filter characteristics . Speci fically, the parallax detection is performed on these so-called compensation channels via feature recognition . A shi ft of these features across the compensation channels directly leads to the parallax error, which can be on the one hand used to directly correct the captured signals of the compensation channels , and on the other hand can be interpolated for the remaining spectral channels that typically have distinct spectral characteristics .
In an embodiment , a multispectral optical sensor comprises a substrate portion having an array of subarrays of optical detector regions , a plurality of lens elements and a plurality of optical filters . Therein, the optical detector regions , the lens elements and the optical filters form a plurality of spectral channels , with each spectral channel including a lens element , an optical filter and a subarrays of optical detector regions . Of these spectral channels , a least three are compensation channels that are characteri zed by an identical spectral response . To this end, each of the compensation channels has an optical filter with identical spectral transmission characteristics , comprises a corresponding plurality of optical detector regions , and each subarray has an identical relative spatial arrangement of the optical detector regions .
The substrate portion is for example a semiconductor substrate , e . g . a chip substrate , which comprises a plurality of optical detector regions , e . g . optical pixels , which are configured to receive electromagnetic radiation and generate electrical signals based on a photocurrent that is generated in response to the captured electromagnetic radiation . To this end, the pixels can comprise a photodiode . In particular, all pixels of the optical sensor can comprise the same type of photodiode , e . g . a silicon-based photodiode for applications in the visible and/or NIR domain or a germanium- based photodiode for applications in the SWIR range of the electromagnetic spectrum . Thus , the spectral channels are defined in terms of the transmission behavior of the individual optical filters . The working principle of pixels of an image sensor is a well-known concept and not further detailed throughout this disclosure . The multispectral optical sensor may be manufactured at least in part using on- chip integration enabling wafer scale packaging .
The optical detector regions are arranged in an array, e . g . a ID or 2D array, of subarrays . This means that pixels of the optical sensor are arranged in a 2D pixel matrix in rows and columns , for instance , while these matrices are in turn arranged in an array, which can be one or two dimensional , e . g . a greater matrix, wherein the matrix elements of this greater matrix are the subarrays of optical detector regions . The optical sensor further comprises lens elements and optical filters such that one optical filter and one lens element is associated to each subarray of optical detector regions . This way, the spectral channels of the multispectral optical sensor are formed . In other words , since light detected by each subarray of optical detector regions is transmitted through a corresponding optical filter and lens element , each subarray of optical detector regions may be considered to act as a monochromatic, i . e . narrowband, subarray of optical detector regions . The spatial arrangement of the optical detector regions of each subarray, the optical filters and the lens elements define the sectoring of the field of view of the multispectral optical sensor .
As a result of the symmetrical design of the monochromatic subarrays and the corresponding optical filters and lens elements , each optical detector region detects light from the same sector of a scene , which is necessary for spectral reconstruction . Such a multi-spectral optical sensor may be used to generate sectored color and spectral information for each di f ferent region of a scene e . g . the center of the scene , border of the scene and outside areas of the scene . The sectored color and spectral information may be used to reali ze a gradient white balancing of a captured image of the scene with respect to di f ferent ambient light conditions in the same scene .
The optical filters can be interference filters , optical absorption filters , Fabry-Perot filters , plasmonic filers or meta surface structures configured to perform filtering . The optical filters can be characteri zed by possessing corresponding spectral passbands defining the range of optical frequencies or wavelengths that can pass through a given filter. The passbands of the optical filters are typically designed such that each optical filter transmits a narrow wavelength range, e.g. a 10-100 nm wide range, and such that all passbands of the optical filters combined span across a broad wavelength range, e.g. the UV, the visible, NIR or SWIR portion of the electromagnetic spectrum. The passbands of the individual filter elements can partially overlap each other.
Compared to conventional solutions, a multispectral optical sensor according to the disclosure provides at least three spectral channels as compensation channels that show the same spectral response. To this end, these compensation channels have optical filters with identical transmission characteristics, i.e. identical spectral passbands, an identical number of optical detector regions, e.g. pixels, and an identical relative spatial arrangement of the latter. In other words, the subarrays of optical detector regions are identical in terms of pixel type, pixel number and spatial pixel arrangement for all compensation channels. This means that each of the compensation channels is designed to generate identical photo signals of an object or a scene the electromagnetic radiation is received from if parallax errors were to be omitted. Thus, the filter elements are the same for the compensation channels, hence leading to a slight reduction in resolution compared to conventional optical sensors that feature entirely distinct optical filters and thus entirely distinct spectral responses of the individual channels .
The optical sensor can comprise further elements that are typical to optical sensors such as housings, apertures, diffusors, and cut filters, for instance. The multispectral optical sensor according to the improved concept thus provides compensation channels that, owing to their identical spectral response, provide efficient means to accurately perform pattern recognition in the generated signals without having to compensate for different spectral responses. This enables determining the exact spatial shifts caused by parallax errors. These errors are of particular relevance in the near-field due to the non-negligible spatial separation of the spectral channels. The determined spatial shifts can be interpolated to the remaining non-compensation spectral channels and used for also correcting the signals of these channels for any parallax error.
In an embodiment, the compensation channels are not directly adjacent to each other within the array of subarrays. Choosing a position of the compensation channels to not be adjacent to each other ensures more accurate results of the parallax compensation as the spatial shift of a recognized pattern, also due to the non-zero dimension of each detector region, i.e. pixel, can be determined more precisely. For example, the compensation channels occupy corner positions of the respective array, e.g. four corners of a rectangular 2D array while all other positions of the subarray are occupied by non-compensation channels.
In an embodiment, the compensation channels are arranged maximally distant from each other within the array. A maximum distance between the compensation channels further improves the result of the spatial shift determination. Ultimately, determining a spatial shift within a subarray is limited to a spatial extent of a single detector region, i.e. pixel. Determining the spatial shift across compensation channels over the largest possible distance compensates for this limitation as the non- zero pixel si ze becomes negligible i f the subarrays of the compensation channels are maximally separated across the respective array .
In an embodiment , the compensation channels are arranged at equal distance from each other . For example , the optical sensor comprises three compensation channels that are arranged on endpoints of an imaginary equilateral triangle across the array of the spectral channels . Having at least three compensation channels that are arranged in an equal distance to each other further enhances the determination of a spatial shi ft of a recorded pattern or structure as a signal of a third compensation channel can act as a confirmation for a shi ft determined between the other two channels in a three compensation channel arrangement since the relative shi fts are expected to be the same .
In an embodiment , each of the optical detector regions comprises a corresponding plurality of optical detector regions . Moreover, each subarray of optical detector regions has the same relative spatial arrangement of optical detector regions as each of the other subarrays of optical detector regions . In this embodiment , each spectral channel and therefore each subarray comprises an equal amount of detection regions , i . e . pixels . This way, an interpolation of the non-compensation spectral channels can be performed in a straight- forward manner i f the subarrays of these channels match the subarrays of the compensation channels in terms of spatial arrangement and number of detection regions .
In an embodiment , each of the spectral channels other than the compensation channels is characteri zed by an optical filter having a distinct spectral transmission characteristic that is di f ferent from that of the compensation channels and from that of the other spectral channels . In order to still feature enough spatial resolution, the optical filters of the non-compensation channels feature passbands that are distinct to each other, similar as it is the case for all channels in conventional multispectral sensors , and to the compensation channels . Thus , choosing three or four of the spectral channels to be compensation channels , slightly sacri fices resolution but on the other hands enables a more accurate spectral reconstruction due to the reliable parallax compensation process .
In an embodiment , the optical filters of the compensation channels are optical bandpass filters that transmit a wavelength range that is substantially centered between a minimum and a maximum transmission wavelength across all optical filters . With all spectral channels spanning a certain wavelength range , e . g . the visible or SWIR range , the compensation channels rather than being sensitive at boundaries of said wavelength range , are sensitive in a range that is substantially centered within the wavelength range of all spectral channels combined . For example , i f the spectral channels are sensitive across the visible range from 400-700 nm, the compensation channels can be sensitive for green light around 550 nm . This minimi zes error channels in the parallax compensation and spectral reconstruction caused by wavelength dispersion in the spatial shi fts of each spectral channel due to the parallax error .
In an embodiment , the optical filters are arranged between the lens elements and the optical detector regions , in particular the optical filters are disposed or formed on a front surface of the substrate portion . The optical filters can be formed on, or attached to , a monolithic multispectral semiconductor chip in front of corresponding subarrays of optical detector regions .
In an embodiment , the lens elements are arranged between the optical filters and the optical detector regions . Alternative to an arrangement of the filter elements on a surface of the substrate portion, the filters can be arranged, or formed on, a front surface of an optical substrate that comprises the lens elements , wherein the front surface faces away from the substrate portion .
In an embodiment , the plurality of lens elements forms a micro lens array, MLA, or a micro Fresnel lens array . The plurality of lens elements can be defined by, or formed on, an optical substrate . Said substrate can be arranged on a spacer that is located between the substrate portion and the optical substrate of the MLA. Furthermore , the spacer can define a plurality of apertures , wherein each aperture is aligned with a corresponding lens element , a corresponding optical filter and a corresponding subarray of optical detector regions . For example , the lens elements are Fresnel lens elements provided as a micro Fresnel lens array, wherein each Fresnel lens element is defined by, or formed on, a corresponding optical filter of the multispectral optical sensor .
In an embodiment , the optical filters are transmissive in the visible domain . Spectral imaging particularly in the visible domain can allow extraction of additional information the human eye fails to capture . Moreover, multispectral imaging allows for an improved color balancing as the precise spectrum of the imaged object or scene can be determined. Therein, white balancing on objects, e.g. a white surface or wall, wherein different portions are illuminated by different light sources, e.g. sunlight on one side and artificial light on an opposite side, can be efficiently performed in an extremely reliable manner. To this end, the compensation channels are sensitive for green light, which is located in the center of the visible domain.
In an embodiment, the optical filters are transmissive in the infrared domain, in particular in the SWIR domain. In addition or alternatively to being transmissive in the visible domain, a transparency in the SWIR domain allows for various applications. Unlike medium and long wavelength infrared light that is emitted by objects themselves, e.g. as temperature radiation, the SWIR spectrum is similar in properties to visible light, meaning photons are reflected or absorbed by an object, creating the strong contrast required for high resolution imaging. Natural sources of SWIR light include starlight and night sky glow, which provide excellent illumination for outdoor imaging at night. SWIR imaging is used for a variety of applications, such as RGB, solar cell and food inspection, identification and sorting, surveillance, counterfeit detection, process quality control, etc .
In an embodiment, the optical sensor further comprises a plurality of apertures, wherein each aperture is aligned with a corresponding lens element, a corresponding optical filter and a corresponding subarray of optical detector regions.
In an embodiment, the array is a 4x4 2D array and the compensation channels are formed from the subarrays of optical detector regions located at corner positions of the array . Providing a rectangular, or even quadratic, array, wherein the four corner positions are turned into the compensation channels with identical spectral response combines the advantages of maximal distancing the compensation channels for an improved interpolation of the remaining channels in two dimensions with the fact that two channels can be used to confirm spatial shi fts of an identi fied pattern detected with the respective other two channels .
In an embodiment , the array is a 2D array formed from the subarrays of optical detector regions arranged in rows and columns of the array . The compensation channels are arranged such that each row and column of the array comprises at most one compensation channel . For example , the array is a 3x3 array of subarrays , wherein the compensation channels are arranged, such that they are located on endpoints of an equilateral triangle , wherein each row and column of the 3x3 array comprises a single compensation channel . Thus , sacri ficing the overall spectral resolution only minimally by providing the minimal amount of three compensation channels with identical spectral response , on the other hand enables ef ficient means for detecting spatial shi fts of patterns in both dimensions of the array as the compensation channels are located at equal distances from each other .
Furthermore , a multispectral optical camera system is provided that comprises a multispectral optical sensor according to one of the embodiments described above . The multispectral optical camera system further comprises a processing resource , wherein the multispectral optical sensor and the processing resource are configured for communication with one another . The processing resource is configured to perform a parallax compensation process , which comprises the steps of : reading out electrical signals from the spectral channels of the multispectral optical sensor, wherein the electrical signals are generated by the subarrays of optical detector regions in response to incident electromagnetic radiation, identi fying a common structural pattern in the electrical signals from each of the compensation channels , and determining a parallax of the common structural pattern between the compensation channels .
The parallax compensation process further comprises calculating from the determined parallax an interpolated parallax for each of the remaining spectral channels , correcting the electrical signals from the compensation channels for the determined parallax, correcting the electrical signals from the spectral channels other than the compensation channels for the interpolated parallax, and performing spectral reconstruction of the electrical signals from all spectral channels .
In an embodiment , the processing resource is further configured to calculate , from the determined parallax of the compensation channels , a distance to an obj ect the electromagnetic radiation is received from .
In an embodiment , the multispectral optical camera system further comprises a time-of- f light , TOF, sensor that is configured to determine a distance between the multispectral optical camera system and an obj ect the electromagnetic radiation is received from . Therein, in the interpolated parallax for each of the remaining spectral channels is calculated based on the determined parallax from the compensation channels and the determined distance . For example , the TOF sensor can determine whether the obj ect or scene to be captured is located in the near- field where a signi ficant parallax error is to be expected . This way, it can be determined, whether a parallax compensation process is performed or not in order to maintain high energy ef ficiency operation especially for battery-power devices such as smartphones , tablet or laptop computers . Alternatively or in addition, the distance determined via the TOF sensor can be used during the parallax compensation process to further enhance the determination of the spatial shi fts of the patterns identi fied in the compensation channels .
Further embodiments of the optical camera system become apparent to the skilled reader from the embodiments of the multispectral optical sensor described above , and vice-versa .
Furthermore , a parallax compensation method is provided, which comprises the steps of capturing electromagnetic radiation using a multispectral optical sensor according to one of the embodiments described above , reading out electrical signals from the spectral channels of the multispectral optical sensor, wherein the electrical signals are generated by the subarrays of optical detector regions in response to the incident electromagnetic radiation, and identi fying a structural pattern in the electrical signals from each of the compensation channels . The method further comprises determining a parallax of the structural pattern between the compensation channels , calculating from the determined parallax an interpolated parallax for each of the remaining spectral channels , correcting the electrical signals from the compensation channels for the determined parallax, correcting the electrical signals from the remaining spectral channels for the interpolated parallax, and performing spectral reconstruction of the electrical signals from all spectral channels .
In an embodiment , the method further comprises calculating, from the determined parallax of the compensation channels , a distance to an obj ect the electromagnetic radiation is received from .
Further embodiments of the method become apparent to the skilled reader from the embodiments of the multispectral optical sensor described above , and vice-versa .
BRIEF DESCRIPTION OF THE DRAWINGS
The following description of figures may further illustrate and explain aspects of the multispectral optical sensor and the parallax compensation method . Components and parts of the multispectral optical sensor that are functionally identical or have an identical ef fect are denoted by identical reference symbols . Identical or ef fectively identical components and parts might be described only with respect to the figures where they occur first . Their description is not necessarily repeated in successive figures .
DETAILED DESCRIPTION
In the figures :
Figure 1 shows a schematic cross-section of an exemplary embodiment of a multispectral optical sensor according to the improved concept ; Figure 2 shows a schematic of an exemplary embodiment of a monolithic semiconductor chip of a multispectral optical sensor ;
Figures 3 and 4 illustrate the working principle of a multispectral optical sensor in the far field and near field, respectively;
Figures 5 and 6 illustrate image fields of the spectral channels of a multispectral optical sensor in the far field and near field, respectively;
Figure 7 shows an exemplary spatial shi ft within a spectral channel evaluated against a distance to an obj ect imaged by the multispectral optical sensor .
Figures 8 to 10 illustrate various embodiments of arrangements of the spectral channels of a multispectral optical sensor ; and
Figure 11 shows a schematic of a rear side of an electronic device in the form of a smartphone having a multispectral optical sensor .
Figure 1 shows a schematic cross-section of an exemplary embodiment of a multispectral optical sensor 1 according to the improved concept . The multispectral optical sensor 1 in this embodiment comprises a monolithic semiconductor chip, e . g . a silicon chip, as the substrate portion 10 , which for this embodiment is shown in more detail in Fig . 2 . The substrate portion 10 defines a plurality of subarrays 11 of optical detector regions I la, 11b, 11c, ... H i in the form of a number of subarrays 11 arranged in a rectangular, e . g . 3x4 or 4x4 , array of subarrays 11 , wherein the optical detector regions I la, 11b, 11c, ... H i of each subarray 11 are pixel structures , for instance , which have the same relative spatial arrangement as the optical detector regions I la, 11b, 11c, ... H i of each of the other subarrays 11 . Speci fically, each of the subarrays 11 in this embodiment defines a 3x3 array of optical detector regions Ha, 11b, He, ... H i . A number of optical detector regions Ha, Hb, He, ... H i in each subarray and their spatial arrangement defines an image resolution of the multispectral optical sensor 1 . For example , a 3x3 resolution as illustrated in this embodiment is suf ficient for applications such as color balancing or analysis applications that do not rely on capturing an obj ect in high resolution but merely a spectral composition of the received light . Higher image resolutions , e . g . 100x100 or even higher, can however likewise be implemented for applications that do require resolving structural features of an obj ect or a scene , e . g . augmented reality applications .
The multispectral optical sensor 1 further comprises a plurality of optical filters 12a, 12b, 12c, ... 12 i as well as a plurality of lens elements 13 in the form of a micro lens array (MLA) defined by, or formed on, an optical substrate 15 . The multispectral optical sensor 1 also includes a spacer 14 located between the substrate portion 10 and the optical substrate 15 of the MLA. The substrate portion 10 and the optical substrate 15 are attached to opposite sides of the spacer 14 . Furthermore , the spacer 14 defines a plurality of apertures 16 , wherein each aperture 16 is aligned with a corresponding lens element 13 , a corresponding optical filter 12 and a corresponding subarray 11 of optical detector regions Ha, Hb, He, ... H i . In consequence , each of the subarrays 11 , the corresponding optical filter 12 and the corresponding lens element 13 form a respective spectral channel of the multispectral optical sensor 1.
Each of the optical filters 12a, 12b, 12c, ... 12i has a corresponding optical transmission spectrum. Each optical filter 12a, 12b, 12c, ... 12i is a passband optical interference filter, for example, which defines a corresponding spectral passband. The optical filters 12a, 12b, 12c, ... 12i define different spectral passbands with the additional requirement that at least three of the optical filters 12a define identical spectral passbands, thus forming the compensation spectral channels. In other words, the compensation channels are independent from spectral characteristics of an object or scene to be captured. The remaining optical filters 12b, 12c, ... 12i define distinct spectral passbands that are different from each other and from the optical filters 12a of the compensation channels. For example, the optical filters 12a, 12b, 12c, ... 12i define passbands of equal bandwidth and can partially overlap each other, such that a predefined range of the electromagnetic spectrum is covered, e.g. the visible or SWIR domain. For example, the center wavelengths of transmission of the optical filters 12a, 12b, 12c, ... 12i are equally spaced from each other. In this embodiment, each optical filter 12a, 12b, 12c, ... 12i is formed on, or attached to, the substrate portion 10 in front of a corresponding subarray 11 of optical detector regions Ila, 11b, 11c, ... Hi.
Each optical filter 12a, 12b, 12c, ... 12i is aligned between a corresponding lens element 13 and a corresponding subarray 11 of optical detector regions Ila, 11b, 11c, ... Hi such that, during operation of the multispectral optical sensor 1, any light which is incident on any one of the lens elements 13 along any given direction of incidence converges through the corresponding optical filter 12a, 12b, 12c, ... 12 i onto a corresponding one of the optical detector regions I la, 11b, 11c, ... H i of the corresponding subarray 11 of optical detector regions I la, 11b, 11c, ... H i , wherein the corresponding one of the optical detector regions Ha, 11b, He, ... H i depends on the given direction of incidence . For example , light incident on any one of the lens elements 13 along a direction of incidence which is parallel to the optical axis 20 of the multispectral optical sensor 1 as represented by the solid rays shown in FIG . 1 , is focused by the lens element 13 to the central optical detector region He of the corresponding subarray 11 through the corresponding optical filter 12a, 12b, 12c, ... 12 i .
Similarly, light incident on any one of the lens elements 13 along a direction of incidence which is oblique to the optical axis 20 of the multispectral optical sensor 1 as represented by the dashed and dotted-dashed rays shown in FIG . 1 , is focused by the lens element 13 to one of the peripheral optical detector regions Ha, Hb, He, l id, I l f , 11g, l lh, H i of the corresponding subarray 11 through the corresponding optical filter 12a, 12b, 12c, ... 12 i which depends on the particular direction of incidence .
Fig . 2 shows a schematic of the substrate portion 10 of the exemplary embodiment of a multispectral optical sensor of Fig . 1 . As illustrated, the substrate portion 10 comprises twelve subarrays 11 that are arranged in a rectangular 3x4 ( or 4x3 ) array, wherein each of the subarrays 11 comprises nine optical detector regions Ha, Hb, He, ... H i arranged in a square 3x3 array . Therein, the 3x3 subarrays 11 define the image resolution of the multispectral sensor 1 , while the 3x4 array defines its spectral resolution . However, compared to conventional multispectral sensors that do not comprise means for parallax compensation, the spectral sensor 1 according to the improved concept and particularly of this embodiment equips the corner subarrays 11 each with an optical filter 12a having an identical spectral passband with respect to each other, while the remaining optical filters 12b, 12c, ... 12 i have passbands that are distinct to each other and to those of the corner subarrays 11 . Thus , the multispectral optical sensor according to this embodiment comprises nine di f ferent spectral channels and a number of four compensation channels with a global image resolution of 3x3 . As mentioned in the context of Fig . 1 , higher image and spectral resolutions can be easily achieved by increasing the number of subarrays 11 in the array and the number of optical detector regions I la, 11b, 11c, ... H i , respectively .
The compensation channels having the identical optical filters 12a are arranged at corner positions of the array as illustrated, thus resulting in a maximal spacing of the compensation channels . As spatial shi fts due to a parallax error increase radially with increasing distance from the center of the optical sensor 1 ( cf . optical axis 20 in Fig . 1 ) , arranging the compensation channels maximally distant from each other enables a maximum sensitivity to said spatial shi fts , thus leading to an ef ficient means to detect parallax errors and provide the basis for a compensation process .
Fig . 3 illustrates a ray optics scheme of an obj ect captured by the multispectral image sensor 1 in its far field, in which the distance d to said obj ect is signi ficantly larger, e . g . by several orders of magnitude , than a pitch of the array of subarrays 11 or a distance of the outermost subarray 11 from the optical axis 20, here indicated by the arrow labelled a.
In this embodiment, the multispectral optical sensor 1 comprises an array having four rows of spectral channels, with each spectral channel having a subarray 11 of optical detector regions Ila, ... Hi, wherein the latter form a 100x100 square array as the subarray 11. Each spectral channel further comprises a lens element 13 and an optical filter 12, in this case disposed on a surface of the lens element 13. For example, a square subarray 11 of 100x100 pixels has a side length of 0.25 mm, thus a pixel pitch of 2.5 pm. A pitch of the subarrays with respect to each other can correspondingly be in the order of 0.25 to 0.5 mm, for instance. A focal length f of the lens elements is in the order of 2 mm, for instance.
As illustrated by the darkened region of the subarray representing the image field, for objects in the far field the image field of each spectral channel is concentric to the corresponding lens element 13. This means that a spatial shift Ax of a certain feature, here indicated as an imaged arrow, is zero or nearly zero for all spectral channels, particularly for the outermost spectral channels. In other words, the parallax error in the far field is nonexistent or at least negligible.
Fig. 4, on the other hand, illustrates a ray optics scheme of an object captured by the multispectral image sensor 1 in its near field, meaning that a distance d of the object or scene is in the same order of magnitude of the pitch of the array of subarrays 11. In other words, in the near field the distance d is comparable to a distance of the subarrays 11 from the optical axis 20, here indicated by the arrows labelled i and a2. Following the principles of ray optics, it can be derived that an angular parallax is to be expected
Axt at an angle air with tan<Z; = Therein,
Figure imgf000024_0002
denotes the
Figure imgf000024_0001
f radial distance of the respective spectral channel from the optical axis 20, d the distance of the object to be imaged, f the focal length of the lens elements 13, and A%; the spatial shift of the channel with radial distance
Figure imgf000024_0003
in distance units .
Thus, as illustrated, even the spectral channels arranged in immediate proximity to the optical axis 20 experience significant spatial shifts of the image region Ax2 and Ax3, while the spatial shifts of the outermost spectral channels and AX4 show an even larger deviation from zero. In consequence, the image fields of all spectral channels are shifted and are no longer concentric to the lens elements 13. Thus, for the typical spectral reconstruction, significant errors are to be expected if beforehand the parallax error is not compensated for. This can even lead to complete failure of the spectral reconstruction.
Figures 5 and 6 illustrate the image regions as highlighted square with respect to the lens elements 13 indicated as circles and optical filters 12a, 12b, ... 12m for the cases of far field and near field imaging of Figs. 3 and 4, respectively. In Figs. 5 and 6, the subarrays 11 of optical detector regions are congruent with the square-shaped optical filters 12a, 12b, ... 12m. With reference to Fig. 5, all image fields in case of far field imaging are concentric with the lens elements 13, meaning that a parallax error is nonexistent or at least negligible as also described in the context of Fig. 3.
However, with reference to Fig. 6 and in analogy to Fig. 4, in case of near field imaging, the image fields experience radial shifts relative to a center of the array on which the optical axis 20 is located. This means, that the image fields and the associated lens elements 13 show a significant spatial misalignment as indicated in the figure. It is to be noted that this, at least for short object distances d, this is even true for the innermost spectral channels closest to the optical axis 20 penetrating the sensor in the center of the array (here: spectral channels with optical filters 12e, 12f, 12i, 12 ) . The further a spectral channel is arranged away from the optical axis, the stronger the spatial shift due to the parallax error. The four corner spectral channels, here designed as the compensation channels having the identical optical filters 12a, show a significant spatial shift in both x and y direction. Hence, any attempt to perform spectral reconstruction by comparing the signals of pixels located at the same position within the subarray 11 across all spectral channels will lead to a strongly falsified result. Thus, a compensation mechanism is required, that enables spectral reconstruction not only in the far field but also in the near field and in intermediate object distances d.
Figure 7 emphasizes the requirement for parallax compensation by illustrating an exemplary spatial shift Ax = ■ f for
Figure imgf000025_0001
typical optical sensor dimensions described above, i.e. a focal length f = 2 mm, and a distance of the spectral channel of a = 0.425 mm. It can be extracted from this behavior that for typical pixel sizes in the order of 2-3 pm and a corresponding pixel pitch within the subarray 11, a distance d to an object of less than half a meter is tantamount with a spatial shift Ax of a given feature of several pixels, hence rendering a spectral reconstruction performed based on a pixel-by-pixel comparison of the different spectral channels difficult or even impossible. On the other hand, for any detected parallax spatial shift, a distance to the object or scene can be estimated via the relation d = — -f. With Ax J this, also a segmentation of different structure distances are possible if the spatial shifts of different structural features are evaluated.
Figures 8 to 10 illustrate various embodiments realizing a core idea of the multispectral optical sensor 1 according to the improved concept. As already indicated in the previous figures, in contrast to conventional sensors, a multispectral optical sensor according to this disclosure realizes at least three so-called compensation channels, which can be regarded as usual spectral channels, however, having an optical filter 12a with identical spectral response, i.e. transmission behavior. Thus, compared to conventional multispectral optical sensors, the spectral resolution is slightly decreased as the spectral compensation channels, neglecting any parallax, record the same spectral component of an object or scene to be imaged.
Figure 8 shows a first embodiment of an optical filter configuration of a 4x4 array of subarrays 11 of a multispectral optical sensor 1 according to the improved concept. In this embodiment, the corner subarrays 11 of the array are equipped with optical filters 12a, which have identical spectral responses, thus forming four identical compensation channels. This means, that all four corner spectral channels would record identical photo signals i f they were exposed to the very same incident light . The remaining spectral channels in this embodiment are characteri zed by twelve distinct optical filters 12b, 12c, ... 12m that di f fer in terms of their spectral response from each other and from the compensation channels .
For example , the optical filters 12a, 12b, ... 12m each have optical passbands characteri zed by respective center wavelength and bandwidth, wherein the optical passbands are engineered such that at least one optical filter 12a, 12b, ... 12m is transmissive for any given optical wavelength selected from a predefined range . For example , the optical filters 12a, 12b, ... 12m cover the visible domain, e . g . a wavelength range between 400 nm and 700 nm, such that any wavelength within this range is transmitted by at least one of the optical filters 12a, 12b, ... 12m . For optimal spectral coverage , the passbands of the optical filters 12a, 12b, ... 12m can at least partially overlap each other . The passbands of the optical filters 12a of the compensation channels can be located in or towards a center of the wavelength range , which is covered by all optical filters 12a, 12b, ... 12m . For example , the optical filters 12a, 12b, ... 12m are characteri zed by passbands of 25-50 nm bandwidth, wherein a passband of the optical filters 12a of the compensation channels is transmissive for green light at around 550 nm, for instance .
The spectral channels with the identical optical filters 12a are arranged maximally distant from each other and from the center of the multispectral optical sensor 1 , i . e . from the center of the array . This is due to the fact that the parallax error is maximum at these locations , as it increases radially from the center of the array . Hence , the maximum spatial shi ft due to the parallax error in both dimensions x and y can be determined at these locations and used to interpolate the spectral shi fts of the remaining noncompensation channels not located at corners of the array in this embodiment .
Fig . 9 shows a second embodiment of an optical filter configuration of a 4x4 array of subarrays 11 of a multispectral optical sensor 1 according to the improved concept . In this embodiment likewise comprising a total of 16 channels , three of these channels are designed as the compensation channels , wherein the identical optical filters 12a are arranged maximally distant from each other in a manner, in which each row and column of the array comprises at most one compensation channel , i . e . at most one of the identical filters 12a . For example , the compensation channels are arranged such that they form endpoints of an equilateral imaginary triangle indicated in the figure . Compared to the embodiment of Figure 8 , this embodiment features an additional distinct spectral channel with an optical filter 12m, thus increasing the spectral resolution of the sensor . With an arrangement as shown, the three compensation channels suf fice to reliably detect spatial shi fts due to the parallax error in both dimensions x and y . Therein, the signal in one of the compensation channels can be used to veri fy spatial shi fts determined by the respective other two .
Figure 10 shows a third embodiment of an optical filter configuration of a 3x3 array of subarrays 11 of a multispectral optical sensor 1 according to the improved concept . Like Fig . 9 , the compensation channels are arranged maximally distant from each other such that the parallax error can be reliably detected using the signals from these three channels . Like the embodiments of Figs . 8 and 9 , the remaining channels feature distinct optical filters 12b, 12c, ... 12g for achieving an optimal spectral resolution .
The embodiments of the spectral channels arrangements and particularly of the compensation channels of Figs 8 to 10 merely represent exemplary embodiments for illustrating the improved concept . It is obvious that alternative arrangements with larger arrays , a larger number of compensation channels , di f ferent locations of the compensation channels , etc . are likewise possible and ful fill the improved concept .
Figure 11 shows an electronic device 100 , e . g . a smartphone , comprising a multispectral optical camera system 101 having a multispectral optical sensor 1 according to the improved concept . The optical camera system 101 can further comprise a camera module 3 for imaging purposes . The optional camera module 3 has a known spatial relationship relative to the multispectral optical sensor 1 . The multispectral optical camera system 101 is coupled to a processing resource 2 of the electronic device 100 which is configured to receive data from the multispectral optical sensor 1 and the image sensor (not shown) of the camera 3 . In other words , the multispectral optical sensor 1 and the processing resource 2 are configured for communication with one another . The multispectral optical camera system 101 can further comprise a TOF sensor for determining a distance to an obj ect or scene to be captured, e . g . for determining whether a parallax compensation process is necessary, or for serving as additional input for a parallax compensation process .
The processing resource 2 is configured to perform spectral reconstruction based on signals received from the multispectral optical sensor 1, wherein the spectral reconstruction comprises a parallax compensation process. To this end, the processing resource 2, after an exposure phase of the multispectral optical sensor 1, is configured to read out electrical signals from the spectral channels of the multispectral optical sensor 1, wherein the electrical signals are generated by the subarrays 11 of optical detector regions lla-lli in response to incident electromagnetic radiation. Each of the optical detector regions lla-lli of each subarray 11 generates an individual electrical photo signal .
In a second step, the processing resource 2 is configured to identifying a common structural pattern in the electrical signals from each of the compensation channels and to determine a spatial shift of this common structural pattern between the compensation channels in both dimensions x and y. The processing resource 2 then calculates from the determined spatial shift an interpolated spatial shift for each of the remaining spectral channels that are non-compensation channels. The processing resource 2 corrects the signals of the spectral channels, e.g. by assigning each of the detector regions lla-lli a modified position within the subarray 11 such that the common structural feature is located at the same modified position across all spectral channels. Finally, processing resource 2 performs spectral reconstruction of the parallax corrected electrical signals from the spectral channels, e.g. from the non-compensation channels and at least one of the compensation channels.
In other words, the processing resource 2 evaluates the information of each compensation channel, e.g. corner channels of a 4x4 array, in order to calculate the geometrical shi ft of each spectral channel position and to interpolate each spectral channel image to a typical position . With this , the interpolated geometrical adj usted spectral images can be used for distance and parallax compensated spectral compensation . In comparison of the spectral similar images from the compensation channels , the geometrical shi ft can be accurately be determined and calculated for all spectral channels .
The embodiments of the multispectral optical sensor 1 and the parallax compensation method disclosed herein have been discussed for the purpose of familiari zing the reader with novel aspects of the idea . Although preferred embodiments have been shown and described, changes , modi fications , equivalents and substitutions of the disclosed concepts may be made by one having skill in the art without unnecessarily departing from the scope of the claims .
It will be appreciated that the disclosure is not limited to the disclosed embodiments and to what has been particularly shown and described hereinabove . Rather, features recited in separate dependent claims or in the description may advantageously be combined . Furthermore , the scope of the disclosure includes those variations and modi fications , which will be apparent to those skilled in the art and fall within the scope of the appended claims .
The term " comprising" , insofar it was used in the claims or in the description, does not exclude other elements or steps of a corresponding feature or procedure . In case that the terms " a" or " an" were used in conj unction with features , they do not exclude a plurality of such features . Moreover, any reference signs in the claims should not be construed as limiting the scope .
References
1 multispectral optical sensor
2 processing resource
3 camera module
10 substrate portion
11 subarray
I la ... H i optical detector region
12a ... 12n optical filter
13 lens element
14 spacer
15 optical substrate
16 aperture
20 optical axis
100 electronic device
101 multispectral optical camera system a array pitch d distance f focal length
Ax spatial shi ft

Claims

Claims
1. A multispectral optical sensor (1) comprising:
- a substrate portion (10) having an array of subarrays (11) of optical detector regions (lla-lli) ;
- a plurality of lens elements (13) ; and
- a plurality of optical filters (12a-12n) ;
- wherein the subarrays (11) , the lens elements (13) and the optical filters (12a-12n) form a plurality of spectral channels, with each spectral channel including a lens element (13) , an optical filter (12a-12n) and a subarray (11) of optical detector regions (lla-lli) ;
- wherein at least three of the plurality of spectral channels are compensation channels that are characterized by :
- having an optical filter (12a) with identical spectral transmission characteristics;
- comprising a corresponding plurality of optical detector regions (lla-lli) ; and
- each subarray (11) having an identical relative spatial arrangement of the optical detector regions (lla-lli) .
2. The multispectral optical sensor (1) according to claim 1, wherein the compensation channels are not directly adjacent to each other within the array.
3. The multispectral optical sensor (1) according to claim 1 or 2, wherein the compensation channels are arranged maximally distant from each other within the array.
4. The multispectral optical sensor (1) according to one of claims 1 to 3, wherein the compensation channels are arranged at equal distance from each other.
5. The multispectral optical sensor (1) according to one of claims 1 to 4, wherein each of the optical detector regions (lla-lli) comprises a corresponding plurality of optical detector regions (lla-lli) , and wherein each subarray (11) of optical detector regions (lla-lli) has the same relative spatial arrangement of optical detector regions (lla-lli) as each of the other subarrays (11) of optical detector regions (lla-lli) .
6. The multispectral optical sensor (1) according to one of claims 1 to 5, wherein each of the spectral channels other than the compensation channels is characterized by an optical filter (12b-12n) having a distinct spectral transmission characteristic that is different from the optical filter (12a) of the compensation channels and from that of the other spectral channels.
7. The multispectral optical sensor (1) according to one of claims 1 to 6, wherein the optical filters (12) of the compensation channels are optical bandpass filters that transmit a wavelength range that is substantially centered between a minimum and a maximum transmission wavelength across all optical filters (12) .
8. The multispectral optical sensor (1) according to one of claims 1 to 7, wherein the optical filters (12) are arranged between the lens elements (13) and the optical detector regions (lla-lli) , in particular the optical filters (12) are disposed or formed on a front surface of the substrate portion (10) .
9. The multispectral optical sensor (1) according to one of claims 1 to 7, wherein the lens elements (13) are arranged between the optical filters (12) and the optical detector regions (lla-lli) .
10. The multispectral optical sensor (1) according to one of claims 1 to 9, wherein the plurality of lens elements (13) forms a micro lens array, MLA, or a micro Fresnel lens array.
11. The multispectral optical sensor (1) according to one of claims 1 to 10, wherein the optical filters (12) are transmissive in the visible domain.
12. The multispectral optical sensor (1) according to one of claims 1 to 11, wherein the optical filters (12) are transmissive in the infrared domain, in particular in the SWIR domain.
13. The multispectral optical sensor (1) according to one of claims 1 to 12, further comprising a plurality of apertures (16) , wherein each aperture (16) is aligned with a corresponding lens element (13) , a corresponding optical filter (12) and a corresponding subarray (11) of optical detector regions (lla-lli) .
14. The multispectral optical sensor (1) according to one of claims 1 to 13, wherein the array is a 4x4 2D array; and the compensation channels are formed from the subarrays
(11) of optical detector regions (lla-lli) located at corner positions of the array.
15. The multispectral optical sensor (1) according to one of claims 1 to 13, wherein the array is a 2D array formed from the subarrays (11) of optical detector regions (lla-lli) arranged in rows and columns of the array; and the compensation channels are arranged such that each row and column of the array comprises at most one compensation channel .
16. A multispectral optical camera system (101) , comprising: a multispectral optical sensor (1) according to one of claims 1 to 15; and a processing resource (2) ; wherein the multispectral optical sensor (1) and the processing resource (2) are configured for communication with one another; and wherein the processing resource (2) is configured to perform a parallax compensation process, comprising the steps of:
- reading out electrical signals from the spectral channels of the multispectral optical sensor (1) , wherein the electrical signals are generated by the subarrays (11) of optical detector regions (lla-lli) in response to incident electromagnetic radiation;
- identifying a common structural pattern in the electrical signals from each of the compensation channels ;
- determining a parallax of the common structural pattern between the compensation channels; - calculating from the determined parallax an interpolated parallax for each of the remaining spectral channels;
- correcting the electrical signals from the compensation channels for the determined parallax;
- correcting the electrical signals from the spectral channels other than the compensation channels for the interpolated parallax; and
- performing spectral reconstruction of the electrical signals from all spectral channels.
17. The multispectral optical camera system (101) according to claim 16, wherein the processing resource (2) is further configured to calculate, from the determined parallax of the compensation channels, a distance to an object the electromagnetic radiation is received from.
18. The multispectral optical camera system (101) according to claim 16 or 17, further comprising a time-of-f light , TOF, sensor that is configured to determine a distance between the multispectral optical camera system and an object the electromagnetic radiation is received from; and wherein the interpolated parallax for each of the remaining spectral channels is calculated based on the determined parallax from the compensation channels and the determined distance.
19. An electronic device (100) comprising at least one of: a multispectral optical sensor (1) according to one of claims 1 to 15; or a multispectral optical camera system (101) according to one of claims 16 to 18. A parallax compensation method, the method comprising: capturing electromagnetic radiation using a multispectral optical sensor (1) according to one of claims 1 to 15; reading out electrical signals from the spectral channels of the multispectral optical sensor (1) , wherein the electrical signals are generated by the subarrays (11) of optical detector regions (lla-lli) in response to the incident electromagnetic radiation; identifying a structural pattern in the electrical signals from each of the compensation channels; determining a parallax of the structural pattern between the compensation channels; calculating from the determined parallax an interpolated parallax for each of the remaining spectral channels; correcting the electrical signals from the compensation channels for the determined parallax; correcting the electrical signals from the remaining spectral channels for the interpolated parallax; and performing spectral reconstruction of the electrical signals from all spectral channels.
PCT/EP2023/055542 2022-05-20 2023-03-06 Multispectral optical sensor, camera system and parallax compensation method WO2023222278A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102022112741 2022-05-20
DE102022112741.9 2022-05-20

Publications (1)

Publication Number Publication Date
WO2023222278A1 true WO2023222278A1 (en) 2023-11-23

Family

ID=85556736

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2023/055542 WO2023222278A1 (en) 2022-05-20 2023-03-06 Multispectral optical sensor, camera system and parallax compensation method

Country Status (1)

Country Link
WO (1) WO2023222278A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050134698A1 (en) * 2003-12-18 2005-06-23 Schroeder Dale W. Color image sensor with imaging elements imaging on respective regions of sensor elements
WO2009026064A2 (en) * 2007-08-21 2009-02-26 Micron Technology, Inc. Multi-array sensor with integrated sub-array for parallax detection and photometer functionality
US20120268634A1 (en) * 2011-04-20 2012-10-25 Canon Kabushiki Kaisha Image sensor and image capturing apparatus
US20150136954A1 (en) * 2013-11-12 2015-05-21 EO Vista, LLC Apparatus and Methods for Hyperspectral Imaging with Parallax Measurement
WO2018075964A1 (en) * 2016-10-21 2018-04-26 Rebellion Photonics, Inc. Mobile gas and chemical imaging camera

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050134698A1 (en) * 2003-12-18 2005-06-23 Schroeder Dale W. Color image sensor with imaging elements imaging on respective regions of sensor elements
WO2009026064A2 (en) * 2007-08-21 2009-02-26 Micron Technology, Inc. Multi-array sensor with integrated sub-array for parallax detection and photometer functionality
US20120268634A1 (en) * 2011-04-20 2012-10-25 Canon Kabushiki Kaisha Image sensor and image capturing apparatus
US20150136954A1 (en) * 2013-11-12 2015-05-21 EO Vista, LLC Apparatus and Methods for Hyperspectral Imaging with Parallax Measurement
WO2018075964A1 (en) * 2016-10-21 2018-04-26 Rebellion Photonics, Inc. Mobile gas and chemical imaging camera

Similar Documents

Publication Publication Date Title
US10653313B2 (en) Systems and methods for lensed and lensless optical sensing of binary scenes
CN109716176B (en) Light field imaging device and method for depth acquisition and three-dimensional imaging
US9338380B2 (en) Image processing methods for image sensors with phase detection pixels
US9883128B2 (en) Imaging systems with high dynamic range and phase detection pixels
US9432568B2 (en) Pixel arrangements for image sensors with phase detection pixels
US9455285B2 (en) Image sensors with phase detection pixels
US9030528B2 (en) Multi-zone imaging sensor and lens array
CN103808409B (en) The multispectral starring array sensor of mixing material
US20170374306A1 (en) Image sensor system with an automatic focus function
EP3129813B1 (en) Low-power image change detector
EP3700197B1 (en) Imaging device and method, and image processing device and method
US11659289B2 (en) Imaging apparatus and method, and image processing apparatus and method
US20220221733A1 (en) Light Field Imaging Device and Method for 3D Sensing
US10609361B2 (en) Imaging systems with depth detection
EP3353808B1 (en) Color filter sensors
US11431898B2 (en) Signal processing device and imaging device
WO2023222278A1 (en) Multispectral optical sensor, camera system and parallax compensation method
WO2019188396A1 (en) Imaging device and method, image processing device and method, and imaging element

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23709961

Country of ref document: EP

Kind code of ref document: A1