CN116183021A - Illuminant correction in an imaging system - Google Patents

Illuminant correction in an imaging system Download PDF

Info

Publication number
CN116183021A
CN116183021A CN202211483115.1A CN202211483115A CN116183021A CN 116183021 A CN116183021 A CN 116183021A CN 202211483115 A CN202211483115 A CN 202211483115A CN 116183021 A CN116183021 A CN 116183021A
Authority
CN
China
Prior art keywords
scene
spectral
image
sensor
filter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211483115.1A
Other languages
Chinese (zh)
Inventor
J·博瑞曼斯
R·利滕
W·范德坦普尔
M·德博克
J·拉奇科夫斯基
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Spectra Corp
Original Assignee
Spectra Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US18/051,166 external-priority patent/US20230082539A1/en
Application filed by Spectra Corp filed Critical Spectra Corp
Publication of CN116183021A publication Critical patent/CN116183021A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • H04N23/88Camera processing pipelines; Components thereof for processing colour signals for colour balance, e.g. white-balance circuits or colour temperature control
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/0205Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows
    • G01J3/0208Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows using focussing or collimating elements, e.g. lenses or mirrors; performing aberration correction
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/2803Investigating the spectrum using photoelectric array detector
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • H04N23/12Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths with one sensor only
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • H04N23/13Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths with multiple sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/135Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on four or more different wavelength filter elements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/2803Investigating the spectrum using photoelectric array detector
    • G01J2003/2806Array and filter array
    • G01J2003/2809Array and correcting filter
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/2823Imaging spectrometer
    • G01J2003/2826Multispectral imaging, e.g. filter imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/12Generating the spectrum; Monochromators
    • G01J3/26Generating the spectrum; Monochromators using multiple reflection, e.g. Fabry-Perot interferometer, variable interference filters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/2823Imaging spectrometer
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/30Measuring the intensity of spectral lines directly on the spectrum itself
    • G01J3/36Investigating two or more bands of a spectrum by separate detectors
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/46Measurement of colour; Colour measuring devices, e.g. colorimeters
    • G01J3/50Measurement of colour; Colour measuring devices, e.g. colorimeters using electric radiation detectors
    • G01J3/51Measurement of colour; Colour measuring devices, e.g. colorimeters using electric radiation detectors using colour filters
    • G01J3/513Measurement of colour; Colour measuring devices, e.g. colorimeters using electric radiation detectors using colour filters having fixed filter-detector pairs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30088Skin; Dermal

Abstract

A method first images a received time T1 scene spectrum using a spectral imager (comprising a plurality of spectral sensors) wherein the spectral sensors include spectral filters overlaying one or more first optical sensors and the sensing ranges of the plurality of spectral sensors collectively include a series of wavelengths, and outputs information representative of a T1 scene spectral image to a processing module. The method continues with imaging the scene at time T2 using an image sensor, wherein the image sensor comprises a plurality of second optical sensors, and outputting information representative of a T2 scene image to the processing module, wherein the spatial resolution of the scene image is higher than the spatial resolution of the spectral image. The method continues with generating a combined spectral image based on the information representative of the T1 scene spectral image and the information representative of the T2 image, and correcting the illuminant for one or more spatial regions of the scene based on the combined spectral image to generate a corrected spectral image.

Description

Illuminant correction in an imaging system
Technical Field
The present invention relates generally to digital imaging, and more particularly to compensating for light source distortion using a spectral sensor with an interference-based filter.
Background
Digital imaging has had a profound impact on the quality and usability of imaging technology. At the same time, the expectations of camera consumers have become higher and higher, especially for cameras embedded in modern smartphones. For example, automatic white balancing has improved the quality of camera imaging by compensating for the distorting effects of various light sources on the camera output.
The spectroscopic assembly, which operates by detecting and/or acquiring incident light associated with a plurality of wavelength ranges, may be used to provide spectroscopic information to aid in automatic white balancing. Interference-based filters, such as Fabry-Perot (Fabry-Perot) filters, have proven to provide information useful in camera systems to improve automatic white balance when used with spectral sensors.
Drawings
FIG. 1 provides a top-down illustration of an exemplary optical sensor with a filter overlaid thereon in accordance with the present invention.
Fig. 2 provides a side view of adjacent fabry-perot filters with different cavity thicknesses for an image sensor according to the present invention.
Fig. 3 provides a side view of a pair of Bragg stacked mirrors in accordance with the present invention.
Fig. 4 provides an illustration of an interference filter and a Near Infrared (NIR) filter combined to filter wavelengths in the infrared and visible spectrum in accordance with the present invention.
Fig. 5 provides a top-down illustration of a filter mosaic pattern for a spectral sensor according to the present invention.
Fig. 6 provides another top-down illustration of a filter mosaic pattern for a spectral sensor according to the present invention.
Fig. 7 provides another top-down illustration of a filter mosaic pattern for a spectral sensor according to the present invention.
Fig. 8 provides a top-down illustration of an image sensor with a standard RGB mosaic pattern according to the present invention, wherein one of the sensors is replaced by a spectral filter element.
Fig. 9 provides a cross-section of an adjacent fabry-perot filter overlaid by an optical fiber plate according to the present invention.
Fig. 10 provides a cross-section of an adjacent fabry-perot filter over a light pipe according to the present invention.
Fig. 11 provides a cross-section of an adjacent fabry-perot filter having a mask that isolates the adjacent filter from crosstalk in accordance with the present invention.
Fig. 12 provides a cross-section of an adjacent fabry-perot filter having trenches for isolating the adjacent filter from crosstalk in accordance with the present invention.
Fig. 13 provides a top-down illustration of a filter array with a shielded gate according to the invention.
Fig. 14 provides a cross-section of an adjacent fabry-perot filter with isolation space between adjacent optical sensors in accordance with the present invention.
Fig. 15 provides an illustration of a filter structure according to the present invention that mirrors similar filter bands in adjacent filter mosaics.
Fig. 16 provides a graphical representation of the color matching function of a CIE XYZ standard observer according to the present invention.
Fig. 17 provides a top-down illustration of the CIE/XYZ mosaic structure in a Bayer pattern according to the present invention.
Fig. 18A provides a cross-section of an adjacent fabry-perot filter overlaid by an optical angle element in accordance with the invention.
Fig. 18B shows a single optical device according to the invention positioned over a sub-array of a filter array.
Fig. 18C shows 3 optical devices according to the invention located over different sub-arrays of a larger filter array.
Fig. 18D provides a cross section of an interference filter sub-array with associated optics in accordance with the present invention.
Fig. 19A illustrates an imaging system incorporating a high resolution imager and a low resolution imager in accordance with the invention.
Fig. 19B shows an imaging system incorporating a high resolution imager and two low resolution imagers in accordance with the invention.
Fig. 20A provides a top-down illustration of a pixel array with adjacent filter mosaics on a sensor according to the present invention.
FIG. 20B provides a top-down illustration of a sensor system having an image sensor and a spectral sensor in accordance with the present invention;
FIG. 20C is a flow chart illustrating an exemplary method for providing a high resolution spectral image of a scene by an imaging system in accordance with the present invention;
FIG. 20D is a flow chart illustrating an exemplary method for providing differential images with and without active illumination in accordance with the present invention;
FIG. 20E is a flowchart illustrating an exemplary method for determining skin parameters in accordance with the present invention;
FIG. 20F is a flow chart illustrating an exemplary method for detecting and classifying skin distortions in accordance with the present invention;
FIG. 20G is a flowchart illustrating an exemplary method for dividing skin types for skin treatment by an imaging system in accordance with the present invention;
fig. 21A provides a block diagram of an imaging system incorporating a high resolution imager and a low resolution spectral sensor in accordance with the present invention.
Fig. 21B provides a block diagram of an imaging system incorporating a high resolution image and a low resolution imager in accordance with the invention.
Fig. 22 is a flowchart illustrating an exemplary method for correcting optical distortion in accordance with the present invention.
Fig. 23 is a flowchart illustrating another exemplary method for correcting optical distortion in accordance with the present invention.
Fig. 24 provides a top-down view of an optical sensor system using an optical sensor/detector comprising nano-semiconductor material in accordance with the present invention.
Fig. 25 provides a cross-section of an adjacent fabry-perot filter overlaid by an optical angle element in accordance with the invention.
Fig. 26 shows a scenario with one or more light sources according to the present invention.
Fig. 27A is a flowchart illustrating an exemplary method for collecting light source information from a digital image of a scene in accordance with the present invention.
Fig. 27B is another flow chart illustrating an exemplary method for collecting light source information from a digital image of a scene in accordance with the present invention.
Fig. 28 is a flow chart illustrating an exemplary method for compensating for ambient light flicker in a scene captured by a digital imaging system in accordance with the present invention.
Fig. 29A shows the individual spectral responses of two spectral sensors (pixels) with adjacent center wavelengths according to the present invention.
Fig. 29B shows the combined spectral response of the two spectral sensors according to the invention.
Fig. 29C shows a pair of adjacent interference filters each associated with an optical sensor in accordance with the present invention.
Fig. 29D shows a pair of adjacent interference filters associated with a single optical sensor in accordance with the present invention.
Fig. 29E shows a pair of interference filters, one on top of the other, associated with a single optical sensor.
Detailed Description
In various embodiments, the spectral image sensor is combined with a spectral filter, such as an interference-based interference filter, to provide spectral information about the scene and/or the light source. In some embodiments, spectral imaging of the scene may be performed, and in other embodiments, spectral imaging of the scene may be combined with high resolution imaging in a single imager or in separate imagers combined after collecting the images. In still other embodiments, the interference-based filter may be implemented using a fabry-perot filter integrated with a spectral image sensor (such as a CMOS-based sensor) to provide a small-scale spectral image sensor system. In some embodiments, the small scale spectral imaging system may be suitable for use in applications requiring white balance correction. Examples of applications include, but are not limited to, smart mobile phones, high resolution cameras, video cameras, security cameras, calibration systems, inspection systems, and certain industrial applications.
Compensating for light source distortion (sometimes referred to as "white point balancing") is an essential part of the camera rendered image. Without white point balancing, the image sensor would not accurately represent the expected chromaticity of the recorded scene or object. Various light sources distort the chromaticity of objects in the field of view of the image sensor. For example, incandescent, fluorescent, and Light Emitting Diodes (LEDs) each distort the light that is "seen" by the image sensor. Other light sources, such as sodium street lamps, distort the output of the image sensor sufficiently that most colors are barely distinguishable.
The white balance compensation provides power for steady development, and finally automatic white balance is realized, which enables a photographer to compensate for color defects caused by a light source output by the image sensor. In one example, RGB optical sensors (a semiconductor device containing three types of pixels with peak sensitivity in the red, green, and blue portions of the visible spectrum) have been used to provide a reference for automatic white balance. The combination of the red-green and blue wavelengths of the RGB sensor appears "white" to the viewer, and thus in a scene containing one or more substantially white objects, the RGB sensor may combine the red-green and blue wavelengths to appear white to the viewer. Thus, in a scene containing such a substantially white object, the RGB sensor may use the white object as a reference point to adjust the processing of any other color in the scene. AWB has evolved from combining the output of RGB sensors on cameras to use as a reference for white balance to include multichannel spectrum sensors. The accuracy of these multichannel spectrum sensors improves as more channels are distributed over the visible spectrum, however, in each case the imager with the multichannel spectrum sensor is limited to a single average reference spectrum for AWB of a given scene. Thus, in cases where there are multiple light sources or where the scene is controlled by a single object, the image sensor may only compensate for the "average" illumination of a particular scene.
Fig. 1 provides a top-down illustration of a spectral sensor having filters providing 9 spectral bands in a 3 x 3 pattern each spanning and imager array. In this example, fabry-perot filters span and spectrum sensors with different center wavelengths are patterned into a mosaic structure that repeats across and across the array. In other embodiments, the 3 x 3 filter pattern may be replaced with other patterns (such as a 2 x 2 pattern, a 4 x 4 filter pattern, a 5 x 5 filter pattern, or a 3 x 4 pattern, etc.), as dictated by resolution and/or manufacturing requirements. In one example, a 3 x 3 pattern filter provides 9 different cavity thicknesses, which are then repeated across the exemplary sensor array. In the example of fig. 1, each of the 9 filter thicknesses (illustrated as filters 20A-20H, etc.) is repeated 12 times across the 12 x 9 array of optical pixels on the sensor 10.
In the sensor system based on fig. 1, the optical pixels for the sensor 10 are arranged on an integrated circuit with sets of interference filters fabricated on top of the optical pixels. In one example, a set of nine (9) interference filters 20A-20I are arranged in a mosaic pattern, each interference filter configured to pass light of a different wavelength range. In one example, each set of interference filters is aligned with at least one set of optical sensors such that each set of optical sensors is adapted to sense a localized bandpass response having 9 channels. The set of optical sensors and filter arrangements are then repeated across the array such that the optical sensor array is capable of providing a plurality of measurement spectra that are spatially separated across different regions of the image sensor. As used herein, a single optical sensor corresponds to a pixel (pixel = minimum addressable element), where the pixel is a photodiode. Thus, "optical sensor," "optical pixel," and "pixel" are used interchangeably.
In one example, the image sensor of fig. 1 is adapted to provide optical distortion information for different regions of the image sensor, allowing white balance correction to be extended to each of those regions. In one example of an implementation, a sensor system for imaging a scene may include multiple optical sensors on an integrated circuit, and multiple sets of interference filters, such as filter elements 20A-20I of fig. 1. In one example, each set of interference filters of the plurality of sets of interference filters may include a plurality of interference filters arranged in a pattern, wherein each of the plurality of filters is configured to pass light of a different wavelength range. In one example, each set of interference filters of the plurality of interference filters is associated with a spatial region of the scene, and thus a spectral response may be determined for each spatial region of the scene.
In one example of the embodiment referring to fig. 1, one set of interference filters of the plurality of sets of interference filters may be spatially separated from other interference filters of the plurality of sets of interference filters, and in another example, each set of interference filters of the plurality of sets of interference filters may be randomly spaced between the plurality of optical sensors of sensor 10.
Fig. 2 provides a cross-section of an adjacent fabry-perot filter stack (filter) having different cavity thicknesses for an image sensor, such as, for example, the image sensor of fig. 1. As shown, the center wavelength of each fabry-perot filter is determined in the first order by the cavity thickness between its upper and lower mirrors. In this example, adjacent filters 20A-20F provide sensor outputs of 6 channels. Between the filters 20A to 20F and the sensor 10, suppression filters 30A to 30C are provided to block stray light outside the desired wavelength of the associated interference filter. In some cases, a fabry-perot filter may pass wavelengths, such as harmonic wavelengths or wavelengths outside the effective range of the (Bragg) mirror, which will negatively impact the desired wavelength response of the filter. Thus, the rejection filter may act as a bandpass filter, thereby rejecting wavelengths outside the bandpass range. In one example, a single rejection filter may provide sufficient bandpass rejection for two or more fabry-perot filters. In another example, a suppression filter may be disposed over an associated fabry-perot filter to suppress light outside of a desired wavelength range before the light may pass through the fabry-perot filter. In yet another example, an additional interference filter, such as a fabry-perot filter, may be disposed between the one or more suppression filters and the sensor 10. In this example, filters 20A through 20F overlie one or more suppression filters with additional interference filters located below the one or more suppression filters.
In one example, the suppression filter may comprise an organic material and may be applied using a spin-on process. In another example, the suppression filter may comprise a plasma interference filter applied by, for example, a lithographic process. In another example, the suppression filter may be a colloid or quantum dot based filter. Other examples of suppression filters include combinations of organic materials and/or plasma filters. And in yet another example, the suppression filter may comprise one or more interference filters alone or in combination with an organic material and/or plasma filter. In one example, a plurality of suppression filters may be arranged in a pattern under a mosaic of filter elements, wherein each suppression filter of the plurality of suppression filters is configured to substantially suppress light of a predetermined wavelength.
In a specific example of an embodiment, a set of interference filters is arranged in a pattern that also includes a plurality of organic filters, and in another example, the pattern includes a plurality of non-interference filters, wherein the non-interference filters are selected from the group consisting of organic filters, plasma filters, or suitable alternatives.
In a related example, the rejection filter may comprise a Bragg stacked mirror. In the example shown in fig. 3, the Bragg-stack mirrors act as suppression filters for the filters 20A and 20B, while acting as Bragg-stack mirrors for the fabry- perot filters 20C and 20D in fig. 3. In yet another example, the one or more suppression filters may include multiple thin dielectric material layers deposited and patterned, for example, using a thin film deposition process and/or a photolithographic process. Thus, the patterning process may consist of a photolithographic process (for defining the spatial locations of the filters) in combination with etching or lift-off techniques (for locally removing the deposited filter layers). A specific etch stop layer may be deposited in the filter stack to control the etching process such that the optical layers in the filter stack are locally removed. In one example, filters 20A and 20B may be protected from being etched away when filter material from other locations is removed using an etch stop layer that does not affect optical performance. In defining the bandpass filter and the rejection filter, an etch stop may be used.
In a specific example of an embodiment, one or more of the plurality of suppression filters is another interference filter. In this example, the other interference filter is one of the plurality of interference filters. In another example, the other interference filter is simultaneously configured to pass light of a particular wavelength range and to reject light for the other optical sensor and interference filter pair.
Fig. 4 provides an illustration of an interference filter for filtering visible light and combining with a Near Infrared (NIR) filter to filter wavelengths in the infrared spectrum. In one example, one or more NIR filters may be composed of organic materials, while the interference filter comprises a fabry-perot filter, enabling measurement of light wavelengths across the visible and infrared spectra. In the example of fig. 4, filters 50A-50C may be any of fabry-perot filters, organic filters, or any other acceptable substitute.
In one example, a non-CMOS based optical sensor may be used to extend the spectral range of the spectral sensor to infrared wavelengths. For example, optical sensors based on colloids or quantum dots can be used to collect infrared light, for example in the short-wave infrared range. In an example of a quantum dot based optical sensor, the optical sensor may be optimized by tuning the quantum dot size such that a predetermined wavelength is selected such that the optical sensor provides an infrared filtering channel. In another example, the sensor system may include multiple sets of optical sensors, wherein each set of optical sensors is arranged in a pattern including at least one optical sensor that is respectively larger in size than at least one other optical sensor of the set of optical sensors.
Fig. 5 provides a top-down illustration of a filter mosaic pattern for a spectral sensor comprising large filter elements. In this example, the 6 filter mosaic includes standard filter elements 20B, 20C, 20D, and 20E, with a single filter element 22 occupying the space of 4 standard filter elements. In one example, where some filter response requirements dictate increasing light capture, such as when a wavelength range requires a filter with reduced transmission characteristics, the larger filter element 22 may provide a 6-channel filter response. In a specific example, a set of interference filters may further comprise a pattern arrangement of interference filters having a size that is respectively larger than at least one other interference filter of the set of interference filters.
Fig. 6 provides a top-down illustration of another filter mosaic pattern for a spectral sensor including filter elements forming a larger rectangular shape. In this example, large filter element 24A and large filter element 24B are included in a filter mosaic having 16 standard filter elements (such as filter elements 20A-20D). In one example, where some filter response requirements dictate increasing light capture, such as with reference to fig. 5, including a larger filter element may provide a 19-channel filter response. In one example, the spectral filter mosaic may include interference filters that are each larger in size than at least one other interference filter of the set of interference filters and/or are in an elongated rectangular shape.
Fig. 7 provides a top-down illustration of a filter mosaic pattern for a spectral sensor in which the filter elements form a tapered loop around a central filter element. In this example, the smaller filter element 26D is surrounded by the larger filter element 26C, which is surrounded by the even larger filter element 26A, all of which are surrounded by the large filter element 26B. In one example, where some filter response requirements dictate increasing light capture, such as with reference to fig. 5, progressively larger filter elements may provide a 4-channel filter response. In an exemplary spectral filter mosaic, one or more of the interference filters are each sized larger than at least one other interference filter of the set of interference filters and/or are adapted to form a loop around the other interference filters of the set of interference filters.
Fig. 8 provides a top-down illustration of an image sensor with a standard RGB mosaic pattern, wherein one of the sensors is replaced by a spectral filter element. In this example, the pixel sensors 20A, 20B, and 20C form a 2×2 mosaic pattern including the filter 32A (1). In one example, a standard RGB mosaic pattern is repeated across the sensor 10, where each 2 x 2RGB mosaic includes spectral filter elements, such as filter elements 32B and 32C of a multi-band spectral sensor. For example, the sensor 10 of fig. 8 is an 8×8 sensor array with a 4×4RGB mosaic comprising 4×4 spectral sensors. Thus, in the example of fig. 8, a standard 16RGB array may include 16 spectral sensor channels for the sensor 10. In one example, the RGB and spectral sensor combination may be repeated across the spatial region of the sensor 10 to provide a local spectral response for a large image sensor.
In one example of an implementation, a sensor system may include multiple sets of optical sensors on an integrated circuit, where each set of optical sensors includes multiple optical sensors arranged in a pattern. In this example, each of the one or more sets of interference filters includes a plurality of interference filters, each interference filter being located on top of an optical sensor of the plurality of sets of optical sensors and each interference filter of the one set of interference filters being configured to pass light of a different wavelength range. In a specific example, the pattern for the set of optical sensors includes 4 segments to form a 2 x 2 matrix, with each of the red, green, and blue channel sensors and the spectral channel sensor located in one of the 4 segments.
In a specific example of an embodiment, the pattern of red, green and blue channel sensors is a 2 x 2 pattern, while the pattern of spectral sensors uses a repetition rate of N, where N >2 and the number of different spectral sensors N >1. In another example, each color channel filter element and/or spectral channel filter for the sensor system covers more than one optical sensor in the pattern. In yet another example, the filter pattern includes a set of color filters (such as those found in any modern imager) intended for color imaging (such as red, green, blue, brightness, transparency, etc.) and at least one set of spectral filter elements.
In one example, the different spectral filters of several patterns together form a low resolution spectral image of the scene, while the color filters of the patterns form a high resolution color image of the scene. In a related example, the low resolution spectral response is used to determine white balance requirements for different spatial regions of the scene.
In a specific example of an implementation, each interference filter of a set of interference filters is randomly associated with a spectral channel sensor, and in another example, the number of interference filters in each set of interference filters differs based on the spatial location of the set of interference filters in the sensor system. In yet another related example, each set of interference filters and/or the position of each interference filter in the spectral imager is based on a pseudo-random pattern.
Fig. 9 provides a cross-section of adjacent fabry- perot filters 20A and 20B overlaid by an optical fiber plate 60. Referring back to fig. 2, light passing through a filter (such as filter 20A of fig. 2) at a particular angle may be filtered by the particular filter while being detected by an optical sensor associated with an adjacent filter. In a specific example, filter 20A is configured to pass light of a particular wavelength, however, when the angle of incidence of the light passing through filter 20A is sufficiently oblique, the light may propagate through integrated circuit back end 40 and be detected with an optical sensor associated with filter 20B. Light of an undesired wavelength propagating through an adjacent interference filter is commonly referred to as "crosstalk". Crosstalk has a negative impact on the spectral response quality of the filter mosaic and thus on the quality of the optical distortion correction. Thus, it is desirable to eliminate or at least attenuate the effects of crosstalk.
The fiber optic plate 60 of fig. 9 is an optical device that is made up of a bundle of micron-sized optical fibers. When used as lenses on filters 20A and 20B, the light or image transmitted through the fiber optic plate is collimated to reduce the angle of incidence through the filter (the angle between the light rays incident on the surface and the line normal to the surface at the point of incidence) to substantially reduce unwanted crosstalk. Unlike a general optical lens, a focusing distance is not required when a fiber plate such as the fiber plate 60 is used, and thus it is compatible with a compact optical device.
Fig. 10 provides another cross-section of adjacent fabry- perot filters 20A and 20B over light pipe 64. In the example of fig. 10, light that passes through a filter with an excessive angle of incidence is redirected by the sidewalls of light pipe 64 to the optical sensor associated with the filter. In a specific example, when the angle of incidence of the light passing through filter 20A is sufficiently large, it will reflect from the side wall of light pipe 64 and be detected by an optical sensor associated with filter 20A. In one example, the angle of the side wall of the light pipe can be adjusted to provide maximum attenuation while minimizing absorption of the desired wavelength. In one example, the light pipe 64 may be constructed of a variety of materials, where the light pipe itself is a material having relatively high light transmittance, and the interstitial material is an opaque or translucent material. In another example, the sidewalls of the light pipe 64 may include a relatively high reflectivity material coated or deposited thereon.
Fig. 11 provides another cross-section of adjacent fabry- perot filters 20A and 20B with a light shield 68 isolating the adjacent filters 20A and 20B from crosstalk. In the example of fig. 11, light passing through filter 20A is deflected or blocked by light shield 68 when the angle of incidence through the filter is too large. In one particular example, when the angle of incidence of light passing through filter 20A is sufficiently large, it will reflect off of the sides of mask 68 or be completely blocked, and thus cross-talk to filter 20B will be eliminated and/or attenuated. In one example, the light shield 68 may be constructed of a variety of materials, including opaque or translucent materials. In another example, the mask 68 may be composed of a metal, such as Al or AlSi deposited in trenches formed and/or etched in the integrated circuit back end 40 prior to adding and/or suppressing the filter. In one specific example of this embodiment, metal is deposited on the surface of the integrated circuit back end 40 where the trench has been formed, and then removed from the region outside the trench using a subtractive process (such as chemical mechanical polishing and/or dry etching using a photolithographic process). In another example, the depth and width of the mask 68 may be adjusted to provide attenuation at a particular angle of incidence, so that more or less crosstalk attenuation is performed as desired.
Fig. 12 provides another cross-section of adjacent fabry- perot filters 20A and 20B having trenches 66 for isolating the adjacent filters 20A and 20B from crosstalk. In the example of fig. 12, light passing through filter 20A is deflected or blocked by trench 66 when the angle of incidence through the filter is too large. In one particular example, when the angle of incidence of light passing through filter 20A is sufficiently large, it will reflect or be completely blocked from the sides of trench 66, and thus cross-talk to filter 20B will be eliminated and/or attenuated. In one example, trenches 66 are formed and/or etched in integrated circuit back-end 40 before adding filters and/or suppressing filters using a photolithographic process. In one example, the trench 66 may be filled with another material or left as a void, with light being reflected or refracted at the sidewalls of the trench 66. In another example, the depth and width of the grooves 66 may be adjusted to provide attenuation at a particular angle of incidence, so that more or less crosstalk attenuation is performed as desired.
Fig. 13 provides a top-down illustration of a filter array with shielding grids 110 for attenuating crosstalk between the filter and optical sensor pairs. In the example of fig. 13, incident light on filters 20A, 20D, 20E, etc. is blocked at shield grid 110 to provide a buffer between the filters such that the filters are at least partially isolated from each other. In one example, the shielding grating 110 may be an opaque or translucent material or any other sufficiently absorbing material lithographically deposited or defined in the edges of the filters 20A, 20D, 20E, etc. In another example, the shield grid 110 may be composed of a reflective material (such as Al and/or AlSi). In one example, the shielding grid 110 may be disposed above or below the filters 20A, 20D, 20E, etc.
In certain embodiments, an image sensor (such as sensor 10 in fig. 9-13) may be configured to provide an ineffective space or gap between a single optical sensor and/or an optical sensor assembly of the image sensor. Dead space may provide isolation between optical sensors to reduce cross-talk between optical sensors. In the related example shown in fig. 14, intermediate element 36 is located below the intersection of adjacent filters 20A and 20B and between photosensitive elements 34. In one example, the intermediate element 36 is a dead space between the optical sensors of the image sensor. In another example, the intermediate element 36 and the photosensitive element 34 are both located in the dead space between the optical sensors of the image sensor. In a specific example of an implementation, crosstalk may be measured using one or more responses from photosensitive element 34, and in a related example, the filter response to the measured crosstalk may be corrected using one or more responses from photosensitive element 34.
Referring to fig. 1, the repeated mosaic pattern can necessarily maximize the number of transitions between filter bands (where filters configured to pass light of the same wavelength range are the same filter band). Fig. 15 provides an illustration of a filter structure that mirrors similar filter bands in adjacent filter mosaics to reduce the number of transitions from one filter band to another. In this example, the patterns of the 4 three-filter mosaics 1 to 4 are modified so that the filters 20A are adjacent to each other. In one example, crosstalk is reduced from a typical repeating pattern because it reduces the number of transitions.
In one specific example of an implementation, an exemplary sensor system having 4 sets of interference filters includes multiple sets of interference filters, each including multiple interference filters arranged in a pattern, wherein the pattern of each of the 4 sets of interference filters is modified such that the 4 interference filters configured to pass light within the same wavelength range are adjacent to each other at four points (quadpoints). In another specific example of the embodiment, 2 sets of interference filters of the plurality of sets of interference filters include a plurality of interference filters arranged in a pattern, wherein the pattern of each of the 2 sets of interference filters is modified such that the 2 interference filters configured to pass light in the same wavelength range are adjacent to each other around a center line between the 2 sets of interference filters.
In one embodiment, the sensor system includes a plurality of optical sensors, one or more of which are used for auto-focusing. In one specific example of an implementation, one set of interference filters of the plurality of sets of interference filters is adapted to position a particular one of the plurality of interference filters on top of one or more optical sensors for auto-focusing.
In another embodiment, a sensor system includes a plurality of optical sensors and a plurality of sets of interference filters disposed on opposite sides of an integrated circuit. In this example, the opposite side of the integrated circuit is opposite the side of the integrated circuit with the wiring. In one example, the sensor system includes a back-illuminated image sensor. Backside illuminated sensors, also known as backside illuminated (BSI or BI) sensors, use a novel arrangement of imaging elements on the opposite side of the integrated circuit containing the image sensor in order to increase the amount of light captured and thereby improve low light performance. The increased light capture is due, at least in part, to the matrix of individual picture elements and their wiring reflecting some light, and thus the sensor 10 may only receive the remaining incident light, as the reflection reduces the signal that can be captured.
FIG. 16 provides a graphical representation of the color matching function of a CIE XYZ standard observer (source: https:// en. Wikipedia. Org/wiki/CIE_1931_color_space). The color matching function may be considered as a spectral sensitivity curve of three linear photodetectors that produce CIE tristimulus values X, Y and Z, where Y is luminance, Z is quasi-blue, or S-cone response, and X is a set of non-negative response curves. In one embodiment, the sensor system of fig. 1-16 may include at least some of the plurality of interference filters in a set of interference filters adapted to provide absolute value color measurements (such as CIE tristimulus values X, Y and Z) when paired with an image sensor including a plurality of optical sensors. In one example, the absolute value color measurement is a measurement that includes both luminance and chrominance.
Fig. 17 provides a top-down illustration of the CIE/XYZ mosaic structure in a bayer pattern. In this example, the interference filters 20A-20D are patterned to form true color sensors. The bayer pattern (sometimes referred to as bayer mosaic or bayer filter mosaic) is an array for arranging color filters on a square grid of an optical sensor.
Fig. 18A provides a cross-section of adjacent fabry-perot filters 20A through 20F overlaid by an optical element 80. In one example, the optical element is associated with an array of one or more filters 20A-20F. Fig. 18B and 18C illustrate the incorporation of optics on a sub-array (stripe) of a multi-spectral array. In fig. 18B, a single optic 130 is located on a sub-array of filter array 120 or band (of filters 1 through 16), while in fig. 18C, each of the 3 optics 140, 142, and 144 is located over a different repeating sub-array. For example, filter sub-array 124 includes filters 1 through 9 (band 1), while filter sub-array 122 includes filters 10 through 13 (band 3), and filter sub-array 126 includes filters 14 through 16 (band 2) of a larger array. In one specific example of an implementation, a sensor system includes a plurality of optics over a plurality of optical sensors, wherein each lens of the plurality of optics is associated with one or more sets of interference filters, which are themselves associated with one of the plurality of optics. In one example, the optics include a lens and a low-pass optical element. In one example, the low pass optical element of the optic is a diffuser, and in another example, the low pass optical element is located at a predetermined distance from the plurality of sets of interference filters to produce a blurred image of a predetermined blur size on the plurality of optical sensors. In different examples, 2 or more of the plurality of optics (such as the 3 optics shown in fig. 18C) overlap a portion of the larger array such that each of the 2 or more optics covers a portion of the larger array. In another specific example, the optical element 80 may comprise an angle element, wherein the angle element is configured to select an input angle at which light propagates to the one or more sensors. In yet another specific example, the optical element 80 may comprise an optical lens configured to rotate or tilt. Examples include optical image stabilization, lens rotation to change the polarity of the propagating light, and/or another mechanical lens motion.
Fig. 18D is a cross section of an interference filter sub-array with associated optics. In one example of implementation and operation, the system includes multiple optical sensors on an integrated circuit with multiple sets of interference filters (such as filter banks 184A and 184B). In this example, a set of interference filters (such as filter banks 184A and 184B) are configured to pass light within a predefined spectral range, with each of the plurality of interference filters configured to pass light within a different wavelength range. In one example, the system includes one or more optical elements (such as lenses 176A-176D), where each optical element is associated with at least one set of interference filters (such as filter banks 184A and 184B) to provide an optics and interference filter bank pair. In another example of an implementation, some of the one or more sets of interference filters are fabry-perot filters.
In one example of an implementation, the one or more optical elements include a filter (such as filter 178 in fig. 18D) and lenses (such as lenses 176C and 176D) to focus the image onto a set of pixels below filter bank 184B. In one example, filters 178A and 178B are suppression filters adapted to suppress unwanted out-of-band light. In another example, the one or more optical elements include more than one lens element (such as lenses 176C and/or 176D). In one example, the baffle 174 is configured to support the lenses 176A-176D while isolating light incident on pixels below a given filter bank. In this example, each optical element and interference filter pair contains a sub-imager having pixels located below a filter bank, where multiple sub-imagers are also configured to provide spectral information for a given scene in different spectral ranges. In yet another example, the optical device is an all-optical system.
In one example of implementation and operation, a first optical element and interference filter pair is configured to pass light in the Ultraviolet (UV) spectrum, a second optical element and interference filter pair is configured to pass light in the Infrared (IR) spectrum, and a third optical element and interference filter pair is configured to pass light in the visible spectrum. In another example of another embodiment, some of the plurality of optical sensors are not associated with any type of filter, allowing for a full color response.
In another example of an implementation, a suppression filter associated with an optical element is integrated on an integrated circuit using semiconductor processing techniques. In another example, wafer-level optics (such as microlenses) are used to fabricate some or all of the plurality of optical elements.
In one specific example of an implementation, the lens may be configured to defocus to produce a blurred image having a predetermined blur size and then focus at the plurality of optical sensors to produce a focused image. In a related example, the focused image is a high resolution color image and the blurred image is a low resolution color balanced image. In another related example, the blurred image is used to provide a representative spectral response of the scene, wherein the representative spectral response includes spectral responses of a plurality of spatial regions of the scene. In yet another example of an embodiment, an optical lens is focused to form a high resolution color image using a color sensor of an imager and defocused to form a low resolution white balance image using a spectral sensor. Exemplary optical lenses include compound lenses, fresnel lenses, multifocal fresnel lenses, molded lens arrays, etc., and may be mechanically and/or electronically focused. The lenses may be integrated on the silicon wafer during manufacture or may be coated and/or assembled on the finished image sensor. In one example, defocusing of the optical lens may be done automatically when an image is taken, or manually if the user selects a white balance capture mode as needed or desired.
Fig. 19A shows an imaging system incorporating a high resolution imager and a low resolution imager, while fig. 19B shows an imaging system incorporating a high resolution imager and two low resolution imagers. In this example, the spectral sensor 170 is configured to provide a low resolution spectral image of a scene, while the image sensor 172 is configured to provide a high resolution image of the same scene. In this example, the response from the spectral sensor 170 may be used to provide color balancing of the spatial region of the scene imaged with the image sensor 172. The imaging system may include one or more processors for using spectral responses from different spatial regions of a scene to process color balance of the same scene imaged with the image sensor 172.
In one example of an implementation, a sensor system includes a first group of optical sensors associated with a plurality of sets of interference filters, wherein a set of interference filters includes a plurality of interference filters arranged in a pattern. In one example, each of the plurality of filters is configured to pass light of a different wavelength range, and each set of the plurality of interference filters is associated with a spatial region of the scene. In this example, the optical sensors of the second group are configured to output an image; and the one or more processors generate spectral responses for the plurality of spatial regions of the scene from the first group of optical sensors, and the image is output by the second group of optical sensors.
In one example, a spectral bandpass response is extracted from a set of filters using a demosaicing process. The demosaicing process may be enabled using one or more processors that use algorithms or digital image processing to reconstruct bandpass responses from optical sensors associated with individual filters of a set of filters. In an example of an optical sensor having two groups interspersed therein, spectral information may be retrieved from a subset of filters in the interspersed groups or arrays using a demosaicing process.
In one example of an implementation, the optical sensors of the second group are configured to produce a higher resolution image, while the optical sensors of the first group provide a lower resolution spectral response. The one or more processors modify a high resolution image of the scene based on the spectral responses of the included spatial regions using the low resolution spectral responses of at least some of the spatial regions of the scene. In one example, the modification of the high resolution image includes a color correction of the included spatial region of the image of the scene.
In another example of an implementation, the one or more processors utilize a spectral response of a spatial region of the scene to classify one or more materials in the scene. In an exemplary application, a low spatial resolution but high spectral resolution sensor image is combined with a high spatial resolution but low spectral resolution sensor image. In another embodiment, the optical sensors of the first group including low resolution spectral sensors provide spectral information of objects in the optical sensors of the second group including high resolution sensors. The spectral information may include information sufficient to determine a characteristic of the object, such as a material composition. The spectral information may further help identify the object type. Exemplary applications may include, for example, skin sensing, water or oxygen detection, food analysis, quality inspection, plant analysis, and drone monitoring.
In one example of an implementation, the first group of optical sensors and the second group of optical sensors are adjacent to each other, and in another example, the first group of optical sensors are adapted for use when in contact with one or more objects of the scene, while the second group of optical sensors are configured not to be in contact with the one or more objects. In another example, the first group of optical sensors is located on a different image sensor than the second group of optical sensors. In yet another example, the first group of optical sensors and the second group of optical sensors are located on a common image sensor, wherein each sensor of the first group of optical sensors is distributed among the optical sensors of the second group of optical sensors. In related examples, the sensor system includes one or more individual optical sensors of the first group of optical sensors and a plurality of optical sensors of the second group of optical sensors, wherein each of the optical sensors of the first group is associated with the plurality of optical sensors of the second group of optical sensors.
In one specific example of an implementation, the one or more processors are to approximate an output of one or more optical sensors of the second group of optical sensors from an output of the first group of optical sensors to produce an approximated output. Also, in yet another example, the approximate output is one of a red, green, or blue sensor output. In yet another example, the output of the optical sensors in the second group of optical sensors lost from the mosaic pattern of the subset of the second group of optical sensors is replaced with the approximate output of the optical sensors from the first group of sensors. In one example, the optical sensor may be lost due to, for example, replacing the optical sensor in the mosaic pattern with an optical sensor for spectrum sensing.
In other examples, additional functionality is combined with the imaging system of fig. 19A and 19B. For example, the imaging system may be adapted to collect information to generate a three-dimensional map of at least a portion of the image, wherein, for example, the three-dimensional map may be used to determine approximate locations and interactions of objects in the scene. Still other examples include adjusting the imaging system to classify material associated with one or more objects in the scene. Materials may be classified based on illumination, chromaticity, or other phenomena.
In a related example, one or more 3D sensors adapted to output data associated with a scene may be included in an imaging system, wherein the imaging system is further adapted to generate a three-dimensional map based on the output data. In additional examples, the one or more 3D sensors include a 3D point sensor and/or a 3D image sensor.
In other examples of additional functions of the imaging system in connection with fig. 19A and 19B, the imaging system is adapted to collect time-of-flight information of one or more objects in a scene in which the time-of-flight information is used to determine an approximate location of the one or more objects in the scene. In an example of implementation, the time-of-flight information is determined by modulating the signal and detecting one or more objects in the scene. In another example, the time-of-flight information is used to further modify the image output by the high resolution sensor, wherein the image output is further modified to correct for optical distortion, e.g., based on distance from the sensor system. In this example, the imaging system may include various lenses that may benefit from particular optical distortion correction. Examples include, but are not limited to, wide area lenses, ultra wide lenses, fish-eye lenses, telephoto lenses, and others.
Fig. 20A provides a top-down illustration of pixel array 172 with adjacent filter mosaics 182 on sensor 10. The pixel array 172 may be a standard pixel array in which the filter mosaic 182 contains spectral sensors. In one example, N filter mosaics 182 may enclose the pixel array 172, where N is any of 2 to 4.
Fig. 20B provides a top-down illustration of a sensor system having an image sensor and a spectral sensor. In one example, the image sensor 182 is a typical high resolution Red Green Blue (RGB) sensor, and the spectral sensor 192, which is relatively low in spatial resolution, is adjacent to the high resolution image sensor 182 such that the high resolution image sensor 182 substantially overlaps the output of the spectral sensor 192 to sample the same scene and/or object. In one example, the high resolution image sensor 182 may comprise any of a variety of conceivable image sensors, such as image sensors based on a Y channel approximately correlated to perceived intensity and U and V channels (YUV) that provide color information. In another example, the image sensor 182 is based on four channels: cyan, magenta, yellow, and black (CMYK). In yet another example, the image sensor 182 may be adapted to detect light outside the visible spectrum. In an alternative example, the image sensor 182 is adapted to provide relative light intensity for the sampled scene without separating the color wavelengths.
In one example, the spectral sensor 192 is comprised of a plurality of interference filters (such as fabry-perot filters) in a repeating mosaic pattern, wherein the plurality of interference filters N are associated with a plurality of image sensor pixels P such that each interference filter is associated with a plurality of image sensor pixels of the image sensor 182. In one specific example of implementation and operation, the spectral sensor 192 uses a sensor pattern of individual sensor elements that is substantially identical to the sensor pattern of the image sensor 182, with each interference filter N of the spectral sensor 192 being associated with a plurality of individual sensor elements. In this example, the spatial resolution of the spectral sensor 192 is lower than the spatial resolution of the image sensor 182 due to P > N. In an alternative example, each interference filter N of the spectral sensor 192 is associated with a single sensor element, and both the spatial resolution of the spectral sensor 192 and the absolute resolution of the spectral sensor 192 are lower than the resolution of the image sensor 182.
Fig. 20C is a flowchart illustrating an exemplary method for providing a high resolution spectral image of a scene by an imaging system. The method includes step 600 in which a received spectrum is sampled using a spectrum sensor (e.g., spectrum sensor 192 in fig. 20B). At step 612, at least a portion of the scene is imaged using an image sensor (e.g., image sensor 182 in fig. 20B), and at step 614, the output of the spectral image sample is combined with the output of the image sensor to produce a high resolution spectral image. In one example, two images are correlated or calibrated so that they are mapped to the same scene. In a related example, any parallax between the sensor images is corrected. In an alternative example, the scene is imaged using an image sensor and then sampled using a spectral sensor. In one example, one or more modules of one or more processors may be used to combine the image sensor output with the spectral sensor output. In one example, the spectral sensor 192 is comprised of a plurality of interference filters (such as fabry-perot filters) in a repeating mosaic pattern, wherein the plurality of interference filters N are associated with a plurality of image sensor pixels P such that each interference filter is associated with a plurality of image sensor pixels 182. In one specific example, the spatial resolution of the spectral sensor is less than or equal to the spatial resolution of the image sensor.
The method continues at step 616 with correcting the output of the image sensor using the spectral image. In one example, output correction may be accomplished using one or more modules of one or more processors. In one example, outputting the correction may include normalizing (neutralizing) the illumination such that the corrected high resolution spectral image is substantially neutral in terms of the corresponding illuminant or "white point," or referenced to a standardized illuminant (e.g., CIE D65). The method continues at step 618 with classifying the scene or object using the current corrected high resolution spectral image. In one example, a scene or object may be any skin and/or tissue discussed herein. In another example, the scene or object may be any other surface. In related examples, classification may be accomplished using neural networks with or without inference engines. In one example of an embodiment, specific skin parameters such as blood, oxygen, or water content may be determined to monitor or predict the status of a wound or abscess. In one example, a processing unit or neural network may be trained based on a priori knowledge of a scene or object (e.g., a face). In one example, corrected high resolution spectral images may be collected over a period of time for purposes such as monitoring healing of a wound or abscess. In another example, the image sensor output may be used to provide a profile of the wound, while the spectral sensor output may be used to qualitatively evaluate the area within the wound profile. In one example, the qualitative assessment of the wound may include more information about the wound and/or the area near the wound, such as the presence of various substances and/or the concentration of these substances.
Fig. 20D is a flow chart illustrating an exemplary method for providing a differential image with and without active illumination. The method includes step 800 in which a scene is illuminated using one or more active light sources. At step 812, the spectrum of the scene is sampled while the scene is in an active illumination state. At step 814, the one or more active light sources are removed, and at step 816, the spectrum of the scene is sampled when the scene is not in an active illumination state (i.e., in an inactive illumination state). In one example, the scene is illuminated by a natural light source, not in an active illumination state. The method (open) includes step 818, wherein the output of the spectral sensor during active and inactive illumination is used to provide a scene differential sampling spectrum. At step 820, the illuminant for the scene is spectrally corrected using differential sampling.
In one example, the active illumination has predetermined properties, including one or more known spectral profiles, spectral radiance or brightness and polarization, such that the sampled image of the scene may be corrected for the known properties. In one example, the predetermined attribute may be used to obtain an image of the scene independent of any other uncontrolled lighting in the scene. In another example, a controlled and known image of the luminary brightness is sampled to obtain a measured illuminance. In a related example, the resulting illuminance represents a brightness value from which CIE LAB color coordinates can be calculated (by making measurements with the use of controlled illuminants). In yet another example, a distance between the object and the sensor is calculated using a depth sensor to determine the brightness value. In another example, depth information is retrieved from autofocus data provided by one or more spectral or image sensors. In another example, a controlled and known polarization illuminant image is sampled to obtain polarization information for a scene. In a related example, polarization information of at least a portion of a scene is measured using one or more polarizing elements (filters). In yet another example, the polarization information is used to determine a type of reflection, such as specular and diffuse, of an object in the scene.
In one example of operation and implementation, the active illumination may be a flash source, such as a flash provided in a typical camera system. In a related example, the camera system may be adapted to automatically sample the scene after (or before) the flash is activated, such that the 20D method is relatively transparent to the user. In one example, the properties of the one or more active light sources are substantially known in order to sample the scene using known (active illumination) and uncontrolled (non-active illumination) spectra. In one example, the field Jing Chafen sampling spectrum can be used to substantially remove the illuminant in the uncontrolled sample image as well as the uncontrolled illuminant. In one example, where corrected illuminants are used, the sampled spectrum of the scene may be used to provide a stable spectral image.
Fig. 20E is a flowchart illustrating an exemplary method for determining skin parameters. The method begins at step 620, where a spectrum of light propagating from an area of skin is sampled using a spectrum sensor. At step 622, at least a portion of the skin region is imaged using an image sensor (e.g., image sensor 182 in fig. 20B), and at step 624, the output of the spectral image sample is combined with the output of the image sensor to produce a high resolution spectral image. In an alternative example, the scene is imaged using an image sensor and then sampled using a spectral sensor. In one example, one or more modules of one or more processors may be used to combine the image sensor output with the spectral sensor output.
The method continues at step 626 with correcting the output of the image sensor using the spectral image. In one example, output correction may be accomplished using one or more modules of one or more processors. In one example, outputting the correction may include normalizing (neutralizing) the illuminant such that the corrected high resolution spectral image is substantially neutral. In one example, the skin sample is illuminated using an illumination source of a predetermined wavelength, and in another example, the illumination source is natural light. In another example, the wavelength and intensity of the illumination source is determined prior to sampling the transmitted spectrum for compensating for non-ideal illumination of the skin region. The skin region may be all or part of a spatial region of the skin region imaged using the mobile device image sensor. The method continues at step 628 where the propagated spectrum is compared to a reference spectrum. In one example, the reference spectrum is predetermined based on data previously collected from the skin region. In another example, the reference spectrum is based on empirical data and/or crowd-sourced data.
The method continues at step 630 by determining relative absorbance at one or more detection wavelengths based on a comparison between the propagated spectrum and a reference spectrum. In one example, the detection wavelength is a wavelength associated with a particular skin and/or tissue parameter (e.g., skin hydration and/or sebum). In one example of operation, the treatment device is adapted to determine a skin parameter percentage (%), such as a hydration percentage (%) and/or a sebum percentage (%), based on the relative absorption at the detection wavelength.
At an optional step, the determined skin parameter percentage (%) may be output for display on a mobile device (e.g., a smart phone) when the mobile device displays the percentage as a level indicator of the spatial region of the scene or object imaged by the image sensor. For example, a larger skin region may display a horizontal indicator of one or more skin parameters within each of a plurality of spatial regions of a scene or object image. In another example, one or more spatial regions of the scene or object image may include potential skin distortions, at which time a comparison indicator of the potential skin distortions to one or more skin parameters of unaffected skin is displayed. In this example, the comparison index may provide diagnostic information relative to potential skin distortions.
Fig. 20F is a flow chart illustrating an exemplary method for detecting and classifying skin distortions. The method begins at step 822 with illuminating an area of skin using one or more active light sources. At step 8124, the spectrum of the skin region is sampled while the scene is in an active illumination state. At step 826, the one or more active light sources are removed (deactivated), and at step 828, the spectrum of the skin region is sampled when the scene is not in an active illumination state (i.e., in an inactive illumination state). In one example, the scene is illuminated by a natural light source while the active light source is deactivated. The method continues at step 830 where the output of the spectral sensor during active and inactive illumination is used to provide a scene differential sampled spectrum. At step 832, the illuminant for the scene is spectrally corrected using differential sampling.
In one example of operation and implementation, the active illumination may be a flash source, such as a flash provided in a typical camera system. In a related example, the camera system may be adapted to automatically sample the scene after (or before) the flash is activated, such that the 20F method is relatively transparent to the user. In one example, the properties of the one or more active light sources are substantially known in order to sample the scene using known (active illumination) and uncontrolled (non-active illumination) spectra. In one example, the field Jing Chafen sampling spectrum can be used to substantially remove the illuminant in the uncontrolled sample image as well as the uncontrolled illuminant. In one example, where corrected illuminants are used, the sampled spectrum of the scene may be used to provide a stable spectral image.
The method continues at step 834 by comparing the corrected spectral information of one or more spatial regions of the skin region to a reference spectrum. In one example, the reference spectrum is based on a spectrum previously collected from a region of space. The method continues at step 836 with classifying the spatial region based on the reference spectrum. In one example, the classification is further based on changes in one or more spatial regions as compared to a previously collected spectrum. In another example, the classification is based on a comparison to a known and/or predetermined spectrum, wherein the known and/or predetermined spectrum is associated with one or more skin conditions and/or diseases. The known and/or predetermined spectra may be stored locally or collected from an external database. In one specific example, the classification is determined using a trained neural network and/or a cognitive computing engine (both of which may be installed locally into the spectrum sensor/mobile device or connected to the mobile device through a network).
The method continues at step 838 where the processor determines whether the spatial region classification indicates a disease, skin condition, or other aberration, and when the classification indicates a disease, skin condition, or other aberration, the processor generates an alert and/or recommends taking a proposed action for the disease, skin condition, or other aberration at step 840. In one example, the classification may include an indication of a disease or skin condition, such that the processor determines whether to generate and transmit an alert or recommend action based on the indication. If the spatial region classification does not indicate a problem, the method will return to step 822. For example, skin aberrations may include healthy and malignant nevi, skin melanoma, psoriasis, basal skin cancer, and almost any other skin disease.
In another example of implementation and operation, the first propagation spectrum is used as a reference spectrum, and the second propagation spectrum is compared to the first propagation spectrum, thereby classifying one or more spatial regions of skin. For example, the first propagation spectrum may be from a skin region having known healthy skin and the second propagation spectrum may be from a skin region having one or more potential skin distortions. In another example, the first propagation spectrum may be from an early observation of the same skin region. In yet another related example, the first propagation spectrum may be from a skin region having known healthy skin and then used to calibrate spectrophotometric parameters for subsequent multiple parameter measurements. Also, in yet another example, the first spread spectrum may be from an area of skin having skin aberrations (e.g., wounds), diseased or infected skin, and the second spread spectrum is used to determine changes in skin aberrations, where the changes may be used to indicate healing, deterioration, etc. of the aberrations (e.g., infection).
In one related example of operation, the classification may include a first propagated spectrum used as a reference spectrum, where the first propagated spectrum is from a known healthy skin region, and a second propagated spectrum used to determine changes in specific skin parameters, such as skin color or other skin spectral differences, and to classify skin aberrations or other skin features. For example, it is known that the difference between the measurement of healthy skin and the area of skin that may be problematic may at least to some extent contribute to the finding of a problematic skin nevus or potential skin melanoma.
In one example, the classification or suggested measures may be determined based at least in part on one or more of a specific skin type of the user, genetic information related to the user, hair color, eye color, and based at least in part on a change over time or a single sample. In one example, the collected classification information may be shared with a crowdsourcing database for training a neural network and/or a cognitive computing engine.
In one example, the method of fig. 20F may be initiated by a user when particularly needed or automatically performed when imaging an area of skin. In a related example, the method may be implemented in the form of a background operation or triggered after a predetermined period of time has elapsed. In another example, the body surface includes at least a portion of a user's eye, wherein the processing device is adapted to determine a Near Infrared (NIR) spectrum of the eye. In one example, the NIR spectrum may be used to assist in biometric analysis of the user in addition to obtaining normal visible information through an iris reader.
In one specific example of the implementation and operation associated with fig. 20E and 20F, the spectral sensor may be used in combination with other diagnostic mechanisms for determining health parameters. In one example, a contact lens (or any other device configured to maintain physical contact) incorporating a passive sensor (e.g., hydrogel) for glucose detection may be worn, where the passive sensor is adapted to color the detected glucose by spectroscopic means. In one example, a user may evaluate glucose levels by taking a spectral image of the eye. In one example, the estimated glucose level may then be correlated with the glucose level of the user. In one example, a mobile device camera may be used to provide a spectral image, and in another example, an eye-facing camera may be installed in the smart glasses for manual or semi-continuous monitoring of glucose levels. In one example, other health parameters such as lactate levels may be assessed.
Fig. 20G is a flowchart illustrating an exemplary method for dividing skin types for skin treatment by an imaging system (e.g., the imaging system of fig. 20B). The method begins at step 842, where an area of skin is illuminated using one or more active light sources. At step 844, the spectrum of the skin region is sampled while the scene is in an active illumination state. At step 846, one or more active light sources are removed (deactivated), and at step 848, the spectrum of the skin region is sampled when the scene is not in an active illumination state (i.e., in an inactive illumination state). In one example, the scene is illuminated by a natural light source while the active light source is deactivated. The method continues at step 850, where the output of the spectral sensor during active and inactive illumination is used to provide a scene differential sampled spectrum. At step 852, the illuminant for the scene is corrected using the differential sampling spectrum.
The method continues at step 854 with determining, using the processing device, a skin type of the skin via the one or more processing modules. In one example, as described in more detail below, skin type may be used as a measure of parameters such as melanin, skin color, etc. within an area of skin.
The method continues at step 856 with outputting skin type information to the user and/or third party resources using one or more processing modules. In one example, skin type information may be displayed on an associated mobile device, and in yet another example, the skin type information may take the form of a reference identification code, such as a code associated with a number or a simple identification code or other identification code, for use by a user. For example, skin type information may be displayed as a base skin tone, with letters and numbers indicating a gradual change in the range of the base skin tone. For example, the underlying skin tone may be identified as "fair", "light", "medium" or "dark", with the gradual change indicated by the numerals 1-5. The skin type information may also include skin ground color, such as cool, warm, and neutral, over a range of base skin types. Other options for the display of skin type information include bar codes or other code representations that can be used to match skin type information to a reference source. In related examples, the skin type information may include more skin factors such as hydration level, dryness, roughness, oiliness, and desquamation, and combinations thereof.
The method then continues at step 858, where the user may choose to perform skin treatments and/or provide advice to the user via third party resources based on the skin type information. Can be used for skin care, cosmetics, and moisturizing cream. In one example, the skin treatment may include one or more of a certain type, brand, and dose of cosmetics, sun protection creams of a particular Sun Protection Factor (SPF), and/or clothing for hair and/or skin. The skin type information may also be used to make changes to make up and/or other treatments to correct the cosmetic application when a 20G method is employed on the make up and/or other treated skin. In one example, skin type information may be used to provide a recommended skin treatment method, and after the skin treatment method is applied, the effectiveness of the skin treatment method used may be assessed and/or corrective measures provided by a second scan or analysis.
In one particular example, various skin parameters and levels, such as skin type, skin color, hydration, oiliness, and melanin concentration, may be determined in multiple skin "zones". The zone-based skin parameters may be used to adjust and/or optimize moisturizing creams, sunscreens, and cosmetics for each different skin region. In related examples, skin parameters such as skin color, hydration level, melanin concentration, etc., may be used to identify healthy and unhealthy skin segments (where unhealthy skin segments may have been infected) or the skin being healed. The skin parameters of one or more healthy zones may be used as a reference for determining the severity of infection, etc., and/or monitoring the skin healing process. In another example, the unhealthy skin segment may include a skin segment having a skin nevus or suspected melanoma. In this example, skin parameters of one or more healthy zones may be used as a reference to classify skin moles and/or identify melanoma.
Fig. 21A provides a block diagram of an imaging system incorporating a high resolution imager 204, a low resolution spectral sensor 214, and a processor 224 adapted to combine the high resolution imager 204 with the output of the low resolution spectral sensor 214. In an exemplary embodiment, the camera system includes an image sensor, such as a high resolution image 204 that itself includes multiple sets of optical sensors, where each set of optical sensors includes multiple optical sensors arranged in a pattern. The camera also includes a spectral image sensor that itself includes a plurality of optical sensors and a plurality of sets of interference filters, each set of interference filters arranged in a pattern. Each of the plurality of filters is configured to pass light of a different wavelength range, and each set of the plurality of interference filters is associated with an area of a scene captured by the camera system. The camera system also includes one or more processors adapted to generate a spectral response of the plurality of regions of the scene from the spectral image sensor and combine the spectral response with the image output by the image sensor to generate a modified image.
Fig. 21B provides a block diagram of an imaging system incorporating a high resolution imager 200, a low resolution imager 210, and a 3-image processor 220 adapted to produce a 4-image output 230. In an exemplary embodiment, the camera system includes an image sensor, such as a high resolution imager 200 that itself includes multiple sets of optical sensors, where each set of optical sensors includes multiple optical sensors arranged in a pattern. The camera also includes a spectral image sensor that itself includes a plurality of optical sensors and a plurality of sets of interference filters, each set of interference filters arranged in a pattern. Each of the plurality of filters is configured to pass light of a different wavelength range, and each set of the plurality of interference filters is associated with an area of a scene captured by the camera system. The camera system also includes one or more processors adapted to generate a spectral response of the plurality of regions of the scene from the spectral image sensor and combine the spectral response with the image output by the image sensor to generate a modified image.
In the embodiment of the camera system of fig. 21B, the modified image may be an image with at least partial correction for optical distortion in at least some of the plurality of regions of the image output. In another example, the camera system is adapted to determine a type of optical distortion of at least some of the plurality of regions of the image output. Examples of types of optical distortion include, but are not limited to, natural light and various artificial light sources. The camera system may be further adapted to determine the frequency or duty cycle of the light source, such as whether the fluorescent light source is 50Hz or 60Hz.
In another embodiment, the camera system of fig. 21B may be adapted to modify the image output for display on one or more of a liquid crystal display, an organic light emitting diode display, a quantum dot display (QD-OLED), or a plasma display based on the spectral response. In yet another embodiment, the camera system may be adapted to modify the image output based on the spectral response for a display lacking an internal light source or a display with a weak or selectable internal light source. Examples include displays used in high light environments such as outdoors or in locations where light is sufficient such as in a venue or an office environment with sufficient artificial light. In one example, a modified image may be provided to a display that does not have a light source, but reflects light from outside the screen to optimize display quality. For example, the spectral response may be used to adjust the liquid crystal display to reflect the corrected color image.
In various embodiments, the camera system of fig. 21 may be adapted to determine an illumination direction of one or more objects in a scene and correct or otherwise enhance the image based on the illumination direction. Embodiments include determining a type of optical distortion for at least some of the spatial regions of the image output and collecting information to generate a three-dimensional map of at least a portion of the image. In this example, any of the illumination direction, the type of light distortion, and the three-dimensional map may be used in combination to produce a further corrected and/or enhanced image. Still other embodiments include determining a type of illumination of one or more objects in the scene, and correcting the image of the light distortion based on the type of illumination, wherein the type of illumination may include one or more of white LED illumination, color LED illumination, phosphor source illumination, halogen light source illumination, and various types of natural (sun) illumination.
In other embodiments, the camera system of fig. 21 may include one or more intelligent agents that are capable of implementing cognitive functions such that the intelligent agents are used, at least in part, to generate modified/corrected images. In another example where the camera system of fig. 21 includes one or more intelligent agents, where the intelligent agents are capable of implementing cognitive functions, the intelligent agents may be used to determine lighting directions, generate a three-dimensional map, and/or determine light distortion types, for example.
FIG. 22 is a flow chart illustrating an exemplary method of correcting optical distortion in a scene by an imaging system. The method includes step 300, wherein sample spectra are received from a local spectral sensor set for the scene, and continued at step 310, wherein an average spectral response for each sample spectrum is determined. The method continues to: at step 320, the imaging system collects image data from the image sensor for the scene, and at step 330, the imaging system uses the average spectral response for the local regions to correct the image data for each respective local region of the scene.
FIG. 23 is a flow chart illustrating an exemplary method for modifying and/or correcting an image of a scene with an imaging system. The method includes step 400, where a sample spectrum is received from a local spectrum sensor set for the scene, and continues at step 410, where the illumination type of each local area of the scene is determined. The method continues at step 420, where the imaging system generates a 3D map of the scene, and continues at step 430, where the illumination directions of one or more objects in the scene are determined. The method continues to: at step 440, the imaging system collects image data from the image sensor for the scene, and at step 450, the imaging system corrects the image data for each respective local area of the scene based on the illumination type, illumination direction, and 3D map to produce a corrected image.
Fig. 24 provides a top-down view of an optical sensor system using an optical sensor/detector comprising nano-semiconductor material. Detectors based on nano-semiconductor materials, such as thin film quantum dot photodiodes, can be fabricated using narrow bandgap thin films compatible with conventional semiconductor processes. In one example of an implementation, the optical sensor system 10 incorporates thin film quantum dots 120 of different sizes to provide a spectral response across a predetermined spectrum, where the granularity and spectral bandwidth of the thin film is determined by the number and size of the quantum dots. The quantum dots may be, but are not limited to, epitaxial quantum dots and/or colloidal quantum dots.
In one specific example of implementation and operation, the sensor system includes a plurality of nanoscale semiconductor sensors configured to sense different bands of light on an integrated circuit. In one example, the sensor system may be limited to an optical sensor comprising a nanoscale semiconductor. In another example, the sensor system may include a fabry-perot filter associated with the CMOS optical sensor. The nanoscale semiconductor elements may include one or more of quantum dots, colloidal nanoparticles, cdSe nanocrystals, znS nanocrystals, and the like. In one specific example of an implementation, nanoscale semiconductor elements may be implemented as different "dot" sizes, where the dot sizes determine the wavelength of the spectral response of a given nanoscale element. In this example, various spot sizes are distributed across the sensor system to provide a spectrum of given bandwidth and granularity.
Fig. 25 provides a cross-section of an adjacent fabry-perot filter overlaid by an optical angle element in accordance with the invention. In one example, the optical element 130 is associated with one or more filter arrays 20A-20F. In one specific example of an implementation, the optical element 130 may comprise an optical lens configured to rotate or tilt. Examples include optical image stabilization, optical element rotation to change the polarity of the propagating light, and/or another mechanical lens motion.
In another example of implementation and operation, the optical element 130 of fig. 25 includes a plurality of integrated polarizing elements (filters). In one example, a combination of polarizing filters enables sensor system 10 to distinguish between polarized light and unpolarized light. In another example, a combination of polarizing elements may enable the sensor system to separate light into different polarizations. In one example, the polarization information may be used to detect illuminant space/direction information and/or information about reflectivity from objects in a scene or image. In another example, polarization information may be used to detect glare and/or reflection in the imaged image.
In another example of implementation and operation, the optical element 130 of fig. 25 includes a plurality of integrated polarizing elements (filters). In one example, a combination of polarizing filters enables sensor system 10 to distinguish between polarized light and unpolarized light. In another example, a combination of polarizing elements may enable the sensor system to separate light into different polarizations. In one example, the polarization information may be used to detect illuminant space/direction information and/or information about reflectivity from objects in a scene or image. In another example, polarization information may be used to detect glare and/or reflection in the imaged image. In another specific example of implementation and operation, sensor system 10 includes a plurality of optical sensors on an integrated circuit, with optical system 80 disposed on top of sensor system 10. In this example, the optical system may be used to select a particular input angle for light incident on the sensor system 10, and in another example, the input angle may be used to determine the position of the light source in the imaged scene. In one example, the optical system 80 comprises a single optical element, and in another example, it comprises a plurality of optical elements.
In one specific example of implementation and operation, the sensor system 10 includes a plurality of optical sensors on an integrated circuit, wherein one or more optical elements 80 are located on top of at least some of the plurality of optical sensors, wherein the one or more optical elements 80 are configured to select an input angle for light incident to the sensor system. In another example, the processor may be configured to determine a direction of light collected by one or more of the plurality of optical sensors based on the selected input angle. In one example, the determined light direction may be used to inform a white balance modification or correction of the scene or object being imaged, where the white balance modification or correction is performed by a processor associated with the sensor system 10, or alternatively, where the determined light direction is provided to an external processing system for white balance modification or correction.
In another specific example of implementation and operation, the optical element 80 is common to all optical sensors in the sensor system 10, and in another example, the optical element is common to only a portion of the optical sensors in the sensor system 10.
There are several options for the optical element 130. In one example, the optical element 130 comprises an optical lens, and in another example, the optical element 130 comprises one or more masks positioned near the optical sensors, wherein the masks comprise light shields having different lateral offsets for at least some of the optical sensors of the sensor system 10. In this example, each mask is configured to allow light at some angles of incidence while masking light at other angles of incidence. The mask may be a single line of various materials, such as metal or another opaque material, or it may contain a grating configured to provide shielding for the array of optical sensors.
In another specific example of implementation and operation, the optical element 130 is an optical microlens; examples include, but are not limited to, fresnel lenses and/or molded lens arrays. In another specific example, the optical elements 130 include mechanical elements such that they may be rotated and/or tilted. In this example, the optical element 130 may be part of an optical image stabilization system for a camera incorporating the sensor system. In another specific example of implementation and operation, the optical elements 130 are microlenses, wherein each microlens is adapted to select an input angle for one or some of the optical sensors in the sensor system 10. In yet another specific example of implementation and operation, the optical element 130 is a polarizing filter.
Fig. 26 shows a scene with one or more light sources. In a scene, the light sources may illuminate the scene from behind (such as light source 140) and from the front and/or below (such as light source 142). In one example, white balance information may be included in the digital imaging data, allowing the white balance information to be used in post-processing of images or video. In one particular example, to provide real illumination of an object added to an image, such as in an augmented reality system, the illumination of the object may be adjusted to match the illumination of the scene. In an example similar to fig. 25, an object (in this example, a plastic shark) may be added to the pre-existing image, and localized white balance information from the pre-existing image may be used to provide real illumination consistent with the pre-existing image by illuminating the shark. In an augmented reality application, the illumination and/or masking of one or more objects may be adjusted to make the objects more realistically represented in the scene. In addition, directional information of the light (such as mentioned in fig. 25) may be used to inform the illumination of objects added to the pre-existing scene.
In one specific example of implementation and operation, a sensor system is used to collect spectral information, such as white balance information, from a scene. The sensor system may include a plurality of optical sensors having a plurality of sets of interference filters. In one example, one set of the plurality of sets of interference filters may be arranged in a pattern, wherein each of the plurality of filters is configured to pass light of a different wavelength range, wherein each set of interference filters is associated with a spatial region of the scene. In one example of an implementation, the sensor system may include one or more processors adapted to provide a spectral response based on the outputs from the plurality of optical sensors and determine a spatial region of the scene that may represent the light source based on the spectral response from each of the plurality of spatial regions of the scene.
In one example, the one or more processors may be adapted to identify a spatial region of the scene that represents a source (and intensity) of light used to illuminate one or more objects added after the acquisition of the digital image of the scene. In a related example, information associated with a spatial region of a scene representing a light source may be embedded in a digital image, provided as an appendix to the digital image and/or provided as a supplemental data file.
Fig. 27A is a flow chart illustrating an exemplary method for collecting digital images of a scene. The method begins at step 460, where received spectra from a scene are received from a set of spectral sensors spatially separated from each other. In one example, each spectral sensor group is associated with an image sensor. In an alternative example, each spectral sensor group is associated with a spatial region of the scene that is associated with a complementary spatial region of the image sensor. The method continues at step 464 by classifying the light features of the spectrum for each spatial region of the image sensor. The method then continues at step 462 with the received spectrum being used to determine whether one or more sets of spectrum sensors indicate a light source in the scene. When no light source is detected, the Average White Balance (AWB) of the scene is adjusted normally at step 468. In one example, the relative intensities of the light sources may be determined, and in another example, the locations and/or intensities of the light sources outside the scene may be determined based on reflections from objects within the scene when the light sources are not located within the scene. In another example, the light source position may be used to determine the region of the scene being imaged and/or the object in shadow in the scene. At optional step 466, a spatial region associated with the light source is identified. The method then continues at step 470 by providing the digital image with light source information including the spatial region of the light source and the classification type. In one example, the light source information may be used to adjust white balance and or color homography of a spatial region of the scene.
Fig. 27B is another flow chart illustrating an exemplary method for collecting digital images of a scene. The method begins at step 480, where spectra received from a scene are received from a set of spatially separated spectral sensors. In one example, each spectral sensor group is associated with an image sensor. In an alternative example, each spectral sensor group is associated with a spatial region of the scene that is associated with a complementary spatial region of the image sensor. The method then continues at step 482, where the received spectrum is used to determine whether one or more groups of spectral sensors indicate a light source in the scene. When no light source is detected, the Average White Balance (AWB) of the scene is adjusted normally at step 488. The method continues at step 484 with classifying the light features of the spectrum for each spatial region of the image sensor. In one example, the relative intensities of the light sources may be determined, and in another example, the locations and/or intensities of the light sources outside the scene may be determined based on reflections from objects within the scene when the light sources are not located within the scene. In another example, the light source position may be used to determine the region of the scene being imaged and/or the object in shadow in the scene. At optional step 486, a spatial region associated with the light source is identified. The method then continues at step 490 by providing the digital image with light source information including spatial regions and classification types of light sources.
In a specific example of implementation and operation, the light information provided in steps 470 and 490 of fig. 27A and 27B, respectively, may be used to assist in post-processing of the digital image. In another example, the light information may be used to adjust the illumination of objects added to the digital image in post-processing. For example, once the spatial region and associated light sources are classified, this information may be provided in addition to the captured image, so that it may be used in post-processing when an object is subsequently added to the captured image. Referring to the example shown in fig. 26, plastic sharks may be added to the captured image of the room, with the light information used in post-processing to correctly place the light intensity and color on the plastic sharks. In yet another example, the light information may be used to adjust the illumination of a scene captured in the digital image to provide a desired effect on the scene, such as improving the aesthetic of the scene or artificially changing the aesthetic of the scene. In one example of an implementation, the light information provided in steps 470 and 490 of fig. 27A and 27B, respectively, may be used to adjust the illumination of the scene to provide a more natural effect. In alternative embodiments, the light information provided in steps 470 and 490 of fig. 27A and 27B, respectively, may be used to adjust the illumination of the scene to enhance or refine the special effect.
In one specific example of implementation and operation, light information may be provided at the time of video imaging of a scene or object so that captured video may be substantially corrected in post-processing. In one example, each frame of captured video may include at least some light information. In another example, light information may be provided intermittently (as opposed to frame-by-frame) with video imaging data such that captured video may be corrected frame-by-frame by interpolating light information lost from frames without light information data. In yet another example, during video encoding of an imaged scene, light source information such as classification and/or intensity may be linked to an object or scene portion so that the light information may be ignored before the scene or object moves or changes, thereby enabling improved compression and/or reduced computational complexity of the captured video or image. In yet another specific example, the video capture system may be adapted to include light information only when the user switches on, such that light information, such as light classification and/or intensity, is not processed and/or captured when the switch is off.
In one example, the camera system is adapted to determine a type of optical distortion of at least some of the plurality of regions of the image output. Examples of types of optical distortion include, but are not limited to, natural light and various artificial light sources. The camera system may further be adapted to determine the frequency or duty cycle of the direct light source and/or the ambient light source, such as whether the fluorescent light source is 50Hz or 60Hz. In yet another example, the camera system may be further adapted to lock the negative feedback compensation loop to match the frequency or duty cycle and/or phase of the light source, and then attenuate and/or cancel the resulting light source flicker. In one example, an optical amplifier may be used to compensate for frequency effects by modifying the open loop output of the amplifier or the gain and phase properties of its feedback network, or both, to compensate for conditions that result in oscillations. In one example, a locked negative feedback compensation loop for flicker interference may be provided to a plurality (or all) of the affected pixels of the camera system, thereby avoiding saturation of those pixels due to flicker interference.
FIG. 28 is a flow chart illustrating an exemplary method for compensating for ambient light flicker in a scene captured by a digital imaging system. The method begins at step 500, where received spectra from a scene are received from multiple sets of localized spectrum sensors. At step 510, the illumination type of the scene is determined, and at step 520, the digital imaging system determines whether ambient light flicker is detected in the scene. If no light flicker is detected, the digital imaging system may adjust the white balance of the scene at step 550, and when light flicker is detected, the digital imaging system may lock the frequency and phase of the flicker source using a negative feedback loop at step 530 to compensate for light flicker at step 540.
Fig. 29A shows the individual spectral responses of two spectral sensors (pixels) having adjacent center wavelengths, such as a fabry-perot filter based spectral sensor, typically providing the spectral response in a narrow band (such as 10 nm). Sometimes a broader band (such as 20nm or 30 nm) of spectral response is desired. Fig. 29B shows the combined spectral response of the two spectral sensors of fig. 29A, effectively doubling the spectral response bandwidth. In one example of implementation and operation, a sensor system includes an optical sensor array and a plurality of spectral filters arranged in an array in proximity to the optical sensor array. In one example, the spectral filter is an interference filter, such as a Fabry-Perot filter or a plasma interference filter and an organic filter. In one example, each optical sensor is associated with one or more spectral filters of the plurality of spectral filters, wherein each spectral filter of the plurality of spectral filters is configured to pass light of a selected wavelength range. In one example, the electronic circuitry is coupled to the optical sensors such that the outputs of two or more optical sensors may be combined.
Fig. 29C shows a pair of adjacent interference filters (1) each associated with an optical sensor. In one example, the interference filter (1) is configured to pass wavelengths in adjacent ranges of the spectrum. In one example, the outputs of adjacent sensors (2) are combined to produce a combined output with a wider spectral response bandwidth, as shown in fig. 29B. In one example, the outputs of adjacent sensors (2) may be combined using dedicated electronic circuitry, or in another example, may be combined by one or more modules of a computing device.
Fig. 29D shows a pair of adjacent interference filters (1) associated with a single optical sensor (2). In one example, incident light passing through two adjacent filters (1) is detected by a single optical sensor (2). In one example of implementation and operation, a sensor system includes an optical sensor array and a plurality of spectral filters arranged in an array in proximity to the optical sensor array. In one example, the spectral filter is an interference filter, such as a Faby-Perot filter or a plasma interference filter and an organic filter. In one particular example, the optical sensor is associated with two or more spectral filters of the plurality of spectral filters, wherein each spectral filter of the plurality of spectral filters is configured to pass light of a selected wavelength range. In this example, the two or more of the plurality of spectral filters are configured to pass light in substantially adjacent wavelength ranges such that the sensor 2 effectively receives twice the wavelength of either interference filter (1) alone.
Fig. 29E shows a pair of interference filters (1) placed one on top of the other, the pair of interference filters being associated with a single optical sensor (2). In one example of implementation and operation, a sensor system includes an optical sensor array and a plurality of spectral filters arranged in an array in proximity to the optical sensor array. In one example, the spectral filter is an interference filter, such as a fabry-perot filter or a plasma interference filter. In one particular example, the first spectral filter array is arranged in an array above the optical sensor array, wherein each optical sensor in the optical sensor array is associated with at least one spectral filter in the first spectral filter array, and each spectral filter is configured to pass light of a selected wavelength range. In one example, the second spectral filter array is arranged in an array on top of the first spectral filter array to produce a pair of spectral filters, wherein each spectral filter in the second spectral filter array is configured to pass light of a selected wavelength range, wherein each spectral filter pair of the plurality of spectral filter pairs comprises two or more spectral filters that together are configured to pass light in substantially adjacent wavelength ranges. In one example, each filter pair is configured to pass light through a single optical sensor.
As described above, using Automatic White Balance (AWB) correction to compensate for light source distortion enables the image sensor to more accurately represent the expected chromaticity of a recorded scene or object. In one example of implementation and operation, when the imager receives input for AWB correction, uniform AWB correction may be enhanced by spatially blurring and/or scrambling the scene. Blurred images may provide more uniform color detection for demosaicing a given set of spectral filter responses.
In one example of implementation and operation, a sensor system for imaging a scene may include a plurality of optical sensors on an integrated circuit having a plurality of sets of interference filters, wherein each set of interference filters includes interference filters arranged in a pattern. Each interference filter of the plurality of filters is configured to pass light of a different wavelength range, and each set of interference filters of the plurality of interference filters is associated with a spatial region of the scene. In one example, a lens system is configured on top of the plurality of optical sensors, wherein the lens system is adapted to produce a blurred image having a substantially predetermined blur size at the plurality of optical sensors. In one example, the lens system is configured to defocus to produce a blurred image having a substantially predetermined blur size and focus to produce a substantially focused image at the plurality of optical sensors. In one particular example, the lens system is comprised of a plurality of elements, and the lens system is configured to defocus by adjusting yet another element of the plurality of elements without adjusting other elements of the one or more elements.
In another specific example, the lens system may be adapted to introduce spherical aberration and/or other coherent aberrations to increase blur of the scene for AWB correction purposes. In yet another specific example, the lens system may include a large field of view (FOV) and a low chief ray angle. The large field of view enables a given imager to detect additional light and capture a wide scene, while the low chief ray angle reduces the angle of incidence of the incident light to a spectral filter (such as an interference-based filter). Fig. 7 shows a lens system with an inverse telecentric design. The inverse telecentric design provides a large field of view and a low chief ray angle so that by adjusting elements of the telecentric design, the image can be blurred for AWB correction while being adjusted to focus the image for high spatial resolution image capture. Telecentric lenses are known to provide orthogonal projection, providing the same magnification over all distances. Objects that are too close may still be out of focus, but the resulting blurred image will be the same size as the correctly focused image. In an inverse telecentric lens system, one or more elements of the telecentric lens system are inverted, resulting in a more uniform color distribution of the spectral filter.
Electronic product manufacturers are increasingly using displays with underlying image sensors in smartphones, tablet computers, and other mobile devices that use cameras. When the image sensor is positioned below the display, spectral recoloring of the image may result, at least in part, due to the active color emitted by the display destroying the image. The spectral sensor may be implemented under the display to mitigate the effects of the display on the imager located under the display while also providing input for Automatic White Balance (AWB). In one specific example of implementation and operation, a sensor system for imaging a scene includes a first optical sensor array and a plurality of sets of interference filters associated with the first optical sensor array. Each of the plurality of sets of interference filters includes a plurality of interference filters arranged in a pattern, wherein each of the plurality of interference filters is configured to pass light of a different wavelength range. Each set of the plurality of interference filters is associated with a spatial region of the scene. In one example, the sensor system includes a second optical sensor array configured to output an image and a processor having one or more modules adapted to generate spectral responses for a plurality of spatial regions of a scene from the first array of optical sensors and the image output by the second optical sensor array. In this example, the display is located on top of the first plurality of optical sensors and the second plurality of optical sensors.
Spectral sensors are typically used to improve the signal from an image sensor located below the display. Although in the given example the spectral sensor and the image sensor are presented as separate entities, the hyperspectral camera can use the same optical sensor for both functions (spectral measurement and imaging).
In one specific example of operation, the first optical sensor array and the second optical sensor array are adjacent to each other under the display. In this example, the spectral response provided by the first optical sensor array may be used to correct for optical distortion and other artifacts of the scene imaged by the second optical sensor array. In another example, the second optical sensor array outputs a monochrome image, and the output from the first optical sensor array can be used to provide color information of the monochrome image.
In another example, a portion of the optical sensors from the first optical sensor array may be used to correct for interference of the display with the image produced by the second optical sensor array, and another portion of the optical sensors from the first optical sensor array may be used to provide color information for Automatic White Balance (AWB). In a related example, display interference may be corrected using an optical sensor from a first array associated with an interference filter configured to pass light in a particular wavelength range, while Automatic White Balance (AWB) is corrected using an optical sensor from a first array associated with an interference filter configured to pass light in other wavelength ranges. In one example, the processor may be further adapted to detect a change in display colorimetry over time based on the output from the display and/or the spectral responses of the plurality of spatial regions.
It should be noted that terms such as bit stream, signal sequence, and the like (or equivalents thereof) as may be used herein have been interchangeably used to describe digital information whose content corresponds to any of a number of desired types (e.g., data, video, voice, text, graphics, audio, and the like, any of which are commonly referred to as "data").
As may be used herein, the terms "substantially" and "about" provide industry accepted tolerances for the correlation between their respective terms and/or items. For some industries, industry accepted tolerances are less than one percent, for other industries, industry accepted tolerances are 10% or higher. Examples of other industry accepted tolerances range from less than one percent to fifty percent. Industry accepted tolerances correspond to, but are not limited to, component values, integrated circuit process variations, temperature variations, rise and fall times, thermal noise, dimensions, signal errors, discarded packets, temperatures, pressures, material composition and/or performance metrics, etc. Within an industry, the tolerance variance of an acceptable tolerance may be greater than or less than a percentage level (e.g., a dimensional tolerance of less than +/-1%). Some relatedness between items may range from less than a percentage level of difference to a few percent. Other relatedness between items may range from a few percent difference to a tremendous difference.
As may be used herein, the terms "configured to," "operatively coupled to," "coupled to," and/or "coupled to" include direct coupling between items and/or indirect coupling between items via intermediate items (e.g., items including, but not limited to, components, elements, circuits, and/or modules), where, as an example of indirect coupling, the intermediate items do not modify the information of the signal, but may adjust its current level, voltage level, and/or power level. As may be further used herein, inferred coupling (i.e., where one element is coupled to another element by inference) includes direct coupling and indirect coupling between two items in the same manner as "coupled to".
As may be even further used herein, the terms "configured to," "operatively," "coupled to," or "operatively coupled to" indicate that an item includes one or more of a power connection, an input, an output, etc., to perform its corresponding function or functions when activated, and may further include inferred coupling with one or more other items. As may be further used herein, the term "associated with … …" includes direct coupling and/or indirect coupling of separate items and/or one item embedded in another item.
As may be used herein, the term "advantageously compares" indicates that a comparison between two or more items, signals, etc., provides a desired relationship. For example, an advantageous comparison may be obtained when the desired relationship is that signal 1 has a larger amplitude than signal 2, when the amplitude of signal 1 is greater than the amplitude of signal 2, or when the amplitude of signal 2 is less than the amplitude of signal 1. As may be used herein, the term "disadvantageously compare" indicates that a comparison between two or more items, signals, etc., does not provide a desired relationship.
As may be used herein, one or more claims may include the phrase "at least one of a, b, and c" or "at least one of a, b, or c" in such general form, in a particular form of such general form, with more or fewer elements than "a," b, "and" c. In either expression, these phrases are to be interpreted to have the same meaning. In particular, "at least one of a, b, and c" is equivalent to "at least one of a, b, or c," and shall mean a, b, and/or c. For example, it means: only "a", only "b", only "c", "a" and "b", "a" and "c", "b" and "c" and/or "a", "b" and "c".
As also used herein, the terms "processing module," "processing circuit," "processor," "processing circuitry," and/or "processing unit" may be a single processing device or a plurality of processing devices. Such processing devices may be microprocessors, microcontrollers, digital signal processors, microcomputers, central processing units, field programmable gate arrays, programmable logic devices, state machines, logic circuitry, analog circuitry, digital circuitry, and/or any device that manipulates signals (analog and/or digital) based on hard coding of circuitry and/or operational instructions. The processing module, processing circuit, processing circuitry, and/or processing unit may be or further include memory and/or integrated memory elements, which may be a single memory device, multiple memory devices, and/or embedded circuitry of another processing module, processing circuit, processing circuitry, and/or processing unit. Such memory devices may be read-only memory, random access memory, volatile memory, non-volatile memory, static memory, dynamic memory, flash memory, cache memory, and/or any device that stores digital information. It should be noted that if a processing module, processing circuitry, and/or processing unit includes more than one processing device, the processing devices may be centrally located (e.g., directly coupled together via wired and/or wireless bus structures) or may be distributed (e.g., cloud computing indirectly coupled via a local area network and/or wide area network). It is further noted that if a processing module, processing circuitry, and/or processing unit implements one or more of its functions via a state machine, analog circuitry, digital circuitry, and/or logic circuitry, the memory and/or memory elements storing the corresponding operational instructions may be embedded within, or external to, the circuitry comprising the state machine, analog circuitry, digital circuitry, and/or logic circuitry. It is still further noted that the memory elements may store and process modules, processing circuits, processing circuitry, and/or processing units perform hard-coded and/or operational instructions corresponding to at least some of the steps and/or functions illustrated in one or more of the figures. Such memory devices or memory elements may be included in an article of manufacture.
One or more embodiments have been described above with the aid of method steps illustrating the performance of specific functions and their relationships. For ease of description, the boundaries and sequence of these functional building blocks and method steps are arbitrarily defined herein. Alternate boundaries and sequences may be defined so long as the specified functions and relationships are appropriately performed. Accordingly, any such alternative boundaries or sequences are within the scope and spirit of the claims. Further, boundaries of these functional building blocks have been arbitrarily defined for the convenience of the description. Alternative boundaries may be defined if appropriate to perform certain important functions. Similarly, flow diagrams may also be arbitrarily defined herein to illustrate certain important functions.
The flow diagram boundaries and sequence may be defined in other ways with respect to the scope of use and still perform some important functions. Accordingly, such alternative definitions of the functional building blocks and flow diagrams and sequences are within the scope and spirit of the claims. Those of ordinary skill in the art will further appreciate that the functional building blocks and other illustrative blocks, modules, and components herein may be implemented as shown or by discrete components, application specific integrated circuits, processors executing appropriate software, etc., or any combination thereof.
Further, the flow diagrams may include "start" and/or "continue" indications. The "start" and "continue" indications reflect that the presented steps may optionally be incorporated into or otherwise used in conjunction with one or more other routines. Further, the flow diagrams may include an "end" and/or "continue" indication. The "end" and/or "continue" indications reflect that the presented steps may be incorporated into or otherwise used in conjunction with one or more other routines as described and illustrated or alternatively. In such cases, "start" indicates the start of the first step presented, and may precede other activities not specifically shown. Further, the "continue" indication reflects that the presented step may be performed multiple times and/or may be superseded by other activities not specifically shown. Further, while the flow diagrams indicate a particular order of steps, other orders are equally possible provided that the causal relationship principle is maintained.
One or more embodiments are used herein to describe one or more aspects, one or more features, one or more concepts, and/or one or more examples. Physical embodiments of devices, articles, machines, and/or processes may include one or more aspects, features, concepts, examples, etc. described with reference to one or more embodiments discussed herein. Furthermore, from one figure to another, embodiments may include the same or similar named functions, steps, modules, etc., which may use the same or different reference numbers, and thus may be the same or similar functions, steps, modules, etc., or may be different functions, steps, modules, etc.
In the figures of any of the drawings presented herein, signals to, from, and/or between elements may be analog or digital, continuous or discrete time, and single ended or differential, unless specifically stated to the contrary. For example, if the signal path is shown as a single ended path, it also represents a differential signal path. Similarly, if the signal path is shown as a differential path, it also represents a single ended signal path. Although one or more particular architectures are described herein, other architectures using one or more data buses, direct connections between elements, and/or indirect coupling between other elements as recognized by one of ordinary skill in the art may likewise be implemented.
The term "module" is used to describe one or more embodiments. The modules implement one or more functions via a device such as a processor or other processing device or other hardware, which may include or operate in association with a memory storing operating instructions. The modules may operate independently and/or in conjunction with software and/or firmware. As also used herein, a module may contain one or more sub-modules, each of which may be one or more modules.
As further used herein, a computer-readable memory includes one or more memory elements. The memory element may be a separate memory device, a plurality of memory devices, or a set of memory locations within a memory device. Such memory devices may be read-only memory, random access memory, volatile memory, non-volatile memory, static memory, dynamic memory, flash memory, cache memory, and/or any device that stores digital information. The memory device may be in the form of solid state memory, hard drive memory, cloud memory, thumb drive, server memory, computing device memory, and/or other physical media for storing digital information.
Although specific combinations of features and functions have been explicitly described herein for various functions of one or more embodiments, other combinations of features and functions are also possible. The present disclosure is not limited by the specific examples disclosed herein, and these other combinations are expressly incorporated.

Claims (21)

1. A method for imaging a scene, the method comprising:
imaging the received time T1 scene spectrum using a spectral imager, wherein the spectral imager comprises a plurality of spectral sensors, wherein the spectral sensors comprise a spectral filter overlaying one or more first optical sensors, wherein each spectral sensor has a sensing range that falls within a predetermined optical wavelength range, wherein the sensing ranges of the plurality of spectral sensors collectively comprise a range of wavelengths;
Outputting, by the spectral imager, information representative of a T1 scene spectral image to one or more processing modules of one or more computing devices via one or more interfaces;
imaging a scene at time T2 using an image sensor, wherein the image sensor comprises a plurality of second optical sensors;
outputting, by the image sensor, information representative of a T2 scene image to one or more processing modules of one or more computing devices via one or more interfaces, wherein a spatial resolution of the scene image is higher than a spatial resolution of the spectral image;
generating, by the one or more processing modules of the one or more computing devices, a combined spectral image based on the information representative of the T1 scene spectral image and the information representative of the T2 image; and
correcting, by the one or more processing modules of the one or more computing devices, illuminants for one or more spatial regions of the scene based on the combined spectral image to produce a corrected spectral image.
2. The method of claim 1, wherein the spatial resolution of the combined spectral image is higher than the spatial resolution of the spectral image.
3. The method of claim 1, wherein at least one of the one or more computing devices is a trained artificial intelligence engine.
4. The method of claim 1, wherein at least one of the one or more computing devices is a trained machine learning network, wherein the scene is of a predetermined type, wherein the predetermined type is at least one of skin, a body part, an object, and a portion of a plant.
5. The method of claim 4, wherein the one or more attributes of the skin region include at least one of the presence of skin disease, wound healing, skin tone of the skin region, and the presence of melanoma.
6. The method of claim 1, wherein the spectral sensors are associated with P first optical sensors and N second optical sensors, wherein n+.p.
7. The method of claim 1, wherein the time t1=t2.
8. The method of claim 1, further comprising:
the scene is illuminated by the illumination source at time T1.
9. The method of claim 1, further comprising:
illuminating a scene by an illumination source at a time T3, wherein T3 is a time between T1 and T2;
sampling the received time T3 scene spectrum using a spectral imager;
outputting, by a spectral imager, information representative of the received T3 scene spectrum to the one or more processing modules via one or more interfaces;
A difference between the received time T1 scene spectrum and the received time T3 scene spectrum is determined by the one or more processing modules, wherein the combined spectral image is based at least in part on the difference between the received time T1 scene spectrum and the received time T3 scene spectrum.
10. The method of claim 1, further comprising:
one or more attributes of the scene are classified based on the corrected spectral images.
11. The method of claim 10, wherein the classifying is based on a comparison between the corrected spectral image and a reference spectral image.
12. The method of claim 11, wherein the scene is a skin region, wherein the reference spectral image is at least one of a previous skin spectral image, an image from a database, and an output of an inference engine.
13. A method for imaging a scene, comprising:
imaging the received time T1 scene spectrum using a spectral imager, wherein the spectral imager comprises a plurality of spectral sensors, wherein the spectral sensors comprise a spectral filter overlaying one or more optical sensors, wherein each spectral sensor has a sensing range that falls within a predetermined optical wavelength range, wherein the sensing ranges of the plurality of spectral sensors collectively comprise a range of wavelengths;
Outputting, by the spectral imager, information representative of a T1 scene spectral image to one or more processing modules of one or more computing devices via one or more interfaces;
illuminating the scene by the illumination source at time T2;
imaging the received base time T2 scene spectrum using a spectral imager;
outputting, by the spectral imager, information representative of a T2 scene spectral image to one or more processing modules of one or more computing devices via one or more interfaces;
determining, by the one or more processing modules, a difference between the received time T1 scene spectrum and the received time T2 scene spectrum; and
the illuminant for the scene at time T1 is determined based on a difference between the received time T1 scene spectrum and the received time T2 scene spectrum.
14. The method of claim 13, further comprising:
imaging at least a portion of the scene at time T3 using an image sensor to produce an uncorrected image, wherein the image sensor comprises an array of other optical sensors; and
the one or more modules of the one or more processors correct, based on the T1 scene spectral image and the T2 scene spectral image, the illuminant for the one or more spatial regions of the uncorrected image to produce a corrected scene.
15. The method of claim 14, wherein the time t2=t3.
16. The method of claim 14, wherein the plurality of spectral sensors comprises P spectral filters and N optical sensors, wherein each spectral filter is associated with an optical sensor, wherein n+.p.
17. The method of claim 14, further comprising:
the scene is illuminated by the illumination source at time T3.
18. The method of claim 14, wherein the scene is a skin area.
19. The method of claim 14, wherein the step of determining a illuminant for the scene at time T1 comprises: one or more attributes of the scene are classified based on a difference between the received time T1 scene spectrum and the received time T2 scene spectrum.
20. The method of claim 14, wherein the step of determining a illuminant for the scene at time T1 comprises: one or more attributes of the scene are classified based on differences between the received time T1 scene spectrum and the reference spectrum.
21. The method of claim 20, wherein the scene is a skin region, wherein the reference spectral image is at least one of a previous skin spectral image, an image from a database, and an output of an inference engine.
CN202211483115.1A 2021-11-26 2022-11-24 Illuminant correction in an imaging system Pending CN116183021A (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US202163264599P 2021-11-26 2021-11-26
US63/264,599 2021-11-26
US18/051,166 2022-10-31
US18/051,166 US20230082539A1 (en) 2020-07-01 2022-10-31 Illuminant correction in an imaging system

Publications (1)

Publication Number Publication Date
CN116183021A true CN116183021A (en) 2023-05-30

Family

ID=86317169

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211483115.1A Pending CN116183021A (en) 2021-11-26 2022-11-24 Illuminant correction in an imaging system

Country Status (2)

Country Link
CN (1) CN116183021A (en)
DE (1) DE102022131079A1 (en)

Also Published As

Publication number Publication date
DE102022131079A1 (en) 2023-06-01

Similar Documents

Publication Publication Date Title
CN113884184B (en) Spectral sensor system using an optical filter sub-array
US11711629B2 (en) Solid-state image pickup device and electronic apparatus
RU2535640C2 (en) Forming multispectral images
CN106456070B (en) Image forming apparatus and method
US20180224333A1 (en) Image capturing apparatus and image capturing computer program product
US7460160B2 (en) Multispectral digital camera employing both visible light and non-visible light sensing on a single image sensor
KR101824290B1 (en) High resolution multispectral image capture
US9307127B2 (en) Image capturing device and image capturing system
US20070170352A1 (en) Method and system for wavelength-dependent imaging and detection using a hybrid filter
JP6322939B2 (en) Imaging system and color inspection system
JP2014132266A (en) Multimode lightfield imaging system using exposure condition and filter position
US10212401B2 (en) Image generation device and imaging device
JP2014095688A (en) Imaging device and imaging system
JP2004228662A (en) Image pickup apparatus
US20180252583A1 (en) Spectral imaging method and system
JP2009257919A (en) Solid-state imaging device, imaging system, and detection device
US20190058837A1 (en) System for capturing scene and nir relighting effects in movie postproduction transmission
US11696043B2 (en) White balance compensation using a spectral sensor system
JP2024514493A (en) Snapshot type high-precision multispectral imaging system and method using multiplexed illumination
US20060289958A1 (en) Color filter and image pickup apparatus including the same
US10446600B2 (en) Imaging system and imaging device having a random optical filter array
CN116183021A (en) Illuminant correction in an imaging system
US20230082539A1 (en) Illuminant correction in an imaging system
WO2022074047A1 (en) Multi-spectral optical sensor and system
US20230402485A1 (en) Imaging system using spatially separated spectral arrays

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination