EP2703792A1 - Method of controlling the resolution of a hyperspectral image - Google Patents

Method of controlling the resolution of a hyperspectral image Download PDF

Info

Publication number
EP2703792A1
EP2703792A1 EP13182230.6A EP13182230A EP2703792A1 EP 2703792 A1 EP2703792 A1 EP 2703792A1 EP 13182230 A EP13182230 A EP 13182230A EP 2703792 A1 EP2703792 A1 EP 2703792A1
Authority
EP
European Patent Office
Prior art keywords
hyperspectral
subpixels
image
window
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP13182230.6A
Other languages
German (de)
English (en)
French (fr)
Inventor
Eric Daniel Buehler
Stefano Angelo Mario Lassini
Benjamin Thomas Occhipinti
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
GE Aviation Systems LLC
Original Assignee
GE Aviation Systems LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by GE Aviation Systems LLC filed Critical GE Aviation Systems LLC
Publication of EP2703792A1 publication Critical patent/EP2703792A1/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/135Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on four or more different wavelength filter elements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/2823Imaging spectrometer
    • G01J2003/2826Multispectral imaging, e.g. filter imaging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • G06T2207/10036Multispectral image; Hyperspectral image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20021Dividing image into blocks, subimages or windows
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20216Image averaging

Definitions

  • Hyperspectral imagery (HSI) devices are a class of spectrometers that record energy in many discrete spectral bands or colors simultaneously at a multitude of spatial picture elements, called pixels on an image sensor.
  • Standard broadband imagers record one value at each pixel for all the detected incident energy across a wide spectrum, and create an image in two spatial dimensions from a two-dimensional array of detectors.
  • HSI devices differ from standard broadband imagers by creating an image with an additional spectral dimension. Each HSI pixel may have ten to hundreds of wavelength values recorded.
  • One aspect of the invention relates to a method of controlling the resolution of a hyperspectral image from an image sensor having pixels and at least one filter that defines subpixels within each pixel.
  • the method includes defining a window on the image sensor with an array of rows and columns of subpixels; weighting the subpixels within the window based upon one or more predefined parameters of the hyperspectral image to establish a value for a weighted average for the array for the predefined parameters; shifting the window by a predefined number of rows or columns wherein the predefined number is less than the number of respective rows or columns in the array; repeating the weighting and shifting steps for all possible windows on the image sensor; and processing the hyperspectral image based on the weighted averages.
  • embodiments described herein may include a computer program product comprising machine-readable media for carrying or having machine-executable instructions or data structures stored thereon.
  • machine-readable media can be any available media, which can be accessed by a general purpose or special purpose computer or other machine with a processor.
  • machine-readable media can comprise RAM, ROM, EPROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of machine-executable instructions or data structures and that can be accessed by a general purpose or special purpose computer or other machine with a processor.
  • Machine-executable instructions comprise, for example, instructions and data, which cause a general purpose computer, special purpose computer, or special purpose processing machines to perform a certain function or group of functions.
  • Embodiments will be described in the general context of method steps that may be implemented in one embodiment by a program product including machine-executable instructions, such as program code, for example, in the form of program modules executed by machines in networked environments.
  • program modules include routines, programs, objects, components, data structures, etc. that have an effect of performing particular tasks or implement particular abstract data types.
  • Machine-executable instructions, associated data structures, and program modules represent examples of program code for executing steps of the method disclosed herein.
  • the particular sequence of such executable instructions or associated data structures represent examples of corresponding acts for implementing the functions described in such steps.
  • Embodiments may be practiced in a networked environment using logical connections to one or more remote computers having processors.
  • Logical connections may include a local area network (LAN) and a wide area network (WAN) that are presented here by way of example and not limitation.
  • LAN local area network
  • WAN wide area network
  • Such networking environments are commonplace in office-wide or enterprise-wide computer networks, intranets and the internet and may use a wide variety of different communication protocols.
  • Those skilled in the art will appreciate that such network computing environments will typically encompass many types of computer system configuration, including personal computers, hand-held devices, multiprocessor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, and the like.
  • Embodiments may also be practiced in distributed computing environments where tasks are performed by local and remote processing devices that are linked (either by hardwired links, wireless links, or by a combination of hardwired or wireless links) through a communication network.
  • program modules may be located in both local and remote memory storage devices.
  • An exemplary system for implementing the overall or portions of the exemplary embodiments might include a general purpose computing device in the form of a computer, including a processing unit, a system memory, and a system bus, that couples various system components including the system memory to the processing unit.
  • the system memory may include read only memory (ROM) and random access memory (RAM).
  • the computer may also include a magnetic hard disk drive for reading from and writing to a magnetic hard disk, a magnetic disk drive for reading from or writing to a removable magnetic disk, and an optical disk drive for reading from or writing to a removable optical disk such as a CD-ROM or other optical media.
  • the drives and their associated machine-readable media provide nonvolatile storage of machine-executable instructions, data structures, program modules and other data for the computer.
  • Beneficial effects of the method disclosed in the embodiments include improving the spatial resolution of an image from existing information which has a direct correlation to improving a sensor's range, especially when object detection and tracking methods are used in conjunction with the sensor.
  • the method improves on existing spectral composite image techniques by more efficiently using the data being collected by the system.
  • This technique can be used on any system that generates composite imagery from spectral cube arrays. Implementation of this technique on spectral cube array systems will improve the spatial resolution of the sensor by a factor equal to the spectral cube dimension.
  • FIG. 1 is a representative illustration demonstrating the known sampling of a panchromatic representation of a hyperspectral image 14 from a hyperspectral data cube 12.
  • a panchromatic representation of a hyperspectral image 14 formed from a hyperspectral data cube 12 is the broadband image formed by combining the data from all of the spectral bands collected from the pixels of an HSI device into a single array.
  • a hyperspectral data cube 12 consists of the array of values from the pixels on the image sensor of an HSI device where each pixel such as 22 of the hyperspectral data cube 12 consists of an array of subpixels, numbering nine in this embodiment and identified as nos. 1-9 such as 16, 18, 20.
  • Each subpixel such as 16, 18, 20 in the pixel 22 represents a specific spectral band and is contained within a single pixel 22. It will be understood that there may be greater or fewer subpixels in each pixel, depending on a specific application.
  • the spectral bands represented by the subpixels may be the nine spectral bands of the hyperspectral data cube 12, or the 16 spectral bands of an HSI device such as the SOC710 from Surface Optics Corporation or the three distinct spectral bands corresponding to the colors of red, blue and green of the well-known Bayer filter although other spectral band filter configurations are possible depending upon the requirements of a specific implementation including configurations where the spectral bands represented by the subpixels are the same for multiple subpixels in a pixel.
  • the hyperspectral data cube 12 is converted into a panchromatic representation of a hyperspectral image 14 by a conversion method 24 that combines the values of all the subpixels, nos. 1-9 in this embodiment, of a given hyperspectral pixel 22 into a single panchromatic pixel 28 value.
  • the nine subpixels in the pixel 22 in this embodiment each contain values representing intensity at a unique wavelength, numbered one through nine, and are combined through the conversion method 24.
  • the conversion method 24 used to combine all of the subpixel values of each hyperspectral pixel 22 is repeated for all hyperspectral pixels in the hyperspectral data cube 12 where, in this embodiment, each pixel is centered at the subpixel corresponding to wavelength five (16, 26, 44, 46, 48, 50, 52, 54) and consists of the subpixels corresponding to wavelength five and the eight nearest subpixels.
  • One conversion method 24 used to form a single panchromatic pixel 28 value is to average the values of all nine of the subpixels of a hyperspectral pixel 22.
  • the hyperspectral image 14 will have the same resolution or number of pixels as the hyperspectral data cube 12.
  • Each pixel of the hyperspectral data cube 12 centered on the subpixel corresponding to wavelength five (16, 26, 27, 44, 46, 48, 50, 52, 54) will be sampled to generate a single panchromatic pixel such as 28 and 30.
  • FIG. 2 is a flowchart demonstrating an upsampling method 80 of controlling the resolution of a hyperspectral image derived from a hyperspectral data cube according to an embodiment of the invention.
  • each panchromatic pixel of the hyperspectral image may be generated by a sampled subset of subpixels of a hyperspectral data cube without regard to the boundaries of a pixel of the hyperspectral data cube.
  • the method 80 may consist of the steps of defining a window 82, weighting the subpixel values in the window 86 for a given window location, assigning a weighted average to an output hyperspectral image 88 and repeatedly weighting and shifting windowed subpixels by iterating through the set of window locations starting at a first window location 84 and incrementing through successive window locations 90 until all window locations have been processed 92.
  • the first step of the upsampling method 80 may be to define a window 82, also known in image processing as a mask or template, to determine which subpixels of a hyperspectral data cube may be used to calculate a value for a resulting pixel of a hyperspectral image.
  • the window may be the same size as a pixel in terms of the number and location of included subpixels. However, any size window may be used depending upon the implementation.
  • a window may include more or less subpixels than contained in a pixel and it may be an entirely different shape than the shape of the set of subpixels that define a pixel. For example, a window consisting of a rectangular arrangement of subpixels may be applied to a hyperspectral data cube consisting of pixels with a square arrangement of subpixels.
  • the window may be applied to a first window location 84, followed by weighting the subpixel values in the window 86.
  • an equal weighting may be applied that affects an averaging operation, i.e. all w p weights are set to the value of 1/ P .
  • the weighting may be used to modify the relative contributions of each subpixel to the value of the resulting pixel where the set of weights may be based upon a predefined parameter.
  • the predefined parameter may be selected to advantageously display features of the subject of the resulting image.
  • weights may be applied and although the set of all weights in a window may typically sum to 1, the invention is not limited to that constraint.
  • a preferred set of weights are derived to compensate for the frequency response of solar radiation and the further attenuation of selective wavelengths due to absorption bands in Earth's atmosphere.
  • the weighting scheme may be designed to advantageously display features of the subject of the resulting image.
  • the values of the weighted, windowed subpixels may be combined into a single weighted average of the subpixels.
  • the weighted average may then be assigned a pixel in the resulting hyperspectral image 88 at a location corresponding to the current window location.
  • the process of weighting subpixel values in a window 86 and assigning the weighted average to an output pixel in a resulting hyperspectral image 88 may be repeated for all possible window locations until all locations have been processed 92. Serially, this process may be equivalent to shifting the window a predefined number of rows or columns. In a preferred embodiment of the invention, the window may be shifted by one subpixel in either the vertical or horizontal direction but other shifts may be used to control the resolution of the resulting hyperspectral image. The weighting process may then be repeated to form the next pixel in the hyperspectral image using the new windowed neighborhood of subpixels. In another embodiment of the invention, simultaneous windows may be instantiated such that they are the set of windows shifted by one subpixel in either the vertical or horizontal directions and all resulting output pixels of the hyperspectral image may generated simultaneously.
  • Figure 3 is a representative illustration demonstrating the upsampling 100 of a first pixel 128 of a hyperspectral image 114 derived from a hyperspectral data cube 12 according to the method of Fig. 2 .
  • the conversion method 124 demonstrating the upsampling 100 of a first pixel 128 of a hyperspectral image 114 derived from a hyperspectral data cube 12 may use a window 22 to determine which subpixels of a hyperspectral data cube 12 may be used to calculate a value for a resulting pixel 128 of the hyperspectral image 114.
  • the window 22 may then be shifted by a predefined number of rows or columns.
  • the window may be shifted by one subpixel in either the vertical or horizontal direction but other shifts may be used to control the resolution of the resulting hyperspectral image 114.
  • Figures 4-6 are representative illustrations demonstrating the upsampling of pixels 226, 328, 430 of the hyperspectral image 114 by use of repeated shifting of the window 22 with single subpixel shifts.
  • Figure 4 is a representative illustration demonstrating the upsampling 200 of a second pixel 226 of a hyperspectral image 114 derived from a hyperspectral data cube 12 in accord with the embodiment of Fig. 3 .
  • the conversion method 224 demonstrating the upsampling 200 of a second pixel 226 of a hyperspectral image 114 derived from a hyperspectral data cube 12 may use a window 22 shifted one column right from the window location in Figure 3 to determine which subpixels centered at subpixel 216 of a hyperspectral data cube 12 may be used to calculate a value for a resulting second pixel 226 of the hyperspectral image 114.
  • Figure 5 is a representative illustration demonstrating the upsampling 300 of a third pixel 328 of a hyperspectral image 114 derived from a hyperspectral data cube 12 in accord with the embodiment of Figs. 3 and 4 .
  • the conversion method 324 demonstrating the upsampling 300 of a third pixel 328 of a hyperspectral image 114 derived from a hyperspectral data cube 12 may use a window 22 shifted one column right from the window location in Figure 4 to determine which subpixels of a hyperspectral data cube 12 may be used to calculate a value for a resulting third pixel 328 of the hyperspectral image 114.
  • Figure 6 is a representative illustration demonstrating the upsampling 400 of a fourth pixel 430 of a hyperspectral image 114 derived from a hyperspectral data cube 12 in accord with the embodiment of Figs. 3-5 .
  • the conversion method 424 demonstrating the upsampling 400 of a fourth pixel 430 of a hyperspectral image 114 derived from a hyperspectral data cube 12 may use a window 22 shifted one row down from the window location in Figure 3 to determine which subpixels of a hyperspectral data cube 12 may be used by the conversion method 424 to calculate a value for a resulting fourth pixel 430 of the hyperspectral image 114.
  • Figure 7 is a representative illustration demonstrating a complete upsampling 500 of a hyperspectral image 114 derived from a hyperspectral data cube 12 according to the method of Fig. 2 .
  • the panchromatic representation of the hyperspectral image 114 may be formed by repeating the weighting and shifting steps for all possible arrays of the subpixels of the hyperspectral data cube 12.
  • the conversion method 524 demonstrating the upsampling 500 to create a hyperspectral image 114 derived from a hyperspectral data cube 12 may repeatedly use a window that is shifted by one subpixel at a time to determine which subpixels of a hyperspectral data cube 12 may be used to calculate a value for each resulting pixel of the hyperspectral image 114, as shown in Figs 3-6 .
  • the resulting pixels 128, 226, 328, 430 from the weighted, windowed average shown in Figures 3-6 are shown as placed in the final hyperspectral image 114 relative to all of the weighted, windowed averages used to construct all of the pixels for the hyperspectral image 114.
  • a panchromatic representation of a hyperspectral image 114 formed from a hyperspectral data cube 12 using an embodiment of the resolution-controlling method of the present invention may establish a one-to-one mapping between the hyperspectral data cube subpixels and the hyperspectral image pixels with the possible exception of boundary hyperspectral data cube subpixels such as 518, 520.
  • the boundary hyperspectral data cube subpixels 518, 520 may not have an analogue in the resulting hyperspectral image 114, establishing a final resolution for the hyperspectral image as M -2 x N -2 pixels where M and N are the number of hyperspectral data cube subpixels in a row and column respectively, effectively improving the spatial resolution of the sensor by a factor equal to the spectral cube dimension.
  • Using a single weighting scheme may result in an image that is displayed as grayscale.
  • Multiple different weighting schemes may be employed simultaneously to generate distinct channels of a resulting image.
  • the display of a panchromatic representation of a hyperspectral image 114 may have multiple color channels.
  • the resulting channels typically expressed as RGB, may represent the actual visible color that the subject of the image actually appears or a false color image whereby the weighted panchromatic representation of the hyperspectral image is imbued with visible color characteristics that may not represent the actual color characteristics of the subject of the image but are nonetheless visually appealing to a viewer.
  • demosaicing An important distinction between the method of the present invention and the well-known techniques of demosaicing is that the techniques commonly used in demosaicing interpolate surrounding pixels to estimate the red, green and blue subpixel values that are inherently missing in each subpixel.
  • the method of the current invention does not attempt to interpolate for missing pixel values by analysis of surrounding subpixels but instead changes the center of the RGB pixel that is used for display. By iterating over each of the color channels, the method may perform an interpolation of the RGB pixel value by continually redefining the color filter array pattern definition of a Bayer filter without trying to interpolate the missing pixel values.

Landscapes

  • Physics & Mathematics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Processing (AREA)
  • Spectrometry And Color Measurement (AREA)
EP13182230.6A 2012-08-29 2013-08-29 Method of controlling the resolution of a hyperspectral image Withdrawn EP2703792A1 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US13/598,029 US8797431B2 (en) 2012-08-29 2012-08-29 Method of controlling the resolution of a hyperspectral image

Publications (1)

Publication Number Publication Date
EP2703792A1 true EP2703792A1 (en) 2014-03-05

Family

ID=49054431

Family Applications (1)

Application Number Title Priority Date Filing Date
EP13182230.6A Withdrawn EP2703792A1 (en) 2012-08-29 2013-08-29 Method of controlling the resolution of a hyperspectral image

Country Status (6)

Country Link
US (1) US8797431B2 (zh)
EP (1) EP2703792A1 (zh)
JP (1) JP6291189B2 (zh)
CN (1) CN103679758B (zh)
BR (1) BR102013020692A2 (zh)
CA (1) CA2824231A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU2782161C1 (ru) * 2021-12-07 2022-10-21 Федеральное государственное казенное военное образовательное учреждение высшего образования "Военный учебно-научный центр Военно-воздушных сил "Военно-воздушная академия имени профессора Н.Е. Жуковского и Ю.А. Гагарина" (г. Воронеж) Министерства обороны Российской Федерации Способ повышения пространственного разрешения гиперспектральных изображений

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014125804A1 (ja) * 2013-02-13 2014-08-21 パナソニック株式会社 マルチスペクトル撮像装置およびマルチスペクトル撮像方法
US20220404531A1 (en) * 2021-06-17 2022-12-22 Drs Network & Imaging Systems, Llc Method and system for fabrication and use of a spectral basis filter
CN114383668B (zh) * 2022-03-24 2022-05-24 北京航空航天大学 一种基于可变背景的流场测量装置及方法
CN115060367B (zh) * 2022-06-13 2023-04-21 华东师范大学 基于显微高光谱成像平台的全玻片数据立方体采集方法

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060188147A1 (en) * 2005-02-24 2006-08-24 Rai Barinder S Method and apparatus applying digital image filtering to color filter array data
WO2008070542A1 (en) * 2006-12-01 2008-06-12 Harris Corporation Spatial and spectral calibration of a panchromatic, multispectral image pair

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6847737B1 (en) * 1998-03-13 2005-01-25 University Of Houston System Methods for performing DAF data filtering and padding
US6639665B2 (en) * 2001-02-09 2003-10-28 Institute For Technology Development Multispectral imaging system for contaminant detection
US7231084B2 (en) * 2002-09-27 2007-06-12 Motorola, Inc. Color data image acquistion and processing
US7751594B2 (en) * 2003-04-04 2010-07-06 Lumidigm, Inc. White-light spectral biometric sensors
CA2547359C (en) 2003-11-26 2012-11-27 Florida Environmental Research Institute, Inc. Spectral imaging system
US7242478B1 (en) * 2003-12-05 2007-07-10 Surface Optics Corporation Spatially corrected full-cubed hyperspectral imager
US7979209B2 (en) 2005-04-15 2011-07-12 Mississippi State University Research And Technology Corporation Temporal mapping and analysis
US20070058050A1 (en) * 2005-09-12 2007-03-15 Manuel Innocent Reconstructing a full color image from an image encoded by a bayer pattern
EP2174176A1 (en) * 2007-07-17 2010-04-14 Explay Ltd. Coherent imaging method of laser projection and apparatus thereof
JP5305992B2 (ja) * 2009-03-10 2013-10-02 キヤノン株式会社 画像処理装置及び画像処理方法
US8351031B2 (en) * 2009-06-05 2013-01-08 Spectral Sciences, Inc. Single-shot spectral imager
US8611603B2 (en) * 2012-02-14 2013-12-17 The United States Of America As Represented By The Secretary Of The Army Method and apparatus for object tracking via hyperspectral imagery

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060188147A1 (en) * 2005-02-24 2006-08-24 Rai Barinder S Method and apparatus applying digital image filtering to color filter array data
WO2008070542A1 (en) * 2006-12-01 2008-06-12 Harris Corporation Spatial and spectral calibration of a panchromatic, multispectral image pair

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
ALTUNBASAK Y ET AL: "Demosaicking: color filter array interpolation [exploring the imaging process and the correlations among three color planes in single-chip digital cameras]", IEEE SIGNAL PROCESSING MAGAZINE, IEEE SERVICE CENTER, PISCATAWAY, NJ, US, vol. 22, no. 1, 1 January 2005 (2005-01-01), pages 44 - 54, XP011128684, ISSN: 1053-5888, DOI: 10.1109/MSP.2005.1407714 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU2782161C1 (ru) * 2021-12-07 2022-10-21 Федеральное государственное казенное военное образовательное учреждение высшего образования "Военный учебно-научный центр Военно-воздушных сил "Военно-воздушная академия имени профессора Н.Е. Жуковского и Ю.А. Гагарина" (г. Воронеж) Министерства обороны Российской Федерации Способ повышения пространственного разрешения гиперспектральных изображений

Also Published As

Publication number Publication date
CA2824231A1 (en) 2014-02-28
JP2014049126A (ja) 2014-03-17
US8797431B2 (en) 2014-08-05
JP6291189B2 (ja) 2018-03-14
BR102013020692A2 (pt) 2016-03-15
CN103679758B (zh) 2018-01-02
US20140063298A1 (en) 2014-03-06
CN103679758A (zh) 2014-03-26

Similar Documents

Publication Publication Date Title
Abraham et al. The Gemini Deep Deep Survey. I. Introduction to the survey, catalogs, and composite spectra
US6970597B1 (en) Method of defining coefficients for use in interpolating pixel values
US7373019B2 (en) System and method for providing multi-sensor super-resolution
Broadwater et al. Hybrid detectors for subpixel targets
US9922407B2 (en) Analysis of a multispectral image
EP2703792A1 (en) Method of controlling the resolution of a hyperspectral image
IL226166A (en) A system and method for reducing the dimensions of spectral figures
US20190139189A1 (en) Image remosaicing
US10089536B2 (en) Analysis of a multispectral image
US9336570B2 (en) Demosaicking system and method for color array based multi-spectral sensors
US9898953B2 (en) Offset method and equipment of RGBW panel subpixel
EP3975045A1 (en) Enhanced training method and apparatus for image recognition model
EP2929503A1 (de) Verfahren und vorrichtung zur erzeugung eines verbesserten farbbildes mit einem sensor mit farbfilter
CN105096286A (zh) 遥感图像的融合方法及装置
DE102012221667A1 (de) Vorrichtung und Verfahren zum Verarbeiten von Fernerkundungsdaten
CN108288256A (zh) 一种多光谱马赛克图像复原方法
CN109829872B (zh) 一种用于内陆水体遥感的多时相多源遥感影像融合方法
CN104170377A (zh) 图像处理装置、摄像装置及图像处理程序
CN105719274A (zh) 边缘检测系统及方法
Zhu et al. HCNNet: A hybrid convolutional neural network for spatiotemporal image fusion
CN110719447A (zh) 一种具有多通道窄带滤色片阵列的图像传感器
EP2517172B1 (en) Filter setup learning for binary sensor
CN106470335B (zh) 基于亚像素采样的图像处理方法及图像显示方法
EP3002731A2 (en) Determing color information using a binary sensor
CN113723228B (zh) 地表类型占比的确定方法、装置、电子设备和存储介质

Legal Events

Date Code Title Description
AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20140905

RBV Designated contracting states (corrected)

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20180821

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20220301