WO2023282069A1 - 信号処理装置および信号処理方法 - Google Patents
信号処理装置および信号処理方法 Download PDFInfo
- Publication number
- WO2023282069A1 WO2023282069A1 PCT/JP2022/025013 JP2022025013W WO2023282069A1 WO 2023282069 A1 WO2023282069 A1 WO 2023282069A1 JP 2022025013 W JP2022025013 W JP 2022025013W WO 2023282069 A1 WO2023282069 A1 WO 2023282069A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- data
- information
- wavelength
- image data
- mask
- Prior art date
Links
- 238000003672 processing method Methods 0.000 title claims abstract description 12
- 238000012545 processing Methods 0.000 title claims description 153
- 230000003595 spectral effect Effects 0.000 claims abstract description 152
- 238000001228 spectrum Methods 0.000 claims description 112
- 238000000034 method Methods 0.000 claims description 96
- 238000002834 transmittance Methods 0.000 claims description 82
- 239000011159 matrix material Substances 0.000 claims description 60
- 239000000126 substance Substances 0.000 claims description 48
- 238000009826 distribution Methods 0.000 claims description 45
- 230000003287 optical effect Effects 0.000 claims description 36
- 238000000862 absorption spectrum Methods 0.000 claims description 26
- 238000002189 fluorescence spectrum Methods 0.000 claims description 18
- 230000002194 synthesizing effect Effects 0.000 claims description 18
- 238000004590 computer program Methods 0.000 claims description 12
- 239000002131 composite material Substances 0.000 claims description 5
- 238000010586 diagram Methods 0.000 description 62
- 238000003384 imaging method Methods 0.000 description 59
- 230000005284 excitation Effects 0.000 description 47
- 230000015572 biosynthetic process Effects 0.000 description 30
- 238000003786 synthesis reaction Methods 0.000 description 30
- 238000004364 calculation method Methods 0.000 description 26
- 230000008569 process Effects 0.000 description 26
- 238000006243 chemical reaction Methods 0.000 description 25
- 239000007850 fluorescent dye Substances 0.000 description 23
- 239000000523 sample Substances 0.000 description 18
- 238000004458 analytical method Methods 0.000 description 15
- 230000006870 function Effects 0.000 description 15
- 238000002372 labelling Methods 0.000 description 14
- 239000000975 dye Substances 0.000 description 13
- 239000000463 material Substances 0.000 description 12
- MHMNJMPURVTYEJ-UHFFFAOYSA-N fluorescein-5-isothiocyanate Chemical compound O1C(=O)C2=CC(N=C=S)=CC=C2C21C1=CC=C(O)C=C1OC1=CC(O)=CC=C21 MHMNJMPURVTYEJ-UHFFFAOYSA-N 0.000 description 9
- 238000012986 modification Methods 0.000 description 9
- 230000004048 modification Effects 0.000 description 9
- 230000004044 response Effects 0.000 description 9
- 210000004027 cell Anatomy 0.000 description 8
- 238000002509 fluorescent in situ hybridization Methods 0.000 description 8
- 210000000349 chromosome Anatomy 0.000 description 6
- 238000003860 storage Methods 0.000 description 6
- 238000012935 Averaging Methods 0.000 description 5
- 238000010521 absorption reaction Methods 0.000 description 4
- 230000005540 biological transmission Effects 0.000 description 4
- 238000000701 chemical imaging Methods 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 4
- 230000010354 integration Effects 0.000 description 4
- OAICVXFJPJFONN-UHFFFAOYSA-N Phosphorus Chemical compound [P] OAICVXFJPJFONN-UHFFFAOYSA-N 0.000 description 3
- 239000000470 constituent Substances 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 238000011156 evaluation Methods 0.000 description 3
- 238000004519 manufacturing process Methods 0.000 description 3
- 238000005259 measurement Methods 0.000 description 3
- 239000002184 metal Substances 0.000 description 3
- 239000004065 semiconductor Substances 0.000 description 3
- 239000007787 solid Substances 0.000 description 3
- 230000005945 translocation Effects 0.000 description 3
- 206010010356 Congenital anomaly Diseases 0.000 description 2
- 206010028980 Neoplasm Diseases 0.000 description 2
- 230000005856 abnormality Effects 0.000 description 2
- 230000000295 complement effect Effects 0.000 description 2
- 238000000295 emission spectrum Methods 0.000 description 2
- 238000002073 fluorescence micrograph Methods 0.000 description 2
- 230000002068 genetic effect Effects 0.000 description 2
- 238000007689 inspection Methods 0.000 description 2
- 239000011368 organic material Substances 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 230000000704 physical effect Effects 0.000 description 2
- 239000000049 pigment Substances 0.000 description 2
- 108090000623 proteins and genes Proteins 0.000 description 2
- 230000001131 transforming effect Effects 0.000 description 2
- 241001504949 Amicia Species 0.000 description 1
- 210000002237 B-cell of pancreatic islet Anatomy 0.000 description 1
- 230000005457 Black-body radiation Effects 0.000 description 1
- 241000893313 Helochara delta Species 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 230000003796 beauty Effects 0.000 description 1
- 238000001574 biopsy Methods 0.000 description 1
- 201000011510 cancer Diseases 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000004040 coloring Methods 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 238000000354 decomposition reaction Methods 0.000 description 1
- 230000006837 decompression Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000009509 drug development Methods 0.000 description 1
- 230000007717 exclusion Effects 0.000 description 1
- 238000000799 fluorescence microscopy Methods 0.000 description 1
- 238000001215 fluorescent labelling Methods 0.000 description 1
- 238000009499 grossing Methods 0.000 description 1
- 229910052736 halogen Inorganic materials 0.000 description 1
- 150000002367 halogens Chemical class 0.000 description 1
- 229910052500 inorganic mineral Inorganic materials 0.000 description 1
- 238000010030 laminating Methods 0.000 description 1
- 208000032839 leukemia Diseases 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 239000011707 mineral Substances 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 238000005192 partition Methods 0.000 description 1
- 239000000575 pesticide Substances 0.000 description 1
- 239000011295 pitch Substances 0.000 description 1
- 238000000513 principal component analysis Methods 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 238000011084 recovery Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000000611 regression analysis Methods 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 238000004611 spectroscopical analysis Methods 0.000 description 1
- 230000006641 stabilisation Effects 0.000 description 1
- 238000011105 stabilization Methods 0.000 description 1
- 230000000087 stabilizing effect Effects 0.000 description 1
- 238000010186 staining Methods 0.000 description 1
- 238000007619 statistical method Methods 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/001—Texturing; Colouring; Generation of texture or colour
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/17—Systems in which incident light is modified in accordance with the properties of the material investigated
- G01N21/25—Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
- G01N21/27—Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands using photo-electric detection ; circuits for computing concentration
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/62—Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light
- G01N21/63—Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light optically excited
- G01N21/64—Fluorescence; Phosphorescence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/10—Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
- H04N25/11—Arrangement of colour filter arrays [CFA]; Filter mosaics
- H04N25/13—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/10—Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
- H04N25/11—Arrangement of colour filter arrays [CFA]; Filter mosaics
- H04N25/13—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
- H04N25/135—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on four or more different wavelength filter elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/24—Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
Definitions
- the present disclosure relates to a signal processing device and a signal processing method.
- spectral information of a large number of wavelength bands each of which is a narrow band, for example, ten bands or more
- Cameras that acquire images in such multiple wavelength bands include "hyperspectral cameras” and “multispectral cameras”. These cameras are used in various fields such as food inspection, biopsy, drug development, and mineral composition analysis.
- Patent Document 1 discloses a compressed sensing hyperspectral camera.
- Compressed sensing is a technique that restores more data than observed data by assuming that the data distribution of the observed object is sparse in a certain space (for example, frequency space). The estimation operation assuming the sparsity of the observed object is called "sparse reconstruction”.
- a hyperspectral camera disclosed in U.S. Pat. No. 6,200,000 acquires monochrome images through an array of filters whose spectral transmittance has maxima at multiple wavelengths. The imaging device restores a hyperspectral image from the monochrome image by calculation based on sparse reconstruction.
- Non-Patent Document 1 discloses an example of a snapshot-type hyperspectral imaging device suitable for observing the spectrum of fluorescence emitted from a phosphor.
- the present disclosure provides techniques for reducing the load of reconstruction operations by efficiently generating images of the required wavelength bands.
- a method is a computer-implemented signal processing method.
- the method includes obtaining compressed image data including two-dimensional image information of an object obtained by compressing hyperspectral information in a wavelength range of interest; and extracting one or more spectral information associated with the object. and generating from the compressed image data a plurality of two-dimensional image data corresponding to a plurality of designated wavelength bands determined based on the reference spectral data.
- a method recovers spectral image data for each wavelength band from compressed image data including two-dimensional image information of a subject obtained by compressing hyperspectral information in a wavelength range of interest.
- a method is a computer-implemented signal processing method.
- the method includes obtaining compressed image data including two-dimensional image information of an object obtained by compressing hyperspectral information in a wavelength range of interest; and extracting one or more spectral information associated with the object.
- a graphical user interface for designating restoration conditions for acquiring reference spectral data including; and specifying restoration conditions for generating a plurality of two-dimensional image data corresponding to a plurality of designated wavelength bands from the compressed image data; and the reference spectrum and causing the data-based images to be displayed together on a display.
- Computer-readable recording media include non-volatile recording media such as CD-ROMs (Compact Disc-Read Only Memory).
- a device may consist of one or more devices. When the device is composed of two or more devices, the two or more devices may be arranged in one device, or may be divided and arranged in two or more separate devices. As used herein and in the claims, a "device" can mean not only one device, but also a system of multiple devices.
- FIG. 1A is a diagram schematically showing a configuration example of an imaging system.
- FIG. 1B is a diagram schematically showing another configuration example of the imaging system.
- FIG. 1C is a diagram schematically showing still another configuration example of the imaging system.
- FIG. 1D is a diagram schematically showing still another configuration example of the imaging system.
- FIG. 2A is a diagram schematically showing an example of a filter array.
- FIG. 2B is a diagram showing an example of a spatial distribution of light transmittance in each of a plurality of wavelength bands included in the target wavelength band.
- FIG. 2C is a diagram showing an example of spectral transmittance of area A1 included in the filter array shown in FIG. 2A.
- FIG. 1A is a diagram schematically showing a configuration example of an imaging system.
- FIG. 1B is a diagram schematically showing another configuration example of the imaging system.
- FIG. 1C is a diagram schematically showing still another configuration example of the imaging system.
- FIG. 2A is
- FIG. 2D is a diagram showing an example of spectral transmittance of area A2 included in the filter array shown in FIG. 2A.
- FIG. 3A is a diagram for explaining an example of the relationship between a target wavelength band and multiple wavelength bands included therein.
- FIG. 3B is a diagram for explaining another example of the relationship between a target wavelength band and multiple wavelength bands included therein.
- FIG. 4A is a diagram for explaining spectral transmittance characteristics in a certain region of the filter array.
- FIG. 4B is a diagram showing the result of averaging the spectral transmittance shown in FIG. 4A for each wavelength band.
- FIG. 5 is a block diagram showing a configuration example of a system for reducing the load of arithmetic processing.
- FIG. 5 is a block diagram showing a configuration example of a system for reducing the load of arithmetic processing.
- FIG. 6 is a diagram showing a modification of the system of FIG.
- FIG. 7 is a diagram showing an example of mask data before conversion stored in memory.
- FIG. 8A is a diagram showing examples of spectra of four types of samples that can be included in the subject.
- FIG. 8B is a diagram showing an example of bands to be synthesized.
- FIG. 9 is a flowchart illustrating an example of mask data conversion processing.
- FIG. 10 is a diagram for explaining an example of a method of synthesizing mask information of a plurality of bands and converting it into new mask information.
- FIG. 11A is a diagram showing examples of reference spectra of four types of samples that can be included in the object.
- FIG. 11B is a diagram showing an example of band synthesis.
- FIG. 12 is a block diagram showing a configuration example of a system for labeling.
- FIG. 13A is a diagram showing examples of reference spectra of four types of samples that can be included in the subject.
- FIG. 13B is a diagram showing an example of band synthesis.
- FIG. 14 is a diagram showing an example of a graphical user interface (GUI).
- FIG. 15A is a first diagram for explaining a method of excluding a specific band from restoration targets.
- FIG. 15B is a second diagram for explaining the method of excluding a specific band from restoration targets.
- FIG. 16 is a flow chart showing an example of the operation of the signal processing circuit when excluding a particular band from restoration.
- FIG. 15A is a first diagram for explaining a method of excluding a specific band from restoration targets.
- FIG. 15B is a second diagram for explaining the method of excluding a specific band from restoration targets.
- FIG. 16 is a flow chart showing an example of the operation of the signal processing circuit when excluding
- FIG. 17 is a flow chart showing an example of the operation when reference spectrum data is translated into mask data conversion conditions and sent to the signal processing circuit.
- FIG. 18 is a diagram schematically showing the system configuration of the fourth embodiment.
- FIG. 19 is a diagram showing an example of the relationship between excitation light and fluorescence spectra and excluded bands.
- FIG. 20A is a diagram showing absorption spectra of five types of fluorescent dyes.
- FIG. 20B is a diagram showing fluorescence spectra of five types of fluorescent dyes.
- FIG. 21A is a diagram showing the relationship between the excitation wavelength and the absorption spectrum of the fluorescent dye in the first imaging.
- FIG. 21B is a diagram showing an example of the relationship between each fluorescence spectrum, cutoff wavelength region, and restoration band.
- FIG. 22A is a diagram showing the relationship between the excitation wavelength and the absorption spectrum of the fluorescent dye in the second imaging.
- FIG. 22B is a diagram showing an example of the relationship between each fluorescence spectrum, cutoff wavelength region, and restoration band.
- FIG. 23A is a diagram showing the relationship between the excitation wavelength and the absorption spectrum of the fluorescent dye in the third imaging.
- FIG. 23B is a diagram showing the relationship between each fluorescence spectrum, cutoff wavelength region, and restoration band.
- all or part of a circuit, unit, device, member or section, or all or part of a functional block in a block diagram is, for example, a semiconductor device, a semiconductor integrated circuit (IC), or an LSI (large scale integration). ) may be performed by one or more electronic circuits.
- An LSI or IC may be integrated on one chip, or may be configured by combining a plurality of chips.
- functional blocks other than memory elements may be integrated into one chip.
- LSIs or ICs may be called system LSIs, VLSIs (very large scale integration), or ULSIs (ultra large scale integration) depending on the degree of integration.
- a Field Programmable Gate Array (FPGA), which is programmed after the LSI is manufactured, or a reconfigurable logic device that can reconfigure the connection relationships inside the LSI or set up the circuit partitions inside the LSI can also be used for the same purpose.
- FPGA Field Programmable Gate Array
- circuits, units, devices, members or parts can be executed by software processing.
- the software is recorded on one or more non-transitory storage media such as ROMs, optical discs, hard disk drives, etc., and when the software is executed by a processor, the functions specified in the software are is performed by the processor and peripherals.
- a system or apparatus may comprise one or more non-transitory storage media on which software is recorded, a processing unit, and required hardware devices such as interfaces.
- Sparsity is the property that the elements that characterize the observation target exist sparsely in a certain space (for example, frequency space). Sparsity is ubiquitous in nature. By using sparsity, it becomes possible to efficiently observe necessary information.
- a sensing technique that uses sparsity is called compressed sensing. By using compressed sensing, highly efficient devices and systems can be constructed.
- An example of applying compressed sensing to a hyperspectral camera is disclosed in US Pat. According to the hyperspectral camera disclosed in Patent Literature 1, it is possible to shoot a moving image with high wavelength resolution, high resolution, and multiple wavelengths in one shot.
- An imaging device that utilizes compressed sensing comprises an array of optical filters having random light transmission properties with respect to space and/or wavelength.
- An array of such optical filters is sometimes referred to as a "coded mask” or “coded element.”
- the coded mask is arranged on the optical path of light incident on the image sensor, and transmits light incident from the subject with light transmission characteristics that differ depending on the area. This process with the encoding mask is called "encoding".
- a light image encoded by the encoding mask is captured by an image sensor.
- An image produced by imaging with a coded mask is referred to herein as a "compressed image”.
- Mask data indicating light transmission characteristics of the coded mask are recorded in advance in a storage device.
- a processing circuit in the imaging device performs restoration processing based on the compressed image and the mask data.
- the decompression process produces a decompressed image that has more wavelength information than the compressed image.
- the mask data is, for example, information indicating the spatial distribution of the spectral transmittance of the coded mask.
- the restoration process includes an estimation operation that assumes the sparsity of the imaging target.
- the operations performed in the sparse reconstruction incorporate regularization terms such as the Discrete Cosine Transform (DCT), the Wavelet Transform, the Fourier Transform, or the Total Variation (TV), for example, as disclosed in US Pat.
- DCT Discrete Cosine Transform
- TV Total Variation
- It can be an estimation operation of the data by minimizing the evaluation function.
- Such an estimation operation is a high-load operation using mask data having a size corresponding to the product of the number of pixels of the image sensor and the number of wavelength bands. Therefore, a processing circuit with high computing power is required. If the time required for such high-load calculations is longer than the exposure time during shooting, the calculation time will limit the operation speed (for example, frame rate) of the camera.
- Non-Patent Document 1 a description of the spectrum assumed in the subject is known, such as fluorescence observation and absorption spectrum observation (see, for example, Non-Patent Document 1).
- the spectrum expected in the object is known, it is possible to reduce the amount of computation by appropriately editing or reducing the mask data used in the restoration process.
- the mask data can be edited or reduced, for example, based on wavelength band information required in processing such as analysis or classification performed after image restoration.
- the present disclosure discloses a method of reducing the computational load by referring to information on the known spectrum when the spectrum that the subject may have is known.
- a known spectrum assumed for each substance contained in a subject, that is, an object of observation is called a "reference spectrum”.
- Data indicating reference spectra of one or more substances that an observation target may have are collectively referred to as "reference spectrum data”.
- a signal processing method includes obtaining compressed image data including two-dimensional image information of an object obtained by compressing hyperspectral information in a wavelength range of interest; obtaining, from the compressed image data, a plurality of two-dimensional image data corresponding to a plurality of designated wavelength bands determined based on the reference spectral data; and generating a
- “Hyperspectral information in the target wavelength range” means information on the spatial distribution of luminance for each of a plurality of wavelength bands included in the target wavelength range. “Compressing hyperspectral information” means compressing information about the spatial distribution of luminance in multiple wavelength bands into one monochrome two-dimensional image using an encoding element such as a filter array to be described later. .
- a plurality of two-dimensional image data corresponding to a plurality of designated wavelength bands determined based on the reference spectral data are restored from the compressed image data. Therefore, the load of arithmetic processing can be reduced compared to the case of restoring two-dimensional image data corresponding to all wavelength bands included in the target wavelength band.
- the one or more spectra may be associated with one or more substances assumed to be contained in the subject.
- data defining the correspondence between substances and spectra may be recorded in advance in a storage medium such as a memory.
- a storage medium such as a memory.
- Each of the plurality of designated wavelength bands may include a peak wavelength of the spectrum associated with a corresponding one of the one or more substances. This allows each designated wavelength band to correspond to one substance, facilitating classification based on the reconstructed image.
- the reference spectral data may include information on multiple spectra associated with multiple types of substances assumed to be contained in the subject.
- the plurality of designated wavelength bands may include a first designated wavelength band having no overlap among the plurality of spectra and a second designated wavelength band having overlap among the plurality of spectra. This can facilitate classification based on the reconstructed image even if there is spectral overlap of two or more substances.
- a certain designated wavelength band “does not have overlap between a plurality of spectra” means that in the designated wavelength band, one of the plurality of spectra has significant intensity and the other spectra have significant intensity. It means having no strength.
- a specified wavelength band “has overlap between spectra” means that two or more spectra have significant intensity in the specified wavelength band. Whether or not it has "significant intensity” can be determined, for example, based on the integrated value when the intensity of each spectrum is integrated from the lower limit wavelength to the upper limit wavelength of the specified wavelength band. Let signal S be the maximum integral value among the integral values of a plurality of spectra, and noise N be the sum of the other integral values.
- the designated wavelength band is considered to have no overlap between the multiple spectra. can be judged. Conversely, if the S/N ratio is less than the threshold, it may be determined that the designated wavelength band has overlap between multiple spectra.
- a threshold e.g. 1, 2, or 3, etc.
- the compressed image data can be generated using an image sensor and a filter array including a plurality of types of optical filters with different spectral transmittances.
- the method may further include obtaining mask data reflecting the spatial distribution of the spectral transmittance.
- the plurality of two-dimensional image data may be generated based on the compressed image data and the mask data.
- the mask data may include mask matrix information having elements according to the spatial distribution of the transmittance of the filter array for each of a plurality of unit bands included in the target wavelength band.
- the method includes generating synthesized mask information by synthesizing the mask matrix information corresponding to a non-specified wavelength band different from the specified wavelength band in the target wavelength band; and generating composite image data for the non-designated wavelength bands based on. As a result, the amount of data is reduced by synthesizing the mask matrix information for the non-designated wavelength band, so that the load of arithmetic processing can be further reduced.
- Generating the plurality of two-dimensional image data includes generating, from the compressed image data, image data corresponding to a non-designated wavelength band different from the designated wavelength band in the target wavelength band, without generating the plurality of designated image data. Generating and outputting the plurality of two-dimensional image data corresponding to wavelength bands may be included. In other words, the method may not include generating from the compressed image data image data corresponding to a non-designated wavelength band different from the designated wavelength band in the target wavelength band. As a result, restoration processing is not performed for non-designated wavelength bands with low importance, so the computation load can be further reduced.
- the plurality of designated wavelength bands may be determined based on the intensity of the one or more spectra indicated by the reference spectral data or a differential value of the intensity.
- each designated wavelength band may be determined to include a peak wavelength at which the intensity of the corresponding spectrum peaks.
- each designated wavelength band may be determined by avoiding wavelength regions in which the absolute value of the derivative of the intensity of the corresponding spectrum is close to zero. Such a method allows determination of designated wavelength bands containing characteristic portions of the spectrum, facilitating processing such as classification after reconstruction.
- the reference spectral data may include information on fluorescence spectra of one or more substances assumed to be contained in the subject. Thereby, restoration processing can be performed with an appropriate band configuration based on the fluorescence spectrum of the fluorescent substance.
- the reference spectrum data may include information on absorption spectra of one or more substances assumed to be contained in the subject. Thereby, restoration processing can be performed with an appropriate band configuration based on the absorption spectrum of the substance.
- the method further includes causing the display to display a graphical user interface (GUI) for allowing a user to specify the one or more spectra or one or more substances associated with the one or more spectra.
- GUI graphical user interface
- the reference spectral data may be obtained in response to the specified one or more spectra or the specified one or more substances.
- restoration processing can be performed with a band configuration corresponding to the spectrum or material designated by the user on the GUI.
- a method recovers spectral image data for each wavelength band from compressed image data containing two-dimensional image information of an object obtained by compressing hyperspectral information in a wavelength range of interest.
- This is a method of generating mask data used for The method includes obtaining first mask data for reconstructing first spectral image data corresponding to a first group of wavelength bands in the wavelength range of interest, and obtaining reference spectral data including information about at least one spectrum. determining one or more specified wavelength ranges included in the target wavelength range based on the reference spectral data; and determining a second wavelength range in the one or more specified wavelength ranges based on the first mask data generating second mask data for reconstructing second spectral image data corresponding to the wavelength bands.
- the first waveband group may be a set of all or part of the wavebands included in the target wavelength band.
- the first wavelength band group may be a set of a plurality of unit wavelength bands having minute bandwidths included in the target wavelength band.
- the second wavelength band group may be a set of all or part of the wavelength bands included in the designated wavelength band.
- the second wavelength band group may be a set of multiple unit wavelength bands included in the specified wavelength band.
- Each of the first wavelength band group and the second wavelength band group may be a set of synthetic bands obtained by synthesizing two or more unit wavelength bands. When such band synthesis is performed, mask data conversion processing is performed according to the band synthesis mode.
- the second mask data for restoring the second spectral image data corresponding to the second wavelength band group in the designated wavelength range determined based on the reference spectral data. Since the data size of the second mask data is smaller than that of the first mask data, it is possible to efficiently restore using the second mask data.
- the above method of generating the second mask data may be performed by a device that restores spectral image data for each wavelength band from compressed image data based on the second mask data, or may be performed by another device connected to the device. It may be performed by an apparatus.
- the compressed image data can be generated using an image sensor and a filter array including a plurality of types of optical filters with different spectral transmittances.
- the first mask data and the second mask data may be data reflecting a spatial distribution of spectral transmittance of the filter array.
- the first mask data may include first mask information indicating the spatial distribution of the spectral transmittance corresponding to the first wavelength band group.
- the second mask data may include second mask information indicating the spatial distribution of the spectral transmittance corresponding to the second wavelength band group.
- the second mask data may further include third mask information obtained by synthesizing a plurality of pieces of information.
- Each of the plurality of pieces of information indicates the spatial distribution of the spectral transmittance in a corresponding wavelength band included in a non-designated wavelength range other than the designated wavelength range in the target wavelength range.
- the second mask data may not include information on the spatial distribution of the spectral transmittance in the corresponding wavelength band included in the non-designated wavelength range other than the designated wavelength range.
- a signal processing method includes obtaining compressed image data including two-dimensional image information of an object obtained by compressing hyperspectral information in a wavelength range of interest; obtaining reference spectral data including information on one or more spectra obtained from the compressed image data; and specifying restoration conditions for generating a plurality of two-dimensional image data corresponding to a plurality of designated wavelength bands from the compressed image data. and causing a display to display both a graphical user interface for and an image based on the reference spectral data.
- the user can specify restoration conditions for generating a plurality of two-dimensional image data corresponding to a plurality of designated wavelength bands. Thereby, it is possible to efficiently generate a plurality of two-dimensional image data corresponding to a desired designated wavelength band.
- the present disclosure also includes a computer program for causing a computer to execute each of the above methods.
- the present disclosure also includes a signal processing apparatus comprising a processor for performing each of the above methods and a memory storing a computer program executed by the processor.
- FIG. 1A is a diagram schematically showing a configuration example of an imaging system.
- This system includes an imaging device 100 and a processing device 200 .
- the imaging device 100 has a configuration similar to that of the imaging device disclosed in Patent Document 1.
- the imaging device 100 includes an optical system 140 , a filter array 110 and an image sensor 160 .
- Optical system 140 and filter array 110 are arranged on the optical path of light incident from object 70, which is a subject.
- Filter array 110 in the example of FIG. 1A is positioned between optical system 140 and image sensor 160 .
- FIG. 1A illustrates an apple as an example of the object 70 .
- Object 70 is not limited to an apple, and may be any object.
- the image sensor 160 generates data of the compressed image 10 in which information of multiple wavelength bands is compressed as a two-dimensional monochrome image.
- the processing device 200 can generate image data for each of a plurality of wavelength bands included in a predetermined target wavelength range.
- a plurality of image data corresponding to the generated plurality of wavelength bands on a one-to-one basis may be referred to as a "hyperspectral (HS) data cube" or "hyperspectral image data".
- HS hyperspectral
- N is an integer equal to or greater than 4.
- a plurality of image data corresponding to a plurality of wavelength bands to be generated are referred to as a restored image 20W 1 , a restored image 20W 2 , . It may be collectively referred to as “spectral image 20".
- the data of the restored image of each wavelength band is sometimes called “spectral image data” or simply “spectral image”.
- image data or signals that is, a set of data or signals representing multiple pixel values of multiple pixels included in an image may be simply referred to as an "image.”
- the "target wavelength range” refers to the wavelength range determined by the upper limit wavelength and lower limit wavelength of the wavelength components included in the spectral image output by the system.
- the wavelength range of interest may correspond to a wavelength range of light detectable by a photodetector, such as an image sensor, in the system.
- the wavelength range of interest may be 400-700 nm.
- the combined wavelength range of 400-500 nm and 600-700 nm detectable by the photodetector may correspond to the wavelength range of interest.
- the filter array 110 in this embodiment is an array of a plurality of translucent filters arranged in rows and columns.
- the multiple filters include multiple types of filters having different spectral transmittances, ie, wavelength dependencies of light transmittances.
- the filter array 110 functions as the coding mask described above, modulates the intensity of incident light for each wavelength, and outputs the modulated light.
- the filter array 110 is arranged near or directly above the image sensor 160 .
- “near” means that the image of the light from the optical system 140 is close enough to be formed on the surface of the filter array 110 in a somewhat clear state.
- “Directly above” means that they are so close to each other that there is almost no gap. Filter array 110 and image sensor 160 may be integrated.
- the optical system 140 includes at least one lens. Although optical system 140 is shown as a single lens in FIG. 1A, optical system 140 may be a combination of multiple lenses. Optical system 140 forms an image on the imaging surface of image sensor 160 via filter array 110 .
- FIGS. 1B to 1D are diagrams showing configuration examples of the imaging device 100 in which the filter array 110 is arranged away from the image sensor 160.
- FIG. 1B filter array 110 is positioned between optical system 140 and image sensor 160 and at a distance from image sensor 160 .
- filter array 110 is positioned between object 70 and optics 140 .
- imaging device 100 comprises two optical systems 140A and 140B, with filter array 110 positioned therebetween.
- an optical system including one or more lenses may be arranged between filter array 110 and image sensor 160 .
- the image sensor 160 is a monochrome photodetector device having a plurality of two-dimensionally arranged photodetector elements (also referred to as "pixels" in this specification).
- the image sensor 160 can be, for example, a CCD (Charge-Coupled Device), a CMOS (Complementary Metal Oxide Semiconductor) sensor, or an infrared array sensor.
- the photodetector includes, for example, a photodiode.
- Image sensor 160 does not necessarily have to be a monochrome type sensor. For example, a color type sensor may be used.
- the color type sensor has a filter that transmits red light, a filter that transmits green light, a sensor that transmits blue light, a filter that transmits red light, a filter that transmits green light, and a filter that transmits blue light.
- the amount of information about wavelengths can be increased, and the accuracy of reconstruction of the hyperspectral image 20 can be improved.
- the wavelength range to be acquired may be arbitrarily determined, and is not limited to the visible wavelength range, and may be the ultraviolet, near-infrared, mid-infrared, or far-infrared wavelength ranges.
- Processing unit 200 may be a computer comprising one or more processors and one or more storage media such as memory.
- the processing device 200 generates data of a restored image 20W 1 , a restored image 20W 2 , .
- FIG. 2A is a diagram schematically showing an example of the filter array 110.
- FIG. Filter array 110 has a plurality of regions arranged two-dimensionally. In this specification, each of the plurality of regions may be referred to as a "cell".
- An optical filter having an individually set spectral transmittance is arranged in each region.
- the spectral transmittance is represented by a function T( ⁇ ), where ⁇ is the wavelength of incident light.
- the spectral transmittance T( ⁇ ) can take a value of 0 or more and 1 or less.
- the filter array 110 has 48 rectangular regions arranged in 6 rows and 8 columns. This is only an example and in actual applications more areas may be provided. The number may be about the same as the number of pixels of the image sensor 160, for example. The number of filters included in the filter array 110 is determined depending on the application, for example, within the range of tens to tens of millions.
- FIG. 2B is a diagram showing an example of the spatial distribution of the light transmittance of each of the wavelength band W 1 , the wavelength band W 2 , .
- the difference in shading in each region represents the difference in transmittance.
- a lighter area has a higher transmittance, and a darker area has a lower transmittance.
- the spatial distribution of light transmittance differs depending on the wavelength band.
- FIGS. 2C and 2D are diagrams respectively showing examples of spectral transmittance of area A1 and area A2 included in filter array 110 shown in FIG. 2A.
- the spectral transmittance of the area A1 and the spectral transmittance of the area A2 are different from each other.
- the spectral transmittance of filter array 110 differs depending on the region. However, it is not necessary that all regions have different spectral transmittances.
- Filter array 110 includes two or more filters having different spectral transmittances.
- the number of spectral transmittance patterns in the plurality of regions included in the filter array 110 can be equal to or greater than the number N of wavelength bands included in the wavelength range of interest.
- the filter array 110 may be designed such that more than half of the regions have different spectral transmittances.
- the target wavelength band W can be set in various ranges depending on the application.
- the target wavelength range W can be, for example, a visible light wavelength range from about 400 nm to about 700 nm, a near-infrared wavelength range from about 700 nm to about 2500 nm, or a near-ultraviolet wavelength range from about 10 nm to about 400 nm.
- the target wavelength range W may be a wavelength range such as mid-infrared or far-infrared.
- the wavelength range used is not limited to the visible light range.
- the term “light” refers to radiation in general, including not only visible light but also infrared rays and ultraviolet rays.
- N is an arbitrary integer of 4 or more, and each wavelength band obtained by equally dividing the target wavelength band W into N is defined as a wavelength band W 1 , a wavelength band W 2 , .
- a plurality of wavelength bands included in the target wavelength band W may be set arbitrarily. For example, different wavelength bands may have different bandwidths. There may be gaps or overlaps between adjacent wavelength bands. In the example shown in FIG. 3B, the wavelength bands have different bandwidths and there is a gap between two adjacent wavelength bands. Thus, how to determine a plurality of wavelength bands is arbitrary.
- FIG. 4A is a diagram for explaining spectral transmittance characteristics in a certain region of the filter array 110.
- the spectral transmittance has multiple maxima (that is, maxima P1 to P5) and multiple minima for wavelengths within the wavelength range W of interest.
- normalization is performed so that the maximum value of the light transmittance within the target wavelength range W is 1 and the minimum value is 0.
- the spectral transmittance has maximum values in wavelength bands such as the wavelength band W 2 and the wavelength band W N ⁇ 1 .
- the spectral transmittance of each region can be designed to have a maximum value in at least two of the wavelength bands W1 to WN.
- maximum value P1, maximum value P3, maximum value P4, and maximum value P5 are greater than or equal to 0.5.
- the filter array 110 transmits a large amount of components in a certain wavelength band and transmits less components in other wavelength bands among the incident light. For example, for light in k wavelength bands out of N wavelength bands, the transmittance is greater than 0.5, and for light in the remaining Nk wavelength bands, the transmittance is 0.5. can be less than k is an integer that satisfies 2 ⁇ k ⁇ N. If the incident light is white light that evenly includes all wavelength components of visible light, the filter array 110 converts the incident light into light having a plurality of discrete intensity peaks with respect to wavelength. , and superimposes and outputs these multi-wavelength lights.
- FIG. 4B is a diagram showing, as an example, the result of averaging the spectral transmittance shown in FIG. 4A for each wavelength band W 1 , wavelength band W 2 , . . . , wavelength band WN.
- the averaged transmittance is obtained by integrating the spectral transmittance T( ⁇ ) for each wavelength band and dividing by the bandwidth of that wavelength band.
- the transmittance value averaged for each wavelength band is defined as the transmittance in that wavelength band.
- the transmittance is remarkably high in the three wavelength regions having the maximum values P1, P3 and P5. In particular, the transmittance exceeds 0.8 in the two wavelength regions having the maximum values P3 and P5.
- a grayscale transmittance distribution is assumed in which the transmittance of each region can take any value between 0 and 1 inclusive.
- a binary-scale transmittance distribution may be employed in which the transmittance of each region can take either a value of approximately 0 or approximately 1.
- each region transmits a majority of light in at least two wavelength bands of the plurality of wavelength bands included in the wavelength band of interest and transmits a majority of light in the remaining wavelength bands. don't let Here, "most" refers to approximately 80% or more.
- Part of the total cells may be replaced with transparent regions.
- a transparent region transmits light in all wavelength bands W1 to WN contained in the wavelength range W of interest with a similarly high transmittance, eg, a transmittance of 80% or more.
- the plurality of transparent regions may be arranged in a checkerboard, for example. That is, in the two directions in which the plurality of regions in the filter array 110 are arranged, the regions having different light transmittances depending on the wavelength and the transparent regions can be alternately arranged.
- Such data indicating the spatial distribution of the spectral transmittance of the filter array 110 is obtained in advance based on design data or actual measurement calibration, and stored in a storage medium included in the processing device 200. This data is used for arithmetic processing to be described later.
- the filter array 110 can be constructed using, for example, a multilayer film, an organic material, a diffraction grating structure, or a microstructure containing metal.
- a multilayer film for example, a dielectric multilayer film or a multilayer film containing a metal layer can be used.
- each cell is formed so that at least one of the thickness, material, and stacking order of each multilayer film is different. Thereby, different spectral characteristics can be realized depending on the cell.
- a multilayer film a sharp rise and fall in spectral transmittance can be realized.
- a structure using an organic material can be realized by differentiating the pigment or dye contained in each cell or by laminating different materials.
- a configuration using a diffraction grating structure can be realized by providing diffraction structures with different diffraction pitches or depths for each cell.
- a microstructure containing metal it can be produced using spectroscopy due to the plasmon effect.
- the processing device 200 reconstructs the hyperspectral image 20 based on the compressed image 10 output from the image sensor 160 and the spatial distribution characteristics of transmittance for each wavelength of the filter array 110 .
- the generated hyperspectral image 20 includes multiple images.
- the plurality of images correspond to a plurality of wavelength regions, and the number of the plurality of wavelength regions is, for example, the wavelength regions acquired by a normal color camera (for example, the wavelength region of red light, the wavelength region of green light, the wavelength region of blue light wavelength range) is greater than three.
- the number of wavelength bands may be on the order of 4 to 100, for example. This number of wavelength regions is referred to as the "number of bands". Depending on the application, the number of bands may exceed 100.
- the data to be obtained is the data of the hyperspectral image 20, and the data is assumed to be f. If the number of bands is N , f is image data f 1 corresponding to wavelength band W 1 , image data f 2 corresponding to wavelength band W 2 , . data that contains Here, as shown in FIGS. 1A to 1D, the horizontal direction of the image is the x direction, and the vertical direction of the image is the y direction. Assuming that the number of pixels in the x direction of the image data to be obtained is m and the number of pixels in the y direction is n , each of image data f 1 , image data f 2 , . is two-dimensional data. Therefore, the data f is three-dimensional data having n ⁇ m ⁇ N elements.
- This three-dimensional data is called “hyperspectral image data” or “hyperspectral datacube”.
- the number of elements of the data g of the compressed image 10 obtained by being encoded and multiplexed by the filter array 110 is n ⁇ m.
- Data g can be represented by the following equation (1).
- each of f 1 , f 2 , . . . , f N is data having n ⁇ m elements. Therefore, the vector on the right side is a one-dimensional vector of n ⁇ m ⁇ N rows and 1 column.
- the compressed image 10 is converted into a one-dimensional vector g of n ⁇ m rows and 1 column, and calculated.
- Matrix H encodes and intensity - modulates each component f 1 , f 2 , . represents a conversion to Therefore, H is a matrix with n ⁇ m rows and n ⁇ m ⁇ N columns.
- the mask information may be interpreted as matrix H in equation (1).
- the processing device 200 utilizes the redundancy of the images included in the data f and obtains the solution using the method of compressed sensing. Specifically, the desired data f is estimated by solving the following equation (2).
- f' represents the estimated data of f.
- the first term in parentheses in the above formula represents the amount of deviation between the estimation result Hf and the acquired data g, ie, the so-called residual term.
- the second term in parentheses is the regularization or stabilization term. Equation (2) means finding f that minimizes the sum of the first and second terms.
- a function in parentheses in Equation (2) is called an evaluation function. The processing device 200 can converge the solution by recursive iterative calculation and calculate f that minimizes the evaluation function as the final solution f'.
- the first term in parentheses in formula (2) means an operation for obtaining the sum of squares of the difference between the acquired data g and Hf obtained by transforming f in the estimation process using the matrix H.
- the second term, ⁇ (f), is a constraint on the regularization of f, and is a function that reflects the sparse information of the estimated data. This function has the effect of smoothing or stabilizing the estimated data.
- the regularization term may be represented by, for example, the Discrete Cosine Transform (DCT), Wavelet Transform, Fourier Transform, or Total Variation (TV) of f. For example, when the total variation is used, it is possible to acquire stable estimated data that suppresses the influence of noise in the observed data g.
- the sparsity of the object 70 in the space of each regularization term depends on the texture of the object 70 .
- a regularization term may be chosen that makes the texture of the object 70 more spars in the space of regularization terms.
- multiple regularization terms may be included in the operation.
- ⁇ is a weighting factor. The larger the weighting factor ⁇ , the larger the reduction amount of redundant data and the higher the compression rate. The smaller the weighting factor ⁇ , the weaker the convergence to the solution.
- the weighting factor ⁇ is set to an appropriate value with which f converges to some extent and does not become over-compressed.
- the image encoded by the filter array 110 is acquired in a blurred state on the imaging surface of the image sensor 160.
- FIG. Therefore, the hyperspectral image 20 can be reconstructed by storing the blur information in advance and reflecting the blur information on the matrix H described above.
- blur information is represented by a point spread function (PSF).
- PSF is a function that defines the degree of spread of a point image to peripheral pixels. For example, when a point image corresponding to one pixel in an image spreads over a region of k ⁇ k pixels around that pixel due to blurring, the PSF is a coefficient group that indicates the effect on the pixel value of each pixel in that region.
- the hyperspectral image 20 can be reconstructed by reflecting the influence of blurring of the encoding pattern by the PSF on the matrix H.
- FIG. The position where the filter array 110 is placed is arbitrary, but a position can be selected where the coding pattern of the filter array 110 is not too diffuse and disappears.
- each image of a plurality of wavelength bands can be restored from the compressed image 10 acquired by the image sensor 160 .
- the computational load is high, and the processing device 200 is required to have high computing power.
- FIG. 5 is a block diagram showing a configuration example of a system for using reference spectral data to reduce the computational load.
- the system includes an imaging device 100 , a processing device 200 , a display device 300 and an input user interface (UI) 400 .
- the processing device 200 is an example of a signal processing device in the present disclosure.
- the imaging device 100 includes an image sensor 160 and a control circuit 150 that controls the image sensor 160 .
- the imaging device 100 also includes a filter array 110 and at least one optical system 140, as shown in FIGS. 1A-1D.
- Image sensor 160 acquires a compressed image, which is a monochrome image based on light whose intensity has been modulated region by region by filter array 110 .
- Information of a plurality of wavelength bands included in the target wavelength band is superimposed on the data of each pixel of the compressed image. Therefore, it can be said that this compressed image is obtained by compressing the hyperspectral information within the target wavelength range as two-dimensional image information.
- compressed image data data representing a compressed image is referred to as "compressed image data”.
- the processing device 200 includes a signal processing circuit 250 and a memory 210 such as RAM and ROM.
- Signal processing circuit 250 may be an integrated circuit comprising a processor such as a CPU or GPU.
- the signal processing circuit 250 performs restoration processing based on the compressed image data output from the image sensor 160 .
- the memory 210 stores computer programs executed by the processor included in the signal processing circuit 250 , various data referenced by the signal processing circuit 250 , and various data generated by the signal processing circuit 250 .
- the memory 210 stores mask data reflecting the spatial distribution of the spectral transmittance of the filter array 110 in the imaging device 100 .
- the mask data is data including information representing the matrix in the above equations (1) and (2) or information for deriving the matrix (hereinafter sometimes referred to as "mask matrix information").
- the mask matrix information may be information in a matrix format or a matrix format having elements according to the spatial distribution of the transmittance of the filter array 110 for each of a plurality of unit bands included in the target wavelength band.
- the mask data is created in advance and stored in memory 210 .
- the display device 300 includes an image processing circuit 320 and a display 330 .
- the image processing circuit 320 performs necessary processing on the image restored by the signal processing circuit 250 and displays it on the display 330 .
- Display 330 can be any display such as, for example, a liquid crystal or organic LED (OLED).
- the input UI 400 includes hardware and software for setting various conditions such as imaging conditions.
- the input UI 400 may include input devices such as a keyboard and mouse.
- the input UI 400 may be realized by a device capable of both input and output, such as a touch screen. In that case, the touch screen may also function as the display 330 .
- Imaging conditions may include conditions such as resolution, gain, and exposure time.
- the input imaging conditions are sent to the control circuit 150 of the imaging apparatus 100 .
- the control circuit 150 causes the image sensor 160 to perform imaging according to imaging conditions.
- the memory 410 stores spectrum data.
- the spectral data includes spectral information assumed for one or more substances that may be contained in the subject. Spectral data are prepared in advance for each substance and recorded in the memory 410 .
- the memory 410 may be an external memory or may be built into the imaging device 100 . Spectral data may be obtained by downloading over a network such as the Internet, for example.
- the user can select specific spectral data as reference spectral data. For example, by the user selecting a particular material or substance on the input UI 400, spectral data corresponding to that material or substance may be determined as reference spectral data. When the reference spectral data is determined by user's operation, the reference spectral data is sent to the signal processing circuit 250 .
- the signal processing circuit 250 determines conditions for synthesizing mask data based on the reference spectral data.
- the conditions for synthesizing mask data are conditions for determining a plurality of designated wavelength bands for which restoration processing is performed. In other words, the signal processing circuit 250 determines a plurality of designated wavelength bands on which restoration processing is performed based on the reference spectrum data. A wavelength band formed by the designated wavelength band is referred to as a “designated wavelength band”.
- the signal processing circuit 250 may automatically determine synthesis conditions based on the reference spectrum data, or may determine synthesis conditions according to conditions specified by the user using the input UI 400 .
- the synthesizing condition defines which of the multiple unit bands included in the target wavelength band should be synthesized and treated as one band.
- Each of the plurality of component bands is a narrow wavelength band included in the target wavelength band.
- a plurality of unit bands included in relatively less important wavelength ranges can be synthesized as one band.
- a plurality of unit bands included in a wavelength range assumed to best represent the characteristics of individual substances may be synthesized as one band.
- a synthesized relatively broad band may be referred to as a "synthetic band" in the following description.
- image data corresponding to the composite band may be called composite image data.
- the synthesizing conditions may include information on wavelength regions in which restoration processing is not performed. For example, it is possible to reduce the computational load by not performing restoration processing on the component bands included in the wavelength region of low importance in observation.
- the signal processing circuit 250 converts the mask data into smaller size mask data based on the determined synthesis condition and the mask data stored in the memory 210 .
- the mask data before conversion is called “first mask data”
- the mask data after conversion is called “second mask data”.
- the first mask data can be used to reconstruct first spectral image data corresponding to a first group of wavelength bands in the wavelength range of interest.
- the first wavelength band group may be, for example, a set of multiple component bands included in the target wavelength band.
- the first spectral image data may be data containing image information of each component band included in the first wavelength band group.
- the second mask data can be used to reconstruct second spectral image data corresponding to a second group of wavelength bands in one or more designated wavelength ranges.
- the second wavelength band group may be a set of multiple component bands included in the specified wavelength band.
- the second spectral image data may be data containing image information of each component band included in the specified wavelength range.
- the first mask data may include first mask information indicating the spatial distribution of spectral transmittance corresponding to the first wavelength band group in filter array 110 .
- the second mask data may include second mask information indicating the spatial distribution of spectral transmittance corresponding to the second wavelength band group in filter array 110 .
- the second mask data may further include third mask information obtained by synthesizing multiple pieces of information corresponding to one or more non-designated wavelength ranges other than the designated wavelength range in the first mask data.
- Each of the plurality of pieces of information in the third mask information can indicate the spatial distribution of spectral transmittance in the corresponding unit wavelength band included in the non-designated wavelength band.
- the third mask information can be said to be synthesized mask information obtained by synthesizing the mask matrix information corresponding to the non-designated wavelength band (that is, the non-designated wavelength band) in the first mask data.
- the size-compressed second mask data is generated by synthesizing the information of a plurality of component bands included in the non-designated wavelength range in the first mask data.
- the signal processing circuit 250 generates a plurality of two-dimensional image data corresponding to a plurality of specified wavelength bands based on the compressed image data and the second mask information in the second mask data.
- the signal processing circuit 250 further generates one or more synthesized image data corresponding to one or more non-designated wavelength bands based on third mask information (that is, synthesized mask information) in the compressed image data and second mask data. Generate.
- the signal processing circuit 250 compresses the size of the first mask data by processing such as averaging matrix elements corresponding to multiple unit bands to be synthesized.
- the signal processing circuit 250 uses the converted second mask data and the compressed image data output from the imaging device 100 to perform a restoration operation corresponding to the above equation (2). Thereby, the signal processing circuit 250 generates a restored image (that is, a spectral image) for each of the synthesized bands.
- the signal processing circuit 250 sends the generated restored image data to the image processing circuit 320 .
- the image processing circuit 320 draws the reconstructed image of each band after synthesis on the display 330 .
- the image processing circuit 320 performs processing such as, for example, determining the arrangement within the screen, linking each restored image with band information, or coloring corresponding to the wavelength, and then causes the display 330 to display the restored image.
- the signal processing circuit 250 determines band synthesis conditions based on the reference spectral data, but the present disclosure is not limited to such a form.
- FIG. 6 shows a variant of the system of FIG.
- a processor 420 is provided that determines the conversion conditions for mask data based on the reference spectral data output from the input UI 400 .
- Processor 420 determines band synthesis conditions based on the reference spectrum data, and sends the information to signal processing circuit 250 as mask data conversion conditions.
- the signal processing circuit 250 reads the necessary unit band information from the memory 210 according to the supplied mask data conversion conditions.
- the signal processing circuit 250 constructs size-compressed mask data from the read information and generates a restored image using the mask data. Such a configuration can further reduce the computation load of the processing device 200 .
- a modification such as that shown in FIG. 6 can be similarly used in various subsequent embodiments.
- FIG. 7 shows an example of first mask data before conversion stored in the memory 210 .
- the first mask data in this example includes information indicating the spatial distribution of transmittance of filter array 110 for each of a plurality of unit bands included in the target wavelength band.
- the first mask data in this example includes mask information for each of a large number of unit bands divided by 1 nm, and information on mask information acquisition conditions. Each unit band is specified by a lower wavelength limit and an upper wavelength limit.
- Mask information includes information on a mask image and a background image.
- the individual mask images shown in FIG. 7 are acquired by imaging a certain background through filter array 110 with image sensor 120 .
- the plurality of component bands are the 1st component band, ⁇ , the kth component band, ⁇ , the Nth component band.
- equation (1) becomes
- each of these submatrices may be a diagonal matrix is the pixel (p, q), pixel (r, s) and the crosstalk between pixels (p, q) and pixels (r, s) of the image sensor 160 when the end user images the object 70 are determined to be the same. (1 ⁇ p, r ⁇ n, 1 ⁇ q, s ⁇ m, pixel (p, q) ⁇ pixel (r, s)). Whether or not the crosstalk conditions described above are satisfied may be determined in consideration of the imaging environment including the optical lens used for imaging, and the image quality of each restored image is determined by the end user. It may be determined by considering whether the purpose can be achieved.
- the data output by the image sensor ( That is, if the background image data) is fk', the formula (4) is
- fk'(i,j) is the pixel value of pixel (i,j) in the background image
- pixel (i,j) may be the pixel value of pixel (i,j) in the mask pixel. 1 ⁇ i ⁇ n and 1 ⁇ j ⁇ m.
- Information on acquisition conditions includes information on exposure time and gain. Information about the acquisition conditions may not be included in the mask data. Note that in the example of FIG. 7, the information of the mask image and the background image is recorded for each of a plurality of unit bands with a width of 1 nm. The width of each unit band is not limited to 1 nm, and can be determined to any value. Moreover, when the uniformity of the background image is high, the mask data does not have to include the background image information. For example, in a configuration in which the image sensor 120 and the filter array 110 are integrated so as to face each other closely, the mask data may not include background image information because the mask information substantially matches the mask image.
- the mask data is, for example, data that defines the matrix H in the above equation (2).
- the format of the mask data may vary depending on system configuration.
- the mask data shown in FIG. 7 includes information for calculating the spatial distribution of spectral transmittance of filter array 110 .
- the mask data is not limited to such a format, and may be data directly indicating the spatial distribution of the spectral transmittance of the filter array 110 .
- the mask data may include a sequence of values obtained by dividing each pixel value of the mask image shown in FIG. 7 by the corresponding pixel value of the background image.
- the first mask data including information as shown in FIG. 7 is converted into second mask data having a smaller size. Examples of methods for transforming mask data are described in more detail below.
- Determination of bands to be synthesized may be performed by manual input by the user, or may be performed automatically based on reference spectral data.
- the input UI 400 has a function that allows the user to select the band to be synthesized.
- the input UI 400 may have a function of allowing the user to select not only the band to be synthesized, but also the wavelength range to be subjected to the restoration calculation, or the restoration condition such as the wavelength resolution.
- a spectrum corresponding to individual substances such as phosphors
- the user can select a specific combination of spectra from the displayed spectra, or select a specific combination of substances from a list.
- the user can further select bands to be synthesized from the spectrum of the selected substance on the UI. In this way, both the reference spectrum data and the UI for allowing the user to select the restoration conditions for the restoration calculation may be displayed on the display. This allows the user to more efficiently generate an image of the desired wavelength band.
- the contribution of each band to analysis or classification can be automatically estimated from the type or combination of selected substances.
- the signal processing circuitry 250 may synthesize bands whose contribution to analysis or classification is less than a threshold value and treat them as one band.
- the magnitude of the contribution can be determined, for example, from the signal intensity in the spectrum, the wavelength derivative of the signal intensity, or prior learning. For example, if the signal intensity of the spectrum of the substance contained in the subject is found to be less than the threshold value in a certain band, it can be determined that the contribution of the band to the result of analysis or classification is small.
- Pre-learning is learning based on statistical methods such as principal component analysis or regression analysis.
- pre-learning we refer to a database that records the "wavelength range used for classification" for each individual substance, and from the reference spectrum, select wavelength regions that do not fall under the "wavelength range used for classification" as "wavelengths with low contribution.” For example, a method of determining as a “region” can be considered.
- FIGS. 8A and 8B are diagrams for explaining a method of estimating a band with a small contribution to calculation based on the reference spectrum and determining band synthesis conditions.
- FIG. 8A shows examples of spectra of four types of samples 31, 32, 33, and 34 that may be included in the object.
- FIG. 8B shows an example of bands to be synthesized. Given reference spectrum data as shown in FIG. 8A, bands with low signal intensities of all reference spectra can be synthesized as shown in FIG. 8B. By such processing, a restored image can be acquired with sufficient wavelength resolution for bands with relatively high signal intensity in the reference spectrum, and the amount of calculation can be reduced without affecting the accuracy of analysis or classification.
- the wavelength band that contributes significantly to the analysis or classification is selected as the "designated wavelength band.”
- wavelength bands that contribute less to the analysis or classification are selected as "non-designated wavelength bands.”
- FIG. 9 is a flowchart showing an example of mask data conversion processing executed by the signal processing circuit 250 shown in FIG.
- the signal processing circuit 250 acquires compressed image data generated by the imaging device 100 .
- the signal processing circuit 250 acquires reference spectral data from the input UI 400 .
- the signal processing circuit 250 determines whether or not to perform mask data conversion processing based on the reference spectrum data. If determined necessary, the signal processing circuit 250 transforms the mask data based on the reference spectrum data. A decision as to whether or not to perform mask data conversion processing can be made based on whether or not there is a band that is estimated to have a small contribution in analysis or classification, as described above.
- the signal processing circuit 250 determines whether conversion of the mask data is necessary based on the user's input. If there is a band to be synthesized, the process proceeds to step S104, and the signal processing circuit 250 performs conversion processing. If there are no bands to be synthesized, the conversion process is skipped. In subsequent step S105, the signal processing circuit 250 performs the restoration operation shown in the above-described equation (2) based on the compressed image and the mask data to generate a restored image for each band. If the mask data has been converted in step S104, the signal processing circuit 250 performs restoration processing using the converted mask data. The restored image is sent to the display device 300 and displayed on the display 330 after necessary processing by the image processing circuit 320 . Acquisition of the compressed image in step S101 may be performed at any timing before step S105.
- FIG. 10 is a diagram for explaining an example of a method of synthesizing mask information of a plurality of bands and converting it into new mask information.
- mask information for component bands #1 to #20 is stored in memory 210 in advance as mask information before conversion.
- the synthesis process is not performed for the component bands #1 to #5, and the synthesis process is performed for the component bands #6 to #20.
- the transmittance distribution of filter array 110 is calculated by dividing the value of each region in the mask image by the value of the corresponding region in the background image.
- the data of each mask image stored in the memory 210 is called “unit mask image data”
- the data of each background image stored is called "unit background image data".
- the synthesized transmittance distribution is obtained by summing the unit mask image data of bands #6 to #20 for each pixel, and the unit background image data for bands #6 to #20 for each pixel. Obtained by dividing by the summed data.
- data obtained by summing or averaging mask image data of bands #6 to #20 may be used as synthesized mask data of bands #6 to #20.
- the matrix H' is changed to the matrix H".
- h'(1,1) (h6(1,1)+...+h20(1,1)/15
- the mask data conversion processing by synthesis may be performed in the environment used by the end user, or may be performed at a manufacturing site such as a factory that manufactures the system or device.
- the second mask data after conversion is stored in the memory 210 in place of or in addition to the first mask data before conversion during the manufacturing process.
- the signal processing circuit 250 can perform the restoration process using the pre-stored converted mask data in response to the user's input. This makes it possible to further reduce the processing load.
- FIGS. 11A and 11B are diagrams for explaining band synthesis processing in this embodiment.
- FIG. 11A shows examples of reference spectra of four types of samples 31, 32, 33, and 34 that can be included in the object.
- FIG. 11B shows an example of band synthesis.
- FIG. 11A when the overlap between reference spectra is small, it is effective to perform band synthesis in accordance with the peaks of each spectrum. As a result, it is possible to create a situation in which a substance corresponding to approximately one spectrum appears in each restored image.
- band synthesis is performed as shown in FIG. 11B, substantially sample 31 has signal strength in the image corresponding to band #1. Therefore, the restored image corresponding to band #1 can be handled as it is as the classified image of sample #1.
- the restored image of band #2 can be treated as the classified image of sample 32, the restored image of band #3 as the classified image of sample 33, and the restored image of band #4 as the classified image of sample #4.
- classification processing is performed after restoration processing. As a result, the burden of signal processing can be greatly reduced.
- the amount of overlap between reference spectra can be determined, for example, by the following method. For example, consider the wavelength range from wavelength ⁇ 1 to ⁇ 2. When the values of each reference spectrum are integrated from ⁇ 1 to ⁇ 2, the integrated value of the reference spectrum with the largest integrated value is defined as signal S, and the sum of the integrated values of the other reference spectra is defined as noise N.
- the amount of overlap between reference spectra can be judged by the value of the signal-to-noise ratio (S/N ratio), which is the signal S divided by the noise N. For example, when the S/N ratio is lower than 1, it can be determined that the overlap is large and that the reference spectra overlap each other.
- S/N ratio signal-to-noise ratio
- the S/N ratio is 1 or more, it can be determined that the overlap is small and that there is no overlap between the reference spectra. Alternatively, it is determined that the reference spectra do not overlap when the S/N ratio is 2 or more, and the reference spectra overlap when the S/N ratio is less than 2. good too.
- the signal processing circuit 250 in this example determines a plurality of designated wavelength bands so that each designated wavelength band does not overlap between reference spectra, and combines a plurality of component bands included in each designated wavelength band into one band. do.
- Each of the plurality of designated wavelength bands in this example includes a spectral peak wavelength associated with a corresponding one of the plurality of substances.
- each reference spectrum may be displayed on the input UI 400 so that the user himself/herself can determine the range of bands to be synthesized by performing an operation such as moving the band edge. Displaying the S/N ratio on the screen may assist the user's decision.
- the signal processing circuit 250 in this example generates compressed second mask data by processing such as averaging a plurality of elements corresponding to each designated wavelength band in the first mask data. Based on the compressed image data and the second mask data, the signal processing circuit 250 generates image data corresponding to each designated wavelength band by performing a calculation corresponding to the above equation (2). By using the compressed second mask data, the load of the restoration calculation can be greatly reduced.
- observation can be performed under such conditions that the restored image is divided into an excitation light image and a fluorescence image.
- the content input on the input UI 400 can also be used for labeling the restored image.
- labeling refers to assigning a known substance name or a code for classification to a region in the restored image or a restored image corresponding to a band having a signal intensity biased toward a certain region. Say things.
- FIG. 12 is a block diagram showing a configuration example of a system for labeling.
- the input UI 400 determines the labeling conditions corresponding to the user-selected substance or spectrum and sends that information to the memory 310 of the display device 300 .
- the image processing circuit 320 of the display device 300 labels each restored image as necessary according to the sent labeling conditions, and causes the display 330 to display the labeled restored image.
- band synthesis is performed such that a restored image of a certain band X is an excitation light image and a restored image of another band Y is a fluorescence image.
- the input UI 400 can specify that band X corresponds to excitation light and band Y corresponds to a specific phosphor based on the band synthesis conditions determined from the reference spectral data.
- the input UI 400 can determine, for example, labeling conditions for assigning a phosphor name or a code for classification to a reconstructed image of band Y or a specific region of the reconstructed image. Bands determined to be in the wavelength range of excitation light may be assigned a name or classification code, such as "excitation light band.” As for the content of labeling, information such as a name input by the user through the input UI 400 may be used for labeling. Labeling may be performed automatically based on known physical property information.
- FIGS. 13A and 13B are diagrams showing examples of reference spectra of four samples 31 to 34 with large spectral overlap.
- the overlap between reference spectra is large as in this example, it is not possible to synthesize bands such that the restored image and the classified image match.
- the overlap can be reduced for some spectra.
- Bands and samples with no spectral overlap or little overlap can be classified at the time of restoration.
- the plurality of designated wavelength bands are one or more first designated wavelength bands having no overlap among the plurality of spectra and one or more second designated wavelength bands having overlap among the plurality of spectra. and may include In the example of FIG. 13B, band #1, band #3, and band #4 correspond to the first designated wavelength band, and band #X and band #Y correspond to the second designated wavelength band.
- band #1, band #3, and band #4 correspond to the first designated wavelength band
- band #X and band #Y correspond to the second designated wavelength band.
- classification is possible at the same time as reconstruction.
- FIG. 14 is a diagram showing an example of a graphical user interface (GUI) that enables operations performed in this embodiment.
- GUI graphical user interface
- the GUI displays a compressed image, a decompressed image, reference spectra and band synthesis information, and a table showing the correspondence between bands and samples (ie, materials). Indications relating to recovery of hyperspectral images and/or indications relating to analysis of hyperspectral images and/or indications of recovered hyperspectral images may be added or removed as desired.
- the overlap between reference spectra is small, it is possible to synthesize bands such that there is a one-to-one correspondence between band numbers and classifications.
- the display part of "reference spectrum and band synthesis information” the result of band synthesis as shown in FIG. 10, or the band synthesis in which some bands are excluded from restoration targets as shown in FIG. Results can be displayed as well.
- the spectrum of the observation target is known, such as in fluorescence observation, depending on the combination of the observation targets, a band in which the signal intensity is expected to be zero or very small may occur within the target wavelength band. If the signal intensity of a certain band included in the wavelength band of interest is zero or very small, there is no or very little restoration error caused by excluding that band from the object of the restoration calculation.
- FIGS. 15A and 15B are diagrams for explaining a method of excluding specific bands from restoration targets based on reference spectral data. If four samples 31, 32, 33, 34 with spectra as shown in FIG. 15A are considered to be present in the imaging region, then all samples have signal intensities at wavelengths between samples 32 and 33. There is a band that is expected to output a dark image. Such bands that are expected to have no signal strength can be excluded from reconstruction as shown in FIG. 15B. By reducing the number of bands to be restored, it is possible to reduce the load of signal processing.
- FIG. 16 is a flow chart showing an example of the operation of the signal processing circuit 250 when excluding a specific band from restoration.
- the flowchart shown in FIG. 16 replaces the specific band synthesizing process (steps S103 and S104) in the flowchart shown in FIG.
- step S203 the signal processing circuit 250 determines whether or not to delete information of a specific band from the mask data. Based on the reference spectrum data, the signal processing circuit 250 proceeds to step S204 if there is a band for which an image having no signal intensity is expected to be output for all assumed samples. information from the mask data. If there is no band for which an image with no signal strength is expected to be output, step S204 is omitted. By such an operation, it is possible to reduce the calculation amount of the restoration calculation process in the transition step S105 and shorten the calculation time.
- FIG. 17 is a flowchart showing an example of the operation when reference spectrum data is translated into conversion conditions for mask data and sent to the signal processing circuit 250, as in the example shown in FIG.
- steps S102, S103, and S104 in the flowchart of FIG. 9 are replaced with steps S302 and S303.
- the signal processing circuit 250 acquires the mask data conversion conditions, that is, band information used for restoration from the processor 420 shown in FIG. Based on this information, certain bands that are not used for reconstruction can be determined.
- step S ⁇ b>303 the signal processing circuit 250 acquires mask data for bands other than the specific band from the memory 210 . That is, reading of the mask data for the particular band to be deleted is not performed.
- reference spectral data may not be stored in memory 410 .
- the name of the material or substance input or selected on the input UI 400 may be associated with the conversion condition.
- Such linkage allows processor 420 or signal processing circuitry 250 to determine bands to exclude without reference to reference spectral data.
- labeling information such as the substance name corresponding to each spectrum may be extracted from the reference spectral data stored in the memory 410, and each restored image may be labeled using the labeling information.
- the present embodiment relates to a system for fluorescence observation.
- FIG. 18 is a diagram schematically showing the configuration of the system of this embodiment.
- the system comprises a light source 610 , optics 620 and a detector 630 .
- Optical system 620 includes an interference filter 621 , a dichroic mirror 622 , an objective lens 623 and a long pass filter 624 .
- Light source 610 may include a laser emitting element that emits excitation light.
- the detector 630 comprises the imager 100 described above. A sample 80 containing fluorescent material is illuminated with excitation light and the emitted fluorescence is detected by detector 630 .
- the excitation light emitted from the light source 610 is incident on the dichroic mirror 622 after passing through an interference filter 621 that selectively transmits light of a specific wavelength that excites the fluorescent material.
- the dichroic mirror 622 reflects light in a certain wavelength range including the wavelength of the excitation light and transmits light in other wavelength ranges.
- the excitation light reflected by the dichroic mirror 622 enters the sample 80 through the objective lens 623 .
- the sample 80 that receives the excitation light emits fluorescence.
- the fluorescence is transmitted through dichroic mirror 622 and longpass filter 624 and detected by detector 630 . Part of the excitation light incident on the sample 80 is reflected.
- the excitation light reflected by the sample 80 is reflected by the dichroic mirror 622 , but part of it is transmitted through the dichroic mirror toward the detector 630 .
- the excitation light usually has an intensity several orders of magnitude higher than fluorescence. may interfere with the observation of
- a long-pass filter 624 is arranged in front of the detector 630 and the optical system 620 is constructed so that the excitation light does not enter the detector 630 .
- the reason why the long-pass filter 624 is used is that in fluorescence observation, excitation light has higher energy than fluorescence, that is, has a shorter wavelength.
- the detector 630 can be, for example, a hyperspectral camera including the imaging device 100 and the processing device 200 in the second embodiment. As described above, the detector 630 performs restoration processing based on mask data from which unnecessary band information corresponding to the excitation light is removed.
- FIG. 19 is a diagram showing an example of the relationship between excitation light and fluorescence spectra and excluded bands.
- the sample 80 contains multiple types of fluorescent materials. The spectra of fluorescence emitted from those fluorescent materials are different from each other. Since the excitation light energy is higher than the fluorescence energy, the excitation light wavelength is shorter than the fluorescence wavelength.
- the wavelength range of the actually incident light is narrower than the target wavelength range that the detector 630 can detect.
- the optical system 620 is constructed so that the dichroic mirror 622 and the long-pass filter 624 do not transmit short wavelength light in order to cut the excitation light.
- long-pass filter 624 and dichroic mirror 622 are selected according to the observed fluorescence and the wavelength of excitation light used. Therefore, with the configuration of the present embodiment, which allows selection of any band and excluding it from the restoration calculation, it is possible to perform observation with the minimum required arithmetic processing according to the observation target.
- the FISH (fluorescent in-situ Hybridization) method is a method in which a probe with a gene sequence complementary to a specific gene sequence is labeled with a fluorescent dye, and the hybridized site or chromosome is identified by fluorescence.
- a probe with a gene sequence complementary to a specific gene sequence is labeled with a fluorescent dye, and the hybridized site or chromosome is identified by fluorescence.
- m-FISH method Multicolor FISH
- multiple probes labeled with different fluorescent dyes are used simultaneously.
- the m-FISH method is used to test some cancers such as leukemia and congenital genetic abnormalities.
- Canbio's m-FISH probe is designed using five types of fluorescent dyes so that attachment ratios of the five types of fluorescent dyes are different for each chromosome number of human and mouse. Therefore, by staining a set of chromosomes with this probe, the number of each chromosome can be specified.
- each chromosome In the absence of translocations that cause cancer and congenital genetic abnormalities, each chromosome exhibits a single fluorescence spectrum as a whole. However, when a translocation occurs, the fluorescence spectrum changes depending on the part of the chromosome. This property can be used to detect translocations.
- FIG. 20A shows absorption spectra of five types of fluorescent dyes (Cy3, Cy3.5, Cy5, FITC, and DEAC) used for fluorescent labeling for the m-FISH method by Canbio.
- FIG. 20B shows the fluorescence spectra of each of these fluorochromes. As is evident from FIG. 20A, there is no single wavelength that can induce fluorescence of all fluorochromes simultaneously. Therefore, the distribution of five types of fluorescent dyes can be identified by the following procedure.
- FIG. 21A is a diagram showing the relationship between excitation wavelengths and absorption spectra of fluorescent dyes. Solid curves show the absorption spectra of two dyes (DEAC and FITC) in which fluorescence is induced. The dashed curves show the absorption spectra of three dyes (Cy3, Cy3.5, Cy5) for which no fluorescence is induced due to insufficient absorption at the excitation wavelength.
- FIG. 21B is a diagram showing an example of the relationship between each fluorescence spectrum, cutoff wavelength region, and restoration band.
- the wavelength of the excitation light is set at 405 nanometers (nm).
- the cutoff wavelengths of dichroic mirror 622 and long-pass filter 624 are set to 450 nm. That is, light with a wavelength of 450 nm or more is incident on the detector 630 .
- the wavelength range shorter than 450 nm is the cutoff wavelength range.
- the wavelength of 405 nm of excitation light be a 1st wavelength.
- the fluorescence of the dyes FITC and DEAC is induced when they are irradiated with excitation light of 405 nm. Since the dichroic mirror 622 and the long-pass filter 624 block light in the wavelength range of 450 nm or less, the light in this wavelength range does not enter the detector 630 . Moreover, since the absorption spectra of the fluorescent dyes Cy3, Cy3.5, and Cy5 are longer than the excitation wavelength, fluorescence is not induced from the fluorescent dyes Cy3, Cy3.5, and Cy5. Therefore, for example, there is no fluorescence at a wavelength of 650 nm or more, and the output image is completely dark.
- FIG. 22A is a diagram showing the relationship between the excitation wavelength and the absorption spectrum of the fluorescent dye in the second imaging.
- the solid curve shows the absorption spectrum of the fluorescence-induced dye (Cy5).
- the dashed curves show the absorption spectra of the remaining four dyes for which no fluorescence is induced due to insufficient absorption at the excitation wavelength.
- FIG. 22B is a diagram showing an example of the relationship between each fluorescence spectrum, cutoff wavelength region, and restoration band.
- the wavelength of the excitation light is set to the second wavelength of 633 nm, and the cutoff wavelengths of the dichroic mirror 622 and the long-pass filter 624 are set to 650 nm. That is, light with a wavelength of 650 nm or more is incident on the detector 630 .
- FIG. 23A is a diagram showing the relationship between the excitation wavelength and the absorption spectrum of the fluorescent dye in the third imaging.
- the solid curve shows the absorption spectra of the two dyes (Cy3, Cy3.5) in which fluorescence is induced.
- the dashed and dotted curves show the absorption spectra of the remaining three dyes for which no fluorescence is induced due to insufficient absorption at the excitation wavelength.
- FIG. 23B is a diagram showing the relationship between each fluorescence spectrum, cutoff wavelength region, and restoration band.
- the wavelength of the excitation light is set to the third wavelength of 532 nm, and the cutoff wavelength of the dichroic mirror 622 and the long-pass filter 624 is set to 550 nm. That is, light with a wavelength of 550 nm or more is incident on the detector 630 .
- the restoration band is set as follows. - First band: wavelengths from 550 nm to 575 nm.
- This band contains fluorescence of Cy3 and Cy3.5 as main components, and may be slightly mixed with fluorescence of FITC.
- Second band wavelengths from 575 nm to 625 nm.
- This band contains fluorescence of Cy3 and Cy3.5 as main components, and may be slightly mixed with fluorescence of FITC.
- - Third band wavelengths from 625 nm to 650 nm.
- This band contains fluorescence of Cy3 and Cy3.5 as main components, and may be slightly mixed with fluorescence of FITC and Cy5.
- - Fourth band wavelengths from 650 nm to 800 nm. This band contains fluorescence of Cy3 and Cy3.5 as main components, and may be slightly mixed with fluorescence of Cy5.
- the distribution of FITC was specified in STEP1
- the distribution of Cy5 was specified in STEP2.
- Cy3 and Cy3.5 fluorescence is contained in all of the first to fourth bands.
- the fluorescence intensity ratio of these fluorochromes in each band is known from the emission spectra of the fluorochromes. Therefore, the distribution of Cy3 and Cy3.5 can be obtained by solving the simultaneous equations regarding the intensity or obtaining the distribution of the pigment that reproduces the imaging result by simulation.
- a computer implemented signal processing method comprising: performing a first process based on the first instruction; performing a second process based on the second instruction; Performing the third process based on the third instruction,
- the first processing is (a-1) receiving a plurality of first pixel values; the image sensor outputs the plurality of first pixel values in response to first light from a filter array, the first light corresponding to second light from a first object entering the filter array; (a-2) based on the first matrix and the plurality of first pixel values, a plurality of pixel values I(11) of the image corresponding to the first wavelength range of the first subject, .
- the plurality of first pixel values correspond to a plurality of first pixels arranged in m rows and n columns, the first matrix is (A1A2 . . . Ap), and the first matrix is a plurality of first pixel values. wherein the plurality of first submatrices are A1, A2, .
- the second processing is (b-1) receiving a plurality of second pixel values; the image sensor outputs the plurality of second pixel values in response to third light from the filter array, the third light corresponding to fourth light from a second object entering the filter array; (b-2) Generating a plurality of pixel values I(2q) of an image corresponding to the q-th wavelength range of the second object based on the q-th submatrix Aq and the plurality of second pixel values.
- the third process is (c-1) receiving a plurality of third pixel values; the image sensor outputs the plurality of third pixel values in response to fifth light from the filter array, the fifth light corresponding to sixth light from a third object entering the filter array; (c-2) generating a plurality of pixel values I(3q) of an image corresponding to the q-th wavelength range of the third subject based on the q-th submatrix Aq and the plurality of third pixel values; (c-3) Based on a second matrix generated based on the pluralit
- p, q, r, and s are natural numbers, respectively, q ⁇ r or (r+s) ⁇ q, 1 ⁇ q ⁇ p, 1 ⁇ r ⁇ p, 1 ⁇ r+s ⁇ p.
- Formulas (1) and (2) are written as n ⁇ m, but here n ⁇ m in formulas (1) and (2) may be rewritten as m ⁇ n.
- Each of the first instruction, the second instruction, and the third instruction may be given by the user using the input UI 400.
- the first instruction is an instruction to generate an image corresponding to the first wavelength region of the first object, . . . , an image corresponding to the p-th wavelength region of the first object.
- the second instruction includes an instruction to generate an image corresponding to the q-th wavelength range of the second subject, an instruction not to generate an image corresponding to the r-th wavelength range of the second subject, .about., and the (r+s) wavelength range of the second subject. contains an indication not to generate an image corresponding to .
- the third instruction is an instruction to generate an image corresponding to the q-th wavelength range of the third subject, an instruction to generate an image corresponding to the r-th to the (r+s)-th wavelength range of the third subject, 3 includes an instruction not to generate an image corresponding to the r-th wavelength region of the object, . . . , an instruction not to generate an image corresponding to the (r+s) wavelength region of the third object.
- the first process includes processes (a-1) and (a-2).
- Signal processor 250 receives a plurality of first pixel values from image sensor 160 .
- the second light from the first subject enters the filter array 110 .
- filter array 110 outputs the first light.
- Image sensor 160 outputs a plurality of first pixel values in response to the first light from filter array 110 .
- a plurality of first pixel values can be described as follows if described in a matrix format of m rows and n columns.
- (i, j) may be considered to correspond to the position of the pixel in the image. 1 ⁇ i ⁇ m and 1 ⁇ j ⁇ n.
- a plurality of first pixel values can be described as follows if they are described in a matrix format of m ⁇ n rows and 1 column.
- the signal processing circuit 250 calculates the plurality of pixel values I(11), . . . , to generate a plurality of pixel values I(1p) of the image corresponding to the p-th wavelength band of the first object. This generation method has already been explained using equations (1) and (2).
- Formulas (1) and (2) are written as n ⁇ m, but here n ⁇ m in formulas (1) and (2) is rewritten as m ⁇ n and applied.
- the first matrix is the matrix H with m ⁇ n rows and m ⁇ n ⁇ N columns shown in Equation (1).
- the matrix H is H1 and submatrices A1, . . . , and submatrices Ap are used,
- Each of the submatrices A1, . . . , Ap may be a diagonal matrix.
- a plurality of pixel values I(11) of the image corresponding to the first wavelength band of the first object, to a plurality of pixel values I(1p) of the image corresponding to the p-th wavelength band of the first object are represented by m rows n If described in a matrix format of columns, it can be described as follows.
- (i, j) may be considered to correspond to the position of the pixel in the image. 1 ⁇ i ⁇ m and 1 ⁇ j ⁇ n.
- a plurality of pixel values I(11) of the image corresponding to the first wavelength range of the first subject, to a plurality of pixel values I(1p) of the image corresponding to the p-th wavelength range of the first subject are m ⁇ n If described in a matrix format with one row and one column, it can be described as follows.
- the plurality of first submatrices A1, A2, . . . , Ap include the qth submatrix Aq.
- the second process includes processes (b-1) and (b-2).
- Signal processor 250 receives a plurality of second pixel values from image sensor 160 .
- a fourth light from the second object enters the filter array 110 .
- filter array 110 outputs the third light.
- Image sensor 160 outputs a plurality of second pixel values corresponding to the third light from filter array 110 .
- a plurality of second pixel values can be described as follows if described in a matrix format of m rows and n columns.
- (i, j) may be considered to correspond to the position of the pixel in the image. 1 ⁇ i ⁇ m and 1 ⁇ j ⁇ n.
- a plurality of second pixel values can be described as follows if described in a matrix format of m ⁇ n rows and 1 column.
- the signal processing circuit 250 calculates a plurality of pixel values I(2q) of the image corresponding to the q-th wavelength range of the second object based on the q-th submatrix Aq and the plurality of second pixel values recorded in the memory 210. to generate The signal processing circuit 250 calculates a plurality of pixel values I(2r), . A plurality of pixel values I(2(r+s)) of the image corresponding to the (r+s)-th wavelength region of the subject are not generated.
- the signal processing circuit 250 may perform the following processing.
- the signal processing circuit 250 deletes the r-th submatrix Ar, . Generate a matrix H2 of m rows n ⁇ m ⁇ (p ⁇ (s+1)) columns.
- a plurality of pixel values I( 2(r+s)), (iii) A plurality of pixel values I(2(r+s+1)) of the image corresponding to the (r+s+1)-th wavelength region of the second object, . . . corresponding to the p-th wavelength region of the second object Generate a plurality of pixel values I(2p) of the image to be processed.
- Formulas (1) and (2) are written as n ⁇ m, but here n ⁇ m in formulas (1) and (2) is rewritten as m ⁇ n and applied.
- the third process includes processes (c-1), (c-2), and (c-3).
- Signal processor 250 receives a plurality of third pixel values from image sensor 160 .
- the sixth light from the third object enters filter array 110 .
- filter array 110 outputs the fifth light.
- Image sensor 160 outputs a plurality of third pixel values corresponding to the fifth light from filter array 110 .
- a plurality of third pixel values can be described as follows if described in a matrix format of m rows and n columns.
- (i, j) may be considered to correspond to the position of the pixel in the image. 1 ⁇ i ⁇ m and 1 ⁇ j ⁇ n.
- a plurality of third pixel values can be described as follows if described in a matrix format of m ⁇ n rows and 1 column.
- the signal processing circuit 250 calculates a plurality of pixel values I(3q) of the image corresponding to the q-th wavelength range of the third object based on the q-th submatrix Aq and the plurality of third pixel values recorded in the memory 210. to generate
- Signal processing circuit 250 generates a second matrix based on the plurality of second sub-matrices stored in memory 210 .
- the signal processing circuit 250 generates an image corresponding to the r-th wavelength region of the third object to the (r+s)-th wavelength region of the third object based on the generated second matrix and the plurality of third pixel values.
- the signal processing circuit 250 generates a plurality of pixel values I3c of the image corresponding to the r-th wavelength range of the third object based on the plurality of second sub-matrices and the plurality of third pixel values.
- I(3r), . . . do not generate a plurality of pixel values I(3(r+s)) of the image corresponding to the (r+s)th wavelength range of the third object.
- the signal processing circuit 250 may perform the following processing.
- the signal processing circuit 250 performs the r-th submatrix Ar, . 2 Generate matrix H3.
- the signal processing circuit 250 generates a third matrix H4 based on the first matrix H1 and the second matrix H3.
- the third matrix H4 is a matrix of n ⁇ m rows n ⁇ m ⁇ (ps) columns.
- Formulas (1) and (2) are written as n ⁇ m, but here n ⁇ m in formulas (1) and (2) may be rewritten as m ⁇ n.
- a plurality of pixel values I3c of the image corresponding to the r-th wavelength range of the third subject to the (r + s)-th wavelength range of the third subject are described as follows when described in a matrix format of m rows and n columns. can.
- (i, j) may be considered to correspond to the position of the pixel in the image. 1 ⁇ i ⁇ m and 1 ⁇ j ⁇ n.
- a plurality of pixel values I3c of the image corresponding to the r-th wavelength range of the third subject to the (r+s)-th wavelength range of the third subject are described in a matrix format of m ⁇ n rows and 1 column, they are as follows. can be written.
- compressed images may be generated by imaging in a different manner than imaging with a filter array that includes multiple optical filters.
- the image sensor 160 may be processed to change the light receiving characteristics of the image sensor for each pixel. may be generated. That is, instead of encoding the light incident on the image sensor by the filter array 110, the image sensor may be provided with the ability to encode the incident light to produce a compressed image. In this case, the mask data corresponds to the light receiving characteristics of the image sensor.
- the optical characteristics of the optical system 140 are changed spatially and wavelength-wise, and the incident light is encoded.
- the compressed image may be generated by an imaging device that includes the configuration.
- the mask data is information corresponding to the optical characteristics of optical elements such as metalens.
- the technology of the present disclosure is not limited to fluorescence observation, and can be applied to other applications where the spectrum of the observation target is known.
- the technique of the present disclosure can be applied to various uses such as observation of absorption spectrum, observation of black body radiation (eg temperature estimation), estimation of light sources (eg LED, halogen lamp, etc.).
- the technology of the present disclosure is useful, for example, for cameras and measurement equipment that acquire multi-wavelength images.
- the technology of the present disclosure can also be applied to, for example, fluorescence observation, absorption spectrum observation, biological/medical/beauty sensing, food foreign matter/residual pesticide inspection system, remote sensing system, and in-vehicle sensing system.
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Life Sciences & Earth Sciences (AREA)
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Biochemistry (AREA)
- General Health & Medical Sciences (AREA)
- Immunology (AREA)
- Pathology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Mathematical Physics (AREA)
- Spectrometry And Color Measurement (AREA)
- Image Processing (AREA)
Abstract
Description
本開示の実施形態を説明する前に、スパース性に基づく画像復元処理と、復元時に用いられるマスクデータの合成および編集処理の概要を説明する。
<1.撮像システム>
まず、本開示の例示的な実施形態において用いられる撮像システムの構成例を説明する。
以上の処理により、イメージセンサ160によって取得された圧縮画像10から、複数の波長バンドのそれぞれの画像を復元することができる。しかし、対象波長域に含まれる全ての波長バンドの画像を復元するためには、イメージセンサ160の画素数と波長バンド数との積に相当する数の要素を含む行列を用いた演算が必要である。この演算の負荷は高く、処理装置200には高い演算能力が要求される。
・・・、
h’(n×m,n×m)=(h6(n×m,n×m)+・・・+h20(n×m,n×m)/15
であってもよい。
次に、第2の実施形態を説明する。実施形態1では、分析または分類への寄与度の低い複数の単位バンドを1つのバンドとして合成することにより、復元処理の演算量を削減する。これに対し、本実施形態では、信号処理回路250は、複数の参照スペクトル同士の重なりが小さいと考えられる場合に、復元後の画像がそのまま分類画像となるようなバンド合成を施す。これにより、信号処理の負担をさらに軽減することができる。
次に、本開示の第3の実施形態を説明する。本実施形態では、対象波長域に含まれるバンドのうち、重要度が低いと考えられるバンドを復元対象から除外することにより、復元演算の負荷をさらに低減する。
次に、本開示の第4の実施形態を説明する。本実施形態は、蛍光観察を行うためのシステムに関する。
本開示の実施形態による方法を、蛍光観察の一手法であるm-FISH法に応用した実施例を説明する。
第1の撮像について、図21Aおよび図21Bを参照しながら説明する。図21Aは、励起波長と蛍光色素の吸収スペクトルとの関係を示す図である。実線の曲線は、蛍光が誘起される2種類の色素(DEACおよびFITC)の吸収スペクトルを示している。点線の曲線は、励起波長において十分な吸収が生じないため蛍光が誘起されない3種類の色素(Cy3、Cy3.5、Cy5)の吸収スペクトルを示している。図21Bは、それぞれの蛍光スペクトルと遮断波長域および復元バンドとの関係の例を示す図である。
第2の撮像について、図22Aおよび図22Bを参照しながら説明する。図22Aは、第2の撮像における励起波長と蛍光色素の吸収スペクトルとの関係を示す図である。実線の曲線は蛍光が誘起される色素(Cy5)の吸収スペクトルを示している。点線の曲線は励起波長において十分な吸収が生じないために蛍光が誘起されない残りの4種類の色素の吸収スペクトルを示している。図22Bは、それぞれの蛍光スペクトルと遮断波長域および復元バンドとの関係の例を示す図である。
第3の撮像について、図23Aおよび図23Bを参照しながら説明する。図23Aは、第3の撮像における励起波長と蛍光色素の吸収スペクトルの関係を示す図である。実線の曲線は蛍光が誘起される2種類の色素(Cy3、Cy3.5)の吸収スペクトルを示している。破線および点線の曲線は、励起波長において十分な吸収が生じないために蛍光が誘起されない残りの3種類の色素の吸収スペクトルを示している。図23Bは、それぞれの蛍光スペクトルと遮断波長域および復元バンドとの関係を示す図である。
・第1バンド:波長550nmから575nm。
・第2バンド:波長575nmから625nm。
・第3バンド:波長625nmから650nm。
・第4バンド:波長650nmから800nm。このバンドにはCy3とCy3.5の蛍光が主成分として含まれ、Cy5の蛍光がわずかに混入している可能性がある。
本開示の実施の形態の変形例は、下記に示すようなものであってもよい。
第1指示に基づき、第1処理を行い、
第2指示に基づき、第2処理を行い、
第3指示に基づき、第3処理を行い、
前記第1処理は、
(a-1)複数の第1画素値を受け取り、
イメージセンサはフィルタアレイからの第1光に対応して前記複数の第1画素値を出力し、前記第1光は前記フィルタアレイに入射する第1被写体からの第2光に対応し、
(a-2)第1行列と前記複数の第1画素値とに基づいて、前記第1被写体の第1波長域に対応する画像の複数の画素値I(11)、~、前記第1被写体の第p波長域に対応する画像の複数の画素値I(1p)を生成することを含み、
前記複数の第1画素値はm行n列状に配列された複数の第1画素に対応し、前記第1行列は(A1A2・・・Ap)であり、前記第1行列は複数の第1小行列を含み、前記複数の第1小行列はA1、A2、~、Apであり、前記複数の第1小行列は第q小行列Aqを含み、前記複数の第1小行列は、複数の第2小行列を含み、前記複数の第2小行列は第r小行列Ar、~、第(r+s)小行列A(r+s)であり、p、q、r、sはそれぞれ自然数、q<rまたは(r+s)<q、1≦q≦p、1≦r≦p、1≦r+s≦pであり、
前記第2処理は、
(b-1)複数の第2画素値を受け取り、
前記イメージセンサは前記フィルタアレイからの第3光に対応して前記複数の第2画素値を出力し、前記第3光は前記フィルタアレイに入射する第2被写体からの第4光に対応し、
(b-2)前記第q小行列Aqと複数の第2画素値とに基づいて、前記第2被写体の前記第q波長域に対応する画像の複数の画素値I(2q)を生成することを含み、
前記複数の第2画素値はm行n列状に配列された複数の第2画素に対応し、
前記複数の第2小行列と前記複数の第2画素値とに基づいて、前記第2被写体の前記第r波長域に対応する画像の複数の画素値I(2r)、~、前記第2被写体の前記第(r+s)波長域に対応する画像の複数の画素値I(2(r+s))を生成せず、
前記第3処理は、
(c-1)複数の第3画素値を受け取り、
前記イメージセンサは前記フィルタアレイからの第5光に対応して前記複数の第3画素値を出力し、前記第5光は前記フィルタアレイに入射する第3被写体からの第6光に対応し、
(c-2)前記第q小行列Aqと複数の第3画素値とに基づいて、前記第3被写体の前記第q波長域に対応する画像の複数の画素値I(3q)を生成し、
(c-3)複数の第3画素値と前記複数の第2小行列に基づいて生成された第2行列に基づいて、前記第3被写体の前記第r波長域~前記第(r+s)波長域に対応する画像I3cの複数の画素値を生成することを含み、
前記複数の第3画素値はm行n列状に配列された複数の第3画素に対応し、
前記複数の第2小行列と前記複数の第3画素値とに基づいて、前記第3被写体の前記第r波長域に対応する画像の複数の画素値I(3r)、~、前記第3被写体の前記第(r+s)波長域に対応する画像の複数の画素値I(3(r+s))を生成しない、
信号処理方法。
p、q、r、sはそれぞれ自然数、q<rまたは(r+s)<q、1≦q≦p、1≦r≦p、1≦r+s≦pである。
信号処理装置250は、複数の第1画素値をイメージセンサ160から受け取る。第1被写体からの第2光はフィルタアレイ110に入射する。この入射に対応して、フィルタアレイ110は第1光を出力する。イメージセンサ160はフィルタアレイ110からの第1光に対応して複数の第1画素値を出力する。
信号処理回路250は、メモリ210に記録された第1行列と複数の第1画素値とに基づいて、第1被写体の第1波長域に対応する画像の複数の画素値I(11)、~、第1被写体の第p波長域に対応する画像の複数の画素値I(1p)を生成する。この生成方法は式(1)(2)を用いてすでに説明された。
信号処理装置250は、複数の第2画素値をイメージセンサ160から受け取る。第2被写体からの第4光はフィルタアレイ110に入射する。この入射に対応して、フィルタアレイ110は第3光を出力する。イメージセンサ160はフィルタアレイ110からの第3光に対応して複数の第2画素値を出力する。
信号処理回路250は、メモリ210に記録された第q小行列Aqと複数の第2画素値とに基づいて、第2被写体の第q波長域に対応する画像の複数の画素値I(2q)を生成する。信号処理回路250は、複数の第2小行列と複数の第2画素値とに基づいて、第2被写体の第r波長域に対応する画像の複数の画素値I(2r)、~、第2被写体の第(r+s)波長域に対応する画像の複数の画素値I(2(r+s))を生成しない。
(i)第2被写体の第1波長域に対応する画像の複数の画素値I(21)、~、第2被写体の第q波長域に対応する画像の複数の画素値I(2q)、~、第2被写体の第(r-1)波長域に対応する画像の複数の画素値I(2(r-1))を生成し、
(ii)第2被写体の第s波長域に対応する画像の複数の画素値I(2r)、~、第2被写体の第(r+s)波長域に対応する画像の複数の画素値I(2(r+s))を生成せず、
(iii)第2被写体の第(r+s+1)波長域に対応する画像の複数の画素値I(2(r+s+1))、~、第2被写体の第p波長域に対応する画像の複数の画素値I(2p)を生成する。
信号処理装置250は、複数の第3画素値をイメージセンサ160から受け取る。第3被写体からの第6光はフィルタアレイ110に入射する。この入射に対応して、フィルタアレイ110は第5光を出力する。イメージセンサ160はフィルタアレイ110からの第5光に対応して複数の第3画素値を出力する。
信号処理回路250は、メモリ210に記録された第q小行列Aqと複数の第3画素値とに基づいて、第3被写体の第q波長域に対応する画像の複数の画素値I(3q)を生成する。
信号処理回路250は、メモリ210に記録された複数の第2小行列に基づいて第2行列を生成する。信号処理回路250は生成された第2行列と複数の第3画素値とに基づいて、第3被写体の前記第r波長域~第3被写体の前記第(r+s)波長域に対応する画像の複数の画素値I3cを生成する
信号処理回路250は、複数の第2小行列と複数の第3画素値とに基づいて、第3被写体の第r波長域に対応する画像の複数の画素値I(3r)、~、第3被写体の第(r+s)波長域に対応する画像の複数の画素値I(3(r+s))を生成しない。
(i)第3被写体の第1波長域に対応する画像の複数の画素値I(31)、~、第3被写体の第q波長域に対応する画像の複数の画素値I(3q)、~、第3被写体の第(r-1)波長域に対応する画像の複数の画素値I(3(r-1))を生成し、
(ii)第3被写体の第s波長域に対応する画像の複数の画素値I(3r)、~、第3被写体の第(r+s)波長域に対応する画像の複数の画素値I(3(r+s))を生成せず、
(iii)第3被写体の第r波長域~第3被写体の第(r+s)波長域に対応する画像の複数の画素値I3cを生成し、
(iv)第3被写体の第(r+s+1)波長域に対応する画像の複数の画素値I(3(r+s+1))、~、第3被写体の第p波長域に対応する画像の複数の画素値I(3p)を生成する。
本開示において、圧縮画像は、複数の光学フィルタを含むフィルタアレイを用いた撮像とは異なる方法で撮像されることによって生成されてもよい。
本開示は、実施の形態1~4、実施例および変形例に限定されるものではない。本開示の趣旨を逸脱しない限り、当業者が思いつく各種変形を上記各実施の形態、実施例、および、変形例に施したもの、異なる実施の形態、実施例、及び/または、異なる変形例における構成要素を組み合わせて構築される形態も、本開示の範囲内に含まれてもよい。
20 復元画像
31、32、33、34 サンプル
70 対象物
80 試料
100 撮像装置
110 フィルタアレイ
140 光学系
150 制御回路
160 イメージセンサ
200 処理装置
210 メモリ
250 信号処理回路
300 表示装置
320 画像処理回路
330 ディスプレイ
400 入力UI
410 メモリ
420 プロセッサ
610 光源
620 光学系
621 干渉フィルタ
622 ダイクロイックミラー
623 対物レンズ
624 ロングパスフィルタ
630 検出器
Claims (22)
- コンピュータによって実行される信号処理方法であって、
対象波長域におけるハイパースペクトル情報を圧縮することによって得られた被写体の2次元画像情報を含む圧縮画像データを取得することと、
前記被写体に関連付けられた1つ以上のスペクトルの情報を含む参照スペクトルデータを取得することと、
前記圧縮画像データから、前記参照スペクトルデータに基づいて決定される複数の指定波長バンドに対応する複数の2次元画像データを生成することと、
を含む方法。 - 前記1つ以上のスペクトルは、前記被写体に含まれると想定される1種類以上の物質に関連付けられている、請求項1に記載の方法。
- 前記複数の指定波長バンドのそれぞれは、前記1種類以上の物質のうちの対応する1つに関連付けられた前記スペクトルのピーク波長を含む、請求項1または2に記載の方法。
- 前記参照スペクトルデータは、前記被写体に含まれると想定される複数種類の物質に関連付けられた複数のスペクトルの情報を含み、
前記複数の指定波長バンドは、前記複数のスペクトルの間で重なりを有しない第1指定波長バンドと、前記複数のスペクトルの間で重なりを有する第2指定波長バンドとを含む、
請求項1または2に記載の方法。 - 前記圧縮画像データは、分光透過率が互いに異なる複数種類の光学フィルタを含むフィルタアレイおよびイメージセンサを用いて生成され、
前記方法は、前記分光透過率の空間分布を反映したマスクデータを取得することをさらに含み、
前記複数の2次元画像データは、前記圧縮画像データおよび前記マスクデータに基づいて生成される、
請求項1から4のいずれかに記載の方法。 - 前記マスクデータは、前記対象波長域に含まれる複数の単位バンドの各々についての前記フィルタアレイの透過率の空間分布に応じた要素を有するマスク行列情報を含み、
前記方法は、
前記対象波長域における、前記指定波長バンドとは異なる非指定波長バンドに対応する前記マスク行列情報を合成した合成マスク情報を生成することと、
前記圧縮画像データおよび前記合成マスク情報に基づいて、前記非指定波長バンドについての合成画像データを生成することと、
をさらに含む、請求項5に記載の方法。 - 前記複数の2次元画像データを生成することは、前記対象波長域における、前記指定波長バンドとは異なる非指定波長バンドに対応する画像データを前記圧縮画像データから生成することなく、前記複数の指定波長バンドに対応する前記複数の2次元画像データを生成して出力することを含む、請求項5に記載の方法。
- 前記複数の指定波長バンドは、前記参照スペクトルデータが示す前記1つ以上のスペクトルの強度または前記強度の微分値に基づいて決定される、請求項1から7のいずれかに記載の方法。
- 前記参照スペクトルデータは、前記被写体に含まれると想定される1つ以上の物質の蛍光スペクトルの情報を含む、請求項1から8のいずれかに記載の方法。
- 前記参照スペクトルデータは、前記被写体に含まれると想定される1つ以上の物質の吸光スペクトルの情報を含む、請求項1から8のいずれかに記載の方法。
- 前記1つ以上のスペクトル、または前記1つ以上のスペクトルに関連付けられた1種類以上の物質をユーザに指定させるためのグラフィカルユーザインターフェースをディスプレイに表示させることをさらに含み、
前記参照スペクトルデータは、指定された前記1つ以上のスペクトル、または指定された前記1種類以上の物質に応じて取得される、
請求項1から10のいずれかに記載の方法。 - 対象波長域におけるハイパースペクトル情報を圧縮することによって得られた被写体の2次元画像情報を含む圧縮画像データから、波長バンドごとの分光画像データを復元するために用いられるマスクデータを生成する方法であって、
前記対象波長域における第1波長バンド群に対応する第1分光画像データを復元するための第1マスクデータを取得することと、
少なくとも1つのスペクトルに関する情報を含む参照スペクトルデータを取得することと、
前記参照スペクトルデータに基づき、前記対象波長域に含まれる1つ以上の指定波長域を決定することと、
前記第1マスクデータに基づき、前記1つ以上の指定波長域における第2波長バンド群に対応する第2分光画像データを復元するための第2マスクデータを生成することと、
を含む方法。 - 前記圧縮画像データは、分光透過率が互いに異なる複数種類の光学フィルタを含むフィルタアレイおよびイメージセンサを用いて生成され、
前記第1マスクデータおよび前記第2マスクデータは、前記フィルタアレイの分光透過率の空間分布が反映されたデータであり、
前記第1マスクデータは、前記第1波長バンド群に対応する前記分光透過率の空間分布を示す第1マスク情報を含み、
前記第2マスクデータは、前記第2波長バンド群に対応する前記分光透過率の空間分布を示す第2マスク情報を含む、
請求項12に記載の方法。 - 前記第2マスクデータは、複数の情報を合成することにより得られた第3マスク情報をさらに含み、
前記複数の情報の各々は、前記対象波長域のうち、前記指定波長域以外の非指定波長域に含まれる対応する波長バンドにおける前記分光透過率の空間分布を示す、
請求項13に記載の方法。 - 前記第2マスクデータは、前記指定波長域以外の非指定波長域に含まれる対応する波長バンドにおける前記分光透過率の空間分布に関する情報を含まない、請求項13に記載の方法。
- コンピュータによって実行される信号処理方法であって、
対象波長域におけるハイパースペクトル情報を圧縮することによって得られた被写体の2次元画像情報を含む圧縮画像データを取得することと、
前記被写体に関連付けられた1つ以上のスペクトルの情報を含む参照スペクトルデータを取得することと、
前記圧縮画像データから、複数の指定波長バンドに対応する複数の2次元画像データを生成するための復元条件を指定させるためのグラフィカルユーザインターフェース、および前記参照スペクトルデータに基づく画像を、ともにディスプレイに表示させることと、
を含む方法。 - プロセッサと、
前記プロセッサによって実行されるコンピュータプログラムを格納したメモリと、
を備えた信号処理装置であって、
前記コンピュータプログラムは、前記プロセッサに
対象波長域におけるハイパースペクトル情報を圧縮することによって得られた被写体の2次元画像情報を含む圧縮画像データを取得することと、
前記被写体に関連付けられた1つ以上のスペクトルの情報を含む参照スペクトルデータを取得することと、
前記圧縮画像データから、前記参照スペクトルデータに基づいて決定される複数の指定波長バンドに対応する複数の2次元画像データを生成することと、
を実行させる、信号処理装置。 - プロセッサと、
前記プロセッサによって実行されるコンピュータプログラムを格納したメモリと、
を備えた信号処理装置であって、
前記コンピュータプログラムは、前記プロセッサに
対象波長域におけるハイパースペクトル情報を圧縮することによって得られた被写体の2次元画像情報を含む圧縮画像データから、前記対象波長域における第1波長バンド群に対応する第1分光画像データを復元するための第1マスクデータを取得することと、
少なくとも1つのスペクトルに関する情報を含む参照スペクトルデータを取得することと、
前記参照スペクトルデータに基づき、前記対象波長域に含まれる1つ以上の指定波長域を決定することと、
前記第1マスクデータに基づき、前記1つ以上の指定波長域における第2波長バンド群に対応する第2分光画像データを復元するための第2マスクデータを生成することと、
を実行させる、信号処理装置。 - プロセッサと、
前記プロセッサによって実行されるコンピュータプログラムを格納したメモリと、
を備えた信号処理装置であって、
前記コンピュータプログラムは、前記プロセッサに
対象波長域におけるハイパースペクトル情報を圧縮することによって得られた被写体の2次元画像情報を含む圧縮画像データを取得することと、
前記被写体に関連付けられた1つ以上のスペクトルの情報を含む参照スペクトルデータを取得することと、
前記圧縮画像データから、複数の指定波長バンドに対応する複数の2次元画像データを生成するための復元条件を指定させるためのグラフィカルユーザインターフェース、および前記参照スペクトルデータに基づく画像を、ともにディスプレイに表示させることと、
を実行させる、信号処理装置。 - コンピュータに、
対象波長域におけるハイパースペクトル情報を圧縮することによって得られた被写体の2次元画像情報を含む圧縮画像データを取得することと、
前記被写体に関連付けられた1つ以上のスペクトルの情報を含む参照スペクトルデータを取得することと、
前記圧縮画像データから、前記対象波長域に含まれ、前記参照スペクトルデータに基づいて決定される複数の指定波長バンドに対応する複数の2次元画像データを生成することと、
を実行させる、コンピュータプログラム。 - コンピュータに、
対象波長域におけるハイパースペクトル情報を圧縮することによって得られた被写体の2次元画像情報を含む圧縮画像データから、前記対象波長域における第1波長バンド群に対応する第1分光画像データを復元するための第1マスクデータを取得することと、
少なくとも1つのスペクトルに関する情報を含む参照スペクトルデータを取得することと、
前記参照スペクトルデータに基づき、前記対象波長域に含まれる1つ以上の指定波長域を決定することと、
前記第1マスクデータに基づき、前記1つ以上の指定波長域における第2波長バンド群に対応する第2分光画像データを復元するための第2マスクデータを生成することと、
を実行させる、コンピュータプログラム。 - コンピュータに、
対象波長域におけるハイパースペクトル情報を圧縮することによって得られた被写体の2次元画像情報を含む圧縮画像データを取得することと、
前記被写体に関連付けられた1つ以上のスペクトルの情報を含む参照スペクトルデータを取得することと、
前記圧縮画像データから、複数の指定波長バンドに対応する複数の2次元画像データを生成するための復元条件を指定させるためのグラフィカルユーザインターフェース、および前記参照スペクトルデータに基づく画像を、ともにディスプレイに表示させることと、
を実行させる、コンピュータプログラム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202280040739.8A CN117460947A (zh) | 2021-07-06 | 2022-06-23 | 信号处理装置及信号处理方法 |
JP2023533523A JPWO2023282069A1 (ja) | 2021-07-06 | 2022-06-23 | |
US18/540,962 US20240119645A1 (en) | 2021-07-06 | 2023-12-15 | Signal processing apparatus and signal processing method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021112254 | 2021-07-06 | ||
JP2021-112254 | 2021-07-06 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/540,962 Continuation US20240119645A1 (en) | 2021-07-06 | 2023-12-15 | Signal processing apparatus and signal processing method |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023282069A1 true WO2023282069A1 (ja) | 2023-01-12 |
Family
ID=84800279
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/025013 WO2023282069A1 (ja) | 2021-07-06 | 2022-06-23 | 信号処理装置および信号処理方法 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20240119645A1 (ja) |
JP (1) | JPWO2023282069A1 (ja) |
CN (1) | CN117460947A (ja) |
WO (1) | WO2023282069A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116704064A (zh) * | 2023-06-08 | 2023-09-05 | 深圳市中科云驰环境科技有限公司 | 基于高光谱的污水成像方法、系统、电子设备及存储介质 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009159068A (ja) * | 2007-12-25 | 2009-07-16 | Ricoh Co Ltd | 画像処理装置、画像処理方法、プログラムおよび記録媒体 |
JP2016156801A (ja) * | 2014-11-19 | 2016-09-01 | パナソニックIpマネジメント株式会社 | 撮像装置および分光システム |
JP2017015694A (ja) * | 2015-07-02 | 2017-01-19 | パナソニックIpマネジメント株式会社 | 撮像装置 |
JP2018098641A (ja) * | 2016-12-13 | 2018-06-21 | ソニーセミコンダクタソリューションズ株式会社 | 画像処理装置、画像処理方法、プログラム、および電子機器 |
WO2021119363A2 (en) * | 2019-12-10 | 2021-06-17 | Agnetix, Inc. | Multisensory imaging methods and apparatus for controlled environment horticulture using irradiators and cameras and/or sensors |
-
2022
- 2022-06-23 CN CN202280040739.8A patent/CN117460947A/zh active Pending
- 2022-06-23 JP JP2023533523A patent/JPWO2023282069A1/ja active Pending
- 2022-06-23 WO PCT/JP2022/025013 patent/WO2023282069A1/ja active Application Filing
-
2023
- 2023-12-15 US US18/540,962 patent/US20240119645A1/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009159068A (ja) * | 2007-12-25 | 2009-07-16 | Ricoh Co Ltd | 画像処理装置、画像処理方法、プログラムおよび記録媒体 |
JP2016156801A (ja) * | 2014-11-19 | 2016-09-01 | パナソニックIpマネジメント株式会社 | 撮像装置および分光システム |
JP2017015694A (ja) * | 2015-07-02 | 2017-01-19 | パナソニックIpマネジメント株式会社 | 撮像装置 |
JP2018098641A (ja) * | 2016-12-13 | 2018-06-21 | ソニーセミコンダクタソリューションズ株式会社 | 画像処理装置、画像処理方法、プログラム、および電子機器 |
WO2021119363A2 (en) * | 2019-12-10 | 2021-06-17 | Agnetix, Inc. | Multisensory imaging methods and apparatus for controlled environment horticulture using irradiators and cameras and/or sensors |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116704064A (zh) * | 2023-06-08 | 2023-09-05 | 深圳市中科云驰环境科技有限公司 | 基于高光谱的污水成像方法、系统、电子设备及存储介质 |
CN116704064B (zh) * | 2023-06-08 | 2024-02-23 | 深圳市中科云驰环境科技有限公司 | 基于高光谱的污水成像方法、系统、电子设备及存储介质 |
Also Published As
Publication number | Publication date |
---|---|
US20240119645A1 (en) | 2024-04-11 |
JPWO2023282069A1 (ja) | 2023-01-12 |
CN117460947A (zh) | 2024-01-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6952277B2 (ja) | 撮像装置および分光システム | |
EP3832359A1 (en) | Method and device for imaging of lensless hyperspectral image | |
EP3172698A1 (en) | Compact multifunctional system for imaging spectroscopy | |
JP5165732B2 (ja) | マルチスペクトル画像処理方法、画像処理装置、及び画像処理システム | |
JP2016130727A (ja) | 撮像装置 | |
WO2021246192A1 (ja) | 信号処理方法、信号処理装置、および撮像システム | |
US9031304B2 (en) | Image processing system | |
EP4130693A1 (en) | Signal processing method, signal processing device, and image-capturing system | |
JP4599520B2 (ja) | マルチスペクトル画像処理方法 | |
US20240119645A1 (en) | Signal processing apparatus and signal processing method | |
CN113711133A (zh) | 基于深度学习的彩色全息显微镜的系统和方法 | |
WO2022163421A1 (ja) | 検査対象に含まれる異物を検出する方法および装置 | |
EP4092397A1 (en) | Optical filter array, optical detection device, and optical detection system | |
US20240037798A1 (en) | Image processing apparatus, imaging system, and method for estimating error in reconstructed images | |
Kotwal et al. | An optimization-based approach to fusion of hyperspectral images | |
Singh et al. | Multi-exposure microscopic image fusion-based detail enhancement algorithm | |
Rahman et al. | Multisensor fusion and enhancement using the Retinex image enhancement algorithm | |
Rice et al. | Hyperspectral image projector applications | |
Sawyer et al. | Towards a simulation framework to maximize the resolution of biomedical hyperspectral imaging | |
WO2023106143A1 (ja) | 分光画像を生成するシステムに用いられる装置およびフィルタアレイ、分光画像を生成するシステム、ならびにフィルタアレイの製造方法 | |
WO2023106142A1 (ja) | 信号処理方法、プログラム、およびシステム | |
Wei et al. | Rapid hyperspectral imaging system via sub-sampling coding | |
JP2024020922A (ja) | 復元画像の評価方法および撮像システム、 | |
WO2023286613A1 (ja) | フィルタアレイ、光検出装置、および光検出システム | |
Zhang et al. | Sub-pixel dispersion model for coded aperture snapshot spectral imaging |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22837486 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202280040739.8 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023533523 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 22837486 Country of ref document: EP Kind code of ref document: A1 |