WO2020213419A1 - Dispositifs d'imagerie, dispositif, procédé et programme de traitement de signal et programme de traitement de signal - Google Patents

Dispositifs d'imagerie, dispositif, procédé et programme de traitement de signal et programme de traitement de signal Download PDF

Info

Publication number
WO2020213419A1
WO2020213419A1 PCT/JP2020/015200 JP2020015200W WO2020213419A1 WO 2020213419 A1 WO2020213419 A1 WO 2020213419A1 JP 2020015200 W JP2020015200 W JP 2020015200W WO 2020213419 A1 WO2020213419 A1 WO 2020213419A1
Authority
WO
WIPO (PCT)
Prior art keywords
pixel
optical
light
wavelength band
optical system
Prior art date
Application number
PCT/JP2020/015200
Other languages
English (en)
Japanese (ja)
Inventor
小野 修司
Original Assignee
富士フイルム株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士フイルム株式会社 filed Critical 富士フイルム株式会社
Publication of WO2020213419A1 publication Critical patent/WO2020213419A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/20Filters
    • G02B5/22Absorbing filters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • H04N23/12Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths with one sensor only
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/30Transforming light or analogous information into electric information
    • H04N5/33Transforming infrared radiation

Definitions

  • the present invention relates to an imaging device, a signal processing device, a signal processing method, and a signal processing program, and more particularly to an imaging device, a signal processing device, a signal processing method, and a signal processing program that acquire images in a plurality of wavelength bands.
  • a multi-spectral camera (also referred to as a multi-spectral camera, a multi-band camera, etc.) is known as a device for capturing images in a plurality of wavelength bands.
  • the image obtained by the multispectral camera is called a multispectral image.
  • Patent Document 1 the pupil portion of the optical system is divided and optical filters having different spectral transmittances are arranged in each region, and a microlens array having a light ray separation function is arranged in front of the image sensor.
  • a device for capturing a multispectral image with one image sensor has been proposed.
  • the ray separation by the microlens array is not always perfect. Therefore, the device of Patent Document 1 has a problem that light leaks to adjacent pixels and so-called crosstalk occurs.
  • Patent Document 2 proposes that a signal (pixel signal) obtained from each pixel is subjected to predetermined signal processing to eliminate the influence of interference.
  • Patent Document 3 in imaging with an endoscope, the subject is irradiated with three narrow band lights corresponding to each wavelength band of R pixel, G pixel and B pixel provided in the image sensor.
  • a technique has been proposed in which a signal is processed to obtain three narrow-band signals.
  • Patent Document 3 can be used only in a special environment that is not affected by ambient light, such as when imaging the inside of a body cavity, and cannot be used in normal imaging that is affected by ambient light. There is a drawback. In addition, it is necessary to prepare a special light source separately, which has a drawback that the system becomes large-scale.
  • the present invention has been made in view of such circumstances, and an object of the present invention is to provide an imaging device, a signal processing device, a signal processing method, and a signal processing program capable of acquiring a high-quality multispectral image.
  • the means for solving the above problems are as follows.
  • an image sensor including a plurality of pixel blocks composed of n pixels equipped with optical filters having different spectral transmittances and m is 2 ⁇ m.
  • an optical system having m optical regions having different wavelength bands of transmitted light, i being an integer satisfying 1 ⁇ i ⁇ m, and j satisfying 1 ⁇ j ⁇ n are satisfied.
  • a storage unit that stores a coefficient group composed of a matrix A of m rows and n columns in which each element is represented by aij, and a coefficient group obtained from the storage unit are obtained from each pixel block of the image sensor.
  • n pixel signals x1, x2, ..., Xn, m pixel signals X1, X2, ..., Xm corresponding to the wavelength band of light passing through each optical region of the optical system are calculated by the following formula.
  • the matrix A is obtained by calculating the inverse matrix of the matrix whose element is the ratio of light in each wavelength band passing through each optical region of the optical system being received by each pixel of each pixel block of the image sensor.
  • each optical region of the optical system has a bandwidth of 40 nm or less in a wavelength band to be transmitted.
  • the wavelength band of light transmitted through each optical region of the optical system is set in a region where the amount of change in the spectral sensitivity curve of the optical filter is equal to or less than a threshold value, and is any one of (1) to (4) above. Imaging device.
  • the wavelength band of light transmitted through each optical region of the optical system is set in a region where the inflection point of the spectral sensitivity curve of the optical filter is 1 or less, and is any one of (1) to (5) above.
  • One imaging device One imaging device.
  • each optical region of the optical system is provided with a bandpass filter that transmits light in a specific wavelength band.
  • the bandpass filters provided in each optical region of the optical system can be individually replaced, the storage unit stores a coefficient group for each combination of bandpass filters, and the arithmetic unit stores each optical region of the optical system.
  • the coefficient group corresponding to the combination of the bandpass filters mounted on the optical system is acquired from the storage unit, and m pixel signals X1, X2, ..., Xm corresponding to the wavelength band of the light passing through each optical region of the optical system are acquired.
  • an image sensor including a plurality of pixel blocks composed of n pixels equipped with optical filters having different spectral transmittances and m is 2 ⁇ m.
  • a signal processing device that processes a signal obtained from an image sensor of an imaging device including an optical system having m optical regions having different wavelength bands of transmitted light when an integer satisfying ⁇ n is used.
  • i is an integer satisfying 1 ⁇ i ⁇ m
  • j is an integer satisfying 1 ⁇ j ⁇ n
  • a coefficient group composed of a matrix A of m rows and n columns in which each element is represented by aij is obtained.
  • an image sensor including a plurality of pixel blocks composed of n pixels equipped with optical filters having different spectral transmittances and m is 2 ⁇ m.
  • This is a signal processing method for processing a signal obtained from an image sensor of an image pickup apparatus provided with an optical system having m optical regions having different wavelength bands of transmitted light when an integer satisfying ⁇ n is used.
  • m pixel signals X1 corresponding to the wavelength band of light passing through each optical region of the optical system by the following equation using the matrix A of m rows and n columns in which each element is represented by aij, Steps to calculate X2, ..., Xm,
  • an image sensor including a plurality of pixel blocks composed of n pixels equipped with optical filters having different spectral transmittances and m is 2 ⁇ m. It is a signal processing program that processes a signal obtained from an image sensor of an image pickup apparatus equipped with an optical system having m optical regions having different wavelength bands of transmitted light when an integer satisfying ⁇ n is used.
  • m pixel signals X1 corresponding to the wavelength band of light passing through each optical region of the optical system by the following equation using the matrix A of m rows and n columns in which each element is represented by aij, X2, ..., the function to calculate Xm, and
  • a signal processing program that makes a computer realize.
  • a high quality multispectral image can be acquired.
  • FIG. 1 shows the schematic structure of the image pickup apparatus of this embodiment.
  • Front view of pupil split filter A graph showing an example of the spectral characteristics of a bandpass filter provided in each optical region.
  • Block diagram showing a schematic configuration of a signal processing device A graph for obtaining the sensitivity of each pixel when light having a wavelength of 550 [nm] is incident on the optical system.
  • a table showing the rate at which light in each wavelength band is received by each pixel Conceptual diagram of operation of the image pickup apparatus of this embodiment The figure which shows another example of division of an optical area
  • FIG. 1 is a diagram showing a schematic configuration of an imaging device according to the present embodiment.
  • the image pickup apparatus 1 includes an optical system 10, an image sensor 100, and a signal processing apparatus 200.
  • the optical system 10 has a lens configuration according to the application of the image pickup apparatus 1. Therefore, a reflector, a prism, and the like are combined with the lens as needed.
  • the lenses and the like constituting the optical system 10 are simplified and shown by one lens 12.
  • the optical system 10 is focused by moving the entire or a part of the lens group back and forth along the optical axis L. Further, the amount of light incident on the image sensor 100 is adjusted by a diaphragm (not shown) provided in the optical path.
  • the optical system 10 further has a pupil dividing filter 14.
  • the pupil division filter 14 is provided at or near the pupil position of the optical system 10, and divides the pupil portion of the optical system 10 into four optical regions.
  • FIG. 2 is a front view of the pupil division filter.
  • the pupil division filter 14 has four optical regions S1, S2, S3, and S4 divided into four equal parts in the circumferential direction.
  • the optical region of reference numeral S1 is referred to as the first optical region S1
  • the optical region of reference numeral S2 is referred to as the second optical region S2
  • the optical region of reference numeral S3 is referred to as the third optical region S3 and the optical region of reference numeral S4.
  • the four optical regions are distinguished by setting the region as the fourth optical region S4.
  • Bandpass filters having different transmission wavelength bands are provided in each of the optical regions S1 to S4.
  • the first optical region S1 is provided with a first bandpass filter BPF1 that transmits light in the first wavelength band ⁇ 1.
  • the second optical region S2 is provided with a second bandpass filter BPF2 that transmits light in the second wavelength band ⁇ 2.
  • the third optical region S3 is provided with a third bandpass filter BPF3 that transmits light in the third wavelength band ⁇ 3.
  • the fourth optical region S4 is provided with a fourth bandpass filter BPF4 that transmits light in the fourth wavelength band ⁇ 4.
  • FIG. 3 is a graph showing an example of the spectral characteristics of the bandpass filter provided in each optical region.
  • the horizontal axis represents the wavelength (nm) and the vertical axis represents the transmittance (%).
  • the light in the first wavelength band ⁇ 1 transmitted by the first optical region S1 is light having a center wavelength of 550 [nm].
  • the light in the second wavelength band ⁇ 2 transmitted by the second optical region S2 is light having a center wavelength of 660 [nm].
  • the light in the third wavelength band ⁇ 3 transmitted by the third optical region S3 is light having a center wavelength of 735 [nm].
  • the light in the fourth wavelength band ⁇ 4 transmitted by the fourth optical region S4 is light having a center wavelength of 790 [nm].
  • the bandwidth (half width) of the wavelength to be transmitted is 40 [nm] or less in each region, which is a so-called narrow band.
  • the wavelength band transmitted through each optical region S1 to S4 of the pupil division filter 14 is the wavelength band of each image acquired as a multispectral image. Therefore, the spectral characteristics of the respective optical regions S1 to S4 shown in FIG. 3 are the image of the first wavelength band ⁇ 1 having a center wavelength of 550 [nm] and the second image having a center wavelength of 660 [nm] as a multispectral image.
  • the light incident on the optical system 10 passes through the pupil dividing filter 14 to allow light in the first wavelength band ⁇ 1 having a center wavelength of 550 [nm] and light in a second wavelength band having a center wavelength of 660 [nm].
  • the light of ⁇ 2 is separated into the light of the third wavelength band ⁇ 3 having a central wavelength of 735 [nm] and the light of the fourth wavelength band ⁇ 4 having a central wavelength of 790 [nm], and the respective optical regions S1 to S4. Is incident on the image sensor 100.
  • the image sensor 100 is composed of, for example, a solid-state image sensor such as a CCD (Charge-Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor).
  • a CCD Charge-Coupled Device
  • CMOS Complementary Metal Oxide Semiconductor
  • FIG. 4 is a diagram showing a schematic configuration of the pixel arrangement of the image sensor.
  • the image sensor 100 has a plurality of pixels on its light receiving surface.
  • the pixels are regularly arranged at a constant pitch along the horizontal direction (x direction) and the vertical direction (y direction).
  • Each pixel includes a photodiode, an optical filter, and a condenser lens, which are light receiving elements.
  • the condensing lens (so-called microlens) condenses the light from the optical system 10 on the photodiode.
  • the optical filter selectively transmits light in a predetermined wavelength band.
  • Each pixel is provided with any one of four optical filters having different spectral transmittances.
  • the four optical filters are an R filter that transmits light in the red (R; red) wavelength band, a G filter that transmits light in the green (G; green) wavelength band, and a blue (B;). It is composed of a B filter that transmits light in the wavelength band of blue) and a NIR filter that transmits light in the near-infrared (NIR) wavelength band.
  • the pixel represented by the reference numeral R is a pixel provided with an R filter.
  • the pixel represented by the reference numeral G is a pixel provided with a G filter.
  • the pixel represented by the reference numeral B is a pixel provided with a B filter.
  • the pixel represented by the reference numeral NIR is a pixel provided with a NIR filter.
  • a pixel equipped with an R filter is an R pixel
  • a pixel equipped with a G filter is a G pixel
  • a pixel provided with a B filter is a B pixel
  • a pixel provided with an NIR filter is a NIR pixel.
  • R pixels, G pixels, B pixels and NIR pixels are regularly arranged.
  • one pixel block PB (x, y) is composed of R pixel, G pixel, B pixel and NIR pixel, and the pixel block is along the horizontal direction (x direction) and the vertical direction (y direction).
  • PB (x, y) is arranged regularly. Therefore, one image sensor 100 is provided with a plurality of pixel blocks PB (x, y).
  • FIG. 4 shows an example in which one pixel block is configured by a 2 ⁇ 2 arrangement (2 pixels in the horizontal direction and 2 pixels in the vertical direction). Further, FIG.
  • FIG. 4 shows an example in which one pixel block is formed by arranging a B pixel in the upper left, a G pixel in the upper right, an R pixel in the lower left, and an NIR pixel in the lower right.
  • (x, y) in the reference numeral PB (x, y) represents the position of the pixel block on the xy plane.
  • FIG. 5 is a diagram showing an example of the spectral sensitivity characteristics of the R pixel, G pixel, B pixel, and NIR pixel.
  • the horizontal axis represents the wavelength (nm) and the vertical axis represents the spectral sensitivity.
  • reference numeral Cr indicates a spectral sensitivity curve of R pixel
  • reference numeral Cg indicates a spectral sensitivity curve of G pixel
  • reference numeral Cb indicates a spectral sensitivity curve of B pixel
  • reference numeral Cnir indicates a spectral sensitivity curve of NIR pixel.
  • the R pixel has a characteristic of receiving light in the red (R) wavelength band with high sensitivity.
  • the G pixel has a characteristic of receiving light in the green (G) wavelength band with high sensitivity.
  • the B pixel has a characteristic of receiving light in the blue (B) wavelength band with high sensitivity.
  • NIR pixels have the property of receiving light in the near infrared (NIR) wavelength band with high sensitivity.
  • the signal processing device 200 processes the signal output from the image sensor 100 to generate a multispectral image.
  • images (image data) of four wavelength bands are generated. Specifically, an image of the first wavelength band ⁇ 1 having a center wavelength of 550 [nm], an image of a second wavelength band ⁇ 2 having a center wavelength of 660 [nm], and a third image having a center wavelength of 735 [nm]. An image of the wavelength band ⁇ 3 and an image of the fourth wavelength band ⁇ 4 having a center wavelength of 790 [nm] are generated.
  • FIG. 6 is a block diagram showing a schematic configuration of a signal processing device.
  • the signal processing device 200 includes an analog signal processing unit 200A, a multispectral image generation unit 200B, a coefficient storage unit 200C, and an output unit 200D.
  • the analog signal processing unit 200A takes in the analog pixel signal output from each pixel of the image sensor 100, performs predetermined signal processing (for example, correlation double sampling processing, amplification processing, etc.), and finally becomes a digital signal. Convert and output.
  • predetermined signal processing for example, correlation double sampling processing, amplification processing, etc.
  • the multispectral image generation unit 200B takes in the digital pixel signal output from the analog signal processing unit 200A and performs predetermined signal processing to generate a multispectral image.
  • This image is an image of four wavelength bands corresponding to the four wavelength bands ⁇ 1 to ⁇ 4 separated by the pupil division filter 14.
  • the image of each wavelength band is generated by obtaining the signal value (pixel signal) of each pixel constituting each image by the following equation (1) using the matrix A.
  • X1 is a pixel signal at the pixel position p (x, y) of the image in the first wavelength band ⁇ 1
  • X2 is a pixel signal at the pixel position p (x, y) of the image in the second wavelength band ⁇ 2.
  • X3 is a pixel signal at the pixel position p (x, y) of the image in the third wavelength band ⁇ 3
  • X4 is a pixel signal at the pixel position p (x, y) of the image in the fourth wavelength band ⁇ 4. ..
  • xb is a pixel signal obtained by the B pixel of the pixel block PB (x, y).
  • xg is a pixel signal obtained by the G pixel of the pixel block PB (x, y).
  • xr is a pixel signal obtained by the R pixel of the pixel block PB (x, y).
  • xnir is a pixel signal obtained by NIR pixels of the pixel block PB (x, y).
  • the pixel position p (x, y) corresponds one-to-one with the position of the pixel block PB (x, y). Therefore, for example, the pixel position p (1,1) corresponds to the position of the pixel block PB (1,1).
  • the optical system 10 of the image pickup apparatus 1 of the present embodiment is provided with a pupil division filter 14. Therefore, light in each wavelength band separated by the pupil division filter 14 is incident on each pixel of the image sensor 100.
  • the four optical regions S1 to S4 of the pupil division filter 14 separate the light into four wavelength bands, so that the light in the four wavelength bands is incident.
  • the ratio of light in each wavelength band separated by the pupil division filter 14 received by each pixel of the image sensor 100 is uniquely determined in relation to the optical filter provided in each pixel.
  • the ratio of receiving light in the first wavelength band ⁇ 1 is b11
  • the ratio of receiving light in the second wavelength band ⁇ 2 is b12
  • the ratio of receiving light in the third wavelength band ⁇ 3 is received.
  • the ratio is b13
  • the ratio at which light in the fourth wavelength band ⁇ 4 is received is b14
  • the relationship of the following equation (2) holds between and (the symbol "*" in the equation is the symbol of integration).
  • the ratio of receiving light in the first wavelength band ⁇ 1 is b21
  • the ratio of receiving light in the second wavelength band ⁇ 2 is b22
  • the ratio of receiving light in the third wavelength band ⁇ 3 is received.
  • the pixel signal xg output from the G pixel and the pixel signals X1, X2, X3 of the corresponding pixels of the image in each wavelength band are taken. The following equation holds with X4.
  • the ratio of receiving the light of the first wavelength band ⁇ 1 is b31
  • the ratio of receiving the light of the second wavelength band ⁇ 2 is b32
  • the ratio of receiving the light of the third wavelength band ⁇ 3 is received.
  • the ratio at which light in the fourth wavelength band ⁇ 4 is received is b34
  • the pixel signal xr output from the R pixel and the pixel signals X1, X2, X3, X4 of the corresponding pixels of the image in each wavelength band The following equation holds between the two.
  • the ratio of receiving light in the first wavelength band ⁇ 1 is b41
  • the ratio of receiving light in the second wavelength band ⁇ 2 is b42
  • the ratio of receiving light in the third wavelength band ⁇ 3 is received.
  • the ratio is b43
  • the ratio at which light in the fourth wavelength band ⁇ 4 is received is b44
  • the pixel signal xnir output from the NIR pixel and the pixel signals X1, X2, X3, X4 of the corresponding pixels of the image in each wavelength band The following equation holds between the two.
  • X1 to X4 which are solutions of simultaneous equations composed of the above equations (2) to (5), are calculated by multiplying both sides of the above equation (6) by the inverse matrix B -1 of the matrix B. That is, it is calculated by the following formula (7).
  • the pixel signals X1 to X4 of each pixel constituting the image in each wavelength region can be calculated from the four pixel signals xb, xg, xr, and xnir of each pixel block.
  • the ratio of light received in each wavelength band in each of the B pixel, G pixel, R pixel, and NIR pixel is determined as follows.
  • FIG. 7 is a graph for obtaining the sensitivity of each pixel when light having a wavelength of 550 [nm] is incident on the optical system.
  • the sensitivity of each pixel when light having a wavelength of 550 [nm] (light having the center wavelength of the first wavelength band ⁇ 1) is incident on the optical system 10 is 7.00 for B pixel and G.
  • the pixel is 32.00, the R pixel is 6.00, and the NIR pixel is 1.70.
  • FIG. 8 is a graph for obtaining the sensitivity of each pixel when light having a wavelength of 660 [nm] is incident on the optical system.
  • the sensitivity of each pixel when light having a wavelength of 660 [nm] (light having the central wavelength of the second wavelength band ⁇ 2) is incident on the optical system 10 is 1.00 for the B pixel and G.
  • the pixel is 2.50
  • the R pixel is 19.50
  • the NIR pixel is 5.00.
  • FIG. 9 is a graph for obtaining the sensitivity of each pixel when light having a wavelength of 735 [nm] is incident on the optical system.
  • the sensitivity of each pixel when light having a wavelength of 735 [nm] (light having the center wavelength of the third wavelength band ⁇ 3) is incident on the optical system 10 is 3.00 for the B pixel and G.
  • the pixel is 6.00
  • the R pixel is 14.50
  • the NIR pixel is 9.00.
  • FIG. 10 is a graph for obtaining the sensitivity of each pixel when light having a wavelength of 790 [nm] is incident on the optical system.
  • the sensitivity of each pixel when light having a wavelength of 790 [nm] (light having a central wavelength of the fourth wavelength band ⁇ 4) is incident on the optical system 10 is 8.00 for B pixel and G.
  • the pixel is 12.00, the R pixel is 16.00, and the NIR pixel is 27.00.
  • the incident light amount of the light of each wavelength can be regarded as a pixel signal when the light of each wavelength is received.
  • the pixel signal when light with a wavelength of 550 [nm] is received is X1
  • the pixel signal when light with a wavelength of 660 [nm] is received is X2
  • the pixel signal when light with a wavelength of 735 [nm] is received is X3.
  • the pixel signal when receiving light having a wavelength of 790 [nm] is X4.
  • the product of the pixel signal and sensitivity when light with a wavelength of 550 [nm] is received is 7.00 * X1
  • the product of the pixel signal and sensitivity when light with a wavelength of 660 [nm] is received.
  • a certain 1.00 * X2 a pixel signal when receiving light having a wavelength of 735 [nm] and a sensitivity, which is 3.00 * X3, and a pixel signal when receiving light having a wavelength of 790 [nm].
  • the sum of 8.00 * X4 which is the product of sensitivities, is output as a pixel signal.
  • the product of the pixel signal and sensitivity when light with a wavelength of 550 [nm] is received is 32.00 * X1
  • the product of the pixel signal and sensitivity when light with a wavelength of 660 [nm] is received.
  • a certain 2.50 * X2 a pixel signal when receiving light having a wavelength of 735 [nm] and a sensitivity, which is the product of 6.00 * X3, and a pixel signal when receiving light having a wavelength of 790 [nm].
  • the sum of 12.00 * X4 which is the product of sensitivities, is output as a pixel signal.
  • the product of 6.00 * X1 which is the product of the pixel signal and sensitivity when light with a wavelength of 550 [nm] is received, and the product of the pixel signal and sensitivity when light with a wavelength of 660 [nm] is received.
  • a certain 19.50 * X2 a pixel signal when receiving light having a wavelength of 735 [nm] and a sensitivity product of 14.50 * X3, and a pixel signal when receiving light having a wavelength of 790 [nm].
  • the sum of 16.00 * X4 which is the product of sensitivities, is output as a pixel signal.
  • the product of 1.70 * X1 which is the product of the pixel signal and sensitivity when light with a wavelength of 550 [nm] is received, and the product of the pixel signal and sensitivity when light with a wavelength of 660 [nm] is received.
  • a certain 5.00 * X2 a pixel signal when receiving light having a wavelength of 735 [nm] and a sensitivity, which is the product of 9.00 * X3, and a pixel signal when receiving light having a wavelength of 790 [nm].
  • the sum of 27.00 * X4 which is the product of sensitivities, is output as a pixel signal.
  • xnir 1.70 * X1 + 5.00 * X2 + 9.00 * X3 + 27.00 * X4. Since the light in each wavelength band transmitted through each optical region S1 to S4 of the optical system 10 is narrow band light, this relationship is related to the light in the first wavelength band ⁇ 1 having a center wavelength of 550 [nm] and the center wavelength 660. When light of the second wavelength band ⁇ 2 of [nm], light of the wavelength band ⁇ 3 of the center wavelength 735 [nm], and light of the wavelength band ⁇ 4 of the center wavelength 790 [nm] are incident, the light of each wavelength band It can be regarded as the relationship of the ratio of light received by the NIR pixel.
  • FIG. 11 is a table showing the ratio of light in each wavelength band received by each pixel.
  • the light in the first wavelength band ⁇ 1 (center wavelength 550 [nm]) is 7.00
  • the light in the second wavelength band ⁇ 2 (center wavelength 660 [nm]) is 1.
  • light in the third wavelength band ⁇ 3 (center wavelength 735 [nm]) is incident at a rate of 3.00
  • light in the fourth wavelength band ⁇ 4 (center wavelength 790 [nm]) is incident at a rate of 8.00.
  • the light in the first wavelength band ⁇ 1 (center wavelength 550 [nm]) is 32.00
  • the light in the second wavelength band ⁇ 2 (center wavelength 660 [nm]) is 2.50
  • the third Light in the wavelength band ⁇ 3 (center wavelength 735 [nm]) is incident at a rate of 6.00
  • light in the fourth wavelength band ⁇ 4 (center wavelength 790 [nm]) is incident at a rate of 12.00.
  • the light in the first wavelength band ⁇ 1 (center wavelength 550 [nm]) is 6.00
  • the light in the second wavelength band ⁇ 2 (center wavelength 660 [nm]) is 19.50
  • the third Light in the wavelength band ⁇ 3 (center wavelength 735 [nm]) is incident at a rate of 14.50
  • light in the fourth wavelength band ⁇ 4 (center wavelength 790 [nm]) is incident at a ratio of 16.00.
  • the light in the first wavelength band ⁇ 1 (center wavelength 550 [nm]) is 1.70
  • the light in the second wavelength band ⁇ 2 (center wavelength 660 [nm]) is 5.00
  • the third Light in the wavelength band ⁇ 3 (center wavelength 735 [nm]) is incident at a rate of 9.00
  • light in the fourth wavelength band ⁇ 4 (center wavelength 790 [nm]) is incident at a rate of 27.00.
  • the inverse matrix B -1 of the matrix B is as follows.
  • the formula for calculating X4 is as shown in the following formula (8).
  • the coefficient storage unit 200C stores each element aij of the matrix A of 4 rows and 4 columns as a coefficient group.
  • the coefficient storage unit 200C is an example of a storage unit.
  • the multispectral image generation unit 200B acquires a coefficient group from the coefficient storage unit 200C and sets an arithmetic expression. Then, based on the set arithmetic expression, the pixel signals X1, X2, X3, and X4 of each wavelength band are calculated, and an image of each wavelength band is generated. That is, from the pixel signals xb, xg, xr, xnir of each pixel (B pixel, G pixel, R pixel and NIR pixel) obtained from each pixel block PB (x, y), based on the above equation (8). The pixel signals X1 to X4 of the corresponding pixels of the image of each wavelength band are calculated, and the image of each wavelength band is generated.
  • the multispectral image generation unit 200B is an example of a calculation unit.
  • the image of each wavelength band generated by the multispectral image generation unit 200B is output from the output unit 200D to the outside.
  • the image output from the output unit 200D is stored in a storage device (not shown) as needed. In addition, it is displayed on a display (not shown) as needed.
  • FIG. 12 is a conceptual diagram of the operation (signal processing method) of the image pickup apparatus of the present embodiment.
  • the light from the subject O incident on the optical system 10 is separated into light in four wavelength bands by the pupil dividing filter 14 provided in the optical system 10, and each pixel of the image sensor 100 (B pixel, G pixel, It is incident on the R pixel and the NIR pixel).
  • the light in the first wavelength band ⁇ 1 having a center wavelength of 550 [nm] the light in the second wavelength band ⁇ 2 having a center wavelength of 660 [nm]
  • the light in the third wavelength band ⁇ 3 and the light in the fourth wavelength band ⁇ 4 having a center wavelength of 790 [nm] are separated and incident on each pixel of the image sensor 100.
  • Each pixel (B pixel, G pixel, R pixel, and NIR pixel) of the image sensor 100 outputs signals (pixel signals) xb, xg, xr, and xnir according to the amount of light received to the signal processing device 200.
  • the signal processing device 200 acquires the pixel signals of each pixel output from the image sensor 100 (steps of acquiring n pixel signals x1, x2, ..., Xn obtained from each pixel block of the image sensor).
  • the signal processing device 200 generates an image (image data) of each wavelength band from the acquired pixel signals xb, xg, xr, and xnir. Specifically, using the equation (8), the pixel signals X1, X2, X3, and X4 of each pixel constituting the image in each wavelength band are converted into the pixel signals xb, xg, xr of the pixel block of the corresponding pixel position. , Xnir (step of calculating m pixel signals X1, X2, ..., Xm corresponding to the wavelength band of light passing through each optical region of the optical system), and an image of each wavelength band is generated.
  • the equation (8) is set by reading out the coefficient (each element aij of the matrix A) from the coefficient storage unit 200C.
  • one optical system 10 and one image sensor 100 can acquire images in four wavelength bands at a time. High quality images are obtained. That is, as long as the light can be correctly separated in the optical system, the image of each wavelength band can be reproduced by signal processing, so that a high-quality multispectral image can be obtained. Since the bandpass filter provided in each optical region S1 to S4 of the pupil division filter 14 can separate light in a desired wavelength band with high accuracy, the light separation itself can be easily performed. Therefore, even an image having an extremely narrow band having a bandwidth of 10 to 20 [nm] can be acquired with high quality.
  • the pupil division filter 14 since the pupil division filter 14 has independent transmission wavelength bands in each optical region S1 to S4, the transmission wavelength band can be easily changed. Moreover, the production can be easily performed at low cost.
  • the amount of transmitted light in each optical region S1 to S4 can be controlled independently, and the amount of light incident on the image sensor 100 can be made uniform in each wavelength band, so that signal processing can be performed.
  • the accuracy of (calculation processing of pixel signals in each wavelength band) can be improved. This makes it possible to generate a high quality multispectral image.
  • the optical system is configured to separate the incident light into light having m wavelength bands and incident the incident light on the image sensor.
  • the optical system includes m optical regions having different wavelength bands of transmitted light. The wavelength band transmitted by each optical region is the wavelength band of the image to be acquired.
  • one pixel block is composed of m or more pixels, and each pixel is provided with an optical filter having different spectral transmittances. Assuming that the number of pixels constituting one pixel block is n, n is an integer satisfying 2 ⁇ m ⁇ n.
  • the pixels constituting one pixel block may have at least m or more pixels of different optical filters. That is, as long as the number of pixels of different optical filters is m or more, one pixel block may include pixels of the same optical filter.
  • An image sensor composed of an optical system having an optical region in which the wavelength band of transmitted light is m, and n pixels in each pixel block (each pixel has an optical filter having different spectral transmittances).
  • n pixel signals obtained from each pixel block of the image sensor are x1, x2, ..., Xn, m wavelength bands (light transmitted through each optical region of the optical system)
  • Each element of the pixel signals X1, X2, ..., Xm in the wavelength band of) is represented by aij, where i is an integer satisfying 1 ⁇ i ⁇ m and j is an integer satisfying 1 ⁇ j ⁇ n. It is calculated by the following equation (9) using the matrix A of m rows and n columns.
  • the matrix A is an inverse matrix (matrix B) of a matrix (matrix B) whose element is the ratio of light in each wavelength band transmitted through each optical region of the optical system being received by each pixel of each pixel block of the image sensor. It is obtained by calculating the inverse matrix B -1 ).
  • the ratio at which each element of the matrix B, that is, the light of each wavelength band transmitted through each optical region of the optical system is received by each pixel of each pixel block of the image sensor is as described in the above embodiment. ..
  • the matrix B is an invertible matrix. That is, there is an inverse matrix in the matrix B. Therefore, for the wavelength band selected in each optical region of the optical system (transmission wavelength band of the bandpass filter provided in each optical region of the pupil division filter) and the optical filter provided in the image sensor, the matrix B is a regular matrix. It is necessary to set so as to be.
  • the matrix B is not a square matrix (when n ⁇ m, m ⁇ n)
  • the transmission wavelength band of each optical region of the optical system is such that the generalized inverse matrix (pseudo-inverse matrix) exists in the matrix B. And set the optical filter.
  • the configuration can be simplified by making the number n of the pixels constituting one pixel block and the number m of each optical region of the optical system the same number.
  • arithmetic processing can be simplified and speeded up.
  • the bandwidth of each wavelength band transmitted in each optical region of the optical system is set according to the application of the imaging apparatus, but is preferably 40 [nm] or less. For example, when it is used in an environment where a sufficient amount of light can be obtained, it can be set to 20 [nm] or 10 [nm]. By narrowing the band, the separation of light rays can be improved and a higher quality multispectral image can be generated.
  • the combination of optical filters provided in each pixel of the image sensor is not limited to the example of the above embodiment.
  • An optical filter that transmits light to another wavelength band can also be used in combination.
  • the optical filter provided in each pixel of the image sensor and the light in each wavelength band transmitted through each optical region of the optical system are required to satisfy the following conditions. That is, it is possible to set a matrix (matrix B) whose element is the ratio of light in each wavelength band transmitted through each optical region of the optical system to be received by each pixel of each pixel block of the image sensor, and the matrix (matrix B).
  • the condition is that the inverse matrix (inverse matrix B -1 ) of the matrix B) can be calculated.
  • the above equation (1) can be set, and an image of each wavelength band can be generated from the output of the image sensor.
  • each wavelength band transmitted in each optical region of the optical system is set avoiding a region in which the spectral sensitivity curve of the optical filter changes significantly (a region in which the change is steep). That is, it is preferable to set in a region where the amount of change in the spectral sensitivity curve of the optical filter is equal to or less than the threshold (the amount of change in the spectral sensitivity curve of the optical filter within the wavelength band transmitted in each optical region of the optical system is equal to or less than the threshold). It is preferable that
  • each wavelength band transmitted in each optical region of the optical system in a region where the fluctuation of the spectral sensitivity curve is small. Therefore, it is preferable to set the inflection point in a region of 1 or less (preferably, the inflection point in the wavelength band transmitted in each optical region of the optical system is 1 or less).
  • the pupil division filter 14 is detachably attached to the lens barrel, and the bandpass filters BPF1 to BPF4 provided in each optical region S1 to S4 are individually replaceable.
  • the transmission wavelength band of each of the optical regions S1 to S4 of 10 can be changed individually.
  • the filter frame that holds the bandpass filters BPF1 to BPF4 in each optical region S1 to S4 is detachably held with respect to the lens barrel, and the bandpass filter in each optical region S1 to S4 is detachably held with respect to the filter frame.
  • the BPF1 to BPF4 By configuring the BPF1 to BPF4 to be individually interchangeable, the transmission wavelength band of each optical region S1 to S4 of the optical system 10 can be individually changed.
  • the matrix A is obtained for each combination of the bandpass filters constituting the pupil division filter 14, and the coefficient group thereof is stored in the coefficient storage unit 200C.
  • the multispectral image generation unit 200B which is a calculation unit, acquires a coefficient group corresponding to the combination of bandpass filters from the coefficient storage unit 200C, and generates an image of each wavelength band.
  • the information on the combination of the bandpass filters attached to the optical system may be manually input by the user, or may be automatically acquired by communicating with the signal processing device 200 and the optical system 10. ..
  • the so-called pupil portion is divided into four equal parts in the circumferential direction to set each optical region S1 to S4, but the mode of division of the optical region is not limited to this.
  • FIG. 13 is a diagram showing another example of division of the optical region (another example of the pupil division filter).
  • the figure shows an example in which the pupil division filter 14 is configured by the filter frame 14A provided with the four openings 14a1 to 14a4 and the bandpass filters BPF1 to BPF4 mounted on the respective openings 14a1 to 14a4. ing.
  • Each of the openings 14a1 to 14a4 has a circular shape and is arranged at regular intervals (90 ° intervals) in the circumferential direction.
  • the openings 14a1 to 14a4 form the optical regions S1 to S4.
  • the bandpass filters BP1 to BPF4 mounted on the openings 14a1 to 14a4 interchangeable, the combination of wavelength bands transmitted in each optical region can be arbitrarily changed.
  • the pupil portion can also be divided into squares or concentric circles.
  • the function of the multispectral image generation unit 200B (calculation unit) in the signal processing device 200 can be realized by using various processors.
  • the various processors include, for example, a CPU (Central Processing Unit), which is a general-purpose processor that executes software (a program (signal processing program in the present invention)) to realize various functions.
  • the above-mentioned various processors include programmable logic devices (Programmable) which are processors whose circuit configurations can be changed after manufacturing such as GPU (Graphics Processing Unit) and FPGA (Field Programmable Gate Array) which are processors specialized in image processing.
  • Logic Device PLD
  • the above-mentioned various processors also include a dedicated electric circuit, which is a processor having a circuit configuration specially designed for executing a specific process such as an ASIC (Application Specific Integrated Circuit).
  • each part may be realized by one processor, or may be realized by a plurality of processors of the same type or different types (for example, a plurality of FPGAs, or a combination of a CPU and an FPGA, or a combination of a CPU and a GPU). Further, a plurality of functions may be realized by one processor. As an example of configuring a plurality of functions with one processor, first, as represented by a computer such as a server, one processor is configured by a combination of one or more CPUs and software, and there are a plurality of these processors. There is a form realized as a function of.
  • SoC System On Chip
  • various functions are configured by using one or more of the above-mentioned various processors as a hardware structure.
  • the hardware structure of these various processors is, more specifically, an electric circuit (circuitry) in which circuit elements such as semiconductor elements are combined.
  • These electric circuits may be electric circuits that realize the above functions by using logical sum, logical product, logical denial, exclusive OR, and logical operations combining these.
  • the processor (computer) readable code of the software to be executed is stored in a non-temporary recording medium such as ROM (Read Only Memory), and the processor Refers to the software.
  • the software stored in the non-temporary recording medium includes a program for executing image input, analysis, display control, and the like.
  • the code may be recorded on a non-temporary recording medium such as various optical magnetic recording devices or semiconductor memories instead of the ROM.
  • RAM Random Access Memory
  • EEPROM Electrical Erasable and Programmable Read Only Memory
  • the coefficient storage unit 200C of the signal processing device 200 can be realized by, for example, a memory such as a ROM (Read-only Memory) or an EEPROM (Electrically Erasable Programmable Read-only Memory).
  • a memory such as a ROM (Read-only Memory) or an EEPROM (Electrically Erasable Programmable Read-only Memory).
  • the image pickup device can also be configured as an interchangeable lens type image pickup device in which the optical system can be exchanged.
  • the matrix A is uniquely determined for each lens (optical system)
  • a matrix A is prepared for each lens, and the coefficient group thereof is stored in the coefficient storage unit.
  • the coefficient group of the matrix A corresponding to the exchanged lens is read out from the coefficient storage unit, arithmetic processing is executed, and an image of each optical region is generated.
  • Imaging device 10 Optical system 12
  • Multi-spectrum image generation unit 200C Coefficient storage unit 200D
  • Output unit Cb B Pixel spectral sensitivity curve Cg G pixel spectral sensitivity curve Cr R pixel spectral sensitivity curve Cnir NIR pixel spectral sensitivity curve B

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Color Television Image Signal Generators (AREA)

Abstract

L'invention concerne un dispositif d'imagerie ainsi qu'un dispositif, un procédé et un programme de traitement de signal pouvant acquérir une image multispectrale de haute qualité. Le dispositif d'imagerie comprend un capteur d'image (100) comprenant une pluralité de blocs de pixels comprenant chacun n pixels et ayant des filtres optiques avec différentes transmittances spectrales, n étant un entier satisfaisant 2 ≦ n ; un système optique (10) ayant m régions optiques dans lesquelles des bandes de longueur d'onde de lumière transmise sont différentes l'une de l'autre, lorsque m est un entier satisfaisant 2 ≦ m ≦ n ; un dispositif de traitement de signal (200) qui, à partir de n signaux de pixels (x1, x2.., Xn) obtenus à partir de chaque bloc de pixels du capteur d'image (100), calcule m signaux de pixels (x1, x2.., Xm) correspondant à la bande de longueur d'onde de la lumière traversant chaque région optique du système optique (10) et génère une image multispectrale.
PCT/JP2020/015200 2019-04-15 2020-04-02 Dispositifs d'imagerie, dispositif, procédé et programme de traitement de signal et programme de traitement de signal WO2020213419A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019076916 2019-04-15
JP2019-076916 2019-04-15

Publications (1)

Publication Number Publication Date
WO2020213419A1 true WO2020213419A1 (fr) 2020-10-22

Family

ID=72837770

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/015200 WO2020213419A1 (fr) 2019-04-15 2020-04-02 Dispositifs d'imagerie, dispositif, procédé et programme de traitement de signal et programme de traitement de signal

Country Status (1)

Country Link
WO (1) WO2020213419A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007536064A (ja) * 2004-05-10 2007-12-13 アバゴ・テクノロジーズ・ジェネラル・アイピー(シンガポール)プライベート・リミテッド 瞳孔検出方法及びシステム
JP2010043979A (ja) * 2008-08-13 2010-02-25 Yuichi Kamata 分光画像計測装置
WO2012098599A1 (fr) * 2011-01-17 2012-07-26 パナソニック株式会社 Dispositif d'imagerie
JP2017032537A (ja) * 2015-08-03 2017-02-09 采▲ぎょく▼科技股▲ふん▼有限公司VisEra Technologies Company Limited スペクトル測定装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007536064A (ja) * 2004-05-10 2007-12-13 アバゴ・テクノロジーズ・ジェネラル・アイピー(シンガポール)プライベート・リミテッド 瞳孔検出方法及びシステム
JP2010043979A (ja) * 2008-08-13 2010-02-25 Yuichi Kamata 分光画像計測装置
WO2012098599A1 (fr) * 2011-01-17 2012-07-26 パナソニック株式会社 Dispositif d'imagerie
JP2017032537A (ja) * 2015-08-03 2017-02-09 采▲ぎょく▼科技股▲ふん▼有限公司VisEra Technologies Company Limited スペクトル測定装置

Similar Documents

Publication Publication Date Title
JP6492055B2 (ja) 双峰性の画像を取得するための装置
JP5463718B2 (ja) 撮像装置
JP4483951B2 (ja) 撮像装置
JP6734647B2 (ja) 撮像装置
US20220078319A1 (en) Imaging apparatus
WO2021172284A1 (fr) Appareil et procédé de capture d'image
JP4968527B2 (ja) 撮像装置
CN110520768B (zh) 高光谱光场成像方法和系统
US11910107B2 (en) Image processing apparatus and image pickup apparatus
WO2021153473A1 (fr) Dispositif de lentille, dispositif d'imagerie, élément optique, procédé d'imagerie et programme d'imagerie
US11122242B2 (en) Imaging device
JP2021135404A (ja) レンズ装置、撮像装置、撮像方法、及び撮像プログラム
US11627263B2 (en) Imaging apparatus
US20060289958A1 (en) Color filter and image pickup apparatus including the same
WO2020213418A1 (fr) Dispositif d'imagerie, dispositif, procédé et programme de traitement de signal
WO2020213419A1 (fr) Dispositifs d'imagerie, dispositif, procédé et programme de traitement de signal et programme de traitement de signal
US9360605B2 (en) System and method for spatial and spectral imaging
JP2010276469A (ja) 画像処理装置及び測距装置の画像処理方法
JP6692749B2 (ja) マルチスペクトルカメラ
WO2020071253A1 (fr) Dispositif d'imagerie
CN112042185B (zh) 图像传感器以及相关电子装置
JP7399194B2 (ja) 画像処理装置、撮像装置、画像処理方法及び画像処理プログラム
JP6360816B2 (ja) 撮像装置及び撮像方法
WO2023053767A1 (fr) Dispositif de traitement de données, procédé, programme, et caméra multispectrale
JP2019062475A (ja) 撮像素子および撮像装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20791463

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20791463

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP