WO2015133130A1 - Dispositif de capture vidéo, dispositif de séparation de signal et procédé de capture vidéo - Google Patents

Dispositif de capture vidéo, dispositif de séparation de signal et procédé de capture vidéo Download PDF

Info

Publication number
WO2015133130A1
WO2015133130A1 PCT/JP2015/001121 JP2015001121W WO2015133130A1 WO 2015133130 A1 WO2015133130 A1 WO 2015133130A1 JP 2015001121 W JP2015001121 W JP 2015001121W WO 2015133130 A1 WO2015133130 A1 WO 2015133130A1
Authority
WO
WIPO (PCT)
Prior art keywords
video data
video
infrared
signal
color
Prior art date
Application number
PCT/JP2015/001121
Other languages
English (en)
Japanese (ja)
Inventor
塚田 正人
ナンシー マクガイア キンバリー
Original Assignee
日本電気株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電気株式会社 filed Critical 日本電気株式会社
Priority to JP2016506139A priority Critical patent/JPWO2015133130A1/ja
Priority to US15/122,034 priority patent/US10334185B2/en
Publication of WO2015133130A1 publication Critical patent/WO2015133130A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • H04N23/11Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths for generating image signals from visible and infrared light wavelengths
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/20Filters
    • G02B5/208Filters for use with infrared or ultraviolet radiation, e.g. for separating visible light from infrared and/or ultraviolet radiation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • H04N23/843Demosaicing, e.g. interpolating colour pixel values
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/134Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on three different wavelength filter elements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2209/00Details of colour television systems
    • H04N2209/04Picture signal generators
    • H04N2209/041Picture signal generators using solid-state devices
    • H04N2209/042Picture signal generators using solid-state devices having a single pick-up sensor
    • H04N2209/047Picture signal generators using solid-state devices having a single pick-up sensor using multispectral pick-up elements

Definitions

  • the present invention relates to video imaging technology, and more particularly to video imaging technology related to video processing in the visible light region and near infrared region.
  • the image sensor In a photographing apparatus such as a digital camera or a video camera, usually, the image sensor has a three-color optical filter of red (R), green (G), and blue (B). The light incident on the camera is decomposed by the three-color optical filter and converted into a video signal by an image sensor to generate RGB video data.
  • RGB red
  • G green
  • B blue
  • the image sensor used in the photographing apparatus is a silicon sensor
  • the image sensor has sensor sensitivity to light from the visible light region to the near infrared region.
  • near-infrared light that adversely affects color reproduction is removed by a near-infrared cut filter.
  • the R, G and B three-color optical filters ensure the transmittance in the wavelength band in the visible light region that each filter is responsible for, but the light transmission characteristics in the near infrared region outside the visible light region This is because there is a case that is not considered.
  • FIG. 1 is an example showing the spectral transmittance of an RGB three-color optical filter.
  • each color filter is expected to transmit light with a wavelength of about 400 nm to 500 nm, a G filter of about 500 nm to 600 nm, and an R filter of about 600 nm to 700 nm. Is done.
  • each filter has a characteristic of transmitting light of 700 nm or more which is outside the visible light region, that is, near infrared light.
  • the spectral sensitivity characteristic of an image sensor using a photodiode generally employed in a color image input device such as a digital camera or a video camera has sensitivity even in a wavelength region of 700 nm or more. If the three-color optical filter having the spectral characteristics shown in FIG. 1 is simply applied to the image sensor as it is, a problem arises from the viewpoint of color reproducibility.
  • the color matching function of the XYZ color system relating to human color perception is expressed as shown in FIG.
  • human color perception since sensitivity is zero for light of 700 nm or more, light having power in a wavelength region of 700 nm or more does not affect the perceptual color that is a psychophysical quantity.
  • an infrared having a spectral transmittance that eliminates the influence of near-infrared light of 700 nm or more as shown in FIG. Use an IR (Infra-Red) cut filter.
  • an IR cut filter is incorporated in the optical system of the color image input apparatus to block near-infrared light from entering the three-color optical filter and the image sensor. By doing so, light having power only in the visible wavelength region is input to the three-color optical filter, and light dispersed by the three-color optical filter is input to the image sensor, and an RGB signal is generated. .
  • Non-Patent Document 1 uses two cameras for capturing RGB images and IR images. A method has been proposed.
  • Non-Patent Document 2 shows spectral sensitivity characteristics of R, G, B, and IR optical filters.
  • the spectral sensitivity characteristics of the R, G, and B color filters have spectral sensitivities in the shape of the IR filter in near infrared light. In order to achieve high color reproducibility during daytime shooting, it is necessary to remove the influence of near-infrared light contained in R, G, and B.
  • Non-Patent Document 2 by using the IR signal obtained through the IR filter, the influence of near-infrared light contained in R, G, B is removed, and the R, G, B signal Is generated. For night shooting, use all R, G, B, and IR signals.
  • Patent Document 1 uses R, G, and B three-color optical filters that transmit near-infrared light, and a special photosensor that detects near-infrared light (NIR).
  • An imaging device that generates NIR signals is proposed. For example, light transmitted through the R filter becomes R + NIR and enters the photosensor.
  • This photosensor includes a visible light sensor unit that detects R at a shallow position and a non-visible light sensor unit that detects NIR at a deep position in the incident direction of light. The same applies to G and B.
  • Non-Patent Document 3 shows an example of a demosaicing processing method, which will be described later in order to explain the present embodiment.
  • Non-Patent Document 4 shows a method using Gradient Interpolation described later.
  • Non-Patent Document 5 shows an example of a demosaicing processing technique.
  • Patent Document 2 discloses an imaging apparatus that can eliminate the influence of unnecessary wavelength region components such as infrared light without using an infrared light filter.
  • Non-Patent Document 1 can generate a high-resolution RGB image and NIR image by using two cameras, but there is a problem that it is difficult to make the image input device compact and the cost is high. Although the same method can be incorporated in one apparatus, it is difficult to solve the above problem because two optical paths of RGB and NIR and two image sensors are required.
  • Non-Patent Document 2 and Patent Document 1 are special image sensors for generating a near-infrared image. In other words, this image sensor is compatible with semiconductor manufacturing. There is a problem that it is difficult to obtain and at present is more expensive than a normal image sensor.
  • the present invention solves the problem, and provides a video imaging technique capable of easily processing video in the visible light region and the near infrared region while utilizing the configuration of a general video imaging device. Main purpose.
  • One aspect of the present invention that solves the above problems is a video data acquisition unit that acquires video data including a periodic pattern by near-infrared light, and a visible light component from the video data based on the periodic pattern.
  • a video processing unit that acquires a color signal and a near-infrared signal.
  • One aspect of the present invention for solving the above-described problems is that video data acquisition means for acquiring video data including a periodic pattern by near-infrared light; And a video processing means for acquiring a signal.
  • One embodiment of the present invention for solving the above problem is to acquire video data including a periodic pattern by near-infrared light, and based on the periodic pattern, a color signal of a visible light component and a near signal from the video data.
  • This is a video shooting method for acquiring an infrared signal.
  • the present invention it is possible to easily perform image processing in the visible light region and the near infrared region while utilizing the configuration of a general photographing apparatus.
  • spectral transmittance of the RGB three-color optical filter This is a color matching function of the XYZ color system related to human color perception. It is a spectral intensity distribution of color light shown as a reference. It is an example of the spectral transmittance of an IR cut filter.
  • 2 is a schematic configuration of an optical system of a color image input device.
  • 3 is a schematic configuration of an imaging device using a four-color optical filter. It is a functional block diagram of the video processing apparatus in 1st Embodiment. It is a comparative example showing the intensity
  • FIG. 7 is a functional block diagram of the video processing device 4 in the first embodiment of the present invention.
  • the video processing device 4 includes a video data acquisition unit 141, a first color signal acquisition unit 142, a Fourier transform unit 143, a periodic pattern detection unit 144, a peak removal unit 145, and a second color signal acquisition unit 146. And a near-infrared signal acquisition unit 147.
  • the video data acquisition unit 141 acquires video data from the external means 140.
  • the video data includes a plurality of color signals. Further, the video data includes a periodic pattern by near infrared light.
  • the first color signal acquisition unit 142 acquires one color signal (first color signal) from the video data.
  • the first color signal is a signal obtained by adding a near infrared signal to the original color signal (second color signal).
  • the Fourier transform unit 143 transforms the first color signal into a signal in the frequency axis space by two-dimensional Fourier transform.
  • the space on the frequency axis converted by the two-dimensional Fourier transform may be referred to as a two-dimensional Fourier space.
  • the first color signal includes a periodic pattern by near infrared light. As a result, a peak due to near-infrared light occurs at a specific frequency in the two-dimensional Fourier space.
  • the periodic pattern detection unit 144 detects a peak at a specific frequency formed by a near infrared light component.
  • the peak removing unit 145 removes a peak at a specific frequency from the first color signal in the two-dimensional Fourier space.
  • the information after the removal is the second color signal in the two-dimensional Fourier space.
  • the second color signal acquisition unit 146 acquires the converted second color signal by performing an inverse Fourier transform on the information from which the peak has been removed. That is, the second color signal is a color signal based only on a visible light component that does not include near-infrared light.
  • the near-infrared signal acquisition unit 147 acquires a near-infrared signal by subtracting the second color signal from the first color signal. Further, a near infrared signal is acquired for each of a plurality of color signals, and NIR video data is generated.
  • a signal on the time axis is converted into a signal on the frequency axis.
  • the signal on the spatial coordinate axis is converted into a signal on the spatial axis of wave number.
  • a cycle means a certain time that is repeated.
  • the period is also applied to a constant space that is repeated.
  • the video data acquisition unit 141 acquires video data for all pixels.
  • R, G, B color information is set for all pixels. Furthermore, near-infrared light components are added to the R, G, and B color information in a specific frequency pattern.
  • NIR video data is extracted and generated from video data in which R, G, and B color information is set for all pixels by the following filtering process. This will be described in detail below.
  • the first color signal acquisition unit 142 selects one color signal (first color signal) from the plurality of color signals (R, G, B). For example, the first color signal acquisition unit 142 selects the first color signal I G_NIR related to the G component.
  • the Fourier transform unit 143 transforms the first color signal into a two-dimensional Fourier space.
  • FIG. 8 is a diagram showing a comparative example of Fourier transform.
  • FIG. 8 shows the intensity of a signal in a two-dimensional Fourier space by obtaining video data of a certain natural scene and applying a two-dimensional discrete Fourier transform to the first color signal related to the G component.
  • the acquired video data is video data taken using a normal IR cut filter, and the signal intensity at each frequency shown in FIG. 8 is based only on the visible light component. Contains no ingredients. That is, a periodic pattern cannot be confirmed.
  • FIG. 9 shows an example of Fourier transform.
  • the image data of the same scene as in FIG. 8 is acquired, and the two-dimensional discrete Fourier transform is applied to the first color signal related to the G component to express the intensity of the signal in the two-dimensional Fourier space.
  • the video data includes a periodic pattern using near infrared light.
  • a peak due to near-infrared light is generated at a specific frequency in the two-dimensional Fourier space shown in FIG. That is, the intensity of the signal at each frequency is due to the visible light component and the near-infrared light component.
  • the signal intensity in the Fourier space shown in FIGS. 8 and 9 is different depending on whether or not it includes near-infrared light formed in a periodic pattern.
  • a periodic pattern (peak) is confirmed in the intensity of the signal in the Fourier space shown in FIG. It can be seen that this is due to the near infrared light component.
  • the peak removing unit 145 removes the intensity of the signal at a specific frequency at which a peak due to near infrared light occurs in Fourier space.
  • the inverse Fourier transform is performed to convert it to the original video data.
  • a Butterworth filter is used.
  • represents the total frequency
  • ⁇ c represents the frequency for removing the peak.
  • This transfer function is applied to a specific frequency at which a peak occurs in Fourier space (filtering only for the specific frequency), thereby suppressing the discontinuity of the signal strength between the specific frequency and its surroundings, Peaks can be removed.
  • the second color signal acquisition unit 146 performs an inverse Fourier transform on the result of removing the signal intensity at a specific frequency where a peak due to near-infrared light is generated by the Butterworth filter.
  • video data composed of color signals I G by only the visible light component not including the near infrared light is obtained.
  • near infrared signal acquiring unit 147 the video data I NIR_G by NIR only, as shown in equation (2), by subtracting the image data I G from the video data I G_NIR of G signal including the NIR, get.
  • I NIR_G I G_NIR -I G (2)
  • the first color signal related to the G component has been described above.
  • the second color signal I R based only on the visible light component not including near-infrared light and the video data I NIR_R based only on NIR are obtained.
  • the second color signal I B based only on the visible light component not including near infrared light and the video data I NIR_B based only on the NIR are obtained. can get.
  • video data I R I containing only R, G, B color signals
  • Video data I NIR_R , I NIR_G , and I NIR_B are obtained only by G 1 , I B , and NIR.
  • R, G, B, and NIR video data I R , I G , I B , and I NIR are generated from video data including a periodic pattern of near-infrared light.
  • video data in a visible light region and a near-infrared region can be acquired by processing video data including a periodic pattern using near-infrared light.
  • FIG. 10 is a schematic configuration diagram of a video imaging apparatus 100 according to the second embodiment of the present invention.
  • the video photographing apparatus 100 includes a code-type IR cut filter 1, an optical filter 2, a photo sensor 3, and a video processing unit 4A.
  • the camera lens may be a normal camera lens.
  • the optical filter 2 and the photosensor 3 may be an optical filter and a photosensor that are currently generally used in color image input devices (or video photographing devices). That is, the spectral characteristics of the optical filter 2 are the same as those in FIG.
  • FIG. 11 is a diagram showing an outline of the optical filter 2 and the photosensor 3.
  • the arrangement of R, G, and B3 colors in the optical filter 2 shown in FIG. 11 is called a Bayer arrangement type.
  • One color of R, G, and B is assigned to one pixel of the photosensor 3 so as to correspond to R, G, and B in the optical filter 2.
  • FIG. 12 is an outline of the code type IR cut filter 1.
  • the code-type IR cut filter 1 is a filter provided with two patterns, a portion that cuts near infrared light (NIR) (infrared cut portion 11) and a portion that transmits light (infrared transmission portion 12). That is, the code type means binary values of transmission and cut.
  • NIR near infrared light
  • infrared transmission portion 12 that is, the code type means binary values of transmission and cut.
  • the infrared transmission section 12 is regularly arranged in the code-type IR cut filter 1 so as to maintain periodicity, for example, in the vertical direction, the horizontal direction, the oblique direction, and the concentric shape. Note that the size and shape do not necessarily match the size and shape of one pixel of the photosensor.
  • the sign type IR cut filter 1 is provided on the front side of the optical filter 2 in the light traveling direction. Thereby, diffraction of near-infrared light occurs (described later).
  • the light that has passed through the code-type IR cut filter 1 and the optical filter 2 is converted into three-color signals of R, G, and B by the photosensor 3 and output as video data.
  • These code-type IR cut filter 1, optical filter 2 and photosensor 3 form video data to which a periodic pattern by near infrared light is added.
  • the video processing unit 4A generates video data composed of four color signals of R, G, B, and NIR based on video data composed of three color signals of R, G, and B.
  • Detailed processing contents are the same as those of the video processing device 4 of the first embodiment.
  • the light that enters the image capturing device 100 through the camera lens is split into light that has been cut from near-infrared light and light that includes near-infrared light by the code-type IR cut filter 1.
  • the two types of split light are incident on a photosensor 3 in which an R, G, B Bayer array type optical filter 2 is incorporated.
  • FIG. 13 shows the behavior of near-infrared light when the near-infrared light passes through the infrared transmission part 12 in the code-type IR cut filter 1.
  • Near-infrared light transmitted through the infrared transmission part 12 in the code-type IR cut filter is diffused by diffraction and enters the optical filter 2 and the photosensor 3.
  • the incident light in the optical filter 2 is near-infrared light diffused by visible light and the infrared transmission part 12 in the code-type IR cut filter 1.
  • R, G, and B signals including near-infrared light components are generated in the photosensor 3 from light (visible light and near-infrared light) transmitted through each filter in the optical filter 2.
  • the component of near infrared light is expressed as NIR.
  • the video processing unit 4A applies a demosaicing process and a filtering process based on the video data I RGB_NIR composed of three color signals of R, G, and B including the NIR output from the photosensor 3, and outputs R, Generate video data for 4 channels of G, B, and NIR. At this time, a periodic pattern by near-infrared light is added to the acquired video data I RGB_NIR .
  • the video processing unit 4A obtains video data (R, G, B color information) of a pixel having a coordinate value (1, 1).
  • the R, G, and B color information at this point includes NIR information.
  • the R, G, and B color information will be described here.
  • the video processing unit 4A calculates, for example, by interpolating from the color information of the peripheral pixels as follows.
  • G (1,1) (G (2,1) + G (1,2)) / 2
  • B (1,1) B (2,2)
  • the video processing unit 4A acquires video data (R, G, B color information) of the pixel having the coordinate value (1, 2). Since the pixel having the coordinate value (1, 2) corresponds to G, the video processing unit 4A directly acquires the G value.
  • G (1,2) G (1,2)
  • the video processing unit 4A similarly calculates the R value and B value that do not exist in the pixel having the coordinate value (1, 2) by interpolating from the color information of the peripheral pixels.
  • R (1,2) R (1,1)
  • B (1,2) B (2,2)
  • the video processing unit 4A repeats the above processing to acquire video data (R, G, B color information) for all pixels.
  • the demosaicing process is not limited to the above method, and various methods shown in Non-Patent Documents 3 to 5 may be used. As described above, video data in which R, G, and B color information is set for all pixels is obtained.
  • the video processing unit 4A extracts and generates NIR video data from video data in which R, G, and B color information is set for all pixels by filtering processing. Detailed operation is equivalent to the operation of the first embodiment. An operation specific to the second embodiment will be described.
  • near-infrared light that has passed through the infrared transmission part 12 in the code-type IR cut filter 1 is diffused by diffraction that occurs when it passes through the infrared transmission part 12, so that the optical filter 2 and photosensor 3 is irradiated.
  • this diffraction is a complicated pattern, the influence on the pixels in the photosensor 3 is also complicated. Therefore, a method for extracting only the NIR component from the R, G, B color signals including the NIR component is also complicated.
  • the infrared transmission parts 12 in the code-type IR cut filter 1 are arranged in a pattern that maintains regular periodicity. Thereby, although the near-infrared light which permeate
  • the visible light component is not affected by the infrared transmission part 12 in the code-type IR cut filter 1
  • the information of the photographing scene is directly irradiated to the optical filter 2 and the photosensor 3, and the R, G, B It is formed as video data composed of three color signals. That is, the component added to the video data composed of the three color signals of R, G, and B with a specific frequency pattern is the near-infrared light component.
  • the peak due to the near infrared light is removed in the Fourier space using the characteristic that the peak occurs at a specific frequency formed by the near infrared light component.
  • the infrared transmission parts 12 in the code-type IR cut filter 1 are arranged in a pattern that maintains regular periodicity. Therefore, the specific frequency at which the peak occurs in the Fourier space is the size, shape, and pattern of the infrared transmission section 12 in the code type IR cut filter 1 and the distance between the code type IR cut filter 1 and the photosensor 3. Determined. Thereby, the frequency can be easily determined in advance by calibration.
  • the image processing unit 4A uses a light diffraction phenomenon modeling method to estimate in advance an image of near-infrared light formed on the sensor surface and apply a Fourier transform to generate a peak in the Fourier space. Determine the specific frequency to be.
  • the following Rayleigh-Sommerfeld integration formula can be used. here, It is.
  • U 2 represents the waveform intensity distribution plane when the distance from the plane U 1 of the code-type IR cut filter 1 is z 1 , and this is the intensity distribution on the image plane.
  • is the wavelength and k is the wave number.
  • r 1 is a distance from the axis of the hole in the code-type IR cut filter 1
  • r 2 is a distance on the image plane. That is, by using the modeling method of the light diffraction phenomenon by the above formula, an image by near infrared light formed on the sensor surface can be obtained.
  • the video imaging apparatus 100 of the present embodiment is configured by adding a code type IR cut filter 1 to an optical filter 2 and a photosensor 3, which are the configuration of a general imaging apparatus (see FIG. 5).
  • the code type IR cut filter 1 is obtained by adding a simple improvement to a general IR cut filter and has a simple configuration. That is, video processing in the visible light region and the near infrared region can be performed only by adding a simple configuration to the configuration similar to the related art.
  • the imaging apparatus 100 can obtain an effect that there are few failures.
  • I R_NIR can be regarded as video data I R of the R signal
  • I B_NIR can be regarded as video data I B of the B signal.
  • the G signal video data IG is as described above.
  • FIG. 15 is a schematic configuration diagram of a video photographing apparatus 100B according to another embodiment.
  • the video imaging device 100B has a configuration in which a code information memory 5 is added to the video imaging device 100 shown in FIG.
  • the code information memory 5 is connected to the video processing unit 4B. Since the code-type IR cut filter 1, the optical filter 2, and the photosensor 3 are the same as those of the video photographing device 100, only the code information memory 5 and the video processing unit 4B will be described here.
  • the code information memory 5 records the frequency component at the peak in the Fourier space that represents the periodic pattern formed by near infrared light.
  • a pattern having periodicity is formed on the photosensor 3 by diffraction that occurs when near-infrared light is transmitted through the infrared transmitting portion 2 configured with a pattern having regular periodicity in the code-type IR cut filter 1. Is done.
  • the pattern is recorded in the video data as a color signal.
  • This periodic pattern is determined by the thickness of the code IR cut filter 1, the wavelength of near infrared light, and the distance between the code IR cut filter 1 and the photosensor 3. Therefore, it is possible to obtain a specific frequency to be recorded in the code information memory 5 by performing calibration in advance. Further, a modeling method of the light diffraction phenomenon may be used.
  • the video processing unit 4B applies demosaicing processing to the video data I RGB composed of R, G, and B color signals in the same manner as the video processing unit 4A, so that video data (R, G , B color information).
  • the video processing unit 4B applies a filtering process to the video data composed of the three color signals R, G, and B generated by the demosaicing process, so that the four-channel video of R, G, B, and NIR Generate data.
  • the video processing unit 4B uses the frequency having a peak in the Fourier space of near-infrared light recorded in the code information memory 5, and uses video data I RGB composed of R, G, B color signals and Video data I NIR is generated only by NIR.
  • This embodiment has the same configuration as that of the second embodiment, and the same effect can be obtained. Furthermore, according to the present embodiment, it is easy to remove peaks due to near infrared light in Fourier space.
  • the present invention is applied to a video photographing device that performs spectroscopy using a Bayer array type optical filter, but the present invention is not limited to this.
  • the present invention can be applied to a three-plate type video photographing apparatus.
  • FIG. 16 is a schematic configuration diagram of the video imaging apparatus 101 according to the embodiment of the present invention.
  • a video photographing apparatus 101 includes a code-type IR cut filter 1, a prism (color separation unit) 21, sensors 31 to 33, and a video processing unit 41.
  • the camera lens may be a normal camera lens.
  • a prism and a photo sensor that are currently generally used in a three-plate image capturing apparatus may be used.
  • the code IR cut filter 1 is the same as that used in the second and third embodiments.
  • the sign type infrared cut filter 1 is provided on the front side in the light traveling direction with respect to at least one of the sensors 31 to 33.
  • FIG. 16 shows a configuration in which the code-type infrared cut filter 1 is provided in association with the R-compatible sensor 31 as an example. As a result, diffraction of near-infrared light occurs.
  • a normal infrared cut is used.
  • a filter may be installed.
  • the near-infrared light is not included in the G and B lights dispersed by the prism 21.
  • the light incident on the image capturing device 101 through the camera lens is decomposed by the prism 21 into R, G, and B light having different wavelength bands.
  • the light corresponding to R enters the sensor 31, the light corresponding to G enters the sensor 32, and the light corresponding to B enters the sensor 33.
  • near-infrared light is diffracted from the infrared transmission part 12 in the code-type IR cut filter 1 and light corresponding to R is generated as R image data including NIR in the sensor 31. That is, the code-type IR cut filter 1, the prism 21, and the sensor 31 form video data to which a periodic pattern based on near infrared light is added.
  • the video processing unit 41 receives R video data including NIR output from the sensor 31 as input, and generates video data based only on R based on Expression (1).
  • the video processing unit 41 subtracts video data based only on R from R video data including NIR to generate video data including only NIR. Thereby, the video processing unit 41 acquires video data (R, NIR) for all pixels.
  • the video processing unit 41 acquires video data (G) for all pixels based on the video data output from the sensor 32.
  • the video processing unit 41 further acquires video data (B) for all the pixels based on the video data output from the sensor 33.
  • the video processing unit 41 acquires video data (R, G, B, NIR) for all pixels.
  • the image photographing apparatus 101 of the present embodiment is obtained by adding a sign type IR cut filter 1 to the prism 21 and the sensors 31 to 33, which is a configuration of a general three-plate photographing apparatus.
  • the code type IR cut filter 1 is obtained by adding a simple improvement to a general cut filter and has a simple configuration. That is, only by adding a simple configuration to the configuration similar to the related art, it is possible to perform image processing in the visible light region and the near infrared region, and it is possible to expect an effect that production cost reduction and failure reduction can be expected.
  • the present invention is applied to a video photographing device that performs spectroscopy using a Bayer array type optical filter, but the present invention is not limited to this.
  • the present invention can be applied to a video photographing apparatus having a stacked sensor.
  • FIG. 17 is a schematic configuration diagram of the video photographing apparatus 102 in the embodiment of the present invention.
  • the video photographing apparatus 102 includes a laminated type sensor in which the code-type IR cut filter 1 and sensors 34 to 36 are laminated, and a video processing unit 42.
  • the camera lens may be a normal camera lens.
  • the sensors 34 to 36 may be stacked sensors that are currently generally used in sensor-type video imaging devices.
  • the stacked sensor is stacked in the order of the sensors 34, 35, and 36 with respect to the light traveling direction.
  • the sensor 34 has sensitivity in the B wavelength band
  • the sensor 35 has sensitivity in the G wavelength band
  • the sensor 36 has sensitivity in the R wavelength band.
  • the code-type IR cut filter 1 is the same as that used in the second to third embodiments.
  • the sign type infrared cut filter 1 is provided at a position on the front side in the light traveling direction with respect to the laminated sensor.
  • the light incident on the image capturing device 102 through the camera lens includes R, G, B, and NIR light having different wavelength bands.
  • Light corresponding to B is converted into a signal by the sensor 34
  • light corresponding to G is converted into a signal by the sensor 35
  • light corresponding to R and NIR is converted into a signal by the sensor 36.
  • near-infrared light is diffracted from the infrared transmission section 12 in the code-type IR cut filter 1 and light corresponding to R is generated as R image data including NIR in the sensor 36.
  • the code-type IR cut filter 1 and the sensor 36 form video data to which a periodic pattern by near infrared light is added.
  • the video processing unit 42 receives R video data including NIR output from the sensor 36 as input, and generates video data based only on R based on Expression (1).
  • the video processing unit 42 subtracts the video data based only on R from the R video data including NIR to generate video data including only NIR. Thereby, the video processing unit 42 acquires video data (R, NIR) for all pixels.
  • the video processing unit 42 acquires video data (G) for all pixels based on the video data output from the sensor 35.
  • the video processing unit 42 acquires video data (B) for all the pixels based on the video data output from the sensor 36.
  • the video processing unit 42 acquires video data (R, G, B, NIR) for all pixels.
  • the image photographing apparatus 102 of the present embodiment is obtained by adding a sign-type IR cut filter 1 to the sensors 34 to 36, which is a configuration of a general laminated sensor type image photographing apparatus.
  • the code type IR cut filter 1 is obtained by adding a simple improvement to a general cut filter and has a simple configuration. That is, only by adding a simple configuration to the configuration similar to the related art, it is possible to perform image processing in the visible light region and the near infrared region, and it is possible to expect an effect that production cost reduction and failure reduction can be expected.
  • FIG. 18 is a functional block diagram of the signal separation device 4C in the embodiment of the present invention.
  • the signal separation device 4C includes a video data acquisition unit 141, a first color signal acquisition unit 142, a Fourier transform unit 143, a periodic pattern detection unit 144, a peak removal unit 145, and a near infrared signal acquisition unit 148.
  • the video data acquisition unit 141 acquires video data from the external means 140.
  • the video data includes a plurality of color signals. Further, the video data includes a periodic pattern by near infrared light.
  • the first color signal acquisition unit 142 acquires one color signal (first color signal) from the video data.
  • the first color signal is a signal obtained by adding a near infrared signal to the original color signal (second color signal).
  • the Fourier transform unit 143 transforms the first color signal into a two-dimensional Fourier space.
  • the first color signal includes a periodic pattern by near infrared light.
  • a peak due to near-infrared light occurs at a specific frequency in the two-dimensional Fourier space.
  • the periodic pattern detection unit 144 detects a peak at a specific frequency formed by a near infrared light component.
  • the peak removing unit 145 removes a peak at a specific frequency from the first color signal in the two-dimensional Fourier space.
  • the information after the removal is the second color signal in the two-dimensional Fourier space.
  • the near-infrared signal acquisition unit 148 acquires the second color signal by performing inverse Fourier transform on the information from which the peak has been removed. That is, the second color signal is a color signal based only on a visible light component that does not include near-infrared light. Next, the near infrared signal acquisition unit 148 acquires the near infrared signal by subtracting the second color signal from the first color signal.
  • the signal separation device 4C acquires a near-infrared signal for each of a plurality of color signals, and extracts NIR data. As described above, according to the sixth embodiment, it is possible to obtain only the near infrared light signal from the video data including a plurality of color signals.
  • a seventh embodiment including the above embodiment will be described with reference to FIG.
  • a video imaging apparatus 200 according to the seventh embodiment includes a video data acquisition unit 210 and a video processing unit 220.
  • the video data acquisition unit 210 acquires video data including a periodic pattern using near infrared light.
  • the video processing unit 220 acquires a color signal of a visible light component and a near-infrared signal from video data based on a periodic pattern.
  • the video data acquisition unit 210 corresponds to the video data acquisition unit 140 in the first embodiment.
  • the video processing unit 220 corresponds to the video processing device 4.
  • R, G, and B have been described as color channels.
  • different color channels such as C (cyan), M (magenta), and Y (yellow) can be similarly realized.
  • a general video imaging apparatus has a basic configuration including a near-infrared cut filter, an optical filter, and a photosensor (see FIG. 5). Near infrared light is removed by the near infrared cut filter. On the other hand, photosensors are inherently sensitive to the near-infrared region, but have not made full use of their capabilities.
  • the present inventor paid attention to the sensitivity in the near-infrared region of a photosensor that has not been effectively used until now. Furthermore, we considered performing image processing in the visible light region and the near infrared region while utilizing the configuration of a general image capturing device.
  • each unit may be configured by hardware, or may be realized by a computer.
  • the same functions and operations as described above are realized by a processor that operates according to a program stored in a program memory. Further, only some functions may be realized by a computer program.
  • Video data acquisition means 140 for acquiring video data including a periodic pattern by near infrared light;
  • Video processing units 4, 4 A, 4 B, 41, 42 for obtaining a color signal and a near-infrared signal of a visible light component from the video data based on the periodic pattern;
  • a video shooting device comprising:
  • the video processing units 4, 4A, 4B, 41, 42 are Obtaining a first color signal of a plurality of colors from the video data; Converting the first color signal into a two-dimensional Fourier space; By removing the peak of the periodic pattern due to the near-infrared light in the two-dimensional Fourier space, a second color signal that is a color signal of the visible light component is obtained, A video imaging apparatus that obtains a near-infrared signal by subtracting the second color signal from the first color signal.
  • the video processing units 4, 4A, 4B, 41, 42 are A video data acquisition unit 141 for inputting video data including a periodic pattern by near infrared light; A first color signal acquisition unit 142 that acquires first color signals of a plurality of colors from the video data; a Fourier transform unit 143 that converts the first color signals into a two-dimensional Fourier space; In the two-dimensional Fourier space, a periodic pattern detector 144 for detecting a peak of the periodic pattern due to the near-infrared light; In the two-dimensional Fourier space, a peak removing unit 145 that removes the peak, and a second color signal that is a color signal of the visible light component is obtained by performing inverse Fourier transform on the information from which the peak has been removed. A second color signal acquisition unit 146; A near-infrared signal acquisition unit 147 that acquires a near-infrared signal by subtracting the second color signal from the first color signal.
  • a code information memory 5 for storing frequency information in a two-dimensional Fourier space of periodically generated peaks of the near infrared light;
  • the video processing unit 4B is a video imaging device that removes the peak of the periodic pattern based on frequency information stored in the code information memory.
  • the video data acquisition means 140 includes: A code-type infrared cut filter 1 having an infrared cut section 11 that cuts near infrared light and an infrared transmission section 12 that transmits near infrared light; Including The infrared transmission unit 12 is a video photographing apparatus provided at a position having regular periodicity in the code-type infrared cut filter 1.
  • the video data acquisition means 140 includes: An optical filter 2 that splits incident light into a plurality of colors; A photosensor 3 for converting light of a plurality of colors dispersed by the optical filter 2 into video data as a video signal; An infrared cut unit 11 that cuts near infrared light, and an infrared that transmits near infrared light, provided on the front side of the optical filter 2 in the light traveling direction or between the optical filter 2 and the photosensor 3.
  • a code-type infrared cut filter 1 having a transmission part 12;
  • Have The infrared transmission unit 12 is a video photographing apparatus provided at a position having regular periodicity in the code-type infrared cut filter 1.
  • the video data acquisition means 140 includes: A color separation unit 21 that separates incident light into a plurality of lights having different wavelength bands; Photosensors 31 to 33 that are respectively provided corresponding to the plurality of lights separated by the color separation unit 21 and convert the plurality of lights into data as video signals; An infrared cut section 11 that cuts near infrared light; and an infrared transmission section 12 that transmits the near infrared light.
  • the infrared cut section 11 is provided corresponding to at least one of the plurality of decomposed lights.
  • a signed infrared cut filter 1, Have The infrared transmission unit 12 is a video photographing apparatus provided at a position having regular periodicity in the code-type infrared cut filter 1.
  • the video data acquisition means 140 includes: A plurality of sensors stacked, and a plurality of sensors 34 to 36 that convert a plurality of lights having different wavelength bands into data as video signals by each sensor; A code-type infrared cut filter 1 having an infrared cut section 11 that cuts near infrared light, and an infrared transmission section 12 that transmits the near infrared light; Have The infrared transmission unit 12 is a video photographing apparatus provided at a position having regular periodicity in the code-type infrared cut filter 1.
  • a signal separation device comprising:
  • Acquire video data including periodic patterns by near-infrared light
  • a video imaging method for acquiring a color signal of a visible light component and a near-infrared signal from the video data based on the periodic pattern.
  • the present invention can be applied to a video photographing device such as a digital camera or a video camera.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Health & Medical Sciences (AREA)
  • Toxicology (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Color Television Image Signal Generators (AREA)
  • Studio Devices (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)

Abstract

L'invention concerne un dispositif de capture vidéo qui permet un traitement vidéo simple dans une région de lumière visible et dans une région en proche infrarouge tout en utilisant la configuration d'un dispositif de capture vidéo universel. Un tel dispositif de capture vidéo comprend : un moyen d'acquisition de données vidéo conçu pour obtenir des données vidéo présentant un motif périodique de lumière en proche infrarouge ; et un moyen de traitement vidéo conçu pour obtenir un signal de couleur d'une composante de lumière visible et un signal en proche infrarouge à partir des données vidéo sur la base du motif périodique.
PCT/JP2015/001121 2014-03-06 2015-03-03 Dispositif de capture vidéo, dispositif de séparation de signal et procédé de capture vidéo WO2015133130A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2016506139A JPWO2015133130A1 (ja) 2014-03-06 2015-03-03 映像撮影装置、信号分離装置および映像撮影方法
US15/122,034 US10334185B2 (en) 2014-03-06 2015-03-03 Image capturing device, signal separation device, and image capturing method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014-044447 2014-03-06
JP2014044447 2014-03-06

Publications (1)

Publication Number Publication Date
WO2015133130A1 true WO2015133130A1 (fr) 2015-09-11

Family

ID=54054946

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/001121 WO2015133130A1 (fr) 2014-03-06 2015-03-03 Dispositif de capture vidéo, dispositif de séparation de signal et procédé de capture vidéo

Country Status (3)

Country Link
US (1) US10334185B2 (fr)
JP (1) JPWO2015133130A1 (fr)
WO (1) WO2015133130A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017047080A1 (fr) * 2015-09-18 2017-03-23 日本電気株式会社 Dispositif de traitement d'image, dispositif d'imagerie, procédé de traitement d'image, et support d'enregistrement de programme
WO2017222021A1 (fr) * 2016-06-24 2017-12-28 日本電気株式会社 Dispositif de traitement d'images, système de traitement d'images, procédé de traitement d'images et support d'enregistrement de programme

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6404923B2 (ja) * 2014-06-24 2018-10-17 マクセル株式会社 撮像センサおよび撮像装置
CN108701365B (zh) * 2017-08-29 2022-05-31 广东虚拟现实科技有限公司 光点识别方法、装置以及系统
CN117314791B (zh) * 2023-11-27 2024-02-20 长春理工大学 基于巴特沃斯函数拟合的红外图像冷反射噪声矫正方法

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001069519A (ja) * 1999-08-30 2001-03-16 Sony Corp 固体撮像装置
JP2005006066A (ja) * 2003-06-12 2005-01-06 Acutelogic Corp 固体撮像素子用カラーフィルタおよびこれを用いたカラー撮像装置
JP2007184805A (ja) * 2006-01-10 2007-07-19 Toyota Central Res & Dev Lab Inc カラー画像再生装置
JP2012080553A (ja) * 2011-11-07 2012-04-19 Sony Corp 半導体装置および撮像装置

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5121445A (en) * 1986-07-01 1992-06-09 Konica Corporation Method and apparatus for reading image
CN1971927B (zh) 2005-07-21 2012-07-18 索尼株式会社 物理信息获取方法、物理信息获取装置和半导体器件
JP5389902B2 (ja) * 2008-04-28 2014-01-15 セールスフォース ドット コム インコーポレイティッド ウェブサイト及びそのコンテンツの作成及び管理のためのオブジェクト指向のシステム
JP2011199798A (ja) * 2010-03-24 2011-10-06 Sony Corp 物理情報取得装置、固体撮像装置、物理情報取得方法
JP2011243862A (ja) 2010-05-20 2011-12-01 Sony Corp 撮像デバイス及び撮像装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001069519A (ja) * 1999-08-30 2001-03-16 Sony Corp 固体撮像装置
JP2005006066A (ja) * 2003-06-12 2005-01-06 Acutelogic Corp 固体撮像素子用カラーフィルタおよびこれを用いたカラー撮像装置
JP2007184805A (ja) * 2006-01-10 2007-07-19 Toyota Central Res & Dev Lab Inc カラー画像再生装置
JP2012080553A (ja) * 2011-11-07 2012-04-19 Sony Corp 半導体装置および撮像装置

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017047080A1 (fr) * 2015-09-18 2017-03-23 日本電気株式会社 Dispositif de traitement d'image, dispositif d'imagerie, procédé de traitement d'image, et support d'enregistrement de programme
US10440292B2 (en) 2015-09-18 2019-10-08 Nec Corporation Color signal and near-infrared signal generated by using pattern information defining intensity-corresponding pattern
WO2017222021A1 (fr) * 2016-06-24 2017-12-28 日本電気株式会社 Dispositif de traitement d'images, système de traitement d'images, procédé de traitement d'images et support d'enregistrement de programme
US10863115B2 (en) 2016-06-24 2020-12-08 Nec Corporation Generation of visible and near-infrared images based on estimated incident light spectral characteristics and image capturing device spectral sensitivity characteristics

Also Published As

Publication number Publication date
JPWO2015133130A1 (ja) 2017-04-06
US20170019614A1 (en) 2017-01-19
US10334185B2 (en) 2019-06-25

Similar Documents

Publication Publication Date Title
US20190349537A1 (en) Apparatus and method for combining images
US9025871B2 (en) Image processing apparatus and method of providing high sensitive color images
JP6582987B2 (ja) 映像撮影装置、映像撮影方法、符号型赤外カットフィルタ、および符号型特定色カットフィルタ
US8614746B2 (en) Image processing apparatus and method of noise reduction
WO2015133130A1 (fr) Dispositif de capture vidéo, dispositif de séparation de signal et procédé de capture vidéo
US20070153099A1 (en) Image signal processing apparatus, imaging apparatus, image signal processing method and computer program thereof
JP2004228662A (ja) 撮像装置
JP5186517B2 (ja) 撮像装置
JP5943393B2 (ja) 撮像装置
US10931895B2 (en) Image processing method, image processing device, and storage medium
GB2456492A (en) Image processing method
US20160323563A1 (en) Method for compensating for color differences between different images of a same scene
JPWO2017222021A1 (ja) 画像処理装置、画像処理システム、画像処理方法及びプログラム
JP5454156B2 (ja) 画像処理装置、撮像装置及び画像処理プログラム
US10593717B2 (en) Image processing apparatus, image processing method, and imaging apparatus
Sadeghipoor et al. Demultiplexing visible and near-infrared information in single-sensor multispectral imaging
JP6794989B2 (ja) 映像処理装置、撮影装置、映像処理方法及びプログラム
US8866972B1 (en) Method for transmitting spectrum information
US9894336B2 (en) Color imaging using a monochromatic digital camera

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15758208

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2016506139

Country of ref document: JP

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 15122034

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15758208

Country of ref document: EP

Kind code of ref document: A1