WO2013027340A1 - Dispositif d'imagerie - Google Patents

Dispositif d'imagerie Download PDF

Info

Publication number
WO2013027340A1
WO2013027340A1 PCT/JP2012/004917 JP2012004917W WO2013027340A1 WO 2013027340 A1 WO2013027340 A1 WO 2013027340A1 JP 2012004917 W JP2012004917 W JP 2012004917W WO 2013027340 A1 WO2013027340 A1 WO 2013027340A1
Authority
WO
WIPO (PCT)
Prior art keywords
pixel
visible light
light exposure
infrared light
image sensor
Prior art date
Application number
PCT/JP2012/004917
Other languages
English (en)
Japanese (ja)
Inventor
藤井 俊哉
信一 寺西
圭一 森
Original Assignee
パナソニック株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニック株式会社 filed Critical パナソニック株式会社
Publication of WO2013027340A1 publication Critical patent/WO2013027340A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/131Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements including elements passing infrared wavelengths
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/135Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on four or more different wavelength filter elements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith

Definitions

  • the present invention relates to an imaging apparatus capable of realizing visible light imaging and infrared light imaging with a single image sensor.
  • the obtained image is called a distance image.
  • a stereo method, a pattern irradiation method, a TOF (time-of-flight) method, and the like are known.
  • a conventional image pickup apparatus realizes color image pickup by background light and distance image pickup by infrared light by a single CCD (charge coupled device) image sensor. For this reason, each of the RGB light receiving units that receive red (R), green (G), and blue (B) light, which are the three primary colors of visible light, and accumulate signal charges,
  • a vertical overflow drain vertical overflow drain that has an interline transfer CCD image sensor with a planar array of IR (infrared) light receiving parts that receive light and accumulate signal charges, and sweeps out signal charges from all light receiving parts : VOD) and a horizontal overflow drain (LOD) that sweeps out only the signal charge of the IR light receiving section at high speed.
  • the LOD includes a sweep gate connected to the IR light receiving unit and a drain region in which the signal charge of the IR light receiving unit is swept through the sweep gate.
  • the LOD is operated after VOD, and then the signal charge is read from all the light receiving units, thereby independently controlling the visible light exposure time and the infrared light exposure time (see Patent Documents 1 and 2).
  • the optical filter provided in the RGB light receiving unit has a two-layer structure of each of the three primary color filters and an infrared light cut filter.
  • the optical filter provided in the IR light receiving unit has a two-layer structure of an infrared light transmission / visible light cut filter and a transparent film for flattening the two-layer structure filter of the RGB light receiving unit.
  • IR imaging is performed on the imaging target space every other frame scanning period while imaging visible light and infrared light every frame scanning period.
  • a visible light image is generated for each frame scanning period, and the IR image signal obtained by imaging at the time of non-irradiation of IR pulse is subtracted from the IR image signal obtained by imaging at the time of IR pulse irradiation.
  • a distance image excluding the influence of the IR component in the light is generated every one frame scanning period (see Patent Document 2).
  • the conventional technology has a problem that basic performance such as sensitivity, saturation, smear, etc. must be sacrificed.
  • An object of the present invention is to provide an imaging apparatus capable of realizing high-quality color image imaging using background light and high-precision distance image imaging using infrared light with a single inexpensive image sensor.
  • an imaging apparatus has irradiation means for irradiating a subject with infrared light, and spectral transmission characteristics for transmitting a visible wavelength region and a wavelength region near the irradiated infrared light.
  • An image pickup apparatus comprising: a two-band optical filter that receives incident light from a subject; and an image sensor that converts an image formed through the two-band optical filter into an electrical signal.
  • a first pixel that receives infrared light that has passed through the two-band optical filter and performs infrared light exposure; a second pixel that receives visible light that has passed through the two-band optical filter and performs visible light exposure; Discharge means for simultaneously discharging the signal charges of the first and second pixels, and performing infrared light exposure of the first pixel and visible light exposure of the second pixel in a time-sharing manner. .
  • receiving light by the image sensor is referred to as “light reception”, and applying light to the image sensor to convert the light into an electrical signal is referred to as “exposure”.
  • pixel signal charges can be discharged by a single discharging means for all pixels at once.
  • the two-band optical filter and each pixel filter having a single layer structure are used in combination, so that the optical of visible light and infrared light can be combined. Mixing can be minimized. Therefore, it is possible to provide an imaging apparatus capable of realizing high-quality color image capturing using background light and high-precision distance image capturing using infrared light with a single inexpensive image sensor.
  • FIG. 1 is a block diagram illustrating a configuration of an imaging apparatus according to a first embodiment of the present invention. It is a figure which shows the spectral transmission factor of the 2 band optical filter in FIG. It is a schematic block diagram of the CCD image sensor in FIG. It is a figure which shows the arrangement
  • FIG. 2 is a timing diagram illustrating an operation of the imaging apparatus in FIG. 1. It is a block diagram which shows the structure of the imaging device which concerns on the 2nd Embodiment of this invention. It is a circuit diagram which shows the structure of the pixel cell in the MOS image sensor in FIG.
  • FIG. 8 is a schematic configuration diagram of the MOS image sensor in FIG.
  • FIG. 7 is a timing diagram illustrating an operation of the imaging apparatus in FIG. 6. It is a block diagram which shows the structure of the imaging device which concerns on the 3rd Embodiment of this invention. It is a schematic block diagram of the CCD image sensor in FIG. FIG. 11 is a timing diagram illustrating an operation of the imaging apparatus in FIG. 10.
  • FIG. 1 shows a configuration of an imaging apparatus according to the first embodiment of the present invention.
  • reference numeral 200 denotes a subject
  • 201 denotes background light illumination.
  • 1 includes an infrared laser 202, a rough surface filter 203, an optical lens 204, a two-band optical filter 205, a CCD image sensor 210, an analog front-end (AFE) 220, and the like.
  • the ISP 230 includes a camera preprocessing unit 240, a color camera processing unit / distance image calculation processing unit 250, a central processing unit (CPU) 260, and a timing generator (TG) 270.
  • the camera preprocessing unit 240 includes a subtraction circuit 241 and an AE (automatic exposure) detection circuit 242.
  • the infrared laser 202 emits infrared light having a wavelength of 850 nm, and the irradiated infrared light passes through the rough surface filter 203 so that a coherent random image for obtaining a distance image is obtained.
  • the subject 200 is irradiated with infrared light having a number of speckle patterns based on a speckle pattern (coherent random speckle pattern).
  • the infrared light reflected from the subject 200 forms an image on the CCD image sensor 210 via the optical lens 204 and the two-band optical filter 205.
  • only the reflected light from the subject 200 by the background light illumination 201 forms an image on the same CCD image sensor 210 via the same optical lens 204 and the two-band optical filter 205. To do.
  • FIG. 2 is a diagram showing the spectral transmittance of the two-band optical filter 205 in FIG.
  • the two-band optical filter 205 is an optical filter having spectral transmission characteristics that transmits a visible light wavelength region and a wavelength region near 850 nm.
  • the CCD image sensor 210 converts the formed image into an electrical signal.
  • a signal output from the CCD image sensor 210 is converted into a digital signal via the AFE 220.
  • the digitized signal is subjected to processing such as flaw correction and noise removal in the camera preprocessing unit 240 in the ISP 230.
  • the input signal is given to the subtraction circuit 241 and the AE detection circuit 242.
  • the color camera processing unit and the distance image calculation processing unit 250 output the result of camera processing such as edge enhancement and color adjustment as a color image, while comparing the coherent random speckle pattern from the infrared imaging result with the reference pattern.
  • the distance image is output so that the distance can be calculated.
  • the CPU 260 controls the entire system of the imaging apparatus according to the register settings of the blocks 240, 250, and 270.
  • the CPU 260 subjects the output from the AE detection circuit 242 to time-axis filtering, etc., and independently calculates the visible light exposure time and the infrared light exposure time of the CCD image sensor 210 to obtain the information. Tell TG270.
  • the TG 270 generates a horizontal drive pulse to be directly supplied to the CCD image sensor 210, supplies a substrate discharge pulse and a vertical drive pulse to the CCD image sensor 210 via the VDR 221, and generates a drive signal to the infrared laser 202.
  • the substrate discharge pulse is a pulse for discharging the signal charge of the pixel, and is generated at a timing at which the shutter speed commanded by the CPU 260 is obtained.
  • FIG. 3 is a schematic configuration diagram of the CCD image sensor 210 in FIG.
  • the CCD image sensor 210 shown in FIG. 3 is an image sensor of progressive scan compatible and interline transfer system, and includes a photodiode (PD) 21, a vertical transfer unit 22, a horizontal transfer unit 23, and a charge detection.
  • the unit 24 and the VOD 25 are included.
  • the PDs 21 are arranged in a matrix and each constitute a pixel.
  • the vertical transfer unit 22 has, for example, a configuration of four gates V1, V2, V3, and V4 per pixel, and is driven by four-phase pulses ⁇ V1, ⁇ V2, ⁇ V3, and ⁇ V4. Of these, ⁇ V1 is an all-pixel readout pulse.
  • the signal charge readout gate of the PD 21 is configured to also serve as the V1 gate of the vertical transfer unit 22.
  • the horizontal transfer unit 23 has, for example, a two-phase (H1 and H2) configuration. Further, when the substrate discharge pulse ⁇ Sub is given, the signal charges of all the pixels are discharged to the substrate all at once via the VOD 25.
  • the VOD 25 is described as being adjacent to the same surface as the PD 21, but in reality, the VOD 25 is adjacent to the PD 21 in the bulk direction (the depth direction of the semiconductor substrate). .
  • FIG. 4 shows an arrangement of color filters arranged on the PD 21 in FIG.
  • the pixel unit has a 2 ⁇ 2 unit array composed of a total of four pixels of R pixel, G pixel, B pixel, and IR pixel.
  • So-called RGB color filters are arranged in the R pixel, G pixel, and B pixel.
  • a visible light cut filter is disposed in the IR pixel.
  • the color filter array is not limited to this example, and when m and n are each an integer of 2 or more, the pixel portion may have an m ⁇ n unit array.
  • FIG. 5 is a timing chart showing the operation of the imaging apparatus of FIG.
  • the rate of the vertical synchronization pulse VD is 60 frames per second (60 fps)
  • imaging with irradiated infrared light is performed in odd frames
  • visible light imaging with background light is performed in even frames.
  • the signal charges of all the pixels are discharged by applying the substrate discharge pulse ⁇ Sub, the infrared light exposure time 11 is started by the end of the application of the substrate discharge pulse ⁇ Sub, and the infrared light irradiation time by the infrared laser 202 at the same time. 13 starts.
  • the background light illumination 201 is continuously lit, the reflected light from the subject 200 includes both infrared light and visible light. However, only the reflected light in the vicinity of 850 nm is received by the IR pixel by the two-band optical filter 205 and the visible light cut filter of the IR pixel, photoelectric conversion is performed, and the signal charge is applied to the IR pixel during the infrared light exposure time 11. Accumulated.
  • the signal charges of all the pixels including the IR pixel are read out to the vertical transfer unit 22 and the infrared light exposure time 11 ends. To do. At the same time, the infrared light irradiation time 13 by the infrared laser 202 ends.
  • an IR pixel signal is output as an infrared light exposure signal at the image sensor output timing 14 shown in FIG. 5 via the vertical transfer unit 22, the horizontal transfer unit 23, and the charge detection unit 24, and is processed in the subsequent stage.
  • the signal charges of all the pixels are discharged by applying the substrate discharge pulse ⁇ Sub so that the necessary exposure time is reached, and the visible light exposure time 12 is set by the end of the application of the substrate discharge pulse ⁇ Sub. Start.
  • the reflected light from the subject 200 does not include the irradiated infrared light component but includes only the background light component.
  • the background light may contain some IR components in addition to the visible light component.
  • the RGB color filter in the RGB pixel has a slight transmission characteristic with respect to the IR component.
  • the R component of visible light and a slight IR component are received by the two-band optical filter 205 and the R color filter, and photoelectric conversion is performed, and a signal is transmitted to the R pixel during the visible light exposure time 12. Charge is accumulated.
  • the 2-band optical filter 205 and the G color filter receive the G component of visible light and a slight amount of IR component to perform photoelectric conversion, and signal charges are applied to the G pixel during the visible light exposure time 12. Accumulated.
  • the 2-band optical filter 205 and the B color filter receive the B component of visible light and a slight amount of IR component to perform photoelectric conversion, and signal charge is applied to the B pixel during the visible light exposure time 12. Accumulated.
  • the two-band optical filter 205 and the visible light cut filter receive only a small amount of IR component in the background light to perform photoelectric conversion, and the signal charge is applied to the IR pixel during the visible light exposure time 12. Is accumulated.
  • the signals of all the pixels including the RGB pixels and the IR pixels are subjected to visible light exposure through the vertical transfer unit 22, the horizontal transfer unit 23, and the charge detection unit 24 at the image sensor output timing 15 shown in FIG. Output as a signal.
  • the operation of the CCD image sensor 210 is repeated, and the infrared light exposure and the visible light exposure are alternately performed in a time division manner so that the infrared light exposure time 11 and the visible light exposure time 12 appear every other frame. Done.
  • the AE operation based on the output signal of the CCD image sensor 210 will be described.
  • the pixel signal obtained at the infrared light exposure time 11 of the odd frame is output at the timing 14 in the even frame.
  • the AE detection circuit 242 only the output signal of the IR pixel is detected by the AE detection circuit 242, and the output is read by the CPU 260.
  • the CPU 260 determines the shutter speed of the next odd frame through the time filter using only the detection result of the IR pixel signal output from the CCD image sensor 210 for each even frame.
  • the pixel signal obtained at the visible light exposure time 12 of the even frame is output at the timing 15 in the odd frame.
  • the output signal of the RGB pixel is detected by the AE detection circuit 242 and the output is read by the CPU 260.
  • the CPU 260 determines the shutter speed of the next even frame through the time filter using only the detection result of the RGB pixel signal output from the CCD image sensor 210 for each odd frame.
  • the RGB pixel signals received as reflected light from the subject 200 during the visible light exposure time 12 are subjected to white balance processing, contour enhancement processing, and the like by the color camera processing unit and the distance image calculation processing unit 250. Output as an image.
  • the IR pixel signal received as reflected light of the coherent random speckle pattern from the subject 200 during the infrared light exposure time 11 is collated with the reference pattern of the speckle pattern stored in advance, and from the address shift, the color camera It is calculated by the processing unit and the distance image calculation processing unit 250, converted into a distance image, and output.
  • pixel signal charges can be discharged by the VOD 25 for all the pixels at once.
  • the two-band optical filter 205 and each pixel filter having a single-layer structure on the CCD image sensor 210 can be used together. Optical mixing with infrared light can be minimized. Therefore, high-quality color image capturing using background light and high-accuracy distance image capturing using infrared light can be realized with an inexpensive system.
  • the output of the AE detection circuit 242 is divided into infrared light and visible light, and each is independently controlled by AE. Therefore, since the light sources are different from each other, it becomes possible to control the infrared light exposure time 11 of the IR pixel and the visible light exposure time 12 of the RGB pixel, which are greatly different from each other, to the optimum exposure time, as if it were a measurement with the visible camera. It is possible to further improve the image quality of visible image capturing and the accuracy of distance image capturing so that the two cameras, the distance camera, operate simultaneously.
  • the signal charges of all the pixels are read out so as to determine the end of the visible light exposure time 12 from the application timing of the substrate discharge pulse ⁇ Sub for discharging the signal charges of all the pixels so as to determine the start of the visible light exposure time 12.
  • the irradiation time of the infrared light is shortened to reduce power and heat generation. be able to.
  • the infrared light exposure time 11 and the infrared light irradiation time 13 are matched, but the infrared light irradiation time 13 includes the infrared light exposure time 11 and the visible light exposure time 12 is set. If not included, there is no particular problem.
  • the visible light exposure time is determined from the IR pixel signal level of the infrared light exposure time 11.
  • the result of subtracting by the subtracting circuit 241 after correcting the signals of the 12 IR pixels to the level of the same exposure time by taking each exposure time into consideration is given to the color camera processing unit and the distance image calculation processing unit 250. input.
  • the visible light exposure time is determined from the signal level of each RGB pixel in the visible light exposure time 12.
  • the signal of 12 IR pixels has the same transmittance level, taking into account the sensitivity due to the infrared light transmittance near 850 nm as the overall characteristics of the 2-band optical filter 205 and the filter arranged in each pixel.
  • the output of each of the RGB pixels and the IR pixel is corrected, and the result of subtraction by the subtraction circuit 241 is input to the color camera processing unit and the distance image calculation processing unit 250.
  • the noise of the IR component near 850 nm of background light can be suppressed, and higher color reproducibility and high image quality can be realized.
  • FIG. 6 shows a configuration of an imaging apparatus according to the second embodiment of the present invention.
  • the imaging apparatus of FIG. 6 is obtained by replacing the CCD image sensor 210 in FIG. 1 with a progressive scan type MOS (metal-oxide-semiconductor) image sensor 211 equipped with a global shutter function and a global reset function. is there.
  • MOS metal-oxide-semiconductor
  • FIG. 7 is a circuit diagram showing a configuration of the pixel cell 101 in the MOS image sensor 211 in FIG.
  • n is an integer from 1 to 4.
  • Each pixel cell 101 includes a PD serving as a photosensitive portion, a shutter transistor M5 for resetting (discharging) signal charges accumulated in the PD by setting the shutter signal TXSn to H, and a floating diffusion (floating) having a storage capacity.
  • FD difusion: FD
  • a read transistor M1 for reading the signal charge accumulated in the PD to the FD by setting the transfer signal TXn to H, and a signal charge read to the FD by setting the reset signal RSn to H
  • a reset transistor M2 for resetting, a source follower transistor M3 whose gate is connected to FD, and a line selection transistor M4 that connects the source follower transistor M3 to the vertical signal line by setting the selection signal SELn to H.
  • the drains of the reset transistor M2, the source follower transistor M3, and the shutter transistor M5 are connected to the pixel electrode VDD.
  • FIG. 8 is a schematic configuration diagram of the MOS image sensor 211 in FIG. 6 in which the pixel cells 101 in FIG. 7 are two-dimensionally arranged.
  • the MOS image sensor 211 in FIG. 8 includes a pixel portion 102 in which pixel cells 101 are arranged in a two-dimensional form of 4 rows and 4 columns, and fixed pattern noise (FPN) generated due to variations in transistor threshold voltage for each column.
  • the FPN removing unit 103 for removing the signal
  • the horizontal selecting unit 104 for sequentially selecting the output signal of the FPN removing unit 103
  • the differential amplifier 105 for amplifying the output signal of the horizontal selecting unit 104.
  • the pixel unit 102 has a small size of 4 rows and 4 columns for convenience of explanation.
  • Each column of the FPN removing unit 103 includes a signal level sample transistor M11 that receives the sample hold signal SHS, a reset level sample transistor M12 that receives the sample hold signal SHN, a signal level capacitor C11, and a reset level capacitor C12. Consists of Each column of the horizontal selection unit 104 includes a first column selection transistor M21 interposed between the signal level capacitor C11 and the first horizontal signal line 107, a reset level capacitor C12, and a second horizontal signal line 108. And a second column selection transistor M22 interposed therebetween. Each column of the horizontal selection unit 104 is sequentially selected by signals H1 to H4 from a horizontal scanning unit (not shown). The differential amplifier 105 amplifies the potential difference between the first horizontal signal line 107 and the second horizontal signal line 108.
  • R, G, B, and IR color filters shown in FIG. 4 are arranged in each pixel as in the first embodiment.
  • the CPU 260 in FIG. 6 controls the entire system of the imaging apparatus according to the register settings of the blocks 240, 250, and 270.
  • the CPU 260 subjects the output from the AE detection circuit 242 to time-axis filtering, etc., and independently calculates the visible light exposure time and the infrared light exposure time of the MOS image sensor 211 to obtain the information,
  • the information is directly transmitted to the MOS image sensor 211 by communication means such as serial transfer, and various drive pulses are generated in the MOS image sensor 211.
  • the pulse timing of the shutter signals TXS1 to TXS4 for discharging the signal charge of each PD is generated so as to achieve the shutter speed commanded by the CPU 260, and natural AE control is performed in both visible light exposure and infrared light exposure. I can do it.
  • FIG. 9 is a timing chart showing the operation of the imaging apparatus of FIG.
  • the rate of the vertical synchronization pulse VD is 60 frames per second (60 fps)
  • imaging with irradiated infrared light is performed in odd frames
  • visible light imaging with background light is performed in even frames.
  • the pulse application of the shutter signal TXSn ends, the infrared light exposure time 131 starts, and at the same time, the infrared light irradiation time 133 by the infrared laser 202 starts.
  • the background light illumination 201 is continuously lit, the reflected light from the subject 200 includes both infrared light and visible light.
  • the reflected light in the vicinity of 850 nm is received by the IR pixel by the two-band optical filter 205 and the visible light cut filter of the IR pixel, and photoelectric conversion is performed, and the signal charge is transferred to the IR pixel during the infrared light exposure time 131. Accumulated.
  • the voltage signal converted into the signal voltage by the FD by the pulse of the selection signal SEL1 is read to the vertical signal line 106 through the source follower transistor M3, and the IR pixel signal is output through the FPN removal unit 103. Then, the process proceeds to the later stage. Subsequently, the lines are sequentially selected by the pulses of the selection signals SEL2, SEL3, and SEL4. As a result, an IR pixel signal for one screen is output as an infrared light exposure signal at the image sensor output timing 134 shown in FIG. At this time, RGB pixel signals are also output, but these RGB pixel signals are not processed in the subsequent stage.
  • the shutter signal is turned on so as to turn on the shutter transistor M5 that collectively discharges the signal charges accumulated in all the PDs so that the necessary exposure time is reached.
  • the reflected light from the subject 200 does not include the irradiated infrared light component but includes only the background light component.
  • the background light may contain some IR components in addition to the visible light component.
  • the RGB color filter in the RGB pixel has a slight transmission characteristic with respect to the IR component.
  • the RGB pixels are sensitive to some IR component derived from background light in addition to each filtered visible light component and accumulate signal charge during the visible light exposure time 132.
  • the two-band optical filter 205 and the visible light cut filter receive only some IR components in the background light and perform photoelectric conversion, and signal charges accumulate in the IR pixel during the visible light exposure time 132. Is done.
  • the voltage signal converted into the signal voltage by the FD by the pulse of the selection signal SEL1 is read to the vertical signal line 106 via the source follower transistor M3, and all the pixel signals are output via the FPN removal unit 103. Then, the process proceeds to the later stage. Subsequently, the lines are sequentially selected by the pulses of the selection signals SEL2, SEL3, and SEL4. As a result, at the image sensor output timing 135 shown in FIG. 9, all pixel signals for one screen including RGB pixels and IR pixels are output as visible light exposure signals.
  • the operation of the MOS image sensor 211 is repeated, and the infrared light exposure and the visible light exposure are alternately performed in a time division manner so that the infrared light exposure time 131 and the visible light exposure time 132 appear every other frame. Done.
  • the AE operation based on the output signal of the MOS image sensor 211 will be described.
  • the pixel signal obtained at the infrared light exposure time 131 of the odd frame is output at the timing 134 in the even frame.
  • the AE detection circuit 242 only the output signal of the IR pixel is detected by the AE detection circuit 242, and the output is read by the CPU 260.
  • the CPU 260 determines the shutter speed of the next odd frame through the time filter using only the detection result of the IR pixel signal output from the MOS image sensor 211 for each even frame.
  • the pixel signal obtained at the visible light exposure time 132 of the even frame is output at the timing 135 in the odd frame.
  • the output signal of the RGB pixel is detected by the AE detection circuit 242 and the output is read by the CPU 260.
  • the CPU 260 determines the shutter speed of the next even frame through the time filter using only the detection result of the RGB pixel signal output from the MOS image sensor 211 for each odd frame.
  • pixel signal charges can be discharged by the shutter transistor M5 for all the pixels at once.
  • the infrared light exposure and the visible light exposure are performed in a time-sharing manner, by using the two-band optical filter 205 and each pixel filter having a single layer structure on the MOS image sensor 211 in combination with the visible light, Optical mixing with infrared light can be minimized. Therefore, high-quality color image capturing using background light and high-accuracy distance image capturing using infrared light can be realized with an inexpensive system.
  • the output of the AE detection circuit 242 is divided into infrared light and visible light, and each is independently controlled by AE.
  • the IR light exposure time 131 of the IR pixel and the visible light exposure time 132 of the RGB pixel which are greatly different from each other, can be controlled to the optimum exposure time. It is possible to further improve the image quality of visible image capturing and the accuracy of distance image capturing so that the two cameras, the distance camera, operate simultaneously.
  • the infrared light exposure time 131 and the infrared light irradiation time 133 are matched, but the infrared light irradiation time 133 includes the infrared light exposure time 131 and the visible light exposure time 132 is set. If not included, there is no particular problem.
  • the visible light exposure time is determined from the IR pixel signal level of the infrared light exposure time 131.
  • the result of subtracting by the subtracting circuit 241 after correcting the signal of the 132 IR pixels to the level of the same exposure time with each exposure time taken into consideration is given to the color camera processing unit and the distance image calculation processing unit 250. input.
  • the visible light exposure time is determined from the signal level of each RGB pixel in the visible light exposure time 132.
  • the signal from the 132 IR pixels has the same transmittance level in consideration of the sensitivity due to the infrared light transmittance near 850 nm as the overall characteristics of the two-band optical filter 205 and the filter arranged in each pixel.
  • the output of each of the RGB pixels and the IR pixel is corrected, and the result of subtraction by the subtraction circuit 241 is input to the color camera processing unit and the distance image calculation processing unit 250.
  • the noise of the IR component near 850 nm of background light can be suppressed, and higher color reproducibility and high image quality can be realized.
  • FIG. 10 shows a configuration of an imaging apparatus according to the third embodiment of the present invention.
  • the imaging device of FIG. 10 is obtained by replacing the CCD image sensor 210 in FIG. 1 with a CCD image sensor 212 in which signal charge readout gates of IR pixels and RGB signal charge readout gates are wired independently of each other.
  • the third embodiment will be described with emphasis on differences from the first embodiment.
  • FIG. 11 is a schematic configuration diagram of the CCD image sensor 212 in FIG.
  • a CCD image sensor 212 shown in FIG. 11 is an image sensor of progressive scan compatible and interline transfer system, and includes a PD 71, a vertical transfer unit 72, a horizontal transfer unit 73, a charge detection unit 74, and a VOD 75. Composed.
  • the PDs 71 are arranged in a matrix and constitute pixels. Each pixel is provided with R, G, B, and IR color filters as in the first embodiment.
  • the vertical transfer unit 72 has, for example, a configuration of four gates V1, V7, V3, and V4 per pixel, and is driven by four-phase pulses ⁇ V1, ⁇ V7, ⁇ V3, and ⁇ V4.
  • ⁇ V1 is an RGB pixel readout pulse.
  • the RGB pixel has a configuration in which the signal charge readout gate also serves as the V1 gate of the vertical transfer unit 72.
  • the gate of the vertical transfer unit 72 that also serves as the IR pixel readout gate is newly provided as a V5 gate independent of the V1 gate.
  • ⁇ V5 is an IR pixel readout pulse.
  • the horizontal transfer unit 73 has, for example, a two-phase (H1 and H2) configuration. Further, when the substrate discharge pulse ⁇ Sub is given, the signal charges of all the pixels are discharged to the substrate all at once via the VOD 75.
  • FIG. 12 is a timing chart showing the operation of the imaging apparatus of FIG.
  • the rate of the vertical synchronization pulse VD is 60 fps
  • imaging with irradiated infrared light is performed in the first half of one frame
  • visible light imaging with background light is performed in the second half.
  • the timing of the IR pixel readout pulse ⁇ V5 can be arbitrarily set for each horizontal period.
  • the substrate discharge pulse ⁇ Sub discharges signal charges of all the pixels, and when the application of the substrate discharge pulse ⁇ Sub ends, the infrared light exposure time 81 starts, and at the same time, the infrared light irradiation time by the infrared laser 202 83 starts.
  • the background light illumination 201 is continuously lit, the reflected light from the subject 200 includes both infrared light and visible light. However, only the reflected light in the vicinity of 850 nm is received by the IR pixel by the two-band optical filter 205 and the visible light cut filter of the IR pixel, photoelectric conversion is performed, and the signal charge is applied to the IR pixel during the infrared light exposure time 81. Accumulated.
  • the signal charge of the IR pixel is read out to the vertical transfer unit 72, and the infrared light exposure time 81 ends.
  • the infrared light irradiation time 83 by the infrared laser 202 ends.
  • an IR pixel signal is output as an infrared light exposure signal at the image sensor output timing 84 shown in FIG. 12 via the vertical transfer unit 72, the horizontal transfer unit 73, and the charge detection unit 74, and processing in the subsequent stage Migrate to
  • the signal charge of the IR pixel is discharged by applying the substrate discharge pulse ⁇ Sub so that the necessary exposure time is reached, and the visible light exposure time 82 starts when the application of the substrate discharge pulse ⁇ Sub is completed. To do.
  • the reflected light from the subject 200 does not include the irradiated infrared light component but includes only the background light component.
  • the background light may contain some IR components in addition to the visible light component.
  • the RGB color filter in the RGB pixel has a slight transmission characteristic with respect to the IR component.
  • the RGB pixels are sensitive to some IR component derived from background light in addition to each filtered visible light component and accumulate signal charge during the visible light exposure time 132.
  • the two-band optical filter 205 and the visible light cut filter receive only some IR components in the background light and perform photoelectric conversion, and signal charges accumulate in the IR pixel during the visible light exposure time 132. Is done.
  • the signal charges of the RGB pixels are read out to the vertical transfer unit 72, and the visible light exposure time 82 ends.
  • RGB pixel signals are output as visible light exposure signals at the image sensor output timing 85 shown in FIG. 12 via the vertical transfer unit 72, the horizontal transfer unit 73, and the charge detection unit 74.
  • the IR pixel signal is also being output, but the signal charge of each of the R, G, B, and IR pixels is vertical by performing vertical transfer so that the RGB pixel signal is contained in the empty packet of the vertical transfer unit 72. Without mixing in the transfer unit 72, it is possible to output all pixels independently while causing a shift between the timings 84 and 85.
  • the operation of the CCD image sensor 212 is similarly repeated in units of frames, and the infrared light exposure and the visible light exposure are sometimes performed so that the infrared light exposure time 81 and the visible light exposure time 82 appear in one frame. Alternation is performed alternately.
  • the AE operation based on the output signal of the CCD image sensor 212 will be described.
  • the pixel signal obtained at the infrared light exposure time 81 is output over a period of about 1 V from the timing in the middle of the frame.
  • the output signal of the IR pixel is detected by the AE detection circuit 242 and the output is read by the CPU 260.
  • the CPU 260 uses only the detection result of the IR pixel signal to determine the shutter speed of the infrared light exposure time 81 of the next frame through the time filter.
  • the pixel signal obtained at the visible light exposure time 82 is output from the top of the frame.
  • the output signal of the RGB pixel is detected by the AE detection circuit 242 and the output is read by the CPU 260.
  • the CPU 260 determines the shutter speed for the next visible light exposure time 82 through the time filter using only the detection result of the RGB pixel signal output from the CCD image sensor 212 for each frame.
  • the pixel signal charge can be discharged by the VOD 75 for all the pixels at once.
  • the two-band optical filter 205 and each pixel filter having a single layer structure on the CCD image sensor 212 are used in combination. Optical mixing with infrared light can be minimized. Therefore, high-quality color image capturing using background light and high-accuracy range image capturing using infrared light can be realized with an inexpensive system.
  • the output of the AE detection circuit 242 is divided into infrared light and visible light, and each is independently controlled by AE. Therefore, since the respective light sources are different, it is possible to control the infrared light exposure time 81 of the IR pixel and the visible light exposure time 82 of the RGB pixel, which are greatly different from each other, to the optimum exposure time, as if they were measured with a visible camera. It is possible to further improve the image quality of visible image capturing and the accuracy of distance image capturing so that the two cameras, the distance camera, operate simultaneously.
  • the signal charge readout gate of the IR pixel and the signal charge readout gate of the RGB pixel are wired independently of each other, and the time division between the infrared light exposure and the visible light exposure is completed within one frame.
  • the application of the signal charge readout pulse ⁇ V5 ends the infrared light exposure time 81
  • the application of the signal charge readout pulse ⁇ V1 of the RGB pixels ends the visible light exposure time 82. Therefore, the frame rate of the CCD image sensor 212 Both a visible image and a distance image can be obtained at the same frame rate, and the system can be greatly speeded up.
  • the signal charges of the RGB pixels are read out so as to determine the end of the visible light exposure time 82 from the application timing of the substrate discharge pulse ⁇ Sub that discharges the signal charges of all the pixels so as to determine the start of the visible light exposure time 82.
  • the infrared light irradiation by the infrared laser 202 is stopped. Therefore, the power and heat generation are reduced by shortening the irradiation time of the infrared light. be able to.
  • the infrared light exposure time 81 and the infrared light irradiation time 83 are matched, but the infrared light irradiation time 83 includes the infrared light exposure time 81 and the visible light exposure time 82 is set. If not included, there is no particular problem.
  • this embodiment can also be applied to a MOS image sensor having a global shutter function and a global reset function by making the row selection wiring of the IR pixel and the RGB pixel independent.
  • the present invention realizes distance image capturing using an infrared light irradiation pattern and color image capturing using background light in one system, it can be applied to an imaging device such as a game machine for gesture authentication or a signage field such as person authentication. Can be used.

Abstract

Afin de former une image de plage, un laser infrarouge (202) est utilisé pour créer un motif de points en envoyant une lumière infrarouge sur un sujet (200) à travers un filtre grossier (203). La lumière incidente provenant du sujet (200) passe à travers un filtre optique à 2 bandes (205) qui transmet la région de longueur d'onde de lumière visible et la région de longueur d'onde à proximité de la lumière infrarouge émise, et forme une image sur un capteur d'image DCC (210). Le capteur d'image DCC (210) comprend un premier pixel pour l'exposition aux infrarouges, un second pixel pour l'exposition à la lumière visible, et un drain de surcharge afin de décharger simultanément les charges de signaux des premier et second pixels, et divises dans le temps la lumière en lumière d'exposition aux infrarouge du premier pixel et en lumière d'exposition visible du second pixel.
PCT/JP2012/004917 2011-08-24 2012-08-02 Dispositif d'imagerie WO2013027340A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2011-182393 2011-08-24
JP2011182393A JP2014207493A (ja) 2011-08-24 2011-08-24 撮像装置

Publications (1)

Publication Number Publication Date
WO2013027340A1 true WO2013027340A1 (fr) 2013-02-28

Family

ID=47746113

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/004917 WO2013027340A1 (fr) 2011-08-24 2012-08-02 Dispositif d'imagerie

Country Status (2)

Country Link
JP (1) JP2014207493A (fr)
WO (1) WO2013027340A1 (fr)

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015008383A1 (fr) * 2013-07-19 2015-01-22 日立マクセル株式会社 Dispositif d'imagerie
WO2015011869A1 (fr) * 2013-07-23 2015-01-29 パナソニックIpマネジメント株式会社 Imageur et procédé de commande de ce dernier
JP2015095772A (ja) * 2013-11-12 2015-05-18 キヤノン株式会社 固体撮像装置および撮像システム
KR20150090778A (ko) * 2014-01-29 2015-08-06 엘지이노텍 주식회사 깊이 정보 추출 장치 및 방법
WO2016021312A1 (fr) * 2014-08-08 2016-02-11 ソニー株式会社 Dispositif de capture d'image et élément de capture d'image
WO2016027397A1 (fr) * 2014-08-20 2016-02-25 パナソニックIpマネジメント株式会社 Appareil de capture d'image à semi-conducteurs et caméra
WO2016144499A1 (fr) * 2015-03-09 2016-09-15 Microsoft Technology Licensing, Llc Agencement de filtre pour capteur d'image
WO2018135315A1 (fr) * 2017-01-20 2018-07-26 ソニーセミコンダクタソリューションズ株式会社 Dispositif de capture d'images, procédé de traitement d'images et système de traitement d'images
JP2019144261A (ja) * 2013-06-06 2019-08-29 ヘプタゴン・マイクロ・オプティクス・プライベート・リミテッドHeptagon Micro Optics Pte. Ltd. 撮像システムおよびそれを動作させる方法
US10560638B2 (en) 2015-11-17 2020-02-11 Sony Semiconductor Solutions Corporation Imaging apparatus and imaging method
CN113225485A (zh) * 2021-03-19 2021-08-06 浙江大华技术股份有限公司 图像采集组件、融合方法、电子设备及存储介质
WO2021199761A1 (fr) 2020-03-31 2021-10-07 株式会社フジクラ Dispositif arithmétique optique et procédé de fabrication pour dispositif arithmétique optique
CN113574408A (zh) * 2019-03-27 2021-10-29 松下知识产权经营株式会社 固体摄像装置
CN113785561A (zh) * 2019-08-01 2021-12-10 松下知识产权经营株式会社 摄像装置
CN113923386A (zh) * 2020-07-10 2022-01-11 广州印芯半导体技术有限公司 动态视觉传感器
US11917888B2 (en) 2020-05-04 2024-02-27 Intel Corporation In-display sensors and viewing angle adjustment microassemblies
US11972635B2 (en) 2017-01-06 2024-04-30 Intel Corporation Integrated image sensor and display pixel

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI675907B (zh) 2015-01-21 2019-11-01 日商Jsr股份有限公司 固體攝像裝置
JP6631243B2 (ja) * 2015-01-30 2020-01-15 Jsr株式会社 固体撮像装置及び光学フィルタ
JP2016162946A (ja) * 2015-03-04 2016-09-05 Jsr株式会社 固体撮像装置
US11076115B2 (en) * 2015-04-14 2021-07-27 Sony Corporation Solid-state imaging apparatus, imaging system, and distance measurement method
JP6727840B2 (ja) 2016-02-19 2020-07-22 ソニーモバイルコミュニケーションズ株式会社 撮像装置、撮像制御方法及びコンピュータプログラム
JP2020072299A (ja) * 2018-10-29 2020-05-07 三星電子株式会社Samsung Electronics Co.,Ltd. 画像処理装置、撮像装置、画像処理方法及び画像処理プログラム
CN110505376B (zh) * 2019-05-31 2021-04-30 杭州海康威视数字技术股份有限公司 图像采集装置及方法
JP7259660B2 (ja) * 2019-09-10 2023-04-18 株式会社デンソー イメージレジストレーション装置、画像生成システム及びイメージレジストレーションプログラム
WO2023100613A1 (fr) * 2021-12-02 2023-06-08 パナソニックIpマネジメント株式会社 Dispositif d'imagerie et système de caméra
WO2023145782A1 (fr) * 2022-01-28 2023-08-03 国立大学法人静岡大学 Dispositif d'imagerie à semi-conducteurs

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS60177572U (ja) * 1984-05-01 1985-11-26 シャープ株式会社 固体撮像装置
JP2007214832A (ja) * 2006-02-09 2007-08-23 Sony Corp 固体撮像装置
JP2011149901A (ja) * 2010-01-25 2011-08-04 Rohm Co Ltd 受光装置およびモバイル機器

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS60177572U (ja) * 1984-05-01 1985-11-26 シャープ株式会社 固体撮像装置
JP2007214832A (ja) * 2006-02-09 2007-08-23 Sony Corp 固体撮像装置
JP2011149901A (ja) * 2010-01-25 2011-08-04 Rohm Co Ltd 受光装置およびモバイル機器

Cited By (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019144261A (ja) * 2013-06-06 2019-08-29 ヘプタゴン・マイクロ・オプティクス・プライベート・リミテッドHeptagon Micro Optics Pte. Ltd. 撮像システムおよびそれを動作させる方法
WO2015008383A1 (fr) * 2013-07-19 2015-01-22 日立マクセル株式会社 Dispositif d'imagerie
WO2015011869A1 (fr) * 2013-07-23 2015-01-29 パナソニックIpマネジメント株式会社 Imageur et procédé de commande de ce dernier
JPWO2015011869A1 (ja) * 2013-07-23 2017-03-02 パナソニックIpマネジメント株式会社 固体撮像装置、撮像装置及びその駆動方法
US9736438B2 (en) 2013-07-23 2017-08-15 Panasonic Intellectual Property Management Co., Ltd. Solid state imaging device and imaging device and driving method thereof
JP2015095772A (ja) * 2013-11-12 2015-05-18 キヤノン株式会社 固体撮像装置および撮像システム
KR20150090778A (ko) * 2014-01-29 2015-08-06 엘지이노텍 주식회사 깊이 정보 추출 장치 및 방법
KR102400992B1 (ko) 2014-01-29 2022-05-23 엘지이노텍 주식회사 깊이 정보 추출 장치 및 방법
KR20210090134A (ko) * 2014-01-29 2021-07-19 엘지이노텍 주식회사 깊이 정보 추출 장치 및 방법
KR102277309B1 (ko) * 2014-01-29 2021-07-14 엘지이노텍 주식회사 깊이 정보 추출 장치 및 방법
JP2017506740A (ja) * 2014-01-29 2017-03-09 エルジー イノテック カンパニー リミテッド 深さ情報抽出装置および方法
US10690484B2 (en) 2014-01-29 2020-06-23 Lg Innotek Co., Ltd. Depth information extracting device and method
US10491791B2 (en) 2014-08-08 2019-11-26 Sony Corporation Imaging apparatus and image sensor
WO2016021312A1 (fr) * 2014-08-08 2016-02-11 ソニー株式会社 Dispositif de capture d'image et élément de capture d'image
CN106664378A (zh) * 2014-08-20 2017-05-10 松下知识产权经营株式会社 固体摄像装置以及相机
WO2016027397A1 (fr) * 2014-08-20 2016-02-25 パナソニックIpマネジメント株式会社 Appareil de capture d'image à semi-conducteurs et caméra
US9699394B2 (en) 2015-03-09 2017-07-04 Microsoft Technology Licensing, Llc Filter arrangement for image sensor
WO2016144499A1 (fr) * 2015-03-09 2016-09-15 Microsoft Technology Licensing, Llc Agencement de filtre pour capteur d'image
US10560638B2 (en) 2015-11-17 2020-02-11 Sony Semiconductor Solutions Corporation Imaging apparatus and imaging method
US11972635B2 (en) 2017-01-06 2024-04-30 Intel Corporation Integrated image sensor and display pixel
US10958847B2 (en) 2017-01-20 2021-03-23 Sony Semiconductor Solutions Corporation Imaging device, image processing method, and image processing system
WO2018135315A1 (fr) * 2017-01-20 2018-07-26 ソニーセミコンダクタソリューションズ株式会社 Dispositif de capture d'images, procédé de traitement d'images et système de traitement d'images
CN113574408A (zh) * 2019-03-27 2021-10-29 松下知识产权经营株式会社 固体摄像装置
CN113785561A (zh) * 2019-08-01 2021-12-10 松下知识产权经营株式会社 摄像装置
CN113785561B (zh) * 2019-08-01 2023-12-19 松下知识产权经营株式会社 摄像装置
WO2021199761A1 (fr) 2020-03-31 2021-10-07 株式会社フジクラ Dispositif arithmétique optique et procédé de fabrication pour dispositif arithmétique optique
US11917888B2 (en) 2020-05-04 2024-02-27 Intel Corporation In-display sensors and viewing angle adjustment microassemblies
CN113923386A (zh) * 2020-07-10 2022-01-11 广州印芯半导体技术有限公司 动态视觉传感器
CN113923386B (zh) * 2020-07-10 2023-10-27 广州印芯半导体技术有限公司 动态视觉传感器
CN113225485A (zh) * 2021-03-19 2021-08-06 浙江大华技术股份有限公司 图像采集组件、融合方法、电子设备及存储介质

Also Published As

Publication number Publication date
JP2014207493A (ja) 2014-10-30

Similar Documents

Publication Publication Date Title
WO2013027340A1 (fr) Dispositif d'imagerie
JP6145826B2 (ja) 撮像装置及びその駆動方法
JP4396684B2 (ja) 固体撮像装置の製造方法
US9071781B2 (en) Image capturing apparatus and defective pixel detection method
JP4957413B2 (ja) 固体撮像素子及びこれを用いた撮像装置
JP6442710B2 (ja) 固体撮像装置、撮像装置及びその駆動方法
JP6664122B2 (ja) 固体撮像装置及びカメラ
US20100309340A1 (en) Image sensor having global and rolling shutter processes for respective sets of pixels of a pixel array
JP2016052041A (ja) 固体撮像素子及びその信号処理方法、並びに電子機器
JP7099446B2 (ja) 固体撮像装置および電子機器
US9794497B2 (en) Solid-state imaging device controlling read-out of signals from pixels in first and second areas
WO2011158567A1 (fr) Elément de capture d'image à semi-conducteurs et appareil photo numérique
JPWO2013172205A1 (ja) 撮像装置および撮像方法、電子機器、並びにプログラム
JP2004180045A (ja) 固体撮像装置及びその信号読み出し方法
JP2008042298A (ja) 固体撮像装置
JP2016090785A (ja) 撮像装置及びその制御方法
WO2019078333A1 (fr) Dispositif d'imagerie, procédé de commande d'exposition, programme, et élément d'imagerie
JP5899653B2 (ja) 撮像装置
JP5299002B2 (ja) 撮像装置
JP2010081286A (ja) 撮影装置
JP6641135B2 (ja) 撮像素子及び撮像装置
JP7071061B2 (ja) 撮像装置及びその制御方法
JP2017098790A (ja) 撮像装置及びその制御方法、プログラム、記憶媒体
CN114650343A (zh) 一种图像传感器及成像装置
JP2015177257A (ja) 固体撮像装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12825213

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12825213

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP