WO2023100547A1 - Dispositif d'imagerie et appareil électronique - Google Patents

Dispositif d'imagerie et appareil électronique Download PDF

Info

Publication number
WO2023100547A1
WO2023100547A1 PCT/JP2022/040062 JP2022040062W WO2023100547A1 WO 2023100547 A1 WO2023100547 A1 WO 2023100547A1 JP 2022040062 W JP2022040062 W JP 2022040062W WO 2023100547 A1 WO2023100547 A1 WO 2023100547A1
Authority
WO
WIPO (PCT)
Prior art keywords
transistor
photoelectric conversion
imaging device
conversion unit
capacitive element
Prior art date
Application number
PCT/JP2022/040062
Other languages
English (en)
Japanese (ja)
Inventor
健市 奥村
信哉 谷村
武裕 大谷
Original Assignee
ソニーセミコンダクタソリューションズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーセミコンダクタソリューションズ株式会社 filed Critical ソニーセミコンダクタソリューションズ株式会社
Publication of WO2023100547A1 publication Critical patent/WO2023100547A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/60Noise processing, e.g. detecting, correcting, reducing or removing noise
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/76Addressed sensors, e.g. MOS or CMOS sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/30Transforming light or analogous information into electric information
    • H04N5/33Transforming infrared radiation

Definitions

  • the present disclosure relates to imaging devices and electronic devices.
  • Such an imaging device has a pixel array section in which a plurality of pixels are arranged in a matrix. Each pixel in the pixel array section is provided with a photoelectric conversion section that photoelectrically converts infrared rays.
  • a III-V group semiconductor such as InGaAs (indium gallium arsenide) can be used for this photoelectric conversion unit.
  • the image quality may deteriorate due to voltage fluctuations occurring between pixels. Accordingly, the present disclosure provides an imaging device and an electronic device capable of improving image quality deterioration.
  • the imaging device of the present disclosure includes a plurality of pixels arranged in a matrix.
  • Each of the plurality of pixels includes a photoelectric conversion unit that photoelectrically converts incident light, a discharge transistor that discharges the charge photoelectrically converted by the photoelectric conversion unit, and a transfer transistor that transfers the charge photoelectrically converted by the photoelectric conversion unit.
  • a floating diffusion that temporarily holds the charge transferred from the transfer transistor, a reset transistor that resets the potential of the floating diffusion, one end of which is connected to the photoelectric conversion section, the discharge transistor, and the transfer transistor, and the other end of which is connected to the photoelectric converter.
  • first capacitive element connected to a power supply line to which a reset voltage is applied for discharging charges from the converter to the discharge transistor; and a first capacitive element having one end connected to the floating diffusion and the other end connected to the power supply line. and a second capacitive element commonly connected to the other end of the .
  • the photoelectric conversion part may be a photodiode containing InGaAs.
  • a constant voltage may be applied to the anode of the photodiode, and the cathode of the photodiode may be connected to the one end of the first capacitive element.
  • the constant voltage may be variable according to the reset voltage.
  • the discharge transistor, the transfer transistor, and the reset transistor may be P-channel MOS transistors.
  • a source of the discharge transistor is connected to the power supply line, a drain of the discharge transistor is connected to the one end of the first capacitive element; a source of the transfer transistor is connected to the floating diffusion, a drain of the transfer transistor is connected to the one end of the first capacitive element; A source of the reset transistor may be connected to the power supply line, and a drain of the reset transistor may be connected to the floating diffusion.
  • a light-receiving substrate provided with the photoelectric conversion unit; a drive substrate provided with the discharge transistor, the transfer transistor, the floating diffusion, the reset transistor, the first capacitive element, and the second capacitive element,
  • the light-receiving substrate and the driving substrate may be laminated and bonded to each other.
  • the plurality of pixels may be driven by a DDS (Double Date Sampling) method and a global shutter method.
  • DDS Double Date Sampling
  • the plurality of pixels may be driven by a rolling shutter method.
  • the electronic device of the present disclosure includes an imaging device having a plurality of pixels arranged in a matrix.
  • each of a plurality of pixels includes a photoelectric conversion portion that photoelectrically converts incident light, an ejection transistor that ejects the charge photoelectrically converted by the photoelectric conversion portion, and a charge that is photoelectrically converted by the photoelectric conversion portion.
  • a transfer transistor a floating diffusion that temporarily holds the charge transferred from the transfer transistor, a reset transistor that resets the potential of the floating diffusion, one end of which is connected to the photoelectric conversion unit, the discharge transistor, and the transfer transistor,
  • a first capacitive element having the other end connected to a power line to which a reset voltage is applied for discharging charges from the photoelectric conversion unit to the discharge transistor, a first capacitive element having one end connected to the floating diffusion, and the other end connected to the power line and a second capacitive element commonly connected to the other end of the first capacitive element.
  • FIG. 1 is a block diagram showing a schematic configuration of an imaging device according to a first embodiment
  • FIG. It is a figure which shows an example of the circuit structure of a pixel. It is a perspective view which shows the structural example of an imaging device.
  • 1 is an example of a cross-sectional view of an imaging device according to a first embodiment
  • FIG. It is a circuit diagram of a pixel of an imaging device according to a comparative example. 7 is a timing chart for explaining the operation of pixels according to a comparative example
  • 4 is a timing chart for explaining the operation of pixels according to the first embodiment
  • It is a block diagram showing a schematic configuration of an electronic device according to a second embodiment.
  • 1 is a block diagram showing an example of a schematic configuration of a vehicle control system
  • FIG. FIG. 4 is an explanatory diagram showing an example of installation positions of an outside information detection unit and an imaging unit;
  • FIG. 1 is a block diagram showing a schematic configuration of an imaging device according to the first embodiment.
  • the imaging device 1 according to this embodiment is, for example, an infrared image sensor, and has sensitivity to light with a wavelength of 800 nm or more, for example.
  • the imaging device 1 includes a pixel array section 10 .
  • a plurality of pixels 11 including photoelectric conversion elements are two-dimensionally arranged in a matrix.
  • the circuit configuration of the pixel 11 will be described with reference to FIG.
  • FIG. 2 is a diagram showing an example of the circuit configuration of the pixel 11.
  • FIG. The pixel 11 has, as shown in FIG. 2, a pixel circuit 14 that photoelectrically converts incident light, and a readout circuit 15 that outputs a pixel signal based on the charge output from the pixel circuit 14 .
  • the pixel circuit 14 has a photodiode PD, a transfer transistor TRG, a floating diffusion FD, an ejection transistor OFG, a first capacitive element Csn, and a second capacitive element Cfd.
  • the transfer transistor TRG and the discharge transistor OFG are P-channel MOS (Metal Oxide Semiconductor) transistors.
  • the photodiode PD corresponds to an example of the "photoelectric conversion unit" of the present disclosure.
  • the photodiode PD is a photoelectric conversion unit that absorbs incident light of a predetermined wavelength (for example, light of an infrared wavelength of 900 nm to 1700 nm) and generates signal charges.
  • the photodiode PD includes, for example, compound semiconductors such as III-V semiconductors.
  • Group III-V semiconductors used in the photodiode PD include, for example, InGaP, InAlP, InGaAs, InAlAs, and compound semiconductors having a chalcopyrite structure.
  • a compound semiconductor with a chalcopyrite structure is a material that provides a high light absorption coefficient and high sensitivity over a wide wavelength range, and is preferably used as an n-type semiconductor material for photoelectric conversion.
  • the photodiode PD may contain amorphous silicon (Si), germanium (Ge), a quantum dot photoelectric conversion film, an organic photoelectric conversion film, etc., in addition to the compound semiconductors described above.
  • the cathode of the photodiode PD is connected to the drain of the transfer transistor TRG, the drain of the discharge transistor OFG, and one end of the first capacitive element Csn, respectively.
  • An anode of the photodiode PD is connected to a power supply line to which a constant voltage Vtop is applied.
  • the constant voltage Vtop applied to the anode of the photodiode PD is variable by the voltage generation circuit 70 according to the reset voltage for initializing the charge of the photodiode PD.
  • the transfer transistor TRG has a source connected to the floating diffusion FD and a gate connected to the pixel drive line 12 .
  • the transfer transistor TRG transfers charges held at the node SN (first capacitive element Csn) to the floating diffusion FD according to a control signal applied to its gate.
  • the floating diffusion FD is a floating diffusion region that temporarily holds charges transferred from the node SN via the transfer transistor TRG.
  • the floating diffusion FD is connected to the input end of the readout circuit 15 and one end of the second capacitive element Cfd.
  • the drain transistor OFG has a source connected to the power line to which the voltage VDD1 is applied, and a gate connected to the pixel drive line 12 . Voltage VDD1 is lower than constant voltage Vtop.
  • the discharge transistor OFG initializes (resets) the charge of the node SN according to the control signal applied to its gate.
  • the first capacitive element Csn and the second capacitive element Cfd can be formed, for example, by a gate electrode of a P-channel MOS transistor and a drain electrode or a source electrode facing the gate electrode via a gate oxide film.
  • the readout circuit 15 has, for example, a reset transistor RST, a selection transistor SEL, and an amplification transistor AMP.
  • the reset transistor RST is a P-channel MOS transistor.
  • the drain (input terminal of the readout circuit 15) is connected to the gates of the floating diffusion FD and the amplification transistor AMP, the source is connected to the power supply line to which the voltage VDD1 is applied, and the gate is connected to the pixel drive line 12. It is connected.
  • the reset transistor RST initializes (resets) the potential of the floating diffusion FD to a predetermined potential. When the reset transistor RST is turned on, the potential of the floating diffusion FD is reset to the voltage VDD1.
  • the source is connected to the drain of the selection transistor SEL, the gate is connected to the drain of the reset transistor RST, and the drain is connected to the power supply line to which the voltage VDD3 is applied.
  • the amplification transistor AMP generates a voltage signal corresponding to the amount of light received in the pixel 11 as a pixel signal.
  • the amplification transistor AMP constitutes a source follower type amplifier, and outputs a pixel signal having a voltage corresponding to the level of charge generated in the photodiode PD.
  • the amplification transistor AMP amplifies the potential of the floating diffusion FD when the selection transistor SEL is turned on, and outputs a voltage corresponding to the potential to the vertical signal line 13 .
  • the drain is connected to the source of the amplification transistor AMP, the source (the output terminal of the readout circuit 15) is connected to the vertical signal line 13, and the gate is connected to the pixel drive line 12.
  • the selection transistor SEL controls the output timing of the pixel signal from the readout circuit 15 .
  • the amplification transistor AMP generates a voltage signal corresponding to the level of the charge held in the floating diffusion FD as a pixel signal.
  • the selection transistor SEL may be provided between the power supply line to which the voltage VDD3 is applied and the amplification transistor AMP.
  • the drain of the reset transistor RST is connected to the power supply line and the drain of the select transistor SEL.
  • a source of the selection transistor SEL is connected to a drain of the amplification transistor AMP, and a gate of the selection transistor SEL is connected to the pixel drive line 12 .
  • the source of the amplification transistor AMP (the output terminal of the readout circuit 15) is connected to the vertical signal line 13, and the gate of the amplification transistor AMP is connected to the source of the reset transistor RST.
  • FIG. 3 is a perspective view showing a structural example of the imaging device 1.
  • the imaging device 1 includes, for example, a light receiving substrate 100 and a driving substrate 200 as shown in FIG. Specifically, the imaging device 1 has a three-dimensional structure in which a light receiving substrate 100 and a driving substrate 200 are bonded together.
  • the light receiving substrate 100 is a substrate in which a plurality of photodiodes PD are arranged in a matrix on a silicon substrate.
  • the upper surface of the light-receiving substrate 100 (the surface opposite to the drive substrate 200) serves as a light-receiving surface 100A.
  • the drive substrate 200 is a substrate in which a pixel signal generation circuit region 200A and a peripheral circuit region 200B are arranged on a silicon substrate.
  • a plurality of pixel signal generation circuits 45 are formed in a matrix in the pixel signal generation circuit region 200A.
  • Each pixel signal generation circuit 45 is a circuit of the pixel 11 excluding the photodiode PD.
  • a logic circuit for processing pixel signals is formed in the peripheral circuit region 200B.
  • the vertical drive circuit 20, the horizontal drive circuit 30, the horizontal selection circuit 40, the system control circuit 50, the film voltage control section 60 and the voltage generation circuit 70 shown in FIG. 1 are formed. That is, the imaging device 1 includes a pixel array section 10, a vertical drive circuit 20, a horizontal drive circuit 30, a horizontal selection circuit 40, a system control circuit 50, a film voltage control section 60, and a voltage generation circuit .
  • the logic circuit outputs a digital pixel signal for each pixel 11 to the outside.
  • the system control circuit 50 Based on the master clock, the system control circuit 50 generates clock signals, control signals, and the like that serve as operational references for the vertical drive circuit 20, the horizontal drive circuit 30, the horizontal selection circuit 40, the film voltage control section 60, and the like. It is transmitted to the vertical drive circuit 20, the horizontal selection circuit 40, the film voltage control section 60, and the like.
  • the vertical drive circuit 20 is composed of, for example, a shift register, and controls row scanning of the plurality of pixels 11 via the plurality of pixel drive lines 12 .
  • the horizontal selection circuit 40 is, for example, a circuit provided with an ADC 40a and a switch element 40b for each pixel column (or vertical signal line 13) of the pixel array section 10.
  • the ADC 40a AD Analog-to-Digital converts the analog pixel signal output from the pixel array section 10 .
  • the ADC 40a is capable of varying the analog range, and sets the analog range based on the range set value input from the outside.
  • the vertical signal line 13 is connected to the input end of the ADC 40a, and the switch element 40b is connected to the output end of the ADC 40a.
  • the horizontal drive circuit 30 is composed of, for example, a shift register, and drives the switch elements 40b of the horizontal selection circuit 40 in order. By sequentially driving the switch elements 40b by the horizontal driving circuit 30, pixel signals transmitted through the vertical signal lines 13 are sequentially output to the horizontal signal line 40c and input to a DSP circuit or the like.
  • the film voltage control section 60 controls the film voltage applied to each photodiode PD based on the pixel signal obtained from the pixel 11 .
  • the membrane voltage controller 60 outputs a control signal for controlling the membrane voltage to the voltage generation circuit 70 .
  • the voltage generation circuit 70 generates an analog voltage (for example, constant voltage Vtop) based on the control signal input from the film voltage control section 60, and applies it to each photodiode PD via the power supply line. That is, the film voltage control unit 60 and the voltage generation circuit 70 apply a film voltage based on the pixel signal obtained from the pixel 11 to each photodiode PD, thereby controlling the image quality of the image data obtained from the pixel signal.
  • FIG. 4 is an example of a cross-sectional view of the imaging device 1.
  • the light receiving substrate 100 has an n-type semiconductor film 21 .
  • the n-type semiconductor film 21 is formed over the entire surface of the pixel array section 10, and is made of, for example, the material described above as the material used for the photodiode PD. In the following description, other configurations will be described on the assumption that the material of the n-type semiconductor film 21 is InGaAs.
  • the light-receiving substrate 100 has a p-type semiconductor layer 22 in contact with the surface of the n-type semiconductor film 21 on the drive substrate 200 side for each pixel 11 .
  • Each p-type semiconductor layer 22 is made of a high-concentration p-type semiconductor, such as p-type InGaAs.
  • the p-type semiconductor layer 22 functions as an electrode of the photodiode PD.
  • a predetermined voltage VDD1 is applied to the p-type semiconductor layer 22 via the on-state discharge transistor OFG, or the voltage VDD1 is applied via the on-state transfer transistor TRG and reset transistor RST.
  • the light receiving substrate 100 also has an n-type semiconductor layer 23 that separates the p-type semiconductor layers 22 from each other.
  • the n-type semiconductor layer 23 is formed in the same layer as each p-type semiconductor layer 22, and is made of n-type InP, for example.
  • the light receiving substrate 100 has an n-type semiconductor layer 24 in contact with the surface of the n-type semiconductor film 21 on the side of the light receiving surface 100A.
  • the n-type semiconductor layer 24 is made of an n-type semiconductor having a higher concentration than the n-type semiconductor film 21, and is made of, for example, n-type InGaAs, n-type InP, or n-type InAlAs.
  • the n-type semiconductor layer 24 functions as a barrier layer that prevents backflow of charges generated in the n-type semiconductor film 21 .
  • the light receiving substrate 100 further has an antireflection film 25 in contact with the surface of the n-type semiconductor layer 24 on the side of the light receiving surface 100A.
  • the antireflection film 25 is made of, for example, silicon nitride (SiN), hafnium oxide (HfO 2 ), aluminum oxide (Al 2 O 3 ), zirconium oxide (ZrO 2 ), tantalum oxide (Ta 2 O 5 ), titanium oxide (TiO 2 ), etc.
  • the n-type semiconductor layer 24 also functions as an upper anode electrode among the electrodes sandwiching the n-type semiconductor film 21 from above and below. A predetermined constant voltage Vtop is applied to the anode electrode.
  • the light receiving substrate 100 further has a color filter 26 and an on-chip lens 27 on the antireflection film 25 .
  • the color filter 26 is composed of a plurality of filters 26R that selectively transmit red light, a plurality of filters 26G that selectively transmit green light, and a plurality of filters 26G that selectively transmit blue light.
  • the plurality of filters 26R, 26G, and 26B are provided one by one for each pixel 11, and are arranged in a Bayer array within a plane parallel to the light receiving surface 100A, for example. 4, the pixel 11 provided with the filter 26R is denoted as 11R, the pixel 11 provided with the filter 26G is denoted as 11G, and the pixel 11 provided with the filter 26B is denoted as 11B. It is written. Note that the color filter 26 may be omitted if necessary.
  • the light receiving substrate 100 further has a passivation layer 28 and an insulating layer 29 below the p-type semiconductor layer 22 and the n-type semiconductor layer 23 .
  • the light receiving substrate 100 further has a connection electrode 31 penetrating the passivation layer 28 and in contact with the p-type semiconductor layer 22 , and a bump electrode 32 penetrating the insulating layer 29 and contacting the connection electrode 31 .
  • One set of connection electrode 31 and bump electrode 32 is provided for each pixel 11 .
  • the bump electrode 32 is bonded to the connection layer 43 of the drive substrate 200 and electrically connected to the connection layer 43 .
  • the bump electrode 32 is bonded to the connection layer 43 of the drive substrate 200, for example, when the light receiving substrate 100 and the drive substrate 200 are bonded together.
  • the driving substrate 200 includes a support substrate 41 and an interlayer insulating layer 42.
  • the support substrate 41 is, for example, a silicon substrate.
  • the interlayer insulating layer 42 is provided between the supporting substrate 41 and the insulating layer 291 (light receiving substrate 100).
  • a plurality of connection layers 43 , a plurality of readout electrodes 44 , a plurality of pixel signal generation circuits 45 and a plurality of wirings 46 are provided in the interlayer insulating layer 42 , for example, in order from the position closer to the light receiving substrate 100 .
  • a plurality of sets of connection layers 43 , readout electrodes 44 , pixel signal generation circuits 45 and wirings 46 are provided for each pixel 11 .
  • a plurality of interlayer insulating layers 42 in the interlayer insulating layer 42 are provided, for example, in an ROIC (Read Out IC) for reading charges from each photodiode PD.
  • the logic circuit described above is provided at a portion of the interlayer insulating layer 42 corresponding to the peripheral circuit region 200B.
  • FIG. 5 is a circuit diagram of pixels of an imaging device according to a comparative example.
  • the same components as those of the pixel 11 described above are denoted by the same reference numerals, and detailed description thereof is omitted.
  • the other end of the first capacitive element Csn and the other end of the second capacitive element Cfd are connected to the power supply line to which the voltage VDD2 is applied.
  • Voltage VDD2 is lower than voltage VDD1.
  • FIG. 6 is a timing chart for explaining the operation of the pixel 111 according to the comparative example.
  • FIG. 6 shows the timing of driving the imaging device by a global shutter method that simultaneously exposes all the pixels 111 in the pixel array unit 10 and by a DDS (Double Date Sampling) method having a P-phase period and a D-phase period. Chart.
  • FIG. 6 shows the gate voltage VOFG of the discharge transistor OFG, the gate voltage VRST of the reset transistor RST, the gate voltage VTRG of the transfer transistor TRG, the voltage VDR of the power supply line from the power supply VDD to the discharge transistor OFG, the voltage VSN of the node SN, and the waveform of the voltage VFD of the floating diffusion FD.
  • the reset transistor RST is on because the gate voltage VRST is at low level. This resets the potential of the floating diffusion FD to the voltage VDD1.
  • the gate voltage VOFG and the gate voltage VTRG are at high level, so both the discharge transistor OFG and the transfer transistor TRG are in the off state.
  • the gate voltage VTRG is at the low level, so the transfer transistors of all rows are turned on simultaneously. As a result, charges photoelectrically converted by the photodiode PD are transferred to the floating diffusion FD.
  • the gate voltage VOFG and the gate voltage VRST are at high level, so both the discharge transistor OFG and the reset transistor RST are in the off state.
  • the voltage VDR ⁇ n/2> of the pixels 111 arranged in the central region of the pixel array section 10 is separated from the central region in the V direction by the wiring resistance R described above. may be higher than the voltage VDR ⁇ 0> of the pixels 111 arranged in the peripheral region.
  • the voltage VSN ⁇ n/2> of the pixel 111 arranged in the central area is the voltage VSN ⁇ n/2> of the pixel 111 arranged in the peripheral area. 0>.
  • the voltage VFD indicating the signal amount of the pixel 111 generated in the dark period the voltage VFD ⁇ n/2> of the pixel 111 arranged in the central region is the same as the voltage VFD ⁇ n/2> of the pixel 111 arranged in the peripheral region. higher than the voltage VFD ⁇ 0>. This causes V shading.
  • FIG. 7 is a timing chart for explaining the operation of the pixel 11 according to this embodiment.
  • FIG. 7 is a timing chart in the case of driving the imaging device 1 according to the present embodiment for all the pixels 11 by the global shutter method and the DDS method.
  • the gate voltage VRST is at the low level during the period from time t1 to time t2, so the reset transistor RST is on. This resets the potential of the floating diffusion FD to the voltage VDD1.
  • the gate voltage VTRG is at the low level, so the transfer transistor is in the ON state. As a result, charges photoelectrically converted by the photodiode PD are transferred to the floating diffusion FD.
  • the voltage VDR ⁇ n/2> of the pixels 111 arranged in the central region of the pixel array section 10 is arranged in the peripheral region separated from the central region in the V direction. may be higher than the voltage VDR ⁇ 0> of the pixel 111 that is applied.
  • the other end of the first capacitive element Csn and the other end of the second capacitive element Cfd are commonly connected to the power supply line to which the voltage VDD1 is applied. . That is, the reset voltage (voltage VDD1) and the opposing voltage of the first capacitive element Csn and the second capacitive element Cfd are shared. Thereby, as shown in FIG. 7, the noise component of the reset voltage fluctuation in the dark is canceled.
  • the voltage VSN ⁇ n/2> of the pixels 11 arranged in the central area of the pixel array section 10 and the voltage VSN ⁇ n/2> of the pixels 11 arranged in the peripheral area of the pixel array section 10 are reduced. There is no voltage variation between VSN ⁇ 0>. Therefore, regarding the voltage VFD in the dark, voltage fluctuations between the voltage VFD ⁇ n/2> of the pixels 11 arranged in the central region and the voltage VFD ⁇ 0> of the pixels 11 arranged in the peripheral region Gone. As a result, V shading is suppressed. Therefore, according to this embodiment, it is possible to suppress deterioration in image quality.
  • each pixel 11 is driven by the global shutter method, but may be driven by a rolling shutter method in which the exposure timing differs for each row of the pixel array section 10 .
  • the imaging device 1 according to this embodiment has a laminated structure in which the light receiving substrate 100 made of InGaAs and the driving substrate 200 made of silicon are bonded together.
  • the imaging device 1 is not limited to a laminated structure, and may be a single-layer structure in which the circuit elements of the pixels 11 are provided on one substrate. may
  • FIG. 8 is a block diagram showing a schematic configuration of an electronic device according to the second embodiment.
  • the electronic device 3 shown in FIG. 8 is, for example, an imaging device such as a digital still camera or a video camera, or an electronic device such as a mobile terminal device such as a smart phone or a tablet terminal.
  • the electronic device 3 includes, for example, an imaging device 140, an optical system 141, a shutter device 142, a DSP circuit 143, a frame memory 144, a display section 145, a storage section 146, an operation section 147 and a power supply section 148.
  • the imaging device 1 , shutter device 142 , DSP circuit 143 , frame memory 144 , display section 145 , storage section 146 , operation section 147 and power supply section 148 are interconnected via a bus line 149 . .
  • the imaging device 140 corresponds to the imaging device 1 described in the first embodiment, and outputs image data (digital values) according to incident light.
  • the optical system 141 includes one or more lenses, guides light (incident light) from a subject to the imaging device 1 , and forms an image on the light receiving surface of the imaging device 1 .
  • the shutter device 142 is arranged between the optical system 141 and the imaging device 140 and controls the light irradiation period and the light shielding period for the imaging device 140 .
  • the DSP circuit 143 is a signal processing circuit that processes image data (digital values) output from the imaging device 140 .
  • the frame memory 144 temporarily holds the image data processed by the DSP circuit 143 on a frame-by-frame basis.
  • the display unit 145 is, for example, a panel type display device such as a liquid crystal panel or an organic EL (Electro Luminescence) panel, and displays moving images or still images captured by the imaging device 140 .
  • the storage unit 146 records image data of moving images or still images captured by the imaging device 140 in a recording medium such as a semiconductor memory or a hard disk.
  • the operation unit 147 issues operation commands for various functions of the electronic device 3 in accordance with user's operations.
  • the power supply unit 148 appropriately supplies various types of power to the imaging device 140, the shutter device 142, the DSP circuit 143, the frame memory 144, the display unit 145, the storage unit 146, and the operation unit 147. do.
  • the operation unit 147 transmits an imaging command to the imaging device 140 .
  • the imaging device 140 performs various settings (for example, the image quality adjustment described above). Subsequently, the image capturing device 140 performs image capturing using a predetermined image capturing method.
  • the imaging device 140 outputs image data obtained by imaging to the DSP circuit 143 .
  • the image data is data for all pixels of pixel signals generated based on the charges temporarily held in the floating diffusion FD.
  • the DSP circuit 143 performs predetermined signal processing (for example, noise reduction processing) based on the image data input from the imaging device 140 .
  • the DSP circuit 143 causes the frame memory 144 to hold the image data subjected to the predetermined signal processing, and the frame memory 144 causes the storage unit 146 to store the image data. In this manner, imaging is performed in the electronic device 3 .
  • the reset voltage (voltage VDD1) and the counter voltage of the first capacitive element Csn and the second capacitive element Cfd are shared. . This cancels the noise component of the reset voltage fluctuation in the dark. Therefore, voltage fluctuations of pixel signals between the pixels 11 in the pixel array section 10 are eliminated. As a result, V shading is suppressed, so that deterioration of image quality can be avoided.
  • the technology (the present technology) according to the present disclosure can be applied to various products.
  • the technology according to the present disclosure can be realized as a device mounted on any type of moving body such as automobiles, electric vehicles, hybrid electric vehicles, motorcycles, bicycles, personal mobility, airplanes, drones, ships, and robots. may
  • FIG. 9 is a block diagram showing a schematic configuration example of a vehicle control system, which is an example of a mobile control system to which the technology according to the present disclosure can be applied.
  • a vehicle control system 12000 includes a plurality of electronic control units connected via a communication network 12001.
  • vehicle control system 12000 includes drive system control unit 12010 , body system control unit 12020 , vehicle exterior information detection unit 12030 , vehicle interior information detection unit 12040 , and integrated control unit 12050 .
  • a microcomputer 12051 , an audio/image output unit 12052 , and an in-vehicle network I/F (Interface) 12053 are illustrated as the functional configuration of the integrated control unit 12050 .
  • the drive system control unit 12010 controls the operation of devices related to the drive system of the vehicle according to various programs.
  • the driving system control unit 12010 includes a driving force generator for generating driving force of the vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to the wheels, and a steering angle of the vehicle. It functions as a control device such as a steering mechanism to adjust and a brake device to generate braking force of the vehicle.
  • the body system control unit 12020 controls the operation of various devices equipped on the vehicle body according to various programs.
  • the body system control unit 12020 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as headlamps, back lamps, brake lamps, winkers or fog lamps.
  • body system control unit 12020 can receive radio waves transmitted from a portable device that substitutes for a key or signals from various switches.
  • the body system control unit 12020 receives the input of these radio waves or signals and controls the door lock device, power window device, lamps, etc. of the vehicle.
  • the vehicle exterior information detection unit 12030 detects information outside the vehicle in which the vehicle control system 12000 is installed.
  • the vehicle exterior information detection unit 12030 is connected with an imaging section 12031 .
  • the vehicle exterior information detection unit 12030 causes the imaging unit 12031 to capture an image of the exterior of the vehicle, and receives the captured image.
  • the vehicle exterior information detection unit 12030 may perform object detection processing or distance detection processing such as people, vehicles, obstacles, signs, or characters on the road surface based on the received image.
  • the imaging unit 12031 is an optical sensor that receives light and outputs an electrical signal according to the amount of received light.
  • the imaging unit 12031 can output the electric signal as an image, and can also output it as distance measurement information.
  • the light received by the imaging unit 12031 may be visible light or non-visible light such as infrared rays.
  • the in-vehicle information detection unit 12040 detects in-vehicle information.
  • the in-vehicle information detection unit 12040 is connected to, for example, a driver state detection section 12041 that detects the state of the driver.
  • the driver state detection unit 12041 includes, for example, a camera that captures an image of the driver, and the in-vehicle information detection unit 12040 detects the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 12041. It may be calculated, or it may be determined whether the driver is dozing off.
  • the microcomputer 12051 calculates control target values for the driving force generator, the steering mechanism, or the braking device based on the information inside and outside the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040, and controls the drive system control unit.
  • a control command can be output to 12010 .
  • the microcomputer 12051 realizes the functions of ADAS (Advanced Driver Assistance System) including collision avoidance or shock mitigation, follow-up driving based on inter-vehicle distance, vehicle speed maintenance driving, vehicle collision warning, or vehicle lane deviation warning. Cooperative control can be performed for the purpose of ADAS (Advanced Driver Assistance System) including collision avoidance or shock mitigation, follow-up driving based on inter-vehicle distance, vehicle speed maintenance driving, vehicle collision warning, or vehicle lane deviation warning. Cooperative control can be performed for the purpose of ADAS (Advanced Driver Assistance System) including collision avoidance or shock mitigation, follow-up driving based on inter-vehicle distance, vehicle speed maintenance driving, vehicle collision warning, or vehicle
  • the microcomputer 12051 controls the driving force generator, the steering mechanism, the braking device, etc. based on the information about the vehicle surroundings acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040, so that the driver's Cooperative control can be performed for the purpose of autonomous driving, etc., in which vehicles autonomously travel without depending on operation.
  • the microcomputer 12051 can output a control command to the body system control unit 12030 based on the information outside the vehicle acquired by the information detection unit 12030 outside the vehicle.
  • the microcomputer 12051 controls the headlamps according to the position of the preceding vehicle or the oncoming vehicle detected by the vehicle exterior information detection unit 12030, and performs cooperative control aimed at anti-glare such as switching from high beam to low beam. It can be carried out.
  • the audio/image output unit 12052 transmits at least one of audio and/or image output signals to an output device capable of visually or audibly notifying the passengers of the vehicle or the outside of the vehicle.
  • an audio speaker 12061, a display unit 12062 and an instrument panel 12063 are illustrated as output devices.
  • the display unit 12062 may include at least one of an on-board display and a head-up display, for example.
  • FIG. 10 is a diagram showing an example of the installation position of the imaging unit 12031.
  • the imaging unit 12031 has imaging units 12101, 12102, 12103, 12104, and 12105.
  • the imaging units 12101, 12102, 12103, 12104, and 12105 are provided at positions such as the front nose, side mirrors, rear bumper, back door, and windshield of the vehicle 12100, for example.
  • An image pickup unit 12101 provided in the front nose and an image pickup unit 12105 provided above the windshield in the passenger compartment mainly acquire images in front of the vehicle 12100 .
  • Imaging units 12102 and 12103 provided in the side mirrors mainly acquire side images of the vehicle 12100 .
  • An imaging unit 12104 provided in the rear bumper or back door mainly acquires an image behind the vehicle 12100 .
  • the imaging unit 12105 provided above the windshield in the passenger compartment is mainly used for detecting preceding vehicles, pedestrians, obstacles, traffic lights, traffic signs, lanes, and the like.
  • FIG. 10 shows an example of the imaging range of the imaging units 12101 to 12104.
  • the imaging range 12111 indicates the imaging range of the imaging unit 12101 provided in the front nose
  • the imaging range 1211212113 indicates the imaging ranges of the imaging units 12102 and 12103 provided in the side mirrors
  • the imaging range 12114 indicates the imaging range of the rear bumper or
  • the imaging range of the imaging unit 12104 provided in the back door is shown.
  • a bird's-eye view image of the vehicle 12100 viewed from above can be obtained.
  • At least one of the imaging units 12101 to 12104 may have a function of acquiring distance information.
  • at least one of the imaging units 12101 to 12104 may be a stereo camera composed of a plurality of imaging devices, or may be an imaging device having pixels for phase difference detection.
  • the microcomputer 12051 determines the distance to each three-dimensional object within the imaging ranges 12111 to 12114 and changes in this distance over time (relative velocity with respect to the vehicle 12100). , it is possible to extract, as the preceding vehicle, the closest three-dimensional object on the traveling path of the vehicle 12100, which runs at a predetermined speed (for example, 0 km/h or more) in substantially the same direction as the vehicle 12100. can. Furthermore, the microcomputer 12051 can set the inter-vehicle distance to be secured in advance in front of the preceding vehicle, and perform automatic brake control (including follow-up cutoff control) and automatic acceleration control (including follow-start control). In this way, cooperative control can be performed for the purpose of automatic driving in which the vehicle runs autonomously without relying on the operation of the driver.
  • automatic brake control including follow-up cutoff control
  • automatic acceleration control including follow-start control
  • the microcomputer 12051 converts three-dimensional object data related to three-dimensional objects to other three-dimensional objects such as motorcycles, ordinary vehicles, large vehicles, pedestrians, and utility poles. It can be classified and extracted and used for automatic avoidance of obstacles. For example, the microcomputer 12051 distinguishes obstacles around the vehicle 12100 into those that are visible to the driver of the vehicle 12100 and those that are difficult to see. Then, the microcomputer 12051 judges the collision risk indicating the degree of danger of collision with each obstacle, and when the collision risk is equal to or higher than the set value and there is a possibility of collision, an audio speaker 12061 and a display unit 12062 are displayed. By outputting an alarm to the driver via the drive system control unit 12010 and performing forced deceleration and avoidance steering via the drive system control unit 12010, driving support for collision avoidance can be performed.
  • At least one of the imaging units 12101 to 12104 may be an infrared camera that detects infrared rays.
  • the microcomputer 12051 can recognize a pedestrian by determining whether or not the pedestrian exists in the captured images of the imaging units 12101 to 12104 .
  • recognition of a pedestrian is performed by, for example, a procedure for extracting feature points in images captured by the imaging units 12101 to 12104 as infrared cameras, and performing pattern matching processing on a series of feature points indicating the outline of an object to determine whether or not the pedestrian is a pedestrian.
  • the audio image output unit 12052 outputs a rectangular outline for emphasis to the recognized pedestrian. is superimposed on the display unit 12062 . Also, the audio/image output unit 12052 may control the display unit 12062 to display an icon or the like indicating a pedestrian at a desired position.
  • the technology according to the present disclosure can be applied to, for example, the imaging units 7910, 7912, 7914, 7916, and 7918 and the vehicle exterior information detection units 7920, 7922, 7924, 7926, 7928, and 7930 among the configurations described above.
  • the technology according to the present disclosure can contribute to improving the accuracy of the vehicle control system.
  • this technique can take the following structures. (1) comprising a plurality of pixels arranged in a matrix, each of the plurality of pixels, a photoelectric conversion unit that photoelectrically converts incident light; a discharge transistor for discharging charges photoelectrically converted by the photoelectric conversion unit; a transfer transistor that transfers charges photoelectrically converted by the photoelectric conversion unit; a floating diffusion that temporarily holds the charge transferred from the transfer transistor; a reset transistor for resetting the potential of the floating diffusion; One end is connected to the photoelectric conversion unit, the discharge transistor, and the transfer transistor, and the other end is connected to a power supply line to which a reset voltage is applied for discharging the charge from the photoelectric conversion unit to the discharge transistor.
  • a first capacitive element to be a second capacitive element having one end connected to the floating diffusion and the other end commonly connected to the power supply line and the other end of the first capacitive element;
  • An imaging device having (2) The imaging device according to (1), wherein the photoelectric conversion unit is a photodiode containing InGaAs. (3) The imaging device according to (2), wherein a constant voltage is applied to the anode of the photodiode, and the cathode of the photodiode is connected to the one end of the first capacitive element. (4) The imaging device according to (2) or (3), wherein the constant voltage is variable according to the reset voltage.
  • the imaging device according to any one of (1) to (4), wherein the discharge transistor, the transfer transistor, and the reset transistor are P-channel MOS transistors.
  • a source of the discharge transistor is connected to the power supply line, a drain of the discharge transistor is connected to the one end of the first capacitive element;
  • a source of the transfer transistor is connected to the floating diffusion, a drain of the transfer transistor is connected to the one end of the first capacitive element;
  • a light-receiving substrate provided with the photoelectric conversion unit; a drive substrate provided with the discharge transistor, the transfer transistor, the floating diffusion, the reset transistor, the first capacitive element, and the second capacitive element,
  • the imaging device according to any one of (1) to (6), wherein the light receiving substrate and the driving substrate are laminated and bonded to each other.
  • the imaging device according to any one of (1) to (7), wherein the plurality of pixels are driven by a DDS (Double Date Sampling) method and a global shutter method.
  • DDS Double Date Sampling
  • An electronic device comprising an imaging device having a plurality of pixels arranged in a matrix, each of the plurality of pixels, a photoelectric conversion unit that photoelectrically converts incident light; a discharge transistor for discharging charges photoelectrically converted by the photoelectric conversion unit; a transfer transistor that transfers charges photoelectrically converted by the photoelectric conversion unit; a floating diffusion that temporarily holds the charge transferred from the transfer transistor; a reset transistor for resetting the potential of the floating diffusion; One end is connected to the photoelectric conversion unit, the discharge transistor, and the transfer transistor, and the other end is connected to a power supply line to which a reset voltage is applied for discharging the charge from the photoelectric conversion unit to the discharge transistor. and a second capacitive element having one end connected to the floating diffusion and the other end commonly connected to the power supply line and the other end of the first capacitive element.
  • An electronic device comprising a device.
  • imaging device 3 electronic device 11: pixel 100: light receiving substrate 200: driving substrate PD: photodiode OFG: discharge transistor TRG: transfer transistor FD: floating diffusion RST: reset transistor Csn: first capacitive element Cfd: second 2 capacitive elements

Abstract

Le problème décrit par la présente invention concerne un dispositif d'imagerie capable de supprimer la détérioration de la qualité de l'image. La solution selon l'invention concerne un dispositif d'imagerie doté d'une pluralité de pixels disposés sous forme de matrice. Chacun de la pluralité de pixels est doté d'une unité de conversion photoélectrique pour effectuer la conversion photoélectrique de la lumière incidente : une unité de conversion photoélectrique pour effectuer la conversion photoélectrique de la lumière incidente ; un transistor de décharge pour décharger les charges converties photoélectriquement par l'unité de conversion photoélectrique ; un transistor de transfert pour transférer les charges converties photoélectriquement par l'unité de conversion photoélectrique ; une diffusion flottante pour retenir temporairement les charges transférées à partir du transistor de transfert ; un transistor de réinitialisation pour réinitialiser le potentiel de la diffusion flottante ; un premier élément capacitif, dont une extrémité est connectée à l'unité de conversion photoélectrique, au transistor de décharge et au transistor de transfert, et dont l'autre extrémité est connectée à une ligne d'alimentation à laquelle est appliquée une tension de réinitialisation pour décharger les charges de l'unité de conversion photoélectrique au transistor de décharge ; et un deuxième élément capacitif, dont une extrémité est connectée à la diffusion flottante, et dont l'autre extrémité est connectée à la ligne d'alimentation en commun avec l'autre extrémité du premier élément capacitif.
PCT/JP2022/040062 2021-12-03 2022-10-27 Dispositif d'imagerie et appareil électronique WO2023100547A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021197002A JP2023082958A (ja) 2021-12-03 2021-12-03 撮像装置および電子機器
JP2021-197002 2021-12-03

Publications (1)

Publication Number Publication Date
WO2023100547A1 true WO2023100547A1 (fr) 2023-06-08

Family

ID=86611843

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/040062 WO2023100547A1 (fr) 2021-12-03 2022-10-27 Dispositif d'imagerie et appareil électronique

Country Status (2)

Country Link
JP (1) JP2023082958A (fr)
WO (1) WO2023100547A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140054446A1 (en) * 2012-08-23 2014-02-27 Rambus Inc. Binary Pixel Circuit Architecture
JP2019530321A (ja) * 2016-09-08 2019-10-17 ジーブイビービー ホールディングス エス.エイ.アール.エル. 交差画素相互接続型cmosイメージセンサの動的画素管理のためのシステム及び方法
WO2021106402A1 (fr) * 2019-11-29 2021-06-03 ソニーセミコンダクタソリューションズ株式会社 Dispositif à semi-conducteur, élément de capture d'image et équipement électronique

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140054446A1 (en) * 2012-08-23 2014-02-27 Rambus Inc. Binary Pixel Circuit Architecture
JP2019530321A (ja) * 2016-09-08 2019-10-17 ジーブイビービー ホールディングス エス.エイ.アール.エル. 交差画素相互接続型cmosイメージセンサの動的画素管理のためのシステム及び方法
WO2021106402A1 (fr) * 2019-11-29 2021-06-03 ソニーセミコンダクタソリューションズ株式会社 Dispositif à semi-conducteur, élément de capture d'image et équipement électronique

Also Published As

Publication number Publication date
JP2023082958A (ja) 2023-06-15

Similar Documents

Publication Publication Date Title
US20210313362A1 (en) Solid-state imaging device and electronic apparatus
TWI820078B (zh) 固體攝像元件
US11582416B2 (en) Solid-state image sensor, imaging device, and method of controlling solid-state image sensor
US11252367B2 (en) Solid-stage image sensor, imaging device, and method of controlling solid-state image sensor
WO2020059553A1 (fr) Dispositif d'imagerie à semi-conducteur et appareil électronique
KR20210006273A (ko) 고체 촬상 소자 및 촬상 장치
WO2020195825A1 (fr) Dispositif d'imagerie et appareil électronique
JP2019186738A (ja) 撮像装置
WO2020129657A1 (fr) Capteur et procédé de commande
US11516418B2 (en) Solid-state imaging apparatus
US11503240B2 (en) Solid-state image pickup element, electronic apparatus, and method of controlling solid-state image pickup element
WO2020045142A1 (fr) Dispositif d'imagerie et instrument électronique
KR20210093859A (ko) 고체 촬상 장치 및 전자 기기
US11563913B2 (en) Solid-state imaging element and imaging device
WO2023100547A1 (fr) Dispositif d'imagerie et appareil électronique
US11438534B2 (en) Solid-state imaging device and electronic apparatus
WO2023127110A1 (fr) Dispositif de détection de lumière et appareil électronique
WO2022209649A1 (fr) Système d'imagerie et dispositif d'imagerie
US20220103775A1 (en) Imaging device
WO2024042864A1 (fr) Dispositif d'imagerie
WO2023013178A1 (fr) Dispositif d'imagerie à semi-conducteurs et appareil électronique
WO2024095630A1 (fr) Dispositif d'imagerie
WO2023188868A1 (fr) Capteur linéaire
WO2024042862A1 (fr) Dispositif d'imagerie
WO2023127512A1 (fr) Dispositif d'imagerie et appareil électronique

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22900980

Country of ref document: EP

Kind code of ref document: A1