WO2021234423A1 - Capacitive structures for imaging devices and imaging apparatuses - Google Patents

Capacitive structures for imaging devices and imaging apparatuses Download PDF

Info

Publication number
WO2021234423A1
WO2021234423A1 PCT/IB2020/000397 IB2020000397W WO2021234423A1 WO 2021234423 A1 WO2021234423 A1 WO 2021234423A1 IB 2020000397 W IB2020000397 W IB 2020000397W WO 2021234423 A1 WO2021234423 A1 WO 2021234423A1
Authority
WO
WIPO (PCT)
Prior art keywords
photoelectric conversion
imaging device
conversion region
floating diffusion
transfer transistor
Prior art date
Application number
PCT/IB2020/000397
Other languages
French (fr)
Inventor
Frederick Brady
Sungin HWANG
Michiel TIMMERMANS
Original Assignee
Sony Semiconductor Solutions Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Semiconductor Solutions Corporation filed Critical Sony Semiconductor Solutions Corporation
Priority to PCT/IB2020/000397 priority Critical patent/WO2021234423A1/en
Publication of WO2021234423A1 publication Critical patent/WO2021234423A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4816Constructional features, e.g. arrangements of optical elements of receivers alone
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • G01S17/8943D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/4865Time delay measurement, e.g. time-of-flight measurement, time of arrival measurement or determining the exact position of a peak
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/491Details of non-pulse systems
    • G01S7/4912Receivers
    • G01S7/4915Time delay measurement, e.g. operational details for pixel components; Phase measurement

Definitions

  • Example embodiments are directed to capacitive structures for imaging devices and imaging apparatuses, and methods for operating the same.
  • Imaging devices are used in many applications, including depth sensing for object tracking, environment rendering, etc. Imaging devices used for depth sensing may employ time-of-flight (ToF) principles to detect a distance to an object or objects within a scene.
  • ToF time-of-flight
  • a ToF depth sensor includes a light source and an imaging device including a plurality of pixels for sensing reflected light.
  • the light source emits light (e.g., infrared light) toward an object or objects in the scene, and the pixels detect the light reflected from the object or objects.
  • the elapsed time between the initial emission of the light and receipt of the reflected light by each pixel may correspond to a distance from the object or objects.
  • Direct ToF imaging devices may measure the elapsed time itself to calculate the distance while indirect ToF imaging devices may measure the phase delay between the emitted light and the reflected light and translate the phase delay into a distance.
  • the depth values of the pixels are then used by the imaging device to determine a distance to the object or objects, which may be used to create a three dimensional scene of the captured object or objects.
  • Example embodiments relate to imaging devices, imaging apparatuses, and methods thereof that improve quantum efficiency and/or increase charge saturations levels.
  • an imaging device includes a first photoelectric conversion region, a first transfer transistor coupled to the first photoelectric conversion region, a second transfer transistor coupled to the first photoelectric conversion region, and a capacitive structure.
  • the capacitive structure includes a first conductive structure overlapping the first photoelectric conversion region, a second conductive structure overlapping the first photoelectric conversion region, and an insulating material between the first conductive structure and the second conductive structure.
  • the first conductive structure includes a portion of the first photoelectric conversion region.
  • the second conductive structure includes metal.
  • the second conductive structure includes poly silicon.
  • At least one of the first conductive structure, the second conductive structure, and the insulating material have an uneven surface (e.g., a plurality of structures arranged at regular intervals).
  • the second conductive structure overlaps a majority of the first photoelectric conversion region in a plan view.
  • the imaging device includes an isolation structure surrounding the capacitive structure, the first photoelectric conversion region, the first transfer transistor, and the second transfer transistor in the plan view.
  • the imaging device includes an overflow transistor coupled to the first photoelectric conversion region.
  • the second conductive structure includes a first conductor and a second conductor spaced apart from the first conductor.
  • the imaging device includes a first floating diffusion region coupled to the first transfer transistor, a second floating diffusion region, a third transfer transistor coupled between the first floating diffusion region and the second floating diffusion region, a third floating diffusion region coupled to the second transfer transistor, a fourth floating diffusion region, and a fourth transfer transistor coupled between the third floating diffusion region and the fourth floating diffusion region.
  • the imaging device includes a first wiring electrically connecting the first conductor to the second floating diffusion region, and a second wiring electrically connecting the second conductor to the fourth floating diffusion region.
  • the imaging device includes an isolation structure that surrounds the first floating diffusion region, the second floating diffusion region, the third floating diffusion region, the fourth floating diffusion region, the first transfer transistor, and the second transfer transistor.
  • the first photoelectric conversion region is disposed in a semiconductor substrate, and the isolation structure extends from a first surface of the semiconductor substrate toward a second surface of the semiconductor substrate. In a plan view, an edge of the second conductive structure is parallel with an edge of the first photoelectric conversion region.
  • the first conductive structure includes an anode of the first photoelectric conversion region.
  • an imaging device includes a first photoelectric conversion region including a first surface and a second surface opposite the first surface.
  • the first surface is a light incident surface.
  • the imaging device includes a first transfer transistor coupled to the first photoelectric conversion region, a first floating diffusion region coupled to the first transfer transistor, a second transfer transistor coupled to the first photoelectric conversion region, a second floating diffusion region coupled to the second transfer transistor, and a structure including a portion of the first photoelectric conversion region at the second surface, and an insulating material on the portion of the first photoelectric conversion region and further away from the first surface than the portion of the first photoelectric conversion region.
  • the portion of the first photoelectric conversion region and the insulating material have uneven surfaces.
  • the imaging device includes a conductive structure on the insulating material.
  • the uneven surfaces are triangular shaped in a cross sectional view.
  • the imaging device includes an isolation structure surrounding the conductive structure, the first photoelectric conversion region, the first floating diffusion region, the second floating diffusion region, the first transfer transistor, and the second transfer transistor in the plan view.
  • an imaging device includes a first photoelectric conversion region disposed in a semiconductor substrate, a first transfer transistor coupled to the first photoelectric conversion region, a second transfer transistor coupled to the first photoelectric conversion region, and a capacitive structure overlapping the first photoelectric conversion region.
  • the capacitive structure includes a first conductor overlapping the first photoelectric conversion region, a second conductor overlapping the first photoelectric conversion region, a third conductor spaced apart from and electrically isolated from the second conductor; and an insulating material sandwiched between the first conductor and the second and third conductors.
  • FIG. l is a block diagram of an imaging device according to at least one example embodiment
  • FIG. 2 illustrates an example schematic of a pixel from Fig. 1 according to at least one example embodiment
  • Fig. 3 illustrates a layout of a pixel in a plan view according to at least one example embodiment
  • Fig. 4 illustrates a cross sectional view of a pixel according to at least one example embodiment
  • Fig. 5 illustrates a cross sectional view of a pixel according to at least one example embodiment
  • Fig. 6 illustrates a cross sectional view of a pixel according to at least one example embodiment
  • Fig. 7 illustrates a cross sectional view of a pixel according to at least one example embodiment
  • Fig. 8 illustrates an example schematic of a pixel according to at least one example embodiment
  • Fig. 9 is a timing chart for explaining a detection method according to at least one example embodiment
  • Fig. 10 is timing chart for detection and readout signals according to at least one example embodiment
  • Fig. 11 is a block diagram of a ranging module (or ranging device) according to at least one example embodiment.
  • Fig. 12 is a diagram illustrating use examples of an imaging device according to at least one example embodiment.
  • Fig. l is a block diagram of an imaging device according to at least one example embodiment.
  • the imaging device 1 shown in Fig. 1 may be an imaging sensor a rear surface irradiation type, and is provided, for example, in an imaging apparatus having a ranging function (or distance measuring function).
  • the imaging device 1 has a pixel array unit (or pixel array or pixel section) 20 formed on a semiconductor substrate (not shown) and a peripheral circuit integrated on the same semiconductor substrate the same as the pixel array unit 20.
  • the peripheral circuit includes, for example, a tap driving unit (or tap driver) 21 (which may be horizontally or vertically arranged), a vertical driving unit (or vertical driver) 22, a column processing unit (or column processing circuit) 23, a horizontal driving unit (or horizontal driver) 24, and a system control unit (or system controller) 25.
  • the imaging device element 1 is further provided with a signal processing unit (or signal processor) 31 and a data storage unit (or data storage or memory) 32.
  • a signal processing unit (or signal processor) 31 and a data storage unit (or data storage or memory) 32.
  • the signal processing unit 31 and the data storage unit 32 may be mounted on the same substrate as the imaging device 1 or may be disposed on a substrate separate from the imaging device 1 in the imaging apparatus.
  • the pixel array unit 20 has a configuration in which pixels 51 that generate charge corresponding to a received light amount and output a signal corresponding to the charge are two-dimensionally disposed in a matrix shape of a row direction and a column direction. That is, the pixel array unit 20 has a plurality of pixels 51 that perform photoelectric conversion on incident light and output a signal corresponding to charge obtained as a result.
  • the row direction refers to an arrangement direction of the pixels 51 in a horizontal direction
  • the column direction refers to the arrangement direction of the pixels 51 in a vertical direction.
  • the row direction is a horizontal direction in the figure
  • the column direction is a vertical direction in the figure.
  • the pixel 51 receives light incident from the external environment, for example, infrared light, performs photoelectric conversion on the received light, and outputs a pixel signal according to charge obtained as a result.
  • the pixel 51 has a first tap TA that detects charge obtained by the photoelectric conversion by applying a predetermined voltage (first voltage) as signals GDA/C, and a second tap TB that detects charge obtained by the photoelectric conversion by applying a predetermined voltage (second voltage) as signals GDB/D.
  • first voltage first voltage
  • second tap TB that detects charge obtained by the photoelectric conversion by applying a predetermined voltage (second voltage) as signals GDB/D.
  • the tap driving unit 21 supplies signals GDA/C to the first tap TA of each of the pixels 51 of the pixel array unit 20 through a predetermined voltage supply line 30, and supplies the signals GDB/D to the second tap TB thereof through the predetermined voltage supply line 30. Therefore, two voltage supply lines 30 including the voltage supply line 30 that transmits the voltage GDA/C and the voltage supply line 30 that transmits the voltage GDB/D are wired to one pixel column of the pixel array unit 20. [0026] In the pixel array unit 20, with respect to the pixel array of the matrix shape, a pixel drive line 28 is wired along a row direction for each pixel row, and two vertical signal lines 29 are wired along a column direction for each pixel column.
  • the pixel drive line 28 transmits a drive signal for driving when reading a signal from the pixel.
  • Fig. 1 shows one wire for the pixel drive line 28, the pixel drive line 28 is not limited to one.
  • One end of the pixel drive line 28 is connected to an output end corresponding to each row of the vertical driving unit 22.
  • the vertical driving unit 22 includes a shift register, an address decoder, or the like.
  • the vertical driving unit 22 drives each pixel of all pixels of the pixel array unit 20 at the same time, or in row units, or the like. That is, the vertical driving unit 22 includes a driving unit that controls operation of each pixel of the pixel array unit 20, together with the system control unit 25 that controls the vertical driving unit 22.
  • the signals output from each pixel 51 of a pixel row in response to drive control by the vertical driving unit 22 are input to the column processing unit 23 through the vertical signal line 29.
  • the column processing unit 23 performs a predetermined signal process on the pixel signal output from each pixel 51 through the vertical signal line 29 and temporarily holds the pixel signal after the signal process.
  • the column processing unit 23 performs a noise removal process, an analog to digital (AD) conversion process, and the like as the signal process.
  • the horizontal driving unit 24 includes a shift register, an address decoder, or the like, and sequentially selects unit circuits corresponding to pixel columns of the column processing unit 23.
  • the column processing unit 23 sequentially outputs the pixel signals obtained through the signal process for each unit circuit, by a selective scan by the horizontal driving unit 24.
  • the system control unit 25 includes a timing generator or the like that generates various timing signals and performs drive control on the tap driving unit 21, the vertical driving unit 22, the column processing unit 23, the horizontal driving unit 24, and the like, on the basis of the various generated timing signals.
  • the signal processing unit 31 has at least a calculation process function and performs various signal processes such as a calculation process on the basis of the pixel signal output from the column processing unit 23.
  • the data storage unit 32 temporarily stores data necessary for the signal processing in the signal processing unit 31.
  • the signal processing unit 31 may control overall functions of the imaging device 1. For example, the tap driving unit 21, the vertical driving unit 22, the column processing unit 23, the horizontal driving unit 24, and the system control unit 25, and the data storage unit 32 may be under control of the signal processing unit 31.
  • Fig. 2 illustrates an example schematic of a pixel 51 from Fig. 1.
  • the pixel 51 includes a photoelectric conversion region PD, such as a photodiode or other light sensor, transfer transistors TG0 and TGI, floating diffusion regions FD0 and FD1, reset transistors RST0 and RST1, amplification transistors AMP0 and AMP1, and selection transistors SEL0 and SELL
  • the pixel 51 may further include an overflow transistor OFG, transfer transistors FDG0 and FDG1, floating diffusion regions FD2 and FD3, and a capacitive structure including capacitors CAPO and CAPl.
  • the pixel 51 may be driven according to tap driving signals GDA/C and GDB/D applied to gates of transfer transistors TG0/TG1, reset signal RSTDRAIN, overflow signal
  • Figs. 9 and 10 and related discussion below set forth additional details for driving the pixel 51.
  • the transfer transistors TG0 and TGI are coupled to the photoelectric conversion region PD and have gates (or taps) that receive tap driving signals GD A/GDC (abbreviated as GDA/C) and GDB/GDD (abbreviated as (GDB/D), where the last letters A, B, C, and D represent different phases of a tap driving signal relative to a phase of a modulated signal from a light source.
  • tap driving signals GD A/GDC may refer to two signals GDA and GDC, where signal GDA has a 0 degree phase shift from a light source signal, and where signal GDC has a 180 degree phase shift from the light source signal.
  • tap driving signals GDB/D may refer to two signals GDB and GDD, where signal GDB has a 90 degree phase shift from the light source signal, and where signal GDD has a 270 degree phase shift from the light source signal.
  • the tap driving signals GDA/C and GDB/D may be applied in a manner that allows for depth information to be captured in a desired number of frames (e.g., one frame, two frames, four frames, etc.).
  • Fig. 2 further illustrates that capacitor CAPO is electrically connected between an anode AN and floating diffusion region FD2, and that capacitor CAPl is electrically connected between the anode AN and floating diffusion region FD3. Additional details of the capacitors CAPO/CAP 1 and the anode AN are discussed in more detail below with reference to Figs. 3-8.
  • the transfer transistors FDG0/FDG1 and floating diffusions FD2/FD3 are included to expand the charge capacity of the pixel 51, if desired. However, these elements may be omitted or not used, if desired.
  • the overflow transistor OFG is included to transfer overflow charge from the photoelectric conversion region PD, but may be omitted or unused if desired.
  • Fig. 3 illustrates a layout of a pixel 51 from Figs. 1 and 2 in a plan view according to at least one example embodiment.
  • the photoelectric conversion region PD is in a central region of the pixel 51 and surrounded by transistors.
  • amplification transistor AMP0 and selection transistor SEL0 may be located at one side of the photoelectric conversion region PD and coupled to floating diffusion region FD0 while amplification transistor AMP1 and selection transistor SEL1 may be located on an opposite side of the photoelectric conversion region PD and coupled to floating diffusion region FD1.
  • transistors AMP0 and SEL0 are aligned with one another in a vertical direction
  • transistors AMP1 and SEL1 are aligned with one another in the vertical direction.
  • Transistors RST0 and FDG0, and transistors RST1 and FDG1 have similar alignments.
  • the photoelectric conversion region PD has eight sides and the transfer transistors TG0 and
  • TGI are located at a same end of but on different sides of the photoelectric conversion region PD.
  • the overflow transistor OFG is located at an opposite end of the photoelectric conversion region. As shown, at least portions of transistors TGO, TGI, and OFG may overlap parts of the photoelectric conversion region PD.
  • Fig. 3 further illustrates a conductive structure including conductors CD0/CD1 that overlap the photoelectric conversion region PD.
  • the conductive structures CDO and CD1 are cathodes of the capacitors CAPO and CAPl.
  • the conductive structure including conductors CDO and CD1 may overlap a majority of the photoelectric conversion region PD.
  • the conductors CDO and CD1 are spaced apart and electrically isolated from one another (e.g., by an insulating material).
  • unlabeled transistor structures in Fig. 3 may correspond to transistors of neighboring pixels 51.
  • the pixel 51 may include one or more wiring layers (e.g., made of metal) to make electrical connections between the photoelectric conversion region PD, the transistors, and other elements of an imaging device.
  • Fig. 3 illustrates portions of a wiring layer Ml, which includes conductive portions that couple the conductor CDO to floating diffusion region FD2 and that couple conductor CD1 to floating diffusion region FD3.
  • Ml may couple conductors CDO and CD1 to floating diffusion regions FD0 and FD1, respectively.
  • Example embodiments are not limited to the layout and shapes illustrated in Fig. 3, and the layout and element shapes in Fig. 3 may be varied if desired.
  • Fig. 4 illustrates a cross sectional view of a pixel 51 according to at least one example embodiment.
  • the cross sectional view in Fig. 4 is taken along line III-III in Fig.
  • the pixel 51 may include a substrate SUB.
  • the substrate SUB may be a semiconductor substrate with the photoelectric conversion region PD disposed therein.
  • the photoelectric conversion region PD is a photodiode.
  • photoelectric conversion region PD may include a portion of a first conductivity type
  • the pixel 51 includes a gate GO of transfer transistor TGO and a gate G1 of transfer transistor TGI, both disposed in the substrate SUB and coupled between the photoelectric conversion PD and respective floating diffusion regions FD0 and FD1.
  • the gates GO and G1 may correspond to taps
  • TA and TB respectively, in Fig. 1, and include a conductor, such as metal, polysilicon, and/or the like.
  • the gates GO and G1 may penetrate the photoelectric conversion region PD, or alternatively, be arranged in proximity to the photoelectric conversion region PD.
  • the floating diffusion regions FD0/FD1 may include doped portions (e.g., N type portions) of the substrate SUB.
  • a gate of the overflow transistor OFG may have a same or similar structure to the gates GO and G1 (e.g., a vertical gate structure).
  • the pixel 51 further includes a capacitive structure CS comprised of capacitors CAPO and CAPl.
  • the capacitive structure CS includes a conductive structure comprised of conductors CD0/CD1, insulating material INS1, and a conductive structure comprised of the anode AN.
  • Capacitor CAPO includes the anode AN, the insulating material INS, and conductor CDO while capacitor CAPl includes the anode AN, the insulating material INS, and conductor CD1.
  • the capacitors CAPO and CAPl may be used to increase a charge capacity of the floating diffusion regions FD0/FD1 of the pixel 51.
  • conductors CD0/CD1 include metal, for example, reflective metal. Conductors CD0/CD1 formed of reflective metal may improve quantum efficiency of the pixel 51 because light incident to the conductors CD0/CD1 is reflected back toward the photoelectric conversion region PD.
  • conductors CD0/CD1 include polysilicon, which may increase a capacitance value of the overall capacitive structure CS compared to similarly sized metal conductors CD0/CD1.
  • conductors CDO and CD1 are spaced apart and electrically insulated from one another by, for example, an insulating material (not explicitly shown in Fig. 4).
  • the anode AN may include a conductor, such as doped semiconductor material, for example, a doped portion of the substrate SUB separate from the photoelectric conversion region PD.
  • the anode AN may be doped with N-type or P-type impurities depending on design preference.
  • the insulating material INS may include silicon oxide, silicon dioxide, and/or other suitable insulator. As shown in Fig. 4, the insulating material INS may span across an entire surface of the substrate SUB. Alternatively, the insulating material INS spans across a partial portion of the entire surface of the substrate SUB.
  • the capacitive structure CS may be formed to have a desired capacitance value and/or reflectance that is set based on empirical evidence and/or design preference.
  • the pixel 51 may further include one or more intermediate layers IL and a microlens LENS on the one or more intermediate layers IL.
  • the one or more intermediate layers IL may include a light filter (e.g., color filter, an infrared filter, etc.) and/or other elements desired for the pixel 51.
  • Fig. 5 illustrates a cross sectional view of a pixel 51 according to at least one example embodiment.
  • the cross sectional view in Fig. 5 is taken along line III-III in Fig. 3.
  • Fig. 5 includes many of the same elements as Fig. 4, and as such, a description of these elements will not be repeated.
  • Fig. 5 further includes an isolation structure IS1 at a periphery of the pixel 51 to reflect and/or absorb light to effective block light from entering neighboring pixels 51.
  • the isolation structure IS1 may include an insulation material, such as an oxide. Additionally or alternatively, the isolation structure IS1 may include a light blocking material, such as tungsten, aluminum, and/or the like.
  • the insulating material INS and the isolation structure IS1 both include an oxide, such as silicon oxide, silicon dioxide, etc.
  • a thickness of the isolation structure IS1 may be greater than a thickness of the insulating material INS.
  • the isolation structure IS1 is formed from one surface of the substrate SUB to an opposite surface of the substrate SUB.
  • example embodiments are not limited thereto, and the isolation structure IS1 may be formed to a desired depth within the substrate SUB, or the isolation structure IS1 may be formed through the substrate SUB and into the one or more intermediate layers IL if desired.
  • Fig. 6 illustrates a cross sectional view of a pixel 51 according to at least one example embodiment.
  • the cross sectional view in Fig. 6 is taken along line III-III in Fig. 3.
  • Fig. 6 includes many of the same elements as Figs. 4 and 5, and as such, a description of these elements will not be repeated.
  • Fig. 6 illustrates uneven surfaces, for example, structures that have triangular cross sectional shapes arranged periodically or at desired intervals, for the anode AN and/or the capacitors CAPO/CAP 1.
  • Fig. 6 illustrates uneven surfaces, for example, structures that have triangular cross sectional shapes arranged periodically or at desired intervals, for the anode AN and/or the capacitors CAPO/CAP 1.
  • the capacitive structure CS includes the anode AN with an uneven surface, conductors CD0/CD1 with uneven surfaces, and an insulating material INS1 sandwiched between the uneven surface of the anode AN and the uneven surfaces of the conductors CD0/CD1.
  • the insulating material INS1 may include the same material as the insulating material INS, for example, silicon oxide, silicon dioxide, etc.
  • a thickness of the insulating material INS may be greater than a thickness of the insulating material INS1.
  • the uneven surfaces of the anode AN and conductors CD0/CD1 may include substantially triangularly shaped structures arranged at a substantially same pitch.
  • conductors CDO and CD1 in Fig. 6 are still spaced apart and electrically isolated from one another.
  • example embodiments are not limited to both of the anode AN and the conductors CD0/CD1 having uneven surfaces.
  • the anode AN may have an uneven surface while the conductors CD0/CD1 have flat surfaces.
  • the anode AN may have flat surfaces while the conductors CD0/CD1 have uneven surfaces.
  • both surfaces of the anode AN may be uneven and both surfaces of the conductors CD0/CD1 may be uneven.
  • the insulating material INS1 may conform to the surfaces of the anode AN and conductors CD0/CD1.
  • the capacitive structure CS of Fig. 6 may have an increased effective surface area compared to the capacitive structure CS of Figs. 4 and 5, which may result in an increased capacitance. Additionally, the uneven surfaces of the anode AN, the insulating material INS1, and/or the conductors CD0/CD1 may diffract photons, thereby increasing photon paths to increase quantum efficiency. In at least one example embodiment, the conductors CD0/CD1 may be omitted from the pixel 51 if increased capacitance is not desired.
  • a surface of the substrate SUB closest to the microlens LENS and/or at least one of the one or more intermediate layers IL may include a same or similar uneven surface, for example, that covers an entire surface of the photoelectric conversion region PD.
  • Fig. 7 illustrates a cross sectional view of a pixel 51 according to at least one example embodiment. As in Figs. 4-6, the cross sectional view in Fig. 7 is taken along line III-III in Fig. 3. Fig. 7 includes many of the same elements as Fig. 4, and as such, a description of these elements will not be repeated. Compared to Fig. 4, the anode AN of Fig. 7 includes a portion of the photoelectric conversion region PD that abuts the insulating material INS.
  • Fig. 8 illustrates an example schematic of a pixel 51 according to at least one example embodiment.
  • Fig. 8 includes many of the same elements as Fig. 3, and as such, a description of these elements will not be repeated.
  • Fig. 8 further includes an isolation structure IS2 that surrounds elements disposed in the substrate SUB.
  • the isolation structure IS2 may surround the floating diffusion regions FD0,
  • the isolation structure IS2 may be formed through the substrate SUB, or to a desired depth within the substrate SUB.
  • the transistors AMP0/AMP1 and SEL0/SEL1 may be formed in a wiring layer that is on the insulating material INS1, and as such, these elements are not surrounded by the isolation structure IS2.
  • the isolation structure IS2 may exist with or without the isolation structure IS1 from Fig. 6, and the isolation structure IS1 may exist without isolation structure IS2.
  • example embodiments shown and described with reference to Figs. 1-8 may be combined in any desired manner according to design preferences.
  • various elements from the pixel designs shown in Figs. 2-8 may be combined with another if desired.
  • various elements from the pixel designs shown in Figs. 2-8 may be omitted or unused according to design preferences.
  • one or more of the uneven surfaces depicted and described with reference to Fig. 6 may be included with the structure(s) of Fig. 4 and/or Fig. 7.
  • the isolation structure IS2 in Fig. 8 may be applied to one or more of the pixel designs in Figs. 4-7 according to design preferences.
  • an imaging device 1 includes a pixel 51.
  • Imaging device 1 includes a pixel 51, a first transfer transistor TG0 coupled to a first photoelectric conversion region PD, and a second transfer transistor TGI coupled to the photoelectric conversion region PD.
  • the pixel 51 further includes a capacitive structure CS including a first conductive structure including anode AN overlapping the first photoelectric conversion region PD, and a second conductive structure including conductors CD0/CD1 overlapping the first photoelectric conversion region PD.
  • the capacitive structure CS includes an insulating material INS1 between the first conductive structure AN and the second conductive structure CD0/CD1.
  • the first conductive structure AN includes a portion of the first photoelectric conversion region PD (see Fig. 7).
  • second conductive structure CD0/CD1 may include metal and/or polysilicon.
  • the second conductive structure CD0/CD1 at one of the first conductive structure AN, the second conductive structure CD0/CD1, and the insulating material INS1 have an uneven surface, for example, with structure arranged at desired intervals.
  • the second conductive structure CD0/CD1 overlaps a majority of the first photoelectric conversion region PD in a plan view (see Fig. 3, for example).
  • the imaging device 1 includes an isolation structure IS1 surrounding the capacitive structure CS, the first photoelectric conversion region PD, the first transfer transistor TG0, and the second transfer transistor TGI in the plan view.
  • the imaging device 1 may further include an overflow transistor OFG coupled to the first photoelectric conversion region PD.
  • the second conductive structure includes a first conductor CDO and a second conductor CD1 spaced apart from the first conductor CDO.
  • the imaging device 1 may include a first floating diffusion region FDO coupled to the first transfer transistor TGO, a second floating diffusion region FD2, and a third transfer transistor FDGO coupled between the first floating diffusion region FDO and the second floating diffusion region FD2.
  • the imaging device 1 includes a third floating diffusion region FD1 coupled to the second transfer transistor TGI, a fourth floating diffusion region FD3, and a fourth transfer transistor FDG1 coupled between the third floating diffusion FD1 region and the fourth floating diffusion region FD3.
  • the imaging device 1 may include a first wiring (a first portion of Ml) electrically connecting the first conductor CDO to the second floating diffusion region FD2, and a second wiring (second portion of Ml) electrically connecting the second conductor CD1 to the fourth floating diffusion region FD3.
  • the imaging device 1 includes an isolation structure IS2 that surrounds the first floating diffusion region FDO, the second floating diffusion region FD2, the third floating diffusion region FD1, the fourth floating diffusion region FD3, the first transfer transistor TGO, and the second transfer transistor TGI .
  • the first photoelectric conversion region PD is disposed in a semiconductor substrate SUB, and the isolation structure IS2 extends from a first surface of the semiconductor substrate SUB toward a second surface of the semiconductor substrate SUB.
  • one or more edges of the second conductive structure CD0/CD1 are parallel with one or more edges of the first photoelectric conversion region PD (see Fig. 3).
  • the first conductive structure includes the anode AN of the first photoelectric conversion region PD.
  • an imaging device 1 includes a first photoelectric conversion region PD disposed in a substrate SUB including a first surface and a second surface opposite the first surface. The first surface is a light incident surface of the substrate SUB.
  • the imaging device 1 includes a first transfer transistor TGO coupled to the first photoelectric conversion region, a first floating diffusion region FDO coupled to the first transfer transistor TGO, a second transfer transistor TGI coupled to the first photoelectric conversion region PD, and a second floating diffusion region FD1 coupled to the second transfer transistor TGI.
  • the imaging device 1 includes a structure.
  • the structure 1 includes a portion AN of the first photoelectric conversion region PD at the second surface of the substrate SUB, and an insulating material INS1 on the portion AN of the first photoelectric conversion region PD and further away from the first surface than the portion AN of the first photoelectric conversion region PD.
  • the portion AN of the first photoelectric conversion region PD and the insulating material INS1 have uneven surfaces, for example, structures arranged at desired intervals.
  • the imaging device 1 further includes a conductive structure CD0/CD1 on the insulating material INSl. The uneven surfaces are triangular shaped in a cross sectional view.
  • the imaging device further includes an isolation structure IS1 surrounding the structure, the first photoelectric conversion region, the first floating diffusion region, the second floating diffusion region, the first transfer transistor, and the second transfer transistor in the plan view.
  • an imaging device 1 includes a first photoelectric conversion region PD disposed in a semiconductor substrate SUB, a first transfer transistor TGO coupled to the first photoelectric conversion region PD, a second transfer transistor TGI coupled to the first photoelectric conversion region PD, and a capacitive structure CS overlapping the first photoelectric conversion region PD.
  • the capacitive structure CS includes a first conductor AN overlapping the first photoelectric conversion region PD, a second conductor CDO overlapping the first photoelectric conversion region PD, a third conductor CD1 spaced apart from and electrically isolated from the second conductive structure CDO, and an insulating material INSl sandwiched between the first conductor AN and the second and third conductors CD0/CD1.
  • Fig. 9 is a timing chart for explaining a detection method according to at least one example embodiment.
  • Fig. 10 is timing chart for detection and readout signals according to at least one example embodiment.
  • the imaging device 1 detects received light at the first tap
  • the imaging device 1 detects received light at the first tap TA with four detection signals.
  • the four detection signals include a first signal having a same phase (that is, PhaseO) as that of the emitted light, a second signal having a phase shifted by 90 degrees (Phase90) from the emitted light, a third signal having a phase shifted by 180 degrees (Phasel80) from the emitted light, and a fourth signal having a phase shifted by 270 degrees (Phase270) from the emitted light.
  • signals PhaseO, Phase90, Phasel80, and Phase270 may correspond to signals GDA, GDB, GDC, and GDD, respectively, in Fig. 2.
  • signal values detected at PhaseO, Phase90, Phasel80, and Phase270 of the first tap TA are qOA, ql A, q2A, and q3A, respectively.
  • Equation (1) The distance to the object is calculated, for example, by Equation (1):
  • C is the speed of light
  • DT is the time delay
  • fmod is the modulation frequency of the emitted light
  • cpO to cp3 are the signal values qOA, ql A, q2A, and q3A detected with signals PhaseO, Phase90, Phasel80, and Phase270, respectively.
  • the signal values qOA, ql A, q2A, and q3A are obtained and the phase shift amount Q corresponding to the time DT is obtained by the distribution ratio
  • the signal values qOA, ql A, q2A, and q3A are obtained at respective timings for the four phases, and then the distribution ratio of each signal is obtained to calculate the phase shift amount Q.
  • a charge corresponding to the signal value q3 A of the phase Phase270 (cp3) advanced by 270 degrees from the irradiation light is accumulated, and then, in time t7 to t8, the signal processing unit 31 reads the accumulated charge (Read) and calculates the signal value q3 A.
  • the signal processing unit 31 obtains the distance to the object on the basis of the obtained signal values qOA, ql A, q2A, and q3A of the four phases.
  • Example embodiments are not limited to the above method for obtaining signal values qOA, ql A, q2A, and q3A.
  • the imaging device 1 may accumulate the charge of each of the four phases in one block of time, and then read the accumulated signal values qOA, ql A, q2A, and q3A in a subsequent block of time.
  • these methods result in depth information being captured in four frames (one frame for each signal value).
  • depth information may be captured in fewer frames if desired by utilizing both taps TA and TB in each pixel 51 and applying detection signals accordingly.
  • depth information may be captured in a single frame by applying two of the detection signals to taps TA and TB of each pixel in a 2x2 group of pixels (e.g., PhaseO to tap TA and Phasel80 to tap TB of pixel 1, Phase90 to tap TA and Phase 270 to tap TB of pixel 2, Phase270 to tap TA and Phase 90 to tap TB of pixel 3, and Phasel80 to tap TA and PhaseO to tap TB of pixel 4).
  • depth information may be captured in two frames if desired by applying detection signals to both taps TA and TB of each pixel to capture signal values qOA and q3A in a first frame and signal values ql A and q2A in a second frame.
  • the phase of the pulsed light source may be shifted compared to the first frame, for example, by 90 degrees or 270 degrees.
  • each second frame has a phase of the detection signals (in relation to the light source) adjusted for the taps of each pixel compared to the first frame.
  • a pixel 51 in the first frame may receive detection signals having phases of 0 degrees and 180 degrees at respective taps (e.g., PhaseO at tap TA and Phasel80 at tap TB) of the pixel 51 while in the second frame the pixel 51 may receive detection signals having phases of 180 degrees and 0 degrees at the respective taps (e.g., Phase 180 at tap TA and PhaseO at tap B).
  • the phase of the pulsed light source in the second frame may be shifted 180 degrees from the first frame.
  • the same detection signals applied to the taps TA and TB in the first frame may be applied to the taps TA and TB in the second frame (i.e., the detection signals applied in the second frame are not shifted compared to the first frame).
  • Fig. 11 is a block diagram illustrating a ranging module (or ranging device) that outputs distance measurement information using an imaging device 1 according to at least one example embodiment.
  • the ranging module 5000 includes a light emitting unit (or light source) 5011, a light emission control unit (or controller) 5012, and a light receiving unit (or imaging device)5013.
  • the light emitting unit 5011 has a light source that emits light having a predetermined wavelength, and irradiates the object with irradiation light of which brightness periodically changes.
  • the light emitting unit 5011 has a light emitting diode that emits infrared light having a wavelength in a range of 780 nm to 1000 nm as a light source, and generates the irradiation light in synchronization with a light emission control signal CLKp of a rectangular wave supplied from the light emission control unit 5012.
  • the light emission control signal CLKp is not limited to the rectangular wave as long as the control signal CLKp is a periodic signal.
  • the light emission control signal CLKp may be a sine wave.
  • the light emission control unit 5012 supplies the light emission control signal CLKp to the light emitting unit 5011 and the light receiving unit 5013 and controls an irradiation timing of the irradiation light.
  • a frequency of the light emission control signal CLKp is, for example, 20 megahertz (MHz). Note that, the frequency of the light emission control signal CLKp is not limited to 20 megahertz (MHz), and may be 5 megahertz (MHz) or other desired frequency.
  • the light receiving unit 5013 receives reflected light reflected from the object, calculates the distance information for each pixel according to a light reception result, generates a depth image in which the distance to the object is represented by a gradation value for each pixel, and outputs the depth image.
  • the above-described imaging device 1 is used for the light receiving unit 5013, and for example, the imaging device 1 serving as the light receiving unit 5013 calculates the distance information for each pixel 51 from a signal intensity on the basis of the light emission control signal CLKp.
  • the imaging device 1 is incorporated as the light receiving unit 5013 of the ranging module 5000 that obtains and outputs the information associated with the distance to the subject by the indirect ToF method.
  • Fig. 12 is a diagram illustrating use examples of an imaging device 1 according to at least one example embodiment.
  • the above-described imaging device 1 can be used in various cases of sensing light such as visible light, infrared light, ultraviolet light, and X- rays as described below.
  • the imaging device 1 may be included in apparatuses such as a digital still camera and a portable device with a camera function which capture images, apparatuses for traffic such as an in-vehicle sensor that captures images of a vehicle to enable automatic stopping, recognition of a driver state, measuring distance, and the like.
  • the imaging device 1 may be included in apparatuses for home appliances such as a TV, a refrigerator, and an air-conditioner in order to photograph a gesture of a user and to perform an apparatus operation in accordance with the gesture.
  • the imaging device f may be included in apparatuses for medical or health care such as an endoscope and an apparatus that performs angiography through reception of infrared light.
  • the imaging device 1 may be included in apparatuses for security such as a security monitoring camera and a personal authentication camera.
  • the imaging device 1 may be included in an apparatus for beauty such as a skin measuring device that photographs skin.
  • the imaging device 1 may be included in apparatuses for sports such as an action camera, a wearable camera for sports, and the like.
  • the imaging device 1 may be included in apparatuses for agriculture such as a camera for monitoring a state of a farm or crop.
  • any processing devices, control units, processing units, etc. discussed above may correspond to one or many computer processing devices, such as a Field Programmable Gate Array (FPGA), an Application-Specific Integrated Circuit (ASIC), any other type of Integrated Circuit (IC) chip, a collection of IC chips, a microcontroller, a collection of microcontrollers, a microprocessor, Central Processing Unit (CPU), a digital signal processor (DSP) or plurality of microprocessors that are configured to execute the instructions sets stored in memory.
  • FPGA Field Programmable Gate Array
  • ASIC Application-Specific Integrated Circuit
  • IC Integrated Circuit
  • microcontroller a collection of microcontrollers
  • microprocessor Central Processing Unit (CPU), a digital signal processor (DSP) or plurality of microprocessors that are configured to execute the instructions sets stored in memory.
  • CPU Central Processing Unit
  • DSP digital signal processor
  • aspects of the present disclosure may be illustrated and described herein in any of a number of patentable classes or context including any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof. Accordingly, aspects of the present disclosure may be implemented entirely hardware, entirely software (including firmware, resident software, micro-code, etc.) or combining software and hardware implementation that may all generally be referred to herein as a “circuit,” “module,” “component,” or “system.” Furthermore, aspects of the present disclosure may take the form of a computer program product embodied in one or more computer readable media having computer readable program code embodied thereon.
  • the computer readable media may be a computer readable signal medium or a computer readable storage medium.
  • a computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing.
  • a computer readable storage medium may be any tangible medium that can contain or store a program for use by or in connection with an instruction execution system, apparatus, or device.
  • a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof.
  • a computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
  • Program code embodied on a computer readable signal medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
  • Computer program code for carrying out operations for aspects of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Scala, Smalltalk, Eiffel, JADE, Emerald, C++, C#, VB.NET, Python or the like, conventional procedural programming languages, such as the "C" programming language, Visual Basic, Fortran 2003, Perl, COBOL 2002, PHP, ABAP, dynamic programming languages such as Python, Ruby and Groovy, or other programming languages.
  • the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
  • the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider) or in a cloud computing environment or offered as a service such as a Software as a Service (SaaS).
  • LAN local area network
  • WAN wide area network
  • SaaS Software as a Service
  • These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable instruction execution apparatus, create a mechanism for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • These computer program instructions may also be stored in a computer readable medium that when executed can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions when stored in the computer readable medium produce an article of manufacture including instructions which when executed, cause a computer to implement the function/act specified in the flowchart and/or block diagram block or blocks.
  • the computer program instructions may also be loaded onto a computer, other programmable instruction execution apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatuses or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • each of the expressions “at least one of A, B and C,” “at least one of A, B, or C,” “one or more of A, B, and C,” “one or more of A, B, or C,” “A, B, and/or C,” and “A, B, or C” means A alone, B alone, C alone, A and B together, A and C together, B and C together, or A, B and C together.
  • An imaging device comprising: a first photoelectric conversion region; a first transfer transistor coupled to the first photoelectric conversion region; a second transfer transistor coupled to the first photoelectric conversion region; and a capacitive structure including: a first conductive structure overlapping the first photoelectric conversion region; a second conductive structure overlapping the first photoelectric conversion region; and an insulating material between the first conductive structure and the second conductive structure.
  • the imaging device of one or more of (1) to (6) further comprising: an isolation structure surrounding the capacitive structure, the first photoelectric conversion region, the first transfer transistor, and the second transfer transistor in the plan view.
  • An imaging device comprising: a first photoelectric conversion region including a first surface and a second surface opposite the first surface, wherein the first surface is a light incident surface; a first transfer transistor coupled to the first photoelectric conversion region; a first floating diffusion region coupled to the first transfer transistor; a second transfer transistor coupled to the first photoelectric conversion region; a second floating diffusion region coupled to the second transfer transistor; and a structure including: a portion of the first photoelectric conversion region at the second surface; and an insulating material on the portion of the first photoelectric conversion region and further away from the first surface than the portion of the first photoelectric conversion region, wherein the portion of the first photoelectric conversion region and the insulating material have uneven surfaces.
  • An imaging device comprising: a first photoelectric conversion region disposed in a semiconductor substrate; a first transfer transistor coupled to the first photoelectric conversion region; a second transfer transistor coupled to the first photoelectric conversion region; a capacitive structure overlapping the first photoelectric conversion region and including: a first conductor overlapping the first photoelectric conversion region; a second conductor overlapping the first photoelectric conversion region; a third conductor spaced apart from and electrically isolated from the second conductor; and an insulating material sandwiched between the first conductor and the second and third conductors.

Abstract

An imaging device includes a first photoelectric conversion region, a first transfer transistor coupled to the first photoelectric conversion region, a second transfer transistor coupled to the first photoelectric conversion region, and a capacitive structure. The capacitive structure includes a first conductive structure overlapping the first photoelectric conversion region, a second conductive structure overlapping the first photoelectric conversion region, and an insulating material between the first conductive structure and the second conductive structure.

Description

CAPACITIVE STRUCTURES FOR IMAGING DEVICES AND IMAGING
APPARATUSES
FIELD
[0001] Example embodiments are directed to capacitive structures for imaging devices and imaging apparatuses, and methods for operating the same.
BACKGROUND
[0002] Imaging devices are used in many applications, including depth sensing for object tracking, environment rendering, etc. Imaging devices used for depth sensing may employ time-of-flight (ToF) principles to detect a distance to an object or objects within a scene.
In general, a ToF depth sensor includes a light source and an imaging device including a plurality of pixels for sensing reflected light. In operation, the light source emits light (e.g., infrared light) toward an object or objects in the scene, and the pixels detect the light reflected from the object or objects. The elapsed time between the initial emission of the light and receipt of the reflected light by each pixel may correspond to a distance from the object or objects. Direct ToF imaging devices may measure the elapsed time itself to calculate the distance while indirect ToF imaging devices may measure the phase delay between the emitted light and the reflected light and translate the phase delay into a distance. The depth values of the pixels are then used by the imaging device to determine a distance to the object or objects, which may be used to create a three dimensional scene of the captured object or objects.
SUMMARY
[0003] Example embodiments relate to imaging devices, imaging apparatuses, and methods thereof that improve quantum efficiency and/or increase charge saturations levels.
[0004] According to at least one example embodiment, an imaging device includes a first photoelectric conversion region, a first transfer transistor coupled to the first photoelectric conversion region, a second transfer transistor coupled to the first photoelectric conversion region, and a capacitive structure. The capacitive structure includes a first conductive structure overlapping the first photoelectric conversion region, a second conductive structure overlapping the first photoelectric conversion region, and an insulating material between the first conductive structure and the second conductive structure. The first conductive structure includes a portion of the first photoelectric conversion region. The second conductive structure includes metal. Alternatively, the second conductive structure includes poly silicon. At least one of the first conductive structure, the second conductive structure, and the insulating material have an uneven surface (e.g., a plurality of structures arranged at regular intervals). The second conductive structure overlaps a majority of the first photoelectric conversion region in a plan view. The imaging device includes an isolation structure surrounding the capacitive structure, the first photoelectric conversion region, the first transfer transistor, and the second transfer transistor in the plan view. The imaging device includes an overflow transistor coupled to the first photoelectric conversion region. The second conductive structure includes a first conductor and a second conductor spaced apart from the first conductor. The imaging device includes a first floating diffusion region coupled to the first transfer transistor, a second floating diffusion region, a third transfer transistor coupled between the first floating diffusion region and the second floating diffusion region, a third floating diffusion region coupled to the second transfer transistor, a fourth floating diffusion region, and a fourth transfer transistor coupled between the third floating diffusion region and the fourth floating diffusion region. The imaging device includes a first wiring electrically connecting the first conductor to the second floating diffusion region, and a second wiring electrically connecting the second conductor to the fourth floating diffusion region. The imaging device includes an isolation structure that surrounds the first floating diffusion region, the second floating diffusion region, the third floating diffusion region, the fourth floating diffusion region, the first transfer transistor, and the second transfer transistor. The first photoelectric conversion region is disposed in a semiconductor substrate, and the isolation structure extends from a first surface of the semiconductor substrate toward a second surface of the semiconductor substrate. In a plan view, an edge of the second conductive structure is parallel with an edge of the first photoelectric conversion region. The first conductive structure includes an anode of the first photoelectric conversion region.
[0005] According to at least one example embodiment, an imaging device includes a first photoelectric conversion region including a first surface and a second surface opposite the first surface. The first surface is a light incident surface. The imaging device includes a first transfer transistor coupled to the first photoelectric conversion region, a first floating diffusion region coupled to the first transfer transistor, a second transfer transistor coupled to the first photoelectric conversion region, a second floating diffusion region coupled to the second transfer transistor, and a structure including a portion of the first photoelectric conversion region at the second surface, and an insulating material on the portion of the first photoelectric conversion region and further away from the first surface than the portion of the first photoelectric conversion region. The portion of the first photoelectric conversion region and the insulating material have uneven surfaces. The imaging device includes a conductive structure on the insulating material. The uneven surfaces are triangular shaped in a cross sectional view. The imaging device includes an isolation structure surrounding the conductive structure, the first photoelectric conversion region, the first floating diffusion region, the second floating diffusion region, the first transfer transistor, and the second transfer transistor in the plan view.
[0006] According to at least one example embodiment, an imaging device includes a first photoelectric conversion region disposed in a semiconductor substrate, a first transfer transistor coupled to the first photoelectric conversion region, a second transfer transistor coupled to the first photoelectric conversion region, and a capacitive structure overlapping the first photoelectric conversion region. The capacitive structure includes a first conductor overlapping the first photoelectric conversion region, a second conductor overlapping the first photoelectric conversion region, a third conductor spaced apart from and electrically isolated from the second conductor; and an insulating material sandwiched between the first conductor and the second and third conductors.
BRIEF DESCRIPTION OF THE FIGURES [0007] Fig. l is a block diagram of an imaging device according to at least one example embodiment;
[0008] Fig. 2 illustrates an example schematic of a pixel from Fig. 1 according to at least one example embodiment;
[0009] Fig. 3 illustrates a layout of a pixel in a plan view according to at least one example embodiment;
[0010] Fig. 4 illustrates a cross sectional view of a pixel according to at least one example embodiment;
[0011] Fig. 5 illustrates a cross sectional view of a pixel according to at least one example embodiment;
[0012] Fig. 6 illustrates a cross sectional view of a pixel according to at least one example embodiment;
[0013] Fig. 7 illustrates a cross sectional view of a pixel according to at least one example embodiment;
[0014] Fig. 8 illustrates an example schematic of a pixel according to at least one example embodiment; [0015] Fig. 9 is a timing chart for explaining a detection method according to at least one example embodiment;
[0016] Fig. 10 is timing chart for detection and readout signals according to at least one example embodiment;
[0017] Fig. 11 is a block diagram of a ranging module (or ranging device) according to at least one example embodiment; and
[0018] Fig. 12 is a diagram illustrating use examples of an imaging device according to at least one example embodiment.
DETAILED DESCRIPTION
[0019] Fig. l is a block diagram of an imaging device according to at least one example embodiment.
[0020] The imaging device 1 shown in Fig. 1 may be an imaging sensor a rear surface irradiation type, and is provided, for example, in an imaging apparatus having a ranging function (or distance measuring function).
[0021] The imaging device 1 has a pixel array unit (or pixel array or pixel section) 20 formed on a semiconductor substrate (not shown) and a peripheral circuit integrated on the same semiconductor substrate the same as the pixel array unit 20. The peripheral circuit includes, for example, a tap driving unit (or tap driver) 21 (which may be horizontally or vertically arranged), a vertical driving unit (or vertical driver) 22, a column processing unit (or column processing circuit) 23, a horizontal driving unit (or horizontal driver) 24, and a system control unit (or system controller) 25.
[0022] The imaging device element 1 is further provided with a signal processing unit (or signal processor) 31 and a data storage unit (or data storage or memory) 32. Note that the signal processing unit 31 and the data storage unit 32 may be mounted on the same substrate as the imaging device 1 or may be disposed on a substrate separate from the imaging device 1 in the imaging apparatus.
[0023] The pixel array unit 20 has a configuration in which pixels 51 that generate charge corresponding to a received light amount and output a signal corresponding to the charge are two-dimensionally disposed in a matrix shape of a row direction and a column direction. That is, the pixel array unit 20 has a plurality of pixels 51 that perform photoelectric conversion on incident light and output a signal corresponding to charge obtained as a result. Here, the row direction refers to an arrangement direction of the pixels 51 in a horizontal direction, and the column direction refers to the arrangement direction of the pixels 51 in a vertical direction. The row direction is a horizontal direction in the figure, and the column direction is a vertical direction in the figure.
[0024] The pixel 51 receives light incident from the external environment, for example, infrared light, performs photoelectric conversion on the received light, and outputs a pixel signal according to charge obtained as a result. The pixel 51 has a first tap TA that detects charge obtained by the photoelectric conversion by applying a predetermined voltage (first voltage) as signals GDA/C, and a second tap TB that detects charge obtained by the photoelectric conversion by applying a predetermined voltage (second voltage) as signals GDB/D. Although two taps are shown, more or fewer taps may be included if desired. [0025] The tap driving unit 21 supplies signals GDA/C to the first tap TA of each of the pixels 51 of the pixel array unit 20 through a predetermined voltage supply line 30, and supplies the signals GDB/D to the second tap TB thereof through the predetermined voltage supply line 30. Therefore, two voltage supply lines 30 including the voltage supply line 30 that transmits the voltage GDA/C and the voltage supply line 30 that transmits the voltage GDB/D are wired to one pixel column of the pixel array unit 20. [0026] In the pixel array unit 20, with respect to the pixel array of the matrix shape, a pixel drive line 28 is wired along a row direction for each pixel row, and two vertical signal lines 29 are wired along a column direction for each pixel column. However, there may be more than two vertical signals lines if desired. For example, the pixel drive line 28 transmits a drive signal for driving when reading a signal from the pixel. Note that, although Fig. 1 shows one wire for the pixel drive line 28, the pixel drive line 28 is not limited to one. One end of the pixel drive line 28 is connected to an output end corresponding to each row of the vertical driving unit 22.
[0027] The vertical driving unit 22 includes a shift register, an address decoder, or the like. The vertical driving unit 22 drives each pixel of all pixels of the pixel array unit 20 at the same time, or in row units, or the like. That is, the vertical driving unit 22 includes a driving unit that controls operation of each pixel of the pixel array unit 20, together with the system control unit 25 that controls the vertical driving unit 22.
[0028] The signals output from each pixel 51 of a pixel row in response to drive control by the vertical driving unit 22 are input to the column processing unit 23 through the vertical signal line 29. The column processing unit 23 performs a predetermined signal process on the pixel signal output from each pixel 51 through the vertical signal line 29 and temporarily holds the pixel signal after the signal process. [0029] Specifically, the column processing unit 23 performs a noise removal process, an analog to digital (AD) conversion process, and the like as the signal process.
[0030] The horizontal driving unit 24 includes a shift register, an address decoder, or the like, and sequentially selects unit circuits corresponding to pixel columns of the column processing unit 23. The column processing unit 23 sequentially outputs the pixel signals obtained through the signal process for each unit circuit, by a selective scan by the horizontal driving unit 24.
[0031] The system control unit 25 includes a timing generator or the like that generates various timing signals and performs drive control on the tap driving unit 21, the vertical driving unit 22, the column processing unit 23, the horizontal driving unit 24, and the like, on the basis of the various generated timing signals.
[0032] The signal processing unit 31 has at least a calculation process function and performs various signal processes such as a calculation process on the basis of the pixel signal output from the column processing unit 23. The data storage unit 32 temporarily stores data necessary for the signal processing in the signal processing unit 31. The signal processing unit 31 may control overall functions of the imaging device 1. For example, the tap driving unit 21, the vertical driving unit 22, the column processing unit 23, the horizontal driving unit 24, and the system control unit 25, and the data storage unit 32 may be under control of the signal processing unit 31.
[0033] Fig. 2 illustrates an example schematic of a pixel 51 from Fig. 1. The pixel 51 includes a photoelectric conversion region PD, such as a photodiode or other light sensor, transfer transistors TG0 and TGI, floating diffusion regions FD0 and FD1, reset transistors RST0 and RST1, amplification transistors AMP0 and AMP1, and selection transistors SEL0 and SELL The pixel 51 may further include an overflow transistor OFG, transfer transistors FDG0 and FDG1, floating diffusion regions FD2 and FD3, and a capacitive structure including capacitors CAPO and CAPl.
[0034] The pixel 51 may be driven according to tap driving signals GDA/C and GDB/D applied to gates of transfer transistors TG0/TG1, reset signal RSTDRAIN, overflow signal
OFGn, power supply signal VDD, selection signal SELn, and vertical selection signals
VSL0 and VSL1. These signals are provided by various elements from Fig. 1, for example, the tap driver 21, vertical driver 22, system controller 25, etc. Figs. 9 and 10 and related discussion below set forth additional details for driving the pixel 51.
[0035] As shown in Fig. 2, the transfer transistors TG0 and TGI are coupled to the photoelectric conversion region PD and have gates (or taps) that receive tap driving signals GD A/GDC (abbreviated as GDA/C) and GDB/GDD (abbreviated as (GDB/D), where the last letters A, B, C, and D represent different phases of a tap driving signal relative to a phase of a modulated signal from a light source. For example, tap driving signals GD A/GDC may refer to two signals GDA and GDC, where signal GDA has a 0 degree phase shift from a light source signal, and where signal GDC has a 180 degree phase shift from the light source signal. Similarly, tap driving signals GDB/D may refer to two signals GDB and GDD, where signal GDB has a 90 degree phase shift from the light source signal, and where signal GDD has a 270 degree phase shift from the light source signal. The tap driving signals GDA/C and GDB/D may be applied in a manner that allows for depth information to be captured in a desired number of frames (e.g., one frame, two frames, four frames, etc.).
[0036] Fig. 2 further illustrates that capacitor CAPO is electrically connected between an anode AN and floating diffusion region FD2, and that capacitor CAPl is electrically connected between the anode AN and floating diffusion region FD3. Additional details of the capacitors CAPO/CAP 1 and the anode AN are discussed in more detail below with reference to Figs. 3-8.
[0037] It should be appreciated that the transfer transistors FDG0/FDG1 and floating diffusions FD2/FD3 are included to expand the charge capacity of the pixel 51, if desired. However, these elements may be omitted or not used, if desired. The overflow transistor OFG is included to transfer overflow charge from the photoelectric conversion region PD, but may be omitted or unused if desired.
[0038] Fig. 3 illustrates a layout of a pixel 51 from Figs. 1 and 2 in a plan view according to at least one example embodiment. As shown in Fig. 3, the photoelectric conversion region PD is in a central region of the pixel 51 and surrounded by transistors. With reference to Figs. 2 and 3, amplification transistor AMP0 and selection transistor SEL0 may be located at one side of the photoelectric conversion region PD and coupled to floating diffusion region FD0 while amplification transistor AMP1 and selection transistor SEL1 may be located on an opposite side of the photoelectric conversion region PD and coupled to floating diffusion region FD1. As shown, transistors AMP0 and SEL0 are aligned with one another in a vertical direction, and transistors AMP1 and SEL1 are aligned with one another in the vertical direction. Transistors RST0 and FDG0, and transistors RST1 and FDG1 have similar alignments. In the example of Fig. 3, the photoelectric conversion region PD has eight sides and the transfer transistors TG0 and
TGI are located at a same end of but on different sides of the photoelectric conversion region PD. Meanwhile, the overflow transistor OFG is located at an opposite end of the photoelectric conversion region. As shown, at least portions of transistors TGO, TGI, and OFG may overlap parts of the photoelectric conversion region PD.
[0039] Fig. 3 further illustrates a conductive structure including conductors CD0/CD1 that overlap the photoelectric conversion region PD. For example, the conductive structures CDO and CD1 are cathodes of the capacitors CAPO and CAPl. The conductive structure including conductors CDO and CD1 may overlap a majority of the photoelectric conversion region PD. In addition, the conductors CDO and CD1 are spaced apart and electrically isolated from one another (e.g., by an insulating material).
[0040] Here, it should be appreciated that unlabeled transistor structures in Fig. 3 may correspond to transistors of neighboring pixels 51.
[0041] The pixel 51 may include one or more wiring layers (e.g., made of metal) to make electrical connections between the photoelectric conversion region PD, the transistors, and other elements of an imaging device. Fig. 3 illustrates portions of a wiring layer Ml, which includes conductive portions that couple the conductor CDO to floating diffusion region FD2 and that couple conductor CD1 to floating diffusion region FD3. In the event that floating diffusion regions FD2/FD3 and transfer transistors FDG0 and FDG1 are omitted, then Ml may couple conductors CDO and CD1 to floating diffusion regions FD0 and FD1, respectively.
[0042] Example embodiments are not limited to the layout and shapes illustrated in Fig. 3, and the layout and element shapes in Fig. 3 may be varied if desired.
[0043] Fig. 4 illustrates a cross sectional view of a pixel 51 according to at least one example embodiment. The cross sectional view in Fig. 4 is taken along line III-III in Fig.
3. As shown in Fig. 4, the pixel 51 may include a substrate SUB. The substrate SUB may be a semiconductor substrate with the photoelectric conversion region PD disposed therein. For example, the photoelectric conversion region PD is a photodiode. In this case, photoelectric conversion region PD may include a portion of a first conductivity type
(e.g., P type) and a portion of a second conductivity type (e.g., N type) to create a PN- junction that converts received light into electric charge. The pixel 51 includes a gate GO of transfer transistor TGO and a gate G1 of transfer transistor TGI, both disposed in the substrate SUB and coupled between the photoelectric conversion PD and respective floating diffusion regions FD0 and FD1. The gates GO and G1 may correspond to taps
TA and TB, respectively, in Fig. 1, and include a conductor, such as metal, polysilicon, and/or the like. The gates GO and G1 may penetrate the photoelectric conversion region PD, or alternatively, be arranged in proximity to the photoelectric conversion region PD. The floating diffusion regions FD0/FD1 may include doped portions (e.g., N type portions) of the substrate SUB. Although not explicitly shown, it should be understood that a gate of the overflow transistor OFG may have a same or similar structure to the gates GO and G1 (e.g., a vertical gate structure).
[0044] The pixel 51 further includes a capacitive structure CS comprised of capacitors CAPO and CAPl. The capacitive structure CS includes a conductive structure comprised of conductors CD0/CD1, insulating material INS1, and a conductive structure comprised of the anode AN. Capacitor CAPO includes the anode AN, the insulating material INS, and conductor CDO while capacitor CAPl includes the anode AN, the insulating material INS, and conductor CD1. The capacitors CAPO and CAPl may be used to increase a charge capacity of the floating diffusion regions FD0/FD1 of the pixel 51.
[0045] According to at least one example embodiment, conductors CD0/CD1 include metal, for example, reflective metal. Conductors CD0/CD1 formed of reflective metal may improve quantum efficiency of the pixel 51 because light incident to the conductors CD0/CD1 is reflected back toward the photoelectric conversion region PD. In at least one example embodiment, conductors CD0/CD1 include polysilicon, which may increase a capacitance value of the overall capacitive structure CS compared to similarly sized metal conductors CD0/CD1. As noted above, conductors CDO and CD1 are spaced apart and electrically insulated from one another by, for example, an insulating material (not explicitly shown in Fig. 4).
[0046] The anode AN may include a conductor, such as doped semiconductor material, for example, a doped portion of the substrate SUB separate from the photoelectric conversion region PD. The anode AN may be doped with N-type or P-type impurities depending on design preference. The insulating material INS may include silicon oxide, silicon dioxide, and/or other suitable insulator. As shown in Fig. 4, the insulating material INS may span across an entire surface of the substrate SUB. Alternatively, the insulating material INS spans across a partial portion of the entire surface of the substrate SUB.
[0047] In any event, the capacitive structure CS may be formed to have a desired capacitance value and/or reflectance that is set based on empirical evidence and/or design preference.
[0048] The pixel 51 may further include one or more intermediate layers IL and a microlens LENS on the one or more intermediate layers IL. The one or more intermediate layers IL may include a light filter (e.g., color filter, an infrared filter, etc.) and/or other elements desired for the pixel 51.
[0049] Fig. 5 illustrates a cross sectional view of a pixel 51 according to at least one example embodiment. As in Fig. 4, the cross sectional view in Fig. 5 is taken along line III-III in Fig. 3. Fig. 5 includes many of the same elements as Fig. 4, and as such, a description of these elements will not be repeated. Compared to Fig. 4, Fig. 5 further includes an isolation structure IS1 at a periphery of the pixel 51 to reflect and/or absorb light to effective block light from entering neighboring pixels 51. The isolation structure IS1 may include an insulation material, such as an oxide. Additionally or alternatively, the isolation structure IS1 may include a light blocking material, such as tungsten, aluminum, and/or the like. According to at least one example embodiment, the insulating material INS and the isolation structure IS1 both include an oxide, such as silicon oxide, silicon dioxide, etc. A thickness of the isolation structure IS1 may be greater than a thickness of the insulating material INS. In the example of Fig. 5, the isolation structure IS1 is formed from one surface of the substrate SUB to an opposite surface of the substrate SUB. However, example embodiments are not limited thereto, and the isolation structure IS1 may be formed to a desired depth within the substrate SUB, or the isolation structure IS1 may be formed through the substrate SUB and into the one or more intermediate layers IL if desired.
[0050] Fig. 6 illustrates a cross sectional view of a pixel 51 according to at least one example embodiment. As in Fig. 5, the cross sectional view in Fig. 6 is taken along line III-III in Fig. 3. Fig. 6 includes many of the same elements as Figs. 4 and 5, and as such, a description of these elements will not be repeated. Compared to Fig. 5, Fig. 6 illustrates uneven surfaces, for example, structures that have triangular cross sectional shapes arranged periodically or at desired intervals, for the anode AN and/or the capacitors CAPO/CAP 1. In the example of Fig. 6, the capacitive structure CS includes the anode AN with an uneven surface, conductors CD0/CD1 with uneven surfaces, and an insulating material INS1 sandwiched between the uneven surface of the anode AN and the uneven surfaces of the conductors CD0/CD1. The insulating material INS1 may include the same material as the insulating material INS, for example, silicon oxide, silicon dioxide, etc. A thickness of the insulating material INS may be greater than a thickness of the insulating material INS1.
[0051] The uneven surfaces of the anode AN and conductors CD0/CD1 may include substantially triangularly shaped structures arranged at a substantially same pitch. [0052] As in Figs. 4 and 5, conductors CDO and CD1 in Fig. 6 are still spaced apart and electrically isolated from one another. In addition, example embodiments are not limited to both of the anode AN and the conductors CD0/CD1 having uneven surfaces. For example, the anode AN may have an uneven surface while the conductors CD0/CD1 have flat surfaces. In another example, the anode AN may have flat surfaces while the conductors CD0/CD1 have uneven surfaces. In addition, both surfaces of the anode AN may be uneven and both surfaces of the conductors CD0/CD1 may be uneven. In any event, the insulating material INS1 may conform to the surfaces of the anode AN and conductors CD0/CD1.
[0053] The capacitive structure CS of Fig. 6 may have an increased effective surface area compared to the capacitive structure CS of Figs. 4 and 5, which may result in an increased capacitance. Additionally, the uneven surfaces of the anode AN, the insulating material INS1, and/or the conductors CD0/CD1 may diffract photons, thereby increasing photon paths to increase quantum efficiency. In at least one example embodiment, the conductors CD0/CD1 may be omitted from the pixel 51 if increased capacitance is not desired. Additionally or alternatively, a surface of the substrate SUB closest to the microlens LENS and/or at least one of the one or more intermediate layers IL may include a same or similar uneven surface, for example, that covers an entire surface of the photoelectric conversion region PD.
[0054] Fig. 7 illustrates a cross sectional view of a pixel 51 according to at least one example embodiment. As in Figs. 4-6, the cross sectional view in Fig. 7 is taken along line III-III in Fig. 3. Fig. 7 includes many of the same elements as Fig. 4, and as such, a description of these elements will not be repeated. Compared to Fig. 4, the anode AN of Fig. 7 includes a portion of the photoelectric conversion region PD that abuts the insulating material INS.
[0055] Fig. 8 illustrates an example schematic of a pixel 51 according to at least one example embodiment. Fig. 8 includes many of the same elements as Fig. 3, and as such, a description of these elements will not be repeated. Compared to Fig. 3, Fig. 8 further includes an isolation structure IS2 that surrounds elements disposed in the substrate SUB.
For example, the isolation structure IS2 may surround the floating diffusion regions FD0,
FD1, FD2, FD3, the transfer transistors TG0/TG1, the photoelectric conversion region PD, and the overflow transistor OFG (where all of or portions of these elements are disposed in the substrate SUB in a same or similar manner as that shown in Figs. 4-7). Similar to the isolation structure IS1, the isolation structure IS2 may be formed through the substrate SUB, or to a desired depth within the substrate SUB. Here, it should be appreciated that the transistors AMP0/AMP1 and SEL0/SEL1 may be formed in a wiring layer that is on the insulating material INS1, and as such, these elements are not surrounded by the isolation structure IS2. In addition, the isolation structure IS2 may exist with or without the isolation structure IS1 from Fig. 6, and the isolation structure IS1 may exist without isolation structure IS2.
[0056] Here, it should be appreciated that example embodiments shown and described with reference to Figs. 1-8 may be combined in any desired manner according to design preferences. For example, various elements from the pixel designs shown in Figs. 2-8 may be combined with another if desired. In addition, various elements from the pixel designs shown in Figs. 2-8 may be omitted or unused according to design preferences. For example, one or more of the uneven surfaces depicted and described with reference to Fig. 6 may be included with the structure(s) of Fig. 4 and/or Fig. 7. In other examples, the isolation structure IS2 in Fig. 8 may be applied to one or more of the pixel designs in Figs. 4-7 according to design preferences.
[0057] With reference to Figs. 1-8, an imaging device 1 includes a pixel 51. Imaging device 1 includes a pixel 51, a first transfer transistor TG0 coupled to a first photoelectric conversion region PD, and a second transfer transistor TGI coupled to the photoelectric conversion region PD. The pixel 51 further includes a capacitive structure CS including a first conductive structure including anode AN overlapping the first photoelectric conversion region PD, and a second conductive structure including conductors CD0/CD1 overlapping the first photoelectric conversion region PD. The capacitive structure CS includes an insulating material INS1 between the first conductive structure AN and the second conductive structure CD0/CD1. According to at least one example embodiment, the first conductive structure AN includes a portion of the first photoelectric conversion region PD (see Fig. 7). As noted above, second conductive structure CD0/CD1 may include metal and/or polysilicon. As shown and described with reference to Fig. 6, at one of the first conductive structure AN, the second conductive structure CD0/CD1, and the insulating material INS1 have an uneven surface, for example, with structure arranged at desired intervals. The second conductive structure CD0/CD1 overlaps a majority of the first photoelectric conversion region PD in a plan view (see Fig. 3, for example). The imaging device 1 includes an isolation structure IS1 surrounding the capacitive structure CS, the first photoelectric conversion region PD, the first transfer transistor TG0, and the second transfer transistor TGI in the plan view. The imaging device 1 may further include an overflow transistor OFG coupled to the first photoelectric conversion region PD.
[0058] The second conductive structure includes a first conductor CDO and a second conductor CD1 spaced apart from the first conductor CDO. The imaging device 1 may include a first floating diffusion region FDO coupled to the first transfer transistor TGO, a second floating diffusion region FD2, and a third transfer transistor FDGO coupled between the first floating diffusion region FDO and the second floating diffusion region FD2. The imaging device 1 includes a third floating diffusion region FD1 coupled to the second transfer transistor TGI, a fourth floating diffusion region FD3, and a fourth transfer transistor FDG1 coupled between the third floating diffusion FD1 region and the fourth floating diffusion region FD3.
[0059] The imaging device 1 may include a first wiring (a first portion of Ml) electrically connecting the first conductor CDO to the second floating diffusion region FD2, and a second wiring (second portion of Ml) electrically connecting the second conductor CD1 to the fourth floating diffusion region FD3. The imaging device 1 includes an isolation structure IS2 that surrounds the first floating diffusion region FDO, the second floating diffusion region FD2, the third floating diffusion region FD1, the fourth floating diffusion region FD3, the first transfer transistor TGO, and the second transfer transistor TGI . The first photoelectric conversion region PD is disposed in a semiconductor substrate SUB, and the isolation structure IS2 extends from a first surface of the semiconductor substrate SUB toward a second surface of the semiconductor substrate SUB. In a plan view, one or more edges of the second conductive structure CD0/CD1 are parallel with one or more edges of the first photoelectric conversion region PD (see Fig. 3). The first conductive structure includes the anode AN of the first photoelectric conversion region PD.
[0060] In view of Figs. 1-8, an imaging device 1 includes a first photoelectric conversion region PD disposed in a substrate SUB including a first surface and a second surface opposite the first surface. The first surface is a light incident surface of the substrate SUB. The imaging device 1 includes a first transfer transistor TGO coupled to the first photoelectric conversion region, a first floating diffusion region FDO coupled to the first transfer transistor TGO, a second transfer transistor TGI coupled to the first photoelectric conversion region PD, and a second floating diffusion region FD1 coupled to the second transfer transistor TGI. The imaging device 1 includes a structure. The structure 1 includes a portion AN of the first photoelectric conversion region PD at the second surface of the substrate SUB, and an insulating material INS1 on the portion AN of the first photoelectric conversion region PD and further away from the first surface than the portion AN of the first photoelectric conversion region PD. The portion AN of the first photoelectric conversion region PD and the insulating material INS1 have uneven surfaces, for example, structures arranged at desired intervals. The imaging device 1 further includes a conductive structure CD0/CD1 on the insulating material INSl. The uneven surfaces are triangular shaped in a cross sectional view. The imaging device further includes an isolation structure IS1 surrounding the structure, the first photoelectric conversion region, the first floating diffusion region, the second floating diffusion region, the first transfer transistor, and the second transfer transistor in the plan view.
[0061] In view of Figs. 1-8, an imaging device 1 includes a first photoelectric conversion region PD disposed in a semiconductor substrate SUB, a first transfer transistor TGO coupled to the first photoelectric conversion region PD, a second transfer transistor TGI coupled to the first photoelectric conversion region PD, and a capacitive structure CS overlapping the first photoelectric conversion region PD. The capacitive structure CS includes a first conductor AN overlapping the first photoelectric conversion region PD, a second conductor CDO overlapping the first photoelectric conversion region PD, a third conductor CD1 spaced apart from and electrically isolated from the second conductive structure CDO, and an insulating material INSl sandwiched between the first conductor AN and the second and third conductors CD0/CD1.
[0062] Fig. 9 is a timing chart for explaining a detection method according to at least one example embodiment. Fig. 10 is timing chart for detection and readout signals according to at least one example embodiment.
[0063] With reference to Figs. 9 and 10, a light source emits light as irradiation light that is modulated (1 period = 2T) so as to repeat irradiation on/off at an irradiation time T, and the imaging device 1 receives light reflected from an object according to a time delay DT based on the distance to the object.
[0064] In four phase method, the imaging device 1 detects received light at the first tap
TA (corresponding to gate GO in Figs. 4-7) and/or the second tap TB (corresponding to gate G1 in Figs. 4-7) according to four detection signals that are based on a phase of the emitted light signal. In this example, the imaging device 1 detects received light at the first tap TA with four detection signals. The four detection signals include a first signal having a same phase (that is, PhaseO) as that of the emitted light, a second signal having a phase shifted by 90 degrees (Phase90) from the emitted light, a third signal having a phase shifted by 180 degrees (Phasel80) from the emitted light, and a fourth signal having a phase shifted by 270 degrees (Phase270) from the emitted light. Here, it should be appreciated that signals PhaseO, Phase90, Phasel80, and Phase270 may correspond to signals GDA, GDB, GDC, and GDD, respectively, in Fig. 2.
[0065] In this four phase method, it is assumed that signal values detected at PhaseO, Phase90, Phasel80, and Phase270 of the first tap TA are qOA, ql A, q2A, and q3A, respectively.
[0066] It is possible to detect a phase shift amount Q corresponding to the time delay DT by a distribution ratio between the signal values qOA, ql A, q2A, and q3A. That is, since the time delay DT is obtained on the basis of the phase shift amount Q, the distance to the object is able to be obtained by the delay time DT.
[0067] The distance to the object is calculated, for example, by Equation (1):
Figure imgf000016_0001
[0068] Here, C is the speed of light, DT is the time delay, fmod is the modulation frequency of the emitted light, cpO to cp3 are the signal values qOA, ql A, q2A, and q3A detected with signals PhaseO, Phase90, Phasel80, and Phase270, respectively.
[0069] As described above, in a case in which the signal values qOA, ql A, q2A, and q3A are obtained and the phase shift amount Q corresponding to the time DT is obtained by the distribution ratio, the signal values qOA, ql A, q2A, and q3A are obtained at respective timings for the four phases, and then the distribution ratio of each signal is obtained to calculate the phase shift amount Q.
[0070] As shown by the uppermost waveforms of Fig. 10, in a period of time tO to tl, voltages of 0 V and 1.5 V are alternately applied to the first tap TA, so that the voltages have reverse phases, and a charge corresponding to the signal value qOA of the same phase PhaseO (cpO) as the emitted light is accumulated. Then, as shown by the lowermost waveform of Fig. 10, in time tl to t2, the signal processing unit 31 reads the accumulated charge (Read) and calculates the signal value qOA. The voltages may correspond to certain ones of the signals GDA/B/C/D, for example, signals GDA and GDB. [0071] Similarly, in a period of time t2 to t3, a charge corresponding to the signal value ql A of the phase Phase90 (cpl) advanced by 90 degrees from the irradiation light is accumulated, and then, in time t3 to t4, the signal processing unit 31 reads the accumulated charge (Read) and calculates the signal value qlA.
[0072] In addition, in a period of time t4 to t5, a charge corresponding to the signal value q2A of the phase Phasel80 (cp2) advanced by 180 degrees from the irradiation light is accumulated, and then, in time t5 to t6, the signal processing unit 31 reads the accumulated charge (Read) and calculates the signal value q2A.
[0073] Furthermore, in a period of time t6 to t7, a charge corresponding to the signal value q3 A of the phase Phase270 (cp3) advanced by 270 degrees from the irradiation light is accumulated, and then, in time t7 to t8, the signal processing unit 31 reads the accumulated charge (Read) and calculates the signal value q3 A. The signal processing unit 31 obtains the distance to the object on the basis of the obtained signal values qOA, ql A, q2A, and q3A of the four phases.
[0074] When calculating the signal values of four phases, a time for accumulating the charges of four times at each of the phases and a four-times signal read time for reading the signal values of each of the phases are necessary, and a distance measurement is possible.
[0075] Example embodiments are not limited to the above method for obtaining signal values qOA, ql A, q2A, and q3A. In at least one example embodiment, the imaging device 1 may accumulate the charge of each of the four phases in one block of time, and then read the accumulated signal values qOA, ql A, q2A, and q3A in a subsequent block of time. Here, it should be appreciated that these methods result in depth information being captured in four frames (one frame for each signal value).
[0076] However, depth information may be captured in fewer frames if desired by utilizing both taps TA and TB in each pixel 51 and applying detection signals accordingly. For example, depth information may be captured in a single frame by applying two of the detection signals to taps TA and TB of each pixel in a 2x2 group of pixels (e.g., PhaseO to tap TA and Phasel80 to tap TB of pixel 1, Phase90 to tap TA and Phase 270 to tap TB of pixel 2, Phase270 to tap TA and Phase 90 to tap TB of pixel 3, and Phasel80 to tap TA and PhaseO to tap TB of pixel 4). In addition, depth information may be captured in two frames if desired by applying detection signals to both taps TA and TB of each pixel to capture signal values qOA and q3A in a first frame and signal values ql A and q2A in a second frame. For the second frame, the phase of the pulsed light source may be shifted compared to the first frame, for example, by 90 degrees or 270 degrees. Thus, each second frame has a phase of the detection signals (in relation to the light source) adjusted for the taps of each pixel compared to the first frame. For example, a pixel 51 in the first frame may receive detection signals having phases of 0 degrees and 180 degrees at respective taps (e.g., PhaseO at tap TA and Phasel80 at tap TB) of the pixel 51 while in the second frame the pixel 51 may receive detection signals having phases of 180 degrees and 0 degrees at the respective taps (e.g., Phase 180 at tap TA and PhaseO at tap B). In at least one example embodiment, the phase of the pulsed light source in the second frame may be shifted 180 degrees from the first frame. In this case, the same detection signals applied to the taps TA and TB in the first frame may be applied to the taps TA and TB in the second frame (i.e., the detection signals applied in the second frame are not shifted compared to the first frame).
[0077] Fig. 11 is a block diagram illustrating a ranging module (or ranging device) that outputs distance measurement information using an imaging device 1 according to at least one example embodiment.
[0078] The ranging module 5000 includes a light emitting unit (or light source) 5011, a light emission control unit (or controller) 5012, and a light receiving unit (or imaging device)5013.
[0079] The light emitting unit 5011 has a light source that emits light having a predetermined wavelength, and irradiates the object with irradiation light of which brightness periodically changes. For example, the light emitting unit 5011 has a light emitting diode that emits infrared light having a wavelength in a range of 780 nm to 1000 nm as a light source, and generates the irradiation light in synchronization with a light emission control signal CLKp of a rectangular wave supplied from the light emission control unit 5012.
[0080] Note that, the light emission control signal CLKp is not limited to the rectangular wave as long as the control signal CLKp is a periodic signal. For example, the light emission control signal CLKp may be a sine wave.
[0081] The light emission control unit 5012 supplies the light emission control signal CLKp to the light emitting unit 5011 and the light receiving unit 5013 and controls an irradiation timing of the irradiation light. A frequency of the light emission control signal CLKp is, for example, 20 megahertz (MHz). Note that, the frequency of the light emission control signal CLKp is not limited to 20 megahertz (MHz), and may be 5 megahertz (MHz) or other desired frequency. [0082] The light receiving unit 5013 receives reflected light reflected from the object, calculates the distance information for each pixel according to a light reception result, generates a depth image in which the distance to the object is represented by a gradation value for each pixel, and outputs the depth image.
[0083] The above-described imaging device 1 is used for the light receiving unit 5013, and for example, the imaging device 1 serving as the light receiving unit 5013 calculates the distance information for each pixel 51 from a signal intensity on the basis of the light emission control signal CLKp.
[0084] As described above, the imaging device 1 is incorporated as the light receiving unit 5013 of the ranging module 5000 that obtains and outputs the information associated with the distance to the subject by the indirect ToF method.
[0085] Fig. 12 is a diagram illustrating use examples of an imaging device 1 according to at least one example embodiment.
[0086] For example, the above-described imaging device 1 (image sensor) can be used in various cases of sensing light such as visible light, infrared light, ultraviolet light, and X- rays as described below. The imaging device 1 may be included in apparatuses such as a digital still camera and a portable device with a camera function which capture images, apparatuses for traffic such as an in-vehicle sensor that captures images of a vehicle to enable automatic stopping, recognition of a driver state, measuring distance, and the like. The imaging device 1 may be included in apparatuses for home appliances such as a TV, a refrigerator, and an air-conditioner in order to photograph a gesture of a user and to perform an apparatus operation in accordance with the gesture. The imaging device f may be included in apparatuses for medical or health care such as an endoscope and an apparatus that performs angiography through reception of infrared light. The imaging device 1 may be included in apparatuses for security such as a security monitoring camera and a personal authentication camera. The imaging device 1 may be included in an apparatus for beauty such as a skin measuring device that photographs skin. The imaging device 1 may be included in apparatuses for sports such as an action camera, a wearable camera for sports, and the like. The imaging device 1 may be included in apparatuses for agriculture such as a camera for monitoring a state of a farm or crop.
[0087] In view of the above, it should be appreciated that example embodiments provide imaging devices that improve quantum efficiency and/or increase charge saturations levels. [0088] Any processing devices, control units, processing units, etc. discussed above may correspond to one or many computer processing devices, such as a Field Programmable Gate Array (FPGA), an Application-Specific Integrated Circuit (ASIC), any other type of Integrated Circuit (IC) chip, a collection of IC chips, a microcontroller, a collection of microcontrollers, a microprocessor, Central Processing Unit (CPU), a digital signal processor (DSP) or plurality of microprocessors that are configured to execute the instructions sets stored in memory.
[0089] As will be appreciated by one skilled in the art, aspects of the present disclosure may be illustrated and described herein in any of a number of patentable classes or context including any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof. Accordingly, aspects of the present disclosure may be implemented entirely hardware, entirely software (including firmware, resident software, micro-code, etc.) or combining software and hardware implementation that may all generally be referred to herein as a “circuit,” “module,” “component,” or “system.” Furthermore, aspects of the present disclosure may take the form of a computer program product embodied in one or more computer readable media having computer readable program code embodied thereon.
[0090] Any combination of one or more computer readable media may be utilized. The computer readable media may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an appropriate optical fiber with a repeater, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain or store a program for use by or in connection with an instruction execution system, apparatus, or device.
[0091] A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable signal medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
[0092] Computer program code for carrying out operations for aspects of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Scala, Smalltalk, Eiffel, JADE, Emerald, C++, C#, VB.NET, Python or the like, conventional procedural programming languages, such as the "C" programming language, Visual Basic, Fortran 2003, Perl, COBOL 2002, PHP, ABAP, dynamic programming languages such as Python, Ruby and Groovy, or other programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider) or in a cloud computing environment or offered as a service such as a Software as a Service (SaaS). [0093] Aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatuses (systems) and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable instruction execution apparatus, create a mechanism for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
[0094] These computer program instructions may also be stored in a computer readable medium that when executed can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions when stored in the computer readable medium produce an article of manufacture including instructions which when executed, cause a computer to implement the function/act specified in the flowchart and/or block diagram block or blocks. The computer program instructions may also be loaded onto a computer, other programmable instruction execution apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatuses or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
[0095] As used herein, the phrases “at least one,” “one or more,” “or,” and “and/or” are open-ended expressions that are both conjunctive and disjunctive in operation. For example, each of the expressions “at least one of A, B and C,” “at least one of A, B, or C,” “one or more of A, B, and C,” “one or more of A, B, or C,” “A, B, and/or C,” and “A, B, or C” means A alone, B alone, C alone, A and B together, A and C together, B and C together, or A, B and C together.
[0096] The term “a” or “an” entity refers to one or more of that entity. As such, the terms “a” (or “an”), “one or more” and “at least one” can be used interchangeably herein. It is also to be noted that the terms “comprising,” “including,” and “having” can be used interchangeably.
[0097] The foregoing discussion has been presented for purposes of illustration and description. The foregoing is not intended to limit the disclosure to the form or forms disclosed herein. In the foregoing Detailed Description for example, various features of the disclosure are grouped together in one or more aspects, embodiments, and/or configurations for the purpose of streamlining the disclosure. The features of the aspects, embodiments, and/or configurations of the disclosure may be combined in alternate aspects, embodiments, and/or configurations other than those discussed above. This method of disclosure is not to be interpreted as reflecting an intention that the claims require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed aspect, embodiment, and/or configuration. Thus, the following claims are hereby incorporated into this Detailed Description, with each claim standing on its own as a separate preferred embodiment of the disclosure. [0098] Moreover, though the description has included description of one or more aspects, embodiments, and/or configurations and certain variations and modifications, other variations, combinations, and modifications are within the scope of the disclosure, e.g., as may be within the skill and knowledge of those in the art, after understanding the present disclosure. It is intended to obtain rights which include alternative aspects, embodiments, and/or configurations to the extent permitted, including alternate, interchangeable and/or equivalent structures, functions, ranges or steps to those claimed, whether or not such alternate, interchangeable and/or equivalent structures, functions, ranges or steps are disclosed herein, and without intending to publicly dedicate any patentable subject matter. [0099] Example embodiments may be configured according to the following:
(1) An imaging device, comprising: a first photoelectric conversion region; a first transfer transistor coupled to the first photoelectric conversion region; a second transfer transistor coupled to the first photoelectric conversion region; and a capacitive structure including: a first conductive structure overlapping the first photoelectric conversion region; a second conductive structure overlapping the first photoelectric conversion region; and an insulating material between the first conductive structure and the second conductive structure.
(2) The imaging device of (1), wherein the first conductive structure includes a portion of the first photoelectric conversion region.
(3) The imaging device of one or more of (1) to (2), wherein the second conductive structure includes metal.
(4) The imaging device of one or more of (1) to (3), wherein the second conductive structure includes polysilicon.
(5) The imaging device of one or more of (1) to (4), wherein at least one of the first conductive structure, the second conductive structure, and the insulating material have a plurality of structures arranged at regular intervals.
(6) The imaging device of one or more of (1) to (5), wherein the second conductive structure overlaps a majority of the first photoelectric conversion region in a plan view.
(7) The imaging device of one or more of (1) to (6), further comprising: an isolation structure surrounding the capacitive structure, the first photoelectric conversion region, the first transfer transistor, and the second transfer transistor in the plan view.
(8) The imaging device of one or more of (1) to (7), further comprising: an overflow transistor coupled to the first photoelectric conversion region.
(9) The imaging device of one or more of (1) to (8), wherein the second conductive structure includes a first conductor and a second conductor spaced apart from the first conductor.
(10) The imaging device of one or more of (1) to (9), further comprising: a first floating diffusion region coupled to the first transfer transistor; a second floating diffusion region; a third transfer transistor coupled between the first floating diffusion region and the second floating diffusion region; a third floating diffusion region coupled to the second transfer transistor; a fourth floating diffusion region; and a fourth transfer transistor coupled between the third floating diffusion region and the fourth floating diffusion region.
(11) The imaging device of one or more of (1) to (10), further comprising: a first wiring electrically connecting the first conductor to the second floating diffusion region; and a second wiring electrically connecting the second conductor to the fourth floating diffusion region.
(12) The imaging device of one or more of (1) to (11), further comprising: an isolation structure that surrounds the first floating diffusion region, the second floating diffusion region, the third floating diffusion region, the fourth floating diffusion region, the first transfer transistor, and the second transfer transistor.
(13) The imaging device of one or more of (1) to (12), wherein the first photoelectric conversion region is disposed in a semiconductor substrate, and wherein the isolation structure extends from a first surface of the semiconductor substrate toward a second surface of the semiconductor substrate.
(14) The imaging device of one or more of (1) to (13), wherein, in a plan view, an edge of the second conductive structure is parallel with an edge of the first photoelectric conversion region. (15) The imaging device of one or more of (1) to (14), wherein the first conductive structure includes an anode of the first photoelectric conversion region.
(16) An imaging device, comprising: a first photoelectric conversion region including a first surface and a second surface opposite the first surface, wherein the first surface is a light incident surface; a first transfer transistor coupled to the first photoelectric conversion region; a first floating diffusion region coupled to the first transfer transistor; a second transfer transistor coupled to the first photoelectric conversion region; a second floating diffusion region coupled to the second transfer transistor; and a structure including: a portion of the first photoelectric conversion region at the second surface; and an insulating material on the portion of the first photoelectric conversion region and further away from the first surface than the portion of the first photoelectric conversion region, wherein the portion of the first photoelectric conversion region and the insulating material have uneven surfaces.
(17) The imaging device of one or more of (1) to (16), further comprising: a conductive structure on the insulating material.
(18) The imaging device of one or more of (1) to (17), wherein the uneven surfaces are triangular shaped in a cross sectional view and are arranged at regular intervals.
(19) The imaging device of one or more of (1) to (18), further comprising: an isolation structure surrounding the conductive structure, the first photoelectric conversion region, the first floating diffusion region, the second floating diffusion region, the first transfer transistor, and the second transfer transistor in the plan view.
(20) An imaging device, comprising: a first photoelectric conversion region disposed in a semiconductor substrate; a first transfer transistor coupled to the first photoelectric conversion region; a second transfer transistor coupled to the first photoelectric conversion region; a capacitive structure overlapping the first photoelectric conversion region and including: a first conductor overlapping the first photoelectric conversion region; a second conductor overlapping the first photoelectric conversion region; a third conductor spaced apart from and electrically isolated from the second conductor; and an insulating material sandwiched between the first conductor and the second and third conductors.

Claims

It is claimed:
1. An imaging device, comprising: a first photoelectric conversion region; a first transfer transistor coupled to the first photoelectric conversion region; a second transfer transistor coupled to the first photoelectric conversion region; and a capacitive structure including: a first conductive structure overlapping the first photoelectric conversion region; a second conductive structure overlapping the first photoelectric conversion region; and an insulating material between the first conductive structure and the second conductive structure.
2. The imaging device of claim 1, wherein the first conductive structure includes a portion of the first photoelectric conversion region.
3. The imaging device of claim 1, wherein the second conductive structure includes metal.
4. The imaging device of claim 1, wherein the second conductive structure includes polysilicon.
5. The imaging device of claim 1, wherein at least one of the first conductive structure, the second conductive structure, and the insulating material have a plurality of structures arranged at regular intervals.
6. The imaging device of claim 1, wherein the second conductive structure overlaps a majority of the first photoelectric conversion region in a plan view.
7. The imaging device of claim 1, further comprising: an isolation structure surrounding the capacitive structure, the first photoelectric conversion region, the first transfer transistor, and the second transfer transistor in the plan view.
8. The imaging device of claim 7, further comprising: an overflow transistor coupled to the first photoelectric conversion region.
9. The imaging device of claim 1, wherein the second conductive structure includes a first conductor and a second conductor spaced apart from the first conductor.
10. The imaging device of claim 9, further comprising: a first floating diffusion region coupled to the first transfer transistor; a second floating diffusion region; a third transfer transistor coupled between the first floating diffusion region and the second floating diffusion region; a third floating diffusion region coupled to the second transfer transistor; a fourth floating diffusion region; and a fourth transfer transistor coupled between the third floating diffusion region and the fourth floating diffusion region.
11. The imaging device of claim 10, further comprising: a first wiring electrically connecting the first conductor to the second floating diffusion region; and a second wiring electrically connecting the second conductor to the fourth floating diffusion region.
12. The imaging device of claim 10, further comprising: an isolation structure that surrounds the first floating diffusion region, the second floating diffusion region, the third floating diffusion region, the fourth floating diffusion region, the first transfer transistor, and the second transfer transistor.
13. The imaging device of claim 12, wherein the first photoelectric conversion region is disposed in a semiconductor substrate, and wherein the isolation structure extends from a first surface of the semiconductor substrate toward a second surface of the semiconductor substrate.
14. The imaging device of claim 1, wherein, in a plan view, an edge of the second conductive structure is parallel with an edge of the first photoelectric conversion region.
15. The imaging device of claim 1, wherein the first conductive structure includes an anode of the first photoelectric conversion region.
16. An imaging device, comprising: a first photoelectric conversion region including a first surface and a second surface opposite the first surface, wherein the first surface is a light incident surface; a first transfer transistor coupled to the first photoelectric conversion region; a first floating diffusion region coupled to the first transfer transistor; a second transfer transistor coupled to the first photoelectric conversion region; a second floating diffusion region coupled to the second transfer transistor; and a structure including: a portion of the first photoelectric conversion region at the second surface; and an insulating material on the portion of the first photoelectric conversion region and further away from the first surface than the portion of the first photoelectric conversion region, wherein the portion of the first photoelectric conversion region and the insulating material have uneven surfaces.
17. The imaging device of claim 16, further comprising: a conductive structure on the insulating material.
18. The imaging device of claim 16, wherein the uneven surfaces are triangular shaped in a cross sectional view and are arranged at regular intervals.
19. The imaging device of claim 16, further comprising: an isolation structure surrounding the conductive structure, the first photoelectric conversion region, the first floating diffusion region, the second floating diffusion region, the first transfer transistor, and the second transfer transistor in the plan view.
20. An imaging device, comprising: a first photoelectric conversion region disposed in a semiconductor substrate; a first transfer transistor coupled to the first photoelectric conversion region; a second transfer transistor coupled to the first photoelectric conversion region; and a capacitive structure overlapping the first photoelectric conversion region and including: a first conductor overlapping the first photoelectric conversion region; a second conductor overlapping the first photoelectric conversion region; a third conductor spaced apart from and electrically isolated from the second conductor; and an insulating material sandwiched between the first conductor and the second and third conductors.
PCT/IB2020/000397 2020-05-21 2020-05-21 Capacitive structures for imaging devices and imaging apparatuses WO2021234423A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/IB2020/000397 WO2021234423A1 (en) 2020-05-21 2020-05-21 Capacitive structures for imaging devices and imaging apparatuses

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/IB2020/000397 WO2021234423A1 (en) 2020-05-21 2020-05-21 Capacitive structures for imaging devices and imaging apparatuses

Publications (1)

Publication Number Publication Date
WO2021234423A1 true WO2021234423A1 (en) 2021-11-25

Family

ID=71078539

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2020/000397 WO2021234423A1 (en) 2020-05-21 2020-05-21 Capacitive structures for imaging devices and imaging apparatuses

Country Status (1)

Country Link
WO (1) WO2021234423A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190122918A1 (en) * 2016-03-31 2019-04-25 Sony Corporation Imaging device, method of manufacturing imaging device, and electronic device
US20190148448A1 (en) * 2017-11-13 2019-05-16 SK Hynix Inc. Image sensor
EP3598501A2 (en) * 2018-07-17 2020-01-22 Brillnics Inc. Solid-state imaging device, method for fabricating solid-state imaging device, and electronic apparatus
EP3598499A2 (en) * 2018-07-18 2020-01-22 Sony Semiconductor Solutions Corporation Light receiving element, ranging module, and electronic apparatus

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190122918A1 (en) * 2016-03-31 2019-04-25 Sony Corporation Imaging device, method of manufacturing imaging device, and electronic device
US20190148448A1 (en) * 2017-11-13 2019-05-16 SK Hynix Inc. Image sensor
EP3598501A2 (en) * 2018-07-17 2020-01-22 Brillnics Inc. Solid-state imaging device, method for fabricating solid-state imaging device, and electronic apparatus
EP3598499A2 (en) * 2018-07-18 2020-01-22 Sony Semiconductor Solutions Corporation Light receiving element, ranging module, and electronic apparatus

Similar Documents

Publication Publication Date Title
CN109643722B (en) Sensor chip and electronic device
US8537218B2 (en) Distance image sensor and method for generating image signal by time-of-flight method
JP6879919B2 (en) Manufacturing method of solid-state image sensor, electronic device, and solid-state image sensor
US20180302597A1 (en) Solid-state image capturing device and electronic device
US8520104B2 (en) Image sensor devices having dual-gated charge storage regions therein
TWI721031B (en) Image sensing device
US11741622B2 (en) Imaging devices and multiple camera interference rejection
CN107665897A (en) Light sensing device and optical detection system
JP2020516056A (en) Electromagnetic radiation detector
US10833207B2 (en) Photo-detection device, photo-detection system, and mobile apparatus
US20190319154A1 (en) Photo-detection device, photo-detection system, and mobile apparatus
US20180149752A1 (en) Imaging apparatus and imaging control method
US20220238577A1 (en) Imaging devices with multi-phase gated time-of-flight pixels
WO2021234423A1 (en) Capacitive structures for imaging devices and imaging apparatuses
US11955494B2 (en) Power supply contact sharing for imaging devices
US20220216253A1 (en) Capacitance matched metal wirings in dual conversion gain pixels
US20220247952A1 (en) Imaging devices with gated time-of-flight pixels with fast charge transfer
US20220217289A1 (en) Dual mode imaging devices
US20220260716A1 (en) Imaging devices for capturing color and depth information
US20220238579A1 (en) Simultaneous capture of multiple phases for imaging devices
JP2022082557A (en) Photodetector and electronic equipment
JP2012211835A (en) Radiographic image detector
US20120248505A1 (en) Light receiving device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20731939

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20731939

Country of ref document: EP

Kind code of ref document: A1