WO2021172176A1 - Dispositif d'imagerie à semi-conducteur et appareil électronique - Google Patents

Dispositif d'imagerie à semi-conducteur et appareil électronique Download PDF

Info

Publication number
WO2021172176A1
WO2021172176A1 PCT/JP2021/006245 JP2021006245W WO2021172176A1 WO 2021172176 A1 WO2021172176 A1 WO 2021172176A1 JP 2021006245 W JP2021006245 W JP 2021006245W WO 2021172176 A1 WO2021172176 A1 WO 2021172176A1
Authority
WO
WIPO (PCT)
Prior art keywords
solid
image sensor
trench
state image
semiconductor substrate
Prior art date
Application number
PCT/JP2021/006245
Other languages
English (en)
Japanese (ja)
Inventor
吉田 慎一
Original Assignee
ソニーセミコンダクタソリューションズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーセミコンダクタソリューションズ株式会社 filed Critical ソニーセミコンダクタソリューションズ株式会社
Publication of WO2021172176A1 publication Critical patent/WO2021172176A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L21/00Processes or apparatus adapted for the manufacture or treatment of semiconductor or solid state devices or of parts thereof
    • H01L21/70Manufacture or treatment of devices consisting of a plurality of solid state components formed in or on a common substrate or of parts thereof; Manufacture of integrated circuit devices or of parts thereof
    • H01L21/71Manufacture of specific parts of devices defined in group H01L21/70
    • H01L21/76Making of isolation regions between components
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L21/00Processes or apparatus adapted for the manufacture or treatment of semiconductor or solid state devices or of parts thereof
    • H01L21/70Manufacture or treatment of devices consisting of a plurality of solid state components formed in or on a common substrate or of parts thereof; Manufacture of integrated circuit devices or of parts thereof
    • H01L21/71Manufacture of specific parts of devices defined in group H01L21/70
    • H01L21/76Making of isolation regions between components
    • H01L21/761PN junctions
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/76Addressed sensors, e.g. MOS or CMOS sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/76Addressed sensors, e.g. MOS or CMOS sensors
    • H04N25/77Pixel circuitry, e.g. memories, A/D converters, pixel amplifiers, shared circuits or shared components

Definitions

  • This technology relates to solid-state image sensors and electronic devices.
  • CMOS Complementary Metal Oxide Semiconductor
  • CCD Charge Coupled Device
  • Patent Documents 1 and 2 propose a solid-state image sensor that employs a trench (Deep Trench Isolation) formed between two adjacent pixels.
  • Patent Documents 1 and 2 may not be able to further improve the quality and reliability of the solid-state image sensor.
  • this technology was made in view of such a situation, and is equipped with a solid-state image sensor capable of further improving the quality and reliability of the solid-state image sensor, and the solid-state image sensor thereof. Its main purpose is to provide electronic devices.
  • the present inventors have succeeded in further improving the quality and reliability of the solid-state image sensor, and have completed the present technology.
  • a solid-state image sensor in which the PN junction region is formed by self-alignment.
  • the width of the P-shaped region arranged on the side wall of the trench and the width of the P-shaped region arranged on the bottom surface, which is the upper part of the trench, may be substantially the same.
  • a pixel transistor formed on the surface of the semiconductor substrate on the opposite side of the light incident side may be provided.
  • the pixel transistor may be arranged above the trench.
  • the solid-state image sensor may be provided with a floating diffusion formed on the surface of the semiconductor substrate opposite to the light incident side.
  • the floating diffusion may be arranged above the trench.
  • the floating diffusion may be shared by a plurality of the pixels.
  • a transfer gate and a floating diffusion formed on the surface of the semiconductor substrate on the opposite side of the light incident side may be provided.
  • the transfer gate and the floating diffusion may be located above the trench.
  • the transfer gate may be formed for each of the pixels.
  • the floating diffusion may be shared by a plurality of the pixels.
  • the floating diffusion may be surrounded by a plurality of the transfer gates.
  • the photoelectric conversion unit may be embedded below the semiconductor substrate.
  • a vertical gate for transferring the signal charge generated by the photoelectric conversion unit by photoelectric conversion may be formed above the semiconductor substrate.
  • An amplification transistor formed on the surface of the semiconductor substrate on the opposite side of the light incident side may be provided.
  • the amplification transistor may be arranged above the trench.
  • the amplification transistor may be shared by a plurality of the pixels.
  • a reset transistor formed on the surface of the semiconductor substrate on the opposite side of the light incident side may be provided.
  • the reset transistor may be arranged above the trench.
  • the reset transistor may be shared by a plurality of the pixels.
  • a selection transistor formed on the surface of the semiconductor substrate on the opposite side of the light incident side may be provided.
  • the selective transistor may be arranged above the trench.
  • the selection transistor may be shared by a plurality of the pixels.
  • this technology provides an electronic device equipped with a solid-state image sensor according to this technology.
  • FIG. 1 It is a figure which shows the use example of the solid-state image sensor of 1st to 3rd Embodiment to which this technique is applied. It is a functional block diagram of an example of the electronic device which concerns on 4th Embodiment to which this technique is applied. It is a figure which shows an example of the schematic structure of the endoscopic surgery system. It is a block diagram which shows an example of the functional structure of a camera head and a CCU. It is a block diagram which shows an example of the schematic structure of a vehicle control system. It is explanatory drawing which shows an example of the installation position of the vehicle exterior information detection unit and the image pickup unit.
  • a back-illuminated image sensor solid-state image sensor
  • it is generally used to form a trench (Reverse Deep Trench Isolation) in the back-side process for pixel separation.
  • the trench formed in the back surface process can suppress the leakage of obliquely incident light to adjacent pixels.
  • the electrical separation of each pixel is realized by forming a P + region by implantation.
  • a pinning film is formed on the side wall of the trench and the surface on the back surface side (the surface on the light incident side) of the semiconductor substrate (silicon substrate).
  • the pixels are separated by a trench formed from the back surface side, and a PN junction is formed on the side wall of the trench by using a conformal doping process such as solid phase diffusion or plasma doping, and Qs (saturation charge) is formed. It is a structure that realizes an increase in quantity).
  • the side wall of the trench and the back surface (light incident side) of the semiconductor substrate (silicon substrate) are pinned in the P + region where boron is doped, and the side wall of the trench and the back surface of the semiconductor substrate (silicon substrate) are pinned.
  • the side surface (the surface on the light incident side) is pinned by a film having a negative fixed charge (pinning film).
  • the FD (Floating Difference), the pixel transistor, and the like are arranged in one pixel. That is, an FD (Floating Difference) and a pixel transistor are arranged for each pixel, and the FD (Floating Difference) and the pixel transistor (for example, an amplification transistor, a reset transistor, a selection transistor, etc.) are not pixel-shared.
  • the trench cannot be heat-treated at a high temperature in the FEOL process, the dry damage at the time of forming the trench may remain, and the white spot and the dark current may be deteriorated. Furthermore, since the Boron I / I layer (boron Ion Implantation, layer on which boron is implanted) (RDTI side wall pinning) around the RDTI expands in the lateral direction, the pixel volume is reduced and the saturated charge amount (Qs) is reduced, especially in fine pixels. ) May decrease.
  • the Boron I / I layer boron Ion Implantation, layer on which boron is implanted
  • Qs saturated charge amount
  • This technology was made in view of the above circumstances. According to this technology, it is possible to improve the saturated charge amount (Qs) by forming a PN junction on the side wall of the trench, and reduce white spots and / or dark current due to recovery of DTI (trench) formation damage by heat treatment during FEOL formation. .. Further, according to the present technology, it is possible to improve noise by expanding the area of the amplification transistor (Amp. Transistor) by sharing the FD and Well contacts.
  • Example 1 of solid-state image sensor The solid-state image sensor of the first embodiment (Example 1 of the solid-state image sensor) according to the present technology will be described with reference to FIGS. 1 to 6.
  • FIG. 1 is a diagram showing a configuration example of the solid-state image sensor of the first embodiment according to the present technology, and more specifically, is a diagram showing the solid-state image sensor 101 of the first embodiment according to the present technology.
  • a non-penetrating trench 1 is formed in the solid-state image sensor 101 from the back surface side (light incident side, lower side of FIG. 1) of the semiconductor substrate 15.
  • the solid-state image sensor 101 has a structure in which a PN junction is formed in a self-aligned manner on the side wall of the non-penetrating trench 1 (in FIG. 1, the left and right walls of the non-penetrating trench 1).
  • the elements floating diffusion (FD) 40, pixel transistor 50, etc.
  • the non-penetrating trench 1 are arranged vertically (in the vertical direction in FIG. 1).
  • the signal charge photoelectrically converted by the photoelectric conversion unit (photodiode (PD, Photo Diode) 30) is transmitted in the vertical direction to the floating diffusion (FD) 40 via the transfer transistor (pixel transistor 50) (FIG. 1).
  • a vertical gate (VG (Vertical Gate)) 41 is formed to perform transfer in the vertical direction.
  • a plurality of FD40s and well contacts are provided by arranging each element (FD40, pixel transistor 50, well contact) formed in the FEOL process and the non-penetrating trench 1 in the vertical direction. Since the pixels can be shared and the pixel transistors can be arranged above the trench 1, the area of the semiconductor substrate (Si substrate) can be effectively utilized. As a result, the area of the amplification transistor (Amp. Transistor) can be expanded as much as possible, so that good noise characteristics can be realized.
  • the solid-state image sensor 101 has a PD (photodiode) 30 which is a photoelectric conversion element of each pixel formed inside the Si substrate 15.
  • a P-type layer 64 is formed on the light incident side of the PD 30 (the lower side and the back surface side in the drawing), and a film 70 having a negative fixed charge is formed on the lower layer of the P-type layer 64. It is formed, and a flattening film 13 is formed under the flattening film 13.
  • a light-shielding film 14 is formed on the flattening film 13.
  • the light-shielding film 14 is provided to prevent light from leaking to adjacent pixels, and is formed between adjacent PDs 30.
  • the light-shielding film 14 is made of, for example, a metal material such as W (tungsten).
  • An OCL (on-chip lens) 11 is formed on the flattening film 13 (lower side in FIG. 1) and on the back surface side of the Si substrate 15 to collect the incident light on the PD 30.
  • the OCL 11 can be formed of an inorganic material, and for example, SiN, SiO, and SiOxNy (where 0 ⁇ x ⁇ 1, 0 ⁇ y ⁇ 1) can be used.
  • a cover glass or a transparent plate such as a resin may be adhered on the OCL 11. Further, a color filter 12 is formed between the OCL 11 and the flattening film 13.
  • the color filter 12 is provided with a plurality of color filters for each pixel, and the colors of the color filters can be configured to be arranged according to, for example, a Bayer arrangement.
  • a wiring layer 16 is formed on the surface side of the semiconductor substrate (Si substrate) 15 (the side opposite to the light incident side and the upper side of the drawing), and the interface between the wiring layer 16 and the Si substrate (semiconductor substrate) 15 is formed. Is formed with a pixel transistor 50.
  • a wiring layer 17 is formed on the upper side of the wiring layer 16, and a logic substrate 18 is formed on the upper side of the wiring layer 17.
  • the Si substrate to be the pixel substrate (sensor substrate) and the circuit board (for example, the logic substrate) 18 are connected to the junction P (the junction surface P1 of the wiring layer 16 and the junction surface P2 of the wiring layer 17) by the Cu—Cu junction 19. It is electrically bonded and joined via (consisting of).
  • the trench 1 is formed between the pixels.
  • the trench 1 is formed between adjacent pixels so as not to penetrate the Si substrate 15 upward from the back surface side of the Si substrate 15 (in the vertical direction in the drawing, the direction from the back surface to the front surface).
  • the trench 1 also functions as a light-shielding wall between pixels so that unnecessary light does not leak to adjacent pixels.
  • a P-type region (P-type impurity region) 60 is formed between the PD 30 and the trench 1 (between the left and right directions in FIG. 1) along the side wall of the trench 1. Further, the P-type region (P-type impurity region) 60 is also formed along the bottom surface of the trench 1.
  • the P-type region (P-type impurity region) 60 may be referred to as a P-type solid phase diffusion layer.
  • the PD30 is composed of an N-type region. Photoelectric conversion is performed in some or all of these N-type regions.
  • a side wall film 61 made of, for example, SiO 2 is formed on the inner wall of the trench 1, and a filler 62 made of, for example, polysilicon is embedded inside the side wall film 61.
  • SiN may be adopted instead of SiO 2 adopted in.
  • doping polysilicon and light-shielding metal may be used instead of the polysilicon used for the filler 62.
  • FIGS. 2 and 3 are diagrams showing a configuration example of the solid-state image sensor of the first embodiment according to the present technology.
  • FIG. 2 is a view from the surface side (opposite side of the light incident side).
  • FIG. 3A is a plan layout view of the solid-state image sensor 123 according to the first embodiment according to the present technology
  • FIG. 3A is a cross section of the solid-state image sensor 123 (123A) according to the line AA'shown in FIG.
  • FIG. 3B is a cross-sectional view of the solid-state image sensor 123 (123B) according to the line BB'shown in FIG.
  • the non-penetrating trench 1 is formed in a grid pattern between adjacent pixels so as to surround each pixel (one pixel is referred to as a pixel GS in FIG. 2). It is formed.
  • a reset transistor (RST Tr.) 51 a transfer gate (TG) transfer transistor) 52 and 53, a selection transistor (SELTr.) 54, and a well contact (VSS) arranged above the non-penetrating trench 1 are shown.
  • ) 59, amplifier (AMP) transistors AMP1 and AMP2, floating diffusion (FD) 40, and wiring 80 are arranged.
  • a photoelectric conversion unit (photodiode (PD)) 30 is formed in the semiconductor substrate 51 by being embedded in the semiconductor substrate 51 for each pixel on the back side of the paper surface of FIG.
  • the floating diffusion (FD) 40 is formed by being shared by four pixel GS (2 ⁇ 2 pixels), and the amplifier (AMP) transistors AMP1 and AMP2 are also shared by four pixel GS (2 ⁇ 2 pixels). Is formed.
  • the number of shared pixels is not limited to 4 pixels (2 ⁇ 2 pixels), for example, 8 pixels (2 ⁇ 4 pixels), 9 pixels (3 ⁇ 3 pixels), 16 pixels (4 ⁇ 4 pixels). ) May be.
  • the reset transistor (RST Tr.) 51 and the selection transistor (SELTr.) 54 are shared and formed by two pixels, and the transfer gate (TG) (transfer transistor) 52 and 53 are formed for each pixel. ..
  • Reset transistor (RST Tr.) 51, transfer gate (TG) (transfer transistor) 52 and 53, selection transistor (SELTr.) 54, floating diffusion (FD) 40 and wiring 80 are arranged.
  • Each of the reset transistor (RST Tr.) 51, transfer gate (TG) transfer transistor) 52 and 53, and the selection transistor (SELTr.) 54 is connected to the wiring 80.
  • the photoelectric conversion unit ( The signal charge photoelectrically converted by the photodiode (PD, Photo Diode) 30 is transferred (per pixel) to each of the transfer gates (transfer transistors) 52 and 52, respectively.
  • the transfer gates 52 and 53 read out the electric charge generated by the PD 30 and transfer it to the FD 40.
  • the FD 40 holds the charge read from the PD 30.
  • the reset transistor 51 resets the potential of the FD 40 by discharging the electric charge stored in the FD 40 to the drain (constant voltage source Vdd).
  • the amplifier (AMP) transistors AMP1 and AMP2 output a pixel signal according to the potential of the FD40.
  • the amplifier (AMP) transistors AMP1 and AMP2 constitute a load MOS as a constant current source connected via a vertical signal line and a source follower circuit, and a pixel signal indicating a level corresponding to the charge stored in the FD 40. Is output from the amplifier (AMP) transistors AMP1 and AMP2 to the column processing unit via the selection transistor 54 and the vertical signal line.
  • the selection transistor 54 outputs the pixel signal of the pixel to the column processing unit via the vertical signal line.
  • FIGS. 4 to 6 a method for manufacturing the solid-state image sensor according to the first embodiment (Example 1 of the solid-state image sensor) according to the present technology will be described.
  • FIG. 4 to 6 are diagrams for explaining a method of manufacturing the solid-state image sensor of the first embodiment (example 1 of the solid-state image sensor) according to the present technology.
  • FIG. 4 shows the solid-state imaging device of the first embodiment according to the present technology in the DTI (DeeP Training Isolation) step, the solid phase diffusion step, the Bond (bonding) step, and the FEOL (Front End Of Line) step. It is a figure for demonstrating the manufacturing method of 106, and FIG. It is a figure for demonstrating the manufacturing method of the solid-state imaging apparatus 106 of an embodiment.
  • FIG. 4 shows the solid-state imaging device of the first embodiment according to the present technology in the DTI (DeeP Training Isolation) step, the solid phase diffusion step, the Bond (bonding) step, and the FEOL (Front End Of Line) step.
  • DTI DeeP Training Isolation
  • Bond bonding
  • FEOL Front End Of Line
  • FIG. 6 is a diagram for explaining a method of manufacturing the solid-state image sensor 106 of the first embodiment according to the present technology in the REOL (Reverse End Of Line) step and the Custom step, and at the same time, is a diagram relating to the present technology. It is a figure which shows the solid-state image sensor 106 of 1st Embodiment.
  • the semiconductor substrate (Si) substrate 15 does not penetrate the trench 1, that is, does not penetrate the semiconductor substrate (Si) substrate 15, and uses lithography and Dry etching processes.
  • the semiconductor substrate (Si) substrate 15 is formed upward from the back surface.
  • FIG. 4 (Trench embedding process) From FIG. 4 (b) to FIG. 4 (c), the inside of the trench 1 is embedded and flattened.
  • the embedded structure of the trench 1 include an insulating film such as a silicon oxide film and a silicon nitride film, and a metal material such as polysilicon (Poly-Si) and tungsten (W).
  • FIG. 4C shows a laminated structure of silicon oxide film 61 and polysilicon (Poly—Si) 62.
  • the semiconductor substrate (Si) substrate 15 on which the trench 1 is formed is thinned.
  • the thinned surface is arranged upward from the bottom surface of the trench.
  • FEOL process After thinning the wafer, as shown in FIG. 4 (d) (FIG. 5 (a)), in order to form the pixel transistor 50 and the photoelectric conversion unit (photodiode (PD)) 30 in the FEOL process.
  • Pixel implantation is performed to form each of a vertical gate (VG (Vertical Gate)) 41, a pixel transistor 50, a floating diffusion (FD) 40, a photoelectric conversion unit 30, and a well contact.
  • BEOL process As shown in FIG. 5B, after executing the FEOL step, the BEOL step of forming the connection wiring for forming the circuit by the element such as the pixel transistor 50 is executed.
  • the logic circuit element (Logic Tr.) Is integrated on the logic substrate 18 which is a substrate different from the semiconductor substrate 19 constituting the pixels. Then, by Cu—Cu (copper-copper) bonding 19, the semiconductor substrate 15 constituting the pixels and the logic substrate 18 are electrically connected via the wiring layer 16 on the semiconductor substrate 19 side and the wiring layer 17 on the logic substrate 18 side. Connect to.
  • Second Embodiment (Example 2 of solid-state image sensor)> The solid-state image sensor of the second embodiment (Example 2 of the solid-state image sensor) according to the present technology will be described with reference to FIGS. 7 to 8.
  • FIG. 7 to 8 are diagrams for explaining the manufacturing method of the solid-state imaging device of the second embodiment (example 2 of the solid-state imaging device) according to the present technology.
  • FIG. 7 is a diagram in the REOL step.
  • FIG. 8 is a diagram for explaining a method of manufacturing the solid-state imaging device 108 of the second embodiment according to the present technology
  • FIG. 8 is a diagram of the solid-state imaging device 108 of the second embodiment according to the present technology in the REOL process. It is a figure for demonstrating the manufacturing method, and at the same time, it is a figure which shows the solid-state image pickup apparatus 108 of the 2nd Embodiment which concerns on this technique.
  • the method for manufacturing the solid-state imaging device 108 is the same as the method for manufacturing the solid-state imaging device according to the first embodiment (Example 1 of the solid-state imaging device) according to the present technology, up to the step of joining the wafer to the logic substrate 18. Execute the process of. Then, in FIG. 9A, the solid-state image sensor 106 is shown for comparison (for reference).
  • the film for example, silicon oxide film 61 and polysilicon (Poly-Si) 62
  • the film for example, silicon oxide film 61 and polysilicon (Poly-Si) 62
  • the P-shaped region 60 for example, the region having BSG (boron silicate glass)
  • BSG boron silicate glass
  • the trench 1 is re-embedded with a film (pinning film) 70 having a negative fixed charge.
  • the film (pinning film) 70 having a negative fixed charge is an insulating film that induces a negative charge on the silicon (Si) surface (including the side wall of the trench 1), and is, for example, HfO 2 , Al 2 O 3, and the like. High dielectric constant insulating film of.
  • re-embedding is performed with an insulating film such as a light-shielding metal or a silicon oxide film so that it is placed on the film (pinning film) 70 having a negative fixed charge.
  • the light-shielding metal material include metal materials such as tungsten (W) and aluminum (Al).
  • tungsten (W) 63 is embedded in the trench 1.
  • a custom step is executed to form a color filter 12, an on-chip lens (OCL (On Chip Lens) 11 and the like), and a solid-state image sensor 108 is manufactured.
  • OCL On Chip Lens
  • the contents described about the solid-state image sensor of the second embodiment according to the present technology of the first embodiment according to the present technology described above, unless there is a particular technical contradiction. It can be applied to a solid-state image sensor and a solid-state image sensor according to a third embodiment of the present technology described later.
  • FIG. 9 to 10 are diagrams for explaining the manufacturing method of the solid-state imaging device of the third embodiment (example 3 of the solid-state imaging device) according to the present technology.
  • FIG. 9 is a diagram in the REOL step. It is a figure for demonstrating the manufacturing method of the solid-state imaging apparatus 110 of the 3rd Embodiment which concerns on this technique
  • FIG. 10 is a figure of FIG. It is a figure for demonstrating the manufacturing method, and at the same time, it is a figure which shows the solid-state image pickup apparatus 110 of the 3rd Embodiment which concerns on this technique.
  • the process up to the step of wafer bonding with the logic substrate 18 is the same as the manufacturing method for the solid-state imaging device of the first embodiment (Example 1 of the solid-state imaging device) according to the present technology. Perform the process. However, in the method for manufacturing the solid-state image sensor 110, the solid-phase diffusion step shown in FIG. 4B is not executed (the P-type region 60 is not formed). Then, in FIG. 9A, the solid-state image sensor 106 is shown for comparison (for reference).
  • the film for example, silicon oxide film 61 and polysilicon (Poly-Si) 62
  • the trench 1 is re-embedded with a film (pinning film) 70 having a negative fixed charge.
  • the film (pinning film) 70 having a negative fixed charge is an insulating film that induces a negative charge on the silicon (Si) surface (including the side wall of the trench 1), and is, for example, HfO 2 , Al 2 O 3, and the like. High dielectric constant insulating film of.
  • re-embedding is performed with an insulating film such as a light-shielding metal or a silicon oxide film so that it is placed on the film (pinning film) 70 having a negative fixed charge.
  • the light-shielding metal material include metal materials such as tungsten (W) and aluminum (Al).
  • tungsten (W) 63 is embedded in the trench 1.
  • a film (pinning film) 70 having a negative fixed charge is formed, so that a P-type region (P-type impurity region) 60-1 is formed on the side wall and the bottom surface of the trench 1. Has been done.
  • a custom step is executed to form a color filter 12, an on-chip lens (OCL (On Chip Lens) 11 and the like), and a solid-state image sensor 110 is manufactured.
  • OCL On Chip Lens
  • the contents of the description of the solid-state image sensor of the third embodiment are the first and second first to the second aspects of the present technology described above, unless there is a particular technical contradiction. It can be applied to the solid-state image sensor of the embodiment.
  • the electronic device of the fourth embodiment according to the present technology is any one of the solid-state image sensors of the first embodiment to the third embodiment according to the present technology. It is an electronic device equipped with the solid-state image sensor of one embodiment.
  • FIG. 11 is a diagram showing an example of using the solid-state image sensor of the first to third embodiments according to the present technology as an image sensor.
  • the solid-state image sensor of the first to third embodiments described above can be used in various cases for sensing light such as visible light, infrared light, ultraviolet light, and X-ray, as described below. can. That is, as shown in FIG. 11, for example, the field of appreciation for taking an image used for appreciation, the field of transportation, the field of home appliances, the field of medical / healthcare, the field of security, the field of beauty, and sports. (For example, the electronic device of the fourth embodiment described above) is the solid-state image sensor of any one of the first to third embodiments. Can be done.
  • the first to third implementations are applied to devices for taking images to be used for appreciation, such as digital cameras, smartphones, and mobile phones with a camera function.
  • the solid-state imaging device of any one of the embodiments can be used.
  • in-vehicle sensors that photograph the front, rear, surroundings, inside of a vehicle, etc., and monitor traveling vehicles and roads for safe driving such as automatic stop and recognition of the driver's condition.
  • the solid-state imaging device of any one of the first to third embodiments is used as a device used for traffic such as a surveillance camera and a distance measuring sensor for measuring distance between vehicles. be able to.
  • devices used in home appliances such as television receivers, refrigerators, and air conditioners in order to photograph a user's gesture and operate the device according to the gesture.
  • the solid-state imaging device of any one of the third embodiments can be used.
  • the first to third implementations are applied to devices used for medical care and healthcare, such as endoscopes and devices that perform angiography by receiving infrared light.
  • the solid-state imaging device of any one of the embodiments can be used.
  • a device used for security such as a surveillance camera for crime prevention and a camera for personal authentication is used as a solid body of any one of the first to third embodiments.
  • An image sensor can be used.
  • a skin measuring device for photographing the skin for example, a microscope for photographing the scalp, and other devices used for cosmetology are equipped with any one of the first to third embodiments.
  • a solid-state imaging device of the form can be used.
  • a solid-state image sensor In the field of sports, for example, a solid-state image sensor according to any one of the first to third embodiments is used as a device used for sports such as an action camera or a wearable camera for sports applications. Can be used.
  • a device used for agriculture such as a camera for monitoring the state of a field or a crop is subjected to solid-state imaging of any one of the first to third embodiments.
  • the device can be used.
  • the solid-state image sensor of any one of the first to third embodiments described above can be used as the solid-state image sensor 101M, for example, a camera system such as a digital still camera or a video camera, or an image pickup function. It can be applied to all types of electronic devices having an image pickup function, such as a mobile phone having a camera.
  • FIG. 12 shows a schematic configuration of the electronic device 102 (camera) as an example.
  • the electronic device 102 is, for example, a video camera capable of capturing a still image or a moving image, and drives a solid-state image sensor 101M, an optical system (optical lens) 310, a shutter device 311 and a solid-state image sensor 101M and a shutter device 311. It has a drive unit 313 and a signal processing unit 312.
  • the optical system 310 guides the image light (incident light) from the subject to the pixel portion 101a of the solid-state image sensor 101M.
  • the optical system 310 may be composed of a plurality of optical lenses.
  • the shutter device 311 controls the light irradiation period and the light blocking period of the solid-state image sensor 101M.
  • the drive unit 313 controls the transfer operation of the solid-state image sensor 101M and the shutter operation of the shutter device 311.
  • the signal processing unit 312 performs various signal processing on the signal output from the solid-state image sensor 101M.
  • the video signal Dout after signal processing is stored in a storage medium such as a memory, or is output to a monitor or the like.
  • FIG. 13 is a diagram showing an example of a schematic configuration of an endoscopic surgery system to which the technique according to the present disclosure (the present technique) can be applied.
  • FIG. 13 illustrates how the surgeon (doctor) 11131 is performing surgery on patient 11132 on patient bed 11133 using the endoscopic surgery system 11000.
  • the endoscopic surgery system 11000 includes an endoscope 11100, other surgical tools 11110 such as an abdominal tube 11111 and an energy treatment tool 11112, and a support arm device 11120 that supports the endoscope 11100.
  • a cart 11200 equipped with various devices for endoscopic surgery.
  • the endoscope 11100 is composed of a lens barrel 11101 in which a region having a predetermined length from the tip is inserted into the body cavity of the patient 11132, and a camera head 11102 connected to the base end of the lens barrel 11101.
  • the endoscope 11100 configured as a so-called rigid mirror having a rigid barrel 11101 is illustrated, but the endoscope 11100 may be configured as a so-called flexible mirror having a flexible barrel. good.
  • An opening in which an objective lens is fitted is provided at the tip of the lens barrel 11101.
  • a light source device 11203 is connected to the endoscope 11100, and the light generated by the light source device 11203 is guided to the tip of the lens barrel by a light guide extending inside the lens barrel 11101, and is an objective. It is irradiated toward the observation target in the body cavity of the patient 11132 through the lens.
  • the endoscope 11100 may be a direct endoscope, a perspective mirror, or a side endoscope.
  • An optical system and an image sensor are provided inside the camera head 11102, and the reflected light (observation light) from the observation target is focused on the image sensor by the optical system.
  • the observation light is photoelectrically converted by the image sensor, and an electric signal corresponding to the observation light, that is, an image signal corresponding to the observation image is generated.
  • the image signal is transmitted as RAW data to the camera control unit (CCU: Camera Control Unit) 11201.
  • CCU Camera Control Unit
  • the CCU11201 is composed of a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), and the like, and comprehensively controls the operations of the endoscope 11100 and the display device 11202. Further, the CCU 11201 receives an image signal from the camera head 11102, and performs various image processing on the image signal for displaying an image based on the image signal, such as development processing (demosaic processing).
  • a CPU Central Processing Unit
  • GPU Graphics Processing Unit
  • the display device 11202 displays an image based on the image signal processed by the CCU 11201 under the control of the CCU 11201.
  • the light source device 11203 is composed of, for example, a light source such as an LED (Light Emitting Diode), and supplies irradiation light to the endoscope 11100 when photographing an operating part or the like.
  • a light source such as an LED (Light Emitting Diode)
  • LED Light Emitting Diode
  • the input device 11204 is an input interface for the endoscopic surgery system 11000.
  • the user can input various information and input instructions to the endoscopic surgery system 11000 via the input device 11204.
  • the user inputs an instruction to change the imaging conditions (type of irradiation light, magnification, focal length, etc.) by the endoscope 11100.
  • the treatment tool control device 11205 controls the drive of the energy treatment tool 11112 for cauterizing, incising, sealing a blood vessel, or the like of a tissue.
  • the pneumoperitoneum device 11206 uses a gas in the pneumoperitoneum tube 11111 to inflate the body cavity of the patient 11132 for the purpose of securing the field of view by the endoscope 11100 and securing the work space of the operator.
  • Recorder 11207 is a device capable of recording various information related to surgery.
  • the printer 11208 is a device capable of printing various information related to surgery in various formats such as texts, images, and graphs.
  • the light source device 11203 that supplies the irradiation light to the endoscope 11100 when photographing the surgical site can be composed of, for example, an LED, a laser light source, or a white light source composed of a combination thereof.
  • a white light source is configured by combining RGB laser light sources, the output intensity and output timing of each color (each wavelength) can be controlled with high accuracy. Therefore, the light source device 11203 adjusts the white balance of the captured image. It can be carried out.
  • the laser light from each of the RGB laser light sources is irradiated to the observation target in a time-division manner, and the drive of the image sensor of the camera head 11102 is controlled in synchronization with the irradiation timing to correspond to each of RGB. It is also possible to capture the image in a time-division manner. According to this method, a color image can be obtained without providing a color filter on the image sensor.
  • the drive of the light source device 11203 may be controlled so as to change the intensity of the output light at predetermined time intervals.
  • the drive of the image sensor of the camera head 11102 in synchronization with the timing of changing the light intensity to acquire an image in a time-divided manner and synthesizing the image, so-called high dynamic without blackout and overexposure. Range images can be generated.
  • the light source device 11203 may be configured to be able to supply light in a predetermined wavelength band corresponding to special light observation.
  • special light observation for example, by utilizing the wavelength dependence of light absorption in body tissue to irradiate light in a narrow band as compared with the irradiation light (that is, white light) in normal observation, the surface layer of the mucous membrane.
  • a so-called narrow band imaging is performed in which a predetermined tissue such as a blood vessel is photographed with high contrast.
  • fluorescence observation may be performed in which an image is obtained by fluorescence generated by irradiating with excitation light.
  • the body tissue is irradiated with excitation light to observe the fluorescence from the body tissue (autofluorescence observation), or a reagent such as indocyanine green (ICG) is locally injected into the body tissue and the body tissue is injected. It is possible to obtain a fluorescence image by irradiating excitation light corresponding to the fluorescence wavelength of the reagent.
  • the light source device 11203 may be configured to be capable of supplying narrow band light and / or excitation light corresponding to such special light observation.
  • FIG. 14 is a block diagram showing an example of the functional configuration of the camera head 11102 and CCU11201 shown in FIG.
  • the camera head 11102 includes a lens unit 11401, an imaging unit 11402, a driving unit 11403, a communication unit 11404, and a camera head control unit 11405.
  • CCU11201 has a communication unit 11411, an image processing unit 11412, and a control unit 11413.
  • the camera head 11102 and CCU11201 are communicatively connected to each other by a transmission cable 11400.
  • the lens unit 11401 is an optical system provided at a connection portion with the lens barrel 11101.
  • the observation light taken in from the tip of the lens barrel 11101 is guided to the camera head 11102 and incident on the lens unit 11401.
  • the lens unit 11401 is configured by combining a plurality of lenses including a zoom lens and a focus lens.
  • the image pickup unit 11402 is composed of an image pickup element.
  • the image sensor constituting the image pickup unit 11402 may be one (so-called single plate type) or a plurality (so-called multi-plate type).
  • each image pickup element may generate an image signal corresponding to each of RGB, and a color image may be obtained by synthesizing them.
  • the image pickup unit 11402 may be configured to have a pair of image pickup elements for acquiring image signals for the right eye and the left eye corresponding to 3D (Dimensional) display, respectively.
  • the 3D display enables the operator 11131 to more accurately grasp the depth of the biological tissue in the surgical site.
  • a plurality of lens units 11401 may be provided corresponding to each image pickup element.
  • the imaging unit 11402 does not necessarily have to be provided on the camera head 11102.
  • the imaging unit 11402 may be provided inside the lens barrel 11101 immediately after the objective lens.
  • the drive unit 11403 is composed of an actuator, and the zoom lens and focus lens of the lens unit 11401 are moved by a predetermined distance along the optical axis under the control of the camera head control unit 11405. As a result, the magnification and focus of the image captured by the imaging unit 11402 can be adjusted as appropriate.
  • the communication unit 11404 is composed of a communication device for transmitting and receiving various information to and from the CCU11201.
  • the communication unit 11404 transmits the image signal obtained from the image pickup unit 11402 as RAW data to the CCU 11201 via the transmission cable 11400.
  • the communication unit 11404 receives a control signal for controlling the drive of the camera head 11102 from the CCU 11201 and supplies the control signal to the camera head control unit 11405.
  • the control signal includes, for example, information to specify the frame rate of the captured image, information to specify the exposure value at the time of imaging, and / or information to specify the magnification and focus of the captured image. Contains information about the condition.
  • the imaging conditions such as the frame rate, exposure value, magnification, and focus may be appropriately specified by the user, or may be automatically set by the control unit 11413 of CCU11201 based on the acquired image signal. good.
  • the so-called AE (Auto Exposure) function, AF (Auto Focus) function, and AWB (Auto White Balance) function are mounted on the endoscope 11100.
  • the camera head control unit 11405 controls the drive of the camera head 11102 based on the control signal from the CCU 11201 received via the communication unit 11404.
  • the communication unit 11411 is composed of a communication device for transmitting and receiving various information to and from the camera head 11102.
  • the communication unit 11411 receives an image signal transmitted from the camera head 11102 via the transmission cable 11400.
  • the communication unit 11411 transmits a control signal for controlling the drive of the camera head 11102 to the camera head 11102.
  • Image signals and control signals can be transmitted by telecommunications, optical communication, or the like.
  • the image processing unit 11412 performs various image processing on the image signal which is the RAW data transmitted from the camera head 11102.
  • the control unit 11413 performs various controls related to the imaging of the surgical site and the like by the endoscope 11100 and the display of the captured image obtained by the imaging of the surgical site and the like. For example, the control unit 11413 generates a control signal for controlling the drive of the camera head 11102.
  • control unit 11413 causes the display device 11202 to display an image captured by the surgical unit or the like based on the image signal processed by the image processing unit 11412.
  • the control unit 11413 may recognize various objects in the captured image by using various image recognition techniques. For example, the control unit 11413 detects the shape, color, and the like of the edge of an object included in the captured image to remove surgical tools such as forceps, a specific biological part, bleeding, and mist when using the energy treatment tool 11112. Can be recognized.
  • the control unit 11413 may superimpose and display various surgical support information on the image of the surgical unit by using the recognition result. By superimposing and displaying the surgical support information and presenting it to the surgeon 11131, it is possible to reduce the burden on the surgeon 11131 and to allow the surgeon 11131 to proceed with the surgery reliably.
  • the transmission cable 11400 that connects the camera head 11102 and CCU11201 is an electric signal cable that supports electrical signal communication, an optical fiber that supports optical communication, or a composite cable thereof.
  • the communication was performed by wire using the transmission cable 11400, but the communication between the camera head 11102 and the CCU11201 may be performed wirelessly.
  • the above is an example of an endoscopic surgery system to which the technology according to the present disclosure can be applied.
  • the technique according to the present disclosure can be applied to the endoscope 11100, the camera head 11102 (imaging unit 11402), and the like among the configurations described above.
  • the solid-state image sensor of the present disclosure can be applied to the image pickup unit 10402.
  • the endoscopic surgery system has been described as an example, but the technique according to the present disclosure may be applied to other, for example, a microscopic surgery system.
  • the technology according to the present disclosure can be applied to various products.
  • the technology according to the present disclosure is realized as a device mounted on a moving body of any kind such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, a personal mobility, an airplane, a drone, a ship, and a robot. You may.
  • FIG. 15 is a block diagram showing a schematic configuration example of a vehicle control system, which is an example of a mobile control system to which the technique according to the present disclosure can be applied.
  • the vehicle control system 12000 includes a plurality of electronic control units connected via the communication network 12001.
  • the vehicle control system 12000 includes a drive system control unit 12010, a body system control unit 12020, an outside information detection unit 12030, an in-vehicle information detection unit 12040, and an integrated control unit 12050.
  • a microcomputer 12051, an audio image output unit 12052, and an in-vehicle network I / F (interface) 12053 are shown as a functional configuration of the integrated control unit 12050.
  • the drive system control unit 12010 controls the operation of the device related to the drive system of the vehicle according to various programs.
  • the drive system control unit 12010 provides a driving force generator for generating the driving force of the vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to the wheels, and a steering angle of the vehicle. It functions as a control device such as a steering mechanism for adjusting and a braking device for generating braking force of the vehicle.
  • the body system control unit 12020 controls the operation of various devices mounted on the vehicle body according to various programs.
  • the body system control unit 12020 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as headlamps, back lamps, brake lamps, blinkers or fog lamps.
  • the body system control unit 12020 may be input with radio waves transmitted from a portable device that substitutes for the key or signals of various switches.
  • the body system control unit 12020 receives inputs of these radio waves or signals and controls a vehicle door lock device, a power window device, a lamp, and the like.
  • the vehicle outside information detection unit 12030 detects information outside the vehicle equipped with the vehicle control system 12000.
  • the image pickup unit 12031 is connected to the vehicle exterior information detection unit 12030.
  • the vehicle outside information detection unit 12030 causes the image pickup unit 12031 to capture an image of the outside of the vehicle and receives the captured image.
  • the vehicle exterior information detection unit 12030 may perform object detection processing or distance detection processing such as a person, a vehicle, an obstacle, a sign, or a character on the road surface based on the received image.
  • the imaging unit 12031 is an optical sensor that receives light and outputs an electric signal according to the amount of the light received.
  • the image pickup unit 12031 can output an electric signal as an image or can output it as distance measurement information. Further, the light received by the imaging unit 12031 may be visible light or invisible light such as infrared light.
  • the in-vehicle information detection unit 12040 detects the in-vehicle information.
  • a driver state detection unit 12041 that detects the driver's state is connected to the in-vehicle information detection unit 12040.
  • the driver state detection unit 12041 includes, for example, a camera that images the driver, and the in-vehicle information detection unit 12040 determines the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 12041. It may be calculated, or it may be determined whether the driver is dozing.
  • the microcomputer 12051 calculates the control target value of the driving force generator, the steering mechanism, or the braking device based on the information inside and outside the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040, and the drive system control unit.
  • a control command can be output to 12010.
  • the microcomputer 12051 realizes ADAS (Advanced Driver Assistance System) functions including vehicle collision avoidance or impact mitigation, follow-up driving based on inter-vehicle distance, vehicle speed maintenance driving, vehicle collision warning, vehicle lane deviation warning, and the like. It is possible to perform cooperative control for the purpose of.
  • ADAS Advanced Driver Assistance System
  • the microcomputer 12051 controls the driving force generator, the steering mechanism, the braking device, and the like based on the information around the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040. It is possible to perform coordinated control for the purpose of automatic driving, etc., which runs autonomously without depending on the operation.
  • the microcomputer 12051 can output a control command to the body system control unit 12020 based on the information outside the vehicle acquired by the vehicle exterior information detection unit 12030.
  • the microcomputer 12051 controls the headlamps according to the position of the preceding vehicle or the oncoming vehicle detected by the external information detection unit 12030, and performs coordinated control for the purpose of anti-glare such as switching the high beam to the low beam. It can be carried out.
  • the audio image output unit 12052 transmits an output signal of at least one of audio and image to an output device capable of visually or audibly notifying information to the passenger or the outside of the vehicle.
  • an audio speaker 12061, a display unit 12062, and an instrument panel 12063 are exemplified as output devices.
  • the display unit 12062 may include, for example, at least one of an onboard display and a heads-up display.
  • FIG. 16 is a diagram showing an example of the installation position of the imaging unit 12031.
  • the vehicle 12100 has image pickup units 12101, 12102, 12103, 12104, 12105 as the image pickup unit 12031.
  • the imaging units 12101, 12102, 12103, 12104, 12105 are provided at positions such as the front nose, side mirrors, rear bumpers, back doors, and the upper part of the windshield in the vehicle interior of the vehicle 12100, for example.
  • the imaging unit 12101 provided on the front nose and the imaging unit 12105 provided on the upper part of the windshield in the vehicle interior mainly acquire an image in front of the vehicle 12100.
  • the imaging units 12102 and 12103 provided in the side mirrors mainly acquire images of the side of the vehicle 12100.
  • the imaging unit 12104 provided on the rear bumper or the back door mainly acquires an image of the rear of the vehicle 12100.
  • the images in front acquired by the imaging units 12101 and 12105 are mainly used for detecting a preceding vehicle, a pedestrian, an obstacle, a traffic light, a traffic sign, a lane, or the like.
  • FIG. 16 shows an example of the photographing range of the imaging units 12101 to 12104.
  • the imaging range 12111 indicates the imaging range of the imaging unit 12101 provided on the front nose
  • the imaging ranges 12112 and 12113 indicate the imaging ranges of the imaging units 12102 and 12103 provided on the side mirrors, respectively
  • the imaging range 12114 indicates the imaging range of the imaging units 12102 and 12103.
  • the imaging range of the imaging unit 12104 provided on the rear bumper or the back door is shown. For example, by superimposing the image data captured by the imaging units 12101 to 12104, a bird's-eye view image of the vehicle 12100 as viewed from above can be obtained.
  • At least one of the imaging units 12101 to 12104 may have a function of acquiring distance information.
  • at least one of the image pickup units 12101 to 12104 may be a stereo camera composed of a plurality of image pickup elements, or may be an image pickup element having pixels for phase difference detection.
  • the microcomputer 12051 has a distance to each three-dimensional object within the imaging range 12111 to 12114 based on the distance information obtained from the imaging units 12101 to 12104, and a temporal change of this distance (relative velocity with respect to the vehicle 12100). By obtaining can. Further, the microcomputer 12051 can set an inter-vehicle distance to be secured in front of the preceding vehicle in advance, and can perform automatic brake control (including follow-up stop control), automatic acceleration control (including follow-up start control), and the like. In this way, it is possible to perform coordinated control for the purpose of automatic driving or the like in which the vehicle travels autonomously without depending on the operation of the driver.
  • automatic brake control including follow-up stop control
  • automatic acceleration control including follow-up start control
  • the microcomputer 12051 converts three-dimensional object data related to a three-dimensional object into two-wheeled vehicles, ordinary vehicles, large vehicles, pedestrians, utility poles, and other three-dimensional objects based on the distance information obtained from the imaging units 12101 to 12104. It can be classified and extracted and used for automatic avoidance of obstacles. For example, the microcomputer 12051 distinguishes obstacles around the vehicle 12100 into obstacles that can be seen by the driver of the vehicle 12100 and obstacles that are difficult to see. Then, the microcomputer 12051 determines the collision risk indicating the risk of collision with each obstacle, and when the collision risk is equal to or higher than the set value and there is a possibility of collision, the microcomputer 12051 is used via the audio speaker 12061 or the display unit 12062. By outputting an alarm to the driver and performing forced deceleration and avoidance steering via the drive system control unit 12010, driving support for collision avoidance can be provided.
  • At least one of the imaging units 12101 to 12104 may be an infrared camera that detects infrared rays.
  • the microcomputer 12051 can recognize a pedestrian by determining whether or not a pedestrian is present in the captured image of the imaging units 12101 to 12104.
  • pedestrian recognition includes, for example, a procedure for extracting feature points in an image captured by an imaging unit 12101 to 12104 as an infrared camera, and pattern matching processing for a series of feature points indicating the outline of an object to determine whether or not the pedestrian is a pedestrian. It is done by the procedure to determine.
  • the audio image output unit 12052 When the microcomputer 12051 determines that a pedestrian is present in the captured images of the imaging units 12101 to 12104 and recognizes the pedestrian, the audio image output unit 12052 outputs a square contour line for emphasizing the recognized pedestrian.
  • the display unit 12062 is controlled so as to superimpose and display. Further, the audio image output unit 12052 may control the display unit 12062 so as to display an icon or the like indicating a pedestrian at a desired position.
  • the above is an example of a vehicle control system to which the technology according to the present disclosure (the present technology) can be applied.
  • the technique according to the present disclosure can be applied to, for example, the imaging unit 12031 among the configurations described above.
  • the solid-state image sensor of the present disclosure can be applied to the image pickup unit 12031.
  • the present technology is not limited to the above-described embodiments, the above-mentioned usage examples, and the above-mentioned application examples, and various changes can be made without departing from the gist of the present technology.
  • the present technology can also have the following configurations.
  • a pixel transistor formed on the surface of the semiconductor substrate on the opposite side of the light incident side is provided.
  • a floating diffusion formed on the surface of the semiconductor substrate opposite to the light incident side is provided.
  • a transfer gate and a floating diffusion formed on the surface of the semiconductor substrate on the opposite side of the light incident side are provided. The transfer gate and the floating diffusion are located above the trench.
  • the transfer gate is formed for each of the pixels.
  • the floating diffusion is shared by the plurality of pixels.
  • the solid-state image sensor according to any one of [1] to [5], wherein the floating diffusion is surrounded by a plurality of the transfer gates.
  • the photoelectric conversion unit is embedded below the semiconductor substrate.
  • the solid according to any one of [1] to [6], wherein the vertical gate for transferring the signal charge generated by the photoelectric conversion unit by the photoelectric conversion is formed above the semiconductor substrate.
  • Image sensor [8] An amplification transistor formed on the surface of the semiconductor substrate on the opposite side of the light incident side is provided.
  • the amplification transistor is arranged above the trench and The solid-state image sensor according to any one of [1] to [7], wherein the amplification transistor is shared by a plurality of the pixels.
  • a reset transistor formed on the surface of the semiconductor substrate on the opposite side of the light incident side is provided.
  • the reset transistor is arranged above the trench and The solid-state image sensor according to any one of [1] to [8], wherein the reset transistor is shared by a plurality of the pixels.
  • a selection transistor formed on the surface of the semiconductor substrate on the opposite side of the light incident side is provided.
  • the selection transistor is arranged above the trench and The solid-state image sensor according to any one of [1] to [9], wherein the selection transistor is shared by a plurality of the pixels.
  • Filler (light-shielding metal, tungsten (W)), 64 ... P layer (P type pinning layer), 70 ... A film having a negative fixed charge, 101, 106, 108, 110, 123, 123A, 123B ... Solid-state image sensor.

Landscapes

  • Engineering & Computer Science (AREA)
  • Power Engineering (AREA)
  • Physics & Mathematics (AREA)
  • Condensed Matter Physics & Semiconductors (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Signal Processing (AREA)
  • Manufacturing & Machinery (AREA)
  • Multimedia (AREA)
  • Electromagnetism (AREA)
  • Solid State Image Pick-Up Elements (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)
  • Element Separation (AREA)

Abstract

L'invention concerne un dispositif d'imagerie à semi-conducteur avec lequel la qualité et la fiabilité du dispositif d'imagerie à semi-conducteur peut être davantage améliorée. L'invention concerne un dispositif d'imagerie à semi-conducteur comprenant : un substrat semi-conducteur sur lequel une unité de conversion photoélectrique qui effectue une conversion photoélectrique est formée pour chaque pixel; une tranchée formée entre les pixels et en retrait vers le haut à partir de la surface arrière du substrat semi-conducteur qui est le côté d'incidence de la lumière, sans pénétrer dans le substrat semi-conducteur; et une région de jonction PN composée d'une région de type P et d'une région de type N qui sont disposées le long de la partie périphérique de la tranchée, la région de jonction PN étant formée par auto-alignement.
PCT/JP2021/006245 2020-02-28 2021-02-19 Dispositif d'imagerie à semi-conducteur et appareil électronique WO2021172176A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-033222 2020-02-28
JP2020033222A JP2021136380A (ja) 2020-02-28 2020-02-28 固体撮像装置及び電子機器

Publications (1)

Publication Number Publication Date
WO2021172176A1 true WO2021172176A1 (fr) 2021-09-02

Family

ID=77490931

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/006245 WO2021172176A1 (fr) 2020-02-28 2021-02-19 Dispositif d'imagerie à semi-conducteur et appareil électronique

Country Status (2)

Country Link
JP (1) JP2021136380A (fr)
WO (1) WO2021172176A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010225818A (ja) * 2009-03-23 2010-10-07 Toshiba Corp 固体撮像装置及びその製造方法
WO2018139279A1 (fr) * 2017-01-30 2018-08-02 ソニーセミコンダクタソリューションズ株式会社 Élément de capture d'image à semiconducteur et dispositif électronique
WO2019093151A1 (fr) * 2017-11-09 2019-05-16 ソニーセミコンダクタソリューションズ株式会社 Dispositif de capture d'image à semi-conducteur et appareil électronique
JP2019145544A (ja) * 2018-02-16 2019-08-29 ソニーセミコンダクタソリューションズ株式会社 撮像素子
WO2019220945A1 (fr) * 2018-05-18 2019-11-21 ソニーセミコンダクタソリューションズ株式会社 Élément d'imagerie et dispositif électronique
WO2019240207A1 (fr) * 2018-06-15 2019-12-19 ソニーセミコンダクタソリューションズ株式会社 Dispositif d'imagerie et son procédé de fabrication, et appareil électronique

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010225818A (ja) * 2009-03-23 2010-10-07 Toshiba Corp 固体撮像装置及びその製造方法
WO2018139279A1 (fr) * 2017-01-30 2018-08-02 ソニーセミコンダクタソリューションズ株式会社 Élément de capture d'image à semiconducteur et dispositif électronique
WO2019093151A1 (fr) * 2017-11-09 2019-05-16 ソニーセミコンダクタソリューションズ株式会社 Dispositif de capture d'image à semi-conducteur et appareil électronique
JP2019145544A (ja) * 2018-02-16 2019-08-29 ソニーセミコンダクタソリューションズ株式会社 撮像素子
WO2019220945A1 (fr) * 2018-05-18 2019-11-21 ソニーセミコンダクタソリューションズ株式会社 Élément d'imagerie et dispositif électronique
WO2019240207A1 (fr) * 2018-06-15 2019-12-19 ソニーセミコンダクタソリューションズ株式会社 Dispositif d'imagerie et son procédé de fabrication, et appareil électronique

Also Published As

Publication number Publication date
JP2021136380A (ja) 2021-09-13

Similar Documents

Publication Publication Date Title
US11961862B2 (en) Solid-state imaging element and electronic apparatus
US11398514B2 (en) Solid-state image pickup device, manufacturing method therefor, and electronic apparatus
WO2019220945A1 (fr) Élément d'imagerie et dispositif électronique
WO2021015009A1 (fr) Dispositif d'imagerie à semi-conducteur et appareil électronique
WO2021124975A1 (fr) Dispositif d'imagerie à semi-conducteurs et instrument électronique
JP7187440B2 (ja) 固体撮像素子、電子機器、並びに製造方法
WO2020079945A1 (fr) Dispositif d'imagerie à semi-conducteur et appareil électronique
US20220246653A1 (en) Solid-state imaging element and solid-state imaging element manufacturing method
WO2019239754A1 (fr) Élément d'imagerie à semi-conducteur, procédé de fabrication d'élément d'imagerie à semi-conducteur et dispositif électronique
WO2019181466A1 (fr) Élément d'imagerie et dispositif électronique
WO2019188131A1 (fr) Dispositif à semi-conducteur et procédé de fabrication de dispositif à semi-conducteur
WO2022130791A1 (fr) Dispositif de capture d'image à semi-conducteurs et dispositif électronique
US20220254823A1 (en) Imaging device
WO2021172176A1 (fr) Dispositif d'imagerie à semi-conducteur et appareil électronique
WO2021187151A1 (fr) Élément de capture d'image, puce à semi-conducteur
WO2022137864A1 (fr) Dispositif d'imagerie et appareil électronique
WO2022145190A1 (fr) Dispositif d'imagerie à semi-conducteurs et appareil électronique
WO2023149187A1 (fr) Transistor vertical, dispositif de détection de lumière et appareil électronique
WO2021140958A1 (fr) Élément d'imagerie, son procédé de fabrication et dispositif électronique
WO2021186911A1 (fr) Dispositif d'imagerie et appareil électronique
WO2022270039A1 (fr) Dispositif d'imagerie à semi-conducteurs
WO2024057805A1 (fr) Élément d'imagerie et dispositif électronique
TWI834644B (zh) 攝像元件及電子機器
WO2022259855A1 (fr) Dispositif à semi-conducteur, son procédé de fabrication et appareil électronique
US20220344390A1 (en) Organic cis image sensor

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21759982

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21759982

Country of ref document: EP

Kind code of ref document: A1