WO2024057470A1 - Dispositif de photodétection, son procédé de production et appareil électronique - Google Patents

Dispositif de photodétection, son procédé de production et appareil électronique Download PDF

Info

Publication number
WO2024057470A1
WO2024057470A1 PCT/JP2022/034521 JP2022034521W WO2024057470A1 WO 2024057470 A1 WO2024057470 A1 WO 2024057470A1 JP 2022034521 W JP2022034521 W JP 2022034521W WO 2024057470 A1 WO2024057470 A1 WO 2024057470A1
Authority
WO
WIPO (PCT)
Prior art keywords
pixel
silicon germanium
germanium layer
layer
photodetection device
Prior art date
Application number
PCT/JP2022/034521
Other languages
English (en)
Japanese (ja)
Inventor
浩之 服部
Original Assignee
ソニーセミコンダクタソリューションズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーセミコンダクタソリューションズ株式会社 filed Critical ソニーセミコンダクタソリューションズ株式会社
Priority to PCT/JP2022/034521 priority Critical patent/WO2024057470A1/fr
Publication of WO2024057470A1 publication Critical patent/WO2024057470A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith

Definitions

  • the present disclosure relates to a photodetection device, a method of manufacturing the same, and an electronic device, and particularly relates to a photodetection device, a method of manufacturing the same, and an electronic device that can realize a photoelectric conversion unit with high quantum efficiency in the infrared region.
  • the absorption wavelength band in the near-infrared region is about 1 um, and the absorption coefficient decreases toward the near-infrared region. Therefore, in order to increase the quantum efficiency at wavelengths of 900 nm and above, which cannot be achieved with silicon, consideration is being given to replacing silicon (Si) with germanium (Ge) or silicon germanium (SiGe) in the photodiode composition.
  • Si silicon
  • germanium germanium
  • SiGe silicon germanium
  • the absorption wavelength band can be changed up to about 1.9 um by changing the Ge composition, and by designing the composition according to the purpose, a sensor with a high absorption coefficient in the infrared region can be manufactured. can do.
  • Patent Document 1 discloses a front-illuminated sensor structure provided with this SiGe stress relaxation layer.
  • a high-concentration P-type layer in which a high-concentration P-type impurity is implanted into the SiGe layer.
  • a process of implanting type impurities is required. Furthermore, it is desirable to remove the SiGe stress relaxation layer from the viewpoint of suppressing dark current.
  • the present disclosure has been made in view of this situation, and is intended to make it possible to realize a photoelectric conversion unit with high quantum efficiency in the infrared region.
  • the photodetection device includes: a silicon germanium layer in which a photoelectric conversion section is formed; an interpixel light shielding film formed on a first surface side that is a light incident surface side of the silicon germanium layer; a MOS transistor formed on a second surface opposite to the first surface of the silicon germanium layer;
  • the silicon germanium layer has a constant germanium concentration.
  • the silicon germanium layer is formed at a constant germanium concentration.
  • a method for manufacturing a photodetection device includes: A photoelectric conversion section is formed on the silicon germanium layer, forming an inter-pixel light-shielding film on a first surface side that is a light incident surface side of the silicon germanium layer; forming a MOS transistor on a second surface opposite to the first surface of the silicon germanium layer;
  • the present invention is a method for manufacturing a photodetector in which the silicon germanium layer has a constant germanium concentration.
  • the electronic device includes: a silicon germanium layer in which a photoelectric conversion section is formed; an interpixel light shielding film formed on a first surface side that is a light incident surface side of the silicon germanium layer; a MOS transistor formed on a second surface opposite to the first surface of the silicon germanium layer;
  • the silicon germanium layer has a constant germanium concentration.
  • the photodetector and electronic equipment may be independent devices or may be modules incorporated into other devices.
  • FIG. 1 is a cross-sectional view showing a configuration example of a photodetection device according to an embodiment of the present disclosure.
  • FIG. 2 is a diagram illustrating a method of manufacturing the photodetection device of FIG. 1.
  • FIG. 7 is a diagram showing an example of a pixel circuit when the photodetection device is a gate-type ToF sensor.
  • FIG. 2 is a cross-sectional view of a pixel when the photodetection device is a gate-type ToF sensor.
  • FIG. 3 is a diagram showing an example of a pixel circuit when the photodetection device is an IR image sensor.
  • FIG. 3 is a cross-sectional view of a pixel when the photodetection device is an IR image sensor.
  • FIG. 3 is a diagram showing an example of a pixel circuit when the photodetection device is a CAPD type ToF sensor.
  • FIG. 3 is a cross-sectional view of a pixel when the photodetection device is a CAPD type ToF sensor.
  • FIG. 7 is a diagram showing an example of a pixel circuit when the photodetection device is a direct ToF sensor using SPAD.
  • FIG. 7 is a cross-sectional view of a pixel when the photodetection device is a direct ToF sensor using SPAD.
  • 11 is a cross-sectional view of a pixel in which a diffraction structure is added to the back surface of the SiGe layer in FIG. 10.
  • FIG. 12 is a plan view of the diffractive structure of FIG. 11; 11 is a cross-sectional view of a pixel in which a diffraction structure is added to the front surface of the SiGe layer in FIG. 10.
  • FIG. FIG. 7 is a diagram illustrating a modification of the element isolation section.
  • FIG. 3 is a diagram illustrating an example of a diffraction structure formed by the same formation method as the element isolation portion.
  • FIG. 1 is a block diagram illustrating a schematic configuration example of a photodetection device according to the present disclosure. It is a figure explaining the example of use of the image sensor using a photodetection device.
  • FIG. 1 is a block diagram illustrating a configuration example of an imaging device as an electronic device to which the technology of the present disclosure is applied.
  • FIG. 1 is a diagram illustrating an example of a schematic configuration of an endoscopic surgery system.
  • FIG. 2 is a block diagram showing an example of the functional configuration of a camera head and a CCU.
  • FIG. 1 is a block diagram showing an example of a schematic configuration of a vehicle control system.
  • FIG. 2 is an explanatory diagram showing an example of installation positions of an outside-vehicle information detection section and an imaging section.
  • FIG. 1 is a cross-sectional view showing a configuration example of a photodetection device according to an embodiment of the present disclosure.
  • the photodetection device 1 shown in FIG. 1 includes a photoelectric conversion section that has high quantum efficiency (QE) for light in the infrared region.
  • a photodiode (PD) 12 is formed in the SiGe layer 11 as a photoelectric conversion section.
  • the SiGe layer (silicon germanium layer) 11 is a single crystal layer of silicon germanium (SiGe) formed with a constant germanium concentration.
  • the "constant" concentration is defined as the fact that the variation in the germanium concentration within the layer is typically controlled to a value smaller than about ⁇ 1%, or at least 10%, of the target concentration.
  • the absolute value of the germanium concentration is not limited to a predetermined range, and may be, for example, about 20 to 30%, or about 70 to 80%. The higher the germanium concentration, the higher the absorption coefficient in the infrared region.
  • the SiGe layer 11 is composed of a low concentration P-type semiconductor region, and a part of the SiGe layer 11 is an N-type semiconductor region doped with N-type impurities such as phosphorus (P) and arsenic (As). By forming , a PN junction type photodiode 12 is formed.
  • An anti-reflection film 14 is formed on the upper surface of the back surface of the SiGe layer 11, which is the upper surface of the SiGe layer 11 in FIG. 16 are formed in sequence.
  • the lens layer 16 it is formed flat on the upper part of the SiGe layer 11 where the photodiode 12 is not formed, such as the outer peripheral part outside the pixel array part, but the lens layer 16 is flat on the upper part of the SiGe layer 11 where the photodiode 12 is formed.
  • On the upper part of 11, an on-chip lens 17 is formed.
  • the antireflection film 14 can be configured with a laminated structure in which a fixed charge film and an oxide film are laminated, for example, using a high dielectric constant (High-k) insulating thin film formed by ALD (Atomic Layer Deposition) method. be able to. Specifically, hafnium oxide (HfO 2 ), aluminum oxide (Al 2 O 3 ), titanium oxide (TiO 2 ), STO (Strontium Titan Oxide), or the like can be used as the fixed charge film.
  • the material of the light shielding film 15 may be any material that blocks light, and for example, a metal material such as tungsten (W), aluminum (Al), or copper (Cu) can be used.
  • the lens layer 16 is formed of a resin material such as styrene resin, acrylic resin, styrene-acrylic copolymer resin, or siloxane resin.
  • the on-chip lens 17 condenses the incident light and makes it enter the photodiode 12 efficiently.
  • an element isolation part 13 is formed around the photodiode 12 in plan view.
  • the element isolation section 13 penetrates the SiGe layer 11 and is electrically isolated from other adjacent photoelectric conversion sections (photodiodes 12).
  • the element isolation part 13 is formed by filling a trench formed to penetrate the SiGe layer 11 with a metal material such as tungsten (W), aluminum (Al), titanium (Ti), or titanium nitride (TiN). Consists of.
  • the element isolation section 13 may include at least one layer of the anti-reflection film 14 formed on the back surface of the SiGe layer 11 on the side wall of the trench formed in the SiGe layer 11, and on the inside of the anti-reflection film 14. It may also be constructed by embedding an insulating film such as a silicon oxide film.
  • a Si layer (silicon layer) 21 is formed on the upper surface (lower surface in FIG. 1) of the front surface of the SiGe layer 11, which is the lower surface of the SiGe layer 11 in FIG.
  • the Si layer 21 is a silicon (Si) cap layer that covers the SiGe layer 11, and has a function of preventing germanium contamination during manufacturing.
  • the Si layer 21 is formed to have a thickness of, for example, about 10 nm.
  • the upper surface (lower surface in FIG. 1) of the Si layer 21 is a transistor formation surface when a MOS transistor is formed, and in the example of FIG. 1, one MOS transistor Tr is formed.
  • the MOS transistor Tr does not necessarily need to be formed on the upper surface of the Si layer 21.
  • two or more MOS transistors Tr may be formed on the upper surface of the Si layer 21.
  • the MOS transistor Tr includes a gate insulating film 31 , a gate electrode 32 formed thereon, and a sidewall 33 formed on a side wall of the gate electrode 32 .
  • the gate electrode 32 of the MOS transistor Tr is connected to a bonding electrode 42 via a contact electrode 41, and the bonding electrode 42 is connected to a bonding electrode 64 of a wiring layer 61 of another stacked semiconductor substrate (not shown). Connected by metal bonding.
  • An insulating layer 43 is formed on the upper surface of the Si layer 21 where the MOS transistor Tr is not formed.
  • the photodetector 1 has a stacked structure of a semiconductor substrate (compound semiconductor substrate) 51 including a SiGe layer 11 and a Si layer 21, and a semiconductor substrate (hereinafter referred to as a logic substrate), not shown, on which at least a logic circuit is formed. Consists of. However, regarding the logic board, only the wiring layer 61 is illustrated in FIG. In FIG. 1, the light incident surface side of the semiconductor substrate 51 including the SiGe layer 11 and the Si layer 21 is the back surface (first surface) of the semiconductor substrate 51, and the side to be connected to the logic board is the front surface of the semiconductor substrate 51. This is the surface (second surface).
  • a plurality of MOS transistors Tr, a drive circuit for the MOS transistors Tr, and a signal processing circuit that processes a signal according to the charge generated by the photodiode 12 are formed on the logic board.
  • a drive circuit for the MOS transistor Tr formed on the Si layer 21 is also formed on the logic substrate side.
  • An antireflection film 14, a light shielding film 15, and a lens layer 16 are formed on the back side of the semiconductor substrate 51, and a plurality of MOS transistors Tr (formed on a logic substrate) are formed on the front side of the semiconductor substrate 51. ) are formed. Note that the antireflection film 14, light shielding film 15, and lens layer 16 formed on the back side of the semiconductor substrate 51 may be omitted as appropriate depending on design conditions.
  • the wiring layer 61 includes multiple layers of metal wiring 62 and an insulating layer 63.
  • the number of layers of the metal wiring 62 is two, but the number of layers of the metal wiring 62 is not limited.
  • a bonding electrode 64 is formed on a bonding surface 66 with the insulating layer 43, which is the upper surface of the wiring layer 61, and the bonding electrode 64 is electrically connected to the bonding electrode 42 by metal bonding.
  • the bonding electrode 64 is connected to a predetermined metal wiring 62 via a contact electrode 65.
  • the contact electrode 41, the bonding electrode 42, the bonding electrode 64, the contact electrode 65, and the metal wiring 62 are made of, for example, copper (Cu), tungsten (W), aluminum (Al), gold (Au), or the like. However, in this embodiment, it is made of copper. Therefore, the bonding electrode 42 and the bonding electrode 64 form a Cu-Cu bond.
  • the insulating layers 43 and 63 are formed of, for example, a SiO2 film, a low-k film (low dielectric constant insulating film), a SiOC film, or the like.
  • the insulating layers 43 and 63 may be composed of a plurality of insulating films made of different materials.
  • the photodetector 1 has the photodiode 12 as a photoelectric conversion section, and the photodiode 12 is formed only in the SiGe layer 11 with good crystallinity and a constant germanium concentration. Since the photodiode 12 formed in the SiGe layer 11 has a high absorption coefficient in the infrared region, its sensitivity to light in the infrared region is improved.
  • the photodetector 1 is also a back-illuminated type in which an on-chip lens 17 is formed on the back side of a semiconductor substrate 51 including a SiGe layer 11, and the light focused by the on-chip lens 17 is photoelectrically converted by a photodiode 12. It is a structure.
  • the back-illuminated structure can enlarge the opening area to the photodiode 12 compared to the front-illuminated structure, and can improve sensitivity. Therefore, according to the photodetecting device 1, a photoelectric conversion section having high quantum efficiency in the infrared region can be realized.
  • the structure of the semiconductor substrate 51 included in the photodetector 1 does not include a SiGe stress relaxation layer, dark current generated due to crystal defects in the SiGe stress relaxation layer can be suppressed. Further, unlike the sensor structure of Patent Document 1, there is no need to provide a high concentration P-type layer for suppressing dark current when a SiGe stress relaxation layer is provided.
  • a SiGe heteroepitaxial substrate is prepared as the semiconductor substrate 51 including the SiGe layer 11 and the Si layer 21.
  • the SiGe heteroepitaxial substrate prepared as the semiconductor substrate 51 in the state of A in FIG. , and the Si layer 21 are laminated on the substrate.
  • This SiGe heteroepitaxial substrate can be formed by epitaxially growing the SiGe stress relaxation layer 112, the SiGe layer 11, and the Si layer 21 in this order on the silicon substrate 111.
  • an N-type semiconductor region is formed by doping an N-type impurity such as phosphorus into a desired region of the SiGe layer 11 composed of a P-type semiconductor region.
  • a junction type photodiode 12 is formed.
  • a MOS transistor Tr is formed at a desired position on the upper surface of the Si layer 21.
  • an insulating film that will become the gate insulating film 31 is formed on the entire upper surface of the Si layer 21, and then the insulating film is removed from the position where the MOS transistor Tr is formed using lithography technology, and the gate insulating film is removed. 31 is formed. After that, gate electrode 32 and sidewalls 33 are formed.
  • the gate electrode 32 is made of, for example, polysilicon, and the sidewalls 33 are made of, for example, a silicon nitride film (SiN). Note that in FIG. 2, the symbols of the gate insulating film 31, the gate electrode 32, and the sidewalls 33 are omitted.
  • an insulating layer 43 is formed with a predetermined thickness on the upper surface of the Si layer 21 so as to cover the gate electrode 32 of the MOS transistor Tr, and then the insulating layer 43 is formed on the upper surface of the gate electrode 32. is opened and filled with a predetermined metal material such as copper, thereby forming the contact electrode 41 and the bonding electrode 42.
  • the insulating layer 43 formed on the semiconductor substrate 51 and the wiring layer 61 of the separately manufactured logic board are bonded at the bonding surface 66, and then the top and bottom are reversed as shown in D of FIG. be done.
  • the silicon substrate 111 which is the uppermost layer of the semiconductor substrate 51, is removed by CMP (Chemical Mechanical Polishing) or the like.
  • the SiGe stress relaxation layer 112 which is the uppermost layer of the semiconductor substrate 51, is removed by CMP or the like.
  • the semiconductor substrate 51 is made up of the SiGe layer 11 and the Si layer 21, and the photodiode 12 is formed in the SiGe layer 11.
  • the element isolation part 13 is formed. Subsequently, a light shielding film 15 and a lens layer 16 including an on-chip lens 17 are formed in this order, thereby completing the cross-sectional structure of the photodetector 1 shown in FIG. 1.
  • the SiGe stress relaxation layer 112 with many crystal defects is removed, so the high-concentration P type for suppressing dark current required in the sensor structure of Patent Document 1 is removed.
  • it is necessary to implant impurities into a deep region with high energy and there is a risk that the SiGe layer may be damaged by the high-energy impurity implantation process. Since no layer is required, such damage can also be avoided.
  • the SiGe layer 11 which is the surface on which the MOS transistor is formed, is covered with the Si layer 21, germanium contamination during manufacturing can be prevented. Furthermore, since an oxide film that becomes the gate insulating film 31 of the MOS transistor Tr can be formed on the Si layer 21, the gate insulating film 31 during formation of the MOS transistor is The interface state at the interface can be reduced.
  • the photodetection device 1 described above includes a ToF (Time of Flight) sensor that measures the distance to an object using irradiated light, and an imaging sensor that receives light that includes at least infrared light and generates an image according to the amount of received light. It may be a sensor.
  • ToF Time of Flight
  • imaging sensor that receives light that includes at least infrared light and generates an image according to the amount of received light. It may be a sensor.
  • a ToF sensor is a sensor that measures the distance to an object by emitting irradiated light toward the object and measuring the time it takes for the irradiated light to be reflected on the object's surface and returned.
  • ToF sensors include indirect ToF sensors and direct ToF sensors.
  • Indirect ToF sensors calculate the distance to the object by detecting the flight time from the time the irradiated light is emitted until the reflected light is received as a phase difference, whereas the direct ToF sensor uses This method calculates the distance to an object by directly measuring the flight time from when the light is emitted until the reflected light is received.
  • FIG. 3 shows an example of a pixel circuit when the photodetection device 1 is a gate-type ToF sensor.
  • a gate-type ToF sensor is an indirect ToF sensor that detects a phase difference by distributing the charge generated by the photodiode 12 between two transfer gates (transfer transistors) and calculates the distance to the object.
  • the pixel 200 includes a photodiode 12 as a photoelectric conversion section. Furthermore, the pixel 200 includes two transfer transistors TRG, a floating diffusion region FD, an additional capacitor FDL, a switching transistor FDG, an amplification transistor AMP, a reset transistor RST, and two selection transistors SEL. Furthermore, the pixel 200 includes a charge discharge transistor OFG.
  • transfer transistors TRG1 and TRG2, floating diffusion regions FD1 and FD2, additional capacitors FDL1 and FDL2, switching transistors FDG1 and FDG2, amplification transistors AMP1 and AMP2, reset transistors RST1 and RST2, and selection transistors SEL1 and SEL2 It is called as.
  • the transfer transistor TRG, the switching transistor FDG, the amplification transistor AMP, the selection transistor SEL, the reset transistor RST, and the charge discharge transistor OFG are composed of, for example, N-type MOS transistors.
  • the transfer transistor TRG1 When the transfer drive signal TRG1g supplied to the gate electrode becomes active, the transfer transistor TRG1 becomes conductive in response to this, thereby transferring the charge accumulated in the photodiode 12 to the floating diffusion region FD1.
  • the transfer drive signal TRG2g supplied to the gate electrode becomes active, the transfer transistor TRG2 becomes conductive in response to this, thereby transferring the charge accumulated in the photodiode 12 to the floating diffusion region FD2.
  • the floating diffusion regions FD1 and FD2 are charge holding parts that temporarily hold the charges transferred from the photodiode 12.
  • the switching transistor FDG1 becomes conductive in response to the active state of the FD drive signal FDG1g supplied to the gate electrode, thereby connecting the additional capacitance FDL1 to the floating diffusion region FD1.
  • the switching transistor FDG2 becomes conductive in response to the active state of the FD drive signal FDG2g supplied to the gate electrode, thereby connecting the additional capacitance FDL2 to the floating diffusion region FD2.
  • Additional capacitances FDL1 and FDL2 can be formed by wiring capacitances, for example.
  • the reset transistor RST1 becomes conductive in response to this, thereby resetting the potential of the floating diffusion region FD1.
  • the reset transistor RST2 becomes conductive in response to the activation of the reset drive signal RSTg supplied to the gate electrode, thereby resetting the potential of the floating diffusion region FD2. Note that when the reset transistors RST1 and RST2 are brought into the active state, the switching transistors FDG1 and FDG2 are also brought into the active state at the same time, and the additional capacitors FDL1 and FDL2 are also reset.
  • the vertical drive section (for example, the vertical drive section 522 in FIG. 16) that drives the pixel 200 activates the switching transistors FDG1 and FDG2 and adds the floating diffusion region FD1, for example, at high illumination with a large amount of incident light.
  • the capacitor FDL1 is connected, and the floating diffusion region FD2 and the additional capacitor FDL2 are also connected. This allows more charges to be accumulated during high illuminance.
  • the vertical drive section makes the switching transistors FDG1 and FDG2 inactive, and separates the additional capacitors FDL1 and FDL2 from the floating diffusion regions FD1 and FD2, respectively. Thereby, conversion efficiency can be increased.
  • the charge discharge transistor OFG discharges the charge accumulated in the photodiode 12 by becoming conductive in response to the discharge drive signal OFG1g supplied to the gate electrode becoming active.
  • the amplification transistor AMP1 has its source electrode connected to the vertical signal line 211A via the selection transistor SEL1, thereby connecting to a constant current source (not shown) and forming a source follower circuit.
  • the amplification transistor AMP2 has a source electrode connected to the vertical signal line 211B via the selection transistor SEL2, thereby connecting to a constant current source (not shown) and forming a source follower circuit.
  • the selection transistor SEL1 is connected between the source electrode of the amplification transistor AMP1 and the vertical signal line 211A.
  • the selection drive signal SEL1g supplied to the gate electrode becomes active, the selection transistor SEL1 becomes conductive in response to this, and outputs the pixel signal VSL1 output from the amplification transistor AMP1 to the vertical signal line 211A.
  • the selection transistor SEL2 is connected between the source electrode of the amplification transistor AMP2 and the vertical signal line 211B.
  • the selection drive signal SEL2g supplied to the gate electrode becomes active, the selection transistor SEL2 becomes conductive in response to this, and outputs the pixel signal VSL2 output from the amplification transistor AMP2 to the vertical signal line 211B.
  • Transfer transistors TRG1 and TRG2 switching transistors FDG1 and FDG2, amplification transistors AMP1 and AMP2, selection transistors SEL1 and SEL2, and charge discharge transistor OFG of the pixel 200 are controlled by the vertical drive section.
  • the additional capacitors FDL1 and FDL2 and the switching transistors FDG1 and FDG2 that control their connections may be omitted, but by providing the additional capacitor FDL and using it properly according to the amount of incident light, it is possible to achieve high dynamic A range can be secured.
  • a reset operation that resets the charge of the pixels 200 is performed on all pixels in the pixel array portion in which the pixels 200 are arranged in a matrix. That is, the charge discharge transistor OFG, the reset transistors RST1 and RST2, and the switching transistors FDG1 and FDG2 are turned on, and the accumulated charge in the photodiode 12, the floating diffusion regions FD1 and FD2, and the additional capacitances FDL1 and FDL2 is discharged.
  • transfer transistors TRG1 and TRG2 are driven alternately. That is, in the first period, the transfer transistor TRG1 is controlled to be on and the transfer transistor TRG2 is controlled to be off. During this first period, charges generated in the photodiode 12 are transferred to the floating diffusion region FD1. In the second period following the first period, the transfer transistor TRG1 is controlled to be off and the transfer transistor TRG2 is controlled to be on. In this second period, charges generated in the photodiode 12 are transferred to the floating diffusion region FD2. As a result, charges generated in the photodiode 12 are alternately distributed and accumulated in the floating diffusion regions FD1 and FD2.
  • each pixel 200 in the pixel array section is selected line-sequentially.
  • selection transistors SEL1 and SEL2 are turned on.
  • the charges accumulated in the floating diffusion region FD1 are outputted as the pixel signal VSL1 via the vertical signal line 211A.
  • the charges accumulated in the floating diffusion region FD2 are output as a pixel signal VSL2 via the vertical signal line 211B.
  • the reflected light received by the pixel 200 is delayed from the timing of irradiation by the light source according to the distance to the target object.
  • the distribution ratio of the charges accumulated in the two floating diffusion regions FD1 and FD2 changes depending on the delay time depending on the distance to the target object, so the distribution ratio of the charges accumulated in the two floating diffusion regions FD1 and FD2 changes. , the distance to an object can be found.
  • FIG. 4 is a cross-sectional view showing an example of the configuration of the pixel 200 when the photodetection device 1 is a gate-type ToF sensor.
  • FIG. 4 parts corresponding to the cross-sectional view shown in FIG. 1 are denoted by the same reference numerals, and explanations of those parts will be omitted as appropriate.
  • the symbols of the gate insulating film 31, gate electrode 32, and sidewall 33 of the MOS transistor Tr are omitted.
  • photodiodes 12 are formed in the SiGe layer 11 in units of 200 pixels.
  • the element isolation section 13 is formed at the boundary between the pixels 200 and functions as a pixel isolation section that isolates the photodiode 12 into pixel units.
  • the light-shielding film 15 is formed at the boundary between the pixels 200, and functions as an inter-pixel light-shielding film.
  • a trench is formed from the back surface (top surface in FIG. 1) of the SiGe layer 11, so that the trench width (trench width) increases from the back surface to the front surface of the SiGe layer 11. It had a tapered cross-sectional shape with a narrow opening width.
  • a trench is formed from the front surface side of the SiGe layer 11, so that the groove width (trench It has an inverted tapered cross-sectional shape with a narrow opening width.
  • Two MOS transistors Tr1 and Tr2 are formed on the MOS transistor formation surface where the Si layer 21 is formed. These two MOS transistors Tr1 and Tr2 correspond to the transfer transistors TRG1 and TRG2 of the pixel circuit in FIG. 3.
  • the photodetector 1 is a gate-type ToF sensor and the photodiode 12 of the pixel 200 is formed on the SiGe layer 11, it has high quantum efficiency in the infrared region and improves the light receiving sensitivity.
  • a ToF sensor can be realized.
  • FIG. 5 shows an example of a pixel circuit when the photodetector 1 is an IR image sensor.
  • the pixel 200 is a transfer transistor in order to distribute and accumulate the charge generated in the photodiode 12 in the two floating diffusion regions FD1 and FD2. It had two each of TRG, floating diffusion region FD, additional capacitance FDL, switching transistor FDG, amplification transistor AMP, reset transistor RST, and selection transistor SEL.
  • the photodetection device 1 is an IR imaging sensor, only one charge holding section is required to temporarily hold the charges generated in the photodiode 12, so the transfer transistor TRG, the floating diffusion region FD, and the additional There is also one capacitor FDL, one switching transistor FDG, one amplification transistor AMP, one reset transistor RST, and one selection transistor SEL.
  • the pixel 200 has the circuit configuration shown in FIG. 3, as shown in FIG. , the amplification transistor AMP2, and the selection transistor SEL2 are omitted. Floating diffusion region FD2 and vertical signal line 211B are also omitted.
  • FIG. 6 is a cross-sectional view showing an example of the configuration of the pixel 200 when the photodetector 1 is an IR image sensor.
  • FIG. 6 the same reference numerals are given to the parts corresponding to the cross-sectional view shown in FIG. 1, and the description of those parts will be omitted as appropriate. However, the numbers of the gate insulating film 31 and sidewalls 33 of the MOS transistor Tr are omitted.
  • the photodetection device 1 is an IR image sensor
  • photodiodes 12 are formed in the SiGe layer 11 in units of 200 pixels.
  • the element isolation section 13 is formed at the boundary between the pixels 200 and functions as a pixel isolation section that isolates the photodiode 12 into pixel units.
  • the light-shielding film 15 is formed at the boundary between the pixels 200, and functions as an inter-pixel light-shielding film.
  • the element isolation part 13 is formed by forming a trench from the front surface side of the SiGe layer 11, as in the case of the gate-type ToF sensor shown in FIG. It has an inverted tapered cross-sectional shape in which the groove width becomes narrower from the front side to the back side.
  • Two MOS transistors Tr1 and Tr2 are formed on the MOS transistor formation surface where the Si layer 21 is formed. These two MOS transistors Tr1 and Tr2 correspond to, for example, the transfer transistor TRG1 and the charge discharge transistor OFG of the pixel circuit in FIG. 5.
  • the other pixel transistors of the pixel circuit in FIG. 3, such as the switching transistor FDG1, the amplification transistor AMP1, the selection transistor SEL1, and the reset transistor RST1, are formed on the logic substrate side (not shown).
  • bonding electrodes 42 and 64 are formed on the bonding surface 66 between the wiring layer 61 and the insulating layer 43, and the gate electrode of the MOS transistor Tr is connected via the bonding electrodes 42 and 64.
  • the voltage applied to 32 was being supplied.
  • the cross-sectional view of the pixel 200 in FIG. It is connected to wiring 62.
  • the gate electrode 32 of the MOS transistor Tr2 is connected to a predetermined metal wiring 62 of the wiring layer 61 via a contact electrode 242.
  • the voltage applied to the gate electrodes 32 of the MOS transistors Tr1 and Tr2 may be supplied only by the vias (through electrodes) without passing through the junction electrodes 42 and 64.
  • the applied voltage may be supplied via the bonding electrodes 42 and 64 as in FIGS. 1 and 4.
  • the photodetection device 1 may be an RGBIR image sensor that receives infrared light and RGB light.
  • the pixel array of the photodetector 1 is set to 4 pixels of 2 x 2: an R pixel that receives R (red) light, a G pixel that receives G (green) light, and a B (blue) light.
  • B pixels that receive IR (infrared) light and IR pixels that receive IR (infrared) light may be assigned to form a pixel array that is repeatedly arranged in a matrix.
  • Whether each pixel 200 is an R pixel, B pixel, G pixel, or IR pixel can be controlled by inserting a color filter layer between the photodiode 12 and the on-chip lens 17, for example. can.
  • the CAPD type ToF sensor applies a voltage directly to the semiconductor substrate 51 to generate current within the substrate, and quickly modulates a wide range of photoelectric conversion regions within the substrate, thereby distributing the photoelectrically converted charges. It is an indirect ToF sensor.
  • FIG. 7 shows an example of a pixel circuit when the photodetection device 1 is a CAPD type ToF sensor.
  • the pixel 200 in FIG. 7 has signal extraction sections 301-1 and 301-2 within the semiconductor substrate 51.
  • the signal extraction section 301-1 includes at least an N+ semiconductor region 311-1, which is an N-type semiconductor region, and a P+ semiconductor region 312-1, which is a P-type semiconductor region.
  • the signal extraction section 301-2 includes at least an N+ semiconductor region 311-2, which is an N-type semiconductor region, and a P+ semiconductor region 312-2, which is a P-type semiconductor region.
  • the pixel 200 has a transfer transistor 321A, an FD 322A, a reset transistor 323A, an amplification transistor 324A, and a selection transistor 325A for the signal extraction section 301-1.
  • the pixel 200 includes a transfer transistor 321B, an FD 322B, a reset transistor 323B, an amplification transistor 324B, and a selection transistor 325B for the signal extraction section 301-2.
  • the vertical drive section applies a predetermined voltage MIX1 (first voltage) to the P+ semiconductor region 312-1 and a predetermined voltage MIX2 (second voltage) to the P+ semiconductor region 312-2.
  • a predetermined voltage MIX1 first voltage
  • MIX2 second voltage
  • one of the voltages MIX1 and MIX2 is 1.5V, and the other is 0V.
  • the P+ semiconductor regions 312-1 and 312-2 are voltage application parts to which a first voltage or a second voltage is applied.
  • the N+ semiconductor regions 311-1 and 311-2 are charge detection sections that detect and accumulate charges generated by photoelectric conversion of light incident on the semiconductor substrate 51.
  • the transfer transistor 321A becomes conductive in response to this, thereby transferring the charges accumulated in the N+ semiconductor region 311-1 to the FD 322A.
  • the transfer transistor 321B becomes conductive in response to this, thereby transferring the charges accumulated in the N+ semiconductor region 311-2 to the FD 322B.
  • the FD 322A temporarily holds the charge supplied from the N+ semiconductor region 311-1.
  • the FD 322B temporarily holds the charge supplied from the N+ semiconductor region 311-2.
  • the reset transistor 323A becomes conductive in response to this, thereby resetting the potential of the FD 322A to a predetermined level (reset voltage VDD).
  • the reset transistor 323B becomes conductive in response to the activation of the reset drive signal RSTg supplied to the gate electrode, thereby resetting the potential of the FD 322B to a predetermined level (reset voltage VDD). Note that when reset transistors 323A and 323B are activated, transfer transistors 321A and 321B are also activated at the same time.
  • the amplification transistor 324A has a source electrode connected to the vertical signal line 211A via the selection transistor 325A, so that the amplification transistor 324A connects the load MOS of the constant current source circuit section 326A connected to one end of the vertical signal line 211A and the source follower circuit.
  • the amplification transistor 324B has a source electrode connected to the vertical signal line 211B via the selection transistor 325B, thereby connecting the load MOS of the constant current source circuit section 326B connected to one end of the vertical signal line 211B and the source follower circuit.
  • the selection transistor 325A is connected between the source electrode of the amplification transistor 324A and the vertical signal line 211A.
  • the selection drive signal SELg supplied to the gate electrode becomes active, the selection transistor 325A becomes conductive in response to this, and outputs the pixel signal output from the amplification transistor 324A to the vertical signal line 211A.
  • the selection transistor 325B is connected between the source electrode of the amplification transistor 324B and the vertical signal line 211B.
  • the selection drive signal SELg supplied to the gate electrode becomes active, the selection transistor 325B becomes conductive in response to this, and outputs the pixel signal output from the amplification transistor 324B to the vertical signal line 211B.
  • Transfer transistors 321A and 321B, reset transistors 323A and 323B, amplification transistors 324A and 324B, and selection transistors 325A and 325B of pixel 200 are controlled by, for example, a vertical drive unit that drives pixel 200.
  • FIG. 8 is a cross-sectional view showing an example of the configuration of the pixel 200 when the photodetection device 1 is a CAPD type ToF sensor.
  • the entire SiGe layer 11 of the semiconductor substrate 51 serves as a photoelectric conversion section.
  • the element separation section 13 is formed at the boundary of the pixel 200, and functions as a pixel separation section that separates the photoelectric conversion section into pixel units.
  • the light shielding film 15 is formed at the boundary between the pixels 200, and functions as an inter-pixel light shielding film.
  • the element isolation part 13 is formed by forming a trench from the front surface side of the SiGe layer 11, as in the case of the gate-type ToF sensor shown in FIG. It has an inverted tapered cross-sectional shape in which the groove width becomes narrower from the front side to the back side.
  • a P-well 331 is formed in the center of the pixel 200 and near the interface on the front surface side of the SiGe layer 11. -1 and a signal extraction section 301-2 are formed.
  • the signal extraction section 301-1 includes at least an N+ semiconductor region 311-1 and a P+ semiconductor region 312-1.
  • the signal extraction section 301-2 includes at least an N+ semiconductor region 311-2 and a P+ semiconductor region 312-2.
  • the P+ semiconductor region 312-1 of the signal take-out section 301-1 is provided with a predetermined signal from the logic board side via the predetermined metal wiring 62 of the wiring layer 61, the contact electrode 65, the bonding electrodes 64 and 42, and the contact electrode 41. voltage MIX1 is applied.
  • a signal DET1 corresponding to the charge obtained by photoelectric conversion is transmitted to the logic circuit via a predetermined contact electrode 41, junction electrodes 42 and 64, and contact electrode 65. It is output to the metal wiring 62 on the board side.
  • a predetermined voltage is applied to the P+ semiconductor region 312-2 of the signal extraction section 301-2 from the logic board side via the predetermined metal wiring 62 of the wiring layer 61, the contact electrode 65, the bonding electrodes 64 and 42, and the contact electrode 41. MIX2 is applied.
  • a signal DET2 corresponding to the electric charge obtained by photoelectric conversion is transmitted to the logic circuit through a predetermined contact electrode 41, junction electrodes 42 and 64, and contact electrode 65. It is output to the metal wiring 62 on the board side.
  • the vertical drive unit drives the pixel 200 and distributes a signal according to the charge obtained by photoelectric conversion to FD 322A and FD 322B (FIG. 7).
  • the vertical drive section applies voltage to the two P+ semiconductor regions 312.
  • the vertical drive unit applies a voltage of 1.5V to the P+ semiconductor region 312-1 and a voltage of 0V to the P+ semiconductor region 312-2.
  • infrared light reflected light
  • the infrared light is photoelectrically converted within the SiGe layer 11 and converted into electrons and positive light.
  • the obtained electrons are guided in the direction of the P+ semiconductor region 312-1 by the electric field between the P+ semiconductor regions 312 and move into the N+ semiconductor region 311-1.
  • electrons generated by photoelectric conversion are used as signal charges for detecting a signal corresponding to the amount of infrared light incident on the pixel 200, that is, the amount of received infrared light.
  • the accumulated charge in the N+ semiconductor region 311-1 is transferred to the FD322A directly connected to the N+ semiconductor region 311-1, and the signal DET1 corresponding to the charge transferred to the FD322A is transmitted to the amplification transistor 324A and the vertical signal line 211A. read out via
  • This pixel signal is a signal indicating the amount of charge corresponding to the electrons detected by the N+ semiconductor region 311-1, that is, the amount of charge accumulated in the FD 322A.
  • the pixel signal can also be said to be a signal indicating the amount of infrared light received by the pixel 200.
  • pixel signals corresponding to electrons detected in the N+ semiconductor region 311-2 may also be used for distance measurement as appropriate.
  • a voltage is applied to the two P+ semiconductor regions 312 by the vertical drive unit so that an electric field in the opposite direction to the electric field that has been generated in the SiGe layer 11 is generated.
  • a voltage of 1.5V is applied to the P+ semiconductor region 312-2, and a voltage of 0V is applied to the P+ semiconductor region 312-1.
  • infrared light (reflected light) from the outside enters the SiGe layer 11 through the on-chip lens 17, and the infrared light is photoelectrically converted within the SiGe layer 11 to generate electrons and holes. Once converted into pairs, the obtained electrons are guided toward the P+ semiconductor region 312-2 by the electric field between the P+ semiconductor regions 312 and move into the N+ semiconductor region 311-2.
  • the accumulated charge in the N+ semiconductor region 311-2 is transferred to the FD322B directly connected to the N+ semiconductor region 311-2, and the signal DET2 corresponding to the charge transferred to the FD322B is transmitted to the amplification transistor 324B and the vertical signal line 211B. read out via
  • pixel signals corresponding to electrons detected in the N+ semiconductor region 311-1 may also be appropriately used for distance measurement in the same manner as in the N+ semiconductor region 311-2.
  • the MOS transistor Tr is not formed on the front surface of the SiGe layer 11. Therefore, in the example of FIG. 8, the Si layer 21 is formed on the front surface of the SiGe layer 11, similar to the other cross-sectional configurations described above, but this Si layer 21 can be omitted. Since it is not necessary to form the Si layer 21, the manufacturing process becomes easier.
  • FIG. 9 shows an example of a pixel circuit when the photodetection device 1 is a direct ToF sensor using SPAD.
  • the pixel 200 in FIG. 9 includes a SPAD 401 and a readout circuit 402 composed of a transistor 411 and an inverter 412.
  • the pixel 200 also includes a switch 413.
  • the transistor 411 is a P-type MOS transistor.
  • the cathode of the SPAD 401 is connected to the drain of the transistor 411, as well as to the input terminal of the inverter 412 and one end of the switch 413.
  • the anode of the SPAD 401 is connected to a power supply voltage VA (hereinafter also referred to as anode voltage VA).
  • the SPAD 401 is a photodiode (single photon avalanche photodiode) that avalanche-amplifies generated electrons when incident light is applied and outputs a signal of cathode voltage VS.
  • the power supply voltage VA supplied to the anode of the SPAD 401 is, for example, a negative bias (negative potential) of about -20V.
  • the transistor 411 is a constant current source that operates in the saturation region, and performs passive quenching by functioning as a quenching resistor.
  • the source of the transistor 411 is connected to the power supply voltage VE, and the drain is connected to the cathode of the SPAD 401, the input terminal of the inverter 412, and one end of the switch 413.
  • the power supply voltage VE is also supplied to the cathode of the SPAD 401.
  • a pull-up resistor can also be used instead of the transistor 411 connected in series with the SPAD 401.
  • a voltage (excess bias) larger than the breakdown voltage VBD of the SPAD 401 is applied to the SPAD 401.
  • the breakdown voltage VBD of the SPAD 401 is 20V and a voltage 3V higher than that is applied, the power supply voltage VE supplied to the source of the transistor 411 is 3V.
  • the breakdown voltage VBD of the SPAD 401 varies greatly depending on temperature and other factors. Therefore, the applied voltage applied to the SPAD 401 is controlled (adjusted) according to changes in the breakdown voltage VBD. For example, if the power supply voltage VE is a fixed voltage, the anode voltage VA is controlled (adjusted).
  • the switch 413 has one end connected to the cathode of the SPAD 401, the input terminal of the inverter 412, and the drain of the transistor 411, and the other end connected to the ground (GND).
  • the switch 413 can be composed of, for example, an N-type MOS transistor, and is turned on and off according to the gating control signal VG supplied from the vertical drive section.
  • the vertical drive section supplies a high or low gating control signal VG to the switch 413 of each pixel 200 and turns the switch 413 on and off, thereby setting each pixel 200 of the pixel array section as an active pixel or an inactive pixel.
  • An active pixel is a pixel that detects incident photons
  • an inactive pixel is a pixel that does not detect incident photons.
  • the switch 413 is set to off, as described above.
  • the SPAD 401 is set to Geiger mode.
  • avalanche multiplication occurs and current flows through the SPAD 401. If avalanche multiplication occurs and a current flows through the SPAD 401, the current flows through the SPAD 401, so that a current also flows through the transistor 411, and a voltage drop occurs due to the resistance component of the transistor 411.
  • the cathode voltage VS of the SPAD 401 becomes lower than 0V
  • the anode-cathode voltage of the SPAD 401 becomes lower than the breakdown voltage VBD, so avalanche amplification stops.
  • the current generated by avalanche amplification flows through the transistor 411, causing a voltage drop, and with the generated voltage drop, the cathode voltage VS becomes lower than the breakdown voltage VBD, and the avalanche amplification is stopped. This action is the quench action.
  • the inverter 412 outputs a Lo pixel signal PFout when the cathode voltage VS, which is the input voltage, is equal to or higher than a predetermined threshold voltage Vth, and outputs a Hi pixel signal PFout when the cathode voltage VS is less than the predetermined threshold voltage Vth. do. Therefore, when a photon is incident on the SPAD 401 and avalanche multiplication occurs and the cathode voltage VS decreases and becomes less than the threshold voltage Vth, the pixel signal PFout is inverted from a low level to a high level. On the other hand, when the avalanche multiplication of the SPAD 401 converges and the cathode voltage VS rises and becomes equal to or higher than the threshold voltage Vth, the pixel signal PFout is inverted from high level to low level.
  • the switch 413 when the pixel 200 is set as an inactive pixel, the switch 413 is turned on.
  • the cathode voltage VS of the SPAD 401 becomes 0V.
  • the voltage between the anode and cathode of the SPAD 401 becomes lower than the breakdown voltage VBD, so that even if a photon enters the SPAD 401, it does not react.
  • FIG. 10 is a cross-sectional view showing an example of the configuration of the pixel 200 when the photodetector 1 is a direct ToF sensor using SPAD.
  • the SiGe layer 11 of the semiconductor substrate 51 is configured as an N-well region.
  • the element separation section 13 is formed at the boundary of the pixel 200, and functions as a pixel separation section that separates the photoelectric conversion section into pixel units.
  • the light-shielding film 15 is formed at the boundary between the pixels 200, and functions as an inter-pixel light-shielding film.
  • the element isolation section 13 is formed by forming a trench from the front surface side of the SiGe layer 11, as in the case of the gate-type ToF sensor shown in FIG. It has an inverted tapered cross-sectional shape in which the groove width becomes narrower from the front side to the back side.
  • a P-type diffusion layer 441 and an N-type diffusion layer 442 are formed in the center of the pixel 200 and near the front surface side interface of the SiGe layer 11.
  • An avalanche multiplication region 443 is formed by a depletion layer formed in a region where the P-type diffusion layer 441 and the N-type diffusion layer 442 are connected.
  • a hole accumulation layer 444 is formed over the entire area from the front surface to the back surface of the SiGe layer 11 in the inner region of the element isolation section 13 that is the peripheral portion of the boundary of the pixel 200.
  • the hole accumulation layer 444 is connected to a high concentration P-type diffusion layer 445 formed at the front surface side interface of the SiGe layer 11.
  • the SiGe layer 11 is controlled to be N-type by implanting N-type impurities, and forms an electric field that transfers electrons generated by photoelectric conversion in the pixel 200 to the avalanche multiplication region 443.
  • the P-type diffusion layer 441 is a dense P-type diffusion layer (P+) formed over almost the entire pixel region in the planar direction.
  • the N-type diffusion layer 442 is a dense N-type diffusion layer (N+) formed near the surface of the semiconductor substrate 51 and covering almost the entire pixel region, like the P-type diffusion layer 441.
  • the N-type diffusion layer 442 is a contact layer connected to the contact electrode 41 as a cathode electrode for supplying a negative voltage to form the avalanche multiplication region 443, and a part of the N-type diffusion layer 442 is on the front side of the semiconductor substrate 51. It has a convex shape that extends up to the contact electrode 41 on the surface.
  • a power supply voltage VE is applied to the N-type diffusion layer 442 from the connected contact electrode 41 .
  • the hole accumulation layer 444 is a P-type diffusion layer (P) and accumulates holes. Further, the hole accumulation layer 444 is connected to a high concentration P-type diffusion layer 445 that is electrically connected to the contact electrode 41 as an anode electrode of the SPAD 401.
  • P P-type diffusion layer
  • the high-concentration P-type diffusion layer 445 is a dense P-type diffusion layer (P++) formed near the surface of the SiGe layer 11 in the periphery of the pixel 200 in a plan view.
  • a contact layer is formed to electrically connect the two.
  • a power supply voltage VA is applied to the heavily doped P-type diffusion layer 445 from the connected contact electrode 41 .
  • the SPAD 401 as a photoelectric conversion unit includes a SiGe layer 11 of a semiconductor substrate 51, a P-type diffusion layer 441, an N-type diffusion layer 442, a hole accumulation layer 444, and a high concentration P-type diffusion layer 445, and the hole accumulation layer 444 is , is connected to the first contact electrode 41 as an anode electrode, and the N-type diffusion layer 442 is connected to the second contact electrode 41 as a cathode electrode.
  • the SiGe layer 11 may be controlled to be P-type by implanting P-type impurities.
  • the voltage applied to the N-type diffusion layer 442 becomes the power supply voltage VA
  • the voltage applied to the high concentration P-type diffusion layer 445 becomes the power supply voltage VE.
  • the transistor 411, inverter 412, and switch 413 that constitute the readout circuit 402 of the pixel 200 are formed on the logic substrate side.
  • the MOS transistor Tr is not formed on the front surface of the SiGe layer 11. Therefore, in the example of FIG. 10, the Si layer 21 is formed on the front surface of the SiGe layer 11, as in the other cross-sectional configurations described above, but this Si layer 21 can be omitted. Since it is not necessary to form the Si layer 21, the manufacturing process becomes easier.
  • FIG. 11 shows an example of a cross-sectional configuration in which a diffraction structure is added to the pixel 200 of the direct ToF sensor shown in FIG. 10.
  • the pixel 200 in FIG. 11 is provided with a diffraction structure 481 in which a recess is formed at a predetermined depth at the interface on the light incident surface side of the SiGe layer 11, and a silicon oxide film as an insulating film is embedded in the recess.
  • the diffraction structure 481 can be formed as STI (Shallow Trench Isolation) using isotropic dry etching, for example.
  • FIG. 12 is a plan view showing the planar shape of the diffraction structure 481 in FIG. 11.
  • the diffraction structure 481 can be formed, for example, in a planar shape of a 4x4 grating pattern, as shown in FIG. Note that, of course, the planar shape of the diffraction structure 481 is not limited to the 4x4 grating pattern, and may be any other shape.
  • FIG. 13 shows an example of a cross-sectional configuration in which the same diffraction structure 481 as in FIG. 11 is added to the front surface side of the SiGe layer 11 of the pixel 200 of the direct ToF sensor shown in FIG. 10.
  • the configuration other than the diffraction structure 481 is the same as that of the pixel 200 shown in FIG. 10, so a description thereof will be omitted.
  • the optical path length can be increased due to the light scattering effect, and the light receiving sensitivity can be improved.
  • 11 to 13 show examples in which a diffraction structure 481 is added to the pixel structure of the direct ToF sensor shown in FIG. 10, but the gate type ToF sensor shown in FIG. 4 and the IR imaging sensor shown in FIG. It goes without saying that the diffraction structure 481 may be added to the pixel structure of the CAPD type ToF sensor shown in FIG.
  • FIG. 14 is a diagram illustrating a modification of the element separation section 13 (pixel separation section) in the photodetection device 1.
  • a trench penetrates the SiGe layer 11 as shown in A in FIG.
  • the element isolation part 13 may have a structure in which it is formed in a part of the SiGe layer 11 in the depth direction without penetrating the SiGe layer 11.
  • FIG. 14C shows an example in which the element isolation portion 13 is formed to a depth halfway through the SiGe layer 11 by a method of forming a trench from the back surface (the top surface in FIG. 14) of the SiGe layer 11.
  • FIG. 14 shows an example in which the element isolation part 13 is formed to a depth halfway through the SiGe layer 11 by forming a trench from the front surface (bottom surface in FIG. 14) of the SiGe layer 11. .
  • the pixel 200 has the diffraction structure 481 formed by STI, etc., but the diffraction structure 481 of the pixel 200 can be A structure may be formed.
  • FIG. 15A shows an example in which a diffraction structure 491 is provided in the region near the center of the pixel 200 using a method of forming a trench from the back side of the SiGe layer 11.
  • FIG. 15B shows an example in which a diffraction structure 491 is provided in a region near the center of the pixel 200 using a method of forming a trench from the front surface side of the SiGe layer 11.
  • the depth of the trench can be formed deeper than the diffraction structure 481 shown in FIGS. 11 to 13.
  • FIG. 15C is a plan view showing the planar shape of the diffraction structure 491.
  • the diffraction structure 491 can be formed, for example, in a cross-shaped planar shape that intersects at the center of the pixel, as shown in FIG. 15C.
  • the planar shape of the diffraction structure 491 is not limited to a cross-shaped planar shape, and may be a grating pattern like the diffraction structure 481 or other shapes.
  • FIG. 16 is a block diagram showing a schematic configuration example of the photodetecting device 1 having the above-mentioned pixel 200.
  • the photodetecting device 1 shown in FIG. 16 includes a pixel array section 521 in which the pixels 200 described above are arranged in rows and columns in the row and column directions, and a peripheral circuit section.
  • the peripheral circuit section includes, for example, a vertical drive section 522, a column processing section 523, a horizontal drive section 524, a system control section 525, and the like.
  • the photodetector 1 is further provided with a signal processing section 526 and a data storage section 527.
  • the signal processing section 526 and the data storage section 527 may be mounted on the same substrate as the photodetection device 1, or may be arranged on a substrate in a module separate from the photodetection device 1.
  • the pixel array section 521 has a configuration in which pixels 200 that generate charges according to the amount of received light and output signals according to the charges are arranged in matrix in the row and column directions.
  • the row direction refers to the direction in which the pixels 200 are arranged in the horizontal direction
  • the column direction refers to the direction in which the pixels 200 are arranged in the vertical direction.
  • the row direction is the horizontal direction in the figure
  • the column direction is the vertical direction in the figure. Therefore, the pixel array section 521 includes a plurality of pixels 200 that photoelectrically convert incident light and output a signal according to the resulting charge.
  • the photodetection device 1 is a gate type ToF sensor, and when it has the configuration explained in FIGS.
  • the photodetection device 1 is an IR It is an image sensor. Further, when the pixel 200 has the configuration explained in FIGS. 7 and 8, the photodetection device 1 is a CAPD type ToF sensor, and when the pixel 200 has the configuration explained in FIGS. 9 and 10, the photodetection device 1 is a SPAD type ToF sensor. This is a direct ToF sensor using
  • a pixel drive line 528 is wired along the row direction for each pixel row, and a vertical signal line 529 is wired along the pixel column direction for a matrix-like pixel arrangement.
  • the pixel drive line 528 transmits a drive signal for driving when reading a signal from the pixel 200.
  • the pixel drive line 528 is shown as one wiring in FIG. 16, it is not limited to one wiring.
  • One end of the pixel drive line 528 is connected to an output end corresponding to each row of the vertical drive section 522.
  • the vertical signal line 529 corresponds to the vertical signal lines 211A and 211B described with reference to FIG. 3 and the like.
  • the vertical drive section 522 is composed of a shift register, an address decoder, etc., and drives each pixel 200 of the pixel array section 521 simultaneously or in units of rows. That is, the vertical drive unit 522 forms a control circuit that controls the operation of each pixel 200 of the pixel array unit 521, together with the system control unit 525 that controls the vertical drive unit 522.
  • Pixel signals output from each pixel 200 in a pixel row according to drive control by the vertical drive unit 522 are input to the column processing unit 523 through the vertical signal line 529.
  • the column processing unit 523 performs predetermined signal processing on the pixel signal output from each pixel 200 through the vertical signal line 529, and temporarily holds the pixel signal after the signal processing. Specifically, the column processing unit 523 performs noise removal processing, AD (Analog to Digital) conversion processing, etc. as signal processing.
  • the horizontal drive section 524 is composed of a shift register, an address decoder, etc., and sequentially selects unit circuits corresponding to the pixel columns of the column processing section 523. By this selective scanning by the horizontal driving section 524, pixel signals subjected to signal processing for each unit circuit in the column processing section 523 are sequentially output.
  • the system control unit 525 includes a timing generator that generates various timing signals, and based on the various timing signals generated by the timing generator, the vertical drive unit 522, column processing unit 523, and horizontal drive unit 524 Performs drive control such as
  • the signal processing unit 526 has at least an arithmetic processing function, and performs various signal processing such as arithmetic processing based on the pixel signal output from the column processing unit 523.
  • the data storage section 527 temporarily stores data necessary for signal processing in the signal processing section 526.
  • the photodetector 1 configured as described above has a circuit configuration called a column ADC type in which an AD conversion circuit that performs AD conversion processing in the column processing section 523 is arranged for each pixel column.
  • the photodetector 1 is a back-illuminated device that has a high quantum efficiency in the infrared region because a photodiode 12 as a photoelectric conversion section is formed in the SiGe layer 11.
  • the photodetection device 1 is mounted on a vehicle, for example, and can measure the distance to an object outside the vehicle and generate a photographed image.
  • the photodetecting device 1 can be installed in a smartphone or the like, and can measure the distance to an object or generate a photographed image.
  • FIG. 17 is a diagram showing an example of use of an image sensor using the above-described photodetection device 1.
  • the above-described photodetection device 1 can be used as an image sensor in various cases for sensing light such as infrared light, visible light, ultraviolet light, and X-rays, for example, as described below.
  • ⁇ Digital cameras, mobile devices with camera functions, and other devices that take images for viewing purposes Devices used for transportation, such as in-vehicle sensors that take pictures of the rear, surroundings, and interior of the car, surveillance cameras that monitor moving vehicles and roads, and distance sensors that measure the distance between vehicles, etc.
  • Devices used for transportation such as in-vehicle sensors that take pictures of the rear, surroundings, and interior of the car, surveillance cameras that monitor moving vehicles and roads, and distance sensors that measure the distance between vehicles, etc.
  • User gestures Devices used in home appliances such as TVs, refrigerators, and air conditioners to take pictures and operate devices according to the gestures.
  • - Endoscopes devices that perform blood vessel imaging by receiving infrared light, etc.
  • Devices used for medical and healthcare purposes - Devices used for security, such as surveillance cameras for crime prevention and cameras for person authentication - Skin measurement devices that take pictures of the skin, and devices that take pictures of the scalp - Devices used for beauty purposes, such as microscopes for skin care.
  • - Devices used for sports such as action cameras and wearable cameras.
  • - Cameras, etc. used to monitor the condition of fields and crops. , equipment used for agricultural purposes
  • the technology of the present disclosure is not limited to application to photodetecting devices. That is, the technology of the present disclosure applies to an image capture unit (photoelectric conversion unit) in an image capture device such as a digital still camera or a video camera, a mobile terminal device having an image capture function, or a copying machine that uses a solid-state image capture device in an image reading unit. ) is applicable to all electronic devices that use solid-state imaging devices.
  • the solid-state imaging device may be formed as a single chip, or may be a module having an imaging function in which an imaging section and a signal processing section or an optical system are packaged together.
  • FIG. 18 is a block diagram showing a configuration example of an imaging device as an electronic device to which the technology of the present disclosure is applied.
  • the imaging device 1000 in FIG. 18 includes an optical section 1001 consisting of a lens group, etc., a solid-state imaging device (imaging device) 1002 in which the configuration of the photodetection device 1 in FIG. 1 is adopted, and a DSP (Digital Signal (processor) circuit 1003.
  • the imaging apparatus 1000 also includes a frame memory 1004, a display section 1005, a recording section 1006, an operation section 1007, and a power supply section 1008.
  • DSP circuit 1003, frame memory 1004, display section 1005, recording section 1006, operation section 1007, and power supply section 1008 are interconnected via bus line 1009.
  • the optical section 1001 takes in incident light (image light) from a subject and forms an image on the imaging surface of the solid-state imaging device 1002.
  • the solid-state imaging device 1002 converts the amount of incident light imaged onto the imaging surface by the optical section 1001 into an electrical signal for each pixel, and outputs the electric signal as a pixel signal.
  • the photodetection device 1 of FIG. 1 that is, a photodetection device that has high quantum efficiency in the infrared region and has improved sensitivity to light in the infrared region.
  • the display unit 1005 is configured with a thin display such as an LCD (Liquid Crystal Display) or an organic EL (Electro Luminescence) display, and displays moving images or still images captured by the solid-state imaging device 1002.
  • a recording unit 1006 records a moving image or a still image captured by the solid-state imaging device 1002 on a recording medium such as a hard disk or a semiconductor memory.
  • the operation unit 1007 issues operation commands regarding various functions of the imaging device 1000 under operation by the user.
  • a power supply unit 1008 appropriately supplies various power supplies that serve as operating power for the DSP circuit 1003, frame memory 1004, display unit 1005, recording unit 1006, and operation unit 1007 to these supply targets.
  • the sensitivity to light in the infrared region can be improved. Therefore, it is possible to improve the quality of captured images even in the imaging device 1000 such as a video camera, a digital still camera, or a camera module for mobile devices such as a mobile phone.
  • Example of application to endoscopic surgery system The technology according to the present disclosure (this technology) can be applied to various products.
  • the technology according to the present disclosure may be applied to an endoscopic surgery system.
  • FIG. 19 is a diagram illustrating an example of a schematic configuration of an endoscopic surgery system to which the technology according to the present disclosure (present technology) can be applied.
  • FIG. 19 shows an operator (doctor) 11131 performing surgery on a patient 11132 on a patient bed 11133 using the endoscopic surgery system 11000.
  • the endoscopic surgery system 11000 includes an endoscope 11100, other surgical instruments 11110 such as a pneumoperitoneum tube 11111 and an energy treatment instrument 11112, and a support arm device 11120 that supports the endoscope 11100. , and a cart 11200 loaded with various devices for endoscopic surgery.
  • the endoscope 11100 is composed of a lens barrel 11101 whose distal end is inserted into a body cavity of a patient 11132 over a predetermined length, and a camera head 11102 connected to the proximal end of the lens barrel 11101.
  • an endoscope 11100 configured as a so-called rigid scope having a rigid tube 11101 is shown, but the endoscope 11100 may also be configured as a so-called flexible scope having a flexible tube. good.
  • An opening into which an objective lens is fitted is provided at the tip of the lens barrel 11101.
  • a light source device 11203 is connected to the endoscope 11100, and the light generated by the light source device 11203 is guided to the tip of the lens barrel by a light guide extending inside the lens barrel 11101, and the light is guided to the tip of the lens barrel. Irradiation is directed toward an observation target within the body cavity of the patient 11132 through the lens.
  • the endoscope 11100 may be a direct-viewing mirror, a diagonal-viewing mirror, or a side-viewing mirror.
  • An optical system and an image sensor are provided inside the camera head 11102, and reflected light (observation light) from an observation target is focused on the image sensor by the optical system.
  • the observation light is photoelectrically converted by the image sensor, and an electric signal corresponding to the observation light, that is, an image signal corresponding to the observation image is generated.
  • the image signal is transmitted as RAW data to a camera control unit (CCU) 11201.
  • CCU camera control unit
  • the CCU 11201 is configured with a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), and the like, and centrally controls the operations of the endoscope 11100 and the display device 11202. Further, the CCU 11201 receives an image signal from the camera head 11102, and performs various image processing on the image signal, such as development processing (demosaic processing), for displaying an image based on the image signal.
  • a CPU Central Processing Unit
  • GPU Graphics Processing Unit
  • the display device 11202 displays an image based on an image signal subjected to image processing by the CCU 11201 under control from the CCU 11201.
  • the light source device 11203 is composed of a light source such as an LED (Light Emitting Diode), and supplies irradiation light to the endoscope 11100 when photographing the surgical site or the like.
  • a light source such as an LED (Light Emitting Diode)
  • LED Light Emitting Diode
  • the input device 11204 is an input interface for the endoscopic surgery system 11000.
  • the user can input various information and instructions to the endoscopic surgery system 11000 via the input device 11204.
  • the user inputs an instruction to change the imaging conditions (type of irradiation light, magnification, focal length, etc.) by the endoscope 11100.
  • a treatment tool control device 11205 controls driving of an energy treatment tool 11112 for cauterizing tissue, incising, sealing blood vessels, or the like.
  • the pneumoperitoneum device 11206 injects gas into the body cavity of the patient 11132 via the pneumoperitoneum tube 11111 in order to inflate the body cavity of the patient 11132 for the purpose of ensuring a field of view with the endoscope 11100 and a working space for the operator. send in.
  • the recorder 11207 is a device that can record various information regarding surgery.
  • the printer 11208 is a device that can print various types of information regarding surgery in various formats such as text, images, or graphs.
  • the light source device 11203 that supplies irradiation light to the endoscope 11100 when photographing the surgical site can be configured, for example, from a white light source configured by an LED, a laser light source, or a combination thereof.
  • a white light source configured by a combination of RGB laser light sources
  • the output intensity and output timing of each color (each wavelength) can be controlled with high precision, so the white balance of the captured image is adjusted in the light source device 11203. It can be carried out.
  • the laser light from each RGB laser light source is irradiated onto the observation target in a time-sharing manner, and the drive of the image sensor of the camera head 11102 is controlled in synchronization with the irradiation timing, thereby supporting each of RGB. It is also possible to capture images in a time-division manner. According to this method, a color image can be obtained without providing a color filter in the image sensor.
  • the driving of the light source device 11203 may be controlled so that the intensity of the light it outputs is changed at predetermined time intervals.
  • the drive of the image sensor of the camera head 11102 in synchronization with the timing of changes in the light intensity to acquire images in a time-division manner and compositing the images, a high dynamic It is possible to generate an image of a range.
  • the light source device 11203 may be configured to be able to supply light in a predetermined wavelength band compatible with special light observation.
  • Special light observation uses, for example, the wavelength dependence of light absorption in body tissues to illuminate the mucosal surface layer by irradiating a narrower band of light than the light used for normal observation (i.e., white light). So-called narrow band imaging is performed in which predetermined tissues such as blood vessels are photographed with high contrast.
  • fluorescence observation may be performed in which an image is obtained using fluorescence generated by irradiating excitation light.
  • Fluorescence observation involves irradiating body tissues with excitation light and observing the fluorescence from the body tissues (autofluorescence observation), or locally injecting reagents such as indocyanine green (ICG) into the body tissues and It is possible to obtain a fluorescence image by irradiating excitation light corresponding to the fluorescence wavelength of the reagent.
  • the light source device 11203 may be configured to be able to supply narrowband light and/or excitation light compatible with such special light observation.
  • FIG. 20 is a block diagram showing an example of the functional configuration of the camera head 11102 and CCU 11201 shown in FIG. 19.
  • the camera head 11102 includes a lens unit 11401, an imaging section 11402, a driving section 11403, a communication section 11404, and a camera head control section 11405.
  • the CCU 11201 includes a communication section 11411, an image processing section 11412, and a control section 11413. Camera head 11102 and CCU 11201 are communicably connected to each other by transmission cable 11400.
  • the lens unit 11401 is an optical system provided at the connection part with the lens barrel 11101. Observation light taken in from the tip of the lens barrel 11101 is guided to the camera head 11102 and enters the lens unit 11401.
  • the lens unit 11401 is configured by combining a plurality of lenses including a zoom lens and a focus lens.
  • the imaging unit 11402 is composed of an image sensor.
  • the imaging unit 11402 may include one image sensor (so-called single-plate type) or a plurality of image sensors (so-called multi-plate type).
  • image signals corresponding to RGB are generated by each imaging element, and a color image may be obtained by combining them.
  • the imaging unit 11402 may be configured to include a pair of imaging elements for respectively acquiring right-eye and left-eye image signals corresponding to 3D (dimensional) display. By performing 3D display, the operator 11131 can more accurately grasp the depth of the living tissue at the surgical site.
  • a plurality of lens units 11401 may be provided corresponding to each imaging element.
  • the imaging unit 11402 does not necessarily have to be provided in the camera head 11102.
  • the imaging unit 11402 may be provided inside the lens barrel 11101 immediately after the objective lens.
  • the drive unit 11403 is constituted by an actuator, and moves the zoom lens and focus lens of the lens unit 11401 by a predetermined distance along the optical axis under control from the camera head control unit 11405. Thereby, the magnification and focus of the image captured by the imaging unit 11402 can be adjusted as appropriate.
  • the communication unit 11404 is configured by a communication device for transmitting and receiving various information to and from the CCU 11201.
  • the communication unit 11404 transmits the image signal obtained from the imaging unit 11402 to the CCU 11201 via the transmission cable 11400 as RAW data.
  • the communication unit 11404 receives a control signal for controlling the drive of the camera head 11102 from the CCU 11201 and supplies it to the camera head control unit 11405.
  • the control signal may include, for example, information specifying the frame rate of the captured image, information specifying the exposure value at the time of capturing, and/or information specifying the magnification and focus of the captured image. Contains information about conditions.
  • the above-mentioned frame rate, exposure value, magnification, focus, and other imaging conditions may be appropriately specified by the user, or may be automatically set by the control unit 11413 of the CCU 11201 based on the acquired image signal.
  • the endoscope 11100 is equipped with so-called AE (Auto Exposure) function, AF (Auto Focus) function, and AWB (Auto White Balance) function.
  • the camera head control unit 11405 controls the drive of the camera head 11102 based on the control signal from the CCU 11201 received via the communication unit 11404.
  • the communication unit 11411 is configured by a communication device for transmitting and receiving various information to and from the camera head 11102.
  • the communication unit 11411 receives an image signal transmitted from the camera head 11102 via the transmission cable 11400.
  • the communication unit 11411 transmits a control signal for controlling the drive of the camera head 11102 to the camera head 11102.
  • the image signal and control signal can be transmitted by electrical communication, optical communication, or the like.
  • the image processing unit 11412 performs various image processing on the image signal, which is RAW data, transmitted from the camera head 11102.
  • the control unit 11413 performs various controls related to the imaging of the surgical site etc. by the endoscope 11100 and the display of the captured image obtained by imaging the surgical site etc. For example, the control unit 11413 generates a control signal for controlling the drive of the camera head 11102.
  • control unit 11413 causes the display device 11202 to display a captured image showing the surgical site, etc., based on the image signal subjected to image processing by the image processing unit 11412.
  • the control unit 11413 may recognize various objects in the captured image using various image recognition techniques. For example, the control unit 11413 detects the shape and color of the edge of an object included in the captured image to detect surgical tools such as forceps, specific body parts, bleeding, mist when using the energy treatment tool 11112, etc. can be recognized.
  • the control unit 11413 may use the recognition result to superimpose and display various types of surgical support information on the image of the surgical site. By displaying the surgical support information in a superimposed manner and presenting it to the surgeon 11131, it becomes possible to reduce the burden on the surgeon 11131 and allow the surgeon 11131 to proceed with the surgery reliably.
  • the transmission cable 11400 connecting the camera head 11102 and the CCU 11201 is an electrical signal cable compatible with electrical signal communication, an optical fiber compatible with optical communication, or a composite cable thereof.
  • communication is performed by wire using the transmission cable 11400, but communication between the camera head 11102 and the CCU 11201 may be performed wirelessly.
  • the technology according to the present disclosure can be applied to the lens unit 11401 and the imaging section 11402 of the camera head 11102 among the configurations described above.
  • the photodetecting device 1 as an IR imaging sensor having the configuration described in FIGS. 5 and 6 can be applied.
  • the technology according to the present disclosure (this technology) can be applied to various products.
  • the technology according to the present disclosure may be realized as a device mounted on any type of moving body such as a car, electric vehicle, hybrid electric vehicle, motorcycle, bicycle, personal mobility, airplane, drone, ship, robot, etc. You can.
  • FIG. 21 is a block diagram showing a schematic configuration example of a vehicle control system, which is an example of a mobile object control system to which the technology according to the present disclosure can be applied.
  • the vehicle control system 12000 includes a plurality of electronic control units connected via a communication network 12001.
  • the vehicle control system 12000 includes a drive system control unit 12010, a body system control unit 12020, an outside vehicle information detection unit 12030, an inside vehicle information detection unit 12040, and an integrated control unit 12050.
  • a microcomputer 12051, an audio/image output section 12052, and an in-vehicle network I/F (interface) 12053 are illustrated.
  • the drive system control unit 12010 controls the operation of devices related to the drive system of the vehicle according to various programs.
  • the drive system control unit 12010 includes a drive force generation device such as an internal combustion engine or a drive motor that generates drive force for the vehicle, a drive force transmission mechanism that transmits the drive force to wheels, and a drive force transmission mechanism that controls the steering angle of the vehicle. It functions as a control device for a steering mechanism to adjust and a braking device to generate braking force for the vehicle.
  • the body system control unit 12020 controls the operations of various devices installed in the vehicle body according to various programs.
  • the body system control unit 12020 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as a headlamp, a back lamp, a brake lamp, a turn signal, or a fog lamp.
  • radio waves transmitted from a portable device that replaces a key or signals from various switches may be input to the body control unit 12020.
  • the body system control unit 12020 receives input of these radio waves or signals, and controls the door lock device, power window device, lamp, etc. of the vehicle.
  • the external information detection unit 12030 detects information external to the vehicle in which the vehicle control system 12000 is mounted.
  • an imaging section 12031 is connected to the outside-vehicle information detection unit 12030.
  • the vehicle exterior information detection unit 12030 causes the imaging unit 12031 to capture an image of the exterior of the vehicle, and receives the captured image.
  • the external information detection unit 12030 may perform object detection processing such as a person, car, obstacle, sign, or text on the road surface or distance detection processing based on the received image.
  • the imaging unit 12031 is an optical sensor that receives light and outputs an electrical signal according to the amount of received light.
  • the imaging unit 12031 can output the electrical signal as an image or as distance measurement information.
  • the light received by the imaging unit 12031 may be visible light or non-visible light such as infrared rays.
  • the in-vehicle information detection unit 12040 detects in-vehicle information.
  • a driver condition detection section 12041 that detects the condition of the driver is connected to the in-vehicle information detection unit 12040.
  • the driver condition detection unit 12041 includes, for example, a camera that images the driver, and the in-vehicle information detection unit 12040 detects the degree of fatigue or concentration of the driver based on the detection information input from the driver condition detection unit 12041. It may be calculated, or it may be determined whether the driver is falling asleep.
  • the microcomputer 12051 calculates control target values for the driving force generation device, steering mechanism, or braking device based on the information inside and outside the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040, Control commands can be output to 12010.
  • the microcomputer 12051 realizes ADAS (Advanced Driver Assistance System) functions, including vehicle collision avoidance or impact mitigation, following distance based on vehicle distance, vehicle speed maintenance, vehicle collision warning, vehicle lane departure warning, etc. It is possible to perform cooperative control for the purpose of ADAS (Advanced Driver Assistance System) functions, including vehicle collision avoidance or impact mitigation, following distance based on vehicle distance, vehicle speed maintenance, vehicle collision warning, vehicle lane departure warning, etc. It is possible to perform cooperative control for the purpose of
  • ADAS Advanced Driver Assistance System
  • the microcomputer 12051 controls the driving force generating device, steering mechanism, braking device, etc. based on information about the surroundings of the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040. It is possible to perform cooperative control for the purpose of autonomous driving, etc., which does not rely on operation.
  • the microcomputer 12051 can output a control command to the body system control unit 12020 based on the information outside the vehicle acquired by the outside information detection unit 12030.
  • the microcomputer 12051 controls the headlamps according to the position of the preceding vehicle or oncoming vehicle detected by the vehicle exterior information detection unit 12030, and performs cooperative control for the purpose of preventing glare, such as switching from high beam to low beam. It can be carried out.
  • the audio and image output unit 12052 transmits an output signal of at least one of audio and images to an output device that can visually or audibly notify information to the occupants of the vehicle or to the outside of the vehicle.
  • an audio speaker 12061, a display section 12062, and an instrument panel 12063 are illustrated as output devices.
  • the display unit 12062 may include, for example, at least one of an on-board display and a head-up display.
  • FIG. 22 is a diagram showing an example of the installation position of the imaging section 12031.
  • the vehicle 12100 has imaging units 12101, 12102, 12103, 12104, and 12105 as the imaging unit 12031.
  • the imaging units 12101, 12102, 12103, 12104, and 12105 are provided, for example, at positions such as the front nose, side mirrors, rear bumper, back door, and the top of the windshield inside the vehicle 12100.
  • An imaging unit 12101 provided in the front nose and an imaging unit 12105 provided above the windshield inside the vehicle mainly acquire images in front of the vehicle 12100.
  • Imaging units 12102 and 12103 provided in the side mirrors mainly capture images of the sides of the vehicle 12100.
  • An imaging unit 12104 provided in the rear bumper or back door mainly captures images of the rear of the vehicle 12100.
  • the images of the front acquired by the imaging units 12101 and 12105 are mainly used for detecting preceding vehicles, pedestrians, obstacles, traffic lights, traffic signs, lanes, and the like.
  • FIG. 22 shows an example of the imaging range of the imaging units 12101 to 12104.
  • An imaging range 12111 indicates the imaging range of the imaging unit 12101 provided on the front nose
  • imaging ranges 12112 and 12113 indicate imaging ranges of the imaging units 12102 and 12103 provided on the side mirrors, respectively
  • an imaging range 12114 shows the imaging range of the imaging unit 12101 provided on the front nose.
  • the imaging range of the imaging unit 12104 provided in the rear bumper or back door is shown. For example, by overlapping the image data captured by the imaging units 12101 to 12104, an overhead image of the vehicle 12100 viewed from above can be obtained.
  • At least one of the imaging units 12101 to 12104 may have a function of acquiring distance information.
  • at least one of the imaging units 12101 to 12104 may be a stereo camera including a plurality of image sensors, or may be an image sensor having pixels for phase difference detection.
  • the microcomputer 12051 determines the distance to each three-dimensional object within the imaging ranges 12111 to 12114 and the temporal change in this distance (relative speed with respect to the vehicle 12100) based on the distance information obtained from the imaging units 12101 to 12104. In particular, by determining the three-dimensional object that is closest to the vehicle 12100 on its path and that is traveling at a predetermined speed (for example, 0 km/h or more) in approximately the same direction as the vehicle 12100, it is possible to extract the three-dimensional object as the preceding vehicle. can.
  • a predetermined speed for example, 0 km/h or more
  • the microcomputer 12051 can set an inter-vehicle distance to be secured in advance in front of the preceding vehicle, and perform automatic brake control (including follow-up stop control), automatic acceleration control (including follow-up start control), and the like. In this way, it is possible to perform cooperative control for the purpose of autonomous driving, etc., in which the vehicle travels autonomously without depending on the driver's operation.
  • the microcomputer 12051 transfers three-dimensional object data to other three-dimensional objects such as two-wheeled vehicles, regular vehicles, large vehicles, pedestrians, and utility poles based on the distance information obtained from the imaging units 12101 to 12104. It can be classified and extracted and used for automatic obstacle avoidance. For example, the microcomputer 12051 identifies obstacles around the vehicle 12100 into obstacles that are visible to the driver of the vehicle 12100 and obstacles that are difficult to see. Then, the microcomputer 12051 determines a collision risk indicating the degree of risk of collision with each obstacle, and when the collision risk exceeds a set value and there is a possibility of a collision, the microcomputer 12051 transmits information via the audio speaker 12061 and the display unit 12062. By outputting a warning to the driver via the vehicle control unit 12010 and performing forced deceleration and avoidance steering via the drive system control unit 12010, driving support for collision avoidance can be provided.
  • the microcomputer 12051 determines a collision risk indicating the degree of risk of collision with each obstacle, and when the collision risk exceed
  • At least one of the imaging units 12101 to 12104 may be an infrared camera that detects infrared rays.
  • the microcomputer 12051 can recognize a pedestrian by determining whether the pedestrian is present in the images captured by the imaging units 12101 to 12104.
  • pedestrian recognition involves, for example, a procedure for extracting feature points in images captured by the imaging units 12101 to 12104 as infrared cameras, and a pattern matching process is performed on a series of feature points indicating the outline of an object to determine whether it is a pedestrian or not.
  • the audio image output unit 12052 creates a rectangular outline for emphasis on the recognized pedestrian.
  • the display unit 12062 is controlled to display the .
  • the audio image output unit 12052 may also control the display unit 12062 to display an icon or the like indicating a pedestrian at a desired position.
  • the technology according to the present disclosure can be applied to the imaging unit 12031 among the configurations described above.
  • the imaging unit 12031 the photodetection device 1 as an IR imaging sensor having the configuration described in FIGS. 5 and 6 can be applied.
  • the photodetecting device 1 as a direct ToF sensor having the configuration described in FIGS. 9 and 10 can be applied as the imaging unit 12031.
  • the first conductivity type is P type
  • the second conductivity type is N type
  • the photodetection device uses electrons as the signal charge.
  • the present disclosure describes a photodetection device using holes as the signal charge. can also be applied. That is, the first conductivity type can be an N type, the second conductivity type can be a P type, and each of the aforementioned semiconductor regions can be configured with semiconductor regions of opposite conductivity types.
  • the present disclosure is not limited to application to a photodetection device that detects the distribution of the amount of incident light of visible light and captures the image as an image, but also applies to a light detection device that detects the distribution of the amount of incident light such as infrared rays, X-rays, or particles. It can be applied to detection devices and, in a broader sense, to general photodetection devices (physical quantity distribution detection devices) such as fingerprint detection sensors that detect the distribution of other physical quantities such as pressure and capacitance and capture them as images. be.
  • the technology of the present disclosure can take the following configuration.
  • the silicon germanium layer is formed with a constant germanium concentration.
  • a gate-type ToF sensor having the photoelectric conversion section, a transfer transistor, an amplification transistor, a selection transistor, a reset transistor, and a charge discharge transistor in each pixel, according to any one of (1) to (4) above.
  • the TOF sensor is a CAPD type ToF sensor that includes, in each pixel, the photoelectric conversion section, a voltage application section that applies a predetermined voltage to the photoelectric conversion section, and a charge detection section that detects the charge generated in the photoelectric conversion section.
  • the photodetection device according to any one of (1) to (5) above.
  • the photodetection device according to any one of (1) to (5) which is a direct ToF sensor having an avalanche photodiode including an avalanche multiplication region in each pixel as the photoelectric conversion section.
  • the photodetection device according to any one of (1) to (13), further comprising a diffraction structure on the first surface side of the silicon germanium layer.
  • a silicon germanium layer in which a photoelectric conversion section is formed a silicon germanium layer in which a photoelectric conversion section is formed; an interpixel light shielding film formed on a first surface side that is a light incident surface side of the silicon germanium layer; a MOS transistor formed on a second surface opposite to the first surface of the silicon germanium layer;
  • the germanium concentration of the silicon germanium layer is formed at a constant concentration.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Power Engineering (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Electromagnetism (AREA)
  • Condensed Matter Physics & Semiconductors (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Solid State Image Pick-Up Elements (AREA)

Abstract

La présente divulgation concerne : un dispositif de photodétection qui permet d'obtenir une unité de conversion photoélectrique ayant un rendement quantique élevé dans une région infrarouge ; un procédé de fabrication de ce dispositif de photodétection ; et un appareil électronique. Le dispositif de photodétection comprend : une couche de silicium-germanium qui est pourvue d'une unité de conversion photoélectrique ; un film de blocage de lumière inter-pixel qui est formé sur un premier côté de surface de la couche de silicium-germanium, le premier côté de surface étant le côté de surface d'incidence de la lumière ; et un transistor MOS qui est formé sur un second côté de surface de la couche de silicium-germanium, le second côté de surface étant situé à l'opposé du premier côté de surface. La couche de silicium-germanium est formée de manière à avoir une concentration constante en germanium. Cette technologie peut être appliquée, par exemple, à un capteur de temps de vol ou à un capteur d'imagerie.
PCT/JP2022/034521 2022-09-15 2022-09-15 Dispositif de photodétection, son procédé de production et appareil électronique WO2024057470A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/034521 WO2024057470A1 (fr) 2022-09-15 2022-09-15 Dispositif de photodétection, son procédé de production et appareil électronique

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/034521 WO2024057470A1 (fr) 2022-09-15 2022-09-15 Dispositif de photodétection, son procédé de production et appareil électronique

Publications (1)

Publication Number Publication Date
WO2024057470A1 true WO2024057470A1 (fr) 2024-03-21

Family

ID=90274547

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/034521 WO2024057470A1 (fr) 2022-09-15 2022-09-15 Dispositif de photodétection, son procédé de production et appareil électronique

Country Status (1)

Country Link
WO (1) WO2024057470A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013033864A (ja) * 2011-08-02 2013-02-14 Sony Corp 固体撮像素子の製造方法、固体撮像素子、および電子機器
US20150279894A1 (en) * 2014-03-28 2015-10-01 Taiwan Semiconductor Manufacturing Company, Ltd. CMOS Image Sensor with Epitaxial Passivation Layer
JP2016001633A (ja) * 2014-06-11 2016-01-07 ソニー株式会社 固体撮像素子、および電子装置
JP2020516200A (ja) * 2017-04-04 2020-05-28 アーティラックス・インコーポレイテッド 高速光感知装置iii
JP2021158418A (ja) * 2020-03-25 2021-10-07 凸版印刷株式会社 固体撮像素子および撮像システム
WO2022014365A1 (fr) * 2020-07-17 2022-01-20 ソニーセミコンダクタソリューションズ株式会社 Élément de réception de lumière, son dispositif de fabrication, et dispositif électronique

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013033864A (ja) * 2011-08-02 2013-02-14 Sony Corp 固体撮像素子の製造方法、固体撮像素子、および電子機器
US20150279894A1 (en) * 2014-03-28 2015-10-01 Taiwan Semiconductor Manufacturing Company, Ltd. CMOS Image Sensor with Epitaxial Passivation Layer
JP2016001633A (ja) * 2014-06-11 2016-01-07 ソニー株式会社 固体撮像素子、および電子装置
JP2020516200A (ja) * 2017-04-04 2020-05-28 アーティラックス・インコーポレイテッド 高速光感知装置iii
JP2021158418A (ja) * 2020-03-25 2021-10-07 凸版印刷株式会社 固体撮像素子および撮像システム
WO2022014365A1 (fr) * 2020-07-17 2022-01-20 ソニーセミコンダクタソリューションズ株式会社 Élément de réception de lumière, son dispositif de fabrication, et dispositif électronique

Similar Documents

Publication Publication Date Title
JP7439214B2 (ja) 固体撮像素子および電子機器
JP2023086799A (ja) 光検出素子
KR20180117601A (ko) 고체 촬상 소자
CN115696074B (zh) 光检测装置
US20240047499A1 (en) Solid-state imaging device, method for manufacturing the same, and electronic apparatus
JP7399105B2 (ja) 固体撮像素子および映像記録装置
JP7452962B2 (ja) 撮像装置
JPWO2018180575A1 (ja) 固体撮像素子、電子機器、並びに製造方法
KR20210075075A (ko) 촬상 소자 및 전자 기기
TW202133421A (zh) 固態成像裝置及電子裝置
US20240079428A1 (en) Imaging device
EP4124010A1 (fr) Ensemble capteur, son procédé de fabrication et dispositif d'imagerie
WO2024057470A1 (fr) Dispositif de photodétection, son procédé de production et appareil électronique
CN114270516A (zh) 摄像元件和摄像装置
WO2023248926A1 (fr) Élément d'imagerie et dispositif électronique
WO2023248925A1 (fr) Élément d'imagerie et dispositif électronique
WO2023188899A1 (fr) Dispositif de détection de lumière et appareil électronique
WO2024058010A1 (fr) Dispositif de détection de la lumière et appareil électronique
WO2024024515A1 (fr) Dispositif de photodétection et système de télémétrie
WO2024057814A1 (fr) Dispositif de détection de lumière et instrument électronique
WO2023234069A1 (fr) Dispositif d'imagerie et appareil électronique
WO2023210238A1 (fr) Dispositif de détection de lumière, et appareil électronique
US20220344390A1 (en) Organic cis image sensor
CN117716504A (zh) 光检测装置、光检测装置的制造方法和电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22958794

Country of ref document: EP

Kind code of ref document: A1