WO2021215299A1 - Élément d'imagerie et dispositif d'imagerie - Google Patents

Élément d'imagerie et dispositif d'imagerie Download PDF

Info

Publication number
WO2021215299A1
WO2021215299A1 PCT/JP2021/015278 JP2021015278W WO2021215299A1 WO 2021215299 A1 WO2021215299 A1 WO 2021215299A1 JP 2021015278 W JP2021015278 W JP 2021015278W WO 2021215299 A1 WO2021215299 A1 WO 2021215299A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
semiconductor substrate
image pickup
pixel
pickup device
Prior art date
Application number
PCT/JP2021/015278
Other languages
English (en)
Japanese (ja)
Inventor
雅史 坂東
至通 熊谷
Original Assignee
ソニーセミコンダクタソリューションズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーセミコンダクタソリューションズ株式会社 filed Critical ソニーセミコンダクタソリューションズ株式会社
Publication of WO2021215299A1 publication Critical patent/WO2021215299A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/76Addressed sensors, e.g. MOS or CMOS sensors
    • H04N25/77Pixel circuitry, e.g. memories, A/D converters, pixel amplifiers, shared circuits or shared components

Definitions

  • the present disclosure relates to, for example, a back-illuminated image sensor and an image sensor equipped with the element.
  • CMOS Complementary Metal Oxide Semiconductor
  • the light transmitted without being absorbed by the light receiving part is reflected by the wiring layer (metal layer) provided below the light receiving part and is reflected by the light receiving part. May re-enter.
  • the intensity of the reflected light is non-uniform for each pixel, optical color mixing occurs between adjacent pixels.
  • Patent Document 1 an image quality is obtained by periodically providing a first reflector that is uniform for each pixel and a second reflector between adjacent pixels below the light receiving portion.
  • a solid-state imaging device with an improved design is disclosed.
  • the image pickup device is required to improve the image quality.
  • the image pickup device as an embodiment of the present disclosure has a first surface as a light incident surface and a second surface opposite to the first surface, and photoelectrically charges a charge corresponding to the amount of light received for each pixel.
  • a first semiconductor substrate having a light receiving portion generated by conversion, and a multilayer wiring layer provided on the second surface side of the first semiconductor substrate and having a plurality of wiring layers laminated with an interlayer insulating layer in between. , Which is provided between the second surface of the first semiconductor substrate and a plurality of wirings, and is formed by using a material having an absorption coefficient larger than the absorption coefficient of the first semiconductor substrate for a wavelength of 750 nm or more. It is provided with a first light absorbing layer.
  • the image pickup device includes the image pickup device according to the embodiment of the present disclosure.
  • the second semiconductor substrate having a light receiving portion for each pixel has a second surface opposite to the first surface which is a light incident surface.
  • a material having an absorption coefficient larger than the absorption coefficient of the first semiconductor substrate for a wavelength of 750 nm or more is provided between the surface and a plurality of wiring layers constituting the multilayer wiring layer provided on the second surface side.
  • a first light absorbing layer formed using the above was provided. As a result, the light transmitted through the first semiconductor substrate without being absorbed by the light receiving portion is reduced from being re-entered into the adjacent pixels due to reflection in the wiring layer provided in the multilayer wiring layer.
  • Deformation example 2 (Example in which unevenness is provided on the surface of the wiring layer directly below the light absorption layer) 2-3.
  • Modification 3 (Example in which a light absorption layer is laminated on a plurality of wiring layers) 2-4.
  • Modification 4 (Example in which a light absorption layer is provided on the surface of the pixel separation portion facing the semiconductor substrate) 2-5.
  • Modification 5 (Example of an image sensor provided with a charge holding portion) 2-6.
  • Modification 6 (Example in which a pixel transistor is provided on a separate substrate and a light absorption layer is provided between a semiconductor substrate having a light receiving portion and another substrate) 3.
  • FIG. 1 shows an example of a cross-sectional configuration of an image pickup device (image pickup device 1) according to an embodiment of the present disclosure.
  • FIG. 2 shows an example of the overall configuration of the image pickup apparatus (imaging apparatus 100) shown in FIG.
  • the image pickup device 100 is, for example, a CMOS image sensor or the like used in an electronic device such as a digital still camera or a video camera, and has a pixel portion 100A in which a plurality of pixels are two-dimensionally arranged in a matrix as an image pickup area. There is.
  • the image sensor 1 is a so-called back-illuminated image sensor that constitutes one pixel (unit pixel P) in the CMOS image sensor or the like.
  • the image pickup device 1 has a configuration in which a semiconductor substrate 10 in which a light receiving portion 11 is embedded and a multilayer wiring layer 20 having a plurality of wiring layers (for example, wiring layers 24, 25, 26) are laminated. ..
  • the semiconductor substrate 10 has a first surface (front surface) 10A and a second surface (back surface) 10B that face each other.
  • the multilayer wiring layer 20 is provided on the first surface 10A of the semiconductor substrate 10, and the second surface 10B of the semiconductor substrate 10 is a light incident surface.
  • the image pickup device 1 of the present embodiment is specifically between the first surface 10A of the semiconductor substrate 10 and a plurality of wiring layers (wiring layers 24, 25, 26) provided in the multilayer wiring layer 20.
  • the light absorption layer 23 is provided between the wiring layer 24 and the wiring layer 24 provided on the semiconductor substrate 10 side.
  • the light absorption layer 23 is formed by using a material having an absorption coefficient larger than the absorption coefficient of the semiconductor substrate 10 with respect to a long wavelength (for example, a wavelength of 750 nm or more).
  • the semiconductor substrate 10 corresponds to a specific example of the "first semiconductor substrate” of the present disclosure
  • the first surface 10A of the semiconductor substrate 10 corresponds to a specific example of the "second surface” of the present disclosure
  • Surface 10B corresponds to a specific example of the "first surface” of the present disclosure
  • the light absorption layer 23 corresponds to a specific example of the "first light absorption layer" of the present disclosure.
  • the semiconductor substrate 10 is composed of, for example, a silicon substrate. As described above, the light receiving portion 11 is embedded in the semiconductor substrate 10 for each unit pixel P, for example.
  • the light receiving unit 11 is, for example, a PIN (Positive Intrinsic Negative) type photodiode PD, and has a pn junction in a predetermined region of the semiconductor substrate 10.
  • PIN Positive Intrinsic Negative
  • the first surface 10A of the semiconductor substrate 10 is further provided with a floating diffusion FD and a pixel circuit that outputs a pixel signal based on the electric charge output from each pixel.
  • the pixel circuit has, for example, a transfer transistor TR, an amplification transistor AMP, a reset transistor RST, and a selection transistor SEL as pixel transistors.
  • FIG. 3 shows an example of the pixel circuit of the image pickup device 1 shown in FIG.
  • the transfer transistor TR is connected between the light receiving unit 11 and the FD.
  • a drive signal TGsig is applied to the gate electrode of the transfer transistor TR.
  • the transfer gate of the transfer transistor TR becomes conductive, and the signal charge accumulated in the light receiving unit 11 is transferred to the FD via the transfer transistor TR.
  • the FD is connected between the transfer transistor TR and the amplification transistor AMP.
  • the FD converts the signal charge transferred by the transfer transistor TR into a voltage signal and outputs it to the amplification transistor AMP.
  • the reset transistor RST is connected between the FD and the power supply unit.
  • a drive signal RSTsig is applied to the gate electrode of the reset transistor RST.
  • this drive signal RSTsig becomes active, the reset gate of the reset transistor RST becomes conductive, and the potential of the FD is reset to the level of the power supply unit.
  • the gate electrode of the amplification transistor AMP is connected to the FD, and the drain electrode is connected to the power supply unit, which serves as an input unit for a voltage signal reading circuit held by the FD, a so-called source follower circuit. That is, the amplification transistor AMP constitutes a constant current source and a source follower circuit connected to one end of the vertical signal line Lsig by connecting its source electrode to the vertical signal line Lsig via the selection transistor SEL.
  • the selection transistor SEL is connected between the source electrode of the amplification transistor AMP and the vertical signal line Lsig.
  • a drive signal SELsig is applied to the gate electrode of the selection transistor SEL.
  • the selection transistor SEL becomes conductive and the unit pixel P becomes selected.
  • the read signal (pixel signal) output from the amplification transistor AMP is output to the vertical signal line Lsig via the selection transistor SEL.
  • the multilayer wiring layer 20 is provided on the first surface 10A side of the semiconductor substrate 10 as described above.
  • the multilayer wiring layer 20 has, for example, an insulating layer 21, a gate wiring layer 22, and an interlayer insulating layer 27 in which the light absorption layer 23 and a plurality of wiring layers 24, 25, 26 are provided in the layer. There is.
  • the insulating layer 21 is provided on the first surface 10A of the semiconductor substrate 10, for example, as a gate insulating layer of a pixel transistor.
  • Examples of the material of the insulating layer 21 include silicon oxide (SiO x ), silicon nitride (SiN x ), silicon oxynitride (SiO x N y ), and the like.
  • the gate wiring layer 22 is provided with, for example, the gate electrodes of the transfer transistor TR, the amplification transistor AMP, the reset transistor RST, and the selection transistor SEL described above.
  • the gate wiring layer 22 is formed using, for example, polysilicon (Poly-Si).
  • the light absorption layer 23 is for passing through the semiconductor substrate 10 without being absorbed by the light receiving unit 11 and absorbing the light incident on the multilayer wiring layer 20, for example, in the interlayer insulating layer 27, the semiconductor substrate. It is provided between the first surface 10A of 10 and the plurality of wiring layers 24.
  • the light absorption layer 23 is formed so as to be symmetrical with respect to the optical center C of the unit pixel P, for example, for each unit pixel P.
  • the light absorption layer 23 can be formed by using, for example, a material having an absorption coefficient larger than the absorption coefficient of the semiconductor substrate 10 with respect to a wavelength of 750 nm or more. Examples of such a material include metal oxides and chalcopyrite compounds.
  • Specific metal oxides include, for example, tungsten oxide (WO x ), molybdenum oxide (MO x ), and composite materials thereof.
  • Specific examples of chalcopyrite compounds include copper indium selenium (CuInSe) and copper indium gallium selenium (CuInGaSe).
  • the light absorption layer can be formed by using a graphite compound, carbon nanotube (CNT), graphene, a fullerene derivative, or the like.
  • FIG. 1 shows an example in which the light absorption layer 23 is provided for each unit pixel P, but the present invention is not limited to this.
  • the light absorption layer 23 may be formed as a common layer for a plurality of pixels arranged two-dimensionally in a matrix on the pixel portion 100A.
  • the wiring layers 24, 25, and 26 are for, for example, driving the light receiving unit 11, transmitting signals, applying voltage to each unit, and the like.
  • the wiring layers 24, 25, and 26 are laminated in the interlayer insulating layer 27 in the order of the wiring layers 24, 25, and 26 from the semiconductor substrate 10 side, respectively.
  • the wiring layers 24, 25, and 26 are formed of, for example, copper (Cu) or aluminum (Al).
  • the interlayer insulating layer 27 is provided on the insulating layer 21 so as to cover the gate wiring layer 22, and has the light absorption layer 23 and the wiring layers 24, 25, 26 in the layer as described above. ..
  • the interlayer insulating layer 27 is formed by using, for example, silicon oxide (SiO x ), silicon nitride (SiN x ), silicon nitriding (SiO x N y ), or the like.
  • a light-shielding film 12, a protective layer 13, and an on-chip lens 14 are provided on the second surface 10B side of the semiconductor substrate 10.
  • the light-shielding film 12 is for preventing oblique light incident from the light incident side S1 from being incident on adjacent unit pixels P, and is provided between adjacent unit pixels P, for example.
  • the light-shielding film 12 is formed of, for example, a metal film such as tungsten (W).
  • the protective layer 13 includes, for example, a light-shielding film 12 in the layer to protect the second surface 10B of the semiconductor substrate 10 and flatten the surface of the light incident side S1.
  • the protective layer 13 is formed of, for example, silicon oxide (SiO x ), silicon nitride (SiN x ), silicon oxynitride (SiO x N y ), or the like.
  • the on-chip lens 14 is for condensing the light incident from the light incident side S1 on the light receiving unit 11.
  • the on-chip lens 14 is formed by using a high refractive index material, and specifically, is formed of an inorganic material such as silicon oxide (SiO x ) or silicon nitride (SiN x).
  • an organic material having a high refractive index such as an episulfide resin, a thietane compound or the resin thereof may be used.
  • the shape of the on-chip lens 14 is not particularly limited, and various lens shapes such as a hemispherical shape and a semi-cylindrical shape can be used. As shown in FIG. 1, the on-chip lens 14 may be provided for each unit pixel P, or for example, one on-chip lens may be provided for each of a plurality of unit pixels P.
  • the imaging device 100 is, for example, a CMOS image sensor, which captures incident light (image light) from a subject via an optical lens system (not shown) and pixels the amount of incident light imaged on the imaging surface. It is converted into an electric signal in units and output as a pixel signal.
  • the image pickup apparatus 100 has a pixel portion 100A as an imaging area on the semiconductor substrate 10, and in a peripheral region of the pixel portion 100A, for example, a vertical drive circuit 111, a column signal processing circuit 112, a horizontal drive circuit 113, and an output. It has a circuit 114, a control circuit 115, and an input / output terminal 116.
  • the pixel unit 100A has, for example, a plurality of unit pixels P arranged two-dimensionally in a matrix.
  • a pixel drive line Lread (specifically, a row selection line and a reset control line) is wired for each pixel row, and a vertical signal line Lsig is wired for each pixel column.
  • the pixel drive line Lead transmits a drive signal for reading a signal from a pixel.
  • One end of the pixel drive line Lead is connected to the output end corresponding to each line of the vertical drive circuit 111.
  • the vertical drive circuit 111 is a pixel drive unit composed of a shift register, an address decoder, etc., and drives each unit pixel P of the pixel unit 100A, for example, in row units.
  • the signal output from each unit pixel P of the pixel row selectively scanned by the vertical drive circuit 111 is supplied to the column signal processing circuit 112 through each of the vertical signal lines Lsig.
  • the column signal processing circuit 112 is composed of an amplifier, a horizontal selection switch, and the like provided for each vertical signal line Lsig.
  • the horizontal drive circuit 113 is composed of a shift register, an address decoder, etc., and drives each horizontal selection switch of the column signal processing circuit 112 in order while scanning. By the selective scanning by the horizontal drive circuit 113, the signals of each pixel transmitted through each of the vertical signal lines Lsig are sequentially output to the horizontal signal line 121 and transmitted to the outside of the semiconductor substrate 10 through the horizontal signal line 121. ..
  • the output circuit 114 performs signal processing on signals sequentially supplied from each of the column signal processing groups r112 via the horizontal signal line 121 and outputs the signals.
  • the output circuit 114 may, for example, perform only buffering, or may perform black level adjustment, column variation correction, various digital signal processing, and the like.
  • the circuit portion including the vertical drive circuit 111, the column signal processing circuit 112, the horizontal drive circuit 113, the horizontal signal line 121, and the output circuit 114 may be formed directly on the semiconductor substrate 10, or may be used as an external control IC. It may be arranged. Further, those circuit portions may be formed on another substrate connected by a cable or the like.
  • the control circuit 115 receives a clock given from the outside of the semiconductor substrate 10, data for instructing an operation mode, and the like, and outputs data such as internal information of the image pickup apparatus 100.
  • the control circuit 115 further includes a timing generator that generates various timing signals, and the vertical drive circuit 111, the column signal processing circuit 112, the horizontal drive circuit 113, and the like based on the various timing signals generated by the timing generator. Controls the drive of peripheral circuits.
  • the input / output terminal 116 exchanges signals with the outside.
  • a plurality of wirings in the first surface 10A of the semiconductor substrate 10 in which the light receiving portion 11 is embedded and in the multilayer wiring layer 20 provided on the first surface 10A side of the semiconductor substrate 10 The light absorption layer 23 is provided between the layers 24, 25, and 26.
  • the light absorption layer 23 is formed by using a material having an absorption coefficient larger than that of the semiconductor substrate 10 for a wavelength of, for example, 750 nm or more.
  • the back-illuminated CIS in which the wiring layer is arranged on the opposite side of the light receiving portion from the light incident side is Since the incident light is not blocked by the wiring layer including the gate electrode and the like, it is possible to suppress a decrease in sensitivity due to miniaturization.
  • long-wavelength light that is not absorbed by the light receiving portion and is transmitted to the wiring layer side may be reflected by the wiring layer (metal layer) and re-entered into the light receiving portion.
  • Optical color mixing between adjacent pixels can be achieved by, for example, making the wiring layer formed below each pixel a symmetrical layout in units of shared pixels, so that non-uniform reflection between adjacent pixels can be achieved. The strength can be relaxed and reduced.
  • the layout of the metal layer such as wiring is symmetric for each shared pixel
  • the arrangement of the metal layer in the pixel is optically asymmetric
  • the light reflected by the metal layer is transferred to the light receiving portion of the adjacent pixel.
  • optical color mixing will occur due to re-incident.
  • the image quality is improved by periodically providing a first reflecting plate that is uniform for each pixel and a second reflecting plate between adjacent pixels below the light receiving portion.
  • the control of reflected light using optical interference has a large wavelength dependence and it is difficult to obtain desired characteristics in a wide wavelength band.
  • the first surface 10A of the semiconductor substrate 10 in which the light receiving portion 11 is embedded and formed for each unit pixel P, and the plurality of wiring layers 24 and 25 provided in the multilayer wiring layer 20 are provided.
  • a light absorption layer 23 containing a material having an absorption coefficient larger than that of the semiconductor substrate 10 for a wavelength of, for example, 750 nm or more is formed between the components and 26.
  • the light that is not absorbed by the light receiving unit 11 and is incident on the multilayer wiring layer 20 is included in the plurality of wiring layers 24, 25, 26 provided in the multilayer wiring layer 20. Optical color mixing to adjacent unit pixels P due to reflection is reduced. Therefore, it is possible to improve the image quality.
  • the light transmitted through the light receiving portion 11 and incident on the multilayer wiring layer 20 is absorbed by the light absorption layer 23, so that the plurality of wiring layers 24 formed below the light absorption layer 23 are formed. , 25, 26 can be improved in layout freedom.
  • FIG. 4A shows an example of the cross-sectional configuration of the image pickup device (image pickup device 1A) according to the first modification of the present disclosure.
  • FIG. 4B schematically shows an example of the planar shape of the light absorption layer 23 in the image pickup device 1A shown in FIG. 4A. Note that FIG. 4A shows a cross section taken along the line II shown in FIG. 4B.
  • a uniform light absorption layer 23 is provided for each unit pixel P is shown, but the present invention is not limited to this.
  • the planar shape of the light absorption layer 23 may have symmetry with respect to the optical center C in the unit pixel P, for example, and the light absorption layer 23 of the image pickup element 1A of this modification is the optical center C.
  • the point that the opening 23H is formed in the vicinity is different from the above-described embodiment.
  • the light transmitted through the semiconductor substrate 10 and incident on the multilayer wiring layer 20 is the light absorption layer 23 as described above.
  • the optical color mixing to the adjacent unit pixels P is reduced, the re-incidentation to the own pixel is also reduced, so that the sensitivity may be lowered.
  • FIG. 5 shows an example of the cross-sectional configuration of the image pickup device (image pickup device 1B) according to the second modification of the present disclosure.
  • the image sensor 1B of the present modification has a concave-convex shape formed on the surface of the wiring layer 24 formed on the uppermost layer, that is, directly below the light absorption layer 23, among the plurality of wiring layers 24, 25, 26. However, it is different from the above embodiment.
  • the effect of suppressing light reflection in the plurality of wiring layers 24, 25, 26 by forming the light absorption layer 23 depends on the absorption coefficient of the light absorption material constituting the light absorption layer 23 and the film thickness of the light absorption layer 23. ..
  • FIG. 6 shows an example of the cross-sectional configuration of the image pickup device (image pickup device 1C) according to the third modification of the present disclosure.
  • the image pickup device 1C of the present modification is different from the above-described embodiment in that the light absorption layer 23 is laminated on each of the plurality of wiring layers 24, 25, and 26.
  • the light absorption layer 23 is laminated and formed on each of the plurality of wiring layers 24, 25, and 26. Specifically, a light absorption layer 23A is formed on the wiring layer 24, a light absorption layer 23B is formed on the wiring layer 25, and a light absorption layer 23C is formed on the wiring layer 26.
  • the light absorption layer 23 and the wiring layers 24, 25, 26 are formed with the interlayer insulating layer 27 spaced apart from each other, but the light absorption layer 23 is shown. May be directly laminated and formed on the wiring layers 24, 25, 26 as in the present modification.
  • FIG. 7 shows an example of the cross-sectional configuration of the image pickup device (image pickup device 1D) according to the modified example 4 of the present disclosure.
  • a pixel separation unit 15 is provided between adjacent unit pixels P, and a light absorption layer 16 having the same configuration as the light absorption layer 23 in the above embodiment is provided on the surface thereof. The point is different from the above-described embodiment.
  • the pixel separation unit 15 is for optically and electrically separating adjacent unit pixels P, and is adjacent to the light receiving units 11 from the second surface 10B to the first surface 10A of the semiconductor substrate 10. It is extended between the two.
  • the pixel separation portion 15 is integrally formed with, for example, a light-shielding film 12 provided on the second surface 10B side of the semiconductor substrate 10, and is formed of, for example, a metal film such as tungsten (W) like the light-shielding film 12.
  • W tungsten
  • a protective layer 13 extends between the pixel separation unit 15 and the semiconductor substrate 10, and the semiconductor substrate 10 and the pixel separation unit 15 are electrically separated from each other.
  • the light absorption layer 16 is provided on the surface of the pixel separation unit 15, specifically, on the surface facing the semiconductor substrate 10. As a result, for example, the intensity of light that is incident from the second surface 10B of the semiconductor substrate 10 and is reflected by the pixel separation unit 15 and the wiring layers 24, 25, 26 and leaks to the adjacent unit pixels P is reduced. ..
  • the light absorption layer 16 corresponds to a specific example of the "second light absorption layer" of the present disclosure.
  • the pixel separation unit 15 for separating the adjacent light receiving units 11 is provided between the adjacent unit pixels P, and the surface of the pixel separation unit 15 facing the semiconductor substrate 10 is provided.
  • the light absorption layer 16 is further provided.
  • FIG. 7 shows an example in which the end portion of the pixel separation unit 15 is provided in the semiconductor substrate 10, the pixel separation unit 15 is, for example, the image sensor 1E (see FIG. 8) described below.
  • the semiconductor substrate 10 may be completely separated from the second surface 10B of the semiconductor substrate 10 toward the first surface 10A.
  • FIG. 8 shows an example of the cross-sectional configuration of the image pickup device (image pickup device 1E) according to the modification 5 of the present disclosure.
  • the image sensor 1E of this modification realizes a so-called global shutter type back-illuminated CIS in which, for example, a charge holding unit 17 that temporarily stores the electric charge generated by the light receiving unit 11 is provided in the unit pixel P. It was done.
  • the global shutter method is basically a method of performing global exposure that starts exposure for all pixels at the same time and ends exposure for all pixels at the same time.
  • all the pixels mean all the pixels of the portion appearing in the image, and dummy pixels and the like are excluded.
  • the global shutter method also includes a method of performing global exposure not only on all the pixels of the portion appearing in the image but also on the pixels in a predetermined region.
  • the electric charge holding unit (MEM) 17 is for temporarily holding the electric charge until the electric charge generated in the light receiving unit 11 is transferred to the floating diffusion FD.
  • the charge holding portion 17 is formed, for example, embedded in the first surface 10A side of the semiconductor substrate 10 in the same manner as the light receiving portion 11.
  • the stray light to the charge holding part 17 becomes a false signal (Parasitic Light Sensitivity), so that the second surface 10B above the charge holding part 17 of the semiconductor substrate 10 and the light receiving part 11 and the charge holding part 11 are charged.
  • a light-shielding portion continuous with the light-shielding film 12 is formed between the portion 17 and the light-shielding film 12.
  • the present technology passes through the semiconductor substrate 10 and the light reflected by the plurality of wiring layers 24, 25, 26 re-enters the charge holding portion 17. It becomes possible to prevent this. This makes it possible to reduce the generation of false signals while improving the image quality of the back-illuminated CIS of the global shutter system.
  • the image sensor 1E of this modification also absorbs light on the respective surfaces of the pixel separation unit 15 and the light-shielding film 12 between the light receiving unit 11 and the charge holding unit 17.
  • the layer 16 may be provided.
  • FIG. 9 shows an example of the cross-sectional configuration of the image pickup device (image pickup device 1F) according to the modification 6 of the present disclosure.
  • image pickup element 1F of this modification pixel transistors such as an amplification transistor AMP, a reset transistor RST, and a selection transistor SEL constituting a pixel circuit are provided on a substrate (semiconductor substrate 30) different from the semiconductor substrate 10, and the semiconductor substrate 10 and the image pickup element 1F are provided. It differs from the above embodiment in that the light absorbing layer 23 is provided between the semiconductor substrate 30 and the semiconductor substrate 30.
  • the first surface 10A of the semiconductor substrate 10 in this modification is provided with a floating diffusion FD and a transfer transistor TR constituting a pixel circuit.
  • the multilayer wiring layer 20 is provided with, for example, a light absorption layer 23 together with a gate electrode of a transfer transistor TR as a common layer for a plurality of unit pixels P, for example.
  • the semiconductor substrate 30 has a first surface (front surface) 30A and a second surface (back surface) 30B facing each other, and pixel transistors such as an amplification transistor AMP, a reset transistor RST, and a selection transistor SEL are provided on the first surface 30A.
  • pixel transistors such as an amplification transistor AMP, a reset transistor RST, and a selection transistor SEL are provided on the first surface 30A.
  • the semiconductor substrate 30 corresponds to a specific example of the "second semiconductor substrate" of the present disclosure.
  • a gate wiring layer 41 including an amplification transistor AMP, a reset transistor RST, and a gate electrode of a selection transistor SEL, and a plurality of wiring layers 42, 43 On the first surface 30A of the semiconductor substrate 30, in the interlayer insulating layer 45, for example, a gate wiring layer 41 including an amplification transistor AMP, a reset transistor RST, and a gate electrode of a selection transistor SEL, and a plurality of wiring layers 42, 43, The multilayer wiring layer 40 included in the 44 is provided.
  • an insulating layer 28 is provided on the second surface 30B of the semiconductor substrate 30.
  • the semiconductor substrate 10 and the semiconductor substrate 30 are the first surface of the semiconductor substrate 10 with the upper surface of the interlayer insulating layer 27 formed on the first surface 10A of the semiconductor substrate 10 and the upper surface of the insulating layer 28 as the bonding surface S3.
  • the 10A and the second surface 30B of the semiconductor substrate 30 are bonded to each other so as to face each other.
  • the amplification transistor AMP, the reset transistor RST, the selection transistor SEL, and the like constituting the pixel circuit are provided on the semiconductor substrate 30, and the light receiving portion 11 is embedded and formed between the semiconductor substrate 10 and the semiconductor substrate 30.
  • the light absorbing layer 23 is provided.
  • FIG. 10 shows an example of a schematic configuration of an image pickup system 2 provided with an image pickup device (for example, an image pickup device 1) according to the above-described embodiment and modifications 1 to 6.
  • the imaging system 2 is, for example, an imaging device such as a digital still camera or a video camera, or an electronic device such as a mobile terminal device such as a smartphone or a tablet terminal.
  • the image pickup system 2 includes, for example, an image pickup element 1, an optical system 241, a shutter device 242, a DSP circuit 243, a frame memory 244, a display unit 245, a storage unit 246, an operation unit 247, and a power supply unit 248.
  • the image pickup element 1, the DSP circuit 243, the frame memory 244, the display unit 245, the storage unit 246, the operation unit 247, and the power supply unit 248 are connected to each other via a bus line 249.
  • the image sensor 1 outputs image data according to the incident light.
  • the optical system 241 is configured to have one or a plurality of lenses, guides light (incident light) from a subject to an image pickup device 1, and forms an image on a light receiving surface of the image pickup device 1.
  • the shutter device 242 is arranged between the optical system 241 and the image sensor 1, and controls the light irradiation period and the light shielding period of the image sensor 1 according to the control of the drive circuit.
  • the DSP circuit 243 is a signal processing circuit that processes a signal (image data) output from the image sensor 1.
  • the frame memory 244 temporarily holds the image data processed by the DSP circuit 243 in frame units.
  • the display unit 245 comprises a panel-type display device such as a liquid crystal panel or an organic EL (Electro Luminescence) panel, and displays a moving image or a still image captured by the image sensor 1.
  • the storage unit 246 records image data of a moving image or a still image captured by the image pickup device 1 on a recording medium such as a semiconductor memory or a hard disk.
  • the operation unit 247 issues operation commands for various functions of the image pickup system 2 according to the operation by the user.
  • the power supply unit 248 appropriately supplies various power sources serving as operating power sources for the image sensor 1, DSP circuit 243, frame memory 244, display unit 245, storage unit 246, and operation unit 247 to these supply targets.
  • FIG. 11 shows an example of a flowchart of an imaging operation in the imaging system 2.
  • the user instructs the start of imaging by operating the operation unit 247 (step S101).
  • the operation unit 247 transmits an image pickup command to the image pickup device 1 (step S102).
  • the image pickup device 1 specifically, the system control circuit
  • the image pickup element 1 specifically, the system control circuit
  • the image sensor 1 outputs the image data obtained by the image pickup to the DSP circuit 243.
  • the image data is data for all pixels of the pixel signal generated based on the electric charge temporarily held in the floating diffusion FD.
  • the DSP circuit 243 performs predetermined signal processing (for example, noise reduction processing) based on the image data input from the image sensor 1 (step S104).
  • the DSP circuit 243 stores the image data subjected to the predetermined signal processing in the frame memory 244, and the frame memory 244 stores the image data in the storage unit 246 (step S105). In this way, the imaging in the imaging system 2 is performed.
  • the image pickup devices 1, 1A to 1F according to the above-described embodiment and modifications 1 to 6 are applied to the image pickup system 2.
  • the image sensor 1 can be miniaturized or high-definition, so that a small-sized or high-definition image pickup system 2 can be provided.
  • the technology according to the present disclosure can be applied to various products.
  • the technology according to the present disclosure is realized as a device mounted on a moving body of any kind such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, a personal mobility, an airplane, a drone, a ship, and a robot. You may.
  • FIG. 12 is a block diagram showing a schematic configuration example of a vehicle control system, which is an example of a mobile control system to which the technique according to the present disclosure can be applied.
  • the vehicle control system 12000 includes a plurality of electronic control units connected via the communication network 12001.
  • the vehicle control system 12000 includes a drive system control unit 12010, a body system control unit 12020, an outside information detection unit 12030, an in-vehicle information detection unit 12040, and an integrated control unit 12050.
  • a microcomputer 12051, an audio image output unit 12052, and an in-vehicle network I / F (interface) 12053 are shown as a functional configuration of the integrated control unit 12050.
  • the drive system control unit 12010 controls the operation of the device related to the drive system of the vehicle according to various programs.
  • the drive system control unit 12010 provides a driving force generator for generating the driving force of the vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to the wheels, and a steering angle of the vehicle. It functions as a control device such as a steering mechanism for adjusting and a braking device for generating a braking force of a vehicle.
  • the body system control unit 12020 controls the operation of various devices mounted on the vehicle body according to various programs.
  • the body system control unit 12020 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as a head lamp, a back lamp, a brake lamp, a winker, or a fog lamp.
  • the body system control unit 12020 may be input with radio waves transmitted from a portable device that substitutes for the key or signals of various switches.
  • the body system control unit 12020 receives inputs of these radio waves or signals and controls a vehicle door lock device, a power window device, a lamp, and the like.
  • the vehicle outside information detection unit 12030 detects information outside the vehicle equipped with the vehicle control system 12000.
  • the image pickup unit 12031 is connected to the vehicle exterior information detection unit 12030.
  • the vehicle outside information detection unit 12030 causes the image pickup unit 12031 to capture an image of the outside of the vehicle and receives the captured image.
  • the vehicle exterior information detection unit 12030 may perform object detection processing or distance detection processing such as a person, a vehicle, an obstacle, a sign, or a character on the road surface based on the received image.
  • the imaging unit 12031 is an optical sensor that receives light and outputs an electric signal according to the amount of the light received.
  • the image pickup unit 12031 can output an electric signal as an image or can output it as distance measurement information. Further, the light received by the imaging unit 12031 may be visible light or invisible light such as infrared light.
  • the in-vehicle information detection unit 12040 detects the in-vehicle information.
  • a driver state detection unit 12041 that detects the driver's state is connected to the in-vehicle information detection unit 12040.
  • the driver state detection unit 12041 includes, for example, a camera that images the driver, and the in-vehicle information detection unit 12040 determines the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 12041. It may be calculated, or it may be determined whether the driver is dozing.
  • the microcomputer 12051 calculates the control target value of the driving force generator, the steering mechanism, or the braking device based on the information inside and outside the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040, and the drive system control unit.
  • a control command can be output to 12010.
  • the microcomputer 12051 realizes ADAS (Advanced Driver Assistance System) functions including vehicle collision avoidance or impact mitigation, follow-up driving based on inter-vehicle distance, vehicle speed maintenance driving, vehicle collision warning, vehicle lane deviation warning, and the like. It is possible to perform cooperative control for the purpose of.
  • ADAS Advanced Driver Assistance System
  • the microcomputer 12051 controls the driving force generator, the steering mechanism, the braking device, and the like based on the information around the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040. It is possible to perform coordinated control for the purpose of automatic driving, etc., which runs autonomously without depending on the operation.
  • the microcomputer 12051 can output a control command to the body system control unit 12020 based on the information outside the vehicle acquired by the vehicle exterior information detection unit 12030.
  • the microcomputer 12051 controls the headlamps according to the position of the preceding vehicle or the oncoming vehicle detected by the external information detection unit 12030, and performs coordinated control for the purpose of anti-glare such as switching the high beam to the low beam. It can be carried out.
  • the audio image output unit 12052 transmits the output signal of at least one of the audio and the image to the output device capable of visually or audibly notifying the passenger or the outside of the vehicle of the information.
  • an audio speaker 12061, a display unit 12062, and an instrument panel 12063 are exemplified as output devices.
  • the display unit 12062 may include, for example, at least one of an onboard display and a heads-up display.
  • FIG. 13 is a diagram showing an example of the installation position of the imaging unit 12031.
  • the vehicle 12100 has image pickup units 12101, 12102, 12103, 12104, 12105 as the image pickup unit 12031.
  • the imaging units 12101, 12102, 12103, 12104, 12105 are provided at positions such as the front nose, side mirrors, rear bumpers, back doors, and the upper part of the windshield in the vehicle interior of the vehicle 12100, for example.
  • the imaging unit 12101 provided on the front nose and the imaging unit 12105 provided on the upper part of the windshield in the vehicle interior mainly acquire an image in front of the vehicle 12100.
  • the imaging units 12102 and 12103 provided in the side mirrors mainly acquire images of the side of the vehicle 12100.
  • the imaging unit 12104 provided on the rear bumper or the back door mainly acquires an image of the rear of the vehicle 12100.
  • the images in front acquired by the imaging units 12101 and 12105 are mainly used for detecting a preceding vehicle or a pedestrian, an obstacle, a traffic light, a traffic sign, a lane, or the like.
  • FIG. 13 shows an example of the photographing range of the imaging units 12101 to 12104.
  • the imaging range 12111 indicates the imaging range of the imaging unit 12101 provided on the front nose
  • the imaging ranges 12112 and 12113 indicate the imaging ranges of the imaging units 12102 and 12103 provided on the side mirrors, respectively
  • the imaging range 12114 indicates the imaging range of the imaging units 12102 and 12103.
  • the imaging range of the imaging unit 12104 provided on the rear bumper or the back door is shown. For example, by superimposing the image data captured by the imaging units 12101 to 12104, a bird's-eye view image of the vehicle 12100 as viewed from above can be obtained.
  • At least one of the imaging units 12101 to 12104 may have a function of acquiring distance information.
  • at least one of the image pickup units 12101 to 12104 may be a stereo camera composed of a plurality of image pickup elements, or an image pickup element having pixels for phase difference detection.
  • the microcomputer 12051 has a distance to each three-dimensional object within the imaging range 12111 to 12114 based on the distance information obtained from the imaging units 12101 to 12104, and a temporal change of this distance (relative velocity with respect to the vehicle 12100). By obtaining can. Further, the microcomputer 12051 can set an inter-vehicle distance to be secured in front of the preceding vehicle in advance, and can perform automatic braking control (including follow-up stop control), automatic acceleration control (including follow-up start control), and the like. In this way, it is possible to perform coordinated control for the purpose of automatic driving or the like in which the vehicle travels autonomously without depending on the operation of the driver.
  • automatic braking control including follow-up stop control
  • automatic acceleration control including follow-up start control
  • the microcomputer 12051 converts three-dimensional object data related to a three-dimensional object into two-wheeled vehicles, ordinary vehicles, large vehicles, pedestrians, electric poles, and other three-dimensional objects based on the distance information obtained from the imaging units 12101 to 12104. It can be classified and extracted and used for automatic avoidance of obstacles. For example, the microcomputer 12051 distinguishes obstacles around the vehicle 12100 into obstacles that can be seen by the driver of the vehicle 12100 and obstacles that are difficult to see. Then, the microcomputer 12051 determines the collision risk indicating the risk of collision with each obstacle, and when the collision risk is equal to or higher than the set value and there is a possibility of collision, the microcomputer 12051 via the audio speaker 12061 or the display unit 12062. By outputting an alarm to the driver and performing forced deceleration and avoidance steering via the drive system control unit 12010, driving support for collision avoidance can be provided.
  • At least one of the imaging units 12101 to 12104 may be an infrared camera that detects infrared rays.
  • the microcomputer 12051 can recognize a pedestrian by determining whether or not a pedestrian is present in the captured image of the imaging units 12101 to 12104.
  • pedestrian recognition includes, for example, a procedure for extracting feature points in an image captured by an imaging unit 12101 to 12104 as an infrared camera, and pattern matching processing for a series of feature points indicating the outline of an object to determine whether or not the pedestrian is a pedestrian. It is done by the procedure to determine.
  • the audio image output unit 12052 When the microcomputer 12051 determines that a pedestrian is present in the captured images of the imaging units 12101 to 12104 and recognizes the pedestrian, the audio image output unit 12052 outputs a square contour line for emphasizing the recognized pedestrian.
  • the display unit 12062 is controlled so as to superimpose and display. Further, the audio image output unit 12052 may control the display unit 12062 so as to display an icon or the like indicating a pedestrian at a desired position.
  • the above is an example of a mobile control system to which the technology according to the present disclosure can be applied.
  • the technique according to the present disclosure can be applied to the imaging unit 12031 among the configurations described above.
  • the image pickup apparatus 100 can be applied to the image pickup unit 12031.
  • the technique according to the present disclosure to the image pickup unit 12031, a high-definition photographed image with less noise can be obtained, so that highly accurate control using the photographed image can be performed in the moving body control system.
  • the technology according to the present disclosure (the present technology) can be applied to various products.
  • the techniques according to the present disclosure may be applied to endoscopic surgery systems.
  • FIG. 14 is a diagram showing an example of a schematic configuration of an endoscopic surgery system to which the technique according to the present disclosure (the present technique) can be applied.
  • FIG. 14 shows a surgeon (doctor) 11131 performing surgery on patient 11132 on patient bed 11153 using the endoscopic surgery system 11000.
  • the endoscopic surgery system 11000 includes an endoscope 11100, other surgical tools 11110 such as an abdominal tube 11111 and an energy treatment tool 11112, and a support arm device 11120 that supports the endoscope 11100.
  • a cart 11200 equipped with various devices for endoscopic surgery.
  • the endoscope 11100 is composed of a lens barrel 11101 in which a region having a predetermined length from the tip is inserted into the body cavity of the patient 11132, and a camera head 11102 connected to the base end of the lens barrel 11101.
  • the endoscope 11100 configured as a so-called rigid mirror having a rigid barrel 11101 is illustrated, but the endoscope 11100 may be configured as a so-called flexible mirror having a flexible barrel. good.
  • An opening in which an objective lens is fitted is provided at the tip of the lens barrel 11101.
  • a light source device 11203 is connected to the endoscope 11100, and the light generated by the light source device 11203 is guided to the tip of the lens barrel by a light guide extending inside the lens barrel 11101, and is an objective. It is irradiated toward the observation target in the body cavity of the patient 11132 through the lens.
  • the endoscope 11100 may be a direct endoscope, a perspective mirror, or a side endoscope.
  • An optical system and an imaging element are provided inside the camera head 11102, and the reflected light (observation light) from the observation target is focused on the imaging element by the optical system.
  • the observation light is photoelectrically converted by the image sensor, and an electric signal corresponding to the observation light, that is, an image signal corresponding to the observation image is generated.
  • the image signal is transmitted as RAW data to the camera control unit (CCU: Camera Control Unit) 11201.
  • CCU Camera Control Unit
  • the CCU11201 is composed of a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), and the like, and comprehensively controls the operations of the endoscope 11100 and the display device 11202. Further, the CCU 11201 receives an image signal from the camera head 11102, and performs various image processing on the image signal for displaying an image based on the image signal, such as development processing (demosaic processing).
  • a CPU Central Processing Unit
  • GPU Graphics Processing Unit
  • the display device 11202 displays an image based on the image signal processed by the CCU 11201 under the control of the CCU 11201.
  • the light source device 11203 is composed of, for example, a light source such as an LED (Light Emitting Diode), and supplies irradiation light to the endoscope 11100 when photographing an operating part or the like.
  • a light source such as an LED (Light Emitting Diode)
  • LED Light Emitting Diode
  • the input device 11204 is an input interface for the endoscopic surgery system 11000.
  • the user can input various information and input instructions to the endoscopic surgery system 11000 via the input device 11204.
  • the user inputs an instruction to change the imaging conditions (type of irradiation light, magnification, focal length, etc.) by the endoscope 11100.
  • the treatment tool control device 11205 controls the drive of the energy treatment tool 11112 for ablation of tissue, incision, sealing of blood vessels, and the like.
  • the pneumoperitoneum device 11206 uses a gas in the pneumoperitoneum tube 11111 to inflate the body cavity of the patient 11132 for the purpose of securing the field of view by the endoscope 11100 and securing the work space of the operator.
  • the recorder 11207 is a device capable of recording various information related to surgery.
  • the printer 11208 is a device capable of printing various information related to surgery in various formats such as texts, images, and graphs.
  • the light source device 11203 that supplies the irradiation light to the endoscope 11100 when photographing the surgical site can be composed of, for example, an LED, a laser light source, or a white light source composed of a combination thereof.
  • a white light source is configured by combining RGB laser light sources, the output intensity and output timing of each color (each wavelength) can be controlled with high accuracy. Therefore, the light source device 11203 adjusts the white balance of the captured image. It can be carried out.
  • the laser light from each of the RGB laser light sources is irradiated to the observation target in a time-division manner, and the drive of the image sensor of the camera head 11102 is controlled in synchronization with the irradiation timing to correspond to each of RGB. It is also possible to capture the image in a time-division manner. According to this method, a color image can be obtained without providing a color filter on the image sensor.
  • the drive of the light source device 11203 may be controlled so as to change the intensity of the output light at predetermined time intervals.
  • the drive of the image sensor of the camera head 11102 in synchronization with the timing of changing the light intensity to acquire an image in a time-divided manner and synthesizing the image, so-called high dynamic without blackout and overexposure. Range images can be generated.
  • the light source device 11203 may be configured to be able to supply light in a predetermined wavelength band corresponding to special light observation.
  • special light observation for example, by utilizing the wavelength dependence of light absorption in body tissue to irradiate light in a narrow band as compared with the irradiation light (that is, white light) in normal observation, the surface layer of the mucous membrane.
  • a so-called narrow band imaging is performed in which a predetermined tissue such as a blood vessel is photographed with high contrast.
  • fluorescence observation may be performed in which an image is obtained by fluorescence generated by irradiating with excitation light.
  • the body tissue is irradiated with excitation light to observe the fluorescence from the body tissue (autofluorescence observation), or a reagent such as indocyanine green (ICG) is locally injected into the body tissue and the body tissue is injected. It is possible to obtain a fluorescence image by irradiating excitation light corresponding to the fluorescence wavelength of the reagent.
  • the light source device 11203 may be configured to be capable of supplying narrow band light and / or excitation light corresponding to such special light observation.
  • FIG. 15 is a block diagram showing an example of the functional configuration of the camera head 11102 and CCU11201 shown in FIG.
  • the camera head 11102 includes a lens unit 11401, an imaging unit 11402, a driving unit 11403, a communication unit 11404, and a camera head control unit 11405.
  • CCU11201 has a communication unit 11411, an image processing unit 11412, and a control unit 11413.
  • the camera head 11102 and CCU11201 are communicably connected to each other by a transmission cable 11400.
  • the lens unit 11401 is an optical system provided at a connection portion with the lens barrel 11101.
  • the observation light taken in from the tip of the lens barrel 11101 is guided to the camera head 11102 and incident on the lens unit 11401.
  • the lens unit 11401 is configured by combining a plurality of lenses including a zoom lens and a focus lens.
  • the image pickup unit 11402 is composed of an image pickup element.
  • the image sensor constituting the image pickup unit 11402 may be one (so-called single plate type) or a plurality (so-called multi-plate type).
  • each image pickup element may generate an image signal corresponding to each of RGB, and a color image may be obtained by synthesizing them.
  • the image pickup unit 11402 may be configured to have a pair of image pickup elements for acquiring image signals for the right eye and the left eye corresponding to 3D (Dimensional) display, respectively.
  • the 3D display enables the operator 11131 to more accurately grasp the depth of the biological tissue in the surgical site.
  • a plurality of lens units 11401 may be provided corresponding to each image pickup element.
  • the imaging unit 11402 does not necessarily have to be provided on the camera head 11102.
  • the imaging unit 11402 may be provided inside the lens barrel 11101 immediately after the objective lens.
  • the drive unit 11403 is composed of an actuator, and the zoom lens and focus lens of the lens unit 11401 are moved by a predetermined distance along the optical axis under the control of the camera head control unit 11405. As a result, the magnification and focus of the image captured by the imaging unit 11402 can be adjusted as appropriate.
  • the communication unit 11404 is composed of a communication device for transmitting and receiving various information to and from the CCU11201.
  • the communication unit 11404 transmits the image signal obtained from the image pickup unit 11402 as RAW data to the CCU 11201 via the transmission cable 11400.
  • the communication unit 11404 receives a control signal for controlling the drive of the camera head 11102 from the CCU 11201 and supplies the control signal to the camera head control unit 11405.
  • the control signal includes, for example, information to specify the frame rate of the captured image, information to specify the exposure value at the time of imaging, and / or information to specify the magnification and focus of the captured image, and the like. Contains information about the condition.
  • the imaging conditions such as the frame rate, exposure value, magnification, and focus may be appropriately specified by the user, or may be automatically set by the control unit 11413 of CCU11201 based on the acquired image signal. good.
  • the so-called AE (Auto Exposure) function, AF (Auto Focus) function, and AWB (Auto White Balance) function are mounted on the endoscope 11100.
  • the camera head control unit 11405 controls the drive of the camera head 11102 based on the control signal from the CCU 11201 received via the communication unit 11404.
  • the communication unit 11411 is composed of a communication device for transmitting and receiving various information to and from the camera head 11102.
  • the communication unit 11411 receives an image signal transmitted from the camera head 11102 via the transmission cable 11400.
  • the communication unit 11411 transmits a control signal for controlling the drive of the camera head 11102 to the camera head 11102.
  • Image signals and control signals can be transmitted by telecommunications, optical communication, or the like.
  • the image processing unit 11412 performs various image processing on the image signal which is the RAW data transmitted from the camera head 11102.
  • the control unit 11413 performs various controls related to the imaging of the surgical site and the like by the endoscope 11100 and the display of the captured image obtained by the imaging of the surgical site and the like. For example, the control unit 11413 generates a control signal for controlling the drive of the camera head 11102.
  • control unit 11413 causes the display device 11202 to display an image captured by the surgical unit or the like based on the image signal processed by the image processing unit 11412.
  • the control unit 11413 may recognize various objects in the captured image by using various image recognition techniques. For example, the control unit 11413 detects the shape, color, and the like of the edge of an object included in the captured image to remove surgical tools such as forceps, a specific biological part, bleeding, and mist when using the energy treatment tool 11112. Can be recognized.
  • the control unit 11413 may superimpose and display various surgical support information on the image of the surgical unit by using the recognition result. By superimposing and displaying the surgical support information and presenting it to the surgeon 11131, it is possible to reduce the burden on the surgeon 11131 and to allow the surgeon 11131 to proceed with the surgery reliably.
  • the transmission cable 11400 that connects the camera head 11102 and CCU11201 is an electric signal cable that supports electric signal communication, an optical fiber that supports optical communication, or a composite cable thereof.
  • the communication is performed by wire using the transmission cable 11400, but the communication between the camera head 11102 and the CCU11201 may be performed wirelessly.
  • the technique according to the present disclosure can be suitably applied to the imaging unit 11402 provided on the camera head 11102 of the endoscope 11100.
  • the imaging unit 11402 can be miniaturized or have high definition, so that a compact or high-definition endoscope 11100 can be provided.
  • the present disclosure has been described above with reference to the embodiments and modifications 1 to 6, application examples and application examples, the present disclosure is not limited to the above-described embodiments and the like, and various modifications are possible. ..
  • a red filter that transmits light in the red wavelength region
  • a green filter that transmits light in the green wavelength region
  • a blue color that transmits light in the blue wavelength region.
  • the filter may be provided with an optical member such as a color filter provided in a regular color arrangement (for example, a bayer arrangement) in the pixel portion 100A, for example.
  • the present disclosure may also have the following structure.
  • the present technology having the following configuration, on the second surface of the first semiconductor substrate having a light receiving portion for each pixel, on the side opposite to the first surface which is the light incident surface, and on the second surface side.
  • a first light formed between a plurality of wiring layers constituting the provided multilayer wiring layer by using a material having an absorption coefficient larger than the absorption coefficient of the first semiconductor substrate for a wavelength of 750 nm or more.
  • An absorption layer was provided.
  • a material provided between the second surface of the first semiconductor substrate and the plurality of wirings and having an absorption coefficient larger than the absorption coefficient of the first semiconductor substrate for a wavelength of 750 nm or more is used.
  • An image pickup device including a first light absorption layer formed of the above.
  • the image pickup device (2) The image pickup device according to (1), wherein the first light absorption layer is provided for each pixel. (3) The image pickup device according to (1) or (2), wherein the first light absorption layer is formed symmetrically with respect to the optical center of the pixel. (4) The first light absorption layer is an image pickup device according to any one of (1) to (3), which has an opening with respect to the optical center of the pixel. (5) The image pickup device according to any one of (1) to (4), wherein the first light absorption layer is provided as a common layer for a plurality of the pixels. (6) The wiring layer directly below the first light absorption layer among the plurality of wiring layers has a concavo-convex shape on the surface, according to any one of (1) to (5). Image sensor.
  • the image pickup device according to any one of (1) to (6) above, wherein the plurality of wiring layers have a laminated structure with the first light absorption layer.
  • the first semiconductor substrate is a silicon substrate.
  • the first light absorption layer is formed containing a metal oxide or a chalcopyrite-based compound.
  • the first light absorption layer is described in any one of (1) to (9) above, wherein the first light absorption layer is formed by using tungsten oxide, molybdenum oxide, copper indium selenium or gallium copper indium selenium. Image sensor.
  • a pixel separation unit that has a light-shielding property and separates the adjacent pixels from the first surface toward the second surface.
  • the image pickup device according to any one of (1) to (10), further including a second light absorption layer formed on the surface of the pixel separation unit facing the first semiconductor substrate. .. (12) Any one of the above (1) to (11), which is provided for each pixel in the first semiconductor substrate and further has a charge holding unit for accumulating the electric charge generated in the light receiving unit.
  • the image pickup device according to. (13) The light receiving portion and the charge holding portion are arranged in parallel in the plane direction in the first semiconductor substrate, and the light receiving portion and the charge holding portion are located between the light receiving portion and the charge holding portion from the first surface to the first surface.
  • the image pickup device further comprising a light-shielding portion extending toward the surface 2.
  • It also has a light-shielding property and further has a pixel separation unit that separates the adjacent pixels from the first surface toward the second surface.
  • the image pickup device according to (13), wherein the pixel separation portion and the light-shielding portion are continuously formed on the first surface side.
  • 15) Further having a second semiconductor substrate provided with a pixel transistor constituting a pixel circuit for outputting a pixel signal based on the electric charge output from the pixel.
  • the second semiconductor substrate is any of the above (1) to (14), which is arranged on the second surface side of the first semiconductor substrate with the first light absorption layer in between.
  • the image sensor according to one.
  • a first surface having a first surface to be a light incident surface and a second surface opposite to the first surface, and also having a light receiving portion for generating a charge corresponding to the amount of received light for each pixel by photoelectric conversion.
  • a multilayer wiring layer provided on the second surface side of the first semiconductor substrate and having a plurality of wiring layers laminated with an interlayer insulating layer in between.
  • a material provided between the second surface of the first semiconductor substrate and the plurality of wirings and having an absorption coefficient larger than the absorption coefficient of the first semiconductor substrate for a wavelength of 750 nm or more is used.
  • An image pickup device having an image pickup device provided with a first light absorption layer formed of the above.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Power Engineering (AREA)
  • Electromagnetism (AREA)
  • Condensed Matter Physics & Semiconductors (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Solid State Image Pick-Up Elements (AREA)

Abstract

Un élément d'imagerie selon un mode de réalisation de la présente invention comprend: un premier substrat semiconducteur qui a une première surface servant de surface d'incidence de lumière et une seconde surface sur le côté opposé à la première surface, et comprend également une unité de réception de lumière qui génère une charge électrique correspondant à la quantité de lumière reçue au niveau de chaque pixel par conversion photoélectrique; une couche de câblage multicouche disposée sur le second côté de surface du premier substrat semiconducteur et ayant une pluralité de couches de câblage ayant une couche isolante intercouche stratifiée entre celles-ci; et une première couche d'absorption de lumière qui est disposée entre la seconde surface du premier substrat semiconducteur et la pluralité de câblages et est formée à l'aide d'un matériau ayant un coefficient d'absorption plus grand pour des longueurs d'onde de 750 nm ou plus que le coefficient d'absorption du premier substrat semiconducteur.
PCT/JP2021/015278 2020-04-21 2021-04-13 Élément d'imagerie et dispositif d'imagerie WO2021215299A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-075689 2020-04-21
JP2020075689 2020-04-21

Publications (1)

Publication Number Publication Date
WO2021215299A1 true WO2021215299A1 (fr) 2021-10-28

Family

ID=78269360

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/015278 WO2021215299A1 (fr) 2020-04-21 2021-04-13 Élément d'imagerie et dispositif d'imagerie

Country Status (1)

Country Link
WO (1) WO2021215299A1 (fr)

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011077580A1 (fr) * 2009-12-26 2011-06-30 キヤノン株式会社 Dispositif et système d'imagerie à semi-conducteurs
JP2012018951A (ja) * 2010-07-06 2012-01-26 Sony Corp 固体撮像素子及びその製造方法、並びに固体撮像装置及び撮像装置
JP2012175050A (ja) * 2011-02-24 2012-09-10 Sony Corp 固体撮像装置、および、その製造方法、電子機器
JP2012204562A (ja) * 2011-03-25 2012-10-22 Sony Corp 固体撮像装置、および、その製造方法、電子機器
JP2012231032A (ja) * 2011-04-26 2012-11-22 Canon Inc 固体撮像素子及び撮像装置
JP2013038176A (ja) * 2011-08-05 2013-02-21 Toshiba Information Systems (Japan) Corp 裏面照射型固体撮像素子
JP2013065688A (ja) * 2011-09-16 2013-04-11 Sony Corp 固体撮像素子および製造方法、並びに電子機器
JP2014086702A (ja) * 2012-10-26 2014-05-12 Canon Inc 固体撮像装置、その製造方法、およびカメラ
JP2014096540A (ja) * 2012-11-12 2014-05-22 Canon Inc 固体撮像装置およびその製造方法ならびにカメラ
JP2015106621A (ja) * 2013-11-29 2015-06-08 ソニー株式会社 固体撮像素子および製造方法、並びに電子機器
JP2015128187A (ja) * 2015-03-24 2015-07-09 ソニー株式会社 固体撮像装置及び電子機器
WO2016052249A1 (fr) * 2014-10-03 2016-04-07 ソニー株式会社 Élément de formation d'image à semi-conducteurs, procédé de production et dispositif électronique
JP2016082133A (ja) * 2014-10-20 2016-05-16 ソニー株式会社 固体撮像素子及び電子機器
US20160181294A1 (en) * 2013-07-15 2016-06-23 Galaxycore Shanghai Limited Corporation Backside illuminated image sensor and manufacturing method therefor
JP2018022076A (ja) * 2016-08-04 2018-02-08 大日本印刷株式会社 貫通電極基板及び電子機器
WO2019130820A1 (fr) * 2017-12-26 2019-07-04 ソニーセミコンダクタソリューションズ株式会社 Élément d'imagerie et dispositif d'imagerie

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011077580A1 (fr) * 2009-12-26 2011-06-30 キヤノン株式会社 Dispositif et système d'imagerie à semi-conducteurs
JP2012018951A (ja) * 2010-07-06 2012-01-26 Sony Corp 固体撮像素子及びその製造方法、並びに固体撮像装置及び撮像装置
JP2012175050A (ja) * 2011-02-24 2012-09-10 Sony Corp 固体撮像装置、および、その製造方法、電子機器
JP2012204562A (ja) * 2011-03-25 2012-10-22 Sony Corp 固体撮像装置、および、その製造方法、電子機器
JP2012231032A (ja) * 2011-04-26 2012-11-22 Canon Inc 固体撮像素子及び撮像装置
JP2013038176A (ja) * 2011-08-05 2013-02-21 Toshiba Information Systems (Japan) Corp 裏面照射型固体撮像素子
JP2013065688A (ja) * 2011-09-16 2013-04-11 Sony Corp 固体撮像素子および製造方法、並びに電子機器
JP2014086702A (ja) * 2012-10-26 2014-05-12 Canon Inc 固体撮像装置、その製造方法、およびカメラ
JP2014096540A (ja) * 2012-11-12 2014-05-22 Canon Inc 固体撮像装置およびその製造方法ならびにカメラ
US20160181294A1 (en) * 2013-07-15 2016-06-23 Galaxycore Shanghai Limited Corporation Backside illuminated image sensor and manufacturing method therefor
JP2015106621A (ja) * 2013-11-29 2015-06-08 ソニー株式会社 固体撮像素子および製造方法、並びに電子機器
WO2016052249A1 (fr) * 2014-10-03 2016-04-07 ソニー株式会社 Élément de formation d'image à semi-conducteurs, procédé de production et dispositif électronique
JP2016082133A (ja) * 2014-10-20 2016-05-16 ソニー株式会社 固体撮像素子及び電子機器
JP2015128187A (ja) * 2015-03-24 2015-07-09 ソニー株式会社 固体撮像装置及び電子機器
JP2018022076A (ja) * 2016-08-04 2018-02-08 大日本印刷株式会社 貫通電極基板及び電子機器
WO2019130820A1 (fr) * 2017-12-26 2019-07-04 ソニーセミコンダクタソリューションズ株式会社 Élément d'imagerie et dispositif d'imagerie

Similar Documents

Publication Publication Date Title
JP7284171B2 (ja) 固体撮像装置
WO2020189534A1 (fr) Élément de capture d'image et élément semi-conducteur
WO2021235101A1 (fr) Dispositif d'imagerie à semi-conducteurs
JP2019012739A (ja) 固体撮像素子および撮像装置
WO2021124975A1 (fr) Dispositif d'imagerie à semi-conducteurs et instrument électronique
WO2021106732A1 (fr) Dispositif d'imagerie et instrument électronique
JP2021019171A (ja) 固体撮像素子および電子機器
WO2021241019A1 (fr) Élément d'imagerie et dispositif d'imagerie
WO2021100332A1 (fr) Dispositif à semi-conducteur, dispositif de capture d'image monolithique et dispositif électronique
WO2019176303A1 (fr) Circuit de commande de dispositif d'imagerie et dispositif d'imagerie
WO2022131034A1 (fr) Dispositif d'imagerie
WO2022172711A1 (fr) Élément de conversion photoélectrique et dispositif électronique
WO2022009627A1 (fr) Dispositif d'imagerie à semi-conducteurs et dispositif électronique
KR20240037943A (ko) 촬상 장치
WO2021140958A1 (fr) Élément d'imagerie, son procédé de fabrication et dispositif électronique
WO2021215299A1 (fr) Élément d'imagerie et dispositif d'imagerie
TW202118279A (zh) 攝像元件及攝像裝置
WO2024162113A1 (fr) Détecteur optique, élément optique et dispositif électronique
WO2024162114A1 (fr) Détecteur de lumière, élément optique et appareil électronique
EP4415047A1 (fr) Dispositif d'imagerie
WO2024057814A1 (fr) Dispositif de détection de lumière et instrument électronique
WO2024085005A1 (fr) Photodétecteur
WO2021157250A1 (fr) Élément de réception de lumière, dispositif d'imagerie à semi-conducteurs, et appareil électronique
WO2023162496A1 (fr) Dispositif d'imagerie
US20240038807A1 (en) Solid-state imaging device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21791600

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21791600

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP