WO2021215299A1 - Imaging element and imaging device - Google Patents

Imaging element and imaging device Download PDF

Info

Publication number
WO2021215299A1
WO2021215299A1 PCT/JP2021/015278 JP2021015278W WO2021215299A1 WO 2021215299 A1 WO2021215299 A1 WO 2021215299A1 JP 2021015278 W JP2021015278 W JP 2021015278W WO 2021215299 A1 WO2021215299 A1 WO 2021215299A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
semiconductor substrate
image pickup
pixel
pickup device
Prior art date
Application number
PCT/JP2021/015278
Other languages
French (fr)
Japanese (ja)
Inventor
雅史 坂東
至通 熊谷
Original Assignee
ソニーセミコンダクタソリューションズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーセミコンダクタソリューションズ株式会社 filed Critical ソニーセミコンダクタソリューションズ株式会社
Publication of WO2021215299A1 publication Critical patent/WO2021215299A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/76Addressed sensors, e.g. MOS or CMOS sensors
    • H04N25/77Pixel circuitry, e.g. memories, A/D converters, pixel amplifiers, shared circuits or shared components

Definitions

  • the present disclosure relates to, for example, a back-illuminated image sensor and an image sensor equipped with the element.
  • CMOS Complementary Metal Oxide Semiconductor
  • the light transmitted without being absorbed by the light receiving part is reflected by the wiring layer (metal layer) provided below the light receiving part and is reflected by the light receiving part. May re-enter.
  • the intensity of the reflected light is non-uniform for each pixel, optical color mixing occurs between adjacent pixels.
  • Patent Document 1 an image quality is obtained by periodically providing a first reflector that is uniform for each pixel and a second reflector between adjacent pixels below the light receiving portion.
  • a solid-state imaging device with an improved design is disclosed.
  • the image pickup device is required to improve the image quality.
  • the image pickup device as an embodiment of the present disclosure has a first surface as a light incident surface and a second surface opposite to the first surface, and photoelectrically charges a charge corresponding to the amount of light received for each pixel.
  • a first semiconductor substrate having a light receiving portion generated by conversion, and a multilayer wiring layer provided on the second surface side of the first semiconductor substrate and having a plurality of wiring layers laminated with an interlayer insulating layer in between. , Which is provided between the second surface of the first semiconductor substrate and a plurality of wirings, and is formed by using a material having an absorption coefficient larger than the absorption coefficient of the first semiconductor substrate for a wavelength of 750 nm or more. It is provided with a first light absorbing layer.
  • the image pickup device includes the image pickup device according to the embodiment of the present disclosure.
  • the second semiconductor substrate having a light receiving portion for each pixel has a second surface opposite to the first surface which is a light incident surface.
  • a material having an absorption coefficient larger than the absorption coefficient of the first semiconductor substrate for a wavelength of 750 nm or more is provided between the surface and a plurality of wiring layers constituting the multilayer wiring layer provided on the second surface side.
  • a first light absorbing layer formed using the above was provided. As a result, the light transmitted through the first semiconductor substrate without being absorbed by the light receiving portion is reduced from being re-entered into the adjacent pixels due to reflection in the wiring layer provided in the multilayer wiring layer.
  • Deformation example 2 (Example in which unevenness is provided on the surface of the wiring layer directly below the light absorption layer) 2-3.
  • Modification 3 (Example in which a light absorption layer is laminated on a plurality of wiring layers) 2-4.
  • Modification 4 (Example in which a light absorption layer is provided on the surface of the pixel separation portion facing the semiconductor substrate) 2-5.
  • Modification 5 (Example of an image sensor provided with a charge holding portion) 2-6.
  • Modification 6 (Example in which a pixel transistor is provided on a separate substrate and a light absorption layer is provided between a semiconductor substrate having a light receiving portion and another substrate) 3.
  • FIG. 1 shows an example of a cross-sectional configuration of an image pickup device (image pickup device 1) according to an embodiment of the present disclosure.
  • FIG. 2 shows an example of the overall configuration of the image pickup apparatus (imaging apparatus 100) shown in FIG.
  • the image pickup device 100 is, for example, a CMOS image sensor or the like used in an electronic device such as a digital still camera or a video camera, and has a pixel portion 100A in which a plurality of pixels are two-dimensionally arranged in a matrix as an image pickup area. There is.
  • the image sensor 1 is a so-called back-illuminated image sensor that constitutes one pixel (unit pixel P) in the CMOS image sensor or the like.
  • the image pickup device 1 has a configuration in which a semiconductor substrate 10 in which a light receiving portion 11 is embedded and a multilayer wiring layer 20 having a plurality of wiring layers (for example, wiring layers 24, 25, 26) are laminated. ..
  • the semiconductor substrate 10 has a first surface (front surface) 10A and a second surface (back surface) 10B that face each other.
  • the multilayer wiring layer 20 is provided on the first surface 10A of the semiconductor substrate 10, and the second surface 10B of the semiconductor substrate 10 is a light incident surface.
  • the image pickup device 1 of the present embodiment is specifically between the first surface 10A of the semiconductor substrate 10 and a plurality of wiring layers (wiring layers 24, 25, 26) provided in the multilayer wiring layer 20.
  • the light absorption layer 23 is provided between the wiring layer 24 and the wiring layer 24 provided on the semiconductor substrate 10 side.
  • the light absorption layer 23 is formed by using a material having an absorption coefficient larger than the absorption coefficient of the semiconductor substrate 10 with respect to a long wavelength (for example, a wavelength of 750 nm or more).
  • the semiconductor substrate 10 corresponds to a specific example of the "first semiconductor substrate” of the present disclosure
  • the first surface 10A of the semiconductor substrate 10 corresponds to a specific example of the "second surface” of the present disclosure
  • Surface 10B corresponds to a specific example of the "first surface” of the present disclosure
  • the light absorption layer 23 corresponds to a specific example of the "first light absorption layer" of the present disclosure.
  • the semiconductor substrate 10 is composed of, for example, a silicon substrate. As described above, the light receiving portion 11 is embedded in the semiconductor substrate 10 for each unit pixel P, for example.
  • the light receiving unit 11 is, for example, a PIN (Positive Intrinsic Negative) type photodiode PD, and has a pn junction in a predetermined region of the semiconductor substrate 10.
  • PIN Positive Intrinsic Negative
  • the first surface 10A of the semiconductor substrate 10 is further provided with a floating diffusion FD and a pixel circuit that outputs a pixel signal based on the electric charge output from each pixel.
  • the pixel circuit has, for example, a transfer transistor TR, an amplification transistor AMP, a reset transistor RST, and a selection transistor SEL as pixel transistors.
  • FIG. 3 shows an example of the pixel circuit of the image pickup device 1 shown in FIG.
  • the transfer transistor TR is connected between the light receiving unit 11 and the FD.
  • a drive signal TGsig is applied to the gate electrode of the transfer transistor TR.
  • the transfer gate of the transfer transistor TR becomes conductive, and the signal charge accumulated in the light receiving unit 11 is transferred to the FD via the transfer transistor TR.
  • the FD is connected between the transfer transistor TR and the amplification transistor AMP.
  • the FD converts the signal charge transferred by the transfer transistor TR into a voltage signal and outputs it to the amplification transistor AMP.
  • the reset transistor RST is connected between the FD and the power supply unit.
  • a drive signal RSTsig is applied to the gate electrode of the reset transistor RST.
  • this drive signal RSTsig becomes active, the reset gate of the reset transistor RST becomes conductive, and the potential of the FD is reset to the level of the power supply unit.
  • the gate electrode of the amplification transistor AMP is connected to the FD, and the drain electrode is connected to the power supply unit, which serves as an input unit for a voltage signal reading circuit held by the FD, a so-called source follower circuit. That is, the amplification transistor AMP constitutes a constant current source and a source follower circuit connected to one end of the vertical signal line Lsig by connecting its source electrode to the vertical signal line Lsig via the selection transistor SEL.
  • the selection transistor SEL is connected between the source electrode of the amplification transistor AMP and the vertical signal line Lsig.
  • a drive signal SELsig is applied to the gate electrode of the selection transistor SEL.
  • the selection transistor SEL becomes conductive and the unit pixel P becomes selected.
  • the read signal (pixel signal) output from the amplification transistor AMP is output to the vertical signal line Lsig via the selection transistor SEL.
  • the multilayer wiring layer 20 is provided on the first surface 10A side of the semiconductor substrate 10 as described above.
  • the multilayer wiring layer 20 has, for example, an insulating layer 21, a gate wiring layer 22, and an interlayer insulating layer 27 in which the light absorption layer 23 and a plurality of wiring layers 24, 25, 26 are provided in the layer. There is.
  • the insulating layer 21 is provided on the first surface 10A of the semiconductor substrate 10, for example, as a gate insulating layer of a pixel transistor.
  • Examples of the material of the insulating layer 21 include silicon oxide (SiO x ), silicon nitride (SiN x ), silicon oxynitride (SiO x N y ), and the like.
  • the gate wiring layer 22 is provided with, for example, the gate electrodes of the transfer transistor TR, the amplification transistor AMP, the reset transistor RST, and the selection transistor SEL described above.
  • the gate wiring layer 22 is formed using, for example, polysilicon (Poly-Si).
  • the light absorption layer 23 is for passing through the semiconductor substrate 10 without being absorbed by the light receiving unit 11 and absorbing the light incident on the multilayer wiring layer 20, for example, in the interlayer insulating layer 27, the semiconductor substrate. It is provided between the first surface 10A of 10 and the plurality of wiring layers 24.
  • the light absorption layer 23 is formed so as to be symmetrical with respect to the optical center C of the unit pixel P, for example, for each unit pixel P.
  • the light absorption layer 23 can be formed by using, for example, a material having an absorption coefficient larger than the absorption coefficient of the semiconductor substrate 10 with respect to a wavelength of 750 nm or more. Examples of such a material include metal oxides and chalcopyrite compounds.
  • Specific metal oxides include, for example, tungsten oxide (WO x ), molybdenum oxide (MO x ), and composite materials thereof.
  • Specific examples of chalcopyrite compounds include copper indium selenium (CuInSe) and copper indium gallium selenium (CuInGaSe).
  • the light absorption layer can be formed by using a graphite compound, carbon nanotube (CNT), graphene, a fullerene derivative, or the like.
  • FIG. 1 shows an example in which the light absorption layer 23 is provided for each unit pixel P, but the present invention is not limited to this.
  • the light absorption layer 23 may be formed as a common layer for a plurality of pixels arranged two-dimensionally in a matrix on the pixel portion 100A.
  • the wiring layers 24, 25, and 26 are for, for example, driving the light receiving unit 11, transmitting signals, applying voltage to each unit, and the like.
  • the wiring layers 24, 25, and 26 are laminated in the interlayer insulating layer 27 in the order of the wiring layers 24, 25, and 26 from the semiconductor substrate 10 side, respectively.
  • the wiring layers 24, 25, and 26 are formed of, for example, copper (Cu) or aluminum (Al).
  • the interlayer insulating layer 27 is provided on the insulating layer 21 so as to cover the gate wiring layer 22, and has the light absorption layer 23 and the wiring layers 24, 25, 26 in the layer as described above. ..
  • the interlayer insulating layer 27 is formed by using, for example, silicon oxide (SiO x ), silicon nitride (SiN x ), silicon nitriding (SiO x N y ), or the like.
  • a light-shielding film 12, a protective layer 13, and an on-chip lens 14 are provided on the second surface 10B side of the semiconductor substrate 10.
  • the light-shielding film 12 is for preventing oblique light incident from the light incident side S1 from being incident on adjacent unit pixels P, and is provided between adjacent unit pixels P, for example.
  • the light-shielding film 12 is formed of, for example, a metal film such as tungsten (W).
  • the protective layer 13 includes, for example, a light-shielding film 12 in the layer to protect the second surface 10B of the semiconductor substrate 10 and flatten the surface of the light incident side S1.
  • the protective layer 13 is formed of, for example, silicon oxide (SiO x ), silicon nitride (SiN x ), silicon oxynitride (SiO x N y ), or the like.
  • the on-chip lens 14 is for condensing the light incident from the light incident side S1 on the light receiving unit 11.
  • the on-chip lens 14 is formed by using a high refractive index material, and specifically, is formed of an inorganic material such as silicon oxide (SiO x ) or silicon nitride (SiN x).
  • an organic material having a high refractive index such as an episulfide resin, a thietane compound or the resin thereof may be used.
  • the shape of the on-chip lens 14 is not particularly limited, and various lens shapes such as a hemispherical shape and a semi-cylindrical shape can be used. As shown in FIG. 1, the on-chip lens 14 may be provided for each unit pixel P, or for example, one on-chip lens may be provided for each of a plurality of unit pixels P.
  • the imaging device 100 is, for example, a CMOS image sensor, which captures incident light (image light) from a subject via an optical lens system (not shown) and pixels the amount of incident light imaged on the imaging surface. It is converted into an electric signal in units and output as a pixel signal.
  • the image pickup apparatus 100 has a pixel portion 100A as an imaging area on the semiconductor substrate 10, and in a peripheral region of the pixel portion 100A, for example, a vertical drive circuit 111, a column signal processing circuit 112, a horizontal drive circuit 113, and an output. It has a circuit 114, a control circuit 115, and an input / output terminal 116.
  • the pixel unit 100A has, for example, a plurality of unit pixels P arranged two-dimensionally in a matrix.
  • a pixel drive line Lread (specifically, a row selection line and a reset control line) is wired for each pixel row, and a vertical signal line Lsig is wired for each pixel column.
  • the pixel drive line Lead transmits a drive signal for reading a signal from a pixel.
  • One end of the pixel drive line Lead is connected to the output end corresponding to each line of the vertical drive circuit 111.
  • the vertical drive circuit 111 is a pixel drive unit composed of a shift register, an address decoder, etc., and drives each unit pixel P of the pixel unit 100A, for example, in row units.
  • the signal output from each unit pixel P of the pixel row selectively scanned by the vertical drive circuit 111 is supplied to the column signal processing circuit 112 through each of the vertical signal lines Lsig.
  • the column signal processing circuit 112 is composed of an amplifier, a horizontal selection switch, and the like provided for each vertical signal line Lsig.
  • the horizontal drive circuit 113 is composed of a shift register, an address decoder, etc., and drives each horizontal selection switch of the column signal processing circuit 112 in order while scanning. By the selective scanning by the horizontal drive circuit 113, the signals of each pixel transmitted through each of the vertical signal lines Lsig are sequentially output to the horizontal signal line 121 and transmitted to the outside of the semiconductor substrate 10 through the horizontal signal line 121. ..
  • the output circuit 114 performs signal processing on signals sequentially supplied from each of the column signal processing groups r112 via the horizontal signal line 121 and outputs the signals.
  • the output circuit 114 may, for example, perform only buffering, or may perform black level adjustment, column variation correction, various digital signal processing, and the like.
  • the circuit portion including the vertical drive circuit 111, the column signal processing circuit 112, the horizontal drive circuit 113, the horizontal signal line 121, and the output circuit 114 may be formed directly on the semiconductor substrate 10, or may be used as an external control IC. It may be arranged. Further, those circuit portions may be formed on another substrate connected by a cable or the like.
  • the control circuit 115 receives a clock given from the outside of the semiconductor substrate 10, data for instructing an operation mode, and the like, and outputs data such as internal information of the image pickup apparatus 100.
  • the control circuit 115 further includes a timing generator that generates various timing signals, and the vertical drive circuit 111, the column signal processing circuit 112, the horizontal drive circuit 113, and the like based on the various timing signals generated by the timing generator. Controls the drive of peripheral circuits.
  • the input / output terminal 116 exchanges signals with the outside.
  • a plurality of wirings in the first surface 10A of the semiconductor substrate 10 in which the light receiving portion 11 is embedded and in the multilayer wiring layer 20 provided on the first surface 10A side of the semiconductor substrate 10 The light absorption layer 23 is provided between the layers 24, 25, and 26.
  • the light absorption layer 23 is formed by using a material having an absorption coefficient larger than that of the semiconductor substrate 10 for a wavelength of, for example, 750 nm or more.
  • the back-illuminated CIS in which the wiring layer is arranged on the opposite side of the light receiving portion from the light incident side is Since the incident light is not blocked by the wiring layer including the gate electrode and the like, it is possible to suppress a decrease in sensitivity due to miniaturization.
  • long-wavelength light that is not absorbed by the light receiving portion and is transmitted to the wiring layer side may be reflected by the wiring layer (metal layer) and re-entered into the light receiving portion.
  • Optical color mixing between adjacent pixels can be achieved by, for example, making the wiring layer formed below each pixel a symmetrical layout in units of shared pixels, so that non-uniform reflection between adjacent pixels can be achieved. The strength can be relaxed and reduced.
  • the layout of the metal layer such as wiring is symmetric for each shared pixel
  • the arrangement of the metal layer in the pixel is optically asymmetric
  • the light reflected by the metal layer is transferred to the light receiving portion of the adjacent pixel.
  • optical color mixing will occur due to re-incident.
  • the image quality is improved by periodically providing a first reflecting plate that is uniform for each pixel and a second reflecting plate between adjacent pixels below the light receiving portion.
  • the control of reflected light using optical interference has a large wavelength dependence and it is difficult to obtain desired characteristics in a wide wavelength band.
  • the first surface 10A of the semiconductor substrate 10 in which the light receiving portion 11 is embedded and formed for each unit pixel P, and the plurality of wiring layers 24 and 25 provided in the multilayer wiring layer 20 are provided.
  • a light absorption layer 23 containing a material having an absorption coefficient larger than that of the semiconductor substrate 10 for a wavelength of, for example, 750 nm or more is formed between the components and 26.
  • the light that is not absorbed by the light receiving unit 11 and is incident on the multilayer wiring layer 20 is included in the plurality of wiring layers 24, 25, 26 provided in the multilayer wiring layer 20. Optical color mixing to adjacent unit pixels P due to reflection is reduced. Therefore, it is possible to improve the image quality.
  • the light transmitted through the light receiving portion 11 and incident on the multilayer wiring layer 20 is absorbed by the light absorption layer 23, so that the plurality of wiring layers 24 formed below the light absorption layer 23 are formed. , 25, 26 can be improved in layout freedom.
  • FIG. 4A shows an example of the cross-sectional configuration of the image pickup device (image pickup device 1A) according to the first modification of the present disclosure.
  • FIG. 4B schematically shows an example of the planar shape of the light absorption layer 23 in the image pickup device 1A shown in FIG. 4A. Note that FIG. 4A shows a cross section taken along the line II shown in FIG. 4B.
  • a uniform light absorption layer 23 is provided for each unit pixel P is shown, but the present invention is not limited to this.
  • the planar shape of the light absorption layer 23 may have symmetry with respect to the optical center C in the unit pixel P, for example, and the light absorption layer 23 of the image pickup element 1A of this modification is the optical center C.
  • the point that the opening 23H is formed in the vicinity is different from the above-described embodiment.
  • the light transmitted through the semiconductor substrate 10 and incident on the multilayer wiring layer 20 is the light absorption layer 23 as described above.
  • the optical color mixing to the adjacent unit pixels P is reduced, the re-incidentation to the own pixel is also reduced, so that the sensitivity may be lowered.
  • FIG. 5 shows an example of the cross-sectional configuration of the image pickup device (image pickup device 1B) according to the second modification of the present disclosure.
  • the image sensor 1B of the present modification has a concave-convex shape formed on the surface of the wiring layer 24 formed on the uppermost layer, that is, directly below the light absorption layer 23, among the plurality of wiring layers 24, 25, 26. However, it is different from the above embodiment.
  • the effect of suppressing light reflection in the plurality of wiring layers 24, 25, 26 by forming the light absorption layer 23 depends on the absorption coefficient of the light absorption material constituting the light absorption layer 23 and the film thickness of the light absorption layer 23. ..
  • FIG. 6 shows an example of the cross-sectional configuration of the image pickup device (image pickup device 1C) according to the third modification of the present disclosure.
  • the image pickup device 1C of the present modification is different from the above-described embodiment in that the light absorption layer 23 is laminated on each of the plurality of wiring layers 24, 25, and 26.
  • the light absorption layer 23 is laminated and formed on each of the plurality of wiring layers 24, 25, and 26. Specifically, a light absorption layer 23A is formed on the wiring layer 24, a light absorption layer 23B is formed on the wiring layer 25, and a light absorption layer 23C is formed on the wiring layer 26.
  • the light absorption layer 23 and the wiring layers 24, 25, 26 are formed with the interlayer insulating layer 27 spaced apart from each other, but the light absorption layer 23 is shown. May be directly laminated and formed on the wiring layers 24, 25, 26 as in the present modification.
  • FIG. 7 shows an example of the cross-sectional configuration of the image pickup device (image pickup device 1D) according to the modified example 4 of the present disclosure.
  • a pixel separation unit 15 is provided between adjacent unit pixels P, and a light absorption layer 16 having the same configuration as the light absorption layer 23 in the above embodiment is provided on the surface thereof. The point is different from the above-described embodiment.
  • the pixel separation unit 15 is for optically and electrically separating adjacent unit pixels P, and is adjacent to the light receiving units 11 from the second surface 10B to the first surface 10A of the semiconductor substrate 10. It is extended between the two.
  • the pixel separation portion 15 is integrally formed with, for example, a light-shielding film 12 provided on the second surface 10B side of the semiconductor substrate 10, and is formed of, for example, a metal film such as tungsten (W) like the light-shielding film 12.
  • W tungsten
  • a protective layer 13 extends between the pixel separation unit 15 and the semiconductor substrate 10, and the semiconductor substrate 10 and the pixel separation unit 15 are electrically separated from each other.
  • the light absorption layer 16 is provided on the surface of the pixel separation unit 15, specifically, on the surface facing the semiconductor substrate 10. As a result, for example, the intensity of light that is incident from the second surface 10B of the semiconductor substrate 10 and is reflected by the pixel separation unit 15 and the wiring layers 24, 25, 26 and leaks to the adjacent unit pixels P is reduced. ..
  • the light absorption layer 16 corresponds to a specific example of the "second light absorption layer" of the present disclosure.
  • the pixel separation unit 15 for separating the adjacent light receiving units 11 is provided between the adjacent unit pixels P, and the surface of the pixel separation unit 15 facing the semiconductor substrate 10 is provided.
  • the light absorption layer 16 is further provided.
  • FIG. 7 shows an example in which the end portion of the pixel separation unit 15 is provided in the semiconductor substrate 10, the pixel separation unit 15 is, for example, the image sensor 1E (see FIG. 8) described below.
  • the semiconductor substrate 10 may be completely separated from the second surface 10B of the semiconductor substrate 10 toward the first surface 10A.
  • FIG. 8 shows an example of the cross-sectional configuration of the image pickup device (image pickup device 1E) according to the modification 5 of the present disclosure.
  • the image sensor 1E of this modification realizes a so-called global shutter type back-illuminated CIS in which, for example, a charge holding unit 17 that temporarily stores the electric charge generated by the light receiving unit 11 is provided in the unit pixel P. It was done.
  • the global shutter method is basically a method of performing global exposure that starts exposure for all pixels at the same time and ends exposure for all pixels at the same time.
  • all the pixels mean all the pixels of the portion appearing in the image, and dummy pixels and the like are excluded.
  • the global shutter method also includes a method of performing global exposure not only on all the pixels of the portion appearing in the image but also on the pixels in a predetermined region.
  • the electric charge holding unit (MEM) 17 is for temporarily holding the electric charge until the electric charge generated in the light receiving unit 11 is transferred to the floating diffusion FD.
  • the charge holding portion 17 is formed, for example, embedded in the first surface 10A side of the semiconductor substrate 10 in the same manner as the light receiving portion 11.
  • the stray light to the charge holding part 17 becomes a false signal (Parasitic Light Sensitivity), so that the second surface 10B above the charge holding part 17 of the semiconductor substrate 10 and the light receiving part 11 and the charge holding part 11 are charged.
  • a light-shielding portion continuous with the light-shielding film 12 is formed between the portion 17 and the light-shielding film 12.
  • the present technology passes through the semiconductor substrate 10 and the light reflected by the plurality of wiring layers 24, 25, 26 re-enters the charge holding portion 17. It becomes possible to prevent this. This makes it possible to reduce the generation of false signals while improving the image quality of the back-illuminated CIS of the global shutter system.
  • the image sensor 1E of this modification also absorbs light on the respective surfaces of the pixel separation unit 15 and the light-shielding film 12 between the light receiving unit 11 and the charge holding unit 17.
  • the layer 16 may be provided.
  • FIG. 9 shows an example of the cross-sectional configuration of the image pickup device (image pickup device 1F) according to the modification 6 of the present disclosure.
  • image pickup element 1F of this modification pixel transistors such as an amplification transistor AMP, a reset transistor RST, and a selection transistor SEL constituting a pixel circuit are provided on a substrate (semiconductor substrate 30) different from the semiconductor substrate 10, and the semiconductor substrate 10 and the image pickup element 1F are provided. It differs from the above embodiment in that the light absorbing layer 23 is provided between the semiconductor substrate 30 and the semiconductor substrate 30.
  • the first surface 10A of the semiconductor substrate 10 in this modification is provided with a floating diffusion FD and a transfer transistor TR constituting a pixel circuit.
  • the multilayer wiring layer 20 is provided with, for example, a light absorption layer 23 together with a gate electrode of a transfer transistor TR as a common layer for a plurality of unit pixels P, for example.
  • the semiconductor substrate 30 has a first surface (front surface) 30A and a second surface (back surface) 30B facing each other, and pixel transistors such as an amplification transistor AMP, a reset transistor RST, and a selection transistor SEL are provided on the first surface 30A.
  • pixel transistors such as an amplification transistor AMP, a reset transistor RST, and a selection transistor SEL are provided on the first surface 30A.
  • the semiconductor substrate 30 corresponds to a specific example of the "second semiconductor substrate" of the present disclosure.
  • a gate wiring layer 41 including an amplification transistor AMP, a reset transistor RST, and a gate electrode of a selection transistor SEL, and a plurality of wiring layers 42, 43 On the first surface 30A of the semiconductor substrate 30, in the interlayer insulating layer 45, for example, a gate wiring layer 41 including an amplification transistor AMP, a reset transistor RST, and a gate electrode of a selection transistor SEL, and a plurality of wiring layers 42, 43, The multilayer wiring layer 40 included in the 44 is provided.
  • an insulating layer 28 is provided on the second surface 30B of the semiconductor substrate 30.
  • the semiconductor substrate 10 and the semiconductor substrate 30 are the first surface of the semiconductor substrate 10 with the upper surface of the interlayer insulating layer 27 formed on the first surface 10A of the semiconductor substrate 10 and the upper surface of the insulating layer 28 as the bonding surface S3.
  • the 10A and the second surface 30B of the semiconductor substrate 30 are bonded to each other so as to face each other.
  • the amplification transistor AMP, the reset transistor RST, the selection transistor SEL, and the like constituting the pixel circuit are provided on the semiconductor substrate 30, and the light receiving portion 11 is embedded and formed between the semiconductor substrate 10 and the semiconductor substrate 30.
  • the light absorbing layer 23 is provided.
  • FIG. 10 shows an example of a schematic configuration of an image pickup system 2 provided with an image pickup device (for example, an image pickup device 1) according to the above-described embodiment and modifications 1 to 6.
  • the imaging system 2 is, for example, an imaging device such as a digital still camera or a video camera, or an electronic device such as a mobile terminal device such as a smartphone or a tablet terminal.
  • the image pickup system 2 includes, for example, an image pickup element 1, an optical system 241, a shutter device 242, a DSP circuit 243, a frame memory 244, a display unit 245, a storage unit 246, an operation unit 247, and a power supply unit 248.
  • the image pickup element 1, the DSP circuit 243, the frame memory 244, the display unit 245, the storage unit 246, the operation unit 247, and the power supply unit 248 are connected to each other via a bus line 249.
  • the image sensor 1 outputs image data according to the incident light.
  • the optical system 241 is configured to have one or a plurality of lenses, guides light (incident light) from a subject to an image pickup device 1, and forms an image on a light receiving surface of the image pickup device 1.
  • the shutter device 242 is arranged between the optical system 241 and the image sensor 1, and controls the light irradiation period and the light shielding period of the image sensor 1 according to the control of the drive circuit.
  • the DSP circuit 243 is a signal processing circuit that processes a signal (image data) output from the image sensor 1.
  • the frame memory 244 temporarily holds the image data processed by the DSP circuit 243 in frame units.
  • the display unit 245 comprises a panel-type display device such as a liquid crystal panel or an organic EL (Electro Luminescence) panel, and displays a moving image or a still image captured by the image sensor 1.
  • the storage unit 246 records image data of a moving image or a still image captured by the image pickup device 1 on a recording medium such as a semiconductor memory or a hard disk.
  • the operation unit 247 issues operation commands for various functions of the image pickup system 2 according to the operation by the user.
  • the power supply unit 248 appropriately supplies various power sources serving as operating power sources for the image sensor 1, DSP circuit 243, frame memory 244, display unit 245, storage unit 246, and operation unit 247 to these supply targets.
  • FIG. 11 shows an example of a flowchart of an imaging operation in the imaging system 2.
  • the user instructs the start of imaging by operating the operation unit 247 (step S101).
  • the operation unit 247 transmits an image pickup command to the image pickup device 1 (step S102).
  • the image pickup device 1 specifically, the system control circuit
  • the image pickup element 1 specifically, the system control circuit
  • the image sensor 1 outputs the image data obtained by the image pickup to the DSP circuit 243.
  • the image data is data for all pixels of the pixel signal generated based on the electric charge temporarily held in the floating diffusion FD.
  • the DSP circuit 243 performs predetermined signal processing (for example, noise reduction processing) based on the image data input from the image sensor 1 (step S104).
  • the DSP circuit 243 stores the image data subjected to the predetermined signal processing in the frame memory 244, and the frame memory 244 stores the image data in the storage unit 246 (step S105). In this way, the imaging in the imaging system 2 is performed.
  • the image pickup devices 1, 1A to 1F according to the above-described embodiment and modifications 1 to 6 are applied to the image pickup system 2.
  • the image sensor 1 can be miniaturized or high-definition, so that a small-sized or high-definition image pickup system 2 can be provided.
  • the technology according to the present disclosure can be applied to various products.
  • the technology according to the present disclosure is realized as a device mounted on a moving body of any kind such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, a personal mobility, an airplane, a drone, a ship, and a robot. You may.
  • FIG. 12 is a block diagram showing a schematic configuration example of a vehicle control system, which is an example of a mobile control system to which the technique according to the present disclosure can be applied.
  • the vehicle control system 12000 includes a plurality of electronic control units connected via the communication network 12001.
  • the vehicle control system 12000 includes a drive system control unit 12010, a body system control unit 12020, an outside information detection unit 12030, an in-vehicle information detection unit 12040, and an integrated control unit 12050.
  • a microcomputer 12051, an audio image output unit 12052, and an in-vehicle network I / F (interface) 12053 are shown as a functional configuration of the integrated control unit 12050.
  • the drive system control unit 12010 controls the operation of the device related to the drive system of the vehicle according to various programs.
  • the drive system control unit 12010 provides a driving force generator for generating the driving force of the vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to the wheels, and a steering angle of the vehicle. It functions as a control device such as a steering mechanism for adjusting and a braking device for generating a braking force of a vehicle.
  • the body system control unit 12020 controls the operation of various devices mounted on the vehicle body according to various programs.
  • the body system control unit 12020 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as a head lamp, a back lamp, a brake lamp, a winker, or a fog lamp.
  • the body system control unit 12020 may be input with radio waves transmitted from a portable device that substitutes for the key or signals of various switches.
  • the body system control unit 12020 receives inputs of these radio waves or signals and controls a vehicle door lock device, a power window device, a lamp, and the like.
  • the vehicle outside information detection unit 12030 detects information outside the vehicle equipped with the vehicle control system 12000.
  • the image pickup unit 12031 is connected to the vehicle exterior information detection unit 12030.
  • the vehicle outside information detection unit 12030 causes the image pickup unit 12031 to capture an image of the outside of the vehicle and receives the captured image.
  • the vehicle exterior information detection unit 12030 may perform object detection processing or distance detection processing such as a person, a vehicle, an obstacle, a sign, or a character on the road surface based on the received image.
  • the imaging unit 12031 is an optical sensor that receives light and outputs an electric signal according to the amount of the light received.
  • the image pickup unit 12031 can output an electric signal as an image or can output it as distance measurement information. Further, the light received by the imaging unit 12031 may be visible light or invisible light such as infrared light.
  • the in-vehicle information detection unit 12040 detects the in-vehicle information.
  • a driver state detection unit 12041 that detects the driver's state is connected to the in-vehicle information detection unit 12040.
  • the driver state detection unit 12041 includes, for example, a camera that images the driver, and the in-vehicle information detection unit 12040 determines the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 12041. It may be calculated, or it may be determined whether the driver is dozing.
  • the microcomputer 12051 calculates the control target value of the driving force generator, the steering mechanism, or the braking device based on the information inside and outside the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040, and the drive system control unit.
  • a control command can be output to 12010.
  • the microcomputer 12051 realizes ADAS (Advanced Driver Assistance System) functions including vehicle collision avoidance or impact mitigation, follow-up driving based on inter-vehicle distance, vehicle speed maintenance driving, vehicle collision warning, vehicle lane deviation warning, and the like. It is possible to perform cooperative control for the purpose of.
  • ADAS Advanced Driver Assistance System
  • the microcomputer 12051 controls the driving force generator, the steering mechanism, the braking device, and the like based on the information around the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040. It is possible to perform coordinated control for the purpose of automatic driving, etc., which runs autonomously without depending on the operation.
  • the microcomputer 12051 can output a control command to the body system control unit 12020 based on the information outside the vehicle acquired by the vehicle exterior information detection unit 12030.
  • the microcomputer 12051 controls the headlamps according to the position of the preceding vehicle or the oncoming vehicle detected by the external information detection unit 12030, and performs coordinated control for the purpose of anti-glare such as switching the high beam to the low beam. It can be carried out.
  • the audio image output unit 12052 transmits the output signal of at least one of the audio and the image to the output device capable of visually or audibly notifying the passenger or the outside of the vehicle of the information.
  • an audio speaker 12061, a display unit 12062, and an instrument panel 12063 are exemplified as output devices.
  • the display unit 12062 may include, for example, at least one of an onboard display and a heads-up display.
  • FIG. 13 is a diagram showing an example of the installation position of the imaging unit 12031.
  • the vehicle 12100 has image pickup units 12101, 12102, 12103, 12104, 12105 as the image pickup unit 12031.
  • the imaging units 12101, 12102, 12103, 12104, 12105 are provided at positions such as the front nose, side mirrors, rear bumpers, back doors, and the upper part of the windshield in the vehicle interior of the vehicle 12100, for example.
  • the imaging unit 12101 provided on the front nose and the imaging unit 12105 provided on the upper part of the windshield in the vehicle interior mainly acquire an image in front of the vehicle 12100.
  • the imaging units 12102 and 12103 provided in the side mirrors mainly acquire images of the side of the vehicle 12100.
  • the imaging unit 12104 provided on the rear bumper or the back door mainly acquires an image of the rear of the vehicle 12100.
  • the images in front acquired by the imaging units 12101 and 12105 are mainly used for detecting a preceding vehicle or a pedestrian, an obstacle, a traffic light, a traffic sign, a lane, or the like.
  • FIG. 13 shows an example of the photographing range of the imaging units 12101 to 12104.
  • the imaging range 12111 indicates the imaging range of the imaging unit 12101 provided on the front nose
  • the imaging ranges 12112 and 12113 indicate the imaging ranges of the imaging units 12102 and 12103 provided on the side mirrors, respectively
  • the imaging range 12114 indicates the imaging range of the imaging units 12102 and 12103.
  • the imaging range of the imaging unit 12104 provided on the rear bumper or the back door is shown. For example, by superimposing the image data captured by the imaging units 12101 to 12104, a bird's-eye view image of the vehicle 12100 as viewed from above can be obtained.
  • At least one of the imaging units 12101 to 12104 may have a function of acquiring distance information.
  • at least one of the image pickup units 12101 to 12104 may be a stereo camera composed of a plurality of image pickup elements, or an image pickup element having pixels for phase difference detection.
  • the microcomputer 12051 has a distance to each three-dimensional object within the imaging range 12111 to 12114 based on the distance information obtained from the imaging units 12101 to 12104, and a temporal change of this distance (relative velocity with respect to the vehicle 12100). By obtaining can. Further, the microcomputer 12051 can set an inter-vehicle distance to be secured in front of the preceding vehicle in advance, and can perform automatic braking control (including follow-up stop control), automatic acceleration control (including follow-up start control), and the like. In this way, it is possible to perform coordinated control for the purpose of automatic driving or the like in which the vehicle travels autonomously without depending on the operation of the driver.
  • automatic braking control including follow-up stop control
  • automatic acceleration control including follow-up start control
  • the microcomputer 12051 converts three-dimensional object data related to a three-dimensional object into two-wheeled vehicles, ordinary vehicles, large vehicles, pedestrians, electric poles, and other three-dimensional objects based on the distance information obtained from the imaging units 12101 to 12104. It can be classified and extracted and used for automatic avoidance of obstacles. For example, the microcomputer 12051 distinguishes obstacles around the vehicle 12100 into obstacles that can be seen by the driver of the vehicle 12100 and obstacles that are difficult to see. Then, the microcomputer 12051 determines the collision risk indicating the risk of collision with each obstacle, and when the collision risk is equal to or higher than the set value and there is a possibility of collision, the microcomputer 12051 via the audio speaker 12061 or the display unit 12062. By outputting an alarm to the driver and performing forced deceleration and avoidance steering via the drive system control unit 12010, driving support for collision avoidance can be provided.
  • At least one of the imaging units 12101 to 12104 may be an infrared camera that detects infrared rays.
  • the microcomputer 12051 can recognize a pedestrian by determining whether or not a pedestrian is present in the captured image of the imaging units 12101 to 12104.
  • pedestrian recognition includes, for example, a procedure for extracting feature points in an image captured by an imaging unit 12101 to 12104 as an infrared camera, and pattern matching processing for a series of feature points indicating the outline of an object to determine whether or not the pedestrian is a pedestrian. It is done by the procedure to determine.
  • the audio image output unit 12052 When the microcomputer 12051 determines that a pedestrian is present in the captured images of the imaging units 12101 to 12104 and recognizes the pedestrian, the audio image output unit 12052 outputs a square contour line for emphasizing the recognized pedestrian.
  • the display unit 12062 is controlled so as to superimpose and display. Further, the audio image output unit 12052 may control the display unit 12062 so as to display an icon or the like indicating a pedestrian at a desired position.
  • the above is an example of a mobile control system to which the technology according to the present disclosure can be applied.
  • the technique according to the present disclosure can be applied to the imaging unit 12031 among the configurations described above.
  • the image pickup apparatus 100 can be applied to the image pickup unit 12031.
  • the technique according to the present disclosure to the image pickup unit 12031, a high-definition photographed image with less noise can be obtained, so that highly accurate control using the photographed image can be performed in the moving body control system.
  • the technology according to the present disclosure (the present technology) can be applied to various products.
  • the techniques according to the present disclosure may be applied to endoscopic surgery systems.
  • FIG. 14 is a diagram showing an example of a schematic configuration of an endoscopic surgery system to which the technique according to the present disclosure (the present technique) can be applied.
  • FIG. 14 shows a surgeon (doctor) 11131 performing surgery on patient 11132 on patient bed 11153 using the endoscopic surgery system 11000.
  • the endoscopic surgery system 11000 includes an endoscope 11100, other surgical tools 11110 such as an abdominal tube 11111 and an energy treatment tool 11112, and a support arm device 11120 that supports the endoscope 11100.
  • a cart 11200 equipped with various devices for endoscopic surgery.
  • the endoscope 11100 is composed of a lens barrel 11101 in which a region having a predetermined length from the tip is inserted into the body cavity of the patient 11132, and a camera head 11102 connected to the base end of the lens barrel 11101.
  • the endoscope 11100 configured as a so-called rigid mirror having a rigid barrel 11101 is illustrated, but the endoscope 11100 may be configured as a so-called flexible mirror having a flexible barrel. good.
  • An opening in which an objective lens is fitted is provided at the tip of the lens barrel 11101.
  • a light source device 11203 is connected to the endoscope 11100, and the light generated by the light source device 11203 is guided to the tip of the lens barrel by a light guide extending inside the lens barrel 11101, and is an objective. It is irradiated toward the observation target in the body cavity of the patient 11132 through the lens.
  • the endoscope 11100 may be a direct endoscope, a perspective mirror, or a side endoscope.
  • An optical system and an imaging element are provided inside the camera head 11102, and the reflected light (observation light) from the observation target is focused on the imaging element by the optical system.
  • the observation light is photoelectrically converted by the image sensor, and an electric signal corresponding to the observation light, that is, an image signal corresponding to the observation image is generated.
  • the image signal is transmitted as RAW data to the camera control unit (CCU: Camera Control Unit) 11201.
  • CCU Camera Control Unit
  • the CCU11201 is composed of a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), and the like, and comprehensively controls the operations of the endoscope 11100 and the display device 11202. Further, the CCU 11201 receives an image signal from the camera head 11102, and performs various image processing on the image signal for displaying an image based on the image signal, such as development processing (demosaic processing).
  • a CPU Central Processing Unit
  • GPU Graphics Processing Unit
  • the display device 11202 displays an image based on the image signal processed by the CCU 11201 under the control of the CCU 11201.
  • the light source device 11203 is composed of, for example, a light source such as an LED (Light Emitting Diode), and supplies irradiation light to the endoscope 11100 when photographing an operating part or the like.
  • a light source such as an LED (Light Emitting Diode)
  • LED Light Emitting Diode
  • the input device 11204 is an input interface for the endoscopic surgery system 11000.
  • the user can input various information and input instructions to the endoscopic surgery system 11000 via the input device 11204.
  • the user inputs an instruction to change the imaging conditions (type of irradiation light, magnification, focal length, etc.) by the endoscope 11100.
  • the treatment tool control device 11205 controls the drive of the energy treatment tool 11112 for ablation of tissue, incision, sealing of blood vessels, and the like.
  • the pneumoperitoneum device 11206 uses a gas in the pneumoperitoneum tube 11111 to inflate the body cavity of the patient 11132 for the purpose of securing the field of view by the endoscope 11100 and securing the work space of the operator.
  • the recorder 11207 is a device capable of recording various information related to surgery.
  • the printer 11208 is a device capable of printing various information related to surgery in various formats such as texts, images, and graphs.
  • the light source device 11203 that supplies the irradiation light to the endoscope 11100 when photographing the surgical site can be composed of, for example, an LED, a laser light source, or a white light source composed of a combination thereof.
  • a white light source is configured by combining RGB laser light sources, the output intensity and output timing of each color (each wavelength) can be controlled with high accuracy. Therefore, the light source device 11203 adjusts the white balance of the captured image. It can be carried out.
  • the laser light from each of the RGB laser light sources is irradiated to the observation target in a time-division manner, and the drive of the image sensor of the camera head 11102 is controlled in synchronization with the irradiation timing to correspond to each of RGB. It is also possible to capture the image in a time-division manner. According to this method, a color image can be obtained without providing a color filter on the image sensor.
  • the drive of the light source device 11203 may be controlled so as to change the intensity of the output light at predetermined time intervals.
  • the drive of the image sensor of the camera head 11102 in synchronization with the timing of changing the light intensity to acquire an image in a time-divided manner and synthesizing the image, so-called high dynamic without blackout and overexposure. Range images can be generated.
  • the light source device 11203 may be configured to be able to supply light in a predetermined wavelength band corresponding to special light observation.
  • special light observation for example, by utilizing the wavelength dependence of light absorption in body tissue to irradiate light in a narrow band as compared with the irradiation light (that is, white light) in normal observation, the surface layer of the mucous membrane.
  • a so-called narrow band imaging is performed in which a predetermined tissue such as a blood vessel is photographed with high contrast.
  • fluorescence observation may be performed in which an image is obtained by fluorescence generated by irradiating with excitation light.
  • the body tissue is irradiated with excitation light to observe the fluorescence from the body tissue (autofluorescence observation), or a reagent such as indocyanine green (ICG) is locally injected into the body tissue and the body tissue is injected. It is possible to obtain a fluorescence image by irradiating excitation light corresponding to the fluorescence wavelength of the reagent.
  • the light source device 11203 may be configured to be capable of supplying narrow band light and / or excitation light corresponding to such special light observation.
  • FIG. 15 is a block diagram showing an example of the functional configuration of the camera head 11102 and CCU11201 shown in FIG.
  • the camera head 11102 includes a lens unit 11401, an imaging unit 11402, a driving unit 11403, a communication unit 11404, and a camera head control unit 11405.
  • CCU11201 has a communication unit 11411, an image processing unit 11412, and a control unit 11413.
  • the camera head 11102 and CCU11201 are communicably connected to each other by a transmission cable 11400.
  • the lens unit 11401 is an optical system provided at a connection portion with the lens barrel 11101.
  • the observation light taken in from the tip of the lens barrel 11101 is guided to the camera head 11102 and incident on the lens unit 11401.
  • the lens unit 11401 is configured by combining a plurality of lenses including a zoom lens and a focus lens.
  • the image pickup unit 11402 is composed of an image pickup element.
  • the image sensor constituting the image pickup unit 11402 may be one (so-called single plate type) or a plurality (so-called multi-plate type).
  • each image pickup element may generate an image signal corresponding to each of RGB, and a color image may be obtained by synthesizing them.
  • the image pickup unit 11402 may be configured to have a pair of image pickup elements for acquiring image signals for the right eye and the left eye corresponding to 3D (Dimensional) display, respectively.
  • the 3D display enables the operator 11131 to more accurately grasp the depth of the biological tissue in the surgical site.
  • a plurality of lens units 11401 may be provided corresponding to each image pickup element.
  • the imaging unit 11402 does not necessarily have to be provided on the camera head 11102.
  • the imaging unit 11402 may be provided inside the lens barrel 11101 immediately after the objective lens.
  • the drive unit 11403 is composed of an actuator, and the zoom lens and focus lens of the lens unit 11401 are moved by a predetermined distance along the optical axis under the control of the camera head control unit 11405. As a result, the magnification and focus of the image captured by the imaging unit 11402 can be adjusted as appropriate.
  • the communication unit 11404 is composed of a communication device for transmitting and receiving various information to and from the CCU11201.
  • the communication unit 11404 transmits the image signal obtained from the image pickup unit 11402 as RAW data to the CCU 11201 via the transmission cable 11400.
  • the communication unit 11404 receives a control signal for controlling the drive of the camera head 11102 from the CCU 11201 and supplies the control signal to the camera head control unit 11405.
  • the control signal includes, for example, information to specify the frame rate of the captured image, information to specify the exposure value at the time of imaging, and / or information to specify the magnification and focus of the captured image, and the like. Contains information about the condition.
  • the imaging conditions such as the frame rate, exposure value, magnification, and focus may be appropriately specified by the user, or may be automatically set by the control unit 11413 of CCU11201 based on the acquired image signal. good.
  • the so-called AE (Auto Exposure) function, AF (Auto Focus) function, and AWB (Auto White Balance) function are mounted on the endoscope 11100.
  • the camera head control unit 11405 controls the drive of the camera head 11102 based on the control signal from the CCU 11201 received via the communication unit 11404.
  • the communication unit 11411 is composed of a communication device for transmitting and receiving various information to and from the camera head 11102.
  • the communication unit 11411 receives an image signal transmitted from the camera head 11102 via the transmission cable 11400.
  • the communication unit 11411 transmits a control signal for controlling the drive of the camera head 11102 to the camera head 11102.
  • Image signals and control signals can be transmitted by telecommunications, optical communication, or the like.
  • the image processing unit 11412 performs various image processing on the image signal which is the RAW data transmitted from the camera head 11102.
  • the control unit 11413 performs various controls related to the imaging of the surgical site and the like by the endoscope 11100 and the display of the captured image obtained by the imaging of the surgical site and the like. For example, the control unit 11413 generates a control signal for controlling the drive of the camera head 11102.
  • control unit 11413 causes the display device 11202 to display an image captured by the surgical unit or the like based on the image signal processed by the image processing unit 11412.
  • the control unit 11413 may recognize various objects in the captured image by using various image recognition techniques. For example, the control unit 11413 detects the shape, color, and the like of the edge of an object included in the captured image to remove surgical tools such as forceps, a specific biological part, bleeding, and mist when using the energy treatment tool 11112. Can be recognized.
  • the control unit 11413 may superimpose and display various surgical support information on the image of the surgical unit by using the recognition result. By superimposing and displaying the surgical support information and presenting it to the surgeon 11131, it is possible to reduce the burden on the surgeon 11131 and to allow the surgeon 11131 to proceed with the surgery reliably.
  • the transmission cable 11400 that connects the camera head 11102 and CCU11201 is an electric signal cable that supports electric signal communication, an optical fiber that supports optical communication, or a composite cable thereof.
  • the communication is performed by wire using the transmission cable 11400, but the communication between the camera head 11102 and the CCU11201 may be performed wirelessly.
  • the technique according to the present disclosure can be suitably applied to the imaging unit 11402 provided on the camera head 11102 of the endoscope 11100.
  • the imaging unit 11402 can be miniaturized or have high definition, so that a compact or high-definition endoscope 11100 can be provided.
  • the present disclosure has been described above with reference to the embodiments and modifications 1 to 6, application examples and application examples, the present disclosure is not limited to the above-described embodiments and the like, and various modifications are possible. ..
  • a red filter that transmits light in the red wavelength region
  • a green filter that transmits light in the green wavelength region
  • a blue color that transmits light in the blue wavelength region.
  • the filter may be provided with an optical member such as a color filter provided in a regular color arrangement (for example, a bayer arrangement) in the pixel portion 100A, for example.
  • the present disclosure may also have the following structure.
  • the present technology having the following configuration, on the second surface of the first semiconductor substrate having a light receiving portion for each pixel, on the side opposite to the first surface which is the light incident surface, and on the second surface side.
  • a first light formed between a plurality of wiring layers constituting the provided multilayer wiring layer by using a material having an absorption coefficient larger than the absorption coefficient of the first semiconductor substrate for a wavelength of 750 nm or more.
  • An absorption layer was provided.
  • a material provided between the second surface of the first semiconductor substrate and the plurality of wirings and having an absorption coefficient larger than the absorption coefficient of the first semiconductor substrate for a wavelength of 750 nm or more is used.
  • An image pickup device including a first light absorption layer formed of the above.
  • the image pickup device (2) The image pickup device according to (1), wherein the first light absorption layer is provided for each pixel. (3) The image pickup device according to (1) or (2), wherein the first light absorption layer is formed symmetrically with respect to the optical center of the pixel. (4) The first light absorption layer is an image pickup device according to any one of (1) to (3), which has an opening with respect to the optical center of the pixel. (5) The image pickup device according to any one of (1) to (4), wherein the first light absorption layer is provided as a common layer for a plurality of the pixels. (6) The wiring layer directly below the first light absorption layer among the plurality of wiring layers has a concavo-convex shape on the surface, according to any one of (1) to (5). Image sensor.
  • the image pickup device according to any one of (1) to (6) above, wherein the plurality of wiring layers have a laminated structure with the first light absorption layer.
  • the first semiconductor substrate is a silicon substrate.
  • the first light absorption layer is formed containing a metal oxide or a chalcopyrite-based compound.
  • the first light absorption layer is described in any one of (1) to (9) above, wherein the first light absorption layer is formed by using tungsten oxide, molybdenum oxide, copper indium selenium or gallium copper indium selenium. Image sensor.
  • a pixel separation unit that has a light-shielding property and separates the adjacent pixels from the first surface toward the second surface.
  • the image pickup device according to any one of (1) to (10), further including a second light absorption layer formed on the surface of the pixel separation unit facing the first semiconductor substrate. .. (12) Any one of the above (1) to (11), which is provided for each pixel in the first semiconductor substrate and further has a charge holding unit for accumulating the electric charge generated in the light receiving unit.
  • the image pickup device according to. (13) The light receiving portion and the charge holding portion are arranged in parallel in the plane direction in the first semiconductor substrate, and the light receiving portion and the charge holding portion are located between the light receiving portion and the charge holding portion from the first surface to the first surface.
  • the image pickup device further comprising a light-shielding portion extending toward the surface 2.
  • It also has a light-shielding property and further has a pixel separation unit that separates the adjacent pixels from the first surface toward the second surface.
  • the image pickup device according to (13), wherein the pixel separation portion and the light-shielding portion are continuously formed on the first surface side.
  • 15) Further having a second semiconductor substrate provided with a pixel transistor constituting a pixel circuit for outputting a pixel signal based on the electric charge output from the pixel.
  • the second semiconductor substrate is any of the above (1) to (14), which is arranged on the second surface side of the first semiconductor substrate with the first light absorption layer in between.
  • the image sensor according to one.
  • a first surface having a first surface to be a light incident surface and a second surface opposite to the first surface, and also having a light receiving portion for generating a charge corresponding to the amount of received light for each pixel by photoelectric conversion.
  • a multilayer wiring layer provided on the second surface side of the first semiconductor substrate and having a plurality of wiring layers laminated with an interlayer insulating layer in between.
  • a material provided between the second surface of the first semiconductor substrate and the plurality of wirings and having an absorption coefficient larger than the absorption coefficient of the first semiconductor substrate for a wavelength of 750 nm or more is used.
  • An image pickup device having an image pickup device provided with a first light absorption layer formed of the above.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Power Engineering (AREA)
  • Electromagnetism (AREA)
  • Condensed Matter Physics & Semiconductors (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Solid State Image Pick-Up Elements (AREA)

Abstract

An imaging element according to one embodiment of the present disclosure comprises: a first semiconductor substrate that has a first surface serving as a light incidence surface and a second surface on the reverse side from the first surface, and also has a light-reception unit that generates an electrical charge corresponding to the amount of light received at each pixel via photoelectric conversion; a multilayer wiring layer provided on the second surface side of the first semiconductor substrate and having a plurality of wiring layers having an interlayer insulating layer layered therebetween; and a first light-absorption layer that is provided between the second surface of the first semiconductor substrate and the plurality of wirings and is formed using a material having a larger absorption coefficient for wavelengths of 750 nm or greater than the absorption coefficient of the first semiconductor substrate.

Description

撮像素子および撮像装置Image sensor and image sensor
 本開示は、例えば裏面照射型の撮像素子およびこれを備えた撮像装置に関する。 The present disclosure relates to, for example, a back-illuminated image sensor and an image sensor equipped with the element.
 裏面照射型のCMOS(Complementary Metal Oxide Semiconductor)イメージセンサ(CIS)では、受光部において吸収されずに透過した光は、受光部の下方に設けられた配線層(金属層)によって反射されて受光部に再入射することがある。この反射光の強度が画素毎に不均一である場合、隣り合う画素間において光学的混色が発生する。これに対して、例えば、特許文献1では、受光部の下方に、各画素に対して一様な第1の反射板および隣り合う画素間に第2の反射板を周期的に設けることで画質の向上を図った固体撮像装置が開示されている。 In a back-illuminated CMOS (Complementary Metal Oxide Semiconductor) image sensor (CIS), the light transmitted without being absorbed by the light receiving part is reflected by the wiring layer (metal layer) provided below the light receiving part and is reflected by the light receiving part. May re-enter. When the intensity of the reflected light is non-uniform for each pixel, optical color mixing occurs between adjacent pixels. On the other hand, for example, in Patent Document 1, an image quality is obtained by periodically providing a first reflector that is uniform for each pixel and a second reflector between adjacent pixels below the light receiving portion. A solid-state imaging device with an improved design is disclosed.
特開2014-53429号公報Japanese Unexamined Patent Publication No. 2014-53429
 このように、撮像装置では画質の向上が求められている。 In this way, the image pickup device is required to improve the image quality.
 画質を向上させることが可能な撮像素子および撮像装置を提供することが望ましい。 It is desirable to provide an image sensor and an image device capable of improving image quality.
 本開示の一実施形態としての撮像素子は、光入射面となる第1の面と、第1の面の反対側の第2の面を有すると共に、画素毎に受光量に応じた電荷を光電変換により生成する受光部を有する第1の半導体基板と、第1の半導体基板の第2の面側に設けられると共に、複数の配線層が層間絶縁層を間に積層されている多層配線層と、第1の半導体基板の第2の面と複数の配線との間に設けられると共に、750nm以上の波長に対して第1の半導体基板の吸収係数より大きな吸収係数を有する材料を用いて形成された第1の光吸収層とを備えたものである。 The image pickup device as an embodiment of the present disclosure has a first surface as a light incident surface and a second surface opposite to the first surface, and photoelectrically charges a charge corresponding to the amount of light received for each pixel. A first semiconductor substrate having a light receiving portion generated by conversion, and a multilayer wiring layer provided on the second surface side of the first semiconductor substrate and having a plurality of wiring layers laminated with an interlayer insulating layer in between. , Which is provided between the second surface of the first semiconductor substrate and a plurality of wirings, and is formed by using a material having an absorption coefficient larger than the absorption coefficient of the first semiconductor substrate for a wavelength of 750 nm or more. It is provided with a first light absorbing layer.
 本開示の一実施形態としての撮像装置は、上記本開示の一実施形態の撮像素子を備えたものである。 The image pickup device according to the embodiment of the present disclosure includes the image pickup device according to the embodiment of the present disclosure.
 本開示の一実施形態としての撮像素子および一実施形態の撮像装置では、画素毎に受光部を有する第1の半導体基板の、光入射面となる第1の面とは反対側の第2の面と、第2の面側に設けられた多層配線層を構成する複数の配線層との間に、750nm以上の波長に対して第1の半導体基板の吸収係数より大きな吸収係数を有する材料を用いて形成された第1の光吸収層を設けるようにした。これにより、受光部において吸収されずに第1半導体基板を透過した光の、多層配線層内に設けられた配線層における反射による隣り合う画素への再入射を低減する。 In the image pickup device according to the embodiment of the present disclosure and the image pickup device according to the embodiment, the second semiconductor substrate having a light receiving portion for each pixel has a second surface opposite to the first surface which is a light incident surface. A material having an absorption coefficient larger than the absorption coefficient of the first semiconductor substrate for a wavelength of 750 nm or more is provided between the surface and a plurality of wiring layers constituting the multilayer wiring layer provided on the second surface side. A first light absorbing layer formed using the above was provided. As a result, the light transmitted through the first semiconductor substrate without being absorbed by the light receiving portion is reduced from being re-entered into the adjacent pixels due to reflection in the wiring layer provided in the multilayer wiring layer.
本開示の実施の形態に係る撮像素子の概略構成の一例を表す断面模式図である。It is sectional drawing which shows an example of the schematic structure of the image pickup device which concerns on embodiment of this disclosure. 図1に示した撮像素子の全体構成を表すブロック図である。It is a block diagram which shows the whole structure of the image pickup device shown in FIG. 図1に示した撮像素子の等価回路図である。It is an equivalent circuit diagram of the image pickup device shown in FIG. 本開示の変形例1に係る撮像素子の概略構成の一例を表す断面模式図である。It is sectional drawing which shows an example of the schematic structure of the image pickup device which concerns on the modification 1 of this disclosure. 図4Aに示した撮像素子における光吸収層の構成の一例を表す平面模式図である。It is a plan schematic diagram which shows an example of the structure of the light absorption layer in the image pickup device shown in FIG. 4A. 本開示の変形例2に係る撮像素子の概略構成の一例を表す断面模式図である。It is sectional drawing which shows an example of the schematic structure of the image pickup device which concerns on the modification 2 of this disclosure. 本開示の変形例3に係る撮像素子の概略構成の一例を表す断面模式図である。It is sectional drawing which shows an example of the schematic structure of the image pickup device which concerns on the modification 3 of this disclosure. 本開示の変形例4に係る撮像素子の概略構成の一例を表す断面模式図である。It is sectional drawing which shows an example of the schematic structure of the image pickup device which concerns on the modification 4 of this disclosure. 本開示の変形例5に係る撮像素子の概略構成の一例を表す断面模式図である。It is sectional drawing which shows an example of the schematic structure of the image pickup device which concerns on the modification 5 of this disclosure. 本開示の変形例6に係る撮像素子の概略構成の一例を表す断面模式図である。It is sectional drawing which shows an example of the schematic structure of the image pickup device which concerns on the modification 6 of this disclosure. 上記実施の形態および変形例1~6に係る撮像素子を備えた撮像システムの概略構成の一例を表す図である。It is a figure which shows an example of the schematic structure of the image pickup system provided with the image pickup element which concerns on the said Embodiment and the modification 1-6. 図10の撮像システムにおける撮像手順の一例を表す図である。It is a figure which shows an example of the imaging procedure in the imaging system of FIG. 車両制御システムの概略的な構成の一例を示すブロック図である。It is a block diagram which shows an example of the schematic structure of a vehicle control system. 車外情報検出部及び撮像部の設置位置の一例を示す説明図である。It is explanatory drawing which shows an example of the installation position of the vehicle exterior information detection unit and the image pickup unit. 内視鏡手術システムの概略的な構成の一例を示す図である。It is a figure which shows an example of the schematic structure of the endoscopic surgery system. カメラヘッド及びCCUの機能構成の一例を示すブロック図である。It is a block diagram which shows an example of the functional structure of a camera head and a CCU.
 以下、本開示における一実施形態について、図面を参照して詳細に説明する。以下の説明は本開示の一具体例であって、本開示は以下の態様に限定されるものではない。また、本開示は、各図に示す各構成要素の配置や寸法、寸法比等についても、それらに限定されるものではない。なお、説明する順序は、下記の通りである。
 1.実施の形態(半導体基板の表面と多層配線層内における複数の配線層との間に光吸収層を設けた例)
   1-1.撮像素子の構成
   1-2.撮像装置の構成
   1-3.作用・効果
 2.変形例
   2-1.変形例1(光吸収層の平面形状の他の例)
   2-2.変形例2(光吸収層の直下の配線層の表面に凹凸を設けた例)
   2-3.変形例3(複数の配線層上に光吸収層を積層した例)
   2-4.変形例4(半導体基板に面する画素分離部の表面に光吸収層を設けた例)
   2-5.変形例5(電荷保持部を備えた撮像素子の例)
   2-6.変形例6(画素トランジスタを別基板に設け、受光部を有する半導体基板と別基板との間に光吸収層を設けた例)
 3.適用例
 4.応用例
Hereinafter, one embodiment in the present disclosure will be described in detail with reference to the drawings. The following description is a specific example of the present disclosure, and the present disclosure is not limited to the following aspects. Further, the present disclosure is not limited to the arrangement, dimensions, dimensional ratio, etc. of each component shown in each figure. The order of explanation is as follows.
1. 1. Embodiment (Example in which a light absorption layer is provided between the surface of a semiconductor substrate and a plurality of wiring layers in a multilayer wiring layer)
1-1. Configuration of image sensor 1-2. Configuration of image pickup device 1-3. Action / effect 2. Modification example 2-1. Deformation example 1 (another example of the planar shape of the light absorption layer)
2-2. Deformation example 2 (Example in which unevenness is provided on the surface of the wiring layer directly below the light absorption layer)
2-3. Modification 3 (Example in which a light absorption layer is laminated on a plurality of wiring layers)
2-4. Modification 4 (Example in which a light absorption layer is provided on the surface of the pixel separation portion facing the semiconductor substrate)
2-5. Modification 5 (Example of an image sensor provided with a charge holding portion)
2-6. Modification 6 (Example in which a pixel transistor is provided on a separate substrate and a light absorption layer is provided between a semiconductor substrate having a light receiving portion and another substrate)
3. 3. Application example 4. Application example
<1.実施の形態>
 図1は、本開示の一実施の形態に係る撮像素子(撮像素子1)の断面構成の一例を表したものである。図2は、図1に示した撮像装置(撮像装置100)の全体構成の一例を表したものである。撮像装置100は、例えば、デジタルスチルカメラ、ビデオカメラ等の電子機器に用いられるCMOSイメージセンサ等であり、撮像エリアとして、複数の画素が行列状に2次元配置された画素部100Aを有している。撮像素子1は、このCMOSイメージセンサ等において、1つの画素(単位画素P)を構成する所謂裏面照射型の撮像素子である。
<1. Embodiment>
FIG. 1 shows an example of a cross-sectional configuration of an image pickup device (image pickup device 1) according to an embodiment of the present disclosure. FIG. 2 shows an example of the overall configuration of the image pickup apparatus (imaging apparatus 100) shown in FIG. The image pickup device 100 is, for example, a CMOS image sensor or the like used in an electronic device such as a digital still camera or a video camera, and has a pixel portion 100A in which a plurality of pixels are two-dimensionally arranged in a matrix as an image pickup area. There is. The image sensor 1 is a so-called back-illuminated image sensor that constitutes one pixel (unit pixel P) in the CMOS image sensor or the like.
(1-2.撮像素子の構成)
 撮像素子1は、受光部11が埋め込み形成された半導体基板10と、複数の配線層(例えば、配線層24,25,26)を有する多層配線層20とが積層された構成を有している。半導体基板10は、対向する第1面(表面)10Aおよび第2面(裏面)10Bを有している。多層配線層20は、半導体基板10の第1面10A上に設けられており、半導体基板10の第2面10Bは光入射面となっている。本実施の形態の撮像素子1は、半導体基板10の第1面10Aと、多層配線層20内に設けられた複数の配線層(配線層24,25,26)との間、具体的には、最も半導体基板10側に設けられた配線層24との間に、光吸収層23を設けたものである。光吸収層23は、長波長(例えば、750nm以上の波長)に対して半導体基板10の吸収係数より大きな吸収係数を有する材料を用いて形成されている。
(1-2. Configuration of image sensor)
The image pickup device 1 has a configuration in which a semiconductor substrate 10 in which a light receiving portion 11 is embedded and a multilayer wiring layer 20 having a plurality of wiring layers (for example, wiring layers 24, 25, 26) are laminated. .. The semiconductor substrate 10 has a first surface (front surface) 10A and a second surface (back surface) 10B that face each other. The multilayer wiring layer 20 is provided on the first surface 10A of the semiconductor substrate 10, and the second surface 10B of the semiconductor substrate 10 is a light incident surface. The image pickup device 1 of the present embodiment is specifically between the first surface 10A of the semiconductor substrate 10 and a plurality of wiring layers (wiring layers 24, 25, 26) provided in the multilayer wiring layer 20. The light absorption layer 23 is provided between the wiring layer 24 and the wiring layer 24 provided on the semiconductor substrate 10 side. The light absorption layer 23 is formed by using a material having an absorption coefficient larger than the absorption coefficient of the semiconductor substrate 10 with respect to a long wavelength (for example, a wavelength of 750 nm or more).
 この半導体基板10が、本開示の「第1の半導体基板」の一具体例に相当し、半導体基板10の第1面10Aが本開示の「第2の面」の一具体例に、第2面10Bが本開示の「第1の面」の一具体例に相当する。光吸収層23が、本開示の「第1の光吸収層」の一具体例に相当する。 The semiconductor substrate 10 corresponds to a specific example of the "first semiconductor substrate" of the present disclosure, and the first surface 10A of the semiconductor substrate 10 corresponds to a specific example of the "second surface" of the present disclosure. Surface 10B corresponds to a specific example of the "first surface" of the present disclosure. The light absorption layer 23 corresponds to a specific example of the "first light absorption layer" of the present disclosure.
 半導体基板10は、例えば、シリコン基板で構成されている。半導体基板10には、上記のように、受光部11が、例えば単位画素P毎に埋め込み形成されている。この受光部11は、例えばPIN(Positive Intrinsic Negative)型のフォトダイオードPDであり、半導体基板10の所定領域にpn接合を有している。 The semiconductor substrate 10 is composed of, for example, a silicon substrate. As described above, the light receiving portion 11 is embedded in the semiconductor substrate 10 for each unit pixel P, for example. The light receiving unit 11 is, for example, a PIN (Positive Intrinsic Negative) type photodiode PD, and has a pn junction in a predetermined region of the semiconductor substrate 10.
 半導体基板10の第1面10Aには、さらに、図示していないがフローティングディフュージョンFDや、各画素から出力された電荷に基づく画素信号を出力する画素回路が設けられている。画素回路は、画素トランジスタとして、例えば、転送トランジスタTR、増幅トランジスタAMP、リセットトランジスタRSTおよび選択トランジスタSELを有している。 Although not shown, the first surface 10A of the semiconductor substrate 10 is further provided with a floating diffusion FD and a pixel circuit that outputs a pixel signal based on the electric charge output from each pixel. The pixel circuit has, for example, a transfer transistor TR, an amplification transistor AMP, a reset transistor RST, and a selection transistor SEL as pixel transistors.
 図3は、図1に示した撮像素子1の画素回路の一例を表したものである。 FIG. 3 shows an example of the pixel circuit of the image pickup device 1 shown in FIG.
 転送トランジスタTRは、受光部11とFDとの間に接続されている。転送トランジスタTRのゲート電極には、駆動信号TGsigが印加される。この駆動信号TGsigがアクティブ状態になると、転送トランジスタTRの転送ゲートが導通状態となり、受光部11に蓄積されている信号電荷が、転送トランジスタTRを介してFDに転送される。 The transfer transistor TR is connected between the light receiving unit 11 and the FD. A drive signal TGsig is applied to the gate electrode of the transfer transistor TR. When the drive signal TGsig becomes active, the transfer gate of the transfer transistor TR becomes conductive, and the signal charge accumulated in the light receiving unit 11 is transferred to the FD via the transfer transistor TR.
 FDは、転送トランジスタTRと増幅トランジスタAMPとの間に接続されている。FDは、転送トランジスタTRにより転送される信号電荷を電圧信号に電荷電圧変換して、増幅トランジスタAMPに出力する。 The FD is connected between the transfer transistor TR and the amplification transistor AMP. The FD converts the signal charge transferred by the transfer transistor TR into a voltage signal and outputs it to the amplification transistor AMP.
 リセットトランジスタRSTは、FDと電源部との間に接続されている。リセットトランジスタRSTのゲート電極には、駆動信号RSTsigが印加される。この駆動信号RSTsigがアクティブ状態になると、リセットトランジスタRSTのリセットゲートが導通状態となり、FDの電位が電源部のレベルにリセットされる。 The reset transistor RST is connected between the FD and the power supply unit. A drive signal RSTsig is applied to the gate electrode of the reset transistor RST. When this drive signal RSTsig becomes active, the reset gate of the reset transistor RST becomes conductive, and the potential of the FD is reset to the level of the power supply unit.
 増幅トランジスタAMPは、そのゲート電極がFDに、ドレイン電極が電源部にそれぞれ接続されており、FDが保持している電圧信号の読み出し回路、所謂ソースフォロア回路の入力部となる。即ち、増幅トランジスタAMPは、そのソース電極が選択トランジスタSELを介して垂直信号線Lsigに接続されることで、垂直信号線Lsigの一端に接続される定電流源とソースフォロア回路を構成する。 The gate electrode of the amplification transistor AMP is connected to the FD, and the drain electrode is connected to the power supply unit, which serves as an input unit for a voltage signal reading circuit held by the FD, a so-called source follower circuit. That is, the amplification transistor AMP constitutes a constant current source and a source follower circuit connected to one end of the vertical signal line Lsig by connecting its source electrode to the vertical signal line Lsig via the selection transistor SEL.
 選択トランジスタSELは、増幅トランジスタAMPのソース電極と、垂直信号線Lsigとの間に接続される。選択トランジスタSELのゲート電極には、駆動信号SELsigが印加される。この駆動信号SELsigがアクティブ状態になると、選択トランジスタSELが導通状態となり、単位画素Pが選択状態となる。これにより、増幅トランジスタAMPから出力される読み出し信号(画素信号)が、選択トランジスタSELを介して、垂直信号線Lsigに出力される。 The selection transistor SEL is connected between the source electrode of the amplification transistor AMP and the vertical signal line Lsig. A drive signal SELsig is applied to the gate electrode of the selection transistor SEL. When this drive signal SELsig becomes active, the selection transistor SEL becomes conductive and the unit pixel P becomes selected. As a result, the read signal (pixel signal) output from the amplification transistor AMP is output to the vertical signal line Lsig via the selection transistor SEL.
 半導体基板10の第1面10A側には、上記のように多層配線層20が設けられている。多層配線層20は、例えば、絶縁層21と、ゲート配線層22と、上記光吸収層23および複数の配線層24,25,26が層内に設けられた層間絶縁層27とを有している。 The multilayer wiring layer 20 is provided on the first surface 10A side of the semiconductor substrate 10 as described above. The multilayer wiring layer 20 has, for example, an insulating layer 21, a gate wiring layer 22, and an interlayer insulating layer 27 in which the light absorption layer 23 and a plurality of wiring layers 24, 25, 26 are provided in the layer. There is.
 絶縁層21は、例えば画素トランジスタのゲート絶縁層として、例えば半導体基板10の第1面10A上に設けられている。絶縁層21の材料としては、例えば、酸化シリコン(SiOx)、窒化シリコン(SiNx)および酸窒化シリコン(SiOxy)等が挙げられる。 The insulating layer 21 is provided on the first surface 10A of the semiconductor substrate 10, for example, as a gate insulating layer of a pixel transistor. Examples of the material of the insulating layer 21 include silicon oxide (SiO x ), silicon nitride (SiN x ), silicon oxynitride (SiO x N y ), and the like.
 ゲート配線層22には、例えば、上述した転送トランジスタTR、増幅トランジスタAMP、リセットトランジスタRSTおよび選択トランジスタSELのゲート電極が設けられている。ゲート配線層22は、例えば、ポリシリコン(Poly-Si)を用いて形成されている。 The gate wiring layer 22 is provided with, for example, the gate electrodes of the transfer transistor TR, the amplification transistor AMP, the reset transistor RST, and the selection transistor SEL described above. The gate wiring layer 22 is formed using, for example, polysilicon (Poly-Si).
 光吸収層23は、受光部11において吸収されずに半導体基板10を透過し、多層配線層20内に入射した光を吸収するためのものであり、例えば、層間絶縁層27内において、半導体基板10の第1面10Aと、複数の配線層24との間に設けられている。光吸収層23は、例えば単位画素P毎に、例えば単位画素Pの光学中心Cに対して対称となるように形成されている。光吸収層23は、上記のように、例えば、750nm以上の波長に対して半導体基板10の吸収係数より大きな吸収係数を有する材料を用いて形成することができる。このような材料としては、例えば、金属酸化物やカルコパイライト系化合物が挙げられる。具体的な金属酸化物としては、例えば、酸化タングステン(WOx)、酸化モリブデン(MOx)およびそれらの複合材料が挙げられる。具体的なカルコパイライト系化合物としては、例えば、セレン化銅インジウム(CuInSe)およびセレン化銅インジウムガリウム(CuInGaSe)等が挙げられる。この他、光吸収層は、グラファイト化合物、カーボンナノチューブ(CNT)、グラフェンまたはフラーレン誘導体等を用いて形成することができる。 The light absorption layer 23 is for passing through the semiconductor substrate 10 without being absorbed by the light receiving unit 11 and absorbing the light incident on the multilayer wiring layer 20, for example, in the interlayer insulating layer 27, the semiconductor substrate. It is provided between the first surface 10A of 10 and the plurality of wiring layers 24. The light absorption layer 23 is formed so as to be symmetrical with respect to the optical center C of the unit pixel P, for example, for each unit pixel P. As described above, the light absorption layer 23 can be formed by using, for example, a material having an absorption coefficient larger than the absorption coefficient of the semiconductor substrate 10 with respect to a wavelength of 750 nm or more. Examples of such a material include metal oxides and chalcopyrite compounds. Specific metal oxides include, for example, tungsten oxide (WO x ), molybdenum oxide (MO x ), and composite materials thereof. Specific examples of chalcopyrite compounds include copper indium selenium (CuInSe) and copper indium gallium selenium (CuInGaSe). In addition, the light absorption layer can be formed by using a graphite compound, carbon nanotube (CNT), graphene, a fullerene derivative, or the like.
 なお、図1では、光吸収層23が単位画素P毎に設けられている例を示したが、これに限らない。光吸収層23は、例えば図9に示したように、画素部100Aに行列状に2次元配置された複数の画素に対する共通層として形成するようにしてもよい。 Note that FIG. 1 shows an example in which the light absorption layer 23 is provided for each unit pixel P, but the present invention is not limited to this. As shown in FIG. 9, for example, the light absorption layer 23 may be formed as a common layer for a plurality of pixels arranged two-dimensionally in a matrix on the pixel portion 100A.
 配線層24,25,26は、例えば、受光部11の駆動、信号伝達、各部への電圧印加等を行うためのものである。配線層24,25,26は、層間絶縁層27内において、半導体基板10側から配線層24,25,26の順に、それぞれ層間絶縁層27を間にして積層形成されている。配線層24,25,26は、例えば、銅(Cu)やアルミニウム(Al)を用いて形成されている。 The wiring layers 24, 25, and 26 are for, for example, driving the light receiving unit 11, transmitting signals, applying voltage to each unit, and the like. The wiring layers 24, 25, and 26 are laminated in the interlayer insulating layer 27 in the order of the wiring layers 24, 25, and 26 from the semiconductor substrate 10 side, respectively. The wiring layers 24, 25, and 26 are formed of, for example, copper (Cu) or aluminum (Al).
 層間絶縁層27は、絶縁層21上に、ゲート配線層22を覆うように設けられており、上記のように、層内に光吸収層23、配線層24,25,26を有している。層間絶縁層27は、絶縁層21と同様に、例えば、酸化シリコン(SiOx)、窒化シリコン(SiNx)および酸窒化シリコン(SiOxy)等を用いて形成されている。 The interlayer insulating layer 27 is provided on the insulating layer 21 so as to cover the gate wiring layer 22, and has the light absorption layer 23 and the wiring layers 24, 25, 26 in the layer as described above. .. Like the insulating layer 21, the interlayer insulating layer 27 is formed by using, for example, silicon oxide (SiO x ), silicon nitride (SiN x ), silicon nitriding (SiO x N y ), or the like.
 半導体基板10の第2面10B側には、例えば、遮光膜12、保護層13およびオンチップレンズ14が設けられている。 For example, a light-shielding film 12, a protective layer 13, and an on-chip lens 14 are provided on the second surface 10B side of the semiconductor substrate 10.
 遮光膜12は、光入射側S1から入射する斜め光の、隣り合う単位画素Pへの入射を防ぐためのものであり、例えば隣り合う単位画素Pの間に設けられている。遮光膜12は、例えば、タングステン(W)等の金属膜によって形成されている。 The light-shielding film 12 is for preventing oblique light incident from the light incident side S1 from being incident on adjacent unit pixels P, and is provided between adjacent unit pixels P, for example. The light-shielding film 12 is formed of, for example, a metal film such as tungsten (W).
 保護層13は、例えば層内に遮光膜12を含み、半導体基板10の第2面10Bを保護すると共に、光入射側S1の表面を平坦化するためのものである。保護層13は、例えば、酸化シリコン(SiOx)、窒化シリコン(SiNx)および酸窒化シリコン(SiOxy)等を用いて形成されている。 The protective layer 13 includes, for example, a light-shielding film 12 in the layer to protect the second surface 10B of the semiconductor substrate 10 and flatten the surface of the light incident side S1. The protective layer 13 is formed of, for example, silicon oxide (SiO x ), silicon nitride (SiN x ), silicon oxynitride (SiO x N y ), or the like.
 オンチップレンズ14は、光入射側S1から入射する光を受光部11に集光させるためのものである。オンチップレンズ14は、高屈折率材料を用いて形成されており、具体的には、例えば、酸化シリコン(SiOx)や窒化シリコン(SiNx)等の無機材料により形成されている。この他、エピスルフィド系樹脂、チエタン化合物やその樹脂等の高屈折率の有機材料を用いてもよい。オンチップレンズ14の形状は、特に限定されるものではなく、半球形状や半円筒状等の各種レンズ形状を用いることができる。オンチップレンズ14は、図1に示したように、単位画素P毎に設けられていてもよいし、例えば、複数の単位画素Pに1つのオンチップレンズを設けるようにしてもよい。 The on-chip lens 14 is for condensing the light incident from the light incident side S1 on the light receiving unit 11. The on-chip lens 14 is formed by using a high refractive index material, and specifically, is formed of an inorganic material such as silicon oxide (SiO x ) or silicon nitride (SiN x). In addition, an organic material having a high refractive index such as an episulfide resin, a thietane compound or the resin thereof may be used. The shape of the on-chip lens 14 is not particularly limited, and various lens shapes such as a hemispherical shape and a semi-cylindrical shape can be used. As shown in FIG. 1, the on-chip lens 14 may be provided for each unit pixel P, or for example, one on-chip lens may be provided for each of a plurality of unit pixels P.
(1-2.撮像装置の構成)
 撮像装置100は、例えばCMOSイメージセンサであり、光学レンズ系(図示せず)を介して被写体からの入射光(像光)を取り込んで、撮像面上に結像された入射光の光量を画素単位で電気信号に変換して画素信号として出力するものである。撮像装置100は、半導体基板10上に、撮像エリアとしての画素部100Aを有すると共に、この画素部100Aの周辺領域に、例えば、垂直駆動回路111、カラム信号処理回路112、水平駆動回路113、出力回路114、制御回路115および入出力端子116を有している。
(1-2. Configuration of imaging device)
The imaging device 100 is, for example, a CMOS image sensor, which captures incident light (image light) from a subject via an optical lens system (not shown) and pixels the amount of incident light imaged on the imaging surface. It is converted into an electric signal in units and output as a pixel signal. The image pickup apparatus 100 has a pixel portion 100A as an imaging area on the semiconductor substrate 10, and in a peripheral region of the pixel portion 100A, for example, a vertical drive circuit 111, a column signal processing circuit 112, a horizontal drive circuit 113, and an output. It has a circuit 114, a control circuit 115, and an input / output terminal 116.
 画素部100Aには、例えば、行列状に2次元配置された複数の単位画素Pを有している。この単位画素Pには、例えば、画素行ごとに画素駆動線Lread(具体的には行選択線およびリセット制御線)が配線され、画素列ごとに垂直信号線Lsigが配線されている。画素駆動線Lreadは、画素からの信号読み出しのための駆動信号を伝送するものである。画素駆動線Lreadの一端は、垂直駆動回路111の各行に対応した出力端に接続されている。 The pixel unit 100A has, for example, a plurality of unit pixels P arranged two-dimensionally in a matrix. In the unit pixel P, for example, a pixel drive line Lread (specifically, a row selection line and a reset control line) is wired for each pixel row, and a vertical signal line Lsig is wired for each pixel column. The pixel drive line Lead transmits a drive signal for reading a signal from a pixel. One end of the pixel drive line Lead is connected to the output end corresponding to each line of the vertical drive circuit 111.
 垂直駆動回路111は、シフトレジスタやアドレスデコーダ等によって構成され、画素部100Aの各単位画素Pを、例えば、行単位で駆動する画素駆動部である。垂直駆動回路111によって選択走査された画素行の各単位画素Pから出力される信号は、垂直信号線Lsigの各々を通してカラム信号処理回路112に供給される。カラム信号処理回路112は、垂直信号線Lsigごとに設けられたアンプや水平選択スイッチ等によって構成されている。 The vertical drive circuit 111 is a pixel drive unit composed of a shift register, an address decoder, etc., and drives each unit pixel P of the pixel unit 100A, for example, in row units. The signal output from each unit pixel P of the pixel row selectively scanned by the vertical drive circuit 111 is supplied to the column signal processing circuit 112 through each of the vertical signal lines Lsig. The column signal processing circuit 112 is composed of an amplifier, a horizontal selection switch, and the like provided for each vertical signal line Lsig.
 水平駆動回路113は、シフトレジスタやアドレスデコーダ等によって構成され、カラム信号処理回路112の各水平選択スイッチを走査しつつ順番に駆動するものである。この水平駆動回路113による選択走査により、垂直信号線Lsigの各々を通して伝送される各画素の信号が順番に水平信号線121に出力され、当該水平信号線121を通して半導体基板10の外部へ伝送される。 The horizontal drive circuit 113 is composed of a shift register, an address decoder, etc., and drives each horizontal selection switch of the column signal processing circuit 112 in order while scanning. By the selective scanning by the horizontal drive circuit 113, the signals of each pixel transmitted through each of the vertical signal lines Lsig are sequentially output to the horizontal signal line 121 and transmitted to the outside of the semiconductor substrate 10 through the horizontal signal line 121. ..
 出力回路114は、カラム信号処理会おr112の各々から水平信号線121を介して順次供給される信号に対して信号処理を行って出力するものである。出力回路114は、例えば、バッファリングのみを行う場合もあるし、黒レベル調整、列ばらつき補正および各種デジタル信号処理等が行われる場合もある。 The output circuit 114 performs signal processing on signals sequentially supplied from each of the column signal processing groups r112 via the horizontal signal line 121 and outputs the signals. The output circuit 114 may, for example, perform only buffering, or may perform black level adjustment, column variation correction, various digital signal processing, and the like.
 垂直駆動回路111、カラム信号処理回路112、水平駆動回路113、水平信号線121および出力回路114からなる回路部分は、半導体基板10上に直に形成されていてもよいし、あるいは外部制御ICに配設されたものであってもよい。また、それらの回路部分は、ケーブル等により接続された他の基板に形成されていてもよい。 The circuit portion including the vertical drive circuit 111, the column signal processing circuit 112, the horizontal drive circuit 113, the horizontal signal line 121, and the output circuit 114 may be formed directly on the semiconductor substrate 10, or may be used as an external control IC. It may be arranged. Further, those circuit portions may be formed on another substrate connected by a cable or the like.
 制御回路115は、半導体基板10の外部から与えられるクロックや、動作モードを指令するデータ等を受け取り、また、撮像装置100の内部情報等のデータを出力するものである。制御回路115はさらに、各種のタイミング信号を生成するタイミングジェネレータを有し、当該タイミングジェネレータで生成された各種のタイミング信号を基に垂直駆動回路111、カラム信号処理回路112および水平駆動回路113等の周辺回路の駆動制御を行う。 The control circuit 115 receives a clock given from the outside of the semiconductor substrate 10, data for instructing an operation mode, and the like, and outputs data such as internal information of the image pickup apparatus 100. The control circuit 115 further includes a timing generator that generates various timing signals, and the vertical drive circuit 111, the column signal processing circuit 112, the horizontal drive circuit 113, and the like based on the various timing signals generated by the timing generator. Controls the drive of peripheral circuits.
 入出力端子116は、外部との信号のやり取りを行うものである。 The input / output terminal 116 exchanges signals with the outside.
(1-3.作用・効果)
 本実施の形態の撮像素子1では、受光部11が埋め込み形成された半導体基板10の第1面10Aと、半導体基板10の第1面10A側に設けられた多層配線層20内における複数の配線層24,25,26との間に光吸収層23を設けるようにした。この光吸収層23は、例えば750nm以上の波長に対して半導体基板10の吸収係数より大きな吸収係数を有する材料を用いて形成されている。これにより、受光部11において吸収されずに半導体基板10を透過し、多層配線層20内に入射した光が、複数の配線層24,25,26における反射によって、例えば隣り合う単位画素Pの受光部11へ再入射するのを防ぐ。以下、これについて説明する。
(1-3. Action / effect)
In the image pickup device 1 of the present embodiment, a plurality of wirings in the first surface 10A of the semiconductor substrate 10 in which the light receiving portion 11 is embedded and in the multilayer wiring layer 20 provided on the first surface 10A side of the semiconductor substrate 10 The light absorption layer 23 is provided between the layers 24, 25, and 26. The light absorption layer 23 is formed by using a material having an absorption coefficient larger than that of the semiconductor substrate 10 for a wavelength of, for example, 750 nm or more. As a result, the light transmitted through the semiconductor substrate 10 without being absorbed by the light receiving unit 11 and incident on the multilayer wiring layer 20 is reflected by the plurality of wiring layers 24, 25, 26, for example, to receive light from adjacent unit pixels P. Prevents re-incident to the portion 11. This will be described below.
 配線層が受光部よりも光入射側に配置された、所謂表面照射型CISに対して、前述した配線層が受光部に対して光入射側とは反対側に配置された裏面照射型CISは、入射光がゲート電極等を含む配線層に遮られることがないため、微細化に伴う感度の低下を抑制することができる。しかしながら、裏面照射型CISでは、受光部で吸収されずに配線層側に透過した長波長の光が、配線層(金属層)によって反射されて受光部に再入射することがある。 In contrast to the so-called surface-illuminated CIS in which the wiring layer is arranged on the light incident side of the light receiving portion, the back-illuminated CIS in which the wiring layer is arranged on the opposite side of the light receiving portion from the light incident side is Since the incident light is not blocked by the wiring layer including the gate electrode and the like, it is possible to suppress a decrease in sensitivity due to miniaturization. However, in the back-illuminated CIS, long-wavelength light that is not absorbed by the light receiving portion and is transmitted to the wiring layer side may be reflected by the wiring layer (metal layer) and re-entered into the light receiving portion.
 この反射光の強度が画素毎に不均一である場合、隣り合う画素間での光学的混色が発生する。隣り合う画素間での光学的混色は、例えば、各画素の下方に形成される配線層を、例えば共有画素単位で対称性のとれたレイアウトとすることで、隣り合う画素間における不均一な反射強度が緩和され、低減することができる。 If the intensity of this reflected light is non-uniform for each pixel, optical color mixing occurs between adjacent pixels. Optical color mixing between adjacent pixels can be achieved by, for example, making the wiring layer formed below each pixel a symmetrical layout in units of shared pixels, so that non-uniform reflection between adjacent pixels can be achieved. The strength can be relaxed and reduced.
 しかしながら、共有画素単位で配線等の金属層のレイアウトに対称性があっても、画素内における金属層の配置が光学的に非対称な場合、金属層で反射した光が隣り合う画素の受光部へ再入射することで光学的な混色が発生してしまう虞がある。例えば、前述したように、受光部の下方に、各画素に対して一様な第1の反射板および隣り合う画素間に第2の反射板を画素単位で周期的に設けることで画質の向上を図った固体撮像装置が報告されているが、光学的な干渉を利用した反射光の制御は、波長依存性が大きく、幅広い波長帯で所望の特性を得ることが難しいと考えられる。 However, even if the layout of the metal layer such as wiring is symmetric for each shared pixel, if the arrangement of the metal layer in the pixel is optically asymmetric, the light reflected by the metal layer is transferred to the light receiving portion of the adjacent pixel. There is a risk that optical color mixing will occur due to re-incident. For example, as described above, the image quality is improved by periodically providing a first reflecting plate that is uniform for each pixel and a second reflecting plate between adjacent pixels below the light receiving portion. However, it is considered that the control of reflected light using optical interference has a large wavelength dependence and it is difficult to obtain desired characteristics in a wide wavelength band.
 これに対して、本実施の形態では、単位画素P毎に受光部11が埋め込み形成された半導体基板10の第1面10Aと、多層配線層20内に設けられた複数の配線層24,25,26との間に、例えば750nm以上の波長に対して半導体基板10の吸収係数より大きな吸収係数を有する材料を含む光吸収層23を形成するようにした。これにより、受光部11において吸収されずに半導体基板10を透過し、多層配線層20内に入射した光は、光吸収層23において吸収されるようになり、複数の配線層24,25,26における反射による隣り合う単位画素Pの受光部11への再入射を低減することが可能となる。 On the other hand, in the present embodiment, the first surface 10A of the semiconductor substrate 10 in which the light receiving portion 11 is embedded and formed for each unit pixel P, and the plurality of wiring layers 24 and 25 provided in the multilayer wiring layer 20 are provided. A light absorption layer 23 containing a material having an absorption coefficient larger than that of the semiconductor substrate 10 for a wavelength of, for example, 750 nm or more is formed between the components and 26. As a result, the light transmitted through the semiconductor substrate 10 without being absorbed by the light receiving unit 11 and incident on the multilayer wiring layer 20 is absorbed by the light absorption layer 23, and the plurality of wiring layers 24, 25, 26 are absorbed. It is possible to reduce the re-incidention of the adjacent unit pixels P to the light receiving unit 11 due to the reflection in.
 以上により、本実施の形態の撮像素子1では、受光部11において吸収されずに多層配線層20に入射した光の、多層配線層20内に設けられた複数の配線層24,25,26における反射による隣り合う単位画素Pへの光学的混色が低減される。よって、画質を向上させることが可能となる。 As described above, in the image sensor 1 of the present embodiment, the light that is not absorbed by the light receiving unit 11 and is incident on the multilayer wiring layer 20 is included in the plurality of wiring layers 24, 25, 26 provided in the multilayer wiring layer 20. Optical color mixing to adjacent unit pixels P due to reflection is reduced. Therefore, it is possible to improve the image quality.
 また、本実施の形態では、受光部11を透過し、多層配線層20に入射した光は、光吸収層23で吸収されるため、光吸収層23の下方に形成される複数の配線層24,25,26のレイアウトの自由度を向上させることが可能となる。 Further, in the present embodiment, the light transmitted through the light receiving portion 11 and incident on the multilayer wiring layer 20 is absorbed by the light absorption layer 23, so that the plurality of wiring layers 24 formed below the light absorption layer 23 are formed. , 25, 26 can be improved in layout freedom.
 次に、本開示の変形例1~6について説明する。以下では、上記実施の形態と同様の構成要素については同一の符号を付し、適宜その説明を省略する。 Next, modifications 1 to 6 of the present disclosure will be described. In the following, the same components as those in the above embodiment will be designated by the same reference numerals, and the description thereof will be omitted as appropriate.
<2.変形例>
(2-1.変形例1)
 図4Aは、本開示の変形例1に係る撮像素子(撮像素子1A)の断面構成の一例を表したものである。図4Bは、図4Aに示した撮像素子1Aにおける光吸収層23の平面形状の一例を模式的に表したものである。なお、図4Aは、図4Bに示したI-I線における断面を表している。上記実施の形態では、単位画素P毎に一様な光吸収層23を設けた例を示したが、これに限らない。光吸収層23の平面形状は、例えば、単位画素P内において光学中心Cに対して対称性を有していればよく、本変形例の撮像素子1Aの光吸収層23は、光学中心Cの近傍に開口23Hが形成されている点が、上記実施の形態とは異なる。
<2. Modification example>
(2-1. Modification 1)
FIG. 4A shows an example of the cross-sectional configuration of the image pickup device (image pickup device 1A) according to the first modification of the present disclosure. FIG. 4B schematically shows an example of the planar shape of the light absorption layer 23 in the image pickup device 1A shown in FIG. 4A. Note that FIG. 4A shows a cross section taken along the line II shown in FIG. 4B. In the above embodiment, an example in which a uniform light absorption layer 23 is provided for each unit pixel P is shown, but the present invention is not limited to this. The planar shape of the light absorption layer 23 may have symmetry with respect to the optical center C in the unit pixel P, for example, and the light absorption layer 23 of the image pickup element 1A of this modification is the optical center C. The point that the opening 23H is formed in the vicinity is different from the above-described embodiment.
 上記実施の形態のように、単位画素P内に一様な光吸収層23を設けた場合、半導体基板10を透過して多層配線層20に入射した光は、上記のように光吸収層23において吸収されるため、隣り合う単位画素Pへの光学的な混色は低減されるものの、自画素への再入射も低減されるため、感度が低下する虞がる。 When a uniform light absorption layer 23 is provided in the unit pixel P as in the above embodiment, the light transmitted through the semiconductor substrate 10 and incident on the multilayer wiring layer 20 is the light absorption layer 23 as described above. Although the optical color mixing to the adjacent unit pixels P is reduced, the re-incidentation to the own pixel is also reduced, so that the sensitivity may be lowered.
 これに対して本変形例では、半導体基板10の第1面10Aと、多層配線層20内に設けられた複数の配線層24,25,26との間に、単位画素Pの光学中心近傍に開口23Hを有する光吸収層を設けるようにした。これにより、半導体基板10を透過し、多層配線層20内の、単位画素Pの光学中心近傍に入射した光は、下層の複数の配線層24,25,26によって反射され、再度、開口23Hを通って自画素の受光部11へ再入射するようになる。よって、感度の低下を抑えつつ、隣り合う単位画素Pへの光学的混色を低減し、画質を向上させることが可能となる。 On the other hand, in this modification, between the first surface 10A of the semiconductor substrate 10 and the plurality of wiring layers 24, 25, 26 provided in the multilayer wiring layer 20, in the vicinity of the optical center of the unit pixel P. A light absorbing layer having an opening of 23H is provided. As a result, the light transmitted through the semiconductor substrate 10 and incident on the vicinity of the optical center of the unit pixel P in the multilayer wiring layer 20 is reflected by the plurality of wiring layers 24, 25, 26 in the lower layer, and the opening 23H is opened again. It passes through and re-enters the light receiving unit 11 of its own pixel. Therefore, it is possible to reduce the optical color mixing of adjacent unit pixels P and improve the image quality while suppressing the decrease in sensitivity.
(2-2.変形例2)
 図5は、本開示の変形例2に係る撮像素子(撮像素子1B)の断面構成の一例を表したものである。本変形例の撮像素子1Bは、複数の配線層24,25,26のうち、例えば、最も上層、即ち、光吸収層23の直下に形成された配線層24の表面に凹凸形状を形成した点が、上記実施の形態とは異なる。
(2-2. Modification 2)
FIG. 5 shows an example of the cross-sectional configuration of the image pickup device (image pickup device 1B) according to the second modification of the present disclosure. The image sensor 1B of the present modification has a concave-convex shape formed on the surface of the wiring layer 24 formed on the uppermost layer, that is, directly below the light absorption layer 23, among the plurality of wiring layers 24, 25, 26. However, it is different from the above embodiment.
 光吸収層23を形成することによる複数の配線層24,25,26における光反射の抑制効果は、光吸収層23を構成する光吸収材料の吸収係数や光吸収層23の膜厚に依存する。 The effect of suppressing light reflection in the plurality of wiring layers 24, 25, 26 by forming the light absorption layer 23 depends on the absorption coefficient of the light absorption material constituting the light absorption layer 23 and the film thickness of the light absorption layer 23. ..
 これに対して本変形例では、例えば、配線層24の表面に凹凸形状を形成するようにした。これにより、例えば、光吸収層23を透過した光は、配線層24の表面で乱反射されるようになり、配線層24側から再度光吸収層23に入射した光の層内における光路長が伸びるようなる。よって、上記実施の形態と比較して、隣り合う単位画素Pへの光学的混色がさらに低減されるようになり、画質をさらに向上させることが可能となる。 On the other hand, in this modification, for example, an uneven shape is formed on the surface of the wiring layer 24. As a result, for example, the light transmitted through the light absorption layer 23 is diffusely reflected on the surface of the wiring layer 24, and the optical path length in the layer of the light incident on the light absorption layer 23 again from the wiring layer 24 side is extended. Will be. Therefore, as compared with the above embodiment, the optical color mixing of the adjacent unit pixels P is further reduced, and the image quality can be further improved.
(2-3.変形例3)
 図6は、本開示の変形例3に係る撮像素子(撮像素子1C)の断面構成の一例を表したものである。本変形例の撮像素子1Cは、複数の配線層24,25,26のそれぞれに、光吸収層23を積層形成した点が、上記実施の形態とは異なる。
(2-3. Modification 3)
FIG. 6 shows an example of the cross-sectional configuration of the image pickup device (image pickup device 1C) according to the third modification of the present disclosure. The image pickup device 1C of the present modification is different from the above-described embodiment in that the light absorption layer 23 is laminated on each of the plurality of wiring layers 24, 25, and 26.
 本変形例では、上記のように、複数の配線層24,25,26のそれぞれに、光吸収層23が積層形成されている。具体的には、配線層24上には光吸収層23Aが、配線層25上には光吸収層23Bが、配線層26上には光吸収層23Cが、それぞれ形成されている。 In this modification, as described above, the light absorption layer 23 is laminated and formed on each of the plurality of wiring layers 24, 25, and 26. Specifically, a light absorption layer 23A is formed on the wiring layer 24, a light absorption layer 23B is formed on the wiring layer 25, and a light absorption layer 23C is formed on the wiring layer 26.
 このように、上記実施の形態等では、光吸収層23と配線層24,25,26とは、層間絶縁層27を間に離間して形成されている例を示したが、光吸収層23は、本変形例のように配線層24,25,26上に直接積層形成されていてもよい。 As described above, in the above-described embodiment and the like, an example is shown in which the light absorption layer 23 and the wiring layers 24, 25, 26 are formed with the interlayer insulating layer 27 spaced apart from each other, but the light absorption layer 23 is shown. May be directly laminated and formed on the wiring layers 24, 25, 26 as in the present modification.
(2-4.変形例4)
 図7は、本開示の変形例4に係る撮像素子(撮像素子1D)の断面構成の一例を表したものである。本変形例の撮像素子1Dは、隣り合う単位画素Pの間に画素分離部15を設け、その表面に、上記実施の形態における光吸収層23と同様の構成を有する光吸収層16を設けた点が、上記実施の形態とは異なる。
(2-4. Modification 4)
FIG. 7 shows an example of the cross-sectional configuration of the image pickup device (image pickup device 1D) according to the modified example 4 of the present disclosure. In the image pickup device 1D of the present modification, a pixel separation unit 15 is provided between adjacent unit pixels P, and a light absorption layer 16 having the same configuration as the light absorption layer 23 in the above embodiment is provided on the surface thereof. The point is different from the above-described embodiment.
 画素分離部15は、隣り合う単位画素Pの間を光学的且つ電気的に分離するためのものであり、半導体基板10の第2面10Bから第1面10Aに向かって、隣り合う受光部11の間に延在して設けられている。画素分離部15は、例えば、半導体基板10の第2面10B側に設けられた遮光膜12と一体形成されており、遮光膜12と同様に、例えば、タングステン(W)等の金属膜によって形成されている。画素分離部15と、半導体基板10との間には、例えば、保護層13が延在しており、半導体基板10と画素分離部15との間は電気的に分離されている。 The pixel separation unit 15 is for optically and electrically separating adjacent unit pixels P, and is adjacent to the light receiving units 11 from the second surface 10B to the first surface 10A of the semiconductor substrate 10. It is extended between the two. The pixel separation portion 15 is integrally formed with, for example, a light-shielding film 12 provided on the second surface 10B side of the semiconductor substrate 10, and is formed of, for example, a metal film such as tungsten (W) like the light-shielding film 12. Has been done. For example, a protective layer 13 extends between the pixel separation unit 15 and the semiconductor substrate 10, and the semiconductor substrate 10 and the pixel separation unit 15 are electrically separated from each other.
 光吸収層16は、画素分離部15の表面、具体的には、半導体基板10に面する表面に設けられている。これにより、例えば、半導体基板10の第2面10Bから入射し、画素分離部15および配線層24,25,26において反射することで、隣り合う単位画素Pへ漏れ込む光の強度が低減される。この光吸収層16が、本開示の「第2の光吸収層」の一具体例に相当する。 The light absorption layer 16 is provided on the surface of the pixel separation unit 15, specifically, on the surface facing the semiconductor substrate 10. As a result, for example, the intensity of light that is incident from the second surface 10B of the semiconductor substrate 10 and is reflected by the pixel separation unit 15 and the wiring layers 24, 25, 26 and leaks to the adjacent unit pixels P is reduced. .. The light absorption layer 16 corresponds to a specific example of the "second light absorption layer" of the present disclosure.
 このように、本変形例では、隣り合う単位画素Pの間に、隣り合う受光部11の間を分離する画素分離部15を設けると共に、半導体基板10に面する画素分離部15の表面に、光吸収層16をさらに設けるようにした。これにより、上記実施の形態と比較して、隣り合う単位画素Pへの光学的混色がさらに低減され、画質をさらに向上させることが可能となる。 As described above, in this modification, the pixel separation unit 15 for separating the adjacent light receiving units 11 is provided between the adjacent unit pixels P, and the surface of the pixel separation unit 15 facing the semiconductor substrate 10 is provided. The light absorption layer 16 is further provided. As a result, as compared with the above-described embodiment, the optical color mixing of adjacent unit pixels P is further reduced, and the image quality can be further improved.
 なお、図7では、画素分離部15の端部が半導体基板10内に設けられている例を示したが、画素分離部15は、例えば、次に説明する撮像素子1E(図8参照)のように、半導体基板10の第2面10Bから第1面10Aに向かって、半導体基板10を完全に分離していてもよい。 Although FIG. 7 shows an example in which the end portion of the pixel separation unit 15 is provided in the semiconductor substrate 10, the pixel separation unit 15 is, for example, the image sensor 1E (see FIG. 8) described below. As described above, the semiconductor substrate 10 may be completely separated from the second surface 10B of the semiconductor substrate 10 toward the first surface 10A.
(2-5.変形例5)
 図8は、本開示の変形例5に係る撮像素子(撮像素子1E)の断面構成の一例を表したものである。本変形例の撮像素子1Eは、例えば、単位画素P内に、受光部11において生成された電荷を一時的に蓄積する電荷保持部17を設けた、いわゆるグローバルシャッタ方式の裏面照射型CISを実現したものである。
(2-5. Modification 5)
FIG. 8 shows an example of the cross-sectional configuration of the image pickup device (image pickup device 1E) according to the modification 5 of the present disclosure. The image sensor 1E of this modification realizes a so-called global shutter type back-illuminated CIS in which, for example, a charge holding unit 17 that temporarily stores the electric charge generated by the light receiving unit 11 is provided in the unit pixel P. It was done.
 グローバルシャッタ方式とは、基本的には全画素同時に露光を開始し、全画素同時に露光を終了するグローバル露光を行う方式である。ここで、全画素とは、画像に現れる部分の画素の全てということであり、ダミー画素等は除外される。また、時間差や画像の歪みが問題にならない程度に十分小さければ、全画素同時ではなく、複数行(例えば、数十行)単位でグローバル露光を行いながら、グローバル露光を行う領域を移動する方式もグローバルシャッタ方式に含まれる。また、画像に表れる部分の画素の全てでなく、所定領域の画素に対してグローバル露光を行う方式もグローバルシャッタ方式に含まれる。 The global shutter method is basically a method of performing global exposure that starts exposure for all pixels at the same time and ends exposure for all pixels at the same time. Here, all the pixels mean all the pixels of the portion appearing in the image, and dummy pixels and the like are excluded. Also, if the time difference and image distortion are small enough not to be a problem, there is also a method of moving the area to be globally exposed while performing global exposure in units of multiple lines (for example, several tens of lines) instead of all pixels at the same time. Included in the global shutter system. Further, the global shutter method also includes a method of performing global exposure not only on all the pixels of the portion appearing in the image but also on the pixels in a predetermined region.
 電荷保持部(MEM)17は、上記のように、受光部11において生成された電荷がフローティングディフュージョンFDに転送されるまでの間、一時的にその電荷を保持するためのものである。電荷保持部17は、例えば、半導体基板10の第1面10A側に、受光部11と同様に、埋め込み形成されている。電荷保持方式のグローバルシャッタでは、電荷保持部17への迷光は、偽信号(Parasitic Light Sensitivity)となるため、半導体基板10の電荷保持部17の上方の第2面10Bおよび受光部11と電荷保持部17との間には、例えば遮光膜12から連続する遮光部が形成されている。なお、半導体基板10内における遮光部(遮光膜12)の端部と、半導体基板10の第1面10Aとの間には、受光部11において生成された電荷を電荷保持部17へ転送するための隙間が形成されている。 As described above, the electric charge holding unit (MEM) 17 is for temporarily holding the electric charge until the electric charge generated in the light receiving unit 11 is transferred to the floating diffusion FD. The charge holding portion 17 is formed, for example, embedded in the first surface 10A side of the semiconductor substrate 10 in the same manner as the light receiving portion 11. In the charge holding type global shutter, the stray light to the charge holding part 17 becomes a false signal (Parasitic Light Sensitivity), so that the second surface 10B above the charge holding part 17 of the semiconductor substrate 10 and the light receiving part 11 and the charge holding part 11 are charged. For example, a light-shielding portion continuous with the light-shielding film 12 is formed between the portion 17 and the light-shielding film 12. In order to transfer the electric charge generated by the light receiving unit 11 to the charge holding unit 17 between the end of the light-shielding portion (light-shielding film 12) in the semiconductor substrate 10 and the first surface 10A of the semiconductor substrate 10. A gap is formed.
 このように本技術は、グローバルシャッタ方式の裏面照射型CISと組み合わせることにより、半導体基板10を透過し、複数の配線層24,25,26によって反射された光が電荷保持部17へ再入射することを防ぐことが可能となる。これにより、グローバルシャッタ方式の裏面照射型CISの画質を向上させつつ、偽信号の発生を低減することが可能となる。 As described above, in combination with the back-illuminated CIS of the global shutter method, the present technology passes through the semiconductor substrate 10 and the light reflected by the plurality of wiring layers 24, 25, 26 re-enters the charge holding portion 17. It becomes possible to prevent this. This makes it possible to reduce the generation of false signals while improving the image quality of the back-illuminated CIS of the global shutter system.
 なお、本変形例の撮像素子1Eも、上記変形例4における撮像素子1Dと同様に、画素分離部15および受光部11と電荷保持部17との間の遮光膜12のそれぞれの表面に光吸収層16を設けるようにしてもよい。 Similarly to the image sensor 1D in the modification 4, the image sensor 1E of this modification also absorbs light on the respective surfaces of the pixel separation unit 15 and the light-shielding film 12 between the light receiving unit 11 and the charge holding unit 17. The layer 16 may be provided.
(2-6.変形例6)
 図9は、本開示の変形例6に係る撮像素子(撮像素子1F)の断面構成の一例を表したものである。本変形例の撮像素子1Fは、画素回路を構成する増幅トランジスタAMP、リセットトランジスタRSTおよび選択トランジスタSEL等の画素トランジスタを、半導体基板10とは異なる基板(半導体基板30)に設け、半導体基板10と半導体基板30との間に光吸収層23を設けた点が、上記実施の形態とは異なる。
(2-6. Modification 6)
FIG. 9 shows an example of the cross-sectional configuration of the image pickup device (image pickup device 1F) according to the modification 6 of the present disclosure. In the image pickup element 1F of this modification, pixel transistors such as an amplification transistor AMP, a reset transistor RST, and a selection transistor SEL constituting a pixel circuit are provided on a substrate (semiconductor substrate 30) different from the semiconductor substrate 10, and the semiconductor substrate 10 and the image pickup element 1F are provided. It differs from the above embodiment in that the light absorbing layer 23 is provided between the semiconductor substrate 30 and the semiconductor substrate 30.
 本変形例における半導体基板10の第1面10Aには、図示していないが、フローティングディフュージョンFDおよび画素回路を構成する転送トランジスタTRが設けられている。多層配線層20には、例えば転送トランジスタTRのゲート電極と共に、光吸収層23が、例えば、複数の単位画素Pに対する共通層として設けられている。 Although not shown, the first surface 10A of the semiconductor substrate 10 in this modification is provided with a floating diffusion FD and a transfer transistor TR constituting a pixel circuit. The multilayer wiring layer 20 is provided with, for example, a light absorption layer 23 together with a gate electrode of a transfer transistor TR as a common layer for a plurality of unit pixels P, for example.
 半導体基板30は、対向する第1面(表面)30Aおよび第2面(裏面)30Bを有しており、第1面30Aに増幅トランジスタAMP、リセットトランジスタRSTおよび選択トランジスタSEL等の画素トランジスタが設けられている。この半導体基板30が、本開示の「第2の半導体基板」の一具体例に相当する。 The semiconductor substrate 30 has a first surface (front surface) 30A and a second surface (back surface) 30B facing each other, and pixel transistors such as an amplification transistor AMP, a reset transistor RST, and a selection transistor SEL are provided on the first surface 30A. Has been. The semiconductor substrate 30 corresponds to a specific example of the "second semiconductor substrate" of the present disclosure.
 半導体基板30の第1面30A上には、層間絶縁層45内に、例えば、増幅トランジスタAMP、リセットトランジスタRSTおよび選択トランジスタSELのゲート電極を含むゲート配線層41および複数の配線層42,43,44が有する多層配線層40が設けられている。半導体基板10と半導体基板30、具体的には、半導体基板10の第1面10Aに設けられたフローティングディフュージョンFDと半導体基板30の第1面30A上に設けられた増幅トランジスタAMPのゲート電極とは、例えば貫通配線46を介して電気的に接続されている。 On the first surface 30A of the semiconductor substrate 30, in the interlayer insulating layer 45, for example, a gate wiring layer 41 including an amplification transistor AMP, a reset transistor RST, and a gate electrode of a selection transistor SEL, and a plurality of wiring layers 42, 43, The multilayer wiring layer 40 included in the 44 is provided. The semiconductor substrate 10 and the semiconductor substrate 30, specifically, the floating diffusion FD provided on the first surface 10A of the semiconductor substrate 10 and the gate electrode of the amplification transistor AMP provided on the first surface 30A of the semiconductor substrate 30. , For example, they are electrically connected via a through wiring 46.
 半導体基板30の第2面30B上には、例えば、絶縁層28が設けられている。半導体基板10と半導体基板30とは、半導体基板10の第1面10Aに形成された層間絶縁層27の上面と、この絶縁層28の上面とを接合面S3として、半導体基板10の第1面10Aと半導体基板30の第2面30Bとを対向して貼り合わされている。 For example, an insulating layer 28 is provided on the second surface 30B of the semiconductor substrate 30. The semiconductor substrate 10 and the semiconductor substrate 30 are the first surface of the semiconductor substrate 10 with the upper surface of the interlayer insulating layer 27 formed on the first surface 10A of the semiconductor substrate 10 and the upper surface of the insulating layer 28 as the bonding surface S3. The 10A and the second surface 30B of the semiconductor substrate 30 are bonded to each other so as to face each other.
 このように本変形例では、画素回路を構成する増幅トランジスタAMP、リセットトランジスタRSTおよび選択トランジスタSEL等を半導体基板30に設け、受光部11が埋め込み形成された半導体基板10と半導体基板30との間に、光吸収層23を設けるようにした。これにより、本変形例では、上記実施の形態の効果に加えて、面積効率を向上させることが可能となるという効果を奏する。 As described above, in this modification, the amplification transistor AMP, the reset transistor RST, the selection transistor SEL, and the like constituting the pixel circuit are provided on the semiconductor substrate 30, and the light receiving portion 11 is embedded and formed between the semiconductor substrate 10 and the semiconductor substrate 30. The light absorbing layer 23 is provided. As a result, in this modification, in addition to the effect of the above-described embodiment, the area efficiency can be improved.
<3.適用例>
 図10は、上記実施の形態および変形例1~6に係る撮像素子(例えば、撮像素子1)を備えた撮像システム2の概略構成の一例を表したものである。
<3. Application example>
FIG. 10 shows an example of a schematic configuration of an image pickup system 2 provided with an image pickup device (for example, an image pickup device 1) according to the above-described embodiment and modifications 1 to 6.
 撮像システム2は、例えば、デジタルスチルカメラやビデオカメラ等の撮像装置や、スマートフォンやタブレット型端末等の携帯端末装置等の電子機器である。撮像システム2は、例えば、撮像素子1、光学系241、シャッタ装置242、DSP回路243、フレームメモリ244、表示部245、記憶部246、操作部247および電源部248を備えている。撮像システム2において、撮像素子1、DSP回路243、フレームメモリ244、表示部245、記憶部246、操作部247および電源部248は、バスライン249を介して相互に接続されている。 The imaging system 2 is, for example, an imaging device such as a digital still camera or a video camera, or an electronic device such as a mobile terminal device such as a smartphone or a tablet terminal. The image pickup system 2 includes, for example, an image pickup element 1, an optical system 241, a shutter device 242, a DSP circuit 243, a frame memory 244, a display unit 245, a storage unit 246, an operation unit 247, and a power supply unit 248. In the image pickup system 2, the image pickup element 1, the DSP circuit 243, the frame memory 244, the display unit 245, the storage unit 246, the operation unit 247, and the power supply unit 248 are connected to each other via a bus line 249.
 撮像素子1は、入射光に応じた画像データを出力する。光学系241は、1枚または複数枚のレンズを有して構成され、被写体からの光(入射光)を撮像素子1に導き、撮像素子1の受光面に結像させる。シャッタ装置242は、光学系241および撮像素子1の間に配置され、駆動回路の制御に従って、撮像素子1への光照射期間および遮光期間を制御する。DSP回路243は、撮像素子1から出力される信号(画像データ)を処理する信号処理回路である。フレームメモリ244は、DSP回路243により処理された画像データを、フレーム単位で一時的に保持する。表示部245は、例えば、液晶パネルや有機EL(Electro Luminescence)パネル等のパネル型表示装置からなり、撮像素子1で撮像された動画又は静止画を表示する。記憶部246は、撮像素子1で撮像された動画又は静止画の画像データを、半導体メモリやハードディスク等の記録媒体に記録する。操作部247は、ユーザによる操作に従い、撮像システム2が有する各種の機能についての操作指令を発する。電源部248は、撮像素子1、DSP回路243、フレームメモリ244、表示部245、記憶部246および操作部247の動作電源となる各種の電源を、これら供給対象に対して適宜供給する。 The image sensor 1 outputs image data according to the incident light. The optical system 241 is configured to have one or a plurality of lenses, guides light (incident light) from a subject to an image pickup device 1, and forms an image on a light receiving surface of the image pickup device 1. The shutter device 242 is arranged between the optical system 241 and the image sensor 1, and controls the light irradiation period and the light shielding period of the image sensor 1 according to the control of the drive circuit. The DSP circuit 243 is a signal processing circuit that processes a signal (image data) output from the image sensor 1. The frame memory 244 temporarily holds the image data processed by the DSP circuit 243 in frame units. The display unit 245 comprises a panel-type display device such as a liquid crystal panel or an organic EL (Electro Luminescence) panel, and displays a moving image or a still image captured by the image sensor 1. The storage unit 246 records image data of a moving image or a still image captured by the image pickup device 1 on a recording medium such as a semiconductor memory or a hard disk. The operation unit 247 issues operation commands for various functions of the image pickup system 2 according to the operation by the user. The power supply unit 248 appropriately supplies various power sources serving as operating power sources for the image sensor 1, DSP circuit 243, frame memory 244, display unit 245, storage unit 246, and operation unit 247 to these supply targets.
 次に、撮像システム2における撮像手順について説明する。 Next, the imaging procedure in the imaging system 2 will be described.
 図11は、撮像システム2における撮像動作のフローチャートの一例を表す。ユーザは、操作部247を操作することにより撮像開始を指示する(ステップS101)。すると、操作部247は、撮像指令を撮像素子1に送信する(ステップS102)。撮像素子1(具体的にはシステム制御回路)は、撮像指令を受けると、所定の撮像方式での撮像を実行する(ステップS103)。 FIG. 11 shows an example of a flowchart of an imaging operation in the imaging system 2. The user instructs the start of imaging by operating the operation unit 247 (step S101). Then, the operation unit 247 transmits an image pickup command to the image pickup device 1 (step S102). When the image pickup device 1 (specifically, the system control circuit) receives an image pickup command, the image pickup element 1 (specifically, the system control circuit) executes an image pickup by a predetermined image pickup method (step S103).
 撮像素子1は、撮像により得られた画像データをDSP回路243に出力する。ここで、画像データとは、フローティングディフュージョンFDに一時的に保持された電荷に基づいて生成された画素信号の全画素分のデータである。DSP回路243は、撮像素子1から入力された画像データに基づいて所定の信号処理(例えばノイズ低減処理等)を行う(ステップS104)。DSP回路243は、所定の信号処理がなされた画像データをフレームメモリ244に保持させ、フレームメモリ244は、画像データを記憶部246に記憶させる(ステップS105)。このようにして、撮像システム2における撮像が行われる。 The image sensor 1 outputs the image data obtained by the image pickup to the DSP circuit 243. Here, the image data is data for all pixels of the pixel signal generated based on the electric charge temporarily held in the floating diffusion FD. The DSP circuit 243 performs predetermined signal processing (for example, noise reduction processing) based on the image data input from the image sensor 1 (step S104). The DSP circuit 243 stores the image data subjected to the predetermined signal processing in the frame memory 244, and the frame memory 244 stores the image data in the storage unit 246 (step S105). In this way, the imaging in the imaging system 2 is performed.
 本適用例では、上記実施の形態および変形例1~6に係る撮像素子1,1A~1Fが撮像システム2に適用される。これにより、撮像素子1を小型化もしくは高精細化することができるので、小型もしくは高精細な撮像システム2を提供することができる。 In this application example, the image pickup devices 1, 1A to 1F according to the above-described embodiment and modifications 1 to 6 are applied to the image pickup system 2. As a result, the image sensor 1 can be miniaturized or high-definition, so that a small-sized or high-definition image pickup system 2 can be provided.
 <4.応用例>
(移動体への応用例)
 本開示に係る技術(本技術)は、様々な製品へ応用することができる。例えば、本開示に係る技術は、自動車、電気自動車、ハイブリッド電気自動車、自動二輪車、自転車、パーソナルモビリティ、飛行機、ドローン、船舶、ロボット等のいずれかの種類の移動体に搭載される装置として実現されてもよい。
<4. Application example>
(Example of application to mobile)
The technology according to the present disclosure (the present technology) can be applied to various products. For example, the technology according to the present disclosure is realized as a device mounted on a moving body of any kind such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, a personal mobility, an airplane, a drone, a ship, and a robot. You may.
 図12は、本開示に係る技術が適用され得る移動体制御システムの一例である車両制御システムの概略的な構成例を示すブロック図である。 FIG. 12 is a block diagram showing a schematic configuration example of a vehicle control system, which is an example of a mobile control system to which the technique according to the present disclosure can be applied.
 車両制御システム12000は、通信ネットワーク12001を介して接続された複数の電子制御ユニットを備える。図12に示した例では、車両制御システム12000は、駆動系制御ユニット12010、ボディ系制御ユニット12020、車外情報検出ユニット12030、車内情報検出ユニット12040、及び統合制御ユニット12050を備える。また、統合制御ユニット12050の機能構成として、マイクロコンピュータ12051、音声画像出力部12052、及び車載ネットワークI/F(interface)12053が図示されている。 The vehicle control system 12000 includes a plurality of electronic control units connected via the communication network 12001. In the example shown in FIG. 12, the vehicle control system 12000 includes a drive system control unit 12010, a body system control unit 12020, an outside information detection unit 12030, an in-vehicle information detection unit 12040, and an integrated control unit 12050. Further, as a functional configuration of the integrated control unit 12050, a microcomputer 12051, an audio image output unit 12052, and an in-vehicle network I / F (interface) 12053 are shown.
 駆動系制御ユニット12010は、各種プログラムにしたがって車両の駆動系に関連する装置の動作を制御する。例えば、駆動系制御ユニット12010は、内燃機関又は駆動用モータ等の車両の駆動力を発生させるための駆動力発生装置、駆動力を車輪に伝達するための駆動力伝達機構、車両の舵角を調節するステアリング機構、及び、車両の制動力を発生させる制動装置等の制御装置として機能する。 The drive system control unit 12010 controls the operation of the device related to the drive system of the vehicle according to various programs. For example, the drive system control unit 12010 provides a driving force generator for generating the driving force of the vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to the wheels, and a steering angle of the vehicle. It functions as a control device such as a steering mechanism for adjusting and a braking device for generating a braking force of a vehicle.
 ボディ系制御ユニット12020は、各種プログラムにしたがって車体に装備された各種装置の動作を制御する。例えば、ボディ系制御ユニット12020は、キーレスエントリシステム、スマートキーシステム、パワーウィンドウ装置、あるいは、ヘッドランプ、バックランプ、ブレーキランプ、ウィンカー又はフォグランプ等の各種ランプの制御装置として機能する。この場合、ボディ系制御ユニット12020には、鍵を代替する携帯機から発信される電波又は各種スイッチの信号が入力され得る。ボディ系制御ユニット12020は、これらの電波又は信号の入力を受け付け、車両のドアロック装置、パワーウィンドウ装置、ランプ等を制御する。 The body system control unit 12020 controls the operation of various devices mounted on the vehicle body according to various programs. For example, the body system control unit 12020 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as a head lamp, a back lamp, a brake lamp, a winker, or a fog lamp. In this case, the body system control unit 12020 may be input with radio waves transmitted from a portable device that substitutes for the key or signals of various switches. The body system control unit 12020 receives inputs of these radio waves or signals and controls a vehicle door lock device, a power window device, a lamp, and the like.
 車外情報検出ユニット12030は、車両制御システム12000を搭載した車両の外部の情報を検出する。例えば、車外情報検出ユニット12030には、撮像部12031が接続される。車外情報検出ユニット12030は、撮像部12031に車外の画像を撮像させるとともに、撮像された画像を受信する。車外情報検出ユニット12030は、受信した画像に基づいて、人、車、障害物、標識又は路面上の文字等の物体検出処理又は距離検出処理を行ってもよい。 The vehicle outside information detection unit 12030 detects information outside the vehicle equipped with the vehicle control system 12000. For example, the image pickup unit 12031 is connected to the vehicle exterior information detection unit 12030. The vehicle outside information detection unit 12030 causes the image pickup unit 12031 to capture an image of the outside of the vehicle and receives the captured image. The vehicle exterior information detection unit 12030 may perform object detection processing or distance detection processing such as a person, a vehicle, an obstacle, a sign, or a character on the road surface based on the received image.
 撮像部12031は、光を受光し、その光の受光量に応じた電気信号を出力する光センサである。撮像部12031は、電気信号を画像として出力することもできるし、測距の情報として出力することもできる。また、撮像部12031が受光する光は、可視光であっても良いし、赤外線等の非可視光であっても良い。 The imaging unit 12031 is an optical sensor that receives light and outputs an electric signal according to the amount of the light received. The image pickup unit 12031 can output an electric signal as an image or can output it as distance measurement information. Further, the light received by the imaging unit 12031 may be visible light or invisible light such as infrared light.
 車内情報検出ユニット12040は、車内の情報を検出する。車内情報検出ユニット12040には、例えば、運転者の状態を検出する運転者状態検出部12041が接続される。運転者状態検出部12041は、例えば運転者を撮像するカメラを含み、車内情報検出ユニット12040は、運転者状態検出部12041から入力される検出情報に基づいて、運転者の疲労度合い又は集中度合いを算出してもよいし、運転者が居眠りをしていないかを判別してもよい。 The in-vehicle information detection unit 12040 detects the in-vehicle information. For example, a driver state detection unit 12041 that detects the driver's state is connected to the in-vehicle information detection unit 12040. The driver state detection unit 12041 includes, for example, a camera that images the driver, and the in-vehicle information detection unit 12040 determines the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 12041. It may be calculated, or it may be determined whether the driver is dozing.
 マイクロコンピュータ12051は、車外情報検出ユニット12030又は車内情報検出ユニット12040で取得される車内外の情報に基づいて、駆動力発生装置、ステアリング機構又は制動装置の制御目標値を演算し、駆動系制御ユニット12010に対して制御指令を出力することができる。例えば、マイクロコンピュータ12051は、車両の衝突回避あるいは衝撃緩和、車間距離に基づく追従走行、車速維持走行、車両の衝突警告、又は車両のレーン逸脱警告等を含むADAS(Advanced Driver Assistance System)の機能実現を目的とした協調制御を行うことができる。 The microcomputer 12051 calculates the control target value of the driving force generator, the steering mechanism, or the braking device based on the information inside and outside the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040, and the drive system control unit. A control command can be output to 12010. For example, the microcomputer 12051 realizes ADAS (Advanced Driver Assistance System) functions including vehicle collision avoidance or impact mitigation, follow-up driving based on inter-vehicle distance, vehicle speed maintenance driving, vehicle collision warning, vehicle lane deviation warning, and the like. It is possible to perform cooperative control for the purpose of.
 また、マイクロコンピュータ12051は、車外情報検出ユニット12030又は車内情報検出ユニット12040で取得される車両の周囲の情報に基づいて駆動力発生装置、ステアリング機構又は制動装置等を制御することにより、運転者の操作に拠らずに自律的に走行する自動運転等を目的とした協調制御を行うことができる。 Further, the microcomputer 12051 controls the driving force generator, the steering mechanism, the braking device, and the like based on the information around the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040. It is possible to perform coordinated control for the purpose of automatic driving, etc., which runs autonomously without depending on the operation.
 また、マイクロコンピュータ12051は、車外情報検出ユニット12030で取得される車外の情報に基づいて、ボディ系制御ユニット12020に対して制御指令を出力することができる。例えば、マイクロコンピュータ12051は、車外情報検出ユニット12030で検知した先行車又は対向車の位置に応じてヘッドランプを制御し、ハイビームをロービームに切り替える等の防眩を図ることを目的とした協調制御を行うことができる。 Further, the microcomputer 12051 can output a control command to the body system control unit 12020 based on the information outside the vehicle acquired by the vehicle exterior information detection unit 12030. For example, the microcomputer 12051 controls the headlamps according to the position of the preceding vehicle or the oncoming vehicle detected by the external information detection unit 12030, and performs coordinated control for the purpose of anti-glare such as switching the high beam to the low beam. It can be carried out.
 音声画像出力部12052は、車両の搭乗者又は車外に対して、視覚的又は聴覚的に情報を通知することが可能な出力装置へ音声及び画像のうちの少なくとも一方の出力信号を送信する。図57の例では、出力装置として、オーディオスピーカ12061、表示部12062及びインストルメントパネル12063が例示されている。表示部12062は、例えば、オンボードディスプレイ及びヘッドアップディスプレイの少なくとも一つを含んでいてもよい。 The audio image output unit 12052 transmits the output signal of at least one of the audio and the image to the output device capable of visually or audibly notifying the passenger or the outside of the vehicle of the information. In the example of FIG. 57, an audio speaker 12061, a display unit 12062, and an instrument panel 12063 are exemplified as output devices. The display unit 12062 may include, for example, at least one of an onboard display and a heads-up display.
 図13は、撮像部12031の設置位置の例を示す図である。 FIG. 13 is a diagram showing an example of the installation position of the imaging unit 12031.
 図13では、車両12100は、撮像部12031として、撮像部12101,12102,12103,12104,12105を有する。 In FIG. 13, the vehicle 12100 has image pickup units 12101, 12102, 12103, 12104, 12105 as the image pickup unit 12031.
 撮像部12101,12102,12103,12104,12105は、例えば、車両12100のフロントノーズ、サイドミラー、リアバンパ、バックドア及び車室内のフロントガラスの上部等の位置に設けられる。フロントノーズに備えられる撮像部12101及び車室内のフロントガラスの上部に備えられる撮像部12105は、主として車両12100の前方の画像を取得する。サイドミラーに備えられる撮像部12102,12103は、主として車両12100の側方の画像を取得する。リアバンパ又はバックドアに備えられる撮像部12104は、主として車両12100の後方の画像を取得する。撮像部12101及び12105で取得される前方の画像は、主として先行車両又は、歩行者、障害物、信号機、交通標識又は車線等の検出に用いられる。 The imaging units 12101, 12102, 12103, 12104, 12105 are provided at positions such as the front nose, side mirrors, rear bumpers, back doors, and the upper part of the windshield in the vehicle interior of the vehicle 12100, for example. The imaging unit 12101 provided on the front nose and the imaging unit 12105 provided on the upper part of the windshield in the vehicle interior mainly acquire an image in front of the vehicle 12100. The imaging units 12102 and 12103 provided in the side mirrors mainly acquire images of the side of the vehicle 12100. The imaging unit 12104 provided on the rear bumper or the back door mainly acquires an image of the rear of the vehicle 12100. The images in front acquired by the imaging units 12101 and 12105 are mainly used for detecting a preceding vehicle or a pedestrian, an obstacle, a traffic light, a traffic sign, a lane, or the like.
 なお、図13には、撮像部12101ないし12104の撮影範囲の一例が示されている。撮像範囲12111は、フロントノーズに設けられた撮像部12101の撮像範囲を示し、撮像範囲12112,12113は、それぞれサイドミラーに設けられた撮像部12102,12103の撮像範囲を示し、撮像範囲12114は、リアバンパ又はバックドアに設けられた撮像部12104の撮像範囲を示す。例えば、撮像部12101ないし12104で撮像された画像データが重ね合わせられることにより、車両12100を上方から見た俯瞰画像が得られる。 Note that FIG. 13 shows an example of the photographing range of the imaging units 12101 to 12104. The imaging range 12111 indicates the imaging range of the imaging unit 12101 provided on the front nose, the imaging ranges 12112 and 12113 indicate the imaging ranges of the imaging units 12102 and 12103 provided on the side mirrors, respectively, and the imaging range 12114 indicates the imaging range of the imaging units 12102 and 12103. The imaging range of the imaging unit 12104 provided on the rear bumper or the back door is shown. For example, by superimposing the image data captured by the imaging units 12101 to 12104, a bird's-eye view image of the vehicle 12100 as viewed from above can be obtained.
 撮像部12101ないし12104の少なくとも1つは、距離情報を取得する機能を有していてもよい。例えば、撮像部12101ないし12104の少なくとも1つは、複数の撮像素子からなるステレオカメラであってもよいし、位相差検出用の画素を有する撮像素子であってもよい。 At least one of the imaging units 12101 to 12104 may have a function of acquiring distance information. For example, at least one of the image pickup units 12101 to 12104 may be a stereo camera composed of a plurality of image pickup elements, or an image pickup element having pixels for phase difference detection.
 例えば、マイクロコンピュータ12051は、撮像部12101ないし12104から得られた距離情報を基に、撮像範囲12111ないし12114内における各立体物までの距離と、この距離の時間的変化(車両12100に対する相対速度)を求めることにより、特に車両12100の進行路上にある最も近い立体物で、車両12100と略同じ方向に所定の速度(例えば、0km/h以上)で走行する立体物を先行車として抽出することができる。さらに、マイクロコンピュータ12051は、先行車の手前に予め確保すべき車間距離を設定し、自動ブレーキ制御(追従停止制御も含む)や自動加速制御(追従発進制御も含む)等を行うことができる。このように運転者の操作に拠らずに自律的に走行する自動運転等を目的とした協調制御を行うことができる。 For example, the microcomputer 12051 has a distance to each three-dimensional object within the imaging range 12111 to 12114 based on the distance information obtained from the imaging units 12101 to 12104, and a temporal change of this distance (relative velocity with respect to the vehicle 12100). By obtaining can. Further, the microcomputer 12051 can set an inter-vehicle distance to be secured in front of the preceding vehicle in advance, and can perform automatic braking control (including follow-up stop control), automatic acceleration control (including follow-up start control), and the like. In this way, it is possible to perform coordinated control for the purpose of automatic driving or the like in which the vehicle travels autonomously without depending on the operation of the driver.
 例えば、マイクロコンピュータ12051は、撮像部12101ないし12104から得られた距離情報を元に、立体物に関する立体物データを、2輪車、普通車両、大型車両、歩行者、電柱等その他の立体物に分類して抽出し、障害物の自動回避に用いることができる。例えば、マイクロコンピュータ12051は、車両12100の周辺の障害物を、車両12100のドライバが視認可能な障害物と視認困難な障害物とに識別する。そして、マイクロコンピュータ12051は、各障害物との衝突の危険度を示す衝突リスクを判断し、衝突リスクが設定値以上で衝突可能性がある状況であるときには、オーディオスピーカ12061や表示部12062を介してドライバに警報を出力することや、駆動系制御ユニット12010を介して強制減速や回避操舵を行うことで、衝突回避のための運転支援を行うことができる。 For example, the microcomputer 12051 converts three-dimensional object data related to a three-dimensional object into two-wheeled vehicles, ordinary vehicles, large vehicles, pedestrians, electric poles, and other three-dimensional objects based on the distance information obtained from the imaging units 12101 to 12104. It can be classified and extracted and used for automatic avoidance of obstacles. For example, the microcomputer 12051 distinguishes obstacles around the vehicle 12100 into obstacles that can be seen by the driver of the vehicle 12100 and obstacles that are difficult to see. Then, the microcomputer 12051 determines the collision risk indicating the risk of collision with each obstacle, and when the collision risk is equal to or higher than the set value and there is a possibility of collision, the microcomputer 12051 via the audio speaker 12061 or the display unit 12062. By outputting an alarm to the driver and performing forced deceleration and avoidance steering via the drive system control unit 12010, driving support for collision avoidance can be provided.
 撮像部12101ないし12104の少なくとも1つは、赤外線を検出する赤外線カメラであってもよい。例えば、マイクロコンピュータ12051は、撮像部12101ないし12104の撮像画像中に歩行者が存在するか否かを判定することで歩行者を認識することができる。かかる歩行者の認識は、例えば赤外線カメラとしての撮像部12101ないし12104の撮像画像における特徴点を抽出する手順と、物体の輪郭を示す一連の特徴点にパターンマッチング処理を行って歩行者か否かを判別する手順によって行われる。マイクロコンピュータ12051が、撮像部12101ないし12104の撮像画像中に歩行者が存在すると判定し、歩行者を認識すると、音声画像出力部12052は、当該認識された歩行者に強調のための方形輪郭線を重畳表示するように、表示部12062を制御する。また、音声画像出力部12052は、歩行者を示すアイコン等を所望の位置に表示するように表示部12062を制御してもよい。 At least one of the imaging units 12101 to 12104 may be an infrared camera that detects infrared rays. For example, the microcomputer 12051 can recognize a pedestrian by determining whether or not a pedestrian is present in the captured image of the imaging units 12101 to 12104. Such pedestrian recognition includes, for example, a procedure for extracting feature points in an image captured by an imaging unit 12101 to 12104 as an infrared camera, and pattern matching processing for a series of feature points indicating the outline of an object to determine whether or not the pedestrian is a pedestrian. It is done by the procedure to determine. When the microcomputer 12051 determines that a pedestrian is present in the captured images of the imaging units 12101 to 12104 and recognizes the pedestrian, the audio image output unit 12052 outputs a square contour line for emphasizing the recognized pedestrian. The display unit 12062 is controlled so as to superimpose and display. Further, the audio image output unit 12052 may control the display unit 12062 so as to display an icon or the like indicating a pedestrian at a desired position.
 以上、本開示に係る技術が適用され得る移動体制御システムの一例について説明した。本開示に係る技術は、以上説明した構成のうち、撮像部12031に適用され得る。具体的には、撮像装置100は、撮像部12031に適用することができる。撮像部12031に本開示に係る技術を適用することにより、ノイズの少ない高精細な撮影画像を得ることができるので、移動体制御システムにおいて撮影画像を利用した高精度な制御を行うことができる。 The above is an example of a mobile control system to which the technology according to the present disclosure can be applied. The technique according to the present disclosure can be applied to the imaging unit 12031 among the configurations described above. Specifically, the image pickup apparatus 100 can be applied to the image pickup unit 12031. By applying the technique according to the present disclosure to the image pickup unit 12031, a high-definition photographed image with less noise can be obtained, so that highly accurate control using the photographed image can be performed in the moving body control system.
(内視鏡手術システムへの応用例)
 本開示に係る技術(本技術)は、様々な製品へ応用することができる。例えば、本開示に係る技術は、内視鏡手術システムに適用されてもよい。
(Example of application to endoscopic surgery system)
The technology according to the present disclosure (the present technology) can be applied to various products. For example, the techniques according to the present disclosure may be applied to endoscopic surgery systems.
 図14は、本開示に係る技術(本技術)が適用され得る内視鏡手術システムの概略的な構成の一例を示す図である。 FIG. 14 is a diagram showing an example of a schematic configuration of an endoscopic surgery system to which the technique according to the present disclosure (the present technique) can be applied.
 図14では、術者(医師)11131が、内視鏡手術システム11000を用いて、患者ベッド11153上の患者11132に手術を行っている様子が図示されている。図示するように、内視鏡手術システム11000は、内視鏡11100と、気腹チューブ11111やエネルギー処置具11112等の、その他の術具11110と、内視鏡11100を支持する支持アーム装置11120と、内視鏡下手術のための各種の装置が搭載されたカート11200と、から構成される。 FIG. 14 shows a surgeon (doctor) 11131 performing surgery on patient 11132 on patient bed 11153 using the endoscopic surgery system 11000. As shown, the endoscopic surgery system 11000 includes an endoscope 11100, other surgical tools 11110 such as an abdominal tube 11111 and an energy treatment tool 11112, and a support arm device 11120 that supports the endoscope 11100. , A cart 11200 equipped with various devices for endoscopic surgery.
 内視鏡11100は、先端から所定の長さの領域が患者11132の体腔内に挿入される鏡筒11101と、鏡筒11101の基端に接続されるカメラヘッド11102と、から構成される。図示する例では、硬性の鏡筒11101を有するいわゆる硬性鏡として構成される内視鏡11100を図示しているが、内視鏡11100は、軟性の鏡筒を有するいわゆる軟性鏡として構成されてもよい。 The endoscope 11100 is composed of a lens barrel 11101 in which a region having a predetermined length from the tip is inserted into the body cavity of the patient 11132, and a camera head 11102 connected to the base end of the lens barrel 11101. In the illustrated example, the endoscope 11100 configured as a so-called rigid mirror having a rigid barrel 11101 is illustrated, but the endoscope 11100 may be configured as a so-called flexible mirror having a flexible barrel. good.
 鏡筒11101の先端には、対物レンズが嵌め込まれた開口部が設けられている。内視鏡11100には光源装置11203が接続されており、当該光源装置11203によって生成された光が、鏡筒11101の内部に延設されるライトガイドによって当該鏡筒の先端まで導光され、対物レンズを介して患者11132の体腔内の観察対象に向かって照射される。なお、内視鏡11100は、直視鏡であってもよいし、斜視鏡又は側視鏡であってもよい。 An opening in which an objective lens is fitted is provided at the tip of the lens barrel 11101. A light source device 11203 is connected to the endoscope 11100, and the light generated by the light source device 11203 is guided to the tip of the lens barrel by a light guide extending inside the lens barrel 11101, and is an objective. It is irradiated toward the observation target in the body cavity of the patient 11132 through the lens. The endoscope 11100 may be a direct endoscope, a perspective mirror, or a side endoscope.
 カメラヘッド11102の内部には光学系及び撮像素子が設けられており、観察対象からの反射光(観察光)は当該光学系によって当該撮像素子に集光される。当該撮像素子によって観察光が光電変換され、観察光に対応する電気信号、すなわち観察像に対応する画像信号が生成される。当該画像信号は、RAWデータとしてカメラコントロールユニット(CCU: Camera Control Unit)11201に送信される。 An optical system and an imaging element are provided inside the camera head 11102, and the reflected light (observation light) from the observation target is focused on the imaging element by the optical system. The observation light is photoelectrically converted by the image sensor, and an electric signal corresponding to the observation light, that is, an image signal corresponding to the observation image is generated. The image signal is transmitted as RAW data to the camera control unit (CCU: Camera Control Unit) 11201.
 CCU11201は、CPU(Central Processing Unit)やGPU(Graphics Processing Unit)等によって構成され、内視鏡11100及び表示装置11202の動作を統括的に制御する。さらに、CCU11201は、カメラヘッド11102から画像信号を受け取り、その画像信号に対して、例えば現像処理(デモザイク処理)等の、当該画像信号に基づく画像を表示するための各種の画像処理を施す。 The CCU11201 is composed of a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), and the like, and comprehensively controls the operations of the endoscope 11100 and the display device 11202. Further, the CCU 11201 receives an image signal from the camera head 11102, and performs various image processing on the image signal for displaying an image based on the image signal, such as development processing (demosaic processing).
 表示装置11202は、CCU11201からの制御により、当該CCU11201によって画像処理が施された画像信号に基づく画像を表示する。 The display device 11202 displays an image based on the image signal processed by the CCU 11201 under the control of the CCU 11201.
 光源装置11203は、例えばLED(Light Emitting Diode)等の光源から構成され、術部等を撮影する際の照射光を内視鏡11100に供給する。 The light source device 11203 is composed of, for example, a light source such as an LED (Light Emitting Diode), and supplies irradiation light to the endoscope 11100 when photographing an operating part or the like.
 入力装置11204は、内視鏡手術システム11000に対する入力インタフェースである。ユーザは、入力装置11204を介して、内視鏡手術システム11000に対して各種の情報の入力や指示入力を行うことができる。例えば、ユーザは、内視鏡11100による撮像条件(照射光の種類、倍率及び焦点距離等)を変更する旨の指示等を入力する。 The input device 11204 is an input interface for the endoscopic surgery system 11000. The user can input various information and input instructions to the endoscopic surgery system 11000 via the input device 11204. For example, the user inputs an instruction to change the imaging conditions (type of irradiation light, magnification, focal length, etc.) by the endoscope 11100.
 処置具制御装置11205は、組織の焼灼、切開又は血管の封止等のためのエネルギー処置具11112の駆動を制御する。気腹装置11206は、内視鏡11100による視野の確保及び術者の作業空間の確保の目的で、患者11132の体腔を膨らめるために、気腹チューブ11111を介して当該体腔内にガスを送り込む。レコーダ11207は、手術に関する各種の情報を記録可能な装置である。プリンタ11208は、手術に関する各種の情報を、テキスト、画像又はグラフ等各種の形式で印刷可能な装置である。 The treatment tool control device 11205 controls the drive of the energy treatment tool 11112 for ablation of tissue, incision, sealing of blood vessels, and the like. The pneumoperitoneum device 11206 uses a gas in the pneumoperitoneum tube 11111 to inflate the body cavity of the patient 11132 for the purpose of securing the field of view by the endoscope 11100 and securing the work space of the operator. To send. The recorder 11207 is a device capable of recording various information related to surgery. The printer 11208 is a device capable of printing various information related to surgery in various formats such as texts, images, and graphs.
 なお、内視鏡11100に術部を撮影する際の照射光を供給する光源装置11203は、例えばLED、レーザ光源又はこれらの組み合わせによって構成される白色光源から構成することができる。RGBレーザ光源の組み合わせにより白色光源が構成される場合には、各色(各波長)の出力強度及び出力タイミングを高精度に制御することができるため、光源装置11203において撮像画像のホワイトバランスの調整を行うことができる。また、この場合には、RGBレーザ光源それぞれからのレーザ光を時分割で観察対象に照射し、その照射タイミングに同期してカメラヘッド11102の撮像素子の駆動を制御することにより、RGBそれぞれに対応した画像を時分割で撮像することも可能である。当該方法によれば、当該撮像素子にカラーフィルタを設けなくても、カラー画像を得ることができる。 The light source device 11203 that supplies the irradiation light to the endoscope 11100 when photographing the surgical site can be composed of, for example, an LED, a laser light source, or a white light source composed of a combination thereof. When a white light source is configured by combining RGB laser light sources, the output intensity and output timing of each color (each wavelength) can be controlled with high accuracy. Therefore, the light source device 11203 adjusts the white balance of the captured image. It can be carried out. Further, in this case, the laser light from each of the RGB laser light sources is irradiated to the observation target in a time-division manner, and the drive of the image sensor of the camera head 11102 is controlled in synchronization with the irradiation timing to correspond to each of RGB. It is also possible to capture the image in a time-division manner. According to this method, a color image can be obtained without providing a color filter on the image sensor.
 また、光源装置11203は、出力する光の強度を所定の時間ごとに変更するようにその駆動が制御されてもよい。その光の強度の変更のタイミングに同期してカメラヘッド11102の撮像素子の駆動を制御して時分割で画像を取得し、その画像を合成することにより、いわゆる黒つぶれ及び白とびのない高ダイナミックレンジの画像を生成することができる。 Further, the drive of the light source device 11203 may be controlled so as to change the intensity of the output light at predetermined time intervals. By controlling the drive of the image sensor of the camera head 11102 in synchronization with the timing of changing the light intensity to acquire an image in a time-divided manner and synthesizing the image, so-called high dynamic without blackout and overexposure. Range images can be generated.
 また、光源装置11203は、特殊光観察に対応した所定の波長帯域の光を供給可能に構成されてもよい。特殊光観察では、例えば、体組織における光の吸収の波長依存性を利用して、通常の観察時における照射光(すなわち、白色光)に比べて狭帯域の光を照射することにより、粘膜表層の血管等の所定の組織を高コントラストで撮影する、いわゆる狭帯域光観察(Narrow Band Imaging)が行われる。あるいは、特殊光観察では、励起光を照射することにより発生する蛍光により画像を得る蛍光観察が行われてもよい。蛍光観察では、体組織に励起光を照射し当該体組織からの蛍光を観察すること(自家蛍光観察)、又はインドシアニングリーン(ICG)等の試薬を体組織に局注するとともに当該体組織にその試薬の蛍光波長に対応した励起光を照射し蛍光像を得ること等を行うことができる。光源装置11203は、このような特殊光観察に対応した狭帯域光及び/又は励起光を供給可能に構成され得る。 Further, the light source device 11203 may be configured to be able to supply light in a predetermined wavelength band corresponding to special light observation. In special light observation, for example, by utilizing the wavelength dependence of light absorption in body tissue to irradiate light in a narrow band as compared with the irradiation light (that is, white light) in normal observation, the surface layer of the mucous membrane. A so-called narrow band imaging (Narrow Band Imaging) is performed in which a predetermined tissue such as a blood vessel is photographed with high contrast. Alternatively, in the special light observation, fluorescence observation may be performed in which an image is obtained by fluorescence generated by irradiating with excitation light. In fluorescence observation, the body tissue is irradiated with excitation light to observe the fluorescence from the body tissue (autofluorescence observation), or a reagent such as indocyanine green (ICG) is locally injected into the body tissue and the body tissue is injected. It is possible to obtain a fluorescence image by irradiating excitation light corresponding to the fluorescence wavelength of the reagent. The light source device 11203 may be configured to be capable of supplying narrow band light and / or excitation light corresponding to such special light observation.
 図15は、図14に示すカメラヘッド11102及びCCU11201の機能構成の一例を示すブロック図である。 FIG. 15 is a block diagram showing an example of the functional configuration of the camera head 11102 and CCU11201 shown in FIG.
 カメラヘッド11102は、レンズユニット11401と、撮像部11402と、駆動部11403と、通信部11404と、カメラヘッド制御部11405と、を有する。CCU11201は、通信部11411と、画像処理部11412と、制御部11413と、を有する。カメラヘッド11102とCCU11201とは、伝送ケーブル11400によって互いに通信可能に接続されている。 The camera head 11102 includes a lens unit 11401, an imaging unit 11402, a driving unit 11403, a communication unit 11404, and a camera head control unit 11405. CCU11201 has a communication unit 11411, an image processing unit 11412, and a control unit 11413. The camera head 11102 and CCU11201 are communicably connected to each other by a transmission cable 11400.
 レンズユニット11401は、鏡筒11101との接続部に設けられる光学系である。鏡筒11101の先端から取り込まれた観察光は、カメラヘッド11102まで導光され、当該レンズユニット11401に入射する。レンズユニット11401は、ズームレンズ及びフォーカスレンズを含む複数のレンズが組み合わされて構成される。 The lens unit 11401 is an optical system provided at a connection portion with the lens barrel 11101. The observation light taken in from the tip of the lens barrel 11101 is guided to the camera head 11102 and incident on the lens unit 11401. The lens unit 11401 is configured by combining a plurality of lenses including a zoom lens and a focus lens.
 撮像部11402は、撮像素子で構成される。撮像部11402を構成する撮像素子は、1つ(いわゆる単板式)であってもよいし、複数(いわゆる多板式)であってもよい。撮像部11402が多板式で構成される場合には、例えば各撮像素子によってRGBそれぞれに対応する画像信号が生成され、それらが合成されることによりカラー画像が得られてもよい。あるいは、撮像部11402は、3D(Dimensional)表示に対応する右目用及び左目用の画像信号をそれぞれ取得するための1対の撮像素子を有するように構成されてもよい。3D表示が行われることにより、術者11131は術部における生体組織の奥行きをより正確に把握することが可能になる。なお、撮像部11402が多板式で構成される場合には、各撮像素子に対応して、レンズユニット11401も複数系統設けられ得る。 The image pickup unit 11402 is composed of an image pickup element. The image sensor constituting the image pickup unit 11402 may be one (so-called single plate type) or a plurality (so-called multi-plate type). When the image pickup unit 11402 is composed of a multi-plate type, for example, each image pickup element may generate an image signal corresponding to each of RGB, and a color image may be obtained by synthesizing them. Alternatively, the image pickup unit 11402 may be configured to have a pair of image pickup elements for acquiring image signals for the right eye and the left eye corresponding to 3D (Dimensional) display, respectively. The 3D display enables the operator 11131 to more accurately grasp the depth of the biological tissue in the surgical site. When the image pickup unit 11402 is composed of a multi-plate type, a plurality of lens units 11401 may be provided corresponding to each image pickup element.
 また、撮像部11402は、必ずしもカメラヘッド11102に設けられなくてもよい。例えば、撮像部11402は、鏡筒11101の内部に、対物レンズの直後に設けられてもよい。 Further, the imaging unit 11402 does not necessarily have to be provided on the camera head 11102. For example, the imaging unit 11402 may be provided inside the lens barrel 11101 immediately after the objective lens.
 駆動部11403は、アクチュエータによって構成され、カメラヘッド制御部11405からの制御により、レンズユニット11401のズームレンズ及びフォーカスレンズを光軸に沿って所定の距離だけ移動させる。これにより、撮像部11402による撮像画像の倍率及び焦点が適宜調整され得る。 The drive unit 11403 is composed of an actuator, and the zoom lens and focus lens of the lens unit 11401 are moved by a predetermined distance along the optical axis under the control of the camera head control unit 11405. As a result, the magnification and focus of the image captured by the imaging unit 11402 can be adjusted as appropriate.
 通信部11404は、CCU11201との間で各種の情報を送受信するための通信装置によって構成される。通信部11404は、撮像部11402から得た画像信号をRAWデータとして伝送ケーブル11400を介してCCU11201に送信する。 The communication unit 11404 is composed of a communication device for transmitting and receiving various information to and from the CCU11201. The communication unit 11404 transmits the image signal obtained from the image pickup unit 11402 as RAW data to the CCU 11201 via the transmission cable 11400.
 また、通信部11404は、CCU11201から、カメラヘッド11102の駆動を制御するための制御信号を受信し、カメラヘッド制御部11405に供給する。当該制御信号には、例えば、撮像画像のフレームレートを指定する旨の情報、撮像時の露出値を指定する旨の情報、並びに/又は撮像画像の倍率及び焦点を指定する旨の情報等、撮像条件に関する情報が含まれる。 Further, the communication unit 11404 receives a control signal for controlling the drive of the camera head 11102 from the CCU 11201 and supplies the control signal to the camera head control unit 11405. The control signal includes, for example, information to specify the frame rate of the captured image, information to specify the exposure value at the time of imaging, and / or information to specify the magnification and focus of the captured image, and the like. Contains information about the condition.
 なお、上記のフレームレートや露出値、倍率、焦点等の撮像条件は、ユーザによって適宜指定されてもよいし、取得された画像信号に基づいてCCU11201の制御部11413によって自動的に設定されてもよい。後者の場合には、いわゆるAE(Auto Exposure)機能、AF(Auto Focus)機能及びAWB(Auto White Balance)機能が内視鏡11100に搭載されていることになる。 The imaging conditions such as the frame rate, exposure value, magnification, and focus may be appropriately specified by the user, or may be automatically set by the control unit 11413 of CCU11201 based on the acquired image signal. good. In the latter case, the so-called AE (Auto Exposure) function, AF (Auto Focus) function, and AWB (Auto White Balance) function are mounted on the endoscope 11100.
 カメラヘッド制御部11405は、通信部11404を介して受信したCCU11201からの制御信号に基づいて、カメラヘッド11102の駆動を制御する。 The camera head control unit 11405 controls the drive of the camera head 11102 based on the control signal from the CCU 11201 received via the communication unit 11404.
 通信部11411は、カメラヘッド11102との間で各種の情報を送受信するための通信装置によって構成される。通信部11411は、カメラヘッド11102から、伝送ケーブル11400を介して送信される画像信号を受信する。 The communication unit 11411 is composed of a communication device for transmitting and receiving various information to and from the camera head 11102. The communication unit 11411 receives an image signal transmitted from the camera head 11102 via the transmission cable 11400.
 また、通信部11411は、カメラヘッド11102に対して、カメラヘッド11102の駆動を制御するための制御信号を送信する。画像信号や制御信号は、電気通信や光通信等によって送信することができる。 Further, the communication unit 11411 transmits a control signal for controlling the drive of the camera head 11102 to the camera head 11102. Image signals and control signals can be transmitted by telecommunications, optical communication, or the like.
 画像処理部11412は、カメラヘッド11102から送信されたRAWデータである画像信号に対して各種の画像処理を施す。 The image processing unit 11412 performs various image processing on the image signal which is the RAW data transmitted from the camera head 11102.
 制御部11413は、内視鏡11100による術部等の撮像、及び、術部等の撮像により得られる撮像画像の表示に関する各種の制御を行う。例えば、制御部11413は、カメラヘッド11102の駆動を制御するための制御信号を生成する。 The control unit 11413 performs various controls related to the imaging of the surgical site and the like by the endoscope 11100 and the display of the captured image obtained by the imaging of the surgical site and the like. For example, the control unit 11413 generates a control signal for controlling the drive of the camera head 11102.
 また、制御部11413は、画像処理部11412によって画像処理が施された画像信号に基づいて、術部等が映った撮像画像を表示装置11202に表示させる。この際、制御部11413は、各種の画像認識技術を用いて撮像画像内における各種の物体を認識してもよい。例えば、制御部11413は、撮像画像に含まれる物体のエッジの形状や色等を検出することにより、鉗子等の術具、特定の生体部位、出血、エネルギー処置具11112の使用時のミスト等を認識することができる。制御部11413は、表示装置11202に撮像画像を表示させる際に、その認識結果を用いて、各種の手術支援情報を当該術部の画像に重畳表示させてもよい。手術支援情報が重畳表示され、術者11131に提示されることにより、術者11131の負担を軽減することや、術者11131が確実に手術を進めることが可能になる。 Further, the control unit 11413 causes the display device 11202 to display an image captured by the surgical unit or the like based on the image signal processed by the image processing unit 11412. At this time, the control unit 11413 may recognize various objects in the captured image by using various image recognition techniques. For example, the control unit 11413 detects the shape, color, and the like of the edge of an object included in the captured image to remove surgical tools such as forceps, a specific biological part, bleeding, and mist when using the energy treatment tool 11112. Can be recognized. When displaying the captured image on the display device 11202, the control unit 11413 may superimpose and display various surgical support information on the image of the surgical unit by using the recognition result. By superimposing and displaying the surgical support information and presenting it to the surgeon 11131, it is possible to reduce the burden on the surgeon 11131 and to allow the surgeon 11131 to proceed with the surgery reliably.
 カメラヘッド11102及びCCU11201を接続する伝送ケーブル11400は、電気信号の通信に対応した電気信号ケーブル、光通信に対応した光ファイバ、又はこれらの複合ケーブルである。 The transmission cable 11400 that connects the camera head 11102 and CCU11201 is an electric signal cable that supports electric signal communication, an optical fiber that supports optical communication, or a composite cable thereof.
 ここで、図示する例では、伝送ケーブル11400を用いて有線で通信が行われていたが、カメラヘッド11102とCCU11201との間の通信は無線で行われてもよい。 Here, in the illustrated example, the communication is performed by wire using the transmission cable 11400, but the communication between the camera head 11102 and the CCU11201 may be performed wirelessly.
 以上、本開示に係る技術が適用され得る内視鏡手術システムの一例について説明した。本開示に係る技術は、以上説明した構成のうち、内視鏡11100のカメラヘッド11102に設けられた撮像部11402に好適に適用され得る。撮像部11402に本開示に係る技術を適用することにより、撮像部11402を小型化もしくは高精細化することができるので、小型もしくは高精細な内視鏡11100を提供することができる。 The above is an example of an endoscopic surgery system to which the technology according to the present disclosure can be applied. Among the configurations described above, the technique according to the present disclosure can be suitably applied to the imaging unit 11402 provided on the camera head 11102 of the endoscope 11100. By applying the technique according to the present disclosure to the imaging unit 11402, the imaging unit 11402 can be miniaturized or have high definition, so that a compact or high-definition endoscope 11100 can be provided.
 以上、実施の形態および変形例1~6ならびに適用例および応用例を挙げて本開示を説明したが、本開示は上記実施の形態等に限定されるものではなく、種々の変形が可能である。例えば、オンチップレンズ14の下方には、例えば単位画素P毎に、例えば赤色波長域の光を透過させる赤色フィルタ、緑色波長域の光を透過させる緑色フィルタおよび青色波長域の光を透過させる青色フィルタが、例えば画素部100A内において、規則的な色配列(例えばベイヤ配列)で設けられたカラーフィルタ等の光学部材を設けるようにしてもよい。 Although the present disclosure has been described above with reference to the embodiments and modifications 1 to 6, application examples and application examples, the present disclosure is not limited to the above-described embodiments and the like, and various modifications are possible. .. For example, below the on-chip lens 14, for example, for each unit pixel P, for example, a red filter that transmits light in the red wavelength region, a green filter that transmits light in the green wavelength region, and a blue color that transmits light in the blue wavelength region. The filter may be provided with an optical member such as a color filter provided in a regular color arrangement (for example, a bayer arrangement) in the pixel portion 100A, for example.
 なお、本明細書中に記載された効果はあくまで例示であってその記載に限定されるものではなく、他の効果があってもよい。 It should be noted that the effects described in the present specification are merely examples and are not limited to the description, and other effects may be obtained.
 なお、本開示は以下のような構成をとることも可能である。以下の構成の本技術によれば、画素毎に受光部を有する第1の半導体基板の、光入射面となる第1の面とは反対側の第2の面と、第2の面側に設けられた多層配線層を構成する複数の配線層との間に、750nm以上の波長に対して第1の半導体基板の吸収係数より大きな吸収係数を有する材料を用いて形成された第1の光吸収層を設けるようにした。これにより、受光部において吸収されずに第1半導体基板を透過した光の、多層配線層内に設けられた配線層における反射による隣り合う画素への再入射を低減し、画質を向上させることが可能となる。
(1)
 光入射面となる第1の面と、前記第1の面の反対側の第2の面を有すると共に、画素毎に受光量に応じた電荷を光電変換により生成する受光部を有する第1の半導体基板と、
 前記第1の半導体基板の前記第2の面側に設けられると共に、複数の配線層が層間絶縁層を間に積層されている多層配線層と、
 前記第1の半導体基板の前記第2の面と前記複数の配線との間に設けられると共に、750nm以上の波長に対して前記第1の半導体基板の吸収係数より大きな吸収係数を有する材料を用いて形成された第1の光吸収層と
 を備えた撮像素子。
(2)
 前記第1の光吸収層は、前記画素毎に設けられている、前記(1)に記載の撮像素子。
(3)
 前記第1の光吸収層は、前記画素の光学中心に対して対称に形成されている、前記(1)または(2)に記載の撮像素子。
(4)
 前記第1の光吸収層は、前記画素の光学中心に対して開口部を有している、前記(1)乃至(3)のうちのいずれか1つの撮像素子。
(5)
 前記第1の光吸収層は、複数の前記画素に対する共通層として設けられている、前記(1)乃至(4)のうちのいずれか1つに記載の撮像素子。
(6)
 前記複数の配線層のうち少なくとも前記第1の光吸収層の直下の配線層は、表面に凹凸形状を有している、前記(1)乃至(5)のうちのいずれか1つに記載の撮像素子。
(7)
 前記複数の配線層は、前記第1の光吸収層との積層構造を有している、前記(1)乃至(6)のうちのいずれか1つに記載の撮像素子。
(8)
 前記第1の半導体基板はシリコン基板である、前記(1)乃至(7)のうちのいずれか1つに記載の撮像素子。
(9)
 前記第1の光吸収層は、金属酸化物またはカルコパイライト系化合物を含んで形成されている、前記(1)乃至(8)のうちのいずれか1つに記載の撮像素子。
(10)
 前記第1の光吸収層は、酸化タングステン、酸化モリブデン、セレン化銅インジウムまたはセレン化銅インジウムガリウムを用いて形成されている、前記(1)乃至(9)のうちのいずれか1つに記載の撮像素子。
(11)
 遮光性を有すると共に、前記第1の面から前記第2の面に向かって隣り合う前記画素の間を分離する画素分離部と、
 前記画素分離部の前記第1の半導体基板に面する表面に形成された第2の光吸収層とをさらに有する、前記(1)乃至(10)のうちのいずれか1つに記載の撮像素子。
(12)
 前記第1の半導体基板内において前記画素毎に設けられると共に、前記受光部において生成された前記電荷を蓄積する電荷保持部をさらに有する、前記(1)乃至(11)のうちのいずれか1つに記載の撮像素子。
(13)
 前記受光部と前記電荷保持部とは、前記第1の半導体基板内において平面方向に並列配置されており、前記受光部と前記電荷保持部との間には、前記第1の面から前記第2の面に向かって延在する遮光部をさらに有する、前記(12)に記載の撮像素子。
(14)
 遮光性を有すると共に、前記第1の面から前記第2の面に向かって隣り合う前記画素の間を分離する画素分離部をさらに有し、
 前記画素分離部と前記遮光部とは、前記第1の面側において連続して形成されている、前記(13)に記載の撮像素子。
(15)
 前記画素から出力された電荷に基づく画素信号を出力する画素回路を構成する画素トランジスタが設けられた第2の半導体基板をさらに有し、
 前記第2の半導体基板は、前記第1の光吸収層を間に、前記第1の半導体基板の前記第2の面側に配置されている、前記(1)乃至(14)のうちのいずれか1つに記載の撮像素子。
(16)
 光入射面となる第1の面と、前記第1の面の反対側の第2の面を有すると共に、画素毎に受光量に応じた電荷を光電変換により生成する受光部を有する第1の半導体基板と、
 前記第1の半導体基板の前記第2の面側に設けられると共に、複数の配線層が層間絶縁層を間に積層されている多層配線層と、
 前記第1の半導体基板の前記第2の面と前記複数の配線との間に設けられると共に、750nm以上の波長に対して前記第1の半導体基板の吸収係数より大きな吸収係数を有する材料を用いて形成された第1の光吸収層と
 を備えた撮像素子を有する撮像装置。
The present disclosure may also have the following structure. According to the present technology having the following configuration, on the second surface of the first semiconductor substrate having a light receiving portion for each pixel, on the side opposite to the first surface which is the light incident surface, and on the second surface side. A first light formed between a plurality of wiring layers constituting the provided multilayer wiring layer by using a material having an absorption coefficient larger than the absorption coefficient of the first semiconductor substrate for a wavelength of 750 nm or more. An absorption layer was provided. As a result, it is possible to reduce the re-incident of the light transmitted through the first semiconductor substrate without being absorbed by the light receiving portion to the adjacent pixels due to the reflection in the wiring layer provided in the multilayer wiring layer, and improve the image quality. It will be possible.
(1)
A first surface having a first surface to be a light incident surface and a second surface opposite to the first surface, and also having a light receiving portion for generating a charge corresponding to the amount of received light for each pixel by photoelectric conversion. With a semiconductor substrate
A multilayer wiring layer provided on the second surface side of the first semiconductor substrate and having a plurality of wiring layers laminated with an interlayer insulating layer in between.
A material provided between the second surface of the first semiconductor substrate and the plurality of wirings and having an absorption coefficient larger than the absorption coefficient of the first semiconductor substrate for a wavelength of 750 nm or more is used. An image pickup device including a first light absorption layer formed of the above.
(2)
The image pickup device according to (1), wherein the first light absorption layer is provided for each pixel.
(3)
The image pickup device according to (1) or (2), wherein the first light absorption layer is formed symmetrically with respect to the optical center of the pixel.
(4)
The first light absorption layer is an image pickup device according to any one of (1) to (3), which has an opening with respect to the optical center of the pixel.
(5)
The image pickup device according to any one of (1) to (4), wherein the first light absorption layer is provided as a common layer for a plurality of the pixels.
(6)
The wiring layer directly below the first light absorption layer among the plurality of wiring layers has a concavo-convex shape on the surface, according to any one of (1) to (5). Image sensor.
(7)
The image pickup device according to any one of (1) to (6) above, wherein the plurality of wiring layers have a laminated structure with the first light absorption layer.
(8)
The image pickup device according to any one of (1) to (7) above, wherein the first semiconductor substrate is a silicon substrate.
(9)
The image pickup device according to any one of (1) to (8) above, wherein the first light absorption layer is formed containing a metal oxide or a chalcopyrite-based compound.
(10)
The first light absorption layer is described in any one of (1) to (9) above, wherein the first light absorption layer is formed by using tungsten oxide, molybdenum oxide, copper indium selenium or gallium copper indium selenium. Image sensor.
(11)
A pixel separation unit that has a light-shielding property and separates the adjacent pixels from the first surface toward the second surface.
The image pickup device according to any one of (1) to (10), further including a second light absorption layer formed on the surface of the pixel separation unit facing the first semiconductor substrate. ..
(12)
Any one of the above (1) to (11), which is provided for each pixel in the first semiconductor substrate and further has a charge holding unit for accumulating the electric charge generated in the light receiving unit. The image pickup device according to.
(13)
The light receiving portion and the charge holding portion are arranged in parallel in the plane direction in the first semiconductor substrate, and the light receiving portion and the charge holding portion are located between the light receiving portion and the charge holding portion from the first surface to the first surface. The image pickup device according to (12) above, further comprising a light-shielding portion extending toward the surface 2.
(14)
It also has a light-shielding property and further has a pixel separation unit that separates the adjacent pixels from the first surface toward the second surface.
The image pickup device according to (13), wherein the pixel separation portion and the light-shielding portion are continuously formed on the first surface side.
(15)
Further having a second semiconductor substrate provided with a pixel transistor constituting a pixel circuit for outputting a pixel signal based on the electric charge output from the pixel.
The second semiconductor substrate is any of the above (1) to (14), which is arranged on the second surface side of the first semiconductor substrate with the first light absorption layer in between. The image sensor according to one.
(16)
A first surface having a first surface to be a light incident surface and a second surface opposite to the first surface, and also having a light receiving portion for generating a charge corresponding to the amount of received light for each pixel by photoelectric conversion. With a semiconductor substrate
A multilayer wiring layer provided on the second surface side of the first semiconductor substrate and having a plurality of wiring layers laminated with an interlayer insulating layer in between.
A material provided between the second surface of the first semiconductor substrate and the plurality of wirings and having an absorption coefficient larger than the absorption coefficient of the first semiconductor substrate for a wavelength of 750 nm or more is used. An image pickup device having an image pickup device provided with a first light absorption layer formed of the above.
 本出願は、日本国特許庁において2020年4月21日に出願された日本特許出願番号2020-075689号を基礎として優先権を主張するものであり、この出願の全ての内容を参照によって本出願に援用する。 This application claims priority based on Japanese Patent Application No. 2020-0756889 filed at the Japan Patent Office on April 21, 2020, and this application is made by referring to all the contents of this application. Invite to.
 当業者であれば、設計上の要件や他の要因に応じて、種々の修正、コンビネーション、サブコンビネーション、および変更を想到し得るが、それらは添付の請求の範囲やその均等物の範囲に含まれるものであることが理解される。 One of ordinary skill in the art can conceive of various modifications, combinations, sub-combinations, and changes, depending on design requirements and other factors, which are included in the appended claims and their equivalents. It is understood that it is one of ordinary skill in the art.

Claims (16)

  1.  光入射面となる第1の面と、前記第1の面の反対側の第2の面を有すると共に、画素毎に受光量に応じた電荷を光電変換により生成する受光部を有する第1の半導体基板と、
     前記第1の半導体基板の前記第2の面側に設けられると共に、複数の配線層が層間絶縁層を間に積層されている多層配線層と、
     前記第1の半導体基板の前記第2の面と前記複数の配線との間に設けられると共に、750nm以上の波長に対して前記第1の半導体基板の吸収係数より大きな吸収係数を有する材料を用いて形成された第1の光吸収層と
     を備えた撮像素子。
    A first surface having a first surface to be a light incident surface and a second surface opposite to the first surface, and also having a light receiving portion for generating a charge corresponding to the amount of received light for each pixel by photoelectric conversion. With a semiconductor substrate
    A multilayer wiring layer provided on the second surface side of the first semiconductor substrate and having a plurality of wiring layers laminated with an interlayer insulating layer in between.
    A material provided between the second surface of the first semiconductor substrate and the plurality of wirings and having an absorption coefficient larger than the absorption coefficient of the first semiconductor substrate for a wavelength of 750 nm or more is used. An image pickup device including a first light absorption layer formed of the above.
  2.  前記第1の光吸収層は、前記画素毎に設けられている、請求項1に記載の撮像素子。 The image pickup device according to claim 1, wherein the first light absorption layer is provided for each of the pixels.
  3.  前記第1の光吸収層は、前記画素の光学中心に対して対称に形成されている、請求項1に記載の撮像素子。 The image pickup device according to claim 1, wherein the first light absorption layer is formed symmetrically with respect to the optical center of the pixel.
  4.  前記第1の光吸収層は、前記画素の光学中心に対して開口部を有している、請求項1の撮像素子。 The image pickup device according to claim 1, wherein the first light absorption layer has an opening with respect to the optical center of the pixel.
  5.  前記第1の光吸収層は、複数の前記画素に対する共通層として設けられている、請求項1に記載の撮像素子。 The image pickup device according to claim 1, wherein the first light absorption layer is provided as a common layer for a plurality of the pixels.
  6.  前記複数の配線層のうち少なくとも前記第1の光吸収層の直下の配線層は、表面に凹凸形状を有している、請求項1に記載の撮像素子。 The image pickup device according to claim 1, wherein at least the wiring layer immediately below the first light absorption layer among the plurality of wiring layers has an uneven shape on the surface.
  7.  前記複数の配線層は、前記第1の光吸収層との積層構造を有している、請求項1に記載の撮像素子。 The image pickup device according to claim 1, wherein the plurality of wiring layers have a laminated structure with the first light absorption layer.
  8.  前記第1の半導体基板はシリコン基板である、請求項1に記載の撮像素子。 The image pickup device according to claim 1, wherein the first semiconductor substrate is a silicon substrate.
  9.  前記第1の光吸収層は、金属酸化物またはカルコパイライト系化合物を含んで形成されている、請求項1に記載の撮像素子。 The image pickup device according to claim 1, wherein the first light absorption layer is formed containing a metal oxide or a chalcopyrite compound.
  10.  前記第1の光吸収層は、酸化タングステン、酸化モリブデン、セレン化銅インジウムまたはセレン化銅インジウムガリウムを用いて形成されている、請求項1に記載の撮像素子。 The imaging device according to claim 1, wherein the first light absorption layer is formed by using tungsten oxide, molybdenum oxide, copper indium selenium, or copper indium gallium selenium.
  11.  遮光性を有すると共に、前記第1の面から前記第2の面に向かって隣り合う前記画素の間を分離する画素分離部と、
     前記画素分離部の前記第1の半導体基板に面する表面に形成された第2の光吸収層とをさらに有する、請求項1に記載の撮像素子。
    A pixel separation unit that has a light-shielding property and separates the adjacent pixels from the first surface toward the second surface.
    The image pickup device according to claim 1, further comprising a second light absorption layer formed on the surface of the pixel separation unit facing the first semiconductor substrate.
  12.  前記第1の半導体基板内において前記画素毎に設けられると共に、前記受光部において生成された前記電荷を蓄積する電荷保持部をさらに有する、請求項1に記載の撮像素子。 The image pickup device according to claim 1, further comprising a charge holding unit that is provided for each pixel in the first semiconductor substrate and stores the electric charge generated in the light receiving unit.
  13.  前記受光部と前記電荷保持部とは、前記第1の半導体基板内において平面方向に並列配置されており、前記受光部と前記電荷保持部との間には、前記第1の面から前記第2の面に向かって延在する遮光部をさらに有する、請求項12に記載の撮像素子。 The light receiving portion and the charge holding portion are arranged in parallel in the plane direction in the first semiconductor substrate, and the light receiving portion and the charge holding portion are located between the light receiving portion and the charge holding portion from the first surface to the first surface. The image pickup device according to claim 12, further comprising a light-shielding portion extending toward the surface 2.
  14.  遮光性を有すると共に、前記第1の面から前記第2の面に向かって隣り合う前記画素の間を分離する画素分離部をさらに有し、
     前記画素分離部と前記遮光部とは、前記第1の面側において連続して形成されている、請求項13に記載の撮像素子。
    It also has a light-shielding property and further has a pixel separation unit that separates the adjacent pixels from the first surface toward the second surface.
    The image pickup device according to claim 13, wherein the pixel separation portion and the light-shielding portion are continuously formed on the first surface side.
  15.  前記画素から出力された電荷に基づく画素信号を出力する画素回路を構成する画素トランジスタが設けられた第2の半導体基板をさらに有し、
     前記第2の半導体基板は、前記第1の光吸収層を間に、前記第1の半導体基板の前記第2の面側に配置されている、請求項1に記載の撮像素子。
    Further having a second semiconductor substrate provided with a pixel transistor constituting a pixel circuit for outputting a pixel signal based on the electric charge output from the pixel.
    The image pickup device according to claim 1, wherein the second semiconductor substrate is arranged on the second surface side of the first semiconductor substrate with the first light absorption layer in between.
  16.  光入射面となる第1の面と、前記第1の面の反対側の第2の面を有すると共に、画素毎に受光量に応じた電荷を光電変換により生成する受光部を有する第1の半導体基板と、
     前記第1の半導体基板の前記第2の面側に設けられると共に、複数の配線層が層間絶縁層を間に積層されている多層配線層と、
     前記第1の半導体基板の前記第2の面と前記複数の配線との間に設けられると共に、750nm以上の波長に対して前記第1の半導体基板の吸収係数より大きな吸収係数を有する材料を用いて形成された第1の光吸収層と
     を備えた撮像素子を有する撮像装置。
    A first surface having a first surface to be a light incident surface and a second surface opposite to the first surface, and also having a light receiving portion for generating a charge corresponding to the amount of received light for each pixel by photoelectric conversion. With a semiconductor substrate
    A multilayer wiring layer provided on the second surface side of the first semiconductor substrate and having a plurality of wiring layers laminated with an interlayer insulating layer in between.
    A material provided between the second surface of the first semiconductor substrate and the plurality of wirings and having an absorption coefficient larger than the absorption coefficient of the first semiconductor substrate for a wavelength of 750 nm or more is used. An image pickup device having an image pickup device provided with a first light absorption layer formed of the above.
PCT/JP2021/015278 2020-04-21 2021-04-13 Imaging element and imaging device WO2021215299A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-075689 2020-04-21
JP2020075689 2020-04-21

Publications (1)

Publication Number Publication Date
WO2021215299A1 true WO2021215299A1 (en) 2021-10-28

Family

ID=78269360

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/015278 WO2021215299A1 (en) 2020-04-21 2021-04-13 Imaging element and imaging device

Country Status (1)

Country Link
WO (1) WO2021215299A1 (en)

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011077580A1 (en) * 2009-12-26 2011-06-30 キヤノン株式会社 Solid-state imaging device and imaging system
JP2012018951A (en) * 2010-07-06 2012-01-26 Sony Corp Solid state image pickup element and method of manufacturing the same, solid state image pickup device and image pickup device
JP2012175050A (en) * 2011-02-24 2012-09-10 Sony Corp Solid state image pickup device, manufacturing method of the same and electronic equipment
JP2012204562A (en) * 2011-03-25 2012-10-22 Sony Corp Solid state image pickup device, manufacturing method of the solid state image pickup device, and electronic apparatus
JP2012231032A (en) * 2011-04-26 2012-11-22 Canon Inc Solid state imaging device and imaging apparatus
JP2013038176A (en) * 2011-08-05 2013-02-21 Toshiba Information Systems (Japan) Corp Rear face irradiation type solid-state imaging element
JP2013065688A (en) * 2011-09-16 2013-04-11 Sony Corp Solid state imaging element, manufacturing method, and electronic apparatus
JP2014086702A (en) * 2012-10-26 2014-05-12 Canon Inc Solid state image pickup device, manufacturing method therefor, and camera
JP2014096540A (en) * 2012-11-12 2014-05-22 Canon Inc Solid state image pickup device, manufacturing method for the same and camera
JP2015106621A (en) * 2013-11-29 2015-06-08 ソニー株式会社 Solid-state imaging element and manufacturing method, and electronic equipment
JP2015128187A (en) * 2015-03-24 2015-07-09 ソニー株式会社 Solid state image pickup device and electronic apparatus
WO2016052249A1 (en) * 2014-10-03 2016-04-07 ソニー株式会社 Solid-state imaging element, production method, and electronic device
JP2016082133A (en) * 2014-10-20 2016-05-16 ソニー株式会社 Solid-state imaging device and electronic apparatus
US20160181294A1 (en) * 2013-07-15 2016-06-23 Galaxycore Shanghai Limited Corporation Backside illuminated image sensor and manufacturing method therefor
JP2018022076A (en) * 2016-08-04 2018-02-08 大日本印刷株式会社 Through electrode substrate and electronic apparatus
WO2019130820A1 (en) * 2017-12-26 2019-07-04 ソニーセミコンダクタソリューションズ株式会社 Imaging element and imaging device

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011077580A1 (en) * 2009-12-26 2011-06-30 キヤノン株式会社 Solid-state imaging device and imaging system
JP2012018951A (en) * 2010-07-06 2012-01-26 Sony Corp Solid state image pickup element and method of manufacturing the same, solid state image pickup device and image pickup device
JP2012175050A (en) * 2011-02-24 2012-09-10 Sony Corp Solid state image pickup device, manufacturing method of the same and electronic equipment
JP2012204562A (en) * 2011-03-25 2012-10-22 Sony Corp Solid state image pickup device, manufacturing method of the solid state image pickup device, and electronic apparatus
JP2012231032A (en) * 2011-04-26 2012-11-22 Canon Inc Solid state imaging device and imaging apparatus
JP2013038176A (en) * 2011-08-05 2013-02-21 Toshiba Information Systems (Japan) Corp Rear face irradiation type solid-state imaging element
JP2013065688A (en) * 2011-09-16 2013-04-11 Sony Corp Solid state imaging element, manufacturing method, and electronic apparatus
JP2014086702A (en) * 2012-10-26 2014-05-12 Canon Inc Solid state image pickup device, manufacturing method therefor, and camera
JP2014096540A (en) * 2012-11-12 2014-05-22 Canon Inc Solid state image pickup device, manufacturing method for the same and camera
US20160181294A1 (en) * 2013-07-15 2016-06-23 Galaxycore Shanghai Limited Corporation Backside illuminated image sensor and manufacturing method therefor
JP2015106621A (en) * 2013-11-29 2015-06-08 ソニー株式会社 Solid-state imaging element and manufacturing method, and electronic equipment
WO2016052249A1 (en) * 2014-10-03 2016-04-07 ソニー株式会社 Solid-state imaging element, production method, and electronic device
JP2016082133A (en) * 2014-10-20 2016-05-16 ソニー株式会社 Solid-state imaging device and electronic apparatus
JP2015128187A (en) * 2015-03-24 2015-07-09 ソニー株式会社 Solid state image pickup device and electronic apparatus
JP2018022076A (en) * 2016-08-04 2018-02-08 大日本印刷株式会社 Through electrode substrate and electronic apparatus
WO2019130820A1 (en) * 2017-12-26 2019-07-04 ソニーセミコンダクタソリューションズ株式会社 Imaging element and imaging device

Similar Documents

Publication Publication Date Title
JP7284171B2 (en) Solid-state imaging device
WO2020189534A1 (en) Image capture element and semiconductor element
WO2021106732A1 (en) Imaging device and electronic instrument
WO2021235101A1 (en) Solid-state imaging device
WO2021241019A1 (en) Imaging element and imaging device
JP2019012739A (en) Solid state imaging device and imaging apparatus
WO2021124975A1 (en) Solid-state imaging device and electronic instrument
JP2021019171A (en) Solid-state imaging device and electronic apparatus
WO2021100332A1 (en) Semiconductor device, solid-state image capturing device, and electronic device
WO2022009627A1 (en) Solid-state imaging device and electronic device
WO2019176303A1 (en) Imaging device drive circuit and imaging device
WO2022131034A1 (en) Imaging device
WO2022172711A1 (en) Photoelectric conversion element and electronic device
KR20240037943A (en) imaging device
WO2021140958A1 (en) Imaging element, manufacturing method, and electronic device
WO2021215299A1 (en) Imaging element and imaging device
WO2024202748A1 (en) Light detection device and electronic device
WO2024162113A1 (en) Optical detector, optical element, and electronic device
WO2024202671A1 (en) Light detection device and electronic apparatus
WO2024162114A1 (en) Light detector, optical element, and electronic appliance
EP4415047A1 (en) Imaging device
WO2024057814A1 (en) Light-detection device and electronic instrument
WO2024085005A1 (en) Photodetector
WO2024203630A1 (en) Light detection device and electronic apparatus
WO2021157250A1 (en) Light receiving element, solid-state imaging device, and electronic apparatus

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21791600

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21791600

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP