WO2022130776A1 - Dispositif de détection de lumière, système de détection de lumière, appareil électronique et corps mobile - Google Patents

Dispositif de détection de lumière, système de détection de lumière, appareil électronique et corps mobile Download PDF

Info

Publication number
WO2022130776A1
WO2022130776A1 PCT/JP2021/038761 JP2021038761W WO2022130776A1 WO 2022130776 A1 WO2022130776 A1 WO 2022130776A1 JP 2021038761 W JP2021038761 W JP 2021038761W WO 2022130776 A1 WO2022130776 A1 WO 2022130776A1
Authority
WO
WIPO (PCT)
Prior art keywords
photoelectric conversion
light
conversion unit
unit
photodetector
Prior art date
Application number
PCT/JP2021/038761
Other languages
English (en)
Japanese (ja)
Inventor
秀起 辻合
利彦 林
賢一 村田
瑛子 平田
Original Assignee
ソニーセミコンダクタソリューションズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーセミコンダクタソリューションズ株式会社 filed Critical ソニーセミコンダクタソリューションズ株式会社
Priority to US18/256,100 priority Critical patent/US20240031703A1/en
Publication of WO2022130776A1 publication Critical patent/WO2022130776A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14601Structural or functional details thereof
    • H01L27/1462Coatings
    • H01L27/14621Colour filter arrangements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/76Addressed sensors, e.g. MOS or CMOS sensors
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14601Structural or functional details thereof
    • H01L27/1462Coatings
    • H01L27/14623Optical shielding
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14601Structural or functional details thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/60Noise processing, e.g. detecting, correcting, reducing or removing noise
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/703SSIS architectures incorporating pixels for producing signals other than image signals
    • H04N25/705Pixels for depth measurement, e.g. RGBZ

Definitions

  • the present disclosure relates to a photodetector, a photodetection system, an electronic device, and a mobile body provided with a photoelectric conversion element that performs photoelectric conversion.
  • the solid-state image sensor is required to have improved functions.
  • the photodetector as an embodiment of the present disclosure includes an effective region provided with a photoelectric conversion element that detects irradiation light and performs photoelectric conversion, and a peripheral region adjacent to the effective region.
  • the photoelectric conversion element is provided so as to overlap the first photoelectric conversion unit that detects light in the first wavelength range of the irradiation light and performs photoelectric conversion, and the first photoelectric conversion unit thereof. It is sandwiched between a second photoelectric conversion unit that detects light in the second wavelength region and performs photoelectric conversion, and a first photoelectric conversion unit and a second photoelectric conversion unit, and is the first. It has a laminated structure including a first optical filter through which light in a second wavelength range is more easily transmitted than light in a wavelength range. In the peripheral region, a second optical filter is provided, which makes it easier for light in the second wavelength range to pass through than light in the first wavelength range.
  • the peripheral region adjacent to the effective region in which the light in the first wavelength region is detected and photoelectric conversion is performed is also a second region than the light in the first wavelength region.
  • a second optical filter that easily transmits light in the wavelength range is provided. Therefore, it is possible to prevent the light in the first wavelength region of the unnecessary light radiated to the peripheral region from entering the second photoelectric conversion unit directly or through the optical filter.
  • FIG. 3 is a vertical cross-sectional view showing an example of a schematic configuration of an image pickup device applied to the pixel portion shown in FIG. 1A.
  • FIG. 3 is a horizontal cross-sectional view showing an example of a schematic configuration of an image pickup device applied to the pixel portion shown in FIG. 1A.
  • FIG. 3 is a vertical cross-sectional view showing an example of a schematic configuration of an image pickup device applied to the pixel portion shown in FIG. 1A.
  • FIG. 3 is a horizontal cross-sectional view showing an example of a schematic configuration of an image pickup device applied to the pixel portion shown in FIG. 1A.
  • It is another horizontal sectional view which shows an example of the schematic structure of the image pickup element applied to the pixel part shown in FIG. 1A.
  • FIG. 3 is an enlarged vertical cross-sectional view showing a main part of the image pickup device shown in FIG. 2A.
  • 2 is another vertical cross-sectional view showing an enlarged main part of the image pickup device shown in FIG. 2A. It is a vertical sectional view which shows an example of the schematic structure in the peripheral part shown in FIG. 1B.
  • FIG. 3 is a horizontal cross-sectional view showing a part of the peripheral portion shown in FIG. 3C in an enlarged manner.
  • FIG. 3 is a schematic cross-sectional view showing the through silicon via shown in FIG. 2A and its periphery in an enlarged manner.
  • FIG. 3 is a schematic plan view showing the through silicon via shown in FIG. 2A and its periphery in an enlarged manner.
  • FIG. 8 is a circuit diagram showing an example of a readout circuit of the organic photoelectric conversion unit shown in FIG. 2A.
  • FIG. 3 is a schematic cross-sectional view showing an example of a schematic configuration of an image pickup device as a first modification in the first embodiment applied to the pixel portion shown in FIG. 1A.
  • FIG. 3 is a horizontal cross-sectional view showing an example of a schematic configuration of an image pickup device as a second modification in the first embodiment applied to the pixel portion shown in FIG. 1B.
  • FIG. 8 is a horizontal cross-sectional view showing an example of a peripheral portion of an image pickup device as a second modification of the first embodiment shown in FIG.
  • FIG. 8 is an enlarged vertical cross-sectional view showing a main part of the image pickup device shown in FIG. 8A.
  • FIG. 8 is another vertical cross-sectional view showing an enlarged main part of the image pickup device shown in FIG. 8A. It is a vertical sectional view which shows an example of the schematic structure in the peripheral part shown in FIG. 8B.
  • FIG. 3 is a vertical cross-sectional view showing an enlarged main part of an image pickup device as a third modification in the first embodiment applied to the pixel portion shown in FIG. 1A.
  • FIG. 3 is another vertical cross-sectional view showing an enlarged main part of an image pickup device as a third modification in the first embodiment applied to the pixel portion shown in FIG. 1A.
  • FIG. 3 is a vertical cross-sectional view showing an enlarged main part of an image pickup device as a fourth modification in the first embodiment applied to the pixel portion shown in FIG. 1A.
  • FIG. 3 is another vertical cross-sectional view showing an enlarged main part of an image pickup device as a fourth modification in the first embodiment applied to the pixel portion shown in FIG. 1A.
  • FIG. 3 is a vertical cross-sectional view showing an enlarged main part of an image pickup device as a fifth modification in the first embodiment applied to the pixel portion shown in FIG. 1A.
  • FIG. 3 is another vertical cross-sectional view showing an enlarged main part of an image pickup device as a fifth modification in the first embodiment applied to the pixel portion shown in FIG. 1A. It is a vertical sectional view which shows an example of the schematic structure in the peripheral part shown in FIG. 12B.
  • FIG. 3 is a horizontal cross-sectional view showing an example of a schematic configuration of an image pickup device as a sixth modification in the first embodiment applied to the pixel portion shown in FIG. 1A.
  • FIG. 13 is a vertical cross-sectional view showing an enlarged main part of an image pickup device as a sixth modification of the first embodiment shown in FIG. 13. It is a vertical sectional view which shows an example of the schematic structure in the peripheral part shown in FIG. 14A.
  • FIG. 15B is a schematic cross-sectional view showing an example of a schematic configuration of an image pickup device as a first modification in the second embodiment applied to the peripheral portion shown in FIG. 15B.
  • FIG. 15B is a schematic cross-sectional view showing an example of a schematic configuration of an image pickup device as a second modification in the second embodiment applied to the peripheral portion shown in FIG.
  • FIG. 15B is a schematic cross-sectional view showing an example of a schematic configuration of an image pickup device as a third modification in the second embodiment applied to the peripheral portion shown in FIG. 15B.
  • FIG. 15B is a schematic cross-sectional view showing an example of a schematic configuration of an image pickup device as a fourth modification in the second embodiment applied to the peripheral portion shown in FIG. 15B.
  • It is a schematic diagram which shows an example of the whole structure of the light detection system which concerns on 3rd Embodiment of this disclosure.
  • It is a schematic diagram which shows an example of the circuit structure of the photodetection system shown in FIG. 20A.
  • FIG. 1A It is a block diagram which shows an example of the schematic structure of the in-vivo information acquisition system. It is a figure which shows an example of the schematic structure of an endoscopic surgery system. It is a block diagram which shows an example of the functional structure of a camera head and a CCU. It is a block diagram which shows an example of the schematic structure of a vehicle control system. It is explanatory drawing which shows an example of the installation position of the vehicle exterior information detection unit and the image pickup unit. It is explanatory drawing which shows the other structural example of the pixel part and the peripheral part thereof shown in FIG. 1A schematically. It is a top view which shows the structural example of the pixel part and the peripheral part of the solid-state image pickup apparatus as another 1st modification of this disclosure schematically. FIG.
  • FIG. 8 is a vertical cross-sectional view showing an example of a schematic configuration of a pixel portion and a peripheral portion shown in FIG. 28.
  • FIG. 9 is a schematic plan view showing an enlarged vicinity of the pad opening region shown in FIG. 29.
  • FIG. 9 is a schematic plan view showing an enlarged vicinity of the pad opening region shown in FIG. 29.
  • FIG. 9 is a schematic plan view showing an enlarged vicinity of the pad opening region shown in FIG. 29.
  • FIG. 9 is a schematic plan view showing an enlarged vicinity of the pad opening region shown in FIG. 29.
  • FIG. 9 is a schematic plan view showing an enlarged vicinity of the pad opening region shown in FIG. 29.
  • FIG. 9 is a schematic plan view showing an enlarged vicinity of the pad opening region shown in FIG. 29.
  • FIG. 9 is a schematic plan view showing an enlarged vicinity of the pad opening region shown in FIG. 29.
  • 6 is a vertical cross section schematically showing a configuration example of a pixel portion and a peripheral portion of a solid-state image sensor as another sixth modification of the present disclosure.
  • First Embodiment An example of a solid-state image pickup device in which an optical filter is also arranged in a peripheral portion surrounding a pixel portion provided with a vertical spectroscopic image pickup element.
  • Second Embodiment An example of an image pickup device in which a black level reference element including two or more layers of light-shielding films is arranged in a peripheral portion surrounding a pixel portion provided with a vertical spectroscopic image pickup device.
  • 3. Third Embodiment An example of a photodetection system including a light emitting device and a photodetector. 4.
  • Application example to electronic devices 5.
  • FIG. 1A shows an overall configuration example of the solid-state image sensor 1 according to the first embodiment of the present disclosure.
  • FIG. 1B is a schematic diagram showing the pixel portion 100 and its periphery in an enlarged manner.
  • the solid-state image sensor 1 is, for example, a CMOS (Complementary Metal Oxide Semiconductor) image sensor.
  • the solid-state image sensor 1 captures incident light (image light) from a subject via, for example, an optical lens system, converts the incident light imaged on the image pickup surface into an electric signal on a pixel-by-pixel basis, and outputs it as a pixel signal. It has become like.
  • the solid-state image sensor 1 includes, for example, a pixel portion 100 as an effective region and a peripheral portion 101 as a peripheral region adjacent to the pixel portion 100 on a semiconductor substrate 11.
  • the peripheral portion 101 is provided so as to surround the periphery of the pixel portion 100, for example.
  • the peripheral portion 101 is provided with, for example, a vertical drive circuit 111, a column signal processing circuit 112, a horizontal drive circuit 113, an output circuit 114, a control circuit 115, an input / output terminal 116, and the like.
  • the solid-state image sensor 1 is a specific example corresponding to the "photodetector" of the present disclosure.
  • a plurality of pixels P are two-dimensionally arranged, for example, in a matrix in the pixel unit 100.
  • a contact region 102 is provided in a part of the peripheral portion 101 to connect the contact layer 57 (later) and the lead-out wiring 58 (later).
  • a pixel row composed of a plurality of pixels P arranged in the horizontal direction (horizontal direction of the paper surface) and a pixel row composed of a plurality of pixels P arranged in the vertical direction (vertical direction of the paper surface) are respectively. There are multiple.
  • one pixel drive line Lread (row selection line and reset control line) is wired to the pixel unit 100 for each pixel row, and one vertical signal line Lsig is wired for each pixel column.
  • the pixel drive line Lread transmits a drive signal for reading a signal from each pixel P.
  • the ends of the plurality of pixel drive lines Lread are connected to a plurality of output terminals corresponding to each pixel row of the vertical drive circuit 111.
  • the peripheral portion 101 is provided with an optical filter 90 (described later).
  • the vertical drive circuit 111 is composed of a shift register, an address decoder, and the like, and is a pixel drive unit that drives each pixel P in the pixel unit 100, for example, in pixel row units.
  • the signal output from each pixel P of the pixel row selectively scanned by the vertical drive circuit 111 is supplied to the column signal processing circuit 112 through each of the vertical signal lines Lsig.
  • the column signal processing circuit 112 is composed of an amplifier, a horizontal selection switch, etc. provided for each vertical signal line Lsig.
  • the horizontal drive circuit 113 is composed of a shift register, an address decoder, etc., and drives each horizontal selection switch of the column signal processing circuit 112 in order while scanning. By the selective scanning by the horizontal drive circuit 113, the signal of each pixel P transmitted through each of the plurality of vertical signal lines Lsig is sequentially output to the horizontal signal line 121, and is sent to the outside of the semiconductor substrate 11 through the horizontal signal line 121. It is designed to be transmitted.
  • the output circuit 114 processes signals and outputs the signals sequentially supplied from each of the column signal processing circuits 112 via the horizontal signal line 121.
  • the output circuit 114 may, for example, perform only buffering, or may perform black level adjustment, column variation correction, various digital signal processing, and the like.
  • the circuit portion including the vertical drive circuit 111, the column signal processing circuit 112, the horizontal drive circuit 113, the horizontal signal line 121, and the output circuit 114 may be formed directly on the semiconductor substrate 11, or may be used as an external control IC. It may be arranged. Further, those circuit portions may be formed on another substrate connected by a cable or the like.
  • the control circuit 115 receives a clock given from the outside of the semiconductor substrate 11, data instructing an operation mode, and the like, and outputs data such as internal information of the pixel P, which is an image pickup element.
  • the control circuit 115 further has a timing generator that generates various timing signals, and the vertical drive circuit 111, the column signal processing circuit 112, the horizontal drive circuit 113, and the like based on the various timing signals generated by the timing generator. It controls the drive of peripheral circuits.
  • the input / output terminal 116 exchanges signals with the outside.
  • FIG. 2A schematically shows an example of a vertical cross-sectional configuration along the thickness direction of one of the plurality of pixels P arranged in a matrix in the pixel unit 100 in the pixel P1.
  • FIG. 2B schematically shows an example of a horizontal cross-sectional configuration along the stacking surface direction orthogonal to the thickness direction at the height position in the Z-axis direction indicated by the arrow IIB in FIG. 2A.
  • FIG. 2C schematically shows an example of a horizontal cross-sectional configuration along the stacking surface direction orthogonal to the thickness direction at the height position in the Z-axis direction indicated by the arrow IIC in FIG. 2A.
  • the thickness direction (stacking direction) of the pixel P1 is the Z-axis direction
  • the plane directions parallel to the stacking surface orthogonal to the Z-axis direction are the X-axis direction and the Y-axis direction.
  • the X-axis direction, the Y-axis direction, and the Z-axis direction are orthogonal to each other.
  • the pixel P1 has a structure in which, for example, one photoelectric conversion unit 10 and one organic photoelectric conversion unit 20 are laminated in the Z-axis direction, which is the thickness direction, so-called longitudinal spectroscopy. It is a type image sensor.
  • the pixel P1 which is an image pickup element is a specific example corresponding to the "photodetection element" of the present disclosure.
  • Pixels P1 include an intermediate layer 40 provided between the photoelectric conversion unit 10 and the organic photoelectric conversion unit 20, and a multilayer wiring layer 30 provided on the opposite side of the organic photoelectric conversion unit 20 when viewed from the photoelectric conversion unit 10. Has more.
  • a sealing film 51, a partition wall 52, a plurality of color filters 53, and a plurality of color filters 53 are respectively.
  • the lens layer 54 including the on-chip lens (OCL) provided corresponding to the above is laminated along the Z-axis direction in order from the position closest to the organic photoelectric conversion unit 20.
  • the partition wall 52 may be formed of a low refractive index material such as SiOx having a refractive index lower than that of the color filter 53.
  • the sealing film 51 and the partition wall 52 may be provided in common in a plurality of pixels P, respectively.
  • the sealing film 51 has a structure in which transparent insulating films 51-1 to 51-3 such as AlOx are laminated. Further, an antireflection film 55 (described later in FIG. 3A or the like) may be provided so as to cover the lens layer 54.
  • the peripheral portion 101 may be provided with a black filter 56 (described later in FIG. 3A or the like).
  • the plurality of color filters 53 include, for example, a color filter that mainly transmits red, a color filter that mainly transmits green, and a color filter that mainly transmits blue.
  • the pixel P1 of the present embodiment is provided with red, green, and blue color filters 53, respectively, and the organic photoelectric conversion unit 20 receives red light, green light, and blue light, respectively, to acquire a color visible light image. I try to do it.
  • the photoelectric conversion unit 10 is an indirect TOF (hereinafter referred to as iTOF) sensor that acquires a distance image (distance information) by, for example, a light flight time (Time-of-Flight; TOF).
  • the photoelectric conversion unit 10 includes, for example, a semiconductor substrate 11, a photoelectric conversion region 12, a fixed charge layer 13, a pair of transfer transistors (TG) 14A and 14B, and a charge-voltage conversion unit (FD) 15A which is a floating diffusion region. , 15B, an interpixel region light-shielding wall 16, and a through electrode 17.
  • the semiconductor substrate 11 is, for example, an n-type silicon (Si) substrate including a front surface 11A and a back surface 11B, and has p-wells in a predetermined region.
  • the surface 11A faces the multilayer wiring layer 30.
  • the back surface 11B is a surface facing the intermediate layer 40, and it is preferable that a fine uneven structure (RIG structure) is formed. This is because it is effective to confine the light having a wavelength in the infrared light region (for example, wavelength 880 nm or more and 1040 nm or less) as the second wavelength region incident on the semiconductor substrate 11 inside the semiconductor substrate 11. A similar fine uneven structure may be formed on the surface 11A.
  • the photoelectric conversion region 12 is a photoelectric conversion element composed of, for example, a PIN (Positive Intrinsic Negative) type photodiode (PD), and includes a pn junction formed in a predetermined region of the semiconductor substrate 11.
  • the photoelectric conversion region 12 detects and receives light having a wavelength in the infrared light region among the light from the subject, and generates and stores an electric charge according to the amount of received light by photoelectric conversion. ..
  • the fixed charge layer 13 is provided so as to cover the back surface 11B of the semiconductor substrate 11.
  • the fixed charge layer 13 has, for example, a negative fixed charge in order to suppress the generation of dark current due to the interface state of the back surface 11B, which is the light receiving surface of the semiconductor substrate 11.
  • the electric field induced by the fixed charge layer 13 forms a hole storage layer in the vicinity of the back surface 11B of the semiconductor substrate 11.
  • the hole storage layer suppresses the generation of electrons from the back surface 11B.
  • the fixed charge layer 13 also includes a portion extending in the Z-axis direction between the interpixel region light-shielding wall 16 and the photoelectric conversion region 12.
  • the fixed charge layer 13 is preferably formed by using an insulating material.
  • examples of the constituent materials of the fixed charge layer 13 include hafnium oxide (HfOx), aluminum oxide (AlOx), zirconium oxide (ZrOx), tantalum oxide (TaOx), titanium oxide (TiOx), and lanthanum oxide (TiOx).
  • Each of the pair of TG14A and 14B extends in the Z-axis direction from, for example, the surface 11A to the photoelectric conversion region 12.
  • the TG14A and TG14B transfer the electric charge stored in the photoelectric conversion region 12 to the pair of FD15A and 15B according to the applied drive signal.
  • the pair of FD15A and 15B are floating diffusion regions that convert the electric charge transferred from the photoelectric conversion region 12 via the TG14A and 14B into an electric signal (for example, a voltage signal) and output the electric charge, respectively.
  • reset transistors (RST) 143A and 143B are connected to FD15A and 15B, and are vertical via amplification transistors (AMP) 144A and 144B and selection transistors (SEL) 145A and 145B.
  • the signal line Lsig (FIG. 1A) is connected.
  • FIG. 3A and 3B are both enlarged cross-sectional views showing the main part of the pixel P1 shown in FIG. 2A in an enlarged manner.
  • FIG. 3A shows a cross section in the direction of the arrow along the IIIA-IIIA cutting line shown in FIGS. 2B and 2C
  • FIG. 3B shows the IIIB-IIIB cutting line shown in FIGS. 2B and 2C. It represents a cross section in the direction of the arrow along the line.
  • FIG. 3C is a vertical cross-sectional view showing an example of the schematic configuration of the peripheral portion 101 shown in FIG. 1B.
  • FIG. 3D is a horizontal cross-sectional view showing a part of the peripheral portion 101 shown in FIG. 3C in an enlarged manner.
  • FIG. 3A shows a cross section in the direction of the arrow along the IIIA-IIIA cutting line shown in FIGS. 2B and 2C
  • FIG. 3C is a vertical cross-sectional view showing an example of the schematic configuration of the peripheral portion 101 shown in
  • FIG. 3D schematically represents an example of a horizontal cross-sectional configuration at a height position in the Z-axis direction indicated by arrow IIID in FIG. 3C.
  • FIG. 3C corresponds to a cross section in the direction of the arrow along the IIIC-IIIC cutting line shown in FIG. 3D.
  • FIG. 4A is a cross-sectional view taken along the Z axis showing an enlarged interpixel region shading wall 16 surrounding the through electrode 17, and
  • FIG. 4B is an enlarged interpixel region shading wall 16 surrounding the through electrode 17. It is sectional drawing along the XY plane shown by.
  • FIG. 4A represents a cross section in the direction of the arrow along the IVB-IVB line shown in FIG. 4B.
  • the inter-pixel region light-shielding wall 16 is provided at a boundary portion with other adjacent pixels P in the XY plane.
  • the inter-pixel region shading wall 16 includes, for example, a portion extending along the XZ plane and a portion extending along the YZ plane, and is provided so as to surround the photoelectric conversion region 12 of each pixel P. Further, the inter-pixel region light-shielding wall 16 may be provided so as to surround the through electrode 17. As a result, oblique incident of unnecessary light on the photoelectric conversion region 12 between adjacent pixels P can be suppressed, and color mixing can be prevented.
  • the inter-pixel region light-shielding wall 16 is made of, for example, a material containing at least one of a simple substance metal having a light-shielding property, a metal alloy, a metal nitride, and a metal silicide. More specifically, as the constituent materials of the interpixel region light-shielding wall 16, Al (aluminum), Cu (copper), Co (cobalt), W (tungsten), Ti (titanium), Ta (tantal), Ni ( Examples thereof include nickel), Mo (molybdenum), Cr (chromium), Ir (iridium), platinum iridium, TiN (titanium nitride), and tungsten silicon compounds.
  • the constituent material of the interpixel region light-shielding wall 16 is not limited to the metal material, and graphite may be used for the constituent material. Further, the inter-pixel region light-shielding wall 16 is not limited to the conductive material, and may be made of a non-conductive material having light-shielding properties such as an organic material. Further, an insulating layer Z1 made of an insulating material such as SiOx (silicon oxide) or aluminum oxide may be provided between the interpixel region light-shielding wall 16 and the through electrode 17. Alternatively, the interpixel region shading wall 16 and the through electrode 17 may be insulated by providing a gap between the interpixel region shading wall 16 and the through electrode 17.
  • the insulating layer Z1 may not be provided. Further, the insulating layer Z2 may be provided outside the inter-pixel region shading wall 16, that is, between the inter-pixel region shading wall 16 and the fixed charge layer 13.
  • the insulating layer Z2 is made of an insulating material such as SiOx (silicon oxide) or aluminum oxide.
  • the interpixel region shading wall 16 and the fixed charge layer 13 may be insulated by providing a gap between the interpixel region shading wall 16 and the fixed charge layer 13.
  • the insulating layer Z2 ensures electrical insulation between the inter-pixel region light-shielding wall 16 and the semiconductor substrate 11. Further, when the inter-pixel region light-shielding wall 16 is provided so as to surround the through electrode 17, and the inter-pixel region light-shielding wall 16 is made of a conductive material, the insulating layer Z1 penetrates the inter-pixel region light-shielding wall 16. Electrical insulation with the electrode 17 is ensured.
  • the through electrodes 17 are, for example, the read electrode 26 of the organic photoelectric conversion unit 20 provided on the back surface 11B side of the semiconductor substrate 11, and the FD 131 and AMP 133 provided on the front surface 11A of the semiconductor substrate 11 (see FIG. 6 below). It is a connecting member that electrically connects to and.
  • the through electrode 17 is, for example, a transmission path for transmitting the signal charge generated in the organic photoelectric conversion unit 20 and transmitting the voltage for driving the charge storage electrode 25.
  • the through electrode 17 can be provided so as to extend in the Z-axis direction from the read electrode 26 of the organic photoelectric conversion unit 20 to the multilayer wiring layer 30 through the semiconductor substrate 11, for example.
  • the through electrode 17 can satisfactorily transfer the signal charge generated by the organic photoelectric conversion unit 20 provided on the back surface 11B side of the semiconductor substrate 11 to the front surface 11A side of the semiconductor substrate 11.
  • the through silicon via 17 penetrates the inside of the interpixel region light-shielding wall 44 in the Z-axis direction. That is, a fixed charge layer 13 and an interpixel region light-shielding wall 44 (described later) having electrical insulation are provided around the through electrode 17, whereby the p-well region of the through electrode 17 and the semiconductor substrate 11 is provided. Is electrically isolated from.
  • the through silicon via 17 has a first through electrode portion 17-1 penetrating the inside of the interpixel region light-shielding wall 44 in the Z-axis direction and a second through electrode portion 17-1 penetrating the inside of the inter-pixel region light-shielding wall 16 in the Z-axis direction. It includes an electrode portion 17-2.
  • the first through electrode portion 17-1 and the second through electrode portion 17-2 are connected via, for example, a connection electrode portion 17-3.
  • the maximum in-plane dimension of the connecting electrode portion 17-3 is, for example, the maximum in-plane dimension of the first through silicon via 17-1 and the in-plane direction of the second through silicon via 17-2. Greater than both maximum dimensions.
  • the through electrode 17 is, for example, a silicon material doped with impurities such as PDAS (Phosphorus Doped Amorphous Silicon), aluminum (Al), tungsten (W), titanium (Ti), cobalt (Co), platinum (Pt). , Palladium (Pd), Copper (Cu), Hafnium (Hf), Titanium (Ta) and the like, and can be formed by using one or more of metal materials.
  • PDAS Phosphorus Doped Amorphous Silicon
  • Al aluminum
  • Ti titanium
  • Co cobalt
  • platinum Pt
  • Palladium (Pd), Copper (Cu), Hafnium (Hf), Titanium (Ta) and the like and can be formed by using one or more of metal materials.
  • the multilayer wiring layer 30 has, for example, RST143A, 143B, AMP144A, 144B, SEL145A, 145B, etc. that form a read circuit together with TG14A, 14B.
  • the intermediate layer 40 may have, for example, an insulating layer 41 and an optical filter 42 embedded in the insulating layer 41.
  • the intermediate layer 40 further has an interpixel region light-shielding wall 44 as a first light-shielding member that blocks light having a wavelength in an infrared light region (for example, a wavelength of 880 nm or more and 1040 nm or less) as a second wavelength region. You may be doing it.
  • the insulating layer 41 is a single-layer film made of, for example, one of inorganic insulating materials such as silicon oxide (SiOx), silicon nitride (SiNx), and silicon oxynitride (SiON), or two or more of them.
  • insulating layer 41 polymethylmethacrylate (PMMA), polyvinylphenol (PVP), polyvinyl alcohol (PVA), polyimide, polycarbonate (PC), polyethylene terephthalate (PET), polystyrene, N-2 (amino).
  • PMMA polymethylmethacrylate
  • PVP polyvinylphenol
  • PVA polyvinyl alcohol
  • PC polycarbonate
  • PET polyethylene terephthalate
  • polystyrene N-2 (amino).
  • Organic insulating materials such as ethyl) 3-aminopropyltrimethoxysilane (AEAPTMS), 3-mercaptopropyltrimethoxysilane (MPTMS), tetraethoxysilane (TEOS), and octadecyltrichlorosilane (OTS) may be used.
  • the interpixel region light-shielding wall 44 is one of materials that mainly block light in the infrared light region, for example, an inorganic insulating material such as silicon oxide (SiOx), silicon nitride (SiNx), and silicon oxynitride (SiON). It is composed of a single-layer film composed of a single layer, or a laminated film composed of two or more of these.
  • the inter-pixel region light-shielding wall 44 may be integrally formed with the insulating layer 41.
  • the inter-pixel region light-shielding wall 44 surrounds the optical filter 42 along the XY plane so that at least a part of the optical filter 42 overlaps with the optical filter 42 on the XY plane orthogonal to the thickness direction (Z-axis direction). Similar to the inter-pixel region shading wall 16, the inter-pixel region shading wall 44 suppresses oblique incident of unnecessary light into the photoelectric conversion region 12 between adjacent pixels P1 to prevent color mixing.
  • the optical filter 42 has a transmission band in the infrared light region where photoelectric conversion is performed in the photoelectric conversion region 12. That is, the optical filter 42 has a wavelength in the visible light region (for example, a wavelength of 400 nm or more and 700 nm or less) as the first wavelength region, that is, light having a wavelength in the infrared light region rather than visible light, that is, infrared light. Is easier to see through.
  • the optical filter 42 can be made of, for example, an organic material, and absorbs at least a part of light having a wavelength in the visible light range while selectively transmitting light in the infrared light range. It is designed to do.
  • the optical filter 42 is made of an organic material such as a phthalocyanine derivative.
  • the plurality of optical filters 42 provided in the pixel unit 100 may have substantially the same shape and substantially the same size as each other.
  • the SiN layer 45 may be provided on the back surface of the optical filter 42, that is, the surface facing the organic photoelectric conversion unit 20. Further, the SiN layer 46 may be provided on the surface of the optical filter 42, that is, the surface facing the photoelectric conversion unit 10. Further, an insulating layer 47 made of, for example, SiOx may be provided between the semiconductor substrate 11 and the SiN layer 46.
  • the organic photoelectric conversion unit 20 has, for example, a read electrode 26 stacked in order from a position closest to the photoelectric conversion unit 10, a semiconductor layer 21, an organic photoelectric conversion layer 22, and an upper electrode 23.
  • the organic photoelectric conversion unit 20 further has an insulating layer 24 provided below the semiconductor layer 21 and a charge storage electrode 25 provided so as to face the semiconductor layer 21 via the insulating layer 24. There is.
  • the charge storage electrode 25 and the read electrode 26 are separated from each other, and are provided, for example, in the same layer.
  • the read electrode 26 is in contact with the upper end of the through electrode 17. Further, as shown in FIG.
  • the organic photoelectric conversion unit 20 is connected to the lead-out wiring 58 via the contact layer 57 in the peripheral portion 101.
  • the upper electrode 23, the organic photoelectric conversion layer 22, and the semiconductor layer 21 are each provided in common in some pixels P1 of a plurality of pixels P1 (FIG. 2A) in the pixel unit 100, or are provided in common. It may be provided in common in all of the plurality of pixels P in the pixel unit 100. The same applies to the modified examples described below.
  • another organic layer may be provided between the organic photoelectric conversion layer 22 and the semiconductor layer 21 and between the organic photoelectric conversion layer 22 and the upper electrode 23.
  • the read electrode 26, the upper electrode 23, and the charge storage electrode 25 are made of a light conductive conductive film, and are made of, for example, ITO (indium tin oxide).
  • ITO indium tin oxide
  • a dopant is added to tin oxide (SnOx) -based material to which a dopant is added, or zinc oxide (ZnO).
  • SnOx tin oxide
  • ZnO zinc oxide
  • zinc oxide-based material examples include aluminum zinc oxide (AZO) to which aluminum (Al) is added as a dopant, gallium zinc oxide (GZO) to which gallium (Ga) is added, and indium zinc oxide to which indium (In) is added. (IZO) can be mentioned.
  • AZO aluminum zinc oxide
  • GZO gallium zinc oxide
  • Indium zinc oxide indium (In) is added.
  • IZO indium zinc oxide to which indium (In) is added.
  • the constituent materials of the read electrode 26 the upper electrode 23 and the charge storage electrode 25, CuI, InSbO 4 , ZnMgO, CuInO 2 , MgIN 2O 4 , CdO, ZnSnO 3 or TiO 2 may be used.
  • a spinel-type oxide or an oxide having a YbFe 2 O 4 structure may be used.
  • the organic photoelectric conversion layer 22 converts light energy into electrical energy, and is formed, for example, containing two or more kinds of organic materials that function as p-type semiconductors and n-type semiconductors.
  • the p-type semiconductor functions as a relatively electron donor (donor)
  • the n-type semiconductor functions as an n-type semiconductor that relatively functions as an electron acceptor (acceptor).
  • the organic photoelectric conversion layer 22 has a bulk heterojunction structure in the layer.
  • the bulk heterojunction structure is a p / n junction surface formed by mixing p-type semiconductors and n-type semiconductors, and excitons generated when light is absorbed are electrons and holes at the p / n junction interface. Separate into and.
  • the organic photoelectric conversion layer 22 further includes three types of so-called dye materials that photoelectrically convert light in a predetermined wavelength band while transmitting light in another wavelength band. It may be composed of. It is preferable that the p-type semiconductor, the n-type semiconductor and the dye material have different absorption maximum wavelengths from each other. This makes it possible to absorb wavelengths in the visible light region in a wide range.
  • the organic photoelectric conversion layer 22 can be formed, for example, by mixing the above-mentioned various organic semiconductor materials and using a spin coating technique.
  • the organic photoelectric conversion layer 22 may be formed by using a vacuum vapor deposition method, a printing technique, or the like.
  • a material having a large bandgap value for example, a bandgap value of 3.0 eV or more
  • a higher mobility than the material constituting the organic photoelectric conversion layer 22 is used.
  • oxide semiconductor materials such as IGZO; transition metal dichalcogenides; silicon carbide; diamond; graphene; carbon nanotubes; and organic semiconductor materials such as condensed polycyclic hydrocarbon compounds and condensed heterocyclic compounds.
  • the charge storage electrode 25 forms a kind of capacitor together with the insulating layer 24 and the semiconductor layer 21, and the charge generated in the organic photoelectric conversion layer 22 is transmitted through a part of the semiconductor layer 21, for example, the insulating layer 24 of the semiconductor layer 21. It is designed to accumulate in the region corresponding to the charge storage electrode 25.
  • one charge storage electrode 25 is provided corresponding to each of one color filter 53 and one on-chip lens.
  • the charge storage electrode 25 is connected to, for example, a vertical drive circuit 111.
  • the insulating layer 24 can be formed of, for example, the same inorganic insulating material and organic insulating material as the insulating layer 41.
  • the organic photoelectric conversion unit 20 detects a part or all of the wavelengths in the visible light region. Further, it is desirable that the organic photoelectric conversion unit 20 has no sensitivity to the infrared light region.
  • the light incident from the upper electrode 23 side is absorbed by the organic photoelectric conversion layer 22.
  • the excitons (electron-hole pairs) generated by this move to the interface between the electron donor and the electron acceptor constituting the organic photoelectric conversion layer 22, and exciton separation, that is, dissociation into electrons and holes. do.
  • the charges generated here, that is, electrons and holes, move to the upper electrode 23 or the semiconductor layer 21 due to diffusion due to the difference in carrier concentration or the internal electric field due to the potential difference between the upper electrode 23 and the charge storage electrode 25, and are used as a photocurrent. Detected.
  • the read electrode 26 has a positive potential and the upper electrode 23 has a negative potential.
  • the holes generated by the photoelectric conversion in the organic photoelectric conversion layer 22 move to the upper electrode 23.
  • the electrons generated by the photoelectric conversion in the organic photoelectric conversion layer 22 are attracted to the charge storage electrode 25, and a part of the semiconductor layer 21, for example, a region portion of the semiconductor layer 21 corresponding to the charge storage electrode 25 via the insulating layer 24. Accumulate in.
  • the charge (for example, an electron) accumulated in the region portion of the semiconductor layer 21 corresponding to the charge storage electrode 25 via the insulating layer 24 is read out as follows. Specifically, the potential V26 is applied to the read electrode 26, and the potential V25 is applied to the charge storage electrode 25. Here, the potential V26 is made higher than the potential V25 (V25 ⁇ V26). By doing so, the electrons accumulated in the region portion of the semiconductor layer 21 corresponding to the charge storage electrode 25 are transferred to the read electrode 26.
  • the peripheral portion 101 is provided with an optical filter 90 as a second optical filter.
  • the optical filter 90 transmits infrared light more easily than visible light.
  • the optical filter 90 may be provided in the same layer as the layer in which the optical filter 42 is provided, for example.
  • the constituent material of the optical filter 90 may be substantially the same as or different from the constituent material of the optical filter 42.
  • the optical filter 42 and the optical filter 90 may both be made of substantially the same organic material.
  • a plurality of optical filters 90 are provided in the peripheral portion 101, and each of the plurality of optical filters 90 has a Z-axis due to a peripheral region light-shielding wall 49 as a second light-shielding member that shields infrared light at least. It may be surrounded along an XY plane orthogonal to the direction. Further, the plurality of optical filters 90 provided in the peripheral portion 101 may have substantially the same shape and substantially the same size as each other.
  • the arrangement pitch WX44 (see FIG. 2B) of the interpixel region shading walls 44 arranged in the X-axis direction is substantially equal to the arrangement pitch WX49 (see FIG. 3D) of the peripheral region shading walls 49 arranged in the X-axis direction. May be good.
  • the arrangement pitch WY44 (see FIG. 2B) of the interpixel region shading walls 44 arranged in the Y-axis direction is substantially equal to the arrangement pitch WY49 (see FIG. 3D) of the peripheral region shading walls 49 arranged in the Y-axis direction. May be good.
  • the arrangement pitch WX44 and the arrangement pitch WX49 may be substantially equal to the arrangement pitch WY44 and the arrangement pitch WY49.
  • the placement pitch WX44 and the placement pitch WX49 may be different from the placement pitch WY44 and the placement pitch WY49.
  • the planar shape of the optical filter 90 separated by the peripheral region light-shielding wall 49 along the XY plane is not limited to a substantially rectangular shape, and may be a polygon other than a square such as a hexagon. For example, it may be circular or elliptical.
  • the peripheral portion 101 may be further provided with a light-shielding film 60 provided so as to overlap the peripheral region light-shielding wall 49 in the Z-axis direction.
  • the light-shielding film 60 is provided, for example, in the layer between the semiconductor substrate 11 and the SiN layer 46, but is not limited thereto.
  • the light-shielding film 60 can be made of a metal material such as W (tungsten).
  • W tungsten
  • FIG. 5 is a circuit diagram showing an example of a readout circuit of the photoelectric conversion unit 10 constituting the pixel P shown in FIG. 2A.
  • the readout circuit of the photoelectric conversion unit 10 has, for example, TG14A, 14B, OFG146, FD15A, 15B, RST143A, 143B, AMP144A, 144B, and SEL145A, 145B.
  • TG14A and 14B are connected between the photoelectric conversion region 12 and FD15A and 15B.
  • a drive signal is applied to the gate electrodes of the TG14A and 14B and the TG14A and 14B are in the active state, the transfer gates of the TG14A and 14B are in the conductive state.
  • the signal charge converted in the photoelectric conversion region 12 is transferred to the FDs 15A and 15B via the TGs 14A and 14B.
  • OFG146 is connected between the photoelectric conversion region 12 and the power supply.
  • a drive signal is applied to the gate electrode of the OFG146 and the OFG146 becomes active, the OFG146 becomes conductive.
  • the signal charge converted in the photoelectric conversion region 12 is discharged to the power supply via the OFG 146.
  • FD15A, 15B are connected between TG14A, 14B and AMP144A, 144B.
  • the FD15A and 15B convert the signal charge transferred by the TG14A and 14B into a voltage signal and output it to the AMP 144A and 144B.
  • RST143A, 143B is connected between FD15A, 15B and a power supply.
  • a drive signal is applied to the gate electrodes of RST143A and 143B and RST143A and 143B are in the active state, the reset gates of RST143A and 143B are in the conductive state.
  • the potentials of FD15A and 15B are reset to the level of the power supply.
  • the AMP 144A and 144B have a gate electrode connected to the FD15A and 15B and a drain electrode connected to the power supply, respectively.
  • the AMP 144A and 144B are input units of a voltage signal reading circuit held by the FDs 15A and 15B, a so-called source follower circuit. That is, the source electrodes of the AMPs 144A and 144B are connected to the vertical signal line Lsig via the SEL145A and 145B, respectively, thereby forming a constant current source and a source follower circuit connected to one end of the vertical signal line Lsig.
  • the SEL145A and 145B are connected between the source electrodes of the AMP 144A and 144B and the vertical signal line Lsig, respectively.
  • a drive signal is applied to each gate electrode of the SEL145A and 145B and the SEL145A and 145B are in the active state, the SEL145A and 145B are in the conduction state and the pixel P is in the selection state.
  • the read signal (pixel signal) output from the AMP 144A and 144B is output to the vertical signal line Lsig via the SEL145A and 145B.
  • the solid-state image sensor 1 irradiates a subject with an optical pulse in the infrared region, and receives the light pulse reflected from the subject in the photoelectric conversion region 12 of the photoelectric conversion unit 10.
  • a plurality of charges are generated by the incident light pulse in the infrared region.
  • the plurality of charges generated in the photoelectric conversion region 12 are alternately distributed to the FD15A and the FD15B by alternately supplying the drive signals to the pair of TGs 14A and 14B over an equal time period.
  • the charge accumulation amount in the FD15A and the charge accumulation amount in the FD15B become phase-modulated values. Since the round-trip time of the optical pulse is estimated by demodulating these, the distance between the solid-state image sensor 1 and the subject can be obtained.
  • FIG. 6 is a circuit diagram showing an example of a readout circuit of the organic photoelectric conversion unit 20 constituting the pixel P1 shown in FIG. 2A.
  • the readout circuit of the organic photoelectric conversion unit 20 has, for example, FD131, RST132, AMP133, and SEL134.
  • the FD 131 is connected between the read electrode 26 and the AMP 133.
  • the FD 131 converts the signal charge transferred by the read electrode 26 into a voltage signal and outputs it to the AMP 133.
  • the RST132 is connected between the FD131 and the power supply.
  • a drive signal is applied to the gate electrode of the RST 132 and the RST 132 becomes active, the reset gate of the RST 132 becomes conductive.
  • the potential of the FD 131 is reset to the level of the power supply.
  • the AMP 133 has a gate electrode connected to the FD 131 and a drain electrode connected to the power supply.
  • the source electrode of the AMP 133 is connected to the vertical signal line Lsig via the SEL134.
  • the SEL134 is connected between the source electrode of the AMP 133 and the vertical signal line Lsig.
  • a drive signal is applied to the gate electrode of the SEL 134 and the SEL 134 becomes active, the SEL 134 becomes a conductive state and the pixel P1 becomes a selected state.
  • the read signal (pixel signal) output from the AMP 133 is output to the vertical signal line Lsig via the SEL134.
  • the solid-state imaging device 1 of the present embodiment has an organic photoelectric conversion unit 20 that detects light having a wavelength in the visible light region stacked in order from the incident side and performs photoelectric conversion, and a transmission band in the infrared light region. It has an optical filter 42 and a photoelectric conversion unit 10 that detects light having a wavelength in the infrared light region and performs photoelectric conversion. Therefore, a visible light image composed of a red light signal, a green light signal, and a blue light signal obtained from the red pixel PR, the green pixel PG, and the blue pixel PB, respectively, and infrared rays obtained from all of the plurality of pixels P. An infrared light image using an optical signal can be simultaneously acquired at the same position in the XY in-plane direction. Therefore, high integration in the in-plane direction of XY can be realized.
  • the photoelectric conversion unit 10 has a pair of TG14A and 14B and FD15A and 15B, it is possible to acquire an infrared light image as a distance image including information on the distance to the subject. Therefore, according to the solid-state image sensor 1 of the present embodiment, it is possible to obtain both a high-resolution visible light image and an infrared light image having depth information at the same time.
  • the inter-pixel region light-shielding wall 44 surrounding the optical filter 42 is provided. Therefore, it is possible to suppress leakage light from other adjacent pixels P1 and unnecessary light from the surroundings from entering the photoelectric conversion unit 10 directly or through the optical filter 42. Therefore, the noise received by the photoelectric conversion unit 10 can be reduced, and the solid-state image sensor 1 can be expected to improve the S / N ratio, the resolution, the distance measurement accuracy, and the like.
  • an optical filter 90 that allows infrared light to pass through more easily than visible light is provided in a peripheral portion 101 adjacent to a pixel portion 100 that detects visible light and performs photoelectric conversion. I am trying to provide it. Therefore, it is possible to prevent visible light among the unnecessary light radiated to the peripheral portion 101 from entering the photoelectric conversion unit 10 directly or through the optical filter 90. Therefore, the noise received by the photoelectric conversion unit 10 can be further reduced, and the solid-state image sensor 1 can be expected to improve the S / N ratio, the resolution, the distance measurement accuracy, and the like.
  • the optical filter 42 and the optical filter 90 when the optical filter 42 and the optical filter 90 are made of an organic material, the optical filter 42 and the optical filter 90 can be collectively formed by, for example, a coating method.
  • the optical filter 90 since the optical filter 90 is arranged so as to surround the optical filter 42 located in the pixel portion 100, the flatness of the plurality of optical filters 42 on the XY plane is improved, and the thickness of the plurality of optical filters 42 varies. Is further reduced. Therefore, the variation in the detection sensitivity of infrared light between the pixels P1 in the pixel unit 100 is reduced, and the solid-state image sensor 1 can exhibit more excellent image pickup performance.
  • the organic photoelectric conversion unit 20 has a structure in which the read electrode 26, the semiconductor layer 21, the organic photoelectric conversion layer 22 and the upper electrode 23 are laminated in this order, and the lower side of the semiconductor layer 21. It has an insulating layer 24 provided in the above, and a charge storage electrode 25 provided so as to face the semiconductor layer 21 via the insulating layer 24. Therefore, in the organic photoelectric conversion layer 22, the charge generated by the photoelectric conversion can be accumulated in a part of the semiconductor layer 21, for example, in the region portion of the semiconductor layer 21 corresponding to the charge storage electrode 25 via the insulating layer 24.
  • a plurality of on-chip lenses, a plurality of color filters 53, and a plurality of charge storage electrodes 25 overlap each other in the Z-axis direction with respect to one photoelectric conversion region 12. It is provided. Therefore, if at least a part of the plurality of color filters 53 has different colors, one on-chip lens, one color filter 53, one charge storage electrode 25, and one photoelectric conversion region. The difference in infrared light detection sensitivity can be reduced as compared with the case where 12 and 12 are provided at positions corresponding to each other in the Z-axis direction.
  • the color filter 53 when one on-chip lens, one color filter 53, one charge storage electrode 25, and one photoelectric conversion region 12 are provided at positions corresponding to each other in the Z-axis direction, the color filter 53
  • the transmittance of infrared light transmitted through the color filter 53 differs depending on the color of. Therefore, the intensity of the infrared light reaching the photoelectric conversion region 12 is different for, for example, the red pixel, the blue pixel, and the green pixel, and as a result, the infrared light detection sensitivity in each of the plurality of pixels is increased.
  • the infrared light transmitted through the plurality of color filters 53 is incident on each photoelectric conversion region 12. Therefore, it is possible to reduce the difference in infrared light detection sensitivity that occurs between the plurality of pixels P1.
  • the red, green, and blue color filters 53 are provided, respectively, and the red light, the green light, and the blue light are received, respectively, to acquire a color visible light image.
  • the color filter 53 is provided. It is also possible to acquire a black-and-white visible light image without providing.
  • FIG. 7 schematically shows an example of a vertical cross-sectional configuration along the thickness direction of the pixel P1A as the first modification (modification 1-1) in the first embodiment.
  • the semiconductor layer 21 may not be provided.
  • the organic photoelectric conversion layer 22 is connected to the read electrode 26, and the charge storage electrode 25 is provided so as to face the organic photoelectric conversion layer 22 via the insulating layer 24. In such a configuration, the electric charge generated by the photoelectric conversion in the organic photoelectric conversion layer 22 is accumulated in the organic photoelectric conversion layer 22.
  • a kind of capacitor is formed by the organic photoelectric conversion layer 22, the insulating layer 24, and the charge storage electrode 25 during the photoelectric conversion in the organic photoelectric conversion layer 22. Therefore, for example, it is possible to remove the electric charge in the organic photoelectric conversion layer 22 at the start of exposure, that is, to completely deplete the organic photoelectric conversion layer 22. As a result, kTC noise can be reduced, so that deterioration of image quality due to random noise can be suppressed.
  • FIG. 8A schematically shows a configuration example of a horizontal cross section of the pixel P1B provided in the pixel portion 100B as the second modification (modification example 1-2) in the first embodiment, including the optical filter 42.
  • FIG. 8B is an enlarged horizontal cross-sectional view showing a part of the peripheral portion 101B adjacent to the pixel portion 100B.
  • FIGS. 9A and 9B both show an enlarged configuration example of a vertical cross section of the pixel P1B shown in FIG. 8A along the Z-axis direction.
  • FIG. 9A shows a vertical cross section in the arrow viewing direction along the IXA-IXA cutting line shown in FIG. 8, and FIG.
  • FIG. 9B shows the arrow viewing direction along the IXB-IXB cutting line shown in FIG. Represents a vertical cross section of.
  • FIG. 9C shows a vertical cross section of the peripheral portion 101B along the IXC-IXC cutting line shown in FIG. 8B in the direction of arrow viewing. Note that 8B schematically represents an example of the horizontal cross-sectional configuration at the height position in the Z-axis direction indicated by the arrow VIIIB in FIG. 9C.
  • the pixel P1B as a second modification of the first embodiment further has a metal partition wall 48 provided in a gap between adjacent optical filters 42.
  • the metal partition wall 48 is embedded in the interpixel region light-shielding wall 44, and is an optical filter along the XY plane so that at least a part of the optical filter 42 overlaps with the optical filter 42 in the XY plane direction orthogonal to the thickness direction (Z-axis direction). Surrounds 42.
  • Pixel P1B has substantially the same configuration as pixel P1 except that it further has a metal partition wall 48.
  • the metal partition wall 48 may be provided on the peripheral portion 101B adjacent to the pixel portion 100B.
  • the metal partition wall 48 may be embedded in, for example, a peripheral region light-shielding wall 49, and may surround the optical filter 90 along the XY plane so that at least a part of the optical filter 90 overlaps with the optical filter 90 in the XY plane direction.
  • the metal partition wall 48 is made of a conductive material containing a metal element such as Al (aluminum), W (tungsten) and Cu (copper). Therefore, as shown in FIGS. 8 and 9B, an interpixel region light-shielding wall 44 having electrical insulation is provided between the through electrode 17 and the metal partition wall 48.
  • the metal partition wall 48 is provided so as to surround the optical filter 42 and the optical filter 90, respectively. Therefore, it is possible to further suppress the leakage of light from other adjacent pixels P1B and unnecessary light from the surroundings from entering the photoelectric conversion unit 10 directly or through the optical filter 42. Therefore, the noise received by the photoelectric conversion unit 10 can be further reduced, and the solid-state image sensor 1 can be expected to further improve the S / N ratio, the resolution, the distance measurement accuracy, and the like.
  • FIG. 10C is a vertical cross-sectional view showing an example of a schematic configuration in a peripheral portion 101C adjacent to a pixel portion 100C.
  • 10A to 10C correspond to FIGS. 9A to 9C representing the pixel portion 100B and the peripheral portion 101B as the second modification of the first embodiment, respectively.
  • the pixel portion 100C and the peripheral portion 101C as the third modification in the first embodiment are thinner than the pixel portion 100B and the peripheral portion 101B as the second modification.
  • the upper surface of the insulating layer 47 is made relatively thick by making the insulating layer 47 covering the back surface 11B forming the RIG structure of the semiconductor substrate 11 relatively thick. It is designed to be a flat surface.
  • the insulating layer 47 covering the back surface 11B is made relatively thin.
  • leakage light from other adjacent pixels P1C and unnecessary light from the surroundings are photoelectric. It is possible to suppress the entry into the conversion unit 10 directly or through the optical filter 42.
  • connection electrode portion 17-3 is provided at a position where it overlaps with the optical filter 42 in the XY plane direction. Therefore, as compared with the pixel portion 100 and the peripheral portion 101 in which the connection electrode portion 17-3 is provided in a layer different from that of the optical filter 42, the thickness is contributed as a whole.
  • FIG. 11A and 11B show a configuration example of a horizontal cross section of the pixel P1D provided in the pixel unit 100D as the fourth modification (modification example 1-4) in the first embodiment, including the optical filter 42A. It is represented schematically.
  • FIG. 11C is a vertical cross-sectional view showing an example of a schematic configuration of a peripheral portion 101D adjacent to the pixel portion 100D. 11A to 11C correspond to FIGS. 9A to 9C representing the pixel portion 100B and the peripheral portion 101B as the second modification of the first embodiment, respectively.
  • the pixel portion 100D and the peripheral portion 101D as the fourth modification have a multilayer structure in which, for example, a plurality of inorganic layers made of an inorganic material are laminated instead of the optical filters 42 and 90 made of an organic material.
  • Optical filters 42A and 90A are used. Except for this point, the configurations of the pixel portion 100D and the peripheral portion 101D are substantially the same as the configurations of the pixel portion 100B and the peripheral portion 101B.
  • the optical filter 42A has a laminated structure in which a first inorganic layer 421 having a relatively high refractive index for visible light and a second inorganic layer 422 having a relatively low refractive index for visible light are alternately laminated. ..
  • a first inorganic layer 901 having a relatively high refractive index for visible light and a second inorganic layer 902 having a relatively low refractive index for visible light are alternately laminated.
  • the first inorganic layer 421,901 may be made of, for example, hydrogenated amorphous silicon (a-Si: H) or the like.
  • the second inorganic layer 422,902 may be made of, for example, silicon oxide (SiO). In the example shown in FIGS.
  • the optical filter 42A has a first inorganic layer 421A, a second inorganic layer 422A, a first inorganic layer 421B, and a second inorganic layer 422B on the insulating layer 47.
  • the optical filter 90A has a first inorganic layer 901A, a second inorganic layer 902A, a first inorganic layer 901B, a second inorganic layer 902B, and a first inorganic layer 901A on the insulating layer 47. It has a five-layer structure in which the inorganic layers 901C are laminated in order.
  • the present disclosure is not limited to this.
  • optical filters 42A and 92A While multiple reflection of visible light occurs, infrared light is transmitted without being reflected. Therefore, the visible light transmitted through the organic photoelectric conversion unit 20 is reflected by the optical filters 42A and 92A, and then is incident on the organic photoelectric conversion unit 20 again. Therefore, improvement in photoelectric conversion efficiency in the organic photoelectric conversion unit 20 can be expected.
  • the pixel portion 100D and the peripheral portion 101D as the fourth modification, since the inter-pixel area light-shielding wall 44, the peripheral region light-shielding wall 49 and the metal partition wall 48 are provided, the pixel portion 100B and the peripheral portion 101B and the like are provided. Similarly, it is possible to suppress leakage light from other adjacent pixels P1D and unnecessary light from the surroundings from entering the photoelectric conversion unit 10 directly or through the optical filter 42A.
  • FIG. 12A and 12B show a configuration example of a vertical cross section of the pixel P1E provided in the pixel unit 100E as the fifth modification (modification example 1-5) in the first embodiment, including the optical filter 42B. It is represented schematically.
  • FIG. 12C is a vertical cross-sectional view showing an example of a schematic configuration of a peripheral portion 101E adjacent to the pixel portion 100E. 12A to 12C correspond to FIGS. 9A to 9C representing the pixel portion 100B and the peripheral portion 101B as the second modification of the first embodiment, respectively.
  • the optical filters 42B and 90B having a single-layer structure made of an inorganic material are adopted instead of the optical filters 42 and 90. Except for this point, the configurations of the pixel portion 100E and the peripheral portion 101E are substantially the same as the configurations of the pixel portion 100B and the peripheral portion 101B.
  • the optical filters 42B and 90B are composed of, for example, hydrided amorphous silicon (a—Si: H) and a—Si / Ge: H.
  • the photoelectric conversion unit 10 can detect infrared light.
  • the configuration of the pixel portion 100E and the peripheral portion 101E as the fifth modification is substantially the same as the configuration of the pixel portion 100B and the peripheral portion 101B.
  • the inter-pixel region light-shielding wall 44, the peripheral region light-shielding wall 49, and the metal partition wall 48 are provided, so that the same as the pixel portion 100B and the peripheral portion 101B.
  • FIG. 13 shows a configuration example of a horizontal cross section including the connection electrode portion 17-3 in the pixel P1F provided in the pixel portion 100F as the sixth modification (modification example 1-6) in the first embodiment. It is represented schematically.
  • FIG. 14A schematically shows an example of a configuration of a vertical cross section in the pixel P1F.
  • FIG. 14B is a vertical cross-sectional view showing an example of a schematic configuration of a peripheral portion 101F adjacent to the pixel portion 100F.
  • FIG. 13 corresponds to FIG. 2C showing the pixel portion 100 of the first embodiment.
  • 14A and 14B correspond to FIGS. 3A and 3C representing the pixel portion 100 and the peripheral portion 101 of the first embodiment, respectively.
  • the occupied area of the connection electrode portion 17-3 and the light-shielding film 60 are compared with the pixel portion 100 and the peripheral portion 101 of the first embodiment. It is provided so that the occupied area of is large.
  • the pixel unit 100 is provided with a connection electrode portion 17-3 having a dimension smaller than the distance between adjacent optical filters 42, that is, so as not to overlap with the optical filter 42 in the Z-axis direction.
  • the connection electrode portion 17-3 is provided so as to extend in the X-axis direction or the Y-axis direction to the region overlapping with the optical filter 42 in the Z-axis direction.
  • the peripheral portion 101 is provided with a light-shielding film 60 having a dimension smaller than the distance between the gaps between the adjacent optical filters 90, that is, so as not to overlap with the optical filters 90 in the Z-axis direction.
  • the light-shielding film 60 is provided so as to extend in the X-axis direction or the Y-axis direction to the region overlapping with the optical filter 42 in the Z-axis direction. Except for these points, the configuration of the pixel portion 100F and the peripheral portion 101F as the sixth modification is substantially the same as the configuration of the pixel portion 100 and the peripheral portion 101 pixel P1 of the first embodiment.
  • connection electrode portion 17-3 spreads along the XY plane, so that the infrared light incident on the semiconductor substrate 11 can be confined inside the semiconductor substrate 11.
  • the photoelectric conversion efficiency in the photoelectric conversion region 12 can be increased.
  • the peripheral portion 101F as a sixth modification since the light-shielding film 60 extends along the XY surface, it is possible to block the peripheral light transmitted through the peripheral region light-shielding wall 49.
  • FIG. 15A is a schematic view of a vertical cross section showing a part of the pixel portion 200 in the solid-state image sensor 2 according to the second embodiment of the present disclosure in an enlarged manner.
  • FIG. 15B is a schematic view of a vertical cross section showing a part of the peripheral portion 201 adjacent to the pixel portion 200 in an enlarged manner.
  • the peripheral portion 201 may be provided so as to surround the pixel portion 200, for example, along the XY plane.
  • the solid-state image sensor 2 has substantially the same configuration as the solid-state image sensor 1 according to the first embodiment, except that the pixel unit 200 and the peripheral unit 201 are provided in place of the pixel unit 100 and the peripheral unit 101, respectively.
  • the pixel unit 200 is a specific example corresponding to the "effective domain" of the present disclosure.
  • the peripheral portion 201 is a specific example corresponding to the “peripheral region” of the present disclosure.
  • the components common to the pixel portion 100 and the peripheral portion 101 in the pixel portion 200 and the peripheral portion 201 are designated by the same reference numerals, and the description thereof will be omitted as appropriate.
  • FIG. 15A schematically shows an example of a vertical cross-sectional configuration along the Z-axis direction in pixel P2 of one of a plurality of pixels P2 arranged in a matrix in the pixel unit 200.
  • the pixel P2 has a structure in which, for example, one photoelectric conversion unit 10 and one organic photoelectric conversion unit 20 are laminated in the Z-axis direction, that is, in the so-called vertical direction. It is a spectroscopic image sensor.
  • An optical filter 42 is provided between the photoelectric conversion unit 10 and the organic photoelectric conversion unit 20.
  • the pixel P2 is a specific example corresponding to the "photodetection element" of the present disclosure.
  • FIG. 15B is an example of a vertical cross-sectional configuration along the Z-axis direction of one of a plurality of black level reference pixel BPs arranged so as to surround the periphery of the pixel portion 200 in the XY plane. Is schematically represented.
  • the black level reference pixel BP has an organic photoelectric conversion unit 20B as a third photoelectric conversion unit, a photoelectric conversion unit 10B as a fourth photoelectric conversion unit, and a first light-shielding unit. It has a light-shielding film 61 and a light-shielding film 62 as a second light-shielding portion.
  • the black level reference pixel BP may further include an optical filter 90.
  • the black level reference pixel BP is an element that detects a black level reference value, and is a specific example corresponding to the “black level reference element” of the present disclosure.
  • the organic photoelectric conversion unit 20B detects light in the first wavelength range (for example, visible light) of the irradiation light L irradiated on the solid-state imaging device 2 and performs photoelectric conversion. Is.
  • the photoelectric conversion unit 10B is provided so as to overlap the organic photoelectric conversion unit 20B in the Z-axis direction, and like the photoelectric conversion unit 10, the light in the second wavelength range of the irradiation light L irradiated to the solid-state image sensor 2. It detects (for example, infrared light) and performs photoelectric conversion.
  • the light-shielding film 61 and the light-shielding film 62 suppress the transmission of the irradiation light L irradiated to the solid-state image sensor 2.
  • the light-shielding film 61 is provided on the incident side of the irradiation light L when viewed from the organic photoelectric conversion unit 20B, that is, on the side opposite to the photoelectric conversion unit 10B when viewed from the organic photoelectric conversion unit 20B.
  • the light-shielding film 62 is sandwiched between the organic photoelectric conversion unit 20B and the photoelectric conversion unit 10B in the Z-axis direction.
  • the light-shielding film 61 may also serve as, for example, the contact layer 57 connected to the lead-out wiring 58.
  • At least one of the light-shielding film 61 and the light-shielding film 62 may include, for example, a metal layer made of a metal material.
  • the metal layer is made of a metal material containing at least one of, for example, Al (aluminum), W (tungsten), Ta (tantalum), TaN (tantalum nitride), Ti (titanium) and Cu (copper).
  • the light-shielding film 61 and the light-shielding film 62 are used as the light-shielding film 61 and the light-shielding film 62, for example, in order to set the attenuation of light having a wavelength of 700 nm to 120 dB and the attenuation of light having a wavelength of 950 nm to 120 dB.
  • the film thickness of 61 may be 205 nm
  • the film thickness of the light-shielding film 62 may be 35 nm.
  • the thickness of the light-shielding film 61 becomes 240 nm.
  • the photoelectric conversion unit 10 and the photoelectric conversion unit 10B are integrally provided so as to extend from the pixel unit 200 to the peripheral unit 201 in the first layer Lv1. You may. Further, the organic photoelectric conversion unit 20 and the organic photoelectric conversion unit 20B may be integrally provided so as to extend from the pixel unit 200 to the peripheral unit 201 in the second layer Lv2. Further, both the optical filter 42 and the optical filter 90 may be provided in the third layer Lv3.
  • the black level reference pixels BP arranged so as to surround the periphery of the pixel portion 200 are provided with two light-shielding films 61 and a light-shielding film 62 overlapping in the Z-axis direction. .. Therefore, as compared with the case where only one of the light-shielding film 61 and the light-shielding film 62 is provided, for example, the light-shielding film 61 and the light-shielding film 62 are thinner while maintaining the light-shielding performance against the irradiation light L. Can be changed.
  • the photoelectric conversion unit 10 and the photoelectric conversion unit 10B when the photoelectric conversion unit 10 and the photoelectric conversion unit 10B are integrally provided, the photoelectric conversion unit 10 and the photoelectric conversion unit 10B can be collectively formed, so that the manufacturing process of the solid-state image sensor 2 can be simplified. Can be changed.
  • the organic photoelectric conversion unit 20 and the organic photoelectric conversion unit 20B when the organic photoelectric conversion unit 20 and the organic photoelectric conversion unit 20B are integrally provided, the organic photoelectric conversion unit 20 and the organic photoelectric conversion unit 20B can be collectively formed, so that the solid-state image sensor 2 can be formed.
  • the manufacturing process can be simplified.
  • both the optical filter 42 and the optical filter 90 are provided in the third layer Lv3, the optical filter 42 and the optical filter 90 can be collectively formed, so that the manufacturing process of the solid-state image sensor 2 can be performed. Can be simplified.
  • FIG. 16 schematically shows an example of a vertical cross-sectional configuration along the thickness direction of the black level reference pixel BP1 as the first modification (modification example 2-1) in the second embodiment.
  • the light-shielding film 62 is arranged between the optical filter 90 and the photoelectric conversion unit 10B.
  • FIG. 17 schematically shows an example of a vertical cross-sectional configuration along the thickness direction of the black level reference pixel BP2 as the second modification (modification 2-2) in the second embodiment.
  • the optical filter 90 is replaced with the light-shielding film 62, which is different from the black level reference pixel BP of the second embodiment.
  • FIG. 18 schematically shows an example of a vertical cross-sectional configuration along the thickness direction of the black level reference pixel BP3 as the third modification (modification example 2-3) in the second embodiment.
  • This modification is different from the black level reference pixel BP of the second embodiment in that the light-shielding film 62 is provided as a metal film formed by, for example, the CVD method without providing the optical filter 90. ..
  • FIG. 19 schematically shows an example of a vertical cross-sectional configuration along the thickness direction of the black level reference pixel BP4 as the fourth modification (modification example 2-4) in the second embodiment.
  • the light-shielding film 62 is dispersed in three places, which is different from the black level reference pixel BP of the second embodiment.
  • the light-shielding film 62 is divided into three portions of the light-shielding film 62-1 to 62-3, the light-shielding film 62-1 is arranged between the wiring layer M and the optical filter 90, and the light-shielding film 62- 2 is arranged in the same layer as the connection electrode portion 17-3, and the light-shielding film 62-3 is provided so as to cover the back surface 11B of the semiconductor substrate 11.
  • FIG. 20A is a schematic diagram showing an example of the overall configuration of the photodetection system 301 according to the third embodiment of the present disclosure.
  • FIG. 20B is a schematic diagram showing an example of the circuit configuration of the photodetection system 301.
  • the photodetector system 301 includes a light emitting device 310 as a light source unit that emits light L2, and a photodetector 320 as a light receiving unit having a photoelectric conversion element.
  • the photodetector 320 the solid-state image sensor 1 described above can be used.
  • the light detection system 301 may further include a system control unit 330, a light source drive unit 340, a sensor control unit 350, a light source side optical system 360, and a camera side optical system 370.
  • the photodetector 320 can detect light L1 and light L2.
  • the light L1 is light in which ambient light from the outside is reflected by the subject (measurement object) 300 (FIG. 20A).
  • the light L2 is light that is emitted by the light emitting device 310 and then reflected by the subject 300.
  • the light L1 is, for example, visible light, and the light L2 is, for example, infrared light.
  • the light L1 can be detected by the organic photoelectric conversion unit in the photodetector 320, and the light L2 can be detected by the photoelectric conversion unit in the photodetector 320.
  • the image information of the subject 300 can be acquired from the light L1, and the distance information between the subject 300 and the photodetection system 301 can be acquired from the light L2.
  • the photodetection system 301 can be mounted on an electronic device such as a smartphone or a moving body such as a car.
  • the light emitting device 310 can be composed of, for example, a semiconductor laser, a surface emitting semiconductor laser, or a vertical resonator type surface emitting laser (VCSEL).
  • VCSEL vertical resonator type surface emitting laser
  • the photoelectric conversion unit can measure the distance to the subject 300 by, for example, the light flight time (Time-of-Flight; TOF).
  • a detection method of the light L2 emitted from the light emitting device 310 by the photodetector 320 for example, a structured light method or a stereovision method can be adopted.
  • the distance between the photodetection system 301 and the subject 300 can be measured by projecting light of a predetermined pattern onto the subject 300 and analyzing the degree of distortion of the pattern.
  • the distance between the photodetection system 301 and the subject can be measured by acquiring two or more images of the subject 300 viewed from two or more different viewpoints.
  • the light emitting device 310 and the photodetector 320 can be synchronously controlled by the system control unit 330.
  • FIG. 21 is a block diagram showing a configuration example of an electronic device 2000 to which the present technology is applied.
  • the electronic device 2000 has a function as, for example, a camera.
  • the electronic device 2000 includes an optical unit 2001 composed of a lens group or the like, an optical detection device 2002 to which the above-mentioned solid-state image sensor 1 or the like (hereinafter referred to as a solid-state image sensor 1 or the like) is applied, and a DSP (camera signal processing circuit). Digital Signal Processor) circuit 2003 is provided.
  • the electronic device 2000 also includes a frame memory 2004, a display unit 2005, a recording unit 2006, an operation unit 2007, and a power supply unit 2008.
  • the DSP circuit 2003, the frame memory 2004, the display unit 2005, the recording unit 2006, the operation unit 2007, and the power supply unit 2008 are connected to each other via the bus line 2009.
  • the optical unit 2001 captures incident light (image light) from the subject and forms an image on the image pickup surface of the photodetector 2002.
  • the photodetector 2002 converts the amount of incident light imaged on the imaging surface by the optical unit 2001 into an electric signal in pixel units and outputs it as a pixel signal.
  • the display unit 2005 comprises a panel-type display device such as a liquid crystal panel or an organic EL panel, and displays a moving image or a still image captured by the photodetector 2002.
  • the recording unit 2006 records a moving image or a still image captured by the optical detection device 2002 on a recording medium such as a hard disk or a semiconductor memory.
  • the operation unit 2007 issues operation commands for various functions of the electronic device 2000 under the operation of the user.
  • the power supply unit 2008 appropriately supplies various power sources that serve as operating power sources for the DSP circuit 2003, the frame memory 2004, the display unit 2005, the recording unit 2006, and the operation unit 2007 to these supply targets.
  • the technique according to the present disclosure can be applied to various products.
  • the techniques according to the present disclosure may be applied to an endoscopic surgery system.
  • FIG. 22 is a block diagram showing an example of a schematic configuration of a patient's internal information acquisition system using a capsule endoscope to which the technique according to the present disclosure (the present technique) can be applied.
  • the internal information acquisition system 10001 is composed of a capsule endoscope 10100 and an external control device 10200.
  • the capsule endoscope 10100 is swallowed by the patient at the time of examination.
  • the capsule-type endoscope 10100 has an imaging function and a wireless communication function, and moves inside an organ such as the stomach and intestine by peristaltic movement until it is naturally excreted from the patient, and inside the organ.
  • Images (hereinafter, also referred to as internal organ images) are sequentially imaged at predetermined intervals, and information about the internal organ images is sequentially wirelessly transmitted to an external control device 10200 outside the body.
  • the external control device 10200 comprehensively controls the operation of the internal information acquisition system 10001. Further, the external control device 10200 receives information about the internal image transmitted from the capsule endoscope 10100, and based on the information about the received internal image, the internal image is displayed on a display device (not shown). Generate image data to display.
  • the internal information acquisition system 10001 in this way, it is possible to obtain an internal image of the inside of the patient at any time from the time when the capsule endoscope 10100 is swallowed until it is discharged.
  • the capsule-type endoscope 10100 has a capsule-type housing 10101, and in the housing 10101, a light source unit 10111, an image pickup unit 10112, an image processing unit 10113, a wireless communication unit 10114, a power feeding unit 10115, and a power supply unit are contained.
  • the 10116 and the control unit 10117 are housed.
  • the light source unit 10111 is composed of, for example, a light source such as an LED (light emission diode), and irradiates the imaging field of view of the imaging unit 10112 with light.
  • a light source such as an LED (light emission diode)
  • the image pickup unit 10112 is composed of an image pickup element and an optical system including a plurality of lenses provided in front of the image pickup element.
  • the reflected light of the light irradiated to the body tissue to be observed (hereinafter referred to as observation light) is collected by the optical system and incident on the image pickup element.
  • the observation light incident on the image pickup device is photoelectrically converted, and an image signal corresponding to the observation light is generated.
  • the image signal generated by the image pickup unit 10112 is provided to the image processing unit 10113.
  • the image processing unit 10113 is composed of a processor such as a CPU (Central Processing Unit) or a GPU (Graphics Processing Unit), and performs various signal processing on the image signal generated by the image pickup unit 10112.
  • the image processing unit 10113 provides the signal-processed image signal to the wireless communication unit 10114 as RAW data.
  • the wireless communication unit 10114 performs predetermined processing such as modulation processing on the image signal processed by the image processing unit 10113, and transmits the image signal to the external control device 10200 via the antenna 10114A. Further, the wireless communication unit 10114 receives a control signal related to the drive control of the capsule endoscope 10100 from the external control device 10200 via the antenna 10114A. The wireless communication unit 10114 provides the control unit 10117 with a control signal received from the external control device 10200.
  • the power feeding unit 10115 is composed of an antenna coil for receiving power, a power regeneration circuit that regenerates power from the current generated in the antenna coil, a booster circuit, and the like. In the power feeding unit 10115, electric power is generated using the so-called non-contact charging principle.
  • the power supply unit 10116 is composed of a secondary battery and stores the electric power generated by the power supply unit 10115.
  • FIG. 22 in order to avoid complication of the drawing, the illustration of the arrow indicating the power supply destination from the power supply unit 10116 is omitted, but the power stored in the power supply unit 10116 is the light source unit 10111. , Is supplied to the image pickup unit 10112, the image processing unit 10113, the wireless communication unit 10114, and the control unit 10117, and can be used to drive these.
  • the control unit 10117 is composed of a processor such as a CPU, and is a control signal transmitted from the external control device 10200 to drive the light source unit 10111, the image pickup unit 10112, the image processing unit 10113, the wireless communication unit 10114, and the power supply unit 10115. Control as appropriate according to.
  • the external control device 10200 is composed of a processor such as a CPU and GPU, or a microcomputer or a control board on which a processor and a storage element such as a memory are mixedly mounted.
  • the external control device 10200 controls the operation of the capsule endoscope 10100 by transmitting a control signal to the control unit 10117 of the capsule endoscope 10100 via the antenna 10200A.
  • the irradiation condition of light to the observation target in the light source unit 10111 can be changed by a control signal from the external control device 10200.
  • the imaging conditions for example, the frame rate in the imaging unit 10112, the exposure value, etc.
  • the content of processing in the image processing unit 10113 and the conditions for transmitting the image signal by the wireless communication unit 10114 may be changed by the control signal from the external control device 10200. ..
  • the external control device 10200 performs various image processing on the image signal transmitted from the capsule type endoscope 10100, and generates image data for displaying the captured internal image on the display device.
  • the image processing includes, for example, development processing (demosaic processing), high image quality processing (band enhancement processing, super-resolution processing, NR (Noise reduction) processing and / or camera shake correction processing, etc.), and / or enlargement processing ( Various signal processing such as electronic zoom processing) can be performed.
  • the external control device 10200 controls the drive of the display device to display the captured internal image based on the generated image data.
  • the external control device 10200 may have the generated image data recorded in a recording device (not shown) or printed out in a printing device (not shown).
  • the above is an example of an in-vivo information acquisition system to which the technology according to the present disclosure can be applied.
  • the technique according to the present disclosure can be applied to, for example, the image pickup unit 10112 among the configurations described above. Therefore, high image detection accuracy can be obtained in spite of its small size.
  • the technique according to the present disclosure (the present technique) can be applied to various products.
  • the techniques according to the present disclosure may be applied to an endoscopic surgery system.
  • FIG. 23 is a diagram showing an example of a schematic configuration of an endoscopic surgery system to which the technique according to the present disclosure (the present technique) can be applied.
  • FIG. 23 illustrates how the surgeon (doctor) 11131 is performing surgery on patient 11132 on patient bed 11133 using the endoscopic surgery system 11000.
  • the endoscopic surgery system 11000 includes an endoscope 11100, other surgical tools 11110 such as an abdominal tube 11111 and an energy treatment tool 11112, and a support arm device 11120 that supports the endoscope 11100.
  • a cart 11200 equipped with various devices for endoscopic surgery.
  • the endoscope 11100 is composed of a lens barrel 11101 in which a region having a predetermined length from the tip is inserted into the body cavity of the patient 11132, and a camera head 11102 connected to the base end of the lens barrel 11101.
  • the endoscope 11100 configured as a so-called rigid mirror having a rigid barrel 11101 is illustrated, but the endoscope 11100 may be configured as a so-called flexible mirror having a flexible barrel. good.
  • An opening in which an objective lens is fitted is provided at the tip of the lens barrel 11101.
  • a light source device 11203 is connected to the endoscope 11100, and the light generated by the light source device 11203 is guided to the tip of the lens barrel by a light guide extending inside the lens barrel 11101, and is an objective. It is irradiated toward the observation target in the body cavity of the patient 11132 through the lens.
  • the endoscope 11100 may be a direct endoscope, a perspective mirror, or a side endoscope.
  • An optical system and an image sensor are provided inside the camera head 11102, and the reflected light (observation light) from the observation target is focused on the image sensor by the optical system.
  • the observation light is photoelectrically converted by the image pickup device, and an electric signal corresponding to the observation light, that is, an image signal corresponding to the observation image is generated.
  • the image signal is transmitted to the camera control unit (CCU: Camera Control Unit) 11201 as RAW data.
  • CCU Camera Control Unit
  • the CCU11201 is composed of a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), etc., and comprehensively controls the operations of the endoscope 11100 and the display device 11202. Further, the CCU11201 receives an image signal from the camera head 11102, and performs various image processing on the image signal for displaying an image based on the image signal, such as development processing (demosaic processing).
  • a CPU Central Processing Unit
  • GPU Graphics Processing Unit
  • the display device 11202 displays an image based on the image signal processed by the CCU 11201 under the control of the CCU 11201.
  • the light source device 11203 is composed of, for example, a light source such as an LED (light emission diode), and supplies the irradiation light for photographing the surgical site or the like to the endoscope 11100.
  • a light source such as an LED (light emission diode)
  • the input device 11204 is an input interface for the endoscopic surgery system 11000.
  • the user can input various information and input instructions to the endoscopic surgery system 11000 via the input device 11204.
  • the user inputs an instruction to change the imaging conditions (type of irradiation light, magnification, focal length, etc.) by the endoscope 11100.
  • the treatment tool control device 11205 controls the drive of the energy treatment tool 11112 for cauterizing, incising, sealing a blood vessel, or the like.
  • the pneumoperitoneum device 11206 uses a gas in the pneumoperitoneum tube 11111 to inflate the body cavity of the patient 11132 for the purpose of securing the field of view by the endoscope 11100 and securing the work space of the operator. Is sent.
  • the recorder 11207 is a device capable of recording various information related to surgery.
  • the printer 11208 is a device capable of printing various information related to surgery in various formats such as text, images, and graphs.
  • the light source device 11203 that supplies the irradiation light to the endoscope 11100 when photographing the surgical site can be composed of, for example, an LED, a laser light source, or a white light source composed of a combination thereof.
  • a white light source is configured by a combination of RGB laser light sources, the output intensity and output timing of each color (each wavelength) can be controlled with high accuracy. Therefore, the light source device 11203 adjusts the white balance of the captured image. It can be carried out.
  • the laser light from each of the RGB laser light sources is irradiated to the observation target in a time-division manner, and the drive of the image sensor of the camera head 11102 is controlled in synchronization with the irradiation timing to correspond to each of RGB. It is also possible to capture the image in a time-division manner. According to this method, a color image can be obtained without providing a color filter in the image pickup device.
  • the drive of the light source device 11203 may be controlled so as to change the intensity of the output light at predetermined time intervals.
  • the drive of the image sensor of the camera head 11102 in synchronization with the timing of the change of the light intensity to acquire an image in time division and synthesizing the image, so-called high dynamic without blackout and overexposure. Range images can be generated.
  • the light source device 11203 may be configured to be able to supply light in a predetermined wavelength band corresponding to special light observation.
  • special light observation for example, by utilizing the wavelength dependence of light absorption in body tissue, the surface layer of the mucous membrane is irradiated with light in a narrower band than the irradiation light (that is, white light) during normal observation.
  • narrow band imaging in which a predetermined tissue such as a blood vessel is photographed with high contrast, is performed.
  • fluorescence observation may be performed in which an image is obtained by fluorescence generated by irradiating with excitation light.
  • the body tissue is irradiated with excitation light to observe the fluorescence from the body tissue (autofluorescence observation), or a reagent such as indocyanine green (ICG) is locally injected into the body tissue and the body tissue is injected. It is possible to obtain a fluorescence image by irradiating the excitation light corresponding to the fluorescence wavelength of the reagent.
  • the light source device 11203 may be configured to be capable of supplying narrowband light and / or excitation light corresponding to such special light observation.
  • FIG. 24 is a block diagram showing an example of the functional configuration of the camera head 11102 and CCU11201 shown in FIG. 23.
  • the camera head 11102 includes a lens unit 11401, an image pickup unit 11402, a drive unit 11403, a communication unit 11404, and a camera head control unit 11405.
  • CCU11201 has a communication unit 11411, an image processing unit 11412, and a control unit 11413.
  • the camera head 11102 and CCU11201 are communicably connected to each other by a transmission cable 11400.
  • the lens unit 11401 is an optical system provided at a connection portion with the lens barrel 11101.
  • the observation light taken in from the tip of the lens barrel 11101 is guided to the camera head 11102 and incident on the lens unit 11401.
  • the lens unit 11401 is configured by combining a plurality of lenses including a zoom lens and a focus lens.
  • the image pickup element constituting the image pickup unit 11402 may be one (so-called single plate type) or a plurality (so-called multi-plate type).
  • each image pickup element may generate an image signal corresponding to each of RGB, and a color image may be obtained by synthesizing them.
  • the image pickup unit 11402 may be configured to have a pair of image pickup elements for acquiring image signals for the right eye and the left eye corresponding to the 3D (dimensional) display, respectively.
  • the 3D display enables the operator 11131 to more accurately grasp the depth of the living tissue in the surgical site.
  • a plurality of lens units 11401 may be provided corresponding to each image pickup element.
  • the image pickup unit 11402 does not necessarily have to be provided on the camera head 11102.
  • the image pickup unit 11402 may be provided inside the lens barrel 11101 immediately after the objective lens.
  • the drive unit 11403 is composed of an actuator, and the zoom lens and focus lens of the lens unit 11401 are moved by a predetermined distance along the optical axis under the control of the camera head control unit 11405. As a result, the magnification and focus of the image captured by the image pickup unit 11402 can be adjusted as appropriate.
  • the communication unit 11404 is configured by a communication device for transmitting and receiving various information to and from the CCU11201.
  • the communication unit 11404 transmits the image signal obtained from the image pickup unit 11402 as RAW data to the CCU 11201 via the transmission cable 11400.
  • the communication unit 11404 receives a control signal for controlling the drive of the camera head 11102 from the CCU 11201 and supplies the control signal to the camera head control unit 11405.
  • the control signal includes, for example, information to specify the frame rate of the captured image, information to specify the exposure value at the time of imaging, and / or information to specify the magnification and focus of the captured image. Contains information about the condition.
  • the image pickup conditions such as the frame rate, exposure value, magnification, and focus may be appropriately specified by the user, or may be automatically set by the control unit 11413 of the CCU11201 based on the acquired image signal. good.
  • the endoscope 11100 is equipped with a so-called AE (Auto Exposure) function, AF (Auto Focus) function, and AWB (Auto White Balance) function.
  • the camera head control unit 11405 controls the drive of the camera head 11102 based on the control signal from the CCU 11201 received via the communication unit 11404.
  • the communication unit 11411 is configured by a communication device for transmitting and receiving various information to and from the camera head 11102.
  • the communication unit 11411 receives an image signal transmitted from the camera head 11102 via the transmission cable 11400.
  • the communication unit 11411 transmits a control signal for controlling the drive of the camera head 11102 to the camera head 11102.
  • Image signals and control signals can be transmitted by telecommunications, optical communication, or the like.
  • the image processing unit 11412 performs various image processing on the image signal which is the RAW data transmitted from the camera head 11102.
  • the control unit 11413 performs various controls related to the imaging of the surgical site and the like by the endoscope 11100 and the display of the captured image obtained by the imaging of the surgical site and the like. For example, the control unit 11413 generates a control signal for controlling the drive of the camera head 11102.
  • control unit 11413 causes the display device 11202 to display an image captured by the surgical unit or the like based on the image signal processed by the image processing unit 11412.
  • the control unit 11413 may recognize various objects in the captured image by using various image recognition techniques.
  • the control unit 11413 detects a surgical tool such as forceps, a specific biological part, bleeding, mist when using the energy treatment tool 11112, etc. by detecting the shape, color, etc. of the edge of the object included in the captured image. Can be recognized.
  • the control unit 11413 may superimpose and display various surgical support information on the image of the surgical unit by using the recognition result. By superimposing and displaying the surgical support information and presenting it to the surgeon 11131, the burden on the surgeon 11131 can be reduced and the surgeon 11131 can surely proceed with the surgery.
  • the transmission cable 11400 connecting the camera head 11102 and CCU11201 is an electric signal cable corresponding to electric signal communication, an optical fiber corresponding to optical communication, or a composite cable thereof.
  • the communication is performed by wire using the transmission cable 11400, but the communication between the camera head 11102 and the CCU11201 may be performed wirelessly.
  • the above is an example of an endoscopic surgery system to which the technique according to the present disclosure can be applied.
  • the technique according to the present disclosure can be applied to, for example, the image pickup unit 11402 of the camera head 11102 among the configurations described above.
  • the technique according to the present disclosure By applying the technique according to the present disclosure to the image pickup unit 10402, a clearer image of the surgical site can be obtained, so that the visibility of the surgical site by the operator is improved.
  • the technique according to the present disclosure may be applied to other, for example, a microscopic surgery system.
  • the technique according to the present disclosure can be applied to various products.
  • the technology according to the present disclosure is realized as a device mounted on a moving body of any kind such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, a personal mobility, an airplane, a drone, a ship, and a robot. You may.
  • FIG. 25 is a block diagram showing a schematic configuration example of a vehicle control system, which is an example of a mobile control system to which the technique according to the present disclosure can be applied.
  • the vehicle control system 12000 includes a plurality of electronic control units connected via the communication network 12001.
  • the vehicle control system 12000 includes a drive system control unit 12010, a body system control unit 12020, an outside information detection unit 12030, an in-vehicle information detection unit 12040, and an integrated control unit 12050.
  • a microcomputer 12051, an audio image output unit 12052, and an in-vehicle network I / F (Interface) 12053 are shown as a functional configuration of the integrated control unit 12050.
  • the drive system control unit 12010 controls the operation of the device related to the drive system of the vehicle according to various programs.
  • the drive system control unit 12010 has a driving force generator for generating a driving force of a vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to the wheels, and a steering angle of the vehicle. It functions as a control device such as a steering mechanism for adjusting and a braking device for generating braking force of the vehicle.
  • the body system control unit 12020 controls the operation of various devices mounted on the vehicle body according to various programs.
  • the body system control unit 12020 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as headlamps, back lamps, brake lamps, turn signals or fog lamps.
  • the body system control unit 12020 may be input with radio waves transmitted from a portable device that substitutes for the key or signals of various switches.
  • the body system control unit 12020 receives inputs of these radio waves or signals and controls a vehicle door lock device, a power window device, a lamp, and the like.
  • the outside information detection unit 12030 detects information outside the vehicle equipped with the vehicle control system 12000.
  • the image pickup unit 12031 is connected to the vehicle outside information detection unit 12030.
  • the vehicle outside information detection unit 12030 causes the image pickup unit 12031 to capture an image of the outside of the vehicle and receives the captured image.
  • the out-of-vehicle information detection unit 12030 may perform object detection processing or distance detection processing such as a person, a vehicle, an obstacle, a sign, or a character on the road surface based on the received image.
  • the image pickup unit 12031 is an optical sensor that receives light and outputs an electric signal according to the amount of the light received.
  • the image pickup unit 12031 can output an electric signal as an image or can output it as distance measurement information. Further, the light received by the image pickup unit 12031 may be visible light or invisible light such as infrared light.
  • the in-vehicle information detection unit 12040 detects the in-vehicle information.
  • a driver state detection unit 12041 that detects a driver's state is connected to the vehicle interior information detection unit 12040.
  • the driver state detection unit 12041 includes, for example, a camera that images the driver, and the in-vehicle information detection unit 12040 determines the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 12041. It may be calculated, or it may be determined whether or not the driver has fallen asleep.
  • the microcomputer 12051 calculates the control target value of the driving force generator, the steering mechanism, or the braking device based on the information inside and outside the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040, and the drive system control unit.
  • a control command can be output to 12010.
  • the microcomputer 12051 realizes ADAS (Advanced Driver Assistance System) functions including vehicle collision avoidance or impact mitigation, follow-up driving based on inter-vehicle distance, vehicle speed maintenance driving, vehicle collision warning, vehicle lane deviation warning, and the like. It is possible to perform cooperative control for the purpose of.
  • ADAS Advanced Driver Assistance System
  • the microcomputer 12051 controls the driving force generating device, the steering mechanism, the braking device, and the like based on the information around the vehicle acquired by the outside information detection unit 12030 or the inside information detection unit 12040, so that the driver can control the driver. It is possible to perform coordinated control for the purpose of automatic driving that runs autonomously without depending on the operation.
  • the microcomputer 12051 can output a control command to the body system control unit 12030 based on the information outside the vehicle acquired by the vehicle outside information detection unit 12030.
  • the microcomputer 12051 controls the headlamps according to the position of the preceding vehicle or the oncoming vehicle detected by the outside information detection unit 12030, and performs cooperative control for the purpose of anti-glare such as switching the high beam to the low beam. It can be carried out.
  • the audio image output unit 12052 transmits an output signal of at least one of audio and image to an output device capable of visually or audibly notifying information to the passenger or the outside of the vehicle.
  • an audio speaker 12061, a display unit 12062, and an instrument panel 12063 are exemplified as output devices.
  • the display unit 12062 may include, for example, at least one of an onboard display and a head-up display.
  • FIG. 26 is a diagram showing an example of the installation position of the image pickup unit 12031.
  • the image pickup unit 12031 has image pickup units 12101, 12102, 12103, 12104, and 12105.
  • the image pickup units 12101, 12102, 12103, 12104, 12105 are provided at positions such as, for example, the front nose, side mirrors, rear bumpers, back doors, and the upper part of the windshield in the vehicle interior of the vehicle 12100.
  • the image pickup unit 12101 provided in the front nose and the image pickup section 12105 provided in the upper part of the windshield in the vehicle interior mainly acquire an image in front of the vehicle 12100.
  • the image pickup units 12102 and 12103 provided in the side mirror mainly acquire images of the side of the vehicle 12100.
  • the image pickup unit 12104 provided in the rear bumper or the back door mainly acquires an image of the rear of the vehicle 12100.
  • the image pickup unit 12105 provided on the upper part of the windshield in the vehicle interior is mainly used for detecting a preceding vehicle, a pedestrian, an obstacle, a traffic light, a traffic sign, a lane, or the like.
  • FIG. 26 shows an example of the shooting range of the imaging units 12101 to 12104.
  • the imaging range 12111 indicates the imaging range of the imaging unit 12101 provided on the front nose
  • the imaging ranges 12112 and 12113 indicate the imaging range of the imaging units 12102 and 12103 provided on the side mirrors, respectively
  • the imaging range 12114 indicates the imaging range.
  • the imaging range of the imaging unit 12104 provided on the rear bumper or the back door is shown. For example, by superimposing the image data captured by the imaging units 12101 to 12104, a bird's-eye view image of the vehicle 12100 can be obtained.
  • At least one of the image pickup units 12101 to 12104 may have a function of acquiring distance information.
  • at least one of the image pickup units 12101 to 12104 may be a stereo camera including a plurality of image pickup elements, or may be an image pickup element having pixels for phase difference detection.
  • the microcomputer 12051 has a distance to each three-dimensional object within the image pickup range 12111 to 12114 based on the distance information obtained from the image pickup unit 12101 to 12104, and a temporal change of this distance (relative speed with respect to the vehicle 12100). By obtaining can. Further, the microcomputer 12051 can set an inter-vehicle distance to be secured in advance in front of the preceding vehicle, and can perform automatic brake control (including follow-up stop control), automatic acceleration control (including follow-up start control), and the like. In this way, it is possible to perform coordinated control for the purpose of automatic driving or the like that autonomously travels without relying on the driver's operation.
  • automatic brake control including follow-up stop control
  • automatic acceleration control including follow-up start control
  • the microcomputer 12051 converts three-dimensional object data related to a three-dimensional object into two-wheeled vehicles, ordinary vehicles, large vehicles, pedestrians, electric poles, and other three-dimensional objects based on the distance information obtained from the image pickup units 12101 to 12104. It can be classified and extracted and used for automatic avoidance of obstacles. For example, the microcomputer 12051 distinguishes obstacles around the vehicle 12100 into obstacles that are visible to the driver of the vehicle 12100 and obstacles that are difficult to see. Then, the microcomputer 12051 determines the collision risk indicating the risk of collision with each obstacle, and when the collision risk is equal to or higher than the set value and there is a possibility of collision, the microcomputer 12051 via the audio speaker 12061 or the display unit 12062. By outputting an alarm to the driver and performing forced deceleration and avoidance steering via the drive system control unit 12010, driving support for collision avoidance can be provided.
  • At least one of the image pickup units 12101 to 12104 may be an infrared camera that detects infrared rays.
  • the microcomputer 12051 can recognize a pedestrian by determining whether or not a pedestrian is present in the captured image of the imaging unit 12101 to 12104.
  • recognition of a pedestrian is, for example, a procedure for extracting feature points in an image captured by an image pickup unit 12101 to 12104 as an infrared camera, and pattern matching processing is performed on a series of feature points indicating the outline of an object to determine whether or not the pedestrian is a pedestrian. It is done by the procedure to determine.
  • the audio image output unit 12052 determines the square contour line for emphasizing the recognized pedestrian.
  • the display unit 12062 is controlled so as to superimpose and display. Further, the audio image output unit 12052 may control the display unit 12062 so as to display an icon or the like indicating a pedestrian at a desired position.
  • the above is an example of a vehicle control system to which the technique according to the present disclosure can be applied.
  • the technique according to the present disclosure can be applied to, for example, the image pickup unit 12031 among the configurations described above.
  • the technique according to the present disclosure By applying the technique according to the present disclosure to the image pickup unit 12031, it is possible to obtain a photographed image that is easier to see, and thus it is possible to reduce driver fatigue.
  • the image pickup apparatus of the present disclosure may be in the form of a module in which an image pickup unit and a signal processing unit or an optical system are packaged together.
  • a solid-state image pickup device that converts the amount of incident light imaged on the image pickup surface via an optical lens system into an electric signal on a pixel-by-pixel basis and outputs it as a pixel signal, and is mounted on the solid-state image pickup device.
  • the photoelectric conversion element of the present disclosure is not limited to such an image pickup device.
  • it may be any as long as it detects light from a subject, receives light, generates an electric charge according to the amount of received light by photoelectric conversion, and accumulates it.
  • the output signal may be a signal of image information or a signal of distance measurement information.
  • the photoelectric conversion unit 10 as the second photoelectric conversion unit is an iTOF sensor
  • the present disclosure is not limited to this. That is, the second photoelectric conversion unit is not limited to the one that detects the light having a wavelength in the infrared light region, and may detect the wavelength light in another wavelength region. Further, when the photoelectric conversion unit 10 is not an iTOF sensor, only one transfer transistor (TG) may be provided.
  • TG transfer transistor
  • the photoelectric conversion unit 10 including the photoelectric conversion region 12 and the organic photoelectric conversion unit 20 including the organic photoelectric conversion layer 22 are laminated with the intermediate layer 40 interposed therebetween.
  • the image pickup device is illustrated, the present disclosure is not limited to this.
  • the photoelectric conversion element of the present disclosure may have a structure in which two organic photoelectric conversion regions are laminated, or may have a structure in which two inorganic photoelectric conversion regions are laminated. ..
  • the photoelectric conversion unit 10 mainly detects wavelength light in the infrared light region to perform photoelectric conversion
  • the organic photoelectric conversion unit 20 mainly detects wavelength light in the visible light region.
  • the photoelectric conversion element of the present disclosure is not limited to this.
  • the wavelength range showing the sensitivity in the first photoelectric conversion unit and the second photoelectric conversion unit can be arbitrarily set.
  • constituent materials of each component of the photoelectric conversion element of the present disclosure are not limited to the materials mentioned in the above-described embodiments and the like.
  • the first photoelectric conversion unit or the second photoelectric conversion unit receives light in the visible light region and performs photoelectric conversion
  • the first photoelectric conversion unit or the second photoelectric conversion unit includes quantum dots. You may do so.
  • the photodetector of the present disclosure is not limited to this.
  • the solid-state image sensor 1 of the first embodiment even if the peripheral portion 101 as a peripheral region is arranged so as to face the two sides of the pixel portion 100 as an effective region. good.
  • the solid-state image sensor 2 of the second embodiment Even if the peripheral portion 101 as a peripheral region is arranged so as to face the two sides of the pixel portion 100 as an effective region. good. The same applies to the solid-state image sensor 2 of the second embodiment.
  • FIG. 28 is a schematic plan view showing the pixel portion 100 of the solid-state image sensor 3 and its periphery in an enlarged manner.
  • FIG. 29 is a schematic cross-sectional view of the pixel portion 100 of the solid-state image sensor 3 and its surroundings. Note that FIG. 29 shows a cross section in the direction of the arrow along the XXIX-XXIX cutting line shown in FIG. 29.
  • the solid-state image sensor 3 includes a pixel portion 100 as an effective region and a peripheral portion 101 as a peripheral region adjacent to the pixel portion 100.
  • FIG. 30 is an enlarged plan view of the vicinity of two adjacent pad openings 103K among the plurality of pad openings 103K.
  • the pad opening region 103 is provided with a pad 71 at the bottom of each of the plurality of pad openings 103K.
  • the pad 71 is a connection terminal for electrically connecting the solid-state image sensor 3 and an external device.
  • the pad 71 is provided on, for example, the multilayer wiring layer 30.
  • the pad 71 is made of a highly conductive metal material such as Al (aluminum) or an aluminum alloy.
  • a guard ring 72 is formed so as to surround the pad opening 103K.
  • the first conductive layer 72-1, the second conductive layer 72-2, and the third conductive layer 72-3 are laminated in order in the thickness direction. Therefore, the guard ring 72 penetrates the intermediate layer 40 in the thickness direction.
  • the pad opening region 103 is provided with a wiring layer 73. As shown in FIG. 30, a plurality of wiring layers 73 are provided so as to surround the pad opening 103K in the area inside the guard ring 72.
  • the wiring layer 73 is provided in the same layer as the wiring layer M of the pixel unit 100.
  • the wiring layer 73 includes, for example, a structure in which wiring M1, wiring M2, and wiring M3 are laminated via an insulating layer 41.
  • the wiring M1, the wiring M2, and the wiring M3 may be electrically connected to each other by a connection layer extending in the thickness direction.
  • the wiring M1, the wiring M2, and the wiring M3 are made of a highly conductive material such as ITO.
  • the wiring layer 73 is not electrically connected to the outside and is an electrically isolated dummy wiring.
  • the plurality of pad openings 103K are provided so as to penetrate the wiring layer 73 in the thickness direction. Therefore, the respective end faces M1T, M2T, and M3T of the wiring M1, the wiring M2, and the wiring M3 of the wiring layer 73 are exposed on the inner wall surface 103KS of the pad opening 103K.
  • the exposed wiring layer 73 is provided on the inner wall surface 103KS of the pad opening 103K of the peripheral portion 101, so that the pixels are provided in the manufacturing process of the solid-state image sensor 3. It is possible to eliminate the occurrence of a step between the portion 100 and the peripheral portion 101. Specifically, when the wiring layer 73 is not provided, for example, when forming the wiring layer M in the pixel portion 100, a step may occur between the pixel portion 100 and the peripheral portion 101. As a result, the processing accuracy such as pattern formation of the organic photoelectric conversion unit 20 may decrease, and the yield in the manufacturing process may decrease.
  • the wiring layer 73 is provided in the intermediate layer 40 of the peripheral portion 101, the flatness in the peripheral portion 101 can be improved. As a result, the possibility of processing defects in the organic photoelectric conversion unit 20 and the like can be further reduced.
  • the layout of the pad opening area 103 is not limited to that shown in FIG.
  • the solid-state image sensor 3 can adopt, for example, the layouts shown in FIGS. 31A to 31E, respectively.
  • a plurality of wiring layers 73 are arranged so as to surround the pad opening 103K, but the present disclosure is not limited thereto.
  • one annular wiring layer 73A may be provided so as to surround the pad opening 103K.
  • the inner side surface of the annular wiring layer 73A is exposed on the inner wall surface 103KS of the pad opening 103K.
  • wiring is performed inside the guard ring 72 separately from the wiring layer 73 whose part is exposed on the inner wall surface 103KS of the pad opening 103K. It may have a plurality of layers 73B. Further, for example, as in the pad opening region 103C shown in FIG. 31C, a plurality of wiring layers 73C may be provided on the outside of the guard ring 72.
  • the present disclosure provides an annular wiring layer 73A partially exposed on the inner wall surface 103KS of the pad opening 103K and the inside of the guard ring 72, as in the pad opening region 103D shown in FIG. 31D, for example. It may have the wiring layer 73B and the wiring layer 73B. Further, for example, as in the pad opening region 103E shown in FIG. 31E, the annular wiring layer 73A and the wiring layer 73C provided on the outside of the guard ring 72 may be provided.
  • FIG. 32 is a schematic cross-sectional view of the pixel portion 100 of the solid-state image sensor 3A and its surroundings, and corresponds to FIG. 29.
  • the wiring layer 73 provided in the pad opening region 103 of the peripheral portion 101 is connected to the semiconductor substrate 11 of the photoelectric conversion unit 10 via the connection layer 74.
  • FIG. 33 is a schematic cross-sectional view of the pixel portion 100 of the solid-state image sensor 3B and its surroundings, and corresponds to FIG. 29.
  • the end portion of a part of the wiring layer 73D provided in the pad opening region 103 of the peripheral portion 101 is exposed to the edge 3BT of the solid-state image sensor 3B.
  • the recesses provided in the peripheral region may be filled with a resin material.
  • the present disclosure may be, for example, an embodiment such as the solid-state image sensor 4 as another fourth modification shown in FIGS. 34 and 35.
  • FIG. 34 is a schematic plan view showing the pixel portion 100 of the solid-state image sensor 4 and its periphery in an enlarged manner.
  • FIG. 35 is a schematic cross-sectional view showing an enlarged peripheral portion 401 of the solid-state image sensor 4. Note that FIG. 35 corresponds to FIG. 3C of the first embodiment.
  • the configuration of the solid-state image sensor 4 is such that the black filter 56 extends to the contact region 102 to which the contact layer 57 and the lead-out wiring 58 are connected in the peripheral portion 401.
  • the configuration of the solid-state image sensor 4 is substantially the same as the configuration of the solid-state image sensor 1 except for that point.
  • the black filter 56 is formed so as to cover the contact region 102 and the contact region 104 so as to occupy a wider region than the region where the contact layer 57 is formed.
  • the black filter 56 is a resin material containing a black pigment such as carbon black.
  • the contact area 104 is an area in which the contact layer 57 and the organic photoelectric conversion unit 20 are connected.
  • the black filter 56 is formed in a wider range, the flatness of the surface of the peripheral portion 401 is improved.
  • the step between the surface of the pixel portion 100 and the surface of the peripheral portion 401 is alleviated, so that the flatness of the base forming the color filter 53 is improved. It will be. Therefore, it is possible to reduce the variation in the thickness of the film to be the color filter 53, and when the color filter 53 having a desired shape is formed by etching, the film to be the color filter 53 remains as a residue in an originally unnecessary region. It is possible to reduce the possibility of defects such as variation in the thickness of the patterned color filter 53. Therefore, it is possible to avoid the occurrence of color unevenness.
  • the solid-state image sensor 4 may have a peripheral portion 401A (another fifth modification) having the configuration shown in FIG. 36.
  • an additional film 75 is formed so as to cover a steeper wall surface among the surfaces covered by the black filter 56.
  • the insulating film 51-1 provided along the bottom surface of the groove and the side wall surface of the groove is added between the black filter 56 and the insulating film 51-1. I try to form a film.
  • the additional film 75 is made of, for example, the same constituent material as the partition wall 52, and is formed at the same time as the partition wall 52.
  • a low-temperature oxide film (LTO film) is formed by a thermal CVD method or the like so as to completely cover the pixel portion 100 and the peripheral portion 401A, and then the low-temperature oxide film is selectively etched to obtain the pixel portion.
  • an additional film 75 is formed in a groove portion of the contact region 102 of the peripheral portion 401A and a part of the groove portion of the contact region 104.
  • the additional film 75 may be formed in a stepped portion other than the contact region 102 and the contact region 104.
  • a gap is likely to occur in the gap between the film covering them.
  • a gap is likely to occur between the insulating film 51-1 and the black filter 56 covering the insulating film 51-1. Therefore, by providing the additional film 75 as in the peripheral portion 401A shown in FIG. 36, the generation of voids between the insulating film 51-1 and the black filter 56 can be suppressed. As a result, the structure of the solid-state image sensor 4 can be stabilized. Therefore, the reliability can be further improved by effectively preventing the generation of cracks due to changes in the temperature environment and deterioration over time.
  • the solid-state image sensor 4 may have a peripheral portion 401B (another sixth modification) having the configuration shown in FIG. 37.
  • the low refractive index resin 76 is formed separately from the black filter 56 so as to fill a portion having a particularly large step, for example, a groove portion of the contact region 102, and the low refractive index is low.
  • the black filter 56 may be formed so as to cover the resin 76. In that case, it is more preferable to provide a low temperature oxide film (LTO film) 77 between the low refractive index resin 76 and the black filter 56.
  • LTO film low temperature oxide film
  • the low refractive index resin 76 When the low refractive index resin 76 is selectively etched, if a resist pattern is formed directly on the low refractive index resin 76, the resist may enter the voids inside the low refractive index resin 76. Therefore, by covering the upper surface of the low refractive index resin 76 with the low temperature oxide film 77 and then covering it with the resist pattern, it is possible to avoid the entry of the resist into the low refractive index resin 76.
  • the height position of the upper surface of the low refractive index resin 76 is substantially the same as the height position of the upper end of the partition wall 52 of the pixel portion 100, for example.
  • the photodetector as one embodiment of the present disclosure, since unnecessary light is cut by the second optical filter, noise received by the second photoelectric conversion unit can be suppressed. Therefore, when this photodetector is used, for example, in an image pickup device, improvements in S / N ratio, resolution, distance measurement accuracy, and the like can be expected. It should be noted that the effects described in the present specification are merely examples and are not limited to the description thereof, and other effects may be obtained.
  • the present technology can have the following configurations. (1) An effective area provided with a photodetector that detects the irradiation light and performs photoelectric conversion, It has the effective area and the adjacent peripheral area.
  • the photodetector is A first photoelectric conversion unit that detects light in the first wavelength range of the irradiation light and performs photoelectric conversion, A second photoelectric conversion unit which is provided so as to overlap with the first photoelectric conversion unit and detects light in the second wavelength region of the irradiation light to perform photoelectric conversion.
  • a plurality of the second optical filters are provided in the peripheral region. Each of the plurality of second optical filters is surrounded by a second light-shielding member that blocks light in at least the second wavelength region along a plane orthogonal to the thickness direction (4) or.
  • the photodetector according to (10) above, wherein the light-shielding film is made of a metal material.
  • the photodetector according to (11) above, wherein the light-shielding film reflects light in the first wavelength range.
  • the photodetector according to any one of (1) to (15) above, wherein the recess is filled with a resin material. (17) Equipped with a light emitting device that emits infrared light and a photodetector,
  • the photodetector is An effective region provided with a photodetector that detects visible light from the outside and the infrared light from the light emitting device. It has the effective area and the adjacent peripheral area.
  • the photodetector is A first photoelectric conversion unit that detects visible light and performs photoelectric conversion, A second photoelectric conversion unit that is provided so as to overlap with the first photoelectric conversion unit and that detects infrared light and performs photoelectric conversion.
  • a photodetection system provided with a second optical filter in the peripheral region, which allows the infrared light to pass through more easily than the visible light.
  • the photodetector is An effective area provided with a photoelectric conversion element that detects irradiation light and performs photoelectric conversion, It has the effective area and the adjacent peripheral area.
  • the photoelectric conversion element is A first photoelectric conversion unit that detects light in the first wavelength range of the irradiation light and performs photoelectric conversion, A second photoelectric conversion unit which is provided so as to overlap with the first photoelectric conversion unit and detects light in the second wavelength region of the irradiation light to perform photoelectric conversion.
  • the photodetection system is equipped with a photodetection system having a light emitting device that emits irradiation light and a photodetector.
  • the photodetector is An effective region provided with a photoelectric conversion element that detects the irradiation light and performs photoelectric conversion, and It has the effective area and the adjacent peripheral area.
  • the photoelectric conversion element is A first photoelectric conversion unit that detects light in the first wavelength range of the irradiation light and performs photoelectric conversion, A second photoelectric conversion unit which is provided so as to overlap with the first photoelectric conversion unit and detects light in the second wavelength region of the irradiation light to perform photoelectric conversion.
  • the photodetector is A first photoelectric conversion unit that detects light in the first wavelength range of the irradiation light and performs photoelectric conversion, A second photoelectric conversion unit which is provided so as to overlap with the first photoelectric conversion unit and detects light in the second wavelength region of the irradiation light to perform photoelectric conversion.
  • An optical filter sandwiched between the first photoelectric conversion unit and the second photoelectric conversion unit, in which light in the second wavelength region is more easily transmitted than light in the first wavelength region.
  • the black level reference element is A third photoelectric conversion unit that detects light in the first wavelength region of the irradiation light and performs photoelectric conversion, A fourth photoelectric conversion unit which is provided so as to overlap with the third photoelectric conversion unit and detects light in the second wavelength region of the irradiation light to perform photoelectric conversion.
  • a first light-shielding unit provided on the side opposite to the fourth photoelectric conversion unit when viewed from the third photoelectric conversion unit and suppressing the transmission of the irradiation light, and a light-shielding unit.
  • a photodetector including a second light-shielding unit that is sandwiched between the third photoelectric conversion unit and the fourth photoelectric conversion unit and suppresses the transmission of the irradiation light.
  • the metal material contains at least one of Al (aluminum), W (tungsten), Ta (tantalum), TaN (tantalum nitride), Ti (titanium) and Cu (copper). Detection device.
  • the first photoelectric conversion unit and the third photoelectric conversion unit are integrally provided so as to extend from the effective region to the peripheral region in the first layer.
  • the second photoelectric conversion unit and the fourth photoelectric conversion unit are integrally provided so as to extend from the effective region to the peripheral region in the second layer from (20) to (23). ).
  • the photodetector according to any one of. (25) Equipped with a light emitting device that emits infrared light and a photodetector,
  • the photodetector is An effective region provided with a photodetector that detects visible light from the outside and the infrared light from the light emitting device. It is provided with a peripheral area adjacent to the effective area and provided with a black level reference element for detecting a black level reference value.
  • the photodetector is A first photoelectric conversion unit that detects visible light and performs photoelectric conversion, A second photoelectric conversion unit that is provided so as to overlap with the first photoelectric conversion unit and that detects infrared light and performs photoelectric conversion. It has a first laminated structure that is sandwiched between the first photoelectric conversion unit and the second photoelectric conversion unit and includes an optical filter that allows the infrared light to pass through more easily than visible light.
  • the black level reference element is A third photoelectric conversion unit that detects visible light and performs photoelectric conversion, A fourth photoelectric conversion unit that is provided so as to overlap with the third photoelectric conversion unit and that detects infrared light and performs photoelectric conversion.
  • a first light-shielding unit provided on the side opposite to the fourth photoelectric conversion unit when viewed from the third photoelectric conversion unit and suppressing transmission of visible light and infrared light. It has a second laminated structure that is sandwiched between the third photoelectric conversion unit and the fourth photoelectric conversion unit and includes a second light-shielding portion that suppresses the transmission of visible light and infrared light.
  • Light detection system (26) It is equipped with an optical unit, a signal processing unit, and a photodetector.
  • the photodetector is An effective area provided with a photodetector that detects irradiation light, and It is provided with a peripheral area adjacent to the effective area and provided with a black level reference element for detecting a black level reference value.
  • the photodetector is A first photoelectric conversion unit that detects light in the first wavelength range of the irradiation light and performs photoelectric conversion, A second photoelectric conversion unit which is provided so as to overlap with the first photoelectric conversion unit and detects light in the second wavelength region of the irradiation light to perform photoelectric conversion.
  • An optical filter sandwiched between the first photoelectric conversion unit and the second photoelectric conversion unit, in which light in the second wavelength region is more easily transmitted than light in the first wavelength region.
  • the black level reference element is A third photoelectric conversion unit that detects light in the first wavelength region of the irradiation light and performs photoelectric conversion, A fourth photoelectric conversion unit which is provided so as to overlap with the third photoelectric conversion unit and detects light in the second wavelength region of the irradiation light to perform photoelectric conversion.
  • a first light-shielding unit provided on the side opposite to the fourth photoelectric conversion unit when viewed from the third photoelectric conversion unit and suppressing the transmission of the irradiation light, and a light-shielding unit.
  • It is equipped with a photodetection system having a light emitting device that emits irradiation light and a photodetector.
  • the photodetector is An effective region provided with a photoelectric conversion element that detects the irradiation light, and It is provided with a peripheral area adjacent to the effective area and provided with a black level reference element for detecting a black level reference value.
  • the photodetector is A first photoelectric conversion unit that detects light in the first wavelength range of the irradiation light and performs photoelectric conversion, A second photoelectric conversion unit which is provided so as to overlap with the first photoelectric conversion unit and detects light in the second wavelength region of the irradiation light to perform photoelectric conversion.
  • An optical filter sandwiched between the first photoelectric conversion unit and the second photoelectric conversion unit, in which light in the second wavelength region is more easily transmitted than light in the first wavelength region.
  • Has a first laminated structure that includes The black level reference element is A third photoelectric conversion unit that detects light in the first wavelength region of the irradiation light and performs photoelectric conversion, A fourth photoelectric conversion unit which is provided so as to overlap with the third photoelectric conversion unit and detects light in the second wavelength region of the irradiation light to perform photoelectric conversion.
  • a first light-shielding unit provided on the side opposite to the fourth photoelectric conversion unit when viewed from the third photoelectric conversion unit and suppressing the transmission of the irradiation light, and a light-shielding unit.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Power Engineering (AREA)
  • Electromagnetism (AREA)
  • Condensed Matter Physics & Semiconductors (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Solid State Image Pick-Up Elements (AREA)

Abstract

Est prévu un dispositif de détection de lumière présentant une haute fonctionnalité. Le dispositif de détection de lumière comprend une région effective dotée d'un élément de conversion photoélectrique pour détecter une lumière d'irradiation et effectuer une conversion photoélectrique de celle-ci, et une région périphérique adjacente à la région effective. L'élément de conversion photoélectrique a une structure stratifiée comprenant : une première unité de conversion photoélectrique qui détecte la lumière dans une première plage de longueurs d'onde parmi la lumière d'irradiation et effectue une conversion photoélectrique de celle-ci ; une seconde unité de conversion photoélectrique qui est placée de manière à chevaucher la première unité de conversion photoélectrique et qui détecte la lumière dans une seconde plage de longueurs d'onde parmi la lumière d'irradiation et effectue une conversion photoélectrique de celle-ci ; et un premier filtre optique qui est intercalé entre la première unité de conversion photoélectrique et la seconde unité de conversion photoélectrique, et à travers lequel la lumière dans la seconde plage de longueurs d'onde est transmise plus facilement que la lumière dans la première plage de longueurs d'onde. La région périphérique est dotée d'un second filtre optique à travers lequel la lumière dans la seconde plage de longueurs d'onde est transmise plus facilement que la lumière dans la première plage de longueurs d'onde.
PCT/JP2021/038761 2020-12-16 2021-10-20 Dispositif de détection de lumière, système de détection de lumière, appareil électronique et corps mobile WO2022130776A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US18/256,100 US20240031703A1 (en) 2020-12-16 2021-10-20 Light detection apparatus, light detection system, electronic equipment, and mobile body

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020208717 2020-12-16
JP2020-208717 2020-12-16

Publications (1)

Publication Number Publication Date
WO2022130776A1 true WO2022130776A1 (fr) 2022-06-23

Family

ID=82059739

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/038761 WO2022130776A1 (fr) 2020-12-16 2021-10-20 Dispositif de détection de lumière, système de détection de lumière, appareil électronique et corps mobile

Country Status (2)

Country Link
US (1) US20240031703A1 (fr)
WO (1) WO2022130776A1 (fr)

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006295125A (ja) * 2005-01-18 2006-10-26 Matsushita Electric Ind Co Ltd 固体撮像装置及びその製造方法並びにカメラ
JP2009111225A (ja) * 2007-10-31 2009-05-21 Fujifilm Corp 固体撮像素子及びその製造方法
JP2010123779A (ja) * 2008-11-20 2010-06-03 Sony Corp 固体撮像装置および撮像装置
JP2011199798A (ja) * 2010-03-24 2011-10-06 Sony Corp 物理情報取得装置、固体撮像装置、物理情報取得方法
JP2011198855A (ja) * 2010-03-17 2011-10-06 Fujifilm Corp 光電変換膜積層型固体撮像素子及び撮像装置
JP2011243945A (ja) * 2010-03-19 2011-12-01 Fujifilm Corp 光電変換層積層型固体撮像素子及び撮像装置
JP2011244010A (ja) * 2011-08-08 2011-12-01 Fujifilm Corp 固体撮像素子
WO2012070164A1 (fr) * 2010-11-24 2012-05-31 パナソニック株式会社 Dispositif d'imagerie à semiconducteur et son procédé de fabrication
JP2012227478A (ja) * 2011-04-22 2012-11-15 Panasonic Corp 固体撮像装置
JP2013070030A (ja) * 2011-09-06 2013-04-18 Sony Corp 撮像素子、電子機器、並びに、情報処理装置
JP2016046508A (ja) * 2014-08-22 2016-04-04 采▲ぎょく▼科技股▲ふん▼有限公司VisEra Technologies Company Limited ダミーパターンを有する撮像装置
JP2017038011A (ja) * 2015-08-12 2017-02-16 株式会社ソニー・インタラクティブエンタテインメント 撮像素子、イメージセンサ、撮像装置、および情報処理装置
JP2018063378A (ja) * 2016-10-14 2018-04-19 ソニーセミコンダクタソリューションズ株式会社 光学デバイス、光学センサ、並びに、撮像装置
JP2020010062A (ja) * 2019-10-02 2020-01-16 キヤノン株式会社 光電変換装置、及び撮像システム
JP2020150264A (ja) * 2019-03-13 2020-09-17 三星電子株式会社Samsung Electronics Co.,Ltd. センサー及びそのセンサーを含む電子装置
WO2020195564A1 (fr) * 2019-03-25 2020-10-01 ソニーセミコンダクタソリューションズ株式会社 Dispositif d'imagerie

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006295125A (ja) * 2005-01-18 2006-10-26 Matsushita Electric Ind Co Ltd 固体撮像装置及びその製造方法並びにカメラ
JP2009111225A (ja) * 2007-10-31 2009-05-21 Fujifilm Corp 固体撮像素子及びその製造方法
JP2010123779A (ja) * 2008-11-20 2010-06-03 Sony Corp 固体撮像装置および撮像装置
JP2011198855A (ja) * 2010-03-17 2011-10-06 Fujifilm Corp 光電変換膜積層型固体撮像素子及び撮像装置
JP2011243945A (ja) * 2010-03-19 2011-12-01 Fujifilm Corp 光電変換層積層型固体撮像素子及び撮像装置
JP2011199798A (ja) * 2010-03-24 2011-10-06 Sony Corp 物理情報取得装置、固体撮像装置、物理情報取得方法
WO2012070164A1 (fr) * 2010-11-24 2012-05-31 パナソニック株式会社 Dispositif d'imagerie à semiconducteur et son procédé de fabrication
JP2012227478A (ja) * 2011-04-22 2012-11-15 Panasonic Corp 固体撮像装置
JP2011244010A (ja) * 2011-08-08 2011-12-01 Fujifilm Corp 固体撮像素子
JP2013070030A (ja) * 2011-09-06 2013-04-18 Sony Corp 撮像素子、電子機器、並びに、情報処理装置
JP2016046508A (ja) * 2014-08-22 2016-04-04 采▲ぎょく▼科技股▲ふん▼有限公司VisEra Technologies Company Limited ダミーパターンを有する撮像装置
JP2017038011A (ja) * 2015-08-12 2017-02-16 株式会社ソニー・インタラクティブエンタテインメント 撮像素子、イメージセンサ、撮像装置、および情報処理装置
JP2018063378A (ja) * 2016-10-14 2018-04-19 ソニーセミコンダクタソリューションズ株式会社 光学デバイス、光学センサ、並びに、撮像装置
JP2020150264A (ja) * 2019-03-13 2020-09-17 三星電子株式会社Samsung Electronics Co.,Ltd. センサー及びそのセンサーを含む電子装置
WO2020195564A1 (fr) * 2019-03-25 2020-10-01 ソニーセミコンダクタソリューションズ株式会社 Dispositif d'imagerie
JP2020010062A (ja) * 2019-10-02 2020-01-16 キヤノン株式会社 光電変換装置、及び撮像システム

Also Published As

Publication number Publication date
US20240031703A1 (en) 2024-01-25

Similar Documents

Publication Publication Date Title
US11217617B2 (en) Imaging element and solid-state imaging device
CN111295761A (zh) 成像元件、成像元件的制造方法和电子设备
US11469262B2 (en) Photoelectric converter and solid-state imaging device
WO2022131268A1 (fr) Élément de conversion photoélectrique, appareil de détection de lumière, système de détection de lumière, dispositif électronique et corps mobile
KR102609022B1 (ko) 수광 소자, 수광 소자의 제조 방법, 촬상 소자 및 전자 기기
US11817466B2 (en) Photoelectric conversion element, photodetector, photodetection system, electronic apparatus, and mobile body
JP2019012739A (ja) 固体撮像素子および撮像装置
KR102550831B1 (ko) 고체 촬상 소자, 전자 기기, 및 제조 방법
JP2019047392A (ja) 撮像素子及び固体撮像装置
WO2022131090A1 (fr) Dispositif de détection optique, système de détection optique, équipement électronique et corps mobile
WO2021172121A1 (fr) Film multicouche et élément d'imagerie
WO2022130776A1 (fr) Dispositif de détection de lumière, système de détection de lumière, appareil électronique et corps mobile
WO2020012842A1 (fr) Élément de conversion photoélectrique
WO2022131033A1 (fr) Élément de conversion photoélectrique, dispositif de détection de lumière, système de détection de lumière, appareil électronique et corps mobile
WO2022224567A1 (fr) Dispositif de détection de lumière, système de détection de lumière, appareil électronique et corps mobile
WO2023067969A1 (fr) Dispositif de détection de lumière et son procédé de fabrication, appareil électronique et corps mobile
WO2022131101A1 (fr) Élément de conversion photoélectrique, dispositif de détection de lumière, système de détection de lumière, équipement électronique et corps mobile
US20220302197A1 (en) Imaging element and imaging device
US20220376128A1 (en) Imaging device and electronic apparatus
WO2023153308A1 (fr) Élément de conversion photoélectrique et dispositif de détection optique
JP7344114B2 (ja) 撮像素子および電子機器
TW202232792A (zh) 固態攝像元件及電子機器

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21906140

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 18256100

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21906140

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP