WO2015198876A1 - Imaging element, and electronic device - Google Patents

Imaging element, and electronic device Download PDF

Info

Publication number
WO2015198876A1
WO2015198876A1 PCT/JP2015/066830 JP2015066830W WO2015198876A1 WO 2015198876 A1 WO2015198876 A1 WO 2015198876A1 JP 2015066830 W JP2015066830 W JP 2015066830W WO 2015198876 A1 WO2015198876 A1 WO 2015198876A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
diffusion layer
image sensor
gate voltage
region
Prior art date
Application number
PCT/JP2015/066830
Other languages
French (fr)
Japanese (ja)
Inventor
駿介 古瀬
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Publication of WO2015198876A1 publication Critical patent/WO2015198876A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J1/00Photometry, e.g. photographic exposure meter
    • G01J1/02Details
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14601Structural or functional details thereof
    • H01L27/14603Special geometry or disposition of pixel-elements, address-lines or gate-electrodes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/30Measuring the intensity of spectral lines directly on the spectrum itself
    • G01J3/36Investigating two or more bands of a spectrum by separate detectors
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L31/00Semiconductor devices sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation; Processes or apparatus specially adapted for the manufacture or treatment thereof or of parts thereof; Details thereof
    • H01L31/08Semiconductor devices sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation; Processes or apparatus specially adapted for the manufacture or treatment thereof or of parts thereof; Details thereof in which radiation controls flow of current through the device, e.g. photoresistors
    • H01L31/10Semiconductor devices sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation; Processes or apparatus specially adapted for the manufacture or treatment thereof or of parts thereof; Details thereof in which radiation controls flow of current through the device, e.g. photoresistors characterised by at least one potential-jump barrier or surface barrier, e.g. phototransistors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/76Addressed sensors, e.g. MOS or CMOS sensors

Definitions

  • This technology relates to image sensors and electronic devices. More specifically, the present invention relates to an image sensor and an electronic device that have improved performance for detecting the intensity of each wavelength of incident light.
  • each pixel has three diffusion layers having a depth of 0.2 ⁇ m, 0.6 ⁇ m, and 2 ⁇ m superimposed on a silicon substrate.
  • each pixel has a three-layer structure, and the three primary colors of light, red (R), green (G), and blue (B), have different wavelengths depending on the transmission characteristics of silicon. It is designed to transmit and receive light.
  • RGB wavelengths are incident from the surface of the silicon substrate, all of RGB is captured in the uppermost layer, RG excluding B elements absorbed in the uppermost layer is captured in the middle layer, and the uppermost layer and middle layer are captured in the lowermost layer.
  • the G value is obtained by subtracting the R value captured in the lowermost layer from the RG value captured in the middle layer
  • the B value is obtained by subtracting the R and G values from the RGB values captured in the uppermost layer.
  • the configuration of the photodiode described above cuts out RGB in the depth direction, but the position for capturing electrons cannot be changed, and the circuit is complicated, so the degree of freedom in design is low. Furthermore, since independent color information is continuously generated in one pixel, it is very difficult to distinguish between correct data and noise in consecutive portions and obtain RGB characteristics, so obtaining RGB characteristics is complicated. Software is required.
  • a spectroscopic sensor having a structure in which a single photodiode corresponding to incident light is provided and the potential depth of the photodiode can be controlled by changing the gate voltage has been proposed (for example, see Patent Document 2).
  • the wavelength and intensity of incident light can be measured by changing the gate voltage and changing the depth of capturing electrons generated by light incident in the photodiode in accordance with the gate voltage. it can.
  • Patent Document 3 proposes a spectroscopic sensor having a structure in which the potential depth of a photodiode can be controlled by changing the gate voltage as a back-illuminated image sensor. Also in Patent Document 3, since the gate electrode is on the incident surface side, the gate electrode absorbs light, and it is necessary to consider the absorption of the gate electrode at the time of analysis.
  • This technology has been made in view of such a situation, and enables a highly sensitive spectroscopic measurement with a single element.
  • An imaging device includes: a diffusion layer formed on a semiconductor substrate; and an electrode that is formed on a side different from a light incident side of the diffusion layer and to which a gate voltage is applied.
  • a gate voltage By changing the voltage, the depth from the surface of the diffusion layer that captures charges generated by incident light in the diffusion layer is changed, and the region from the surface side to the depth is changed for each changed gate voltage.
  • a current By measuring a current indicating the amount of the electric charge generated in step 1, the intensity for each wavelength of the incident light is analyzed.
  • the electrode can be made of metal.
  • the first substrate including the diffusion layer and the electrode and the second substrate including the circuit for performing the analysis may be joined.
  • the bonding can be a bump bonding.
  • a floating diffusion may be further provided, and the floating diffusion may be shared by a plurality of imaging elements.
  • a photoelectric conversion film can be further provided on the side of the diffusion layer different from the side where the electrode is formed.
  • An electronic apparatus includes a diffusion layer formed on a semiconductor substrate, and an electrode formed on a side different from a light incident side of the diffusion layer and to which a gate voltage is applied.
  • a gate voltage By changing the voltage, the depth from the surface of the diffusion layer that captures charges generated by incident light in the diffusion layer is changed, and the region from the surface side to the depth is changed for each changed gate voltage.
  • An image sensor that analyzes the intensity for each wavelength of the incident light by measuring a current that indicates the amount of the electric charge generated in step 1, and a processing unit that processes a signal from the image sensor.
  • An imaging device includes a diffusion layer formed on a semiconductor substrate and an electrode that is formed on a side different from the light incident side of the diffusion layer and to which a gate voltage is applied. . Then, by changing the gate voltage, the depth from the surface of the diffusion layer that captures the charge generated by the incident light in the diffusion layer is changed, and in the region from the surface side to the depth for each changed gate voltage. By measuring a current indicating the amount of generated charges, the intensity of each wavelength of incident light is analyzed.
  • the electronic device includes the image sensor.
  • spectroscopic measurement can be performed with high sensitivity by a single element.
  • the present technology is an image sensor capable of performing spectroscopic measurement.
  • the wavelength of light that is photoelectrically converted from light incident on the semiconductor device changes depending on the depth from the surface of the semiconductor.
  • a spectral characteristic can be measured by providing a gate electrode in a spectroscopic sensor that constitutes an imaging device and variably controlling the potential of a charge well in which electrons converted from photons accumulate.
  • a solid-state image sensor that extracts a color image from the measured spectral characteristics can be configured.
  • the semiconductor type is silicon (Si)
  • Si silicon
  • the electrons are excited and light is converted into electric charges.
  • ⁇ 0 is ⁇ 1.0 ⁇ m.
  • FIG. 2 shows the wavelength dependence of the energy (eV) of light incident on the semiconductor and the absorption coefficient ⁇ (cm ⁇ 1 ).
  • the effective absorption region becomes shallow. That is, short-wavelength light is almost absorbed and photoelectrically converted at a position close to the surface of the semiconductor. In contrast, long-wavelength light reaches a deeper position from the semiconductor surface and is photoelectrically converted.
  • the current generated by changing the potential depth at which electrons (or holes) generated by the incident light of the semiconductor device can be collected is measured.
  • wavelength information of incident light can be obtained.
  • the current generated up to the depth (position) W in the semiconductor can be obtained by calculation.
  • the light intensity attenuates exponentially. Therefore, the light intensity ⁇ at a certain depth x is expressed by the following equation (2).
  • each parameter is as follows.
  • a 1 , A 2 Incident light intensity [W / cm 2 ]
  • S Sensor area [cm2]
  • I 1 Actual measured value of current when electron capture position is W 1 [A]
  • I 2 Measured current value when the electron capture position is W 2 [A]
  • Frequency ⁇ 1 c / ⁇ 1
  • Frequency ⁇ 2 c / ⁇ 2
  • Equation (6) Each parameter in Equation (6) is as follows.
  • the incident light when the incident light is separated into three wavelengths, the current I 3 in the case of the position W 3 where electrons are captured is increased in the equation (4). After that, by calculating in the same manner as in the case of two wavelengths, the incoming light can be separated into three wavelengths. Similarly, when the incident light entering at a wavelength of 100 is dispersed, the position for capturing the electrons may be changed 100 times and measured.
  • 3 and 4 are diagrams showing the configuration of an imaging apparatus equipped with a column parallel analog-digital converter.
  • the solid-state imaging device 10 includes a pixel unit 11, a vertical scanning circuit 18, a column parallel analog-digital converter (ADC) 15, and a digital-analog that generates a ramp wave.
  • a converter (DAC: Digital-Analognaconverter) 19, a logic control circuit 20, and a digital output small amplitude differential signal (LVDS: Low Voltage16differential signaling) interface (I / F) 16.
  • a plurality of pixels 12 are two-dimensionally arranged on a matrix, for example.
  • the pixel 12 has a function as a spectroscopic sensor, for example, and includes a photodiode, a polycrystalline silicon layer to which an impurity is added, and a gate electrode that applies a gate voltage to the polycrystalline silicon layer.
  • the pixel 12 has a configuration in which pixel drive wirings are arranged in units of rows of pixels and vertical signal lines 24 are arranged in units of rows.
  • Each pixel 12 of the pixel unit 11 is driven by a pixel drive wiring 25 extending in the column direction.
  • the pixel signal is an analog signal and is output to the vertical signal line 24 extending in the row direction.
  • a column parallel ADC 15 which is a column parallel analog / digital conversion device includes a comparator 21 and a counter 23.
  • the comparator 21 compares the ramp wave generated from the DAC 19 that generates the ramp wave with the analog signal from each pixel 12.
  • the counter 23 is composed of, for example, an up / down counter that counts the comparison time until the comparison in the comparator 21 is completed and holds the result.
  • a phase locked loop (PLL: Phase Locked Loop) 17 is incorporated to generate a high speed count clock.
  • a logic control circuit 20 that generates an internal clock and a vertical scanning circuit 18 that controls row address and row scanning are arranged as a control circuit for sequentially reading out signals from the pixel unit 11.
  • the digital output LVDS interface (I / F) 16 processes and outputs a signal from the column parallel ADC 15. For example, only buffering may be performed, or black level adjustment, column variation correction, various digital signal processing, and the like may be performed before that. Further, although not particularly illustrated, other various signal processing circuits may be arranged.
  • the column parallel ADC 15 is configured by the comparator 21 and the counter (up / down counter) 23, but the up / down counter is an asynchronous up / down capable of high-speed operation with one count control clock.
  • a counter is preferred.
  • the up / down counter configuration of this embodiment is preferable because it has many advantages such as simplified circuit and high-speed operation.
  • the up / down counter instead of the up / down counter, it is possible to provide double counters, or the counters are not arranged in parallel, and the memory means may be doubled.
  • FIG. 5 is a cross-sectional view showing the configuration of the image sensor.
  • the image sensor 100 is composed of an upper part 101 and a lower part 102.
  • the upper part 101 is a photogate type CIS (CMOS Image Sensor) substrate, and the lower part 102 is a logic substrate.
  • the image sensor 100 is a back-illuminated image sensor in which a CIS substrate and a logic substrate are bonded.
  • the photogate is used to form a depletion layer in order to accumulate one charge among carriers generated by photoelectric conversion in the active layer in the active layer.
  • An upper chip 101 is provided with an on-chip lens (hereinafter referred to as a lens) 121, and an optical black region 122 is provided except for the portion where the lens 121 is provided.
  • the pixel region is defined.
  • a semiconductor substrate under the lens 121 is provided with a P-well region 123 through an n-region, and a back gate region 124 is also provided in a part thereof.
  • the P well region 123 which is a diffusion layer is also provided with a floating diffusion (FD) region 125 and a reset region (reset transistor) 126.
  • FD floating diffusion
  • reset transistor reset transistor
  • an n + region 127 is also provided in the n ⁇ region. These regions are connected to the logic circuit in the lower part 102 by wiring.
  • the back gate region 124 is connected to the logic circuit in the lower portion 102 by the wiring 141
  • the floating diffusion region 125 is connected to the logic circuit in the lower portion 102 by the wiring 144
  • the reset region 126 is connected to the logic circuit in the lower portion 102 by the wiring 146.
  • the n + region 127 is connected to the logic circuit in the lower part 102 by a wiring 147.
  • a gate electrode 131, a transfer gate 132, and a reset gate 133 are also provided below the P well region 123 in the upper portion 101, and wirings are also connected to these gates.
  • the gate electrode 131 is connected to the logic circuit in the lower part 102 by the wiring 142
  • the transfer gate 132 is connected to the logic circuit in the lower part 102 by the wiring 143
  • the reset gate 133 is connected to the logic circuit in the lower part 102 by the wiring 145.
  • the gate electrode 131, the transfer gate 132, and the reset gate 133 are each formed in the polysilicon layer 130 to which impurities are added.
  • the image sensor 100 is provided with the gate electrode 131 in the spectroscopic sensor constituting the image sensor, and variably controls the potential of the charge well in which the electrons converted from the photons are accumulated.
  • the spectral characteristics can be measured.
  • the spectral characteristic can be measured by varying the voltage of the gate electrode 131. This will be described with reference to FIG.
  • the horizontal axis represents a cross-sectional view of the potential
  • the vertical axis represents the potential.
  • the potential cross-sectional view is a cross-sectional view from the lens 121 at a predetermined position in the P well region 123 toward the gate electrode 131.
  • the solid thick line is a potential graph when the gate voltage is low
  • the solid thin line is a potential graph when the gate voltage is high. From the graph of FIG. 6, it can be seen that as the position moves from the lens 121 side in the direction of the gate electrode 131, the potential increases, reaches a maximum value at a predetermined position, and then decreases. That is, the potential reaches the maximum value at a predetermined position (depth) of the P well region 123.
  • the position where the potential becomes the maximum value differs depending on the gate voltage.
  • the dotted line indicates the position where the maximum value is obtained.
  • the image sensor 100 can be used as a spectroscopic sensor. Further, since it can be used as a spectroscopic sensor, a color image can be extracted from the measured spectral characteristics, and can be used as an image sensor for photographing a color image. Such gate voltage control and spectral processing (analysis) are performed by a logic circuit provided in the lower part 102.
  • the imaging element 100 shown in FIG. 5 has a structure in which the potential depth of the quantum well can be controlled by changing the gate voltage from the gate electrode 131. Combined with high sensitivity.
  • the driving of the image sensor 100 is performed by changing the gate voltage in accordance with the vertical readout.
  • An example of a method for spectroscopic measurement of incident light by driving the image sensor 100 will be described below.
  • incident light is incident and, for example, a gate voltage of 1 V is applied by the gate electrode 131, and a current flowing at that time is read.
  • a gate voltage of 2 V is applied to the P well region 123, and the current flowing at that time is read.
  • a gate voltage of 5 V is applied to the P well region 123, and the current flowing at that time is read.
  • the intensity of each wavelength of the incident light is calculated from the above equation (2) based on the current value measured in this way. For example, by raising the voltage of the gate electrode 131 as described above, as shown in FIG. 6, the position where the potential becomes the maximum value moves to the deep side. As a result, electrons are generated in the order of red ⁇ red + green ⁇ red + green + blue.
  • the wavelength ranges from 400 nm to 700 nm and has a width of 300 nm. Therefore, when the imaging device 100 has a unit resolution of 10 nm, if the voltage is changed according to this resolution, the wavelength characteristics ( Spectral characteristics). In addition, if the resolution is increased, the accuracy of spectral characteristics can be improved.
  • color images can be extracted by reproducing colors from the obtained spectral characteristics. Therefore, it is possible to configure the image sensor 100 that does not require a color filter.
  • the image sensor 100 that can also be used as a spectroscopic sensor in the present embodiment is configured to include the gate electrode 131, the polysilicon layer 130, and the P well region 123.
  • the gate electrode 131 is provided below the lens 121 and the P well region 123.
  • the gate electrode 131 is provided at a position that is different from the light incident side and does not prevent the light from entering the photodiode.
  • the gate electrode 131 is provided on the light incident side, for example, between the lens 121 and the P well region 123, the gate electrode 131 needs to be a transparent electrode. In this case, the light incident on the gate electrode 131 is absorbed, and the sensitivity may be lowered.
  • the gate electrode 131 is provided below the P well region 123, and therefore does not absorb the light incident through the lens 121. It is possible to prevent the sensitivity from decreasing.
  • the gate electrode 131 does not need to be a transparent electrode.
  • the gate electrode 131 can be formed of metal and can be used as a light-shielding film for reducing the influence of light on the logic circuit and the like of the lower part 102.
  • the upper part 101 and the lower part 102 can be designed and manufactured separately.
  • a logic circuit can be formed using a supporting substrate, temperature restrictions can be eliminated, and a profile of a light receiving portion (a photogate provided in the upper portion 101) can be easily created.
  • the image sensor 100 can be reduced in height.
  • an organic film such as a color filter can be eliminated, it can be manufactured using a high-temperature reflow process, and the imaging element 100 that can be used even under conditions of 240 degrees or more can be obtained. Become.
  • FIG. 7 shows the configuration of the image sensor according to the second embodiment.
  • the image pickup device 200 shown in FIG. 7 has the same basic configuration as the image pickup device 100 shown in FIG. 5 except that the upper part 101 and the lower part 102 are bump-bonded.
  • the image sensor 200 shown in FIG. 7 parts similar to those of the image sensor 100 shown in FIG.
  • the upper part 101 and the lower part 102 of the image sensor 200 are joined by bumps 202-1 to 202-6.
  • the upper part 101 and the lower part 102 are joined by joining the bumps 202 formed on the upper part 101 and the bumps 202 formed on the lower part 102. It is also possible to configure.
  • the same effect as that of the image sensor 100 is obtained because the gate electrode 131 is provided below the P well region 123 as in the image sensor 100 shown in FIG. Can do.
  • FIG. 8 shows the configuration of the image sensor according to the third embodiment.
  • the image pickup device 300 shown in FIG. 8 has the same basic configuration as the image pickup device 100 shown in FIG. 5 except that a plurality of pixels share a floating diffusion or the like.
  • the image sensor 300 is also composed of an upper part 301 and a lower part 302, similar to the image sensor 100.
  • the upper portion 301 is a photogate type CIS substrate, and the lower portion 302 is a logic substrate.
  • the upper portion 301 is provided with a lens 321, and an optical black region 322 is formed except for a portion where the lens 321 is provided.
  • a P-well region 323 is provided below the lens 321, and a back gate region 324 is provided in part of the P-well region 323. Since the imaging element 300 shares a floating diffusion or the like with other pixels, unlike the imaging element 100 shown in FIG. 5, the P-well region 323 is not provided with a floating diffusion (FD) region or a reset region. .
  • FD floating diffusion
  • an n + region 325 is provided, and the n + region 325 is connected to an n + region 361 provided in the lower portion 302 via a wiring 343. With this n + region 325, wiring 343, and n + region 361, the current from the P well region 323 is read out to the lower portion 302.
  • the upper portion 301 is provided with a wiring 341 connected to the back gate region 324, and this wiring 341 is also connected to the wiring of the back gate region of another pixel.
  • the shared wiring is not shown so as to be connected from the upper portion 301 to the lower portion 302, but is shown halfway, and the wiring that is shown halfway is shared with other pixels. Let's show that.
  • a gate electrode 331 is formed below the P well region 323 via a polysilicon layer 330 to which impurities are added, as in the other embodiments.
  • a wiring 342 is connected to the gate electrode 331, and the wiring 342 is shared with other pixels. The same gate voltage is applied to the pixels sharing the wiring 342 at the same timing.
  • a floating diffusion region 363 and a reset region 365 are provided in the lower portion 302.
  • a transfer gate 362 for transferring charges from the n + region 361 to the floating diffusion region 363 and a reset gate 364 for resetting the floating diffusion region 363 are also provided in the lower portion 302.
  • the floating diffusion region 363 is shared with other pixels, and a wiring 344 is connected thereto.
  • the layout shown in FIG. 9 is obtained when the image sensor 300 is viewed from the upper side (lens 321) side.
  • the image sensor shown in A of FIG. 9 has a configuration in which one floating diffusion is shared by four pixels.
  • the photogate contact 381 is connected to a gate electrode 331 provided below each photodiode.
  • the photogate contact 381 is shared by all pixels. With this configuration, the configuration of the imaging device including the imaging device 300 can be reduced in size.
  • an optical black region 322 is formed between each photodiode.
  • the transfer gate 362 and the like can be disposed below the P well region 323 (photodiode). That is, as shown in FIG. 9B, when the imaging device 300 is viewed from the lens 321 side, the transfer gate 362 of each photodiode is arranged at a position that overlaps the P well region 323 constituting the photodiode.
  • the imaging device including the imaging device 300 can be further reduced in size.
  • the same effect as the image pickup device 100 can be obtained because the gate electrode 331 is provided below the P well region 323 as in the case of the image pickup device 100 shown in FIG. Can do.
  • the image sensor 300 shown in FIG. 8 can be miniaturized by sharing the floating diffusion and the like with other pixels.
  • the image sensor 300 shown in FIG. 8 may be configured such that the upper portion 301 and the lower portion 302 are joined by bumps.
  • FIG. 9 shows the configuration of the image sensor according to the fourth embodiment.
  • the image pickup device 400 shown in FIG. 9 has the same basic configuration as the image pickup device 100 shown in FIG. 5 except that the image pickup device 400 has an organic photoelectric conversion film.
  • the same parts as those of the image sensor 100 shown in FIG. 9 the same parts as those of the image sensor 100 shown in FIG.
  • the image sensor 400 has a configuration in which an organic photoelectric conversion film 451 is added to the image sensor 100 shown in FIG.
  • the organic photoelectric conversion film 451 is directly below the lens 121 and is provided between the lens 121 and the P well region 123.
  • an electrode 461 is provided on the upper side of the organic photoelectric conversion film 451, and an electrode 462 is provided on the lower side.
  • the organic photoelectric conversion film 451 is configured to be sandwiched between the electrode 461 and the electrode 462.
  • the electrode 461 is connected to the logic circuit in the lower portion 402 by a wiring 471.
  • the electrode 462 is connected to the logic circuit in the lower portion 402 by a wiring 472.
  • the organic photoelectric conversion film 451 is a film having sensitivity to a specific color.
  • the organic photoelectric conversion film 451 is a film that absorbs blue light and transmits light of other colors.
  • the description will be continued assuming that the film absorbs blue light.
  • This electron-hole pair is separated into electrons and holes by the electric field applied by the electrodes 461 and 462 in the organic photoelectric conversion film 451, and is read out to the logic circuit.
  • the blue intensity is calculated according to the read charge amount.
  • the organic photoelectric conversion film 451 By providing the organic photoelectric conversion film 451 in this way, it is possible to reduce the number of colors to be detected while changing the voltage of the gate electrode 131 (narrow the width of the wavelength to be detected). Thereby, for example, analysis becomes easier than the imaging device 100 shown in FIG. In addition, since the amount of information to be analyzed is reduced, it is possible to shorten the time required for analysis.
  • the same effect as that of the image sensor 100 is obtained because the gate electrode 131 is provided below the P well region 123 as in the image sensor 100 shown in FIG. Can do.
  • the organic photoelectric conversion film 451 since there is the organic photoelectric conversion film 451, there is a temperature limitation on the organic photoelectric conversion film 451.
  • the image sensor 400 shown in FIG. 10 can also be configured such that the upper portion 401 and the lower portion 402 are joined by bumps.
  • the image sensor 400 shown in FIG. 10 can also have a configuration in which a floating diffusion or the like is shared by a plurality of pixels.
  • FIG. 11 is a block diagram illustrating an example of a configuration of an electronic apparatus according to the present technology, for example, an imaging apparatus.
  • an imaging apparatus 1000 according to the present technology includes an optical system including a lens group 1001 and the like, a solid-state imaging device (imaging device) 1002, a DSP (Digital Signal Processor) circuit 1003, a frame memory 1004, and a display unit 1005.
  • a DSP circuit 1003, a frame memory 1004, a display unit 1005, a recording unit 1006, an operation unit 1007, and a power supply unit 1008 are connected to each other via a bus line 1009.
  • the lens group 1001 takes in incident light (image light) from a subject and forms an image on the imaging surface of the solid-state imaging device 1002.
  • the solid-state imaging device 1002 converts the amount of incident light imaged on the imaging surface by the lens group 1001 into an electrical signal in units of pixels and outputs it as a pixel signal.
  • the DSP circuit 1003 processes a signal from the solid-state image sensor 1002.
  • the solid-state imaging device 1002 has pixels for constructing an image of a photographed subject, and processing such as processing a signal from such a pixel and developing it in the frame memory 1004 is also performed.
  • the display unit 1005 includes a panel type display device such as a liquid crystal display device or an organic EL (electroluminescence) display device, and displays a moving image or a still image captured by the solid-state image sensor 1002.
  • the recording unit 1006 records a moving image or a still image captured by the solid-state imaging device 1002 on a recording medium such as a DVD (Digital Versatile Disk).
  • the operation unit 1007 issues operation commands for various functions of the imaging apparatus under operation by the user.
  • the power source unit 1008 appropriately supplies various power sources serving as operation power sources for the DSP circuit 1003, the frame memory 1004, the display unit 1005, the recording unit 1006, and the operation unit 1007 to these supply targets.
  • the CPU 1010 controls each unit in the imaging apparatus 1000.
  • the imaging apparatus having the above-described configuration can be used as an imaging apparatus such as a video camera, a digital still camera, and a camera module for mobile devices such as a mobile phone.
  • the above-described imaging element can be used as the solid-state imaging element 1002.
  • this technology can also take the following structures.
  • An image sensor that analyzes the intensity of each wavelength of the incident light by measuring a current indicating the amount of the charge generated in the region.
  • Floating diffusion is further provided, The imaging element according to any one of (1) to (4), wherein the floating diffusion is shared by a plurality of imaging elements.
  • An image sensor that analyzes the intensity for each wavelength of the incident light by measuring a current indicating the amount of the electric charge generated in the region of An electronic apparatus comprising a processing unit that processes a signal from the image sensor.

Abstract

 The present technique pertains to an imaging element and an electronic device with which it is possible to improve light-splitting performance. The present invention is provided with a diffusion layer formed on a semiconductor substrate, and an electrode formed on a side of the diffusion layer different from the side at which light is incident, the electrode being subjected to a gate voltage. The gate voltage is varied to vary the depth, from the surface of the diffusion layer, at which charge generated by incident light in the diffusion layer is captured, and for each value of the varied gate voltage, a current is measured which indicates the amount of charge generated in the region from the surface to the corresponding depth, whereby the intensity for each wavelength of the incident light is analyzed. This technique can be applied to a spectral sensor which is an element for performing light splitting, or to an imaging element for producing color information using the result of light splitting.

Description

撮像素子、電子機器Image sensors, electronic devices
 本技術は、撮像素子、電子機器に関する。詳しくは、入射光の波長毎の強度を検出する性能を向上させた撮像素子、電子機器に関する。 This technology relates to image sensors and electronic devices. More specifically, the present invention relates to an image sensor and an electronic device that have improved performance for detecting the intensity of each wavelength of incident light.
 1つのフォトダイオードにより、RGBのカラー情報を取得する試みがある(例えば、特許文献1)。特許文献1に記載されたフォトダイオードは、各画素がシリコン基板に0.2μm、0.6μm、2μmの深さを持つ3つの拡散層を重ねるように配置される。このように、各画素が3層構造となっており、光の三原色である赤(R)・緑(G)・青(B)をシリコンの透過特性によって、深さの異なる各層がそれぞれ異なる波長を透過、受光するように設計されている。 There is an attempt to acquire RGB color information with one photodiode (for example, Patent Document 1). The photodiode described in Patent Document 1 is arranged so that each pixel has three diffusion layers having a depth of 0.2 μm, 0.6 μm, and 2 μm superimposed on a silicon substrate. In this way, each pixel has a three-layer structure, and the three primary colors of light, red (R), green (G), and blue (B), have different wavelengths depending on the transmission characteristics of silicon. It is designed to transmit and receive light.
 例えば、シリコン基板の表面からRGBのすべての波長を入射し、最上層においてRGBのすべてを取り込み、中層では最上層で吸収されたBの要素を除くRGを取り込み、最下層では最上層と中層で吸収されたBGを除くRの要素を取り込む。そして、最下層で取り込んだRの値を中層で取り込んだRGの値から引いてGの値を求め、最上層で取り込んだRGBの値からRとGの値を引いてBの値を求める。 For example, all RGB wavelengths are incident from the surface of the silicon substrate, all of RGB is captured in the uppermost layer, RG excluding B elements absorbed in the uppermost layer is captured in the middle layer, and the uppermost layer and middle layer are captured in the lowermost layer. Take in the elements of R, excluding the absorbed BG. Then, the G value is obtained by subtracting the R value captured in the lowermost layer from the RG value captured in the middle layer, and the B value is obtained by subtracting the R and G values from the RGB values captured in the uppermost layer.
 この構成によれば、単板であるにも関わらず原理的には光の三原色をそのまま取り入れた画像を生成することができる。 According to this configuration, it is possible to generate an image in which the three primary colors of light are taken in as they are in spite of being a single plate.
 上述のフォトダイオードの構成は、RGBを深さ方向に切り出すものであるが、電子を捕獲する位置を変化させることができず、回路が複雑であるため、設計の自由度が低い。さらに、1つの画素で独立した色情報が連続して発生するため、連続する部分の正しいデータとノイズとを見分けてRGB特性を得ることが非常に困難であるため、RGB特性を得るには複雑なソフトウエアが必要となる。 The configuration of the photodiode described above cuts out RGB in the depth direction, but the position for capturing electrons cannot be changed, and the circuit is complicated, so the degree of freedom in design is low. Furthermore, since independent color information is continuously generated in one pixel, it is very difficult to distinguish between correct data and noise in consecutive portions and obtain RGB characteristics, so obtaining RGB characteristics is complicated. Software is required.
 このため、分光センサとして、例えば、入射光に対応する単一のフォトダイオードを備え、ゲート電圧を変化させることによりフォトダイオードのポテンシャル深さを電圧制御できる構造を有する分光センサが提案されている(例えば、特許文献2参照)。この分光センサでは、ゲート電圧を変化させ、フォトダイオード内に入射した光により発生した電子を捕獲する深さをゲート電圧に対応させて変化させることにより、入射光の波長と強度を測定することができる。 For this reason, for example, a spectroscopic sensor having a structure in which a single photodiode corresponding to incident light is provided and the potential depth of the photodiode can be controlled by changing the gate voltage has been proposed ( For example, see Patent Document 2). In this spectroscopic sensor, the wavelength and intensity of incident light can be measured by changing the gate voltage and changing the depth of capturing electrons generated by light incident in the photodiode in accordance with the gate voltage. it can.
USP No.5,965,875号公報USP No. No. 5,965,875 特開2005-10114号公報JP 2005-10114 A 特開2009-168742号公報JP 2009-168742 A
 上記したように、表面入射型の場合、ゲート電極が光の一部を吸収してしまうため、解析時にゲート電極の吸収分を考慮する必要があった。 As described above, in the case of the front-illuminated type, since the gate electrode absorbs a part of light, it is necessary to consider the absorbed amount of the gate electrode at the time of analysis.
 特許文献3では、裏面照射型イメージセンサに、ゲート電圧を変化させることによりフォトダイオードのポテンシャル深さを電圧制御できる構造を有する分光センサが提案されている。特許文献3においても、ゲート電極が入射面側にあるため、ゲート電極が光を吸収してしまい、解析時にゲート電極の吸収分を考慮する必要があった。 Patent Document 3 proposes a spectroscopic sensor having a structure in which the potential depth of a photodiode can be controlled by changing the gate voltage as a back-illuminated image sensor. Also in Patent Document 3, since the gate electrode is on the incident surface side, the gate electrode absorbs light, and it is necessary to consider the absorption of the gate electrode at the time of analysis.
 本技術は、このような状況に鑑みてなされたものであり、単一の素子により高感度に分光測定ができるようにするものである。 This technology has been made in view of such a situation, and enables a highly sensitive spectroscopic measurement with a single element.
 本技術の一側面の撮像素子は、半導体基板に形成された拡散層と、前記拡散層の光が入射する側とは異なる側に形成され、ゲート電圧が印加される電極とを備え、前記ゲート電圧を変化させることにより前記拡散層中で入射光により発生した電荷を捕獲する前記拡散層の表面からの深さを変化させ、変化させたゲート電圧毎で、前記表面側から前記深さまでの領域で発生する前記電荷の量を示す電流を測定することにより、前記入射光の波長毎の強度を解析する。 An imaging device according to one aspect of the present technology includes: a diffusion layer formed on a semiconductor substrate; and an electrode that is formed on a side different from a light incident side of the diffusion layer and to which a gate voltage is applied. By changing the voltage, the depth from the surface of the diffusion layer that captures charges generated by incident light in the diffusion layer is changed, and the region from the surface side to the depth is changed for each changed gate voltage. By measuring a current indicating the amount of the electric charge generated in step 1, the intensity for each wavelength of the incident light is analyzed.
 前記電極は、金属で構成されているようにすることができる。 The electrode can be made of metal.
 前記拡散層と前記電極を含む第1の基板と、前記解析を行う回路を含む第2の基板とが接合された構成とされているようにすることができる。 The first substrate including the diffusion layer and the electrode and the second substrate including the circuit for performing the analysis may be joined.
 前記接合は、バンプ接合であるようにすることができる。 The bonding can be a bump bonding.
 フローティングディフュージョンをさらに備え、前記フローティングディフュージョンは、複数の撮像素子で共有されているようにすることができる。 A floating diffusion may be further provided, and the floating diffusion may be shared by a plurality of imaging elements.
 前記拡散層の前記電極が形成されている側とは異なる側に、光電変換膜をさらに備えるようにすることができる。 A photoelectric conversion film can be further provided on the side of the diffusion layer different from the side where the electrode is formed.
 本技術の一側面の電子機器は、半導体基板に形成された拡散層と、前記拡散層の光が入射する側とは異なる側に形成され、ゲート電圧が印加される電極とを備え、前記ゲート電圧を変化させることにより前記拡散層中で入射光により発生した電荷を捕獲する前記拡散層の表面からの深さを変化させ、変化させたゲート電圧毎で、前記表面側から前記深さまでの領域で発生する前記電荷の量を示す電流を測定することにより、前記入射光の波長毎の強度を解析する撮像素子を備え、前記撮像素子からの信号を処理する処理部を備える。 An electronic apparatus according to an aspect of the present technology includes a diffusion layer formed on a semiconductor substrate, and an electrode formed on a side different from a light incident side of the diffusion layer and to which a gate voltage is applied. By changing the voltage, the depth from the surface of the diffusion layer that captures charges generated by incident light in the diffusion layer is changed, and the region from the surface side to the depth is changed for each changed gate voltage. An image sensor that analyzes the intensity for each wavelength of the incident light by measuring a current that indicates the amount of the electric charge generated in step 1, and a processing unit that processes a signal from the image sensor.
 本技術の一側面の撮像素子においては、半導体基板に形成された拡散層と、拡散層の光が入射する側とは異なる側に形成され、ゲート電圧が印加される電極とが備えられている。そして、ゲート電圧を変化させることにより拡散層中で入射光により発生した電荷を捕獲する拡散層の表面からの深さを変化させ、変化させたゲート電圧毎で、表面側から深さまでの領域で発生する電荷の量を示す電流を測定することにより、入射光の波長毎の強度が解析される。 An imaging device according to one aspect of the present technology includes a diffusion layer formed on a semiconductor substrate and an electrode that is formed on a side different from the light incident side of the diffusion layer and to which a gate voltage is applied. . Then, by changing the gate voltage, the depth from the surface of the diffusion layer that captures the charge generated by the incident light in the diffusion layer is changed, and in the region from the surface side to the depth for each changed gate voltage. By measuring a current indicating the amount of generated charges, the intensity of each wavelength of incident light is analyzed.
 本技術の一側面の電子機器は、前記撮像素子を含む構成とされている。 The electronic device according to one aspect of the present technology includes the image sensor.
 本技術によれば、単一の素子により高感度に分光測定ができる。 According to this technology, spectroscopic measurement can be performed with high sensitivity by a single element.
 なお、ここに記載された効果は必ずしも限定されるものではなく、本開示中に記載されたいずれかの効果であってもよい。 It should be noted that the effects described here are not necessarily limited, and may be any of the effects described in the present disclosure.
半導体の表面に入射する光の波長情報を得る原理について説明する図である。It is a figure explaining the principle which acquires the wavelength information of the light which injects into the surface of a semiconductor. 半導体に入射する光のエネルギーと吸収係数の波長依存性を示す図である。It is a figure which shows the wavelength dependence of the energy of the light which injects into a semiconductor, and an absorption coefficient. 本技術を適用した撮像素子の一実施の形態の構成を示す図である。It is a figure showing composition of one embodiment of an image sensor to which this art is applied. 本技術を適用した撮像素子の回路構成を示す図である。It is a figure showing the circuit composition of the image sensor to which this art is applied. 撮像素子の構成を示す図である。It is a figure which shows the structure of an image pick-up element. ポテンシャルと位置の関係を示す図である。It is a figure which shows the relationship between a potential and a position. 撮像素子の構成を示す図である。It is a figure which shows the structure of an image pick-up element. 撮像素子の構成を示す図である。It is a figure which shows the structure of an image pick-up element. フォトダイオードの配置について説明するための図である。It is a figure for demonstrating arrangement | positioning of a photodiode. 撮像素子の構成を示す図である。It is a figure which shows the structure of an image pick-up element. 電子機器の構成を示す図である。It is a figure which shows the structure of an electronic device.
 以下に、本技術を実施するための形態(以下、実施の形態という)について説明する。なお、説明は、以下の順序で行う。
 1.分光測定について
 2.撮像装置の構成
 3.第1の実施の形態における撮像素子の構成
 4.第2の実施の形態における撮像素子の構成
 5.第3の実施の形態における撮像素子の構成
 6.第4の実施の形態における撮像素子の構成
 7.電子機器
Hereinafter, modes for carrying out the present technology (hereinafter referred to as embodiments) will be described. The description will be given in the following order.
1. 1. About spectroscopic measurement 2. Configuration of imaging device 3. Configuration of image pickup device according to first embodiment 4. Configuration of image pickup device according to second embodiment Configuration of image pickup device according to third embodiment 6. 6. Configuration of image pickup device according to fourth embodiment Electronics
 <分光測定について>
 本技術の具体的な実施の形態の説明に先立ち、本技術の概要について説明する。本技術は、分光測定が可能な撮像素子である。半導体装置に入射した光は、半導体の表面からの深さにより光電変換される光の波長が変化する。この光の性質を利用し、撮像素子を構成する分光センサにゲート電極を設け、光子から変換された電子がたまる電荷の井戸のポテンシャルを可変制御させることにより、分光特性を測定することができる。そして、画素としてこの分光センサを用いることにより、測定した分光特性からカラー画像の抽出を行う固体撮像素子を構成することができる。
<About spectroscopic measurement>
Prior to the description of specific embodiments of the present technology, an outline of the present technology will be described. The present technology is an image sensor capable of performing spectroscopic measurement. The wavelength of light that is photoelectrically converted from light incident on the semiconductor device changes depending on the depth from the surface of the semiconductor. By utilizing this property of light, a spectral characteristic can be measured by providing a gate electrode in a spectroscopic sensor that constitutes an imaging device and variably controlling the potential of a charge well in which electrons converted from photons accumulate. By using this spectroscopic sensor as a pixel, a solid-state image sensor that extracts a color image from the measured spectral characteristics can be configured.
 まず、図1を用いて半導体の表面に入射する光の波長情報を得るための基本原理について説明する。半導体に光があたると、その光のエネルギーhνにより、半導体中に電子・正孔対が発生する。なお、hはプランク定数、νは光振動数を表す。 First, the basic principle for obtaining wavelength information of light incident on the semiconductor surface will be described with reference to FIG. When light hits a semiconductor, electron-hole pairs are generated in the semiconductor by the energy hν of the light. Note that h represents the Planck constant and ν represents the optical frequency.
 これは、光と半導体の相互作用によるものであり、半導体の種類と波長に依存している。例えば、半導体の種類がシリコン(Si)の場合には、約1.1VのバンドキャップEg(禁制帯)を有しており、hν>Egとなる入射光に対して価電子帯から伝導帯への電子の励起が行われ、光が電荷に変わる。このときhν0=hc/λ0=Egとなる波長λ0を基礎吸収端と呼び、これが半導体内で光電変換される波長の上限を与える。例えばSiの場合λ0は、λ≒1.0μmである。 This is due to the interaction between light and semiconductor and depends on the type and wavelength of the semiconductor. For example, when the semiconductor type is silicon (Si), it has a band cap Eg (forbidden band) of about 1.1 V, and from the valence band to the conduction band for incident light where hν> Eg. The electrons are excited and light is converted into electric charges. At this time, the wavelength λ0 where hν0 = hc / λ0 = Eg is called the fundamental absorption edge, and this gives the upper limit of the wavelength that is photoelectrically converted in the semiconductor. For example, in the case of Si, λ0 is λ≈1.0 μm.
 そして図1に示すように、吸収の起こる領域から見ると、入射光の強度をI0、表面での反射率をRとすると、実際に半導体に入射する光は、(1-R)・I0となる。半導体表面からの深さxの位置での光の強さI、(x+dx)の位置での光の強さを(I+dI)とすると、dI=-α・I・dxとなる吸収係数αを定義することができる。この式を積分すると、次式(1)となる。 As seen from the region where absorption occurs, as shown in FIG. 1, when the intensity of incident light is I 0 and the reflectance at the surface is R, the light actually incident on the semiconductor is (1-R) · I 0 . Define the absorption coefficient α such that dI = −α · I · dx where the light intensity I at the position of depth x from the semiconductor surface is I and the light intensity at the location of (x + dx) is (I + dI). can do. When this equation is integrated, the following equation (1) is obtained.
Figure JPOXMLDOC01-appb-M000001
Figure JPOXMLDOC01-appb-M000001
 式(1)は、深さ方向の光強度分布を示し、x0=1/αのあたりまでが実効的な吸収領域と考えることができる。 Equation (1) shows the light intensity distribution in the depth direction, and it can be considered that the area up to around x0 = 1 / α is an effective absorption region.
 次に、図2に半導体に入射する光のエネルギー(eV)と、吸収係数α(cm-1)の波長依存性を示す。フォトンのエネルギーが高い(波長が短い)ほど吸収係数αの値は大きくなる。また、実効的な吸収領域が浅くなる。すなわち、短波長の光は半導体の表面に近い位置でほとんど吸収されて光電変換される。これに対し、長波長の光は半導体の表面からより深い位置まで達して光電変換される。 Next, FIG. 2 shows the wavelength dependence of the energy (eV) of light incident on the semiconductor and the absorption coefficient α (cm −1 ). The higher the photon energy (the shorter the wavelength), the larger the value of the absorption coefficient α. In addition, the effective absorption region becomes shallow. That is, short-wavelength light is almost absorbed and photoelectrically converted at a position close to the surface of the semiconductor. In contrast, long-wavelength light reaches a deeper position from the semiconductor surface and is photoelectrically converted.
 そこで、本技術では、半導体装置の入射光により発生する電子(又は正孔)を、収集可能なポテンシャル深さを変化させて発生する電流を測定する。この方法により、入射する光の波長情報を得ることができる。 Therefore, in the present technology, the current generated by changing the potential depth at which electrons (or holes) generated by the incident light of the semiconductor device can be collected is measured. By this method, wavelength information of incident light can be obtained.
 例えば、単色光が入射している場合においては、半導体内の深さ(位置)Wまでに発生する電流を計算によって求めることができる。半導体に光が入射すると光強度は指数関数的に減衰する。よって、ある深さxにおける光強度Φは、次式(2)で表される。 For example, when monochromatic light is incident, the current generated up to the depth (position) W in the semiconductor can be obtained by calculation. When light enters the semiconductor, the light intensity attenuates exponentially. Therefore, the light intensity Φ at a certain depth x is expressed by the following equation (2).
Figure JPOXMLDOC01-appb-M000002
Figure JPOXMLDOC01-appb-M000002
 式(2)において、
Φ0 :入射光強度〔W/cm2 〕
α:吸収係数〔cm-1
である。これより、深さWまでに吸収される割合を求める次式(3)となる。
In equation (2),
Φ 0 : Incident light intensity [W / cm 2]
α: absorption coefficient [cm −1 ]
It is. From this, the following equation (3) for obtaining the proportion absorbed by the depth W is obtained.
Figure JPOXMLDOC01-appb-M000003
Figure JPOXMLDOC01-appb-M000003
 これらより、深さWまでに発生する電流は次式(4)で決まる。 From these, the current generated up to the depth W is determined by the following equation (4).
Figure JPOXMLDOC01-appb-M000004
Figure JPOXMLDOC01-appb-M000004
 式(4)において、
 S :受光部の面積〔cm2 〕
 hν:光のエネルギー〔J〕
 q :電子ボルト〔J〕
In equation (4),
S: Area of light receiving part [cm 2 ]
hν: Light energy [J]
q: Electron Volt [J]
 2つの波長(λ1とλ2)の光が、強度A1とA2で同時に入射したとする。表面から電子の捕獲位置W1の距離までに発生した電子による電流を測定したら、電流I1であった。 It is assumed that light of two wavelengths (λ1 and λ2) is incident simultaneously with intensities A 1 and A 2 . When the current due to the electrons generated from the surface to the distance of the electron capture position W 1 was measured, it was the current I 1 .
 次に、電子の捕獲位置W2の距離までに発生した電子による電流を測定したら、電流I2であったとする。このとき、上記式(4)をそれぞれの波長に分けて表すと、以下の式(5)で表すことができる。 Next, when the current due to the electrons generated up to the distance of the electron capture position W 2 is measured, it is assumed that the current is I 2 . At this time, when the formula (4) is divided into the respective wavelengths, it can be represented by the following formula (5).
Figure JPOXMLDOC01-appb-M000005
Figure JPOXMLDOC01-appb-M000005
 ここで、各パラメータは次の通りである。
 A1 、A2 :入射光強度〔W/cm2 〕
 S:センサ面積〔cm2〕
 W1 、W2 :電子の捕獲位置〔cm〕
 α1 、α2 :それぞれの波長の吸収係数〔cm-1
 I1 :電子の捕獲位置をW1 としたときの電流の実測値〔A〕
 I2 :電子の捕獲位置をW2 としたときの電流の実測値〔A〕
 振動数ν1 =c/λ1
 振動数ν2 =c/λ2
Here, each parameter is as follows.
A 1 , A 2 : Incident light intensity [W / cm 2 ]
S: Sensor area [cm2]
W 1 , W 2 : electron capture position [cm]
α 1 , α 2 : Absorption coefficient of each wavelength [cm −1 ]
I 1 : Actual measured value of current when electron capture position is W 1 [A]
I 2 : Measured current value when the electron capture position is W 2 [A]
Frequency ν 1 = c / λ 1
Frequency ν 2 = c / λ 2
 式(5)において、cは光速、Sは受光部の面積、hνは光のエネルギー、qは電子ボルトであり、入射光強度A1 、A2 以外はすべて既知の値であるから、この2式から連立方程式を解くことにより、入射光強度A1 、A2 を、次式(6)により求めることができる。 In Expression (5), c is the speed of light, S is the area of the light receiving portion, hν is the energy of light, q is the electron volt, and all the values other than the incident light intensities A 1 and A 2 are known values. By solving the simultaneous equations from the equations, the incident light intensities A 1 and A 2 can be obtained by the following equation (6).
Figure JPOXMLDOC01-appb-M000006
Figure JPOXMLDOC01-appb-M000006
 式(6)における各パラメータは次の通りである。 Each parameter in Equation (6) is as follows.
Figure JPOXMLDOC01-appb-M000007
Figure JPOXMLDOC01-appb-M000007
 例えば、入射光を3つの波長に分離する場合は、式(4)に電子を捕獲する位置W3の場合の電流I3 が増える。その後は、2波長の場合と同様に計算を行うことで、入ってきた光を3波長に分離できる。同様に100の波長で入ってくる入射光を分光する場合は、電子を捕獲する位置を100回変化させて測定すればよい。 For example, when the incident light is separated into three wavelengths, the current I 3 in the case of the position W 3 where electrons are captured is increased in the equation (4). After that, by calculating in the same manner as in the case of two wavelengths, the incoming light can be separated into three wavelengths. Similarly, when the incident light entering at a wavelength of 100 is dispersed, the position for capturing the electrons may be changed 100 times and measured.
 <撮像装置の構成>
 次に本技術が適用される撮像素子が備えられる撮像装置の構成について説明する。図3及び図4は、列並列アナログ-デジタル変換器搭載の撮像装置の構成を示す図である。
<Configuration of imaging device>
Next, the configuration of an imaging device provided with an imaging device to which the present technology is applied will be described. 3 and 4 are diagrams showing the configuration of an imaging apparatus equipped with a column parallel analog-digital converter.
 図3及び図4に示すように、固体撮像素子10は、画素部11、垂直走査回路18、列並列アナログ-デジタル変換器(ADC:Analog-Digital converter)15、ランプ波を生成するデジタル-アナログ変換器(DAC:Digital-Analog converter)19、ロジック制御回路20、デジタル出力小振幅差動信号(LVDS:Low Voltage differential signaling)インターフェース(I/F)16からなる。 3 and 4, the solid-state imaging device 10 includes a pixel unit 11, a vertical scanning circuit 18, a column parallel analog-digital converter (ADC) 15, and a digital-analog that generates a ramp wave. A converter (DAC: Digital-Analognaconverter) 19, a logic control circuit 20, and a digital output small amplitude differential signal (LVDS: Low Voltage16differential signaling) interface (I / F) 16.
 画素部11は、複数の画素12が、例えばマトリックス上に二次元配置される。この画素12は、例えば分光センサとしての機能を備え、フォトダイオードと、不純物が添加された多結晶シリコン層と、多結晶シリコン層にゲート電圧を印加するゲート電極とを備えて構成される。また、画素12には、画素の行単位で画素駆動配線が配列され、行単位で垂直信号線24が配列された構成となる。画素部11の各画素12は、列方向に延びた画素駆動配線25によって駆動される。また、画素の信号はアナログ信号であり、行方向に延びた垂直信号線24に出力される。 In the pixel unit 11, a plurality of pixels 12 are two-dimensionally arranged on a matrix, for example. The pixel 12 has a function as a spectroscopic sensor, for example, and includes a photodiode, a polycrystalline silicon layer to which an impurity is added, and a gate electrode that applies a gate voltage to the polycrystalline silicon layer. Further, the pixel 12 has a configuration in which pixel drive wirings are arranged in units of rows of pixels and vertical signal lines 24 are arranged in units of rows. Each pixel 12 of the pixel unit 11 is driven by a pixel drive wiring 25 extending in the column direction. The pixel signal is an analog signal and is output to the vertical signal line 24 extending in the row direction.
 列並列アナログ/デジタル変換装置である列並列ADC15は、比較器21とカウンタ23とからなる。比較器21は、ランプ波を生成するDAC19から生成されるランプ波と、各画素12からのアナログ信号とを比較する。そして、カウンタ23は、比較器21における比較完了までの比較時間をカウントし、その結果を保持する例えばアップダウンカウンタとからなる。 A column parallel ADC 15 which is a column parallel analog / digital conversion device includes a comparator 21 and a counter 23. The comparator 21 compares the ramp wave generated from the DAC 19 that generates the ramp wave with the analog signal from each pixel 12. The counter 23 is composed of, for example, an up / down counter that counts the comparison time until the comparison in the comparator 21 is completed and holds the result.
 このアップダウンカウンタ等よりなるカウンタ23を高速動作させるため、位相同期回路(PLL:Phase Locked Loop)17を内蔵し高速カウントクロックが生成される。また、画素部11の信号を順次読み出すための制御回路として、内部クロックを生成するロジック制御回路20、及び、行アドレスや行走査を制御する垂直走査回路18が配置されている。 In order to operate the counter 23 including the up / down counter at a high speed, a phase locked loop (PLL: Phase Locked Loop) 17 is incorporated to generate a high speed count clock. In addition, a logic control circuit 20 that generates an internal clock and a vertical scanning circuit 18 that controls row address and row scanning are arranged as a control circuit for sequentially reading out signals from the pixel unit 11.
 デジタル出力LVDSインターフェース(I/F)16は、列並列ADC15からの信号を処理して出力する。例えば、バッファリングだけを行う場合もあるし、その前に黒レベル調整、列ばらつき補正、各種のデジタル信号処理等の処理を行う場合もある。さらに、特に図示しないが、その他の各種信号処理回路が配置されてもよい。 The digital output LVDS interface (I / F) 16 processes and outputs a signal from the column parallel ADC 15. For example, only buffering may be performed, or black level adjustment, column variation correction, various digital signal processing, and the like may be performed before that. Further, although not particularly illustrated, other various signal processing circuits may be arranged.
 本実施の形態では、列並列ADC15の構成を比較器21とカウンタ(アップダウンカウンタ)23からなるものとしたが、アップダウンカウンタは、1本のカウント制御クロックで高速動作が可能な非同期アップダウンカウンタが好ましい。また本実施例のアップダウンカウンタ構成は回路の簡略化・高速動作など利点が多く好ましい。一方、アップダウンカウンタの替わりに、カウンタを2重に設けること、またはカウンタは列並列とせず、メモリ手段を2重に設けることも可能である。 In this embodiment, the column parallel ADC 15 is configured by the comparator 21 and the counter (up / down counter) 23, but the up / down counter is an asynchronous up / down capable of high-speed operation with one count control clock. A counter is preferred. The up / down counter configuration of this embodiment is preferable because it has many advantages such as simplified circuit and high-speed operation. On the other hand, instead of the up / down counter, it is possible to provide double counters, or the counters are not arranged in parallel, and the memory means may be doubled.
 <第1の実施の形態における撮像素子の構成>
 次に、撮像素子(画素12)の構成について説明を加える。図5は、撮像素子の構成を示す断面図である。
<Configuration of Image Sensor in First Embodiment>
Next, the configuration of the image sensor (pixel 12) will be described. FIG. 5 is a cross-sectional view showing the configuration of the image sensor.
 撮像素子100は、上部101と下部102から構成されている。上部101は、フォトゲート型のCIS(CMOS Image Sensor)基板であり、下部102は、ロジック基板である。撮像素子100は、CIS基板とロジック基板が接合された裏面照射型のイメージセンサである。フォトゲートは、活性層内で光電変換により生成されたキャリアのうち一方の電荷を活性層内に蓄積するために空乏層を形成するために用いられているものである。 The image sensor 100 is composed of an upper part 101 and a lower part 102. The upper part 101 is a photogate type CIS (CMOS Image Sensor) substrate, and the lower part 102 is a logic substrate. The image sensor 100 is a back-illuminated image sensor in which a CIS substrate and a logic substrate are bonded. The photogate is used to form a depletion layer in order to accumulate one charge among carriers generated by photoelectric conversion in the active layer in the active layer.
 上部101には、オンチップレンズ(以下、レンズと記載)121が設けられており、このレンズ121が設けられている部分以外は、オプティカルブラック領域122とされており、映像信号で黒の基準を規定する画素領域とされている。 An upper chip 101 is provided with an on-chip lens (hereinafter referred to as a lens) 121, and an optical black region 122 is provided except for the portion where the lens 121 is provided. The pixel region is defined.
 レンズ121の下側の半導体基板には、n-領域を介して、Pウェル(P-well)領域123が設けられ、その一部にバックゲート領域124も設けられている。また拡散層であるPウェル領域123には、フローティングディフュージョン(FD)領域125、リセット領域(リセットトランジスタ)126も設けられている。 A semiconductor substrate under the lens 121 is provided with a P-well region 123 through an n-region, and a back gate region 124 is also provided in a part thereof. The P well region 123 which is a diffusion layer is also provided with a floating diffusion (FD) region 125 and a reset region (reset transistor) 126.
 Pウェル領域123の図中右側であり、n-領域には、n+領域127も設けられている。これらの領域は、下部102のロジック回路と配線により接続されている。バックゲート領域124は、配線141により下部102のロジック回路と接続され、フローティングディフュージョン領域125は、配線144により下部102のロジック回路と接続され、リセット領域126は、配線146により下部102のロジック回路と接続され、n+領域127は、配線147により下部102のロジック回路と接続されている。 On the right side of the P well region 123 in the drawing, an n + region 127 is also provided in the n− region. These regions are connected to the logic circuit in the lower part 102 by wiring. The back gate region 124 is connected to the logic circuit in the lower portion 102 by the wiring 141, the floating diffusion region 125 is connected to the logic circuit in the lower portion 102 by the wiring 144, and the reset region 126 is connected to the logic circuit in the lower portion 102 by the wiring 146. The n + region 127 is connected to the logic circuit in the lower part 102 by a wiring 147.
 上部101のPウェル領域123の下部には、ゲート電極131、転送ゲート132、リセットゲート133も設けられており、これらのゲートにも配線が接続されている。ゲート電極131は、配線142により下部102のロジック回路と接続され、転送ゲート132は、配線143により下部102のロジック回路と接続され、リセットゲート133は、配線145により下部102のロジック回路と接続されている。 A gate electrode 131, a transfer gate 132, and a reset gate 133 are also provided below the P well region 123 in the upper portion 101, and wirings are also connected to these gates. The gate electrode 131 is connected to the logic circuit in the lower part 102 by the wiring 142, the transfer gate 132 is connected to the logic circuit in the lower part 102 by the wiring 143, and the reset gate 133 is connected to the logic circuit in the lower part 102 by the wiring 145. ing.
 また、ゲート電極131、転送ゲート132、リセットゲート133は、それぞれ不純物が添加されたポリシリコン層130に形成されている。 The gate electrode 131, the transfer gate 132, and the reset gate 133 are each formed in the polysilicon layer 130 to which impurities are added.
 撮像素子100は、図1、図2を参照して説明したように、撮像素子を構成する分光センサにゲート電極131を設け、光子から変換された電子がたまる電荷の井戸のポテンシャルを可変制御させることにより、分光特性を測定することができるように構成されている。 As described with reference to FIGS. 1 and 2, the image sensor 100 is provided with the gate electrode 131 in the spectroscopic sensor constituting the image sensor, and variably controls the potential of the charge well in which the electrons converted from the photons are accumulated. Thus, the spectral characteristics can be measured.
 図5に示した撮像素子100において、ゲート電極131の電圧を可変することで、分光特性を測定することができる。図6を参照して説明する。図6に示したグラフは、横軸がポテンシャルの断面図を表し、縦軸がポテンシャルを表す。ポテンシャルの断面図とは、Pウェル領域123の所定の位置のレンズ121からゲート電極131方向への断面図である。 In the image sensor 100 shown in FIG. 5, the spectral characteristic can be measured by varying the voltage of the gate electrode 131. This will be described with reference to FIG. In the graph shown in FIG. 6, the horizontal axis represents a cross-sectional view of the potential, and the vertical axis represents the potential. The potential cross-sectional view is a cross-sectional view from the lens 121 at a predetermined position in the P well region 123 toward the gate electrode 131.
 図6中、実線の太線は、ゲート電圧が低いときのポテンシャルのグラフであり、実線の細線は、ゲート電圧が高いときのポテンシャルのグラフである。図6のグラフから、レンズ121側から、ゲート電極131方向に位置が移動するにともない、ポテンシャルが上がり、所定の位置で最高値になり、その後、ポテンシャルが下がることがわかる。すなわち、Pウェル領域123の所定の位置(深さ)で、ポテンシャルが最高値になる。 In FIG. 6, the solid thick line is a potential graph when the gate voltage is low, and the solid thin line is a potential graph when the gate voltage is high. From the graph of FIG. 6, it can be seen that as the position moves from the lens 121 side in the direction of the gate electrode 131, the potential increases, reaches a maximum value at a predetermined position, and then decreases. That is, the potential reaches the maximum value at a predetermined position (depth) of the P well region 123.
 また、図6から、ゲート電圧が異なると、ポテンシャルが最高値となる位置が異なることがわかる。図6中、点線で、最高値が得られる位置を示した。ゲート電圧が低い場合、Pウェル領域123のレンズ121側(浅い側)でポテンシャルが最高値となり、ゲート電圧が高くなると、その位置よりもゲート電極131側(深い側)でポテンシャルが最高値となる。 In addition, it can be seen from FIG. 6 that the position where the potential becomes the maximum value differs depending on the gate voltage. In FIG. 6, the dotted line indicates the position where the maximum value is obtained. When the gate voltage is low, the potential is highest on the lens 121 side (shallow side) of the P well region 123, and when the gate voltage is higher, the potential is highest on the gate electrode 131 side (deep side) than that position. .
 このようなことを利用することで、撮像素子100を、分光センサとして用いることができる。また、分光センサとして用いることができることで、測定した分光特性からカラー画像の抽出を行うことができ、カラー画像を撮影する撮像素子として用いることができる。このようなゲート電圧の制御や、分光に係わる処理(解析)は、下部102に設けられているロジック回路で行われる。 By utilizing such a thing, the image sensor 100 can be used as a spectroscopic sensor. Further, since it can be used as a spectroscopic sensor, a color image can be extracted from the measured spectral characteristics, and can be used as an image sensor for photographing a color image. Such gate voltage control and spectral processing (analysis) are performed by a logic circuit provided in the lower part 102.
 図5に示した撮像素子100は、ゲート電極131からのゲート電圧を変化させることにより、量子井戸のポテンシャル深さを電圧制御できる構造を有し、半導体の深さ方向による波長依存性と裏面入射の高感度を併せ持つ。 The imaging element 100 shown in FIG. 5 has a structure in which the potential depth of the quantum well can be controlled by changing the gate voltage from the gate electrode 131. Combined with high sensitivity.
 撮像素子100の駆動は、垂直読み出しにあわせゲート電圧を可変することで行われる。この撮像素子100の駆動による、入射光の分光測定の方法の一例を下記に示す。 The driving of the image sensor 100 is performed by changing the gate voltage in accordance with the vertical readout. An example of a method for spectroscopic measurement of incident light by driving the image sensor 100 will be described below.
 まず、入射光が入射されるとともに、ゲート電極131により、例えば、ゲート電圧が1V印加されて、そのときに流れる電流が読み取られる。次に、Pウェル領域123に、ゲート電圧が2V印加されて、そのときに流れる電流が読み取られる。次に、Pウェル領域123にゲート電圧が5V印加されて、そのときに流れる電流が読み取られる。 First, incident light is incident and, for example, a gate voltage of 1 V is applied by the gate electrode 131, and a current flowing at that time is read. Next, a gate voltage of 2 V is applied to the P well region 123, and the current flowing at that time is read. Next, a gate voltage of 5 V is applied to the P well region 123, and the current flowing at that time is read.
 このようにして測定されて電流値に基づいて、上述の式(2)から、入射光の各波長の強度が算出される。例えば、ゲート電極131の電圧を、上記したように上げていくことで、図6に示したように、ポテンシャルが最高値となる位置が、深い側に移動する。このことにより、赤→赤+緑→赤+緑+青の順に電子が発生することになる。 The intensity of each wavelength of the incident light is calculated from the above equation (2) based on the current value measured in this way. For example, by raising the voltage of the gate electrode 131 as described above, as shown in FIG. 6, the position where the potential becomes the maximum value moves to the deep side. As a result, electrons are generated in the order of red → red + green → red + green + blue.
 例えば、可視域の場合は、波長が400nmから700nmまで300nmの幅があるため、撮像素子100が10nm単位分解能の場合、この分解能に応じて電圧変化をさせれば、30回読み出して波長特性(分光特性)を得ることができる。また、分解能を上げれば分光特性の精度を向上させることが可能である。 For example, in the visible region, the wavelength ranges from 400 nm to 700 nm and has a width of 300 nm. Therefore, when the imaging device 100 has a unit resolution of 10 nm, if the voltage is changed according to this resolution, the wavelength characteristics ( Spectral characteristics). In addition, if the resolution is increased, the accuracy of spectral characteristics can be improved.
 また、得られた分光特性から、色を再現することによりカラー画像の抽出を行うことができる。このため、カラーフィルタが不要な撮像素子100を構成することができる。 Also, color images can be extracted by reproducing colors from the obtained spectral characteristics. Therefore, it is possible to configure the image sensor 100 that does not require a color filter.
 このように、本実施の形態における分光センサとしても用いることができる撮像素子100は、ゲート電極131、ポリシリコン層130、Pウェル領域123を含む構成とされている。そして、図5に示した撮像素子100は、ゲート電極131が、レンズ121、Pウェル領域123の下側に設けられている。換言すれば、光が入射する側とは異なる側であり、フォトダイオードへの光の入射を妨げるようなことがない位置に、ゲート電極131は設けられている。 As described above, the image sensor 100 that can also be used as a spectroscopic sensor in the present embodiment is configured to include the gate electrode 131, the polysilicon layer 130, and the P well region 123. In the imaging element 100 shown in FIG. 5, the gate electrode 131 is provided below the lens 121 and the P well region 123. In other words, the gate electrode 131 is provided at a position that is different from the light incident side and does not prevent the light from entering the photodiode.
 仮に、ゲート電極131が、光が入射する側、例えば、レンズ121とPウェル領域123との間に設けられていた場合、ゲート電極131を、透明電極にする必要がある。また、この場合、ゲート電極131に、入射してきた光が吸収されてしまい、感度が低下してしまう可能性がある。 If the gate electrode 131 is provided on the light incident side, for example, between the lens 121 and the P well region 123, the gate electrode 131 needs to be a transparent electrode. In this case, the light incident on the gate electrode 131 is absorbed, and the sensitivity may be lowered.
 しかしながら、図5に示した撮像素子100においては、ゲート電極131は、Pウェル領域123の下側に設けられているため、レンズ121を介して入射してきた光を吸収してしまうことがなく、感度が低下するようなことを防ぐことが可能となる。 However, in the imaging device 100 shown in FIG. 5, the gate electrode 131 is provided below the P well region 123, and therefore does not absorb the light incident through the lens 121. It is possible to prevent the sensitivity from decreasing.
 また、光が入射する側に設けられていないため、ゲート電極131を透明電極とする必要が無い。例えば、ゲート電極131を金属で形成することが可能となり、下部102のロジック回路などへの光の影響を低減するための遮光膜として用いることも可能となる。 Further, since it is not provided on the light incident side, the gate electrode 131 does not need to be a transparent electrode. For example, the gate electrode 131 can be formed of metal and can be used as a light-shielding film for reducing the influence of light on the logic circuit and the like of the lower part 102.
 また、レンズ121やPウェル領域123などを上部101に設け、ロジック回路などを下部102に設ける構成とすることで、上部101と下部102をそれぞれ別々に設計、製造することが可能となる。例えばロジック回路を、支持基板で作成することができ、温度の制約をなくすことができ、受光部(上部101に設けられているフォトゲート)のプロファイルが作成しやすくなる。 Further, by providing the lens 121, the P well region 123, and the like in the upper part 101 and the logic circuit etc. in the lower part 102, the upper part 101 and the lower part 102 can be designed and manufactured separately. For example, a logic circuit can be formed using a supporting substrate, temperature restrictions can be eliminated, and a profile of a light receiving portion (a photogate provided in the upper portion 101) can be easily created.
 また、ロジック基板からフォトゲートへのホットキャリアの混入を防ぐことも可能となる。また、カラーフィルタが無いため、撮像素子100を低背化することが可能となる。またカラーフィルタのような有機膜をなくす構造とすることができるため、高温のリフロー処理を用いて製造することも可能となり、240度以上の条件でも使用可能な撮像素子100とすることが可能となる。また、SiNレンズも高温成膜したものを使用することが可能となる。 Also, it is possible to prevent hot carriers from entering from the logic substrate to the photogate. Further, since there is no color filter, the image sensor 100 can be reduced in height. In addition, since an organic film such as a color filter can be eliminated, it can be manufactured using a high-temperature reflow process, and the imaging element 100 that can be used even under conditions of 240 degrees or more can be obtained. Become. In addition, it is possible to use a SiN lens formed at a high temperature.
 また上記したように、ゲート電圧の印加に伴って、赤→赤+緑→赤+緑+青の順に情報が増え、取り出せる構成のために、解析が容易となる。 Also, as described above, as the gate voltage is applied, information increases in the order of red → red + green → red + green + blue, and the analysis can be easily performed because the information can be extracted.
 <第2の実施の形態における撮像素子の構成>
 図7に、第2の実施の形態における撮像素子の構成を示した。図7に示した撮像素子200は、図5に示した撮像素子100と基本的な構成は同様であるが、上部101と下部102が、バンプ接合されている点が異なる。図7に示した撮像素子200において、図5に示した撮像素子100と同様の部分には、同様の符号を付し、その説明は省略する。
<Configuration of Image Sensor in Second Embodiment>
FIG. 7 shows the configuration of the image sensor according to the second embodiment. The image pickup device 200 shown in FIG. 7 has the same basic configuration as the image pickup device 100 shown in FIG. 5 except that the upper part 101 and the lower part 102 are bump-bonded. In the image sensor 200 shown in FIG. 7, parts similar to those of the image sensor 100 shown in FIG.
 図7に示すように、撮像素子200の上部101と下部102は、バンプ202-1乃至202-6で接合されている。このように、上部101と下部102が接合される際、上部101に形成されたバンプ202と下部102に形成されたバンプ202が接合されることで、上部101と下部102が接合されるように構成することも可能である。 As shown in FIG. 7, the upper part 101 and the lower part 102 of the image sensor 200 are joined by bumps 202-1 to 202-6. As described above, when the upper part 101 and the lower part 102 are joined, the upper part 101 and the lower part 102 are joined by joining the bumps 202 formed on the upper part 101 and the bumps 202 formed on the lower part 102. It is also possible to configure.
 図7に示した撮像素子200においても、図5に示した撮像素子100と同じく、ゲート電極131が、Pウェル領域123の下側に設けられているため、撮像素子100と同じ効果を得ることができる。 In the image sensor 200 shown in FIG. 7 as well, the same effect as that of the image sensor 100 is obtained because the gate electrode 131 is provided below the P well region 123 as in the image sensor 100 shown in FIG. Can do.
 <第3の実施の形態における撮像素子の構成>
 図8に、第3の実施の形態における撮像素子の構成を示した。図8に示した撮像素子300は、図5に示した撮像素子100と基本的な構成は同様であるが、複数の画素でフローティングディフュージョン等を共有した構成となっている点が異なる。
<Configuration of Image Sensor in Third Embodiment>
FIG. 8 shows the configuration of the image sensor according to the third embodiment. The image pickup device 300 shown in FIG. 8 has the same basic configuration as the image pickup device 100 shown in FIG. 5 except that a plurality of pixels share a floating diffusion or the like.
 撮像素子300も、撮像素子100と同じく、上部301と下部302から構成されている。上部301は、フォトゲート型のCIS基板であり、下部302は、ロジック基板である。上部301には、レンズ321が設けられており、このレンズ321が設けられている部分以外は、オプティカルブラック領域322とされている。 The image sensor 300 is also composed of an upper part 301 and a lower part 302, similar to the image sensor 100. The upper portion 301 is a photogate type CIS substrate, and the lower portion 302 is a logic substrate. The upper portion 301 is provided with a lens 321, and an optical black region 322 is formed except for a portion where the lens 321 is provided.
 レンズ321の下側には、Pウェル(P-well)領域323が設けられ、その一部にバックゲート領域324が設けられている。撮像素子300は、フローティングディフュージョン等を他の画素と共用しているため、図5に示した撮像素子100と異なり、Pウェル領域323に、フローティングディフュージョン(FD)領域やリセット領域は設けられていない。 A P-well region 323 is provided below the lens 321, and a back gate region 324 is provided in part of the P-well region 323. Since the imaging element 300 shares a floating diffusion or the like with other pixels, unlike the imaging element 100 shown in FIG. 5, the P-well region 323 is not provided with a floating diffusion (FD) region or a reset region. .
 Pウェル領域323には、n+領域325が設けられており、このn+領域325が、配線343を介して、下部302に設けられているn+領域361と接続されている。このn+領域325、配線343、およびn+領域361により、Pウェル領域323からの電流が、下部302へと読み出される。 In the P well region 323, an n + region 325 is provided, and the n + region 325 is connected to an n + region 361 provided in the lower portion 302 via a wiring 343. With this n + region 325, wiring 343, and n + region 361, the current from the P well region 323 is read out to the lower portion 302.
 上部301には、バックゲート領域324と接続されている配線341が設けられており、この配線341は、他の画素のバックゲート領域の配線とも接続されている。図8では、共有されている配線は、上部301から下部302につながるようには図示せず、途中まで図示し、そのような途中まで図示されている配線は、他の画素と共有されていることを示すとする。 The upper portion 301 is provided with a wiring 341 connected to the back gate region 324, and this wiring 341 is also connected to the wiring of the back gate region of another pixel. In FIG. 8, the shared wiring is not shown so as to be connected from the upper portion 301 to the lower portion 302, but is shown halfway, and the wiring that is shown halfway is shared with other pixels. Let's show that.
 Pウェル領域323の下側には、他の実施の形態と同じく、不純物が添加されたポリシリコン層330を介してゲート電極331が形成されている。ゲート電極331には、配線342が接続されているが、この配線342は、他の画素と共有されている。配線342を共有している画素には、同タイミングで同一のゲート電圧がかけられる。 A gate electrode 331 is formed below the P well region 323 via a polysilicon layer 330 to which impurities are added, as in the other embodiments. A wiring 342 is connected to the gate electrode 331, and the wiring 342 is shared with other pixels. The same gate voltage is applied to the pixels sharing the wiring 342 at the same timing.
 図8に示した撮像素子300においては、下部302にフローティングディフュージョン領域363とリセット領域365が設けられている。n+領域361からフローティングディフュージョン領域363に電荷を転送するための転送ゲート362と、フローティングディフュージョン領域363をリセットするためのリセットゲート364も、下部302には設けられている。 In the imaging device 300 shown in FIG. 8, a floating diffusion region 363 and a reset region 365 are provided in the lower portion 302. A transfer gate 362 for transferring charges from the n + region 361 to the floating diffusion region 363 and a reset gate 364 for resetting the floating diffusion region 363 are also provided in the lower portion 302.
 フローティングディフュージョン領域363は、他の画素と共有され、配線344が接続されている。 The floating diffusion region 363 is shared with other pixels, and a wiring 344 is connected thereto.
 例えば、4画素で、フローティングディフュージョンを共有する構成とした場合、撮像素子300を上側(レンズ321)側から見た場合、図9のようなレイアウトとなる。図9のAに示した撮像素子は、4画素で1つのフローティングディフュージョンを共有している構成である。 For example, in the case where the floating diffusion is shared by four pixels, the layout shown in FIG. 9 is obtained when the image sensor 300 is viewed from the upper side (lens 321) side. The image sensor shown in A of FIG. 9 has a configuration in which one floating diffusion is shared by four pixels.
 中央部分にフローティングディフュージョン領域363があり、その周りに、フォトダイオード(PD)を構成するPウェル領域323-1乃至323-4が配置されている。また各フォトダイオードからの電荷をフローティングディフュージョン領域363に転送するための転送ゲート362-1乃至362-4が、それぞれフォトダイオードに近接する位置に設けられている。 There is a floating diffusion region 363 at the center, and P-well regions 323-1 to 323-4 constituting a photodiode (PD) are arranged around the floating diffusion region 363. In addition, transfer gates 362-1 to 362-4 for transferring charges from each photodiode to the floating diffusion region 363 are provided at positions close to the photodiode.
 フォトゲートコンタクト381は、各フォトダイオードの下部に設けられているゲート電極331に接続されている。また、フォトゲートコンタクト381は、全画素で共有されている。このように構成することで、撮像素子300を含む撮像装置の構成を小型化できる。 The photogate contact 381 is connected to a gate electrode 331 provided below each photodiode. The photogate contact 381 is shared by all pixels. With this configuration, the configuration of the imaging device including the imaging device 300 can be reduced in size.
 なお、図9のAに示した撮像素子において、各フォトダイオードとの間は、オプティカルブラック領域322とされている。 In the image sensor shown in FIG. 9A, an optical black region 322 is formed between each photodiode.
 図8に示したように、フローティングディフュージョン領域363をロジック側の下部302に配置するようにした場合、転送ゲート362などを、Pウェル領域323(フォトダイオード)の下側に配置することができる。すなわち、図9のBに示したように、撮像素子300をレンズ321側から見た場合、フォトダイオードを構成するPウェル領域323に重なるような位置に、各フォトダイオードの転送ゲート362が配置される。 As shown in FIG. 8, when the floating diffusion region 363 is disposed in the lower portion 302 on the logic side, the transfer gate 362 and the like can be disposed below the P well region 323 (photodiode). That is, as shown in FIG. 9B, when the imaging device 300 is viewed from the lens 321 side, the transfer gate 362 of each photodiode is arranged at a position that overlaps the P well region 323 constituting the photodiode. The
 このような配置とすることで、撮像素子300を含む撮像装置を、さらに小型化することができる。 With such an arrangement, the imaging device including the imaging device 300 can be further reduced in size.
 図8に示した撮像素子300においても、図5に示した撮像素子100と同じく、ゲート電極331が、Pウェル領域323の下側に設けられているため、撮像素子100と同じ効果を得ることができる。 In the image pickup device 300 shown in FIG. 8, the same effect as the image pickup device 100 can be obtained because the gate electrode 331 is provided below the P well region 323 as in the case of the image pickup device 100 shown in FIG. Can do.
 さらに、図8に示した撮像素子300では、フローティングディフュージョン等を他の画素と共有することで、小型化することが可能となる。 Furthermore, the image sensor 300 shown in FIG. 8 can be miniaturized by sharing the floating diffusion and the like with other pixels.
 なお、図7に示した撮像素子200と同じく、図8に示した撮像素子300においても、上部301と下部302をバンプにより接合するように構成することも可能である。 Note that, similarly to the image sensor 200 shown in FIG. 7, the image sensor 300 shown in FIG. 8 may be configured such that the upper portion 301 and the lower portion 302 are joined by bumps.
 <第4の実施の形態における撮像素子の構成>
 図9に、第4の実施の形態における撮像素子の構成を示した。図9に示した撮像素子400は、図5に示した撮像素子100と基本的な構成は同様であるが、有機光電変換膜を有する構成となっている点が異なる。図9に示した撮像素子400において、図5に示した撮像素子100と同様の部分には、同様の符号を付し、その説明は省略する。
<Configuration of Image Sensor in Fourth Embodiment>
FIG. 9 shows the configuration of the image sensor according to the fourth embodiment. The image pickup device 400 shown in FIG. 9 has the same basic configuration as the image pickup device 100 shown in FIG. 5 except that the image pickup device 400 has an organic photoelectric conversion film. In the image sensor 400 shown in FIG. 9, the same parts as those of the image sensor 100 shown in FIG.
 撮像素子400は、図5に示した撮像素子100に、有機光電変換膜451を追加した構成となっている。有機光電変換膜451は、レンズ121の真下であり、レンズ121とPウェル領域123の間に設けられている。 The image sensor 400 has a configuration in which an organic photoelectric conversion film 451 is added to the image sensor 100 shown in FIG. The organic photoelectric conversion film 451 is directly below the lens 121 and is provided between the lens 121 and the P well region 123.
 また有機光電変換膜451の上側には、電極461が設けられ、下側には、電極462が設けられている。有機光電変換膜451は、電極461と電極462で挟まれた構成とされている。電極461は、配線471により、下部402のロジック回路と接続されている。電極462は、配線472により、下部402のロジック回路と接続されている。 Further, an electrode 461 is provided on the upper side of the organic photoelectric conversion film 451, and an electrode 462 is provided on the lower side. The organic photoelectric conversion film 451 is configured to be sandwiched between the electrode 461 and the electrode 462. The electrode 461 is connected to the logic circuit in the lower portion 402 by a wiring 471. The electrode 462 is connected to the logic circuit in the lower portion 402 by a wiring 472.
 有機光電変換膜451は、特定の色に感度を有する膜であり、例えば、青に感度を有する膜である場合、青色光を吸収し、他の色の光を透過させる膜である。ここでは、青色光を吸収する膜であるとして説明を続ける。 The organic photoelectric conversion film 451 is a film having sensitivity to a specific color. For example, in the case of a film having sensitivity to blue, the organic photoelectric conversion film 451 is a film that absorbs blue light and transmits light of other colors. Here, the description will be continued assuming that the film absorbs blue light.
 レンズ121を介して白色光が入射された場合、有機光電変換膜451で、青色光が選択的に吸収される。また、他の色の光は有機光電変換膜451を透過し、Pウェル領域123に吸収される。有機光電変換膜451は、吸収した光の量に応じて、電子―正孔対を生成する。 When white light is incident through the lens 121, blue light is selectively absorbed by the organic photoelectric conversion film 451. Further, light of other colors passes through the organic photoelectric conversion film 451 and is absorbed by the P well region 123. The organic photoelectric conversion film 451 generates electron-hole pairs according to the amount of absorbed light.
 この電子―正孔対は、有機光電変換膜451内に電極461と電極462により印加された電界により、電子と正孔に分離され、ロジック回路に読み出される。読み出された電荷量に応じ、青色の強度が算出される。 This electron-hole pair is separated into electrons and holes by the electric field applied by the electrodes 461 and 462 in the organic photoelectric conversion film 451, and is read out to the logic circuit. The blue intensity is calculated according to the read charge amount.
 青色光以外の光は、Pウェル領域123に吸収され、上記した他の実施の形態と同じく、ゲート電極131の電圧を変更することで、赤色や緑色の強度が算出される。 Light other than blue light is absorbed by the P-well region 123, and the red and green intensities are calculated by changing the voltage of the gate electrode 131 as in the other embodiments described above.
 このように有機光電変換膜451を備えることで、ゲート電極131の電圧を変えながら検出する色の数を少なく(検出すべき波長の幅を狭く)することが可能となる。このことにより、例えば、図5に示した撮像素子100よりも解析が容易となる。また解析する情報量が減るため、解析に係る時間を短縮することが可能となる。 By providing the organic photoelectric conversion film 451 in this way, it is possible to reduce the number of colors to be detected while changing the voltage of the gate electrode 131 (narrow the width of the wavelength to be detected). Thereby, for example, analysis becomes easier than the imaging device 100 shown in FIG. In addition, since the amount of information to be analyzed is reduced, it is possible to shorten the time required for analysis.
 図10に示した撮像素子400においても、図5に示した撮像素子100と同じく、ゲート電極131が、Pウェル領域123の下側に設けられているため、撮像素子100と同じ効果を得ることができる。ただし、有機光電変換膜451があるため、有機光電変換膜451に対する温度の制限はある。 In the image sensor 400 shown in FIG. 10 as well, the same effect as that of the image sensor 100 is obtained because the gate electrode 131 is provided below the P well region 123 as in the image sensor 100 shown in FIG. Can do. However, since there is the organic photoelectric conversion film 451, there is a temperature limitation on the organic photoelectric conversion film 451.
 なお、図7に示した撮像素子200と同じく、図10に示した撮像素子400においても、上部401と下部402をバンプにより接合するように構成することも可能である。 Note that, similarly to the image sensor 200 shown in FIG. 7, the image sensor 400 shown in FIG. 10 can also be configured such that the upper portion 401 and the lower portion 402 are joined by bumps.
 また、図8に示した撮像素子300と同じく、図10に示した撮像素子400においても、フローティングディフュージョン等を複数の画素で共有する構成とすることも可能である。 Further, similarly to the image sensor 300 shown in FIG. 8, the image sensor 400 shown in FIG. 10 can also have a configuration in which a floating diffusion or the like is shared by a plurality of pixels.
 <電子機器>
 図11は、本技術に係る電子機器、例えば撮像装置の構成の一例を示すブロック図である。図11に示すように、本技術に係る撮像装置1000は、レンズ群1001等を含む光学系、固体撮像素子(撮像デバイス)1002、DSP(Digital Signal Processor)回路1003、フレームメモリ1004、表示部1005、記録部1006、操作部1007および電源部1008等を有する。そして、DSP回路1003、フレームメモリ1004、表示部1005、記録部1006、操作部1007および電源部1008がバスライン1009を介して相互に接続されている。
<Electronic equipment>
FIG. 11 is a block diagram illustrating an example of a configuration of an electronic apparatus according to the present technology, for example, an imaging apparatus. As shown in FIG. 11, an imaging apparatus 1000 according to the present technology includes an optical system including a lens group 1001 and the like, a solid-state imaging device (imaging device) 1002, a DSP (Digital Signal Processor) circuit 1003, a frame memory 1004, and a display unit 1005. A recording unit 1006, an operation unit 1007, a power supply unit 1008, and the like. A DSP circuit 1003, a frame memory 1004, a display unit 1005, a recording unit 1006, an operation unit 1007, and a power supply unit 1008 are connected to each other via a bus line 1009.
 レンズ群1001は、被写体からの入射光(像光)を取り込んで固体撮像素子1002の撮像面上に結像する。固体撮像素子1002は、レンズ群1001によって撮像面上に結像された入射光の光量を画素単位で電気信号に変換して画素信号として出力する。 The lens group 1001 takes in incident light (image light) from a subject and forms an image on the imaging surface of the solid-state imaging device 1002. The solid-state imaging device 1002 converts the amount of incident light imaged on the imaging surface by the lens group 1001 into an electrical signal in units of pixels and outputs it as a pixel signal.
 DSP回路1003は、固体撮像素子1002からの信号を処理する。例えば、固体撮像素子1002には、撮影された被写体の画像を構築するための画素があり、そのような画素からの信号を処理し、フレームメモリ1004に展開するといった処理も行う。 The DSP circuit 1003 processes a signal from the solid-state image sensor 1002. For example, the solid-state imaging device 1002 has pixels for constructing an image of a photographed subject, and processing such as processing a signal from such a pixel and developing it in the frame memory 1004 is also performed.
 表示部1005は、液晶表示装置や有機EL(electro luminescence)表示装置等のパネル型表示装置からなり、固体撮像素子1002で撮像された動画または静止画を表示する。記録部1006は、固体撮像素子1002で撮像された動画または静止画を、DVD(Digital Versatile Disk)等の記録媒体に記録する。 The display unit 1005 includes a panel type display device such as a liquid crystal display device or an organic EL (electroluminescence) display device, and displays a moving image or a still image captured by the solid-state image sensor 1002. The recording unit 1006 records a moving image or a still image captured by the solid-state imaging device 1002 on a recording medium such as a DVD (Digital Versatile Disk).
 操作部1007は、ユーザによる操作の下に、本撮像装置が持つ様々な機能について操作指令を発する。電源部1008は、DSP回路1003、フレームメモリ1004、表示部1005、記録部1006および操作部1007の動作電源となる各種の電源を、これら供給対象に対して適宜供給する。CPU1010は、撮像装置1000内の各部を制御する。 The operation unit 1007 issues operation commands for various functions of the imaging apparatus under operation by the user. The power source unit 1008 appropriately supplies various power sources serving as operation power sources for the DSP circuit 1003, the frame memory 1004, the display unit 1005, the recording unit 1006, and the operation unit 1007 to these supply targets. The CPU 1010 controls each unit in the imaging apparatus 1000.
 上記の構成の撮像装置は、ビデオカメラやデジタルスチルカメラ、さらには携帯電話機等のモバイル機器向けカメラモジュールなどの撮像装置として用いることができる。そして、当該撮像装置において、固体撮像素子1002として、上記した撮像素子を用いることができる。 The imaging apparatus having the above-described configuration can be used as an imaging apparatus such as a video camera, a digital still camera, and a camera module for mobile devices such as a mobile phone. In the imaging apparatus, the above-described imaging element can be used as the solid-state imaging element 1002.
 なお、本明細書に記載された効果はあくまで例示であって限定されるものでは無く、また他の効果があってもよい。 It should be noted that the effects described in this specification are merely examples and are not limited, and other effects may be obtained.
 なお、本技術の実施の形態は、上述した実施の形態に限定されるものではなく、本技術の要旨を逸脱しない範囲において種々の変更が可能である。 Note that the embodiments of the present technology are not limited to the above-described embodiments, and various modifications can be made without departing from the gist of the present technology.
 なお、本技術は以下のような構成も取ることができる。 In addition, this technology can also take the following structures.
 (1)
 半導体基板に形成された拡散層と、
 前記拡散層の光が入射する側とは異なる側に形成され、ゲート電圧が印加される電極と
 を備え、
 前記ゲート電圧を変化させることにより前記拡散層中で入射光により発生した電荷を捕獲する前記拡散層の表面からの深さを変化させ、変化させたゲート電圧毎で、前記表面側から前記深さまでの領域で発生する前記電荷の量を示す電流を測定することにより、前記入射光の波長毎の強度を解析する
 撮像素子。
(2)
 前記電極は、金属で構成されている
 前記(1)に記載の撮像素子。
(3)
 前記拡散層と前記電極を含む第1の基板と、
 前記解析を行う回路を含む第2の基板とが接合された構成とされている
 前記(1)または(2)に記載の撮像素子。
(4)
 前記接合は、バンプ接合である
 前記(3)に記載の撮像素子。
(5)
 フローティングディフュージョンをさらに備え、
 前記フローティングディフュージョンは、複数の撮像素子で共有されている
 前記(1)乃至(4)のいずれかに記載の撮像素子。
(6)
 前記拡散層の前記電極が形成されている側とは異なる側に、光電変換膜をさらに備える
 前記(1)乃至(5)のいずれかに記載の撮像素子。
(7)
 半導体基板に形成された拡散層と、
 前記拡散層の光が入射する側とは異なる側に形成され、ゲート電圧が印加される電極と
 を備え、
 前記ゲート電圧を変化させることにより前記拡散層中で入射光により発生した電荷を捕獲する前記拡散層の表面からの深さを変化させ、変化させたゲート電圧毎で、前記表面側から前記深さまでの領域で発生する前記電荷の量を示す電流を測定することにより、前記入射光の波長毎の強度を解析する
 撮像素子を備え、
 前記撮像素子からの信号を処理する処理部を備える
 電子機器。
(1)
A diffusion layer formed on a semiconductor substrate;
An electrode to which a gate voltage is applied, formed on a side different from the side on which light of the diffusion layer is incident,
By changing the gate voltage, the depth from the surface of the diffusion layer that captures charges generated by incident light in the diffusion layer is changed, and from the surface side to the depth for each changed gate voltage. An image sensor that analyzes the intensity of each wavelength of the incident light by measuring a current indicating the amount of the charge generated in the region.
(2)
The imaging device according to (1), wherein the electrode is made of metal.
(3)
A first substrate including the diffusion layer and the electrode;
The imaging device according to (1) or (2), wherein a second substrate including a circuit that performs the analysis is joined.
(4)
The image sensor according to (3), wherein the bonding is bump bonding.
(5)
Floating diffusion is further provided,
The imaging element according to any one of (1) to (4), wherein the floating diffusion is shared by a plurality of imaging elements.
(6)
The imaging device according to any one of (1) to (5), further including a photoelectric conversion film on a side different from the side on which the electrode is formed of the diffusion layer.
(7)
A diffusion layer formed on a semiconductor substrate;
An electrode to which a gate voltage is applied, formed on a side different from the side on which light of the diffusion layer is incident,
By changing the gate voltage, the depth from the surface of the diffusion layer that captures charges generated by incident light in the diffusion layer is changed, and from the surface side to the depth for each changed gate voltage. An image sensor that analyzes the intensity for each wavelength of the incident light by measuring a current indicating the amount of the electric charge generated in the region of
An electronic apparatus comprising a processing unit that processes a signal from the image sensor.
 100 撮像素子, 101 上部, 102 下部, 121 レンズ, 122 オプティカルブラック領域, 123 Pウェル領域, 124 バックゲート領域, 125 フローティングディフュージョン領域, 126 リセット領域, 127 n+領域, 131 ゲート電極, 132 転送ゲート, 133 リセットゲート 100 image sensor, 101 upper part, 102 lower part, 121 lens, 122 optical black area, 123 P well area, 124 back gate area, 125 floating diffusion area, 126 reset area, 127 n + area, 131 gate electrode, 132 transfer gate, 133 Reset gate

Claims (7)

  1.  半導体基板に形成された拡散層と、
     前記拡散層の光が入射する側とは異なる側に形成され、ゲート電圧が印加される電極と
     を備え、
     前記ゲート電圧を変化させることにより前記拡散層中で入射光により発生した電荷を捕獲する前記拡散層の表面からの深さを変化させ、変化させたゲート電圧毎で、前記表面側から前記深さまでの領域で発生する前記電荷の量を示す電流を測定することにより、前記入射光の波長毎の強度を解析する
     撮像素子。
    A diffusion layer formed on a semiconductor substrate;
    An electrode to which a gate voltage is applied, formed on a side different from the side on which light of the diffusion layer is incident,
    By changing the gate voltage, the depth from the surface of the diffusion layer that captures charges generated by incident light in the diffusion layer is changed, and from the surface side to the depth for each changed gate voltage. An image sensor that analyzes the intensity of each wavelength of the incident light by measuring a current indicating the amount of the charge generated in the region.
  2.  前記電極は、金属で構成されている
     請求項1に記載の撮像素子。
    The imaging device according to claim 1, wherein the electrode is made of metal.
  3.  前記拡散層と前記電極を含む第1の基板と、
     前記解析を行う回路を含む第2の基板とが接合された構成とされている
     請求項1に記載の撮像素子。
    A first substrate including the diffusion layer and the electrode;
    The imaging device according to claim 1, wherein the imaging element is configured to be joined to a second substrate including a circuit that performs the analysis.
  4.  前記接合は、バンプ接合である
     請求項3に記載の撮像素子。
    The imaging device according to claim 3, wherein the bonding is bump bonding.
  5.  フローティングディフュージョンをさらに備え、
     前記フローティングディフュージョンは、複数の撮像素子で共有されている
     請求項1に記載の撮像素子。
    Floating diffusion is further provided,
    The imaging device according to claim 1, wherein the floating diffusion is shared by a plurality of imaging devices.
  6.  前記拡散層の前記電極が形成されている側とは異なる側に、光電変換膜をさらに備える
     請求項1に記載の撮像素子。
    The imaging device according to claim 1, further comprising a photoelectric conversion film on a side different from the side on which the electrode is formed of the diffusion layer.
  7.  半導体基板に形成された拡散層と、
     前記拡散層の光が入射する側とは異なる側に形成され、ゲート電圧が印加される電極と
     を備え、
     前記ゲート電圧を変化させることにより前記拡散層中で入射光により発生した電荷を捕獲する前記拡散層の表面からの深さを変化させ、変化させたゲート電圧毎で、前記表面側から前記深さまでの領域で発生する前記電荷の量を示す電流を測定することにより、前記入射光の波長毎の強度を解析する
     撮像素子を備え、
     前記撮像素子からの信号を処理する処理部を備える
     電子機器。
    A diffusion layer formed on a semiconductor substrate;
    An electrode to which a gate voltage is applied, formed on a side different from the side on which light of the diffusion layer is incident,
    By changing the gate voltage, the depth from the surface of the diffusion layer that captures charges generated by incident light in the diffusion layer is changed, and from the surface side to the depth for each changed gate voltage. An image sensor that analyzes the intensity for each wavelength of the incident light by measuring a current indicating the amount of the electric charge generated in the region of
    An electronic apparatus comprising a processing unit that processes a signal from the image sensor.
PCT/JP2015/066830 2014-06-24 2015-06-11 Imaging element, and electronic device WO2015198876A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014128839A JP2016009739A (en) 2014-06-24 2014-06-24 Image pick-up device and electronic apparatus
JP2014-128839 2014-06-24

Publications (1)

Publication Number Publication Date
WO2015198876A1 true WO2015198876A1 (en) 2015-12-30

Family

ID=54937967

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/066830 WO2015198876A1 (en) 2014-06-24 2015-06-11 Imaging element, and electronic device

Country Status (2)

Country Link
JP (1) JP2016009739A (en)
WO (1) WO2015198876A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7005886B2 (en) * 2016-03-31 2022-01-24 ソニーグループ株式会社 Solid-state image sensor and electronic equipment
KR20210049103A (en) 2018-09-11 2021-05-04 소니 세미컨덕터 솔루션즈 가부시키가이샤 Solid state image sensor
WO2020170936A1 (en) * 2019-02-20 2020-08-27 ソニーセミコンダクタソリューションズ株式会社 Imaging device

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004146816A (en) * 2002-09-30 2004-05-20 Matsushita Electric Ind Co Ltd Solid image pickup device and equipment using the same
JP2005010114A (en) * 2003-06-23 2005-01-13 Japan Science & Technology Agency Measuring method of incidence light, and sensor having spectroscopic mechanism using method
JP2006120922A (en) * 2004-10-22 2006-05-11 Fuji Film Microdevices Co Ltd Photoelectric conversion film laminated type color solid state imaging device
JP2007067075A (en) * 2005-08-30 2007-03-15 Nippon Hoso Kyokai <Nhk> Color imaging device
JP2009014459A (en) * 2007-07-03 2009-01-22 Hamamatsu Photonics Kk Backside-illuminated distance measuring sensor and distance measuring device
JP2009168742A (en) * 2008-01-18 2009-07-30 Sony Corp Spectral sensor, solid imaging element, and imaging apparatus
JP2009180512A (en) * 2008-01-29 2009-08-13 Fujifilm Corp Spectroscopic sensor, fluorescence detection method and fluorescence detection device using spectroscopic sensor
JP2010225927A (en) * 2009-03-24 2010-10-07 Sony Corp Solid-state image pickup device, drive method therefor, and electronic apparatus
JP2011029337A (en) * 2009-07-23 2011-02-10 Sony Corp Solid-state imaging device, method of manufacturing the same, and electronic apparatus
JP2011114324A (en) * 2009-11-30 2011-06-09 Sony Corp Solid-state imaging device and electronic apparatus
JP2011181595A (en) * 2010-02-26 2011-09-15 Panasonic Corp Solid-state imaging device and camera
JP2012104704A (en) * 2010-11-11 2012-05-31 Toshiba Corp Solid-state imaging device and method of manufacturing the same
JP2013197697A (en) * 2012-03-16 2013-09-30 Sony Corp Solid-state image pickup device and electronic apparatus
JP2013214930A (en) * 2012-04-04 2013-10-17 Nippon Hoso Kyokai <Nhk> Rear face irradiation type imaging element, driving apparatus and imaging apparatus both provided with rear face irradiation type imaging element, and driving method of rear face irradiation type imaging element

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004146816A (en) * 2002-09-30 2004-05-20 Matsushita Electric Ind Co Ltd Solid image pickup device and equipment using the same
JP2005010114A (en) * 2003-06-23 2005-01-13 Japan Science & Technology Agency Measuring method of incidence light, and sensor having spectroscopic mechanism using method
JP2006120922A (en) * 2004-10-22 2006-05-11 Fuji Film Microdevices Co Ltd Photoelectric conversion film laminated type color solid state imaging device
JP2007067075A (en) * 2005-08-30 2007-03-15 Nippon Hoso Kyokai <Nhk> Color imaging device
JP2009014459A (en) * 2007-07-03 2009-01-22 Hamamatsu Photonics Kk Backside-illuminated distance measuring sensor and distance measuring device
JP2009168742A (en) * 2008-01-18 2009-07-30 Sony Corp Spectral sensor, solid imaging element, and imaging apparatus
JP2009180512A (en) * 2008-01-29 2009-08-13 Fujifilm Corp Spectroscopic sensor, fluorescence detection method and fluorescence detection device using spectroscopic sensor
JP2010225927A (en) * 2009-03-24 2010-10-07 Sony Corp Solid-state image pickup device, drive method therefor, and electronic apparatus
JP2011029337A (en) * 2009-07-23 2011-02-10 Sony Corp Solid-state imaging device, method of manufacturing the same, and electronic apparatus
JP2011114324A (en) * 2009-11-30 2011-06-09 Sony Corp Solid-state imaging device and electronic apparatus
JP2011181595A (en) * 2010-02-26 2011-09-15 Panasonic Corp Solid-state imaging device and camera
JP2012104704A (en) * 2010-11-11 2012-05-31 Toshiba Corp Solid-state imaging device and method of manufacturing the same
JP2013197697A (en) * 2012-03-16 2013-09-30 Sony Corp Solid-state image pickup device and electronic apparatus
JP2013214930A (en) * 2012-04-04 2013-10-17 Nippon Hoso Kyokai <Nhk> Rear face irradiation type imaging element, driving apparatus and imaging apparatus both provided with rear face irradiation type imaging element, and driving method of rear face irradiation type imaging element

Also Published As

Publication number Publication date
JP2016009739A (en) 2016-01-18

Similar Documents

Publication Publication Date Title
JP7264187B2 (en) Solid-state imaging device, its driving method, and electronic equipment
US10903279B2 (en) Solid state image sensor pixel electrode below a photoelectric conversion film
WO2017130728A1 (en) Solid-state imaging device and electronic device
JP6879919B2 (en) Manufacturing method of solid-state image sensor, electronic device, and solid-state image sensor
JP5061915B2 (en) Solid-state imaging device and imaging apparatus
US10942304B2 (en) Solid-state imaging element, manufacturing method of the same, and electronic device
CN108369953B (en) Imaging device and electronic apparatus
WO2010004683A1 (en) Solid-state imaging device
US9287302B2 (en) Solid-state imaging device
KR20220066188A (en) Solid-state imaging element, production method thereof, and electronic device
US10536659B2 (en) Solid-state image capturing element, manufacturing method therefor, and electronic device
WO2016104177A1 (en) Solid-state image capture element, method for manufacturing same, and electronic component
JPWO2017159362A1 (en) Solid-state imaging device, manufacturing method thereof, and electronic device
WO2015198876A1 (en) Imaging element, and electronic device
WO2016084629A1 (en) Solid state imaging element and electronic equipment
WO2022149488A1 (en) Light detection device and electronic apparatus
WO2022190867A1 (en) Imaging device and ranging system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15811309

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15811309

Country of ref document: EP

Kind code of ref document: A1