WO2020093197A1 - Image sensor and control method therefor - Google Patents

Image sensor and control method therefor Download PDF

Info

Publication number
WO2020093197A1
WO2020093197A1 PCT/CN2018/113967 CN2018113967W WO2020093197A1 WO 2020093197 A1 WO2020093197 A1 WO 2020093197A1 CN 2018113967 W CN2018113967 W CN 2018113967W WO 2020093197 A1 WO2020093197 A1 WO 2020093197A1
Authority
WO
WIPO (PCT)
Prior art keywords
pixel block
pixels
photoelectric conversion
conversion elements
charge
Prior art date
Application number
PCT/CN2018/113967
Other languages
French (fr)
Inventor
Makoto Monoi
Original Assignee
Huawei Technologies Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co., Ltd. filed Critical Huawei Technologies Co., Ltd.
Priority to CN201880003761.9A priority Critical patent/CN109804617B/en
Priority to PCT/CN2018/113967 priority patent/WO2020093197A1/en
Publication of WO2020093197A1 publication Critical patent/WO2020093197A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14601Structural or functional details thereof
    • H01L27/14641Electronic components shared by two or more pixel-elements, e.g. one amplifier shared by two pixel elements
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14601Structural or functional details thereof
    • H01L27/14603Special geometry or disposition of pixel-elements, address-lines or gate-electrodes
    • H01L27/14607Geometry of the photosensitive area
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14601Structural or functional details thereof
    • H01L27/14609Pixel-elements with integrated switching, control, storage or amplification elements
    • H01L27/14612Pixel-elements with integrated switching, control, storage or amplification elements involving a transistor
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14601Structural or functional details thereof
    • H01L27/1462Coatings
    • H01L27/14621Colour filter arrangements
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14601Structural or functional details thereof
    • H01L27/14625Optical elements or arrangements associated with the device
    • H01L27/14627Microlenses
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/76Addressed sensors, e.g. MOS or CMOS sensors
    • H04N25/77Pixel circuitry, e.g. memories, A/D converters, pixel amplifiers, shared circuits or shared components
    • H04N25/778Pixel circuitry, e.g. memories, A/D converters, pixel amplifiers, shared circuits or shared components comprising amplifiers shared between a plurality of pixels, i.e. at least one part of the amplifier must be on the sensor array itself
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14601Structural or functional details thereof
    • H01L27/1464Back illuminated imager structures

Definitions

  • the present disclosure relates to an image sensor and a control method therefor, and, more particularly, to an image sensor for use for the camera function or the like of a cellular phone and a control method therefor.
  • An image sensor such as a CMOS image sensor that converts light to an electric signal has a plurality of photoelectric conversion elements in a matrix form for each color of an array of color filters.
  • the photoelectric conversion elements can be implemented by, for example, photodiodes, and converts incident light to charges corresponding to the amount of light and stores the charges. The stored charges are converted to a voltage, which is in turn converted to a digital signal for each corresponding color and output.
  • the quantity of photoelectric conversion elements disposed for each corresponding color is increasing. Accordingly, the quantity of a plurality of pixels (pixel block) corresponding to the photoelectric conversion elements for each color also increases.
  • ADC step As the quantity of pixels included in a pixel block increases, however, the number of processing steps required for analog/digital conversion (hereinafter referred to as “ADC step” ) increases. The increase in the number of the processing steps increases noise to be read.
  • the present disclosure provides an image sensor capable of increasing the processing speed while maintaining high sensitivity and high resolution, and a control method therefor.
  • a first aspect provides an image sensor including:
  • a pixel block corresponding to one filter color and including a plurality of photoelectric conversion elements, the plurality of photoelectric conversion elements respectively corresponding to pixels constituting the pixel block;
  • a signal converter connected to the plurality of charge/voltage converters so as to collectively convert signals of voltages output from the plurality of charge/voltage converters included in each pixel block
  • each of the pixels in the pixel block corresponding to the one filter color being connected to the signal converter via an associated one of the plurality of charge/voltage converters which is shared by the each of the pixels.
  • a plurality of pixel blocks are arranged in a matrix form.
  • the charge/voltage converter converts only charges output from the photoelectric conversion elements in one pixel block to voltages.
  • the charge/voltage converter is shared only by the photoelectric conversion elements in one pixel block.
  • the pixel block includes pixels arranged in a 3x3 matrix form.
  • the plurality of charge/voltage converters includes at least a first charge/voltage converter shared by two of the photoelectric conversion elements and a second charge/voltage converter shared by three of the photoelectric conversion elements.
  • the plurality of charge/voltage converters includes a first charge/voltage converter shared by three of the photoelectric conversion elements and a second charge/voltage converter shared by six of the photoelectric conversion elements.
  • the plurality of charge/voltage converters includes a first charge/voltage converter shared by two of the photoelectric conversion elements, a second charge/voltage converter shared by three of the photoelectric conversion elements, and a third charge/voltage converter shared by four of the photoelectric conversion elements.
  • the plurality of charge/voltage converters includes three charge/voltage converters each shared by three of the photoelectric conversion elements.
  • the pixel block includes pixels arranged in a 2x4 matrix form.
  • the plurality of charge/voltage converters includes two charge/voltage convertersshared by four of the photoelectric conversion elements.
  • the pixels included in the pixel block include eight pixels whose longitudinal direction is a horizontal direction or a vertical direction.
  • an array of lenses with a size of 1/8 of a size of the pixel block is stacked on the pixel block, the lenses being arranged to be inclined by a predetermined angle ⁇ to the pixels.
  • the predetermined angle ⁇ is 45 degrees.
  • one lens isstacked on adjacent two or four pixels included in the pixel block.
  • a second aspect provides an image sensor including:
  • a pixel block corresponding to one filter color and including a plurality of photoelectric conversion elements, the plurality of photoelectric conversion elements respectively corresponding to pixels constituting the pixel block;
  • At least one charge/voltage converter converting charges output from the photoelectric conversion elements in the pixel block to voltages
  • a signal converter connected to the at least one charge/voltage converter so as to collectively convert signals of voltages output from the at least one charge/voltage converter included in each pixel block
  • each of the pixels in the pixel block corresponding to the one filter color is connected to the signal converter via the at least one charge/voltage converter shared by the each of the pixels, and the photoelectric conversion elements connected to the at least one charge/voltage converter include a photoelectric conversion element corresponding to pixels arranged apart from the at least one charge/voltage converter.
  • a plurality of pixel blocks are arranged in a matrix form.
  • the at least one charge/voltage converter converts only charges output from the photoelectric conversion elements included in one pixel block to voltages.
  • the at least one charge/voltage converter is shared only by the photoelectric conversion elements included in one pixel block.
  • the pixel block includes an array of eight pixels.
  • individual rows of the pixel block include three pixels, two pixels and three pixels, respectively.
  • the pixel block includes four pixels adjacent to one another, and four pixels arranged apart from one another.
  • an array of lenses with a size of 1/9 of a size of the pixel block is stacked on the pixel block.
  • a third aspect provides an image sensor including:
  • a pixel block corresponding to one filter color and including a plurality of photoelectric conversion elements, the plurality of photoelectric conversion elements respectively corresponding to pixels constituting the pixel block, wherein the pixels included in the pixel block include eight pixels whose longitudinal direction is a horizontal direction or a vertical direction.
  • the pixel block includes pixels arranged in a 2x4 matrix form.
  • the image sensor includes a plurality of pixel blocks, one of the plurality of pixel blocks including eight pixels whose longitudinal direction is a horizontal direction, and another of the plurality of pixel blocks including eight pixels whose longitudinal direction is a vertical direction.
  • an array of lenses with a size of 1/8 of a size of the pixel block is stacked on the pixel block, the lenses being arranged to be inclined by a predetermined angle ⁇ to the pixels.
  • the predetermined angle ⁇ is 45 degrees.
  • one lens is stacked on adjacent two or four pixels included in the pixel block.
  • a fourth aspect provides an image sensor including:
  • a pixel block corresponding to one filter color and including a plurality of photoelectric conversion elements, the plurality of photoelectric conversion elements respectively corresponding to pixels constituting the pixel block, wherein individual rows of the pixel block include three pixels, two pixels and three pixels, respectively.
  • a fifth aspect provides an image sensor including:
  • a pixel block corresponding to one filter color and including a plurality of photoelectric conversion elements, the plurality of photoelectric conversion elements respectively corresponding to pixels constituting the pixel block, wherein the pixel block includes four pixels adjacent to one another, and four pixels arranged apart from one another.
  • a sixth aspect provides an image sensor including:
  • a pixel block corresponding to one filter color and including a plurality of photoelectric conversion elements, the plurality of photoelectric conversion elements respectively corresponding to pixels constituting the pixel block, wherein the pixel block includes pixels arranged in a 3x3 matrix form.
  • one lens is stacked on adjacent two or four pixels included in the pixel block.
  • a seventh aspect provides a signal reading method for an image sensor including a pixel block corresponding to one filter color and including a plurality of photoelectric conversion elements, the plurality of photoelectric conversion elements respectively corresponding to pixels constituting the pixel block, a plurality of charge/voltage converters converting charges output from the photoelectric conversion elements in the pixel block to voltages, and a signal converter connected to the plurality of charge/voltage converters so as to convert signals of voltages output from the plurality of charge/voltage converters included in each pixel block, the each of the pixels in the pixel block corresponding to the one filter color being connected to the signal converter via an associated one of the plurality of charge/voltage converters which is shared by the each of the pixels, the method including the steps of:
  • An eighth aspect provides a signal reading method for an image sensor including a pixel block corresponding to one filter color and including a plurality of photoelectric conversion elements, the plurality of photoelectric conversion elements corresponding to pixels constituting the pixel block, at least one charge/voltage converter converting charges output from the photoelectric conversion elements in the pixel block to voltages, and a signal converter connected to the at least one charge/voltage converter so as to convert signals of voltages output from the at least one charge/voltage converter included in each pixel block, wherein the each of the pixels in the pixel block corresponding to the one filter color is connected to the signal converter via the at least one charge/voltage converter shared by the each of the pixels, and the photoelectric conversion elements connected to the at least one charge/voltage converter include a photoelectric conversion element corresponding to pixels arranged apart from the at least one charge/voltage converter, the method including the steps of:
  • a ninth aspect provides a signal reading method for an image sensor including a pixel block corresponding to one filter color and including a plurality of photoelectric conversion elements, the plurality of photoelectric conversion elements respectively corresponding to pixels constituting the pixel block, wherein the pixels included in the pixel block include eight pixels whose longitudinal direction is a horizontal direction or a vertical direction, the method including the steps of:
  • a tenth aspect provides a signal reading method for an image sensor including a pixel block corresponding to one filter color and including a plurality of photoelectric conversion elements, the plurality of photoelectric conversion elements respectively corresponding to pixels constituting the pixel block, wherein individual rows of the pixel block include three pixels, two pixels and three pixels, respectively, the method including the steps of:
  • An eleventh aspect provides a signal reading method for an image sensor including a pixel block corresponding to one filter color and including a plurality of photoelectric conversion elements, the plurality of photoelectric conversion elements respectively corresponding to pixels constituting the pixel block, wherein the pixel block includes four pixels adjacent to one another, and four pixels arranged apart from one another, the method including the steps of:
  • a twelfth aspect provides a signal reading method for an image sensor including a pixel block corresponding to one filter color and including a plurality of photoelectric conversion elements, the plurality of photoelectric conversion elements respectively corresponding to pixels constituting the pixel block, wherein the pixel block includes pixels arranged in a 3x3 matrix form, the method including the steps of:
  • the foregoing configurations can increase the processing speed while maintaining high sensitivity and high resolution.
  • FIG. 1 is a block diagram including a configurational example of a solid-state image sensing device according to one embodiment.
  • Fig. 2 is a diagram showing a circuit included in a pixel array of a CMOS image sensor.
  • Fig. 3A is a diagram showing the configuration of a signal reading side of the CMOS image sensor.
  • Fig. 3B is a diagram showing a color filter array on a light-receiving side of the circuit in Fig. 3A.
  • Fig. 4A is a cross-sectional view of the pixel array taken along arrow IVA in Figs. 3A and 3B.
  • Fig. 4B is a diagram showing potentials of the pixel array shown in Fig. 4A.
  • Fig. 4C is a pulse timing chart of control signals.
  • Fig. 5 is a diagram showing an example of the color filter array.
  • Fig. 6 is a diagram showing an example of the color filter array.
  • Fig. 7 is a diagram showing an example of the color filter array.
  • Fig. 8A is a diagram showing a pixel array in which a pixel block for one filter color includes 2x2 pixels.
  • Fig. 8B is a diagram showing signals read out in an ADC step.
  • Fig. 8C is a pulse timing chart of control signals.
  • Fig. 9A is a diagram showing a pixel array in which a pixel block for one filter color includes 3x3 pixels.
  • Fig. 9B is a diagram showing signals read out in an ADC step.
  • Fig. 10A is a diagram showing the configuration of a signal reading side of a pixel array according to one embodiment.
  • Fig. 10B is a diagram showing signals read out in an ADC step.
  • Fig. 11A is a diagram showing a lens part which is stacked on the pixel array.
  • Fig. 11B is a cross-sectional view taken along horizontal arrow XIB in Fig. 11A.
  • Fig. 12 is a diagram showing the configuration of a signal reading side of a pixel array according to one embodiment.
  • Fig. 13 is a diagram showing the configuration of a signal reading side of a pixel array according to one embodiment.
  • Fig. 14 is a diagram showing the configuration of a signal reading side of a pixel array according to one embodiment.
  • Fig. 15 is a diagram showing a color filter array according to one embodiment.
  • Fig. 16 is a diagram showing a lens part which is stacked on the pixel array.
  • Fig. 17A is a diagram showing the configuration of a signal reading side of a pixel array according to one embodiment.
  • Fig. 17B is a cross-sectional view taken along vertical arrow XVIIB in Fig. 17A.
  • Fig. 17C is a cross-sectional view taken along horizontal arrow XVIIC in Fig. 17A.
  • Fig. 18 is a diagram showing a color filter array according to one embodiment.
  • Fig. 19A is a diagram showing a lens part which is stacked on the pixel array.
  • Fig. 19B is a cross-sectional view taken along horizontal arrow XIXB in Fig. 19A.
  • Fig. 20 is a diagram showing a color filter array according to one embodiment.
  • Fig. 21A is a diagram showing a lens part which is stacked on the pixel array.
  • Fig. 21B is a cross-sectional view taken along horizontal arrow XXIB in Fig. 21A.
  • Fig. 21C is a cross-sectional view taken along vertical arrow XXIC in Fig. 21A.
  • Fig. 12 is a diagram showing the configuration of a signal reading side of a pixel array according to one embodiment.
  • Fig. 23A is a diagram showing a lens part which is stacked on the pixel array.
  • Fig. 23B is a cross-sectional view taken along vertical arrow XXIIIB in Fig. 23A.
  • Fig. 24A is a diagram for describing the operation of phase difference AF.
  • Fig. 24B is a diagram for describing the operation of phase difference AF.
  • Fig. 25 is a diagram showing a lens part which is stacked on the pixel array.
  • Fig. 26A is a diagram showing a lens part which is stacked on the pixel array.
  • Fig. 26B is a cross-sectional view taken along horizontal arrow XXVIB in Fig. 26A.
  • Fig. 27 is a diagram showing a lens part which is stacked on the pixel array.
  • Fig. 28 is a diagram showing a color filter array according to one embodiment.
  • Fig. 29 is a diagram showing a lens part according to one embodiment.
  • Fig. 30 is a diagram showing an example of a combination of a pixel array and a lens part.
  • Fig. 31 is a diagram showing a color filter array according to one embodiment.
  • Fig. 32 is a diagram showing an example of a combination of a color filter array and a lens part.
  • Fig. 33 is a diagram showing an example of a combination of a color filter array and a lens part.
  • Fig. 34A is a diagram showing the configuration of a signal reading side of a pixel array in Fig. 33.
  • Fig. 34B is a diagram showing an example of the layout of transistors in the pixel array in Fig. 34A.
  • Fig. 35 is a flowchart for a signal reading method according to one embodiment.
  • Fig. 1 is a block diagram including a configurational example of an image sensor according to this embodiment.
  • a CMOS image sensor 100 includes a pixel array 104.
  • the pixel array 104 has a plurality of pixel circuits arrayed in a two dimensional form (matrix form) of N rows by M columns.
  • a vertical scanning circuit 102 that supplies a pixel drive signal is disposed on one end side of the pixel array 104 (left side in the figure) .
  • the pixel array 104 and the vertical scanning circuit 102 are connected together by trans fer gate (hereinafter also simply referred to as "TG" ) pulse signal lines 114.
  • signal converters 108 and a horizontal scanning circuit 110 which are connected to signal lines 116 of individual columns, are disposed on the lower end side of an imaging area (underside in the figure) .
  • the CMOS image sensor 100 includes a timing controller 106.
  • the timing controller 106 generates and outputs a master clock or a clock obtained by frequency-dividing the master clock.
  • the vertical scanning circuit 102, the signal converters 108 and the horizontal scanning circuit 110 are controlled in synchronism with the clock output from the timing controller 106.
  • the vertical scanning circuit 102 sets addresses and controls vertical scanning.
  • the signal converters 108 each perform signal conversion to convert, for example, an analog output from pixels to a digital output, and outputs the digital output to an output circuit 112.
  • the horizontal scanning circuit 110 sequentially selects the signal converters 108 in synchronism with the clock output from the timing controller 106, and reads a signal from the selected signal converter 108 and outputs the signal to the output circuit 112.
  • the output circuit 112 converts the digital outputs obtained through conversion in the signal converters 108 to signals corresponding to a color arrangement, and outputs the signals.
  • the converted signals are output to a display controller 120 for displaying an image on, for example, a display, and an AF controller 118 for controlling autofocusing (AF) .
  • the display controller 120 includes a digital signal processor (DSP) or the like.
  • the AF controller 118 includes a central processing unit (CPU) or the like.
  • Fig. 2 is a diagram showing a circuit included in the pixel array 104 of the CMOS image sensor 100 according to the embodiment.
  • TG pulse signal lines VTG1_n (hereinafter also simply referred to as "VTG1_n” or the like)
  • VTG2_n, VTG3_n, VTG4_n, VTG1_n+1, VTG2_n+1, VTG3_n+1, and VTG4_n+1 correspond to the TG pulse signal lines 114 in Fig. 1.
  • signal lines Vsig in Fig. 2 correspond to the signal lines 116 in Fig. 1.
  • a pixel block 202 encircled by a broken line in Fig. 2 includes four photodiodes PD1_n, m (hereinafter also simply referred to as "PD1" or the like) , PD2_n, m+1, PD3_n, m+1, and PD4_n, m where m and n respectively indicate the column number and the row number of the pixel block included in the pixel array.
  • PD1_n hereinafter also simply referred to as "PD1" or the like
  • PD2_n m+1, PD3_n, m+1, and PD4_n, m
  • m and n respectively indicate the column number and the row number of the pixel block included in the pixel array.
  • the photodiode serves as a photoelectric conversion element that converts incident light to charges whose quantity corresponds to the amount of the light.
  • the cathodes of the PD1_n, m, PD2_n, m+1, PD3_n, m+1, and PD4_n, m are respectively connected to transfer gates TG1_n, m (hereinafter also simply referred to as "TG1" or the like) , TG2_n, m+1, TG3_n, m+1, and TG4_n, m.
  • the VTG1_n, VTG2_n, VTG3_n, and VTG4_n are respectively connected to the input parts of the gates of the TG1_n, m, TG2_n, m+1, TG3_n, m+1, and TG4_n, m.
  • the TG is turned on when a transfer signal from the VTG is input to its gate.
  • the drain of the TG shares a diffusion layer with the source of a reset transistor (hereinafter also simply referred to as "RS” or the like) to form a floating diffusion (hereinafter also simply referred to as "FD” or the like) .
  • RS reset transistor
  • FD floating diffusion
  • the RS is turned on when a reset signal is input from a signal line VRS_n, resetting the potential of the FD to the potential of the power supply line.
  • the FD serves as a charge/voltage converter that converts charges to a voltage.
  • the FD has a capacitance Cfd for storing charges generated by an associated one of the PD1 to PD4, and is connected to the input part of the gate of an amplifier transistor (hereinafter also simply referred to as "AMP" ) .
  • the AMP is connected to the signal line Vsig via a select transistor (hereinafter also simply referred to as "SL” ) .
  • the SL is turned on when a control signal is input to its gate from a signal line VSL_n.
  • the signal line is connected with a constant current source (not shown) , so that when the SL is turned on, the AMP and the constant current source form a source follower circuit, and the AMP outputs a low impedance signal corresponding to the potential of the FD to a vertical signal line Vsig. Voltages output from the individual pixels via the vertical signal lines Vsig are output to the signal converters 108.
  • the signal converter 108 includes a CDS (Correlated Double Sampling) circuit (hereinafter also simply referred to as "CDS" ) and an analog/digital converting circuit (hereinafter also simply referred to as "ADC" ) .
  • CDS Correlated Double Sampling
  • ADC analog/digital converting circuit
  • the CDS holds a reset voltage and a signal voltage output from the FD by means of a sample/hold circuit (S/H) , and obtains the difference between those voltages.
  • the ADC collectively converts signals of analog voltages output from the CDS's to digital signals.
  • Fig. 3A is a diagram for describing PDs that share one FD shown in Fig. 2.
  • Fig. 3A shows the configuration of the signal reading side of the pixel array.
  • Fig. 3A shows adjacent two pixel blocks each of which consists of four PD1 to PD4 as described above.
  • the four PD1 to PD4 share one FD via the TG1 to TG4, and the charges of the four PD1 to PD4 can be converted to voltages by this FD.
  • PDs share one circuit including the AMP, SL, RS, and FD.
  • Fig. 3B shows a color filter array on the light-receiving side of the pixel array in Fig. 3A.
  • Each of the pixel blocks corresponds to one color, namely, green (Gr, Gb) , red (R) , or blue (B) .
  • the green pixel is distinctly shown as Gr when the color of a pixel block horizontally adj oining to the green pixel is red, and as Gb when the color of the horizontally adj oining pixel block is blue.
  • Fig. 4A is a cross-sectional view of the pixel array taken along arrow IVA in Figs. 3A and 3B.
  • light is input from the underside in the figure to a pixel array 200 via a lens part 302 and a color filter 310 that passes light of a specific color, and charges are read out from the upper side in the figure.
  • the pixel array 200 is implemented on a single semiconductor substrate, and P-wells 306 are provided between the PD1 to PD4.
  • One PD includes a first PD 302_1 and PD 302_2 having different impurity concentrations, bot combined to form a single n layer.
  • Fig. 4B is a diagram showing potentials of the pixel array shown in Fig. 4A.
  • Fig. 4C is a pulse timing chart of control signals for the pixel array in Fig. 4A. Next, the operation of the pixel array is described referring to Figs. 2 to 4.
  • the RS is turned on by the control signal from the signal line VRS_n, so that the voltage of the FD becomes the same as the voltage of the reset transistor.
  • Figs. 5 to 7 are diagrams for describing the relationship between the color filter array and the resolution.
  • Fig. 5 shows a Bayer array which is generally used in the color filter array of an image sensor.
  • Gr and R or B and Gb are alternately arrayed horizontally.
  • a pixel block corresponding to each filter color has one PD. Therefore, the length, a1, of one side of the pixel block is the same as the length, b1, of one side of one pixel corresponding to the PD.
  • Fig. 6 shows an example in which the pixel block corresponding to each filter color has two pixels (PD) vertically and two pixels horizontally (2x2) , so that the length, a2, of one side of the pixel block is twice the length, b2, of one side of one pixel.
  • the pixel block corresponding to each filter color has three pixels (PD) vertically and three pixels horizontally (3x3) , so that the length, a3, of one side of the pixel block is three times the length, b3, of one side of one pixel.
  • the sensitivity of the image sensor depends on the total area of the PDs corresponding to one filter color.
  • the length a1, a2, a3 of one side of the pixel block corresponds to the area of the PD, and has a relation of a1 ⁇ a2 ⁇ a3, so that with the sum of the signals from the 2x2 or 3x3 PDs, the sensitivities in Figs. 5 to 7 become Fig. 5 ⁇ Fig. 6 ⁇ Fig. 7.
  • the resolution of the image sensor depends on the size of the pixels corresponding to one filter color.
  • the length b1, b2, b3 of one side of one pixel has a relation of b1>b2>b3, so that with the resolutions in Figs. 5 to 7 become Fig. 5 ⁇ Fig. 6 ⁇ Fig. 7.
  • the pixel array 200 shown in Fig. 8A includes pixel blocks 212 of three filter colors.
  • the pixel block for one filter color includes 2x2 or four pixels 214, each pixel including one PD.
  • the four PDs in the pixel block 212 for Gr shares one FD via the TG1 to TG4.
  • One FD has a capacitance Cfd and can collectively convert the charges supplied from the four PDs sharing the FD to voltages.
  • the FD of the Gr pixel block 212 and the FD of the B pixel block 212 are connected to a source follower circuit SFC including an AMP, and are connected to the ADC by a metal signal line Vsig.
  • Each ADC corresponds to a read channel, and the pixel array 200 can read signals separately via two read channels Ch. 1 and Ch. 2.
  • Fig. 8B shows signals read out in an ADC step.
  • the ADC treats the processing by all the ADC circuits from the beginning of the processing to the end thereof as one step.
  • each row corresponds to one ADC step, and the number of ADC steps is 2.
  • Fig. 8C is a pulse timing chart of control signals.
  • An ADC step 1 (AD1) starts at the rising of the pulse of the reset signal at time t2, and ends at the falling of the next pulse.
  • charges for Gr are read via the read channel Ch. 1
  • charges for R are read via the read channel Ch. 2.
  • an ADC step 2 starts at the end of the AD1 period, and ends at the falling of the select signal VSL_n.
  • charges for B are read via the read channel Ch. 1
  • charges for Gb are read via the read channel Ch. 2.
  • Fig. 9A shows an example of a pixel array in which a pixel block for one filter color includes 3x3 pixels.
  • the PDs connected to the TG1, TG2, TG4 and TG5 share an FD1.
  • the PDs connected to the TG3 and TG6 share an FD3.
  • the PDs connected to the TG7 and TG8 share an FD2.
  • the PDs connected to the TG9 share an FD4.
  • the FD2 to FD4 are also shared by the PDs connected to the TGs for another filter color.
  • the FD1 and FD2 are connected to the ADC of the read channel Ch. 1 via one source follower circuit SFC.
  • Fig. 9B shows the ADC step and signals read out in the ADC step in the pixel array shown in Fig. 9A.
  • One read channel can perform processing for one color in a single DC step, and processes only the output from one source follower circuit SFC.
  • the number of ADC steps required to carry out ADC processing for pixel blocks of three colors in the pixel array 900 is six, namely, the AD1 to AD6.
  • the pixel block including 2x2 pixels shown in Fig. 8 charges supplied from the pixel block are added in the charge domain for ADC, whereas in the example shown in Fig. 9, charges from the pixel block are distributed to different read channels or different ADC steps.
  • the signals from the Gr pixel block are separated to Gr1 and Gr2, and the signals from the B pixel block are separated to B1 to B4, and those signals are added color by color in the digital domain.
  • the pixel block including 3x3 pixels increases noise 1.4 to 2 times.
  • the present embodiment deals with the aforementioned increase in the number of ADC steps and an increase in nose.
  • FIG. 10A is a diagram showing the configuration of the signal reading side of the pixel array according to the first embodiment of the present disclosure.
  • a pixel array 1000 included in the image sensor has pixel blocks each having pixels (PDs) arranged in a 3x3 matrix form for each of the filter colors Gr, R, B and Gb.
  • One PD is disposed in correspondence to each pixel in the pixel block.
  • FD1 to FD4 are connected to the ADC of the read channel Ch. 1 via one source follower circuit SFC.
  • each pixel (PD) in the pixel block corresponding to one filter color is connected to one ADC via an associated FD shared by each pixel. Accordingly, AD conversion of the signals from all the PDs in the pixel block can be performed in a single step.
  • FIG. 10B shows the ADC step and signals read out in the ADC step in the pixel array shown in Fig. 10A.
  • the number of ADC steps required to carry out ADC processing for pixel blocks of three colors in the pixel array 1000 is two, namely, the AD1 and the AD2. Therefore, the signals read out in the ADC step becomes the same as those in the case of 2x2 pixels.
  • Fig. 11A is a diagram showing a lens part 1100 which is stacked on the pixel array 1000 in Fig. 10, and Fig. 11B is a cross-sectional view taken along horizontal arrow XIB in Fig. 11A.
  • Fig. 11B light enters the lens part 1100 from the underside, and enters the PDs constituting the pixel array 1000 via a color filter 1110.
  • On-chip lenses 1102 included in the lens part 1100 are formed in accordance with the sizes of the pixels in the pixel array 1000.
  • a pixel block for one filter color is provided with a plurality of FDs that are shared only by pixels in the pixel block to enable signal reading in one ADC step.
  • Fig. 12 shows the configuration of the signal reading side of a pixel array according to another embodiment of the present disclosure. Referring to a Gr pixel block in a pixel array 1200, six PDs connected to the TG1 to TG6 share the FD1. Three PDs connected to the TG7 to TG9 share the FD2. The FD1 and FD2 are connected to the ADC of the read channel Ch. 1 via one source follower circuit SFC.
  • Fig. 13 shows the configuration of the signal reading side of a pixel array according to another embodiment of the present disclosure.
  • a Gr pixel block in a pixel array 1300 three PDs connected to the TG1 to TG3 share the FD1.
  • Four PDs connected to the TG4, TG5, TG7 and TG8 share the FD2.
  • Two PDs connected to the TG6 and TG9 share the FD3.
  • the FD1 to FD3 are connected to the ADC of the read channel Ch. 1 via one source follower circuit SFC.
  • Fig. 14 shows the configuration of the signal reading side of a pixel array according to another embodiment of the present disclosure.
  • a Gr pixel block in a pixel array 1400 three PDs connected to the TG1, TG2 and TG4 share the FD1.
  • Three PDs connected to the TG3, TG5 and TG6 share the FD1.
  • Three PDs connected to the TG7 to TG9 share the FD3.
  • the FD1 to FD3 are connected to the ADC of the read channel Ch. 1 via one source follower circuit SFC.
  • Fig. 15 shows a color filter array on the light receiving side of a pixel array according to another embodiment of the present disclosure.
  • the pixel block includes three rows of pixels, namely, three pixels, two pixels and three pixels from the above, a total of eight pixels 1403, arranged horizontally.
  • Fig. 16 is a diagram showing a lens part which is stacked on the pixel array in Fig. 15.
  • a lens part 1600 includes square-shaped on-chip lenses 1602 and rectangular on-chip lenses 1603 which are formed in accordance with the sizes of the pixels in the pixel array 1500.
  • FIG. 17A shows the configuration of the signal reading side of a pixel array according to this embodiment.
  • a pixel array 1700 included in the image sensor has pixel blocks each having pixels (PDs) arranged in a 3x3 matrix form for each of the filter colors Gr, R, B and Gb.
  • One photodiode is disposed in correspondence to each pixel in a pixel block 1702.
  • An FD is shared by photodiodes corresponding to two pixels 1706 and 1707 adjacent to this FD. Further, the photodiodes connected to the FD include photodiodes corresponding to six pixels 1703, 1704, 1705, 1708, 1709, and 1711 disposed apart from the FD.
  • Fig. 17B is a cross-sectional view taken along vertical arrow XVIIB in Fig. 17A
  • Fig. 17C is a cross-sectional view taken along horizontal arrow XVIIC in Fig. 17A.
  • light enters the lens part 1600 from the underside in the figures, and enters the PDs constituting the pixel array 1700 via a color filter 1710.
  • a Gr pixel block in the pixel array 1700 shown in Fig. 17A photodiodes connected to the TG1 to TG9 share the FD.
  • the FD is connected to the ADC of the read channel Ch. 1 via one source follower circuit SFC.
  • each pixel (PD) in the pixel block corresponding to one filter color is connected to one ADC via an associated FD shared by each pixel.
  • the PDs connected to the FD include PDs corresponding to pixels arranged apart from the FD. This allows AD conversion of the signals from all the PDs in the pixel block to be performed in a single step.
  • eight photodiodes share one FD.
  • the capacitance of the FD can be made small. Consequently, noise originated from signal reading can be made smaller.
  • Fig. 18 shows a color filter array on the light receiving side of a pixel array according to another embodiment of the present disclosure.
  • a Gr pixel block 1802 includes four pentagonal pixels 1804, 1806, 1807 and 1809 adjacent to an FD. Further, the pixel block 1802 includes four square-shaped pixels 1803, 1805, 1808 and 1811 disposed apart from the FD.
  • Fig. 19A is a diagram showing a lens part which is stacked on the pixel array shown in Fig. 18.
  • a lens part 1900 including an array of square-shaped on-chip lenses 1902 whose size is 1/9 of the size of the pixel block is stacked on the pixel array 1800 in Fig. 19A.
  • Fig. 19B shows the cross section taken along horizontal arrow XIXB in Fig. 19A.
  • light enters the lens part 1900 from the underside in the figure, and enters the PDs constituting the pixel array 1800 via a color filter 1910.
  • the pixel block 1802 includes four pixels 1804, 1806, 1807 and 1809 adjacent to one another.
  • the pixel block 1802 further includes four pixels 1803, 1805, 1808 and 1811 disposed apart from one another.
  • Fig. 20 shows a color filter array according to one embodiment.
  • a Gr pixel block 2002 includes eight rectangular pixels 1903 whose longitudinal direction is a vertical direction.
  • Fig. 21A is a diagram showing a lens part which is stacked on the pixel array shown in Fig. 20.
  • a lens part 2100 including an array of square-shaped on-chip lenses 2102 whose size is 1/8 of the size of the pixel block is stacked on the pixel array 2002. It is to be noted that the lens part 2100 is disposed at an inclination of an angle ⁇ to the pixel array 2000.
  • Fig. 21B is a cross-sectional view taken along horizontal arrow XXIB in Fig. 21A
  • Fig. 21C is a cross-sectional view taken along vertical arrow XXIC in Fig. 21A.
  • light enters the lens part 2100 from the underside, and enters the PDs constituting the pixel array 2000 via a color filter 1110. That is, lights entering the on-chip lenses 2102_1 to 2102_8 enter PD 2003_1 to 2003_8, respectively.
  • the longitudinal direction of the pixels 2003 is a vertical direction, so that the resolution in the vertical direction becomes lower than that in the horizontal direction.
  • the lens part 2100 including the square-shaped on-chip lenses 2102 is disposed at an inclination of an angle ⁇ to the pixel array 2000, light can be uniformly sampled by dividing the light region into square regions. From the viewpoint of uniform sampling, it is preferable to set the angle ⁇ to 45 degrees. Therefore, it is possible to reduce the influence of the difference between the resolution in the horizontal direction and the resolution in the vertical direction on the output image.
  • Fig. 22 shows the configuration of the signal reading side of the pixel array according to the present embodiment.
  • the PDs connected to the TG1, TG2, TG5 and TG6 share the FD1.
  • the PDs connected to the TG3, TG4, TG7 and TG8 share the FD2.
  • the FD1 and FD2 are connected to the ADC of the read channel Ch. 1 via one source follower circuit SFC.
  • Fig. 23A shows a lens part which is stacked on the pixel arrays shown in Figs. 10 to 14, and Fig. 23B shows a cross section taken along vertical arrow XXIIIB in Fig. 23A.
  • light enters a lens part 2300, and enters PDs constituting the pixel array 1000 via a color filter 2310.
  • the lens part 2300 that is provided for the Gr pixel block is constituted by a combination of a rectangular on-chip lens 2304 stacked on two pixels adj oining horizontally, a square-shaped on-chip lens 2305 having the same size as the pixels, and an on-chip lens 2306 stacked on two pixels adj oining vertically.
  • the pixels on which the on-chip lens 2304 is stacked are constituted by two PDs, and can be used for both imaging and phase difference autofocus (AF) .
  • Figs. 24A and 24B are diagrams for describing the operation of phase difference AF.
  • the PDs adj oining horizontally operate as a phase difference AF sensor a flux of light entering from the imaging lens is received separately by two PDs as shown in Figs. 24A and 24B.
  • the AF controller 118 in Fig. 1 detects signals from the two PDs, and measures a deviation of the focus based on the interval between two images formed on the image sensor. Then, the optical system is controlled based on the measured deviation to bring the optical system into focus.
  • the adj oining PDs are combined with other PDs in the pixel block to form pixels of one color.
  • the on-chip lens 2304 can be used to detect a phase difference in the horizontal direction
  • the on-chip lens 2306 can be used to detect a phase difference in the vertical direction.
  • Fig. 25 is shows a lens part which is stacked on the pixel arrays shown in Figs. 10 to 14.
  • a part corresponding to a GR pixel block is constituted by a combination of a rectangular on-chip lens 2502 stacked on 2x2 or four pixels, and a square-shaped on-chip lens 2503 having the same size as the pixels.
  • the pixels on which the on-chip lens 2502 is stacked are constituted by four PDs, and can be used for both imaging and phase difference autofocus (AF) .
  • AF phase difference autofocus
  • the PDs adj oining horizontally operate as a phase difference AF sensor a flux of light entering from the imaging lens is separately received by two PDs as shown in Figs. 24A and 24B.
  • the AF controller 118 in Fig. 1 detects signals from the two PDs, and measures a deviation of the focus based on the interval between two images formed on the image sensor. Then, the optical system is controlled based on the measured deviation to bring the optical system into focus.
  • the two PDs adj oining vertically likewise are used for phase difference AF control.
  • the adj oining PDs are combined with other PDs in the pixel block to form pixels of one color.
  • Fig. 26A is a diagram showing a lens part which is stacked on the pixel array 2000 shown in Fig. 20.
  • a 2600 including an array of four on-chip lenses 2603 whose size is 1/4 of the size of the pixel block is stacked on a pixel block 2602.
  • Fig. 26B shows a cros s section taken along horizontal arrow XXVIB in Fig. 26A. As shown in Figs. 26A and 26B, light enters the lens part 2600, and enters the PDs constituting the pixel array 2000 via a color filter 2610.
  • the longitudinal direction of the pixels 2003 is a vertical direction, so that the resolution in the vertical direction becomes lower than that in the horizontal direction.
  • one on-chip lens 2603 is stacked on two PDs, which brings about an advantage such that the PDs can be used as a phase difference AF sensor as described above.
  • the combination of the on-chip lens 2603 and two PDs enables detection of a phase difference in the horizontal direction.
  • Fig. 27 is a diagram showing a lens part which is stacked on the pixel array 2000 shown in Fig. 20.
  • two on-chip lenses 2703 provided for a Gr pixel block 2702 is stacked on 2x2 or four pixels.
  • the pixels on which the on-chip lens 2703 is stacked are constituted by four PDs, and can be used for both imaging and phase difference autofocus (AF) .
  • the longitudinal direction of the pixels 2003 is a vertical direction, so that the resolution in the vertical direction becomes lower than that in the horizontal direction.
  • one on-chip lens 2703 is stacked on four PDs, which brings about an advantage such that the PDs can be used as a phase difference AF sensor as described above.
  • Fig. 28 shows a color filter array on the light receiving side of a pixel array according to another embodiment of the present disclosure.
  • a Gr pixel block 2802 includes eight rectangular pixels 2803 whose longitudinal direction is a horizontal direction.
  • Fig. 29 shows a lens part 2900 which is stacked on the pixel array 2800 shown in Fig. 28, and Fig. 30 shows an example of a combination of the lens part 2900 in the pixel array 2800.
  • the lens part 2900 is formed so that four square-shaped on-chip lenses 2902 are included in a pixel block 2802.
  • the longitudinal direction of the pixels 2803 is a horizontal direction, so that the resolution in the horizontal direction becomes lower than that in the vertical direction.
  • stacking one on-chip lens 2902 on two PDs brings about an advantage such that the PDs can be used as a phase difference AF sensor as described above.
  • the combination of the on-chip lens 2902 and two PDs enables detection of a phase difference in the vertical direction.
  • Fig. 31 shows a color filter array on the light receiving side of a pixel array according to another embodiment of the present disclosure.
  • a Gr pixel block 3002 includes four rectangular pixels 3003 whose longitudinal direction is a horizontal direction, and four rectangular pixels 3004 whose longitudinal direction is a vertical direction.
  • Fig. 32 shows an example of a combination of the lens part 2900 shown in Fig. 29 in the color filter array shown in Fig. 30.
  • the lens part 2900 is formed so that four square-shaped on-chip lenses 2902 are included in the pixel block 3002.
  • the on-chip lens 2902 is stacked on two pixels 3003 whose longitudinal direction is a horizontal direction and the on-chip lens 2904 stacked on two pixels 3004 whose longitudinal direction is a vertical direction.
  • the combination of the on-chip lens 2902 and two PDs enables detection of a phase difference in the vertical direction.
  • the combination of the on-chip lens 2904 and two PDs enables detection of a phase difference in the horizontal direction.
  • Fig. 33 shows a combination of a color filter array on the light receiving side of a pixel array and a lens part according to another embodiment of the present disclosure.
  • a lens part 3300 constitutes the combination 2600 of the pixels and the lens part shown in Fig. 26A and the combination 2900 of the pixels and the lens part shown in Fig. 30.
  • Fig. 34A shows the configuration of the signal reading side of the pixel array shown in Fig. 33.
  • a Gr pixel block 3406 In an upper left region 3402 of a pixel array 3400, in a Gr pixel block 3406, PDs connected to four TGs share one FD. Two FDs included in the Gr pixel block and two FDs included in the B pixel block are connected to one ADC for the read channel Ch. 1 via one source follower circuit SFC.
  • a region 3404 is a configurational example of the signal reading side of the pixel array having the color filter array shown in Fig. 28.
  • Fig. 34B shows an example of the layout of transistors in the pixel array in Fig. 34A.
  • two FDs included in the Gr pixel block and two FDs included in the B pixel block are connected to the source follower circuit SFC including an AMP, SL and RS provided between those two pixel blocks.
  • a variable i is initialized to 0 (S3501) .
  • signal charges are read out from FDs of an ith row for each read channel (S3503) .
  • signal charges stored in all the FDs included in the pixel block for one filter color are supplied to the vertical signal lines via the SFC circuit.
  • ADC is performed on the read signals (S3504) .
  • i+1 is set to i (S3505) .
  • the processing of the steps S3503 to S3505 is repeated by the number of rows, N, of the pixel block (S3502 to S3506) .
  • the rows from which pixel signals are read out may be different for different read channels.
  • signals from the pixels may be read out in such a way that the read channel Ch. 1 reads signals from the pixels of the nth row, the read channel Ch. 2 reads signals from the pixels of the (n+1) th row, and so forth.
  • a pixel array according to one embodiment may be combined with on-chip lenses according to another embodiment.
  • a lens part 2100 including an array of square-shaped on-chip lenses 2102 whose size is 1/8 of the size of the pixel block in Fig. 21A may be stacked on the pixel array indicated in any one of the eighth to thirteenth embodiments.
  • the lens part 2100 may be disposed at an inclination of an angle ⁇ to the pixel array 2000.

Abstract

Provided are an image sensor capable of increasing the processing speed while maintaining high sensitivity and high resolution, and a control method therefor. The image sensor includes: a pixel block corresponding to one filter color and including a plurality of photoelectric conversion elements, one photoelectric conversion element corresponding to each of pixels constituting the pixel block; a plurality of charge/voltage converters converting charges output from the photoelectric conversion elements included in the pixel block to voltages; and a signal converter connected to the plurality of charge/voltage converters so as to collectively convert signals of voltages output from the plurality of charge/voltage converters included in each pixel block. In one embodiment, an image sensor includes at least one charge/voltage converter shared by at least one of photoelectric conversion elements corresponding to pixels arranged apart from the at least one charge/voltage converter.

Description

IMAGE SENSOR AND CONTROL METHOD THEREFOR Technical Field
The present disclosure relates to an image sensor and a control method therefor, and, more particularly, to an image sensor for use for the camera function or the like of a cellular phone and a control method therefor.
Background Art
An image sensor such as a CMOS image sensor that converts light to an electric signal has a plurality of photoelectric conversion elements in a matrix form for each color of an array of color filters. The photoelectric conversion elements can be implemented by, for example, photodiodes, and converts incident light to charges corresponding to the amount of light and stores the charges. The stored charges are converted to a voltage, which is in turn converted to a digital signal for each corresponding color and output.
Recently, to meet a demand for highly sensitive and high-resolution image sensors, the quantity of photoelectric conversion elements disposed for each corresponding color is increasing. Accordingly, the quantity of a plurality of pixels (pixel block) corresponding to the photoelectric conversion elements for each color also increases.
As the quantity of pixels included in a pixel block increases, however, the number of processing steps required for  analog/digital conversion (hereinafter referred to as “ADC step” ) increases. The increase in the number of the processing steps increases noise to be read.
Summary of Invention
The present disclosure provides an image sensor capable of increasing the processing speed while maintaining high sensitivity and high resolution, and a control method therefor.
A first aspect provides an image sensor including:
a pixel block corresponding to one filter color and including a plurality of photoelectric conversion elements, the plurality of photoelectric conversion elements respectively corresponding to pixels constituting the pixel block;
a plurality of charge/voltage converters converting charges output from the photoelectric conversion elements in the pixel block to voltages; and
a signal converter connected to the plurality of charge/voltage converters so as to collectively convert signals of voltages output from the plurality of charge/voltage converters included in each pixel block,
the each of the pixels in the pixel block corresponding to the one filter color being connected to the signal converter via an associated one of the plurality of charge/voltage converters which is shared by the each of the pixels.
According to a first possible implementation of the first aspect, a plurality of pixel blocks are arranged in a matrix form.
In the first aspect or the first possible implementation of the first aspect, according to a second possible implementation of the first aspect, the charge/voltage converter converts only charges output from the photoelectric conversion elements in one pixel block to voltages.
In the first aspect or the first or second possible implementation of the first aspect, according to a third possible implementation of the first aspect, the charge/voltage converter is shared only by the photoelectric conversion elements in one pixel block.
In the first aspect or any one of the first to third possible implementations of the first aspect, according to a fourth possible implementation of the first aspect, the pixel block includes pixels arranged in a 3x3 matrix form.
In the fourth possible implementation of the first aspect, according to a fifth possible implementation of the first aspect, the plurality of charge/voltage converters includes at least a first charge/voltage converter shared by two of the photoelectric conversion elements and a second charge/voltage converter shared by three of the photoelectric conversion elements.
In the fourth possible implementation of the first aspect, according to a sixth possible implementation of the first aspect, the plurality of charge/voltage converters includes a first charge/voltage converter shared by three of the photoelectric conversion elements and a second charge/voltage converter shared by six of the photoelectric conversion elements.
In the fourth possible implementation of the first aspect, according to a seventh possible implementation of the first aspect, the plurality of charge/voltage converters includes a first charge/voltage converter shared by two of the photoelectric conversion elements, a second charge/voltage converter shared by three of the photoelectric conversion elements, and a third charge/voltage converter shared by four of the photoelectric conversion elements.
In the fourth possible implementation of the first aspect, according to an eighth possible implementation of the first aspect, the plurality of charge/voltage converters includes three charge/voltage converters each shared by three of the photoelectric conversion elements.
In the first aspect or any one of the first to third possible implementations of the first aspect, according to a ninth possible implementation of the first aspect, the pixel block includes pixels arranged in a 2x4 matrix form.
In the ninth possible implementation of the first aspect, according to a tenth possible implementation of the first aspect, the plurality of charge/voltage converters includes two charge/voltage convertersshared by four of the photoelectric conversion elements.
In the ninth or tenth possible implementation of the first aspect, according to an eleventh possible implementation of the first aspect, the pixels included in the pixel block include eight pixels whose longitudinal direction is a horizontal direction or a vertical direction.
In any one of the ninth to eleventh possible implementations of the first aspect, according to a twelfth possible implementation of the first aspect, an array of lenses with a size of 1/8 of a size of the pixel block is stacked on the pixel block, the lenses being arranged to be inclined by a predetermined angle θ to the pixels.
In the twelfth possible implementation of the first aspect, according to a thirteenth possible implementation of the first aspect, the predetermined angle θ is 45 degrees.
In any one of the fourth to thirteenth possible implementations of the first aspect, according to a fourteenth possible implementation of the first aspect, one lens isstacked on adjacent two or four pixels included in the pixel block.
A second aspect provides an image sensor including:
a pixel block corresponding to one filter color and including a plurality of photoelectric conversion elements, the plurality of photoelectric conversion elements respectively corresponding to pixels constituting the pixel block;
at least one charge/voltage converter converting charges output from the photoelectric conversion elements in the pixel block to voltages; and
a signal converter connected to the at least one charge/voltage converter so as to collectively convert signals of voltages output from the at least one charge/voltage converter included in each pixel block,
wherein the each of the pixels in the pixel block corresponding to the one filter color is connected to the signal  converter via the at least one charge/voltage converter shared by the each of the pixels, and the photoelectric conversion elements connected to the at least one charge/voltage converter include a photoelectric conversion element corresponding to pixels arranged apart from the at least one charge/voltage converter.
According to a first possible implementation of the second aspect, a plurality of pixel blocks are arranged in a matrix form.
In the second aspect or the first possible implementation of the second aspect, according to a second possible implementation of the second aspect, the at least one charge/voltage converter converts only charges output from the photoelectric conversion elements included in one pixel block to voltages.
In the second aspect or the first or second possible implementation of the second aspect, according to a third possible implementation of the second aspect, the at least one charge/voltage converter is shared only by the photoelectric conversion elements included in one pixel block.
In the second aspect or any one of the first to third possible implementations of the second aspect, according to a fourth possible implementation of the second aspect, the pixel block includes an array of eight pixels.
In the second aspect or the fourth possible implementation of the second aspect, according to a fifth possible implementation of the second aspect, individual rows  of the pixel block include three pixels, two pixels and three pixels, respectively.
In the second aspect or the fourth possible implementation of the second aspect, according to a sixth possible implementation of the second aspect, the pixel block includes four pixels adjacent to one another, and four pixels arranged apart from one another.
In the second aspect or any one of the first to fifth possible implementations of the second aspect, according to a seventh possible implementation of the second aspect, an array of lenses with a size of 1/9 of a size of the pixel block is stacked on the pixel block.
A third aspect provides an image sensor including:
a pixel block corresponding to one filter color and including a plurality of photoelectric conversion elements, the plurality of photoelectric conversion elements respectively corresponding to pixels constituting the pixel block, wherein the pixels included in the pixel block include eight pixels whose longitudinal direction is a horizontal direction or a vertical direction.
According to a first possible implementation of the third aspect, the pixel block includes pixels arranged in a 2x4 matrix form.
In the first possible implementation of the third aspect, according to a second possible implementation of the third aspect, the image sensor includes a plurality of pixel blocks, one of the plurality of pixel blocks including eight pixels  whose longitudinal direction is a horizontal direction, and another of the plurality of pixel blocks including eight pixels whose longitudinal direction is a vertical direction.
In the third aspect or the first or second possible implementation of the third aspect, according to a third possible implementation of the third aspect, an array of lenses with a size of 1/8 of a size of the pixel block is stacked on the pixel block, the lenses being arranged to be inclined by a predetermined angle θ to the pixels.
In the third possible implementation of the third aspect, according to a fourth possible implementation of the third aspect, the predetermined angle θ is 45 degrees.
In the third aspect or the first or second possible implementation of the third aspect, according to a fifth possible implementation, one lens is stacked on adjacent two or four pixels included in the pixel block.
A fourth aspect provides an image sensor including:
a pixel block corresponding to one filter color and including a plurality of photoelectric conversion elements, the plurality of photoelectric conversion elements respectively corresponding to pixels constituting the pixel block, wherein individual rows of the pixel block include three pixels, two pixels and three pixels, respectively.
A fifth aspect provides an image sensor including:
a pixel block corresponding to one filter color and including a plurality of photoelectric conversion elements, the plurality of photoelectric conversion elements respectively  corresponding to pixels constituting the pixel block, wherein the pixel block includes four pixels adjacent to one another, and four pixels arranged apart from one another.
A sixth aspect provides an image sensor including:
a pixel block corresponding to one filter color and including a plurality of photoelectric conversion elements, the plurality of photoelectric conversion elements respectively corresponding to pixels constituting the pixel block, wherein the pixel block includes pixels arranged in a 3x3 matrix form.
According to a first possible implementation of the sixth aspect, one lens is stacked on adjacent two or four pixels included in the pixel block.
A seventh aspect provides a signal reading method for an image sensor including a pixel block corresponding to one filter color and including a plurality of photoelectric conversion elements, the plurality of photoelectric conversion elements respectively corresponding to pixels constituting the pixel block, a plurality of charge/voltage converters converting charges output from the photoelectric conversion elements in the pixel block to voltages, and a signal converter connected to the plurality of charge/voltage converters so as to convert signals of voltages output from the plurality of charge/voltage converters included in each pixel block, the each of the pixels in the pixel block corresponding to the one filter color being connected to the signal converter via an associated one of the plurality of charge/voltage converters which is shared by the each of the pixels, the method including the steps of:
collectively supplying signals of voltages output from the plurality of charge/voltage converters included in a first pixel block to the signal converter; and
collectively supplying signals of voltages output from the plurality of charge/voltage converters included in a second pixel block to the signal converter.
An eighth aspect provides a signal reading method for an image sensor including a pixel block corresponding to one filter color and including a plurality of photoelectric conversion elements, the plurality of photoelectric conversion elements corresponding to pixels constituting the pixel block, at least one charge/voltage converter converting charges output from the photoelectric conversion elements in the pixel block to voltages, and a signal converter connected to the at least one charge/voltage converter so as to convert signals of voltages output from the at least one charge/voltage converter included in each pixel block, wherein the each of the pixels in the pixel block corresponding to the one filter color is connected to the signal converter via the at least one charge/voltage converter shared by the each of the pixels, and the photoelectric conversion elements connected to the at least one charge/voltage converter include a photoelectric conversion element corresponding to pixels arranged apart from the at least one charge/voltage converter, the method including the steps of:
collectively supplying signals of voltages output from the at least one charge/voltage converters included in a first  pixel block to the signal converter; and
collectively supplying signals of voltages output from the at least one charge/voltage converters included in a second pixel block to the signal converter.
A ninth aspect provides a signal reading method for an image sensor including a pixel block corresponding to one filter color and including a plurality of photoelectric conversion elements, the plurality of photoelectric conversion elements respectively corresponding to pixels constituting the pixel block, wherein the pixels included in the pixel block include eight pixels whose longitudinal direction is a horizontal direction or a vertical direction, the method including the steps of:
collectively supplying signals of voltages output from the plurality of photoelectric conversion elements included in a first pixel block to a signal converter; and
collectively supplying signals of voltages output from the plurality of photoelectric conversion elements included in a second pixel block to the signal converter.
A tenth aspect provides a signal reading method for an image sensor including a pixel block corresponding to one filter color and including a plurality of photoelectric conversion elements, the plurality of photoelectric conversion elements respectively corresponding to pixels constituting the pixel block, wherein individual rows of the pixel block include three pixels, two pixels and three pixels, respectively, the method including the steps of:
collectively supplying signals of voltages output from the plurality of photoelectric conversion elements included in a first pixel block to a signal converter; and
collectively supplying signals of voltages output from the plurality of photoelectric conversion elements included in a second pixel block to the signal converter.
An eleventh aspect provides a signal reading method for an image sensor including a pixel block corresponding to one filter color and including a plurality of photoelectric conversion elements, the plurality of photoelectric conversion elements respectively corresponding to pixels constituting the pixel block, wherein the pixel block includes four pixels adjacent to one another, and four pixels arranged apart from one another, the method including the steps of:
collectively supplying signals of voltages output from the plurality of photoelectric conversion elements included in a first pixel block to a signal converter; and
collectively supplying signals of voltages output from the plurality of photoelectric conversion elements included in a second pixel block to the signal converter.
A twelfth aspect provides a signal reading method for an image sensor including a pixel block corresponding to one filter color and including a plurality of photoelectric conversion elements, the plurality of photoelectric conversion elements respectively corresponding to pixels constituting the pixel block, wherein the pixel block includes pixels arranged in a 3x3 matrix form, the method including the steps of:
collectively supplying signals of voltages output from the plurality of photoelectric conversion elements included in a first pixel block to a signal converter; and
collectively supplying signals of voltages output from the plurality of photoelectric conversion elements included in a second pixel block to the signal converter.
The foregoing configurations can increase the processing speed while maintaining high sensitivity and high resolution.
Brief Description of Drawings
[Fig. 1] Fig. 1 is a block diagram including a configurational example of a solid-state image sensing device according to one embodiment.
[Fig. 2] Fig. 2 is a diagram showing a circuit included in a pixel array of a CMOS image sensor.
[Fig. 3A] Fig. 3A is a diagram showing the configuration of a signal reading side of the CMOS image sensor.
[Fig. 3B] Fig. 3B is a diagram showing a color filter array on a light-receiving side of the circuit in Fig. 3A.
[Fig. 4A] Fig. 4A is a cross-sectional view of the pixel array taken along arrow IVA in Figs. 3A and 3B.
[Fig. 4B] Fig. 4B is a diagram showing potentials of the pixel array shown in Fig. 4A.
[Fig. 4C] Fig. 4C is a pulse timing chart of control signals.
[Fig. 5] Fig. 5 is a diagram showing an example of the color filter array.
[Fig. 6] Fig. 6 is a diagram showing an example of the color filter array.
[Fig. 7] Fig. 7 is a diagram showing an example of the color filter array.
[Fig. 8A] Fig. 8A is a diagram showing a pixel array in which a pixel block for one filter color includes 2x2 pixels.
[Fig. 8B] Fig. 8B is a diagram showing signals read out in an ADC step.
[Fig. 8C] Fig. 8C is a pulse timing chart of control signals.
[Fig. 9A] Fig. 9A is a diagram showing a pixel array in which a pixel block for one filter color includes 3x3 pixels.
[Fig. 9B] Fig. 9B is a diagram showing signals read out in an ADC step.
[Fig. 10A] Fig. 10A is a diagram showing the configuration of a signal reading side of a pixel array according to one embodiment.
[Fig. 10B] Fig. 10B is a diagram showing signals read out in an ADC step.
[Fig. 11A] Fig. 11A is a diagram showing a lens part which is stacked on the pixel array.
[Fig. 11B] Fig. 11B is a cross-sectional view taken along horizontal arrow XIB in Fig. 11A.
[Fig. 12] Fig. 12 is a diagram showing the configuration of a signal reading side of a pixel array according to one embodiment.
[Fig. 13] Fig. 13 is a diagram showing the  configuration of a signal reading side of a pixel array according to one embodiment.
[Fig. 14] Fig. 14 is a diagram showing the configuration of a signal reading side of a pixel array according to one embodiment.
[Fig. 15] Fig. 15 is a diagram showing a color filter array according to one embodiment.
[Fig. 16] Fig. 16 is a diagram showing a lens part which is stacked on the pixel array.
[Fig. 17A] Fig. 17A is a diagram showing the configuration of a signal reading side of a pixel array according to one embodiment.
[Fig. 17B] Fig. 17B is a cross-sectional view taken along vertical arrow XVIIB in Fig. 17A.
[Fig. 17C] Fig. 17C is a cross-sectional view taken along horizontal arrow XVIIC in Fig. 17A.
[Fig. 18] Fig. 18 is a diagram showing a color filter array according to one embodiment.
[Fig. 19A] Fig. 19A is a diagram showing a lens part which is stacked on the pixel array.
[Fig. 19B] Fig. 19B is a cross-sectional view taken along horizontal arrow XIXB in Fig. 19A.
[Fig. 20] Fig. 20 is a diagram showing a color filter array according to one embodiment.
[Fig. 21A] Fig. 21A is a diagram showing a lens part which is stacked on the pixel array.
[Fig. 21B] Fig. 21B is a cross-sectional view taken  along horizontal arrow XXIB in Fig. 21A.
[Fig. 21C] Fig. 21C is a cross-sectional view taken along vertical arrow XXIC in Fig. 21A.
[Fig. 22] Fig. 12 is a diagram showing the configuration of a signal reading side of a pixel array according to one embodiment.
[Fig. 23A] Fig. 23A is a diagram showing a lens part which is stacked on the pixel array.
[Fig. 23B] Fig. 23B is a cross-sectional view taken along vertical arrow XXIIIB in Fig. 23A.
[Fig. 24A] Fig. 24A is a diagram for describing the operation of phase difference AF.
[Fig. 24B] Fig. 24B is a diagram for describing the operation of phase difference AF.
[Fig. 25] Fig. 25 is a diagram showing a lens part which is stacked on the pixel array.
[Fig. 26A] Fig. 26A is a diagram showing a lens part which is stacked on the pixel array.
[Fig. 26B] Fig. 26B is a cross-sectional view taken along horizontal arrow XXVIB in Fig. 26A.
[Fig. 27] Fig. 27 is a diagram showing a lens part which is stacked on the pixel array.
[Fig. 28] Fig. 28 is a diagram showing a color filter array according to one embodiment.
[Fig. 29] Fig. 29 is a diagram showing a lens part according to one embodiment.
[Fig. 30] Fig. 30 is a diagram showing an example of  a combination of a pixel array and a lens part.
[Fig. 31] Fig. 31 is a diagram showing a color filter array according to one embodiment.
[Fig. 32] Fig. 32 is a diagram showing an example of a combination of a color filter array and a lens part.
[Fig. 33] Fig. 33 is a diagram showing an example of a combination of a color filter array and a lens part.
[Fig. 34A] Fig. 34A is a diagram showing the configuration of a signal reading side of a pixel array in Fig. 33.
[Fig. 34B] Fig. 34B is a diagram showing an example of the layout of transistors in the pixel array in Fig. 34A.
[Fig. 35] Fig. 35 is a flowchart for a signal reading method according to one embodiment.
Description of Embodiments
To make persons skilled in the art understand the technical solutions in the present disclosure better, the following clearly and completely describes the technical solutions in the embodiments of the present disclosure with reference to the accompanying drawings in the embodiments of the present disclosure. Apparently, the described embodiments are merely some rather than all of the embodiments of the present disclosure. All other embodiments obtained by persons of ordinary skill in the art based on the embodiments of the present disclosure without creative efforts shall fall within the protection scope of the present invention.
(First Embodiment)
To begin with, the operational principle of the present embodiment is described with reference to Figs. 1 to 9.
Fig. 1 is a block diagram including a configurational example of an image sensor according to this embodiment.
CMOS image sensor 100 includes a pixel array 104. The pixel array 104 has a plurality of pixel circuits arrayed in a two dimensional form (matrix form) of N rows by M columns. A vertical scanning circuit 102 that supplies a pixel drive signal is disposed on one end side of the pixel array 104 (left side in the figure) . The pixel array 104 and the vertical scanning circuit 102 are connected together by trans fer gate (hereinafter also simply referred to as "TG" ) pulse signal lines 114. In addition, signal converters 108 and a horizontal scanning circuit 110, which are connected to signal lines 116 of individual columns, are disposed on the lower end side of an imaging area (underside in the figure) .
The CMOS image sensor 100 includes a timing controller 106. The timing controller 106 generates and outputs a master clock or a clock obtained by frequency-dividing the master clock. The vertical scanning circuit 102, the signal converters 108 and the horizontal scanning circuit 110 are controlled in synchronism with the clock output from the timing controller 106.
The vertical scanning circuit 102 sets addresses and controls vertical scanning. The signal converters 108 each perform signal conversion to convert, for example, an analog  output from pixels to a digital output, and outputs the digital output to an output circuit 112. The horizontal scanning circuit 110 sequentially selects the signal converters 108 in synchronism with the clock output from the timing controller 106, and reads a signal from the selected signal converter 108 and outputs the signal to the output circuit 112.
The output circuit 112 converts the digital outputs obtained through conversion in the signal converters 108 to signals corresponding to a color arrangement, and outputs the signals. The converted signals are output to a display controller 120 for displaying an image on, for example, a display, and an AF controller 118 for controlling autofocusing (AF) . The display controller 120 includes a digital signal processor (DSP) or the like. The AF controller 118 includes a central processing unit (CPU) or the like.
Fig. 2 is a diagram showing a circuit included in the pixel array 104 of the CMOS image sensor 100 according to the embodiment. In Fig. 2, TG pulse signal lines VTG1_n (hereinafter also simply referred to as "VTG1_n" or the like) , VTG2_n, VTG3_n, VTG4_n, VTG1_n+1, VTG2_n+1, VTG3_n+1, and VTG4_n+1 correspond to the TG pulse signal lines 114 in Fig. 1. Further, signal lines Vsig in Fig. 2 correspond to the signal lines 116 in Fig. 1.
pixel block 202 encircled by a broken line in Fig. 2 includes four photodiodes PD1_n, m (hereinafter also simply referred to as "PD1" or the like) , PD2_n, m+1, PD3_n, m+1, and PD4_n, m where m and n respectively indicate the column number  and the row number of the pixel block included in the pixel array. It is to be noted that although the present embodiment relates to an example which has nine photodiodes for one filter color to form one pixel block as described later referring to Fig. 10 or the like, four photodiodes are included for the purpose of descriptive simplicity herein.
The photodiode serves as a photoelectric conversion element that converts incident light to charges whose quantity corresponds to the amount of the light. The cathodes of the PD1_n, m, PD2_n, m+1, PD3_n, m+1, and PD4_n, m are respectively connected to transfer gates TG1_n, m (hereinafter also simply referred to as "TG1" or the like) , TG2_n, m+1, TG3_n, m+1, and TG4_n, m. The VTG1_n, VTG2_n, VTG3_n, and VTG4_n are respectively connected to the input parts of the gates of the TG1_n, m, TG2_n, m+1, TG3_n, m+1, and TG4_n, m. The TG is turned on when a transfer signal from the VTG is input to its gate. The drain of the TG shares a diffusion layer with the source of a reset transistor (hereinafter also simply referred to as "RS" or the like) to form a floating diffusion (hereinafter also simply referred to as "FD" or the like) . When the TG is turned on, the charges stored by photoelectric conversion of the PD is transferred to the FD.
The RS is turned on when a reset signal is input from a signal line VRS_n, resetting the potential of the FD to the potential of the power supply line.
The FD serves as a charge/voltage converter that converts charges to a voltage. The FD has a capacitance Cfd for storing  charges generated by an associated one of the PD1 to PD4, and is connected to the input part of the gate of an amplifier transistor (hereinafter also simply referred to as "AMP" ) . The AMP is connected to the signal line Vsig via a select transistor (hereinafter also simply referred to as "SL" ) . The SL is turned on when a control signal is input to its gate from a signal line VSL_n. The signal line is connected with a constant current source (not shown) , so that when the SL is turned on, the AMP and the constant current source form a source follower circuit, and the AMP outputs a low impedance signal corresponding to the potential of the FD to a vertical signal line Vsig. Voltages output from the individual pixels via the vertical signal lines Vsig are output to the signal converters 108. The signal converter 108 includes a CDS (Correlated Double Sampling) circuit (hereinafter also simply referred to as "CDS" ) and an analog/digital converting circuit (hereinafter also simply referred to as "ADC" ) . The CDS holds a reset voltage and a signal voltage output from the FD by means of a sample/hold circuit (S/H) , and obtains the difference between those voltages. The ADC collectively converts signals of analog voltages output from the CDS's to digital signals.
Fig. 3A is a diagram for describing PDs that share one FD shown in Fig. 2. Fig. 3A shows the configuration of the signal reading side of the pixel array. Fig. 3A shows adjacent two pixel blocks each of which consists of four PD1 to PD4 as described above. The four PD1 to PD4 share one FD via the TG1 to TG4, and the charges of the four PD1 to PD4 can be converted  to voltages by this FD. According to the present embodiment, it is possible to suppress an increase in the number of steps of the ADC which would be otherwise originated from an increase in the quantity of pixels (PDs) as described later referring to Fig. 10 or the like by appropriately determining, for such a single FD, the quantity and the arrangement (connection relation) of the PDs sharing the FD.
Referring to Fig. 3A, eight PDs share one circuit including the AMP, SL, RS, and FD.
Fig. 3B shows a color filter array on the light-receiving side of the pixel array in Fig. 3A. Each of the pixel blocks corresponds to one color, namely, green (Gr, Gb) , red (R) , or blue (B) . The green pixel is distinctly shown as Gr when the color of a pixel block horizontally adj oining to the green pixel is red, and as Gb when the color of the horizontally adj oining pixel block is blue.
Fig. 4A is a cross-sectional view of the pixel array taken along arrow IVA in Figs. 3A and 3B. In Fig. 4A, light is input from the underside in the figure to a pixel array 200 via a lens part 302 and a color filter 310 that passes light of a specific color, and charges are read out from the upper side in the figure. The pixel array 200 is implemented on a single semiconductor substrate, and P-wells 306 are provided between the PD1 to PD4. One PD includes a first PD 302_1 and PD 302_2 having different impurity concentrations, bot combined to form a single n layer.
Fig. 4B is a diagram showing potentials of the pixel array shown in Fig. 4A. In the figure, the PD1 and PD4 on the left  side corresponding to a pixel block of the nth row, and the PD1 and PD4 on the right side corresponding to a pixel block of the (n+1) th row, with subscriptions m and n being omitted. Fig. 4C is a pulse timing chart of control signals for the pixel array in Fig. 4A. Next, the operation of the pixel array is described referring to Figs. 2 to 4.
First, with light unilluminated, charges generated in a photodiode by a dark current or the like are removed. At time t1, control signals from the VTG1_n to VTG4_n go to a high level, turning on the TG1, TG2, TG3 and TG4 of the nth row. At the same time, when the control signal from the signal line VRS_n goes to a high level, the RS is turned on. As a result, the charges of the PD1, PD2, PD3 and PD4 of the nth row flow via the FD to the RS. Consequently, the voltages of the PD1, PD2, PD3 and PD4 are reset.
Next, when the control signals from the VTG1_n to VTG4_n go to a high level, the TG1, TG2, TG3 and TG4 of the (n+1) th row are turned on. At the same time, when the control signal from the signal line VRS_n goes to a high level, the RS is turned on. As a result, the charges of the PD1, PD2, PD3 and PD4 of the (n+1) th row flow via the FD to the RS. Consequently, the voltages of the PD1, PD2, PD3 and PD4 are reset.
Next, at time t2, the RS is turned on by the control signal from the signal line VRS_n, so that the voltage of the FD becomes the same as the voltage of the reset transistor.
At time t3 which is a point of time passed by a predetermined charge integration period Tint from the time t1,  when the control signals from the signal lines VTG1_n to VTG4_4 go to a high level, the TG1, TG2, TG3 and TG4 of the nth row are turned on. As a result, charges of the PD1 to PD4 flow into the FD of the nth row. At the same time, when the control signal from the signal line VRS_n goes to a high level, the SL is turned on. As a result, the charges stored in the FD of the nth row are transferred via the AMP and the SL to the signal line Vsig. This signal is supplied to the ADC.
Subsequently, at time t4, when control signals from the VTG1_n+1 to VTG4_n+1 go to a high level while the SL keeps on-state, the TG1, TG2, TG3 and TG4 of the (n+1) th row are turned on. As a result, charges of the PD1 to PD4 flow into the FD of the (n+1) th row. At the same time, when the control signal from the signal line VRS_n goes to a high level, the SL is turned on. As a result, the charges stored in the FD of the (n+1) th row are transferred via the AMP and the SL to the signal line Vsig. This signal is supplied to the ADC.
Figs. 5 to 7 are diagrams for describing the relationship between the color filter array and the resolution.
Fig. 5 shows a Bayer array which is generally used in the color filter array of an image sensor. In the Bayer array, Gr and R or B and Gb are alternately arrayed horizontally. In the example shown in Fig. 5, a pixel block corresponding to each filter color has one PD. Therefore, the length, a1, of one side of the pixel block is the same as the length, b1, of one side of one pixel corresponding to the PD. Fig. 6 shows an example in which the pixel block corresponding to each filter color has  two pixels (PD) vertically and two pixels horizontally (2x2) , so that the length, a2, of one side of the pixel block is twice the length, b2, of one side of one pixel. Fig. 7 shows an example in which the pixel block corresponding to each filter color has three pixels (PD) vertically and three pixels horizontally (3x3) , so that the length, a3, of one side of the pixel block is three times the length, b3, of one side of one pixel.
The sensitivity of the image sensor depends on the total area of the PDs corresponding to one filter color. The length a1, a2, a3 of one side of the pixel block corresponds to the area of the PD, and has a relation of a1<a2<a3, so that with the sum of the signals from the 2x2 or 3x3 PDs, the sensitivities in Figs. 5 to 7 become Fig. 5<Fig. 6<Fig. 7.
Moreover, the resolution of the image sensor depends on the size of the pixels corresponding to one filter color. The length b1, b2, b3 of one side of one pixel has a relation of b1>b2>b3, so that with the resolutions in Figs. 5 to 7 become Fig. 5<Fig. 6<Fig. 7.
Next, referring to Figs. 8A to 8C, the operation of the ADC according to a Comparative Example to the present embodiment is described. The pixel array 200 shown in Fig. 8A includes pixel blocks 212 of three filter colors. The pixel block for one filter color includes 2x2 or four pixels 214, each pixel including one PD. The four PDs in the pixel block 212 for Gr shares one FD via the TG1 to TG4. One FD has a capacitance Cfd and can collectively convert the charges supplied from the four PDs sharing the FD to voltages. The FD of the Gr pixel block  212 and the FD of the B pixel block 212 are connected to a source follower circuit SFC including an AMP, and are connected to the ADC by a metal signal line Vsig. Each ADC corresponds to a read channel, and the pixel array 200 can read signals separately via two read channels Ch. 1 and Ch. 2.
Fig. 8B shows signals read out in an ADC step. The ADC treats the processing by all the ADC circuits from the beginning of the processing to the end thereof as one step. In Fig. 8B, each row corresponds to one ADC step, and the number of ADC steps is 2. Fig. 8C is a pulse timing chart of control signals. An ADC step 1 (AD1) starts at the rising of the pulse of the reset signal at time t2, and ends at the falling of the next pulse. In this duration, charges for Gr are read via the read channel Ch. 1, and charges for R are read via the read channel Ch. 2. Then, an ADC step 2 (AD2) starts at the end of the AD1 period, and ends at the falling of the select signal VSL_n. In this duration, charges for B are read via the read channel Ch. 1, and charges for Gb are read via the read channel Ch. 2.
Fig. 9A shows an example of a pixel array in which a pixel block for one filter color includes 3x3 pixels. With reference to the Gr pixel block in a pixel array 900, the PDs connected to the TG1, TG2, TG4 and TG5 share an FD1. In addition, the PDs connected to the TG3 and TG6 share an FD3. Further, the PDs connected to the TG7 and TG8 share an FD2. Furthermore, the PDs connected to the TG9 share an FD4. The FD2 to FD4 are also shared by the PDs connected to the TGs for another filter color. Moreover, the FD1 and FD2 are connected to the ADC of  the read channel Ch. 1 via one source follower circuit SFC.
Fig. 9B shows the ADC step and signals read out in the ADC step in the pixel array shown in Fig. 9A. One read channel can perform processing for one color in a single DC step, and processes only the output from one source follower circuit SFC. The number of ADC steps required to carry out ADC processing for pixel blocks of three colors in the pixel array 900 is six, namely, the AD1 to AD6.
Therefore, when the quantity of pixels included in the pixel block increases from 2x2 shown in Fig. 8 to 3x3 shown in Fig. 9, the number of ADC steps becomes triple. As a result, the time required for the ADC steps is also substantially tripled.
When the pixel block is divided, noise increases. In the case of the CMOS image sensor, noise is generated due to a variation in the quality of amplifier transistors or the like. This noise increases as the frequency of reading signals increases, and as the capacitance of the FD becomes larger. For the pixel block including 2x2 pixels shown in Fig. 8, charges supplied from the pixel block are added in the charge domain for ADC, whereas in the example shown in Fig. 9, charges from the pixel block are distributed to different read channels or different ADC steps. For example, the signals from the Gr pixel block are separated to Gr1 and Gr2, and the signals from the B pixel block are separated to B1 to B4, and those signals are added color by color in the digital domain. Compared with the pixel block including 2x2 pixels, therefore, the pixel block  including 3x3 pixels increases noise 1.4 to 2 times. The present embodiment deals with the aforementioned increase in the number of ADC steps and an increase in nose.
Fig. 10A is a diagram showing the configuration of the signal reading side of the pixel array according to the first embodiment of the present disclosure. In Fig. 10A and the subsequent descriptions, like or same reference symbols are given to those elements which are like or same as the elements shown in Figs. 8A and 9A to avoid redundant descriptions. A pixel array 1000 included in the image sensor has pixel blocks each having pixels (PDs) arranged in a 3x3 matrix form for each of the filter colors Gr, R, B and Gb. One PD is disposed in correspondence to each pixel in the pixel block.
Referring to the Gr pixel block in the pixel array 1000, three PDs connected to the TG1, TG4 and TG5 share the FD2. Two PDs connected to the TG2 and TG3 share the FD1. Two PDs connected to the TG6 and TG9 share the FD3. Two PDs connected to the TG7 and TG8 share the FD4. The FD1 to FD4 are connected to the ADC of the read channel Ch. 1 via one source follower circuit SFC.
As described above, according to the signal reading configuration of the pixel array according to this embodiment, each pixel (PD) in the pixel block corresponding to one filter color is connected to one ADC via an associated FD shared by each pixel. Accordingly, AD conversion of the signals from all the PDs in the pixel block can be performed in a single step.
Such a configuration allows the signals from the pixel  block for one filter color to be read out in a single ADC step. Fig. 10B shows the ADC step and signals read out in the ADC step in the pixel array shown in Fig. 10A. The number of ADC steps required to carry out ADC processing for pixel blocks of three colors in the pixel array 1000 is two, namely, the AD1 and the AD2. Therefore, the signals read out in the ADC step becomes the same as those in the case of 2x2 pixels.
Fig. 11A is a diagram showing a lens part 1100 which is stacked on the pixel array 1000 in Fig. 10, and Fig. 11B is a cross-sectional view taken along horizontal arrow XIB in Fig. 11A. In Fig. 11B, light enters the lens part 1100 from the underside, and enters the PDs constituting the pixel array 1000 via a color filter 1110. On-chip lenses 1102 included in the lens part 1100 are formed in accordance with the sizes of the pixels in the pixel array 1000.
(Second Embodiment)
In the present disclosure, a pixel block for one filter color is provided with a plurality of FDs that are shared only by pixels in the pixel block to enable signal reading in one ADC step. In the following embodiment, an example of the layout of pixels is illustrated. Fig. 12 shows the configuration of the signal reading side of a pixel array according to another embodiment of the present disclosure. Referring to a Gr pixel block in a pixel array 1200, six PDs connected to the TG1 to TG6 share the FD1. Three PDs connected to the TG7 to TG9 share the FD2. The FD1 and FD2 are connected to the ADC of the read channel Ch. 1 via one source follower circuit SFC.
(Third Embodiment)
Fig. 13 shows the configuration of the signal reading side of a pixel array according to another embodiment of the present disclosure. Referring to a Gr pixel block in a pixel array 1300, three PDs connected to the TG1 to TG3 share the FD1. Four PDs connected to the TG4, TG5, TG7 and TG8 share the FD2. Two PDs connected to the TG6 and TG9 share the FD3. The FD1 to FD3 are connected to the ADC of the read channel Ch. 1 via one source follower circuit SFC.
(Fourth Embodiment)
Fig. 14 shows the configuration of the signal reading side of a pixel array according to another embodiment of the present disclosure. Referring to a Gr pixel block in a pixel array 1400, three PDs connected to the TG1, TG2 and TG4 share the FD1. Three PDs connected to the TG3, TG5 and TG6 share the FD1. Three PDs connected to the TG7 to TG9 share the FD3. The FD1 to FD3 are connected to the ADC of the read channel Ch. 1 via one source follower circuit SFC.
(Fifth Embodiment)
Fig. 15 shows a color filter array on the light receiving side of a pixel array according to another embodiment of the present disclosure. Referring to a Gr pixel block 1502 in a pixel array 1500, the pixel block includes three rows of pixels, namely, three pixels, two pixels and three pixels from the above, a total of eight pixels 1403, arranged horizontally.
Fig. 16 is a diagram showing a lens part which is stacked on the pixel array in Fig. 15. A lens part 1600 includes  square-shaped on-chip lenses 1602 and rectangular on-chip lenses 1603 which are formed in accordance with the sizes of the pixels in the pixel array 1500.
Fig. 17A shows the configuration of the signal reading side of a pixel array according to this embodiment. A pixel array 1700 included in the image sensor has pixel blocks each having pixels (PDs) arranged in a 3x3 matrix form for each of the filter colors Gr, R, B and Gb. One photodiode is disposed in correspondence to each pixel in a pixel block 1702.
An FD is shared by photodiodes corresponding to two pixels 1706 and 1707 adjacent to this FD. Further, the photodiodes connected to the FD include photodiodes corresponding to six  pixels  1703, 1704, 1705, 1708, 1709, and 1711 disposed apart from the FD.
Fig. 17B is a cross-sectional view taken along vertical arrow XVIIB in Fig. 17A, and Fig. 17C is a cross-sectional view taken along horizontal arrow XVIIC in Fig. 17A. In Figs. 17B and 17C, light enters the lens part 1600 from the underside in the figures, and enters the PDs constituting the pixel array 1700 via a color filter 1710. Referring to a Gr pixel block in the pixel array 1700 shown in Fig. 17A, photodiodes connected to the TG1 to TG9 share the FD. The FD is connected to the ADC of the read channel Ch. 1 via one source follower circuit SFC.
According to the signal reading configuration of the pixel array according to this embodiment, each pixel (PD) in the pixel block corresponding to one filter color is connected to one ADC via an associated FD shared by each pixel. Further,  the PDs connected to the FD include PDs corresponding to pixels arranged apart from the FD. This allows AD conversion of the signals from all the PDs in the pixel block to be performed in a single step.
According to this embodiment, eight photodiodes share one FD. As compared with the 3x3 pixels, therefore, the capacitance of the FD can be made small. Consequently, noise originated from signal reading can be made smaller.
(Sixth Embodiment)
Fig. 18 shows a color filter array on the light receiving side of a pixel array according to another embodiment of the present disclosure. In a pixel array 1800 shown in Fig. 18, a Gr pixel block 1802 includes four  pentagonal pixels  1804, 1806, 1807 and 1809 adjacent to an FD. Further, the pixel block 1802 includes four square-shaped  pixels  1803, 1805, 1808 and 1811 disposed apart from the FD.
Fig. 19A is a diagram showing a lens part which is stacked on the pixel array shown in Fig. 18. A lens part 1900 including an array of square-shaped on-chip lenses 1902 whose size is 1/9 of the size of the pixel block is stacked on the pixel array 1800 in Fig. 19A.
Fig. 19B shows the cross section taken along horizontal arrow XIXB in Fig. 19A. In Fig. 19B, light enters the lens part 1900 from the underside in the figure, and enters the PDs constituting the pixel array 1800 via a color filter 1910.
The pixel block 1802 includes four  pixels  1804, 1806, 1807 and 1809 adjacent to one another. The pixel block 1802 further  includes four  pixels  1803, 1805, 1808 and 1811 disposed apart from one another.
(Seventh Embodiment)
Fig. 20 shows a color filter array according to one embodiment. In a pixel array 2000, a Gr pixel block 2002 includes eight rectangular pixels 1903 whose longitudinal direction is a vertical direction.
Fig. 21A is a diagram showing a lens part which is stacked on the pixel array shown in Fig. 20. A lens part 2100 including an array of square-shaped on-chip lenses 2102 whose size is 1/8 of the size of the pixel block is stacked on the pixel array 2002. It is to be noted that the lens part 2100 is disposed at an inclination of an angle θ to the pixel array 2000.
Fig. 21B is a cross-sectional view taken along horizontal arrow XXIB in Fig. 21A, and Fig. 21C is a cross-sectional view taken along vertical arrow XXIC in Fig. 21A. In Fig. 21B, light enters the lens part 2100 from the underside, and enters the PDs constituting the pixel array 2000 via a color filter 1110. That is, lights entering the on-chip lenses 2102_1 to 2102_8 enter PD 2003_1 to 2003_8, respectively.
In the pixel array 2000, the longitudinal direction of the pixels 2003 is a vertical direction, so that the resolution in the vertical direction becomes lower than that in the horizontal direction. However, as the lens part 2100 including the square-shaped on-chip lenses 2102 is disposed at an inclination of an angle θ to the pixel array 2000, light can be uniformly sampled by dividing the light region into square  regions. From the viewpoint of uniform sampling, it is preferable to set the angle θ to 45 degrees. Therefore, it is possible to reduce the influence of the difference between the resolution in the horizontal direction and the resolution in the vertical direction on the output image.
Fig. 22 shows the configuration of the signal reading side of the pixel array according to the present embodiment. Referring to a Gr pixel block in the pixel array 2000, the PDs connected to the TG1, TG2, TG5 and TG6 share the FD1. The PDs connected to the TG3, TG4, TG7 and TG8 share the FD2. The FD1 and FD2 are connected to the ADC of the read channel Ch. 1 via one source follower circuit SFC.
(Eighth Embodiment)
Regarding the following individual embodiments, various aspects of a lens part which is stacked on a pixel array are described.
Fig. 23A shows a lens part which is stacked on the pixel arrays shown in Figs. 10 to 14, and Fig. 23B shows a cross section taken along vertical arrow XXIIIB in Fig. 23A. As shown in Figs. 23A and 23B, light enters a lens part 2300, and enters PDs constituting the pixel array 1000 via a color filter 2310.
Of the lens part 2300, the lens part 2300 that is provided for the Gr pixel block is constituted by a combination of a rectangular on-chip lens 2304 stacked on two pixels adj oining horizontally, a square-shaped on-chip lens 2305 having the same size as the pixels, and an on-chip lens 2306 stacked on two pixels adj oining vertically.
The pixels on which the on-chip lens 2304 is stacked are constituted by two PDs, and can be used for both imaging and phase difference autofocus (AF) . Figs. 24A and 24B are diagrams for describing the operation of phase difference AF. When the PDs adj oining horizontally operate as a phase difference AF sensor, a flux of light entering from the imaging lens is received separately by two PDs as shown in Figs. 24A and 24B. For example, the AF controller 118 in Fig. 1 detects signals from the two PDs, and measures a deviation of the focus based on the interval between two images formed on the image sensor. Then, the optical system is controlled based on the measured deviation to bring the optical system into focus. At the time of imaging, on the other hand, the adj oining PDs are combined with other PDs in the pixel block to form pixels of one color.
In the example shown in Fig. 23A, the on-chip lens 2304 can be used to detect a phase difference in the horizontal direction, and the on-chip lens 2306 can be used to detect a phase difference in the vertical direction.
(Ninth Embodiment)
Fig. 25 is shows a lens part which is stacked on the pixel arrays shown in Figs. 10 to 14.
Of a lens part 2500, a part corresponding to a GR pixel block is constituted by a combination of a rectangular on-chip lens 2502 stacked on 2x2 or four pixels, and a square-shaped on-chip lens 2503 having the same size as the pixels.
The pixels on which the on-chip lens 2502 is stacked are constituted by four PDs, and can be used for both imaging and  phase difference autofocus (AF) . When the PDs adj oining horizontally operate as a phase difference AF sensor, a flux of light entering from the imaging lens is separately received by two PDs as shown in Figs. 24A and 24B. For example, the AF controller 118 in Fig. 1 detects signals from the two PDs, and measures a deviation of the focus based on the interval between two images formed on the image sensor. Then, the optical system is controlled based on the measured deviation to bring the optical system into focus. The two PDs adj oining vertically likewise are used for phase difference AF control. At the time of imaging, on the other hand, the adj oining PDs are combined with other PDs in the pixel block to form pixels of one color.
(Tenth Embodiment)
Fig. 26A is a diagram showing a lens part which is stacked on the pixel array 2000 shown in Fig. 20. A 2600 including an array of four on-chip lenses 2603 whose size is 1/4 of the size of the pixel block is stacked on a pixel block 2602.
Fig. 26B shows a cros s section taken along horizontal arrow XXVIB in Fig. 26A. As shown in Figs. 26A and 26B, light enters the lens part 2600, and enters the PDs constituting the pixel array 2000 via a color filter 2610.
In the pixel array 2000, the longitudinal direction of the pixels 2003 is a vertical direction, so that the resolution in the vertical direction becomes lower than that in the horizontal direction. However, one on-chip lens 2603 is stacked on two PDs, which brings about an advantage such that the PDs can be used as a phase difference AF sensor as described  above. In the example shown in Fig. 26A, the combination of the on-chip lens 2603 and two PDs enables detection of a phase difference in the horizontal direction.
(Eleventh Embodiment)
Fig. 27 is a diagram showing a lens part which is stacked on the pixel array 2000 shown in Fig. 20.
Of a lens part 2700, two on-chip lenses 2703 provided for a Gr pixel block 2702 is stacked on 2x2 or four pixels.
The pixels on which the on-chip lens 2703 is stacked are constituted by four PDs, and can be used for both imaging and phase difference autofocus (AF) .
In the pixel array 2000, the longitudinal direction of the pixels 2003 is a vertical direction, so that the resolution in the vertical direction becomes lower than that in the horizontal direction. However, one on-chip lens 2703 is stacked on four PDs, which brings about an advantage such that the PDs can be used as a phase difference AF sensor as described above.
(Twelfth Embodiment)
Fig. 28 shows a color filter array on the light receiving side of a pixel array according to another embodiment of the present disclosure. In a pixel array 2800, a Gr pixel block 2802 includes eight rectangular pixels 2803 whose longitudinal direction is a horizontal direction.
Fig. 29 shows a lens part 2900 which is stacked on the pixel array 2800 shown in Fig. 28, and Fig. 30 shows an example of a combination of the lens part 2900 in the pixel array 2800.  The lens part 2900 is formed so that four square-shaped on-chip lenses 2902 are included in a pixel block 2802.
In the pixel array 2800, the longitudinal direction of the pixels 2803 is a horizontal direction, so that the resolution in the horizontal direction becomes lower than that in the vertical direction. However, stacking one on-chip lens 2902 on two PDs brings about an advantage such that the PDs can be used as a phase difference AF sensor as described above. In the example shown in Fig. 30, the combination of the on-chip lens 2902 and two PDs enables detection of a phase difference in the vertical direction.
(Thirteenth Embodiment)
Fig. 31 shows a color filter array on the light receiving side of a pixel array according to another embodiment of the present disclosure. In a pixel array 3000, a Gr pixel block 3002 includes four rectangular pixels 3003 whose longitudinal direction is a horizontal direction, and four rectangular pixels 3004 whose longitudinal direction is a vertical direction.
Fig. 32 shows an example of a combination of the lens part 2900 shown in Fig. 29 in the color filter array shown in Fig. 30. The lens part 2900 is formed so that four square-shaped on-chip lenses 2902 are included in the pixel block 3002.
In the pixel array 3000, the on-chip lens 2902 is stacked on two pixels 3003 whose longitudinal direction is a horizontal direction and the on-chip lens 2904 stacked on two pixels 3004 whose longitudinal direction is a vertical direction. This  brings about an advantage such that the on-chip lens 2902 can be used as a phase difference AF sensor as described above. In the example shown in Fig. 32, the combination of the on-chip lens 2902 and two PDs enables detection of a phase difference in the vertical direction. In addition, the combination of the on-chip lens 2904 and two PDs enables detection of a phase difference in the horizontal direction.
(Fourteenth Embodiment)
Fig. 33 shows a combination of a color filter array on the light receiving side of a pixel array and a lens part according to another embodiment of the present disclosure. A lens part 3300 constitutes the combination 2600 of the pixels and the lens part shown in Fig. 26A and the combination 2900 of the pixels and the lens part shown in Fig. 30.
Fig. 34A shows the configuration of the signal reading side of the pixel array shown in Fig. 33. In an upper left region 3402 of a pixel array 3400, in a Gr pixel block 3406, PDs connected to four TGs share one FD. Two FDs included in the Gr pixel block and two FDs included in the B pixel block are connected to one ADC for the read channel Ch. 1 via one source follower circuit SFC. A region 3404 is a configurational example of the signal reading side of the pixel array having the color filter array shown in Fig. 28.
Fig. 34B shows an example of the layout of transistors in the pixel array in Fig. 34A. In Fig. 34, two FDs included in the Gr pixel block and two FDs included in the B pixel block are connected to the source follower circuit SFC including an  AMP, SL and RS provided between those two pixel blocks.
Next, referring to Fig. 35, procedures of a signal reading method which is executed by the image sensors according to the above-described various embodiments are described. A variable i is initialized to 0 (S3501) . Then, signal charges are read out from FDs of an ith row for each read channel (S3503) . In this process, signal charges stored in all the FDs included in the pixel block for one filter color are supplied to the vertical signal lines via the SFC circuit. Then, ADC is performed on the read signals (S3504) . Next, i+1 is set to i (S3505) . The processing of the steps S3503 to S3505 is repeated by the number of rows, N, of the pixel block (S3502 to S3506) .
Although signals from the pixels of the same row are read out for all the read channels in the foregoing procedures, the rows from which pixel signals are read out may be different for different read channels. For example, signals from the pixels may be read out in such a way that the read channel Ch. 1 reads signals from the pixels of the nth row, the read channel Ch. 2 reads signals from the pixels of the (n+1) th row, and so forth.
(Other Embodiments)
Although the foregoing embodiments have been described using the RGB layout by way of example, the present disclosure can be applied to another color layout such as black and white.
In addition, a pixel array according to one embodiment may be combined with on-chip lenses according to another embodiment. For example, a lens part 2100 including an array of square-shaped on-chip lenses 2102 whose size is 1/8 of the  size of the pixel block in Fig. 21A may be stacked on the pixel array indicated in any one of the eighth to thirteenth embodiments. In this case, the lens part 2100 may be disposed at an inclination of an angle θ to the pixel array 2000.
The foregoing descriptions are merely specific implementation manners of the present invention, but are not intended to limit the protection scope of the present invention. Any variation or replacement readily figured out by a person skilled in the art within the technical scope disclosed shall fall within the protection scope of the present invention. Therefore, the protection scope of the present invention shall be subj ect to the protection scope of the claims.

Claims (39)

  1. An image sensor comprising:
    a pixel block corresponding to one filter color and including a plurality of photoelectric conversion elements, the plurality of photoelectric conversion elements respectively corresponding to pixels constituting the pixel block;
    a plurality of charge/voltage converters converting charges output from the photoelectric conversion elements in the pixel block to voltages; and
    a signal converter connected to the plurality of charge/voltage converters so as to collectively convert signals of voltages output from the plurality of charge/voltage converters included in each pixel block,
    the each of the pixels in the pixel block corresponding to the one filter color being connected to the signal converter via an associated one of the plurality of charge/voltage converters which is shared by the each of the pixels.
  2. The image sensor according to claim 1, wherein
    a plurality of pixel blocks are arranged in a matrix form.
  3. The image sensor according to claim 1 or 2, wherein
    the charge/voltage converter converts only charges output from the photoelectric conversion elements in one pixel block to voltages.
  4. The image sensor according to any one of claims 1 to 3, wherein
    the charge/voltage converter is shared only by the  photoelectric conversion elements in one pixel block.
  5. The image sensor according to any one of claims 1 to 4, wherein
    the pixel block includes pixels arranged in a 3x3 matrix form.
  6. The image sensor according to claim 5, wherein
    the plurality of charge/voltage converters includes at least a first charge/voltage converter shared by two of the photoelectric conversion elements and a second charge/voltage converter shared by three of the photoelectric conversion elements.
  7. The image sensor according to claim 5, wherein
    the plurality of charge/voltage converters includes a first charge/voltage converter shared by three of the photoelectric conversion elements and a second charge/voltage converter shared by six of the photoelectric conversion elements.
  8. The image sensor according to claim 5, wherein
    the plurality of charge/voltage converter include a first charge/voltage converter shared by two of the photoelectric conversion elements, a second charge/voltage converter shared by three of the photoelectric conversion elements, and a third charge/voltage converter shared by four of the photoelectric conversion elements.
  9. The image sensor according to claim 5, wherein
    the plurality of charge/voltage converters includes three charge/voltage converters each shared by three of the  photoelectric conversion elements.
  10. The image sensor according to any one of claims 1 to 4, wherein
    the pixel block includes pixels arranged in a 2x4 matrix form.
  11. The image sensor according to claim 10, wherein
    the plurality of charge/voltage converters includes two charge/voltage converters shared by four of the photoelectric conversion elements.
  12. The image sensor according to claim 10 or 11, wherein
    the pixels included in the pixel block include eight pixels whose longitudinal direction is a horizontal direction or a vertical direction.
  13. The image sensor according to any one of claims 10 to 12, wherein
    an array of lenses with a size of 1/8 of a size of the pixel block is stacked on the pixel block, the lenses being arranged to be inclined by a predetermined angle θ to the pixels.
  14. The image sensor according to claim 13, wherein
    the predetermined angle θ is 45 degrees.
  15. The image sensor according to any one of claims 5 to 14, wherein
    one lens is stacked on adj acent two or four pixels included in the pixel block.
  16. An image sensor comprising:
    a pixel block corresponding to one filter color and including a plurality of photoelectric conversion elements, the  plurality of photoelectric conversion elements respectively corresponding to pixels constituting the pixel block;
    at least one charge/voltage converter converting charges output from the photoelectric conversion elements in the pixel block to voltages; and
    a signal converter connected to the at least one charge/voltage converter so as to collectively convert signals of voltages output from the at least one charge/voltage converter included in each pixel block,
    wherein the each of the pixels in the pixel block corresponding to the one filter color is connected to the signal converter via the at least one charge/voltage converter shared by the each of the pixels, and the photoelectric conversion elements connected to the at least one charge/voltage converter include a photoelectric conversion element corresponding to pixels arranged apart from the at least one charge/voltage converter.
  17. The image sensor according to claim 16, wherein
    a plurality of pixel blocks are arranged in a matrix form.
  18. The image sensor according to claim 16 or 17, wherein
    the at least one charge/voltage converter converts only charges output from the photoelectric conversion elements included in one pixel block to voltages.
  19. The image sensor according to any one of claims 16 to 18, wherein
    the at least one charge/voltage converter is shared only by the photoelectric conversion elements included in one pixel  block.
  20. The image sensor according to any one of claims 16 to 19, wherein
    the pixel block includes an array of eight pixels.
  21. The image sensor according to claim 20, wherein
    individual rows of the pixel block include three pixels, two pixels and three pixels, respectively.
  22. The image sensor according to claim 20, wherein
    the pixel block includes four pixels adj acent to one another, and four pixels arranged apart from one another.
  23. The image sensor according to any one of claims 16 to 22, wherein
    an array of lenses with a size of 1/9 of a size of the pixel block is stacked on the pixel block.
  24. An image sensor comprising:
    a pixel block corresponding to one filter color and including a plurality of photoelectric conversion elements, the plurality of photoelectric conversion elements respectively corresponding to pixels constituting the pixel block, wherein the pixels included in the pixel block include eight pixels whose longitudinal direction is a horizontal direction or a vertical direction.
  25. The image sensor according to claim 24, wherein
    the pixel block includes pixels arranged in a 2x4 matrix form.
  26. The image sensor according to claim 25, wherein the image sensor comprises a plurality of pixel blocks, one of the  plurality of pixel blocks including eight pixels whose longitudinal direction is a horizontal direction, and another of the plurality of pixel blocks including eight pixels whose longitudinal direction is a vertical direction.
  27. The image sensor according to any one of claims 24 to 26, wherein
    an array of lenses with a size of 1/8 of a size of the pixel block is stacked on the pixel block, the lenses being arranged to be inclined by a predetermined angle θ to the pixels.
  28. The image sensor according to claim 27, wherein
    the predetermined angle θ is 45 degrees.
  29. The image sensor according to any one of claims 24 to 26, wherein
    one lens is stacked on adj acent two or four pixels included in the pixel block.
  30. An image sensor comprising:
    a pixel block corresponding to one filter color and including a plurality of photoelectric conversion elements, the plurality of photoelectric conversion elements respectively corresponding to pixels constituting the pixel block, wherein individual rows of the pixel block include three pixels, two pixels and three pixels, respectively.
  31. An image sensor comprising:
    a pixel block corresponding to one filter color and including a plurality of photoelectric conversion elements, the plurality of photoelectric conversion elements respectively corresponding to pixels constituting the pixel block, wherein  the pixel block includes four pixels adj acent to one another, and four pixels arranged apart from one another.
  32. An image sensor comprising:
    a pixel block corresponding to one filter color and including a plurality of photoelectric conversion elements, the plurality of photoelectric conversion elements respectively corresponding to pixels constituting the pixel block, wherein the pixel block includes pixels arranged in a 3x3 matrix form.
  33. The image sensor according to claim 32, wherein
    one lens is stacked on adj acent two or four pixels included in the pixel block.
  34. A signal reading method for an image sensor including a pixel block corresponding to one filter color and including a plurality of photoelectric conversion elements, the plurality of photoelectric conversion elements respectively corresponding to pixels constituting the pixel block, a plurality of charge/voltage converters converting charges output from the photoelectric conversion elements in the pixel block to voltages, and a signal converter connected to the plurality of charge/voltage converters so as to convert signals of voltages output from the plurality of charge/voltage converters included in each pixel block, the each of the pixels in the pixel block corresponding to the one filter color being connected to the signal converter via an associated one of the plurality of charge/voltage converters which is shared by the each of the pixels, the method comprising the steps of:
    collectively supplying signals of voltages output from  the plurality of charge/voltage converters included in a first pixel block to the signal converter; and
    collectively supplying signals of voltages output from the plurality of charge/voltage converters included in a second pixel block to the signal converter.
  35. A signal reading method for an image sensor including a pixel block corresponding to one filter color and including a plurality of photoelectric conversion elements, the plurality of photoelectric conversion elements corresponding to pixels constituting the pixel block, at least one charge/voltage converter converting charges output from the photoelectric conversion elements in the pixel block to voltages, and a signal converter connected to the at least one charge/voltage converter so as to convert signals of voltages output from the at least one charge/voltage converter included in each pixel block, wherein the each of the pixels in the pixel block corresponding to the one filter color is connected to the signal converter via the at least one charge/voltage converter shared by the each of the pixels, and the photoelectric conversion elements connected to the at least one charge/voltage converter include a photoelectric conversion element corresponding to pixels arranged apart from the at least one charge/voltage converter, the method comprising the steps of:
    collectively supplying signals of voltages output from the at least one charge/voltage converters included in a first pixel block to the signal converter; and
    collectively supplying signals of voltages output from  the at least one charge/voltage converters included in a second pixel block to the signal converter.
  36. A signal reading method for an image sensor including a pixel block corresponding to one filter color and including a plurality of photoelectric conversion elements, the plurality of photoelectric conversion elements respectively corresponding to pixels constituting the pixel block, wherein the pixels included in the pixel block include eight pixels whose longitudinal direction is a horizontal direction or a vertical direction, the method comprising the steps of:
    collectively supplying signals of voltages output from the plurality of photoelectric conversion elements included in a first pixel block to a signal converter; and
    collectively supplying signals of voltages output from the plurality of photoelectric conversion elements included in a second pixel block to the signal converter.
  37. A signal reading method for an image sensor including a pixel block corresponding to one filter color and including a plurality of photoelectric conversion elements, the plurality of photoelectric conversion elements respectively corresponding to pixels constituting the pixel block, wherein individual rows of the pixel block include three pixels, two pixels and three pixels, respectively, the method comprising the steps of:
    collectively supplying signals of voltages output from the plurality of photoelectric conversion elements included in a first pixel block to a signal converter; and
    collectively supplying signals of voltages output from the plurality of photoelectric conversion elements included in a second pixel block to the signal converter.
  38. A signal reading method for an image sensor including a pixel block corresponding to one filter color and including a plurality of photoelectric conversion elements, the plurality of photoelectric conversion elements respectively corresponding to pixels constituting the pixel block, wherein the pixel block includes four pixels adj acent to one another, and four pixels arranged apart from one another, the method comprising the steps of:
    collectively supplying signals of voltages output from the plurality of photoelectric conversion elements included in a first pixel block to a signal converter; and
    collectively supplying signals of voltages output from the plurality of photoelectric conversion elements included in a second pixel block to the signal converter.
  39. A signal reading method for an image sensor including a pixel block corresponding to one filter color and including a plurality of photoelectric conversion elements, the plurality of photoelectric conversion elements respectively corresponding to pixels constituting the pixel block, wherein the pixel block includes pixels arranged in a 3x3 matrix form, the method comprising the steps of:
    collectively supplying signals of voltages output from the plurality of photoelectric conversion elements included in a first pixel block to a signal converter; and
    collectively supplying signals of voltages output from the plurality of photoelectric conversion elements included in a second pixel block to the signal converter.
PCT/CN2018/113967 2018-11-05 2018-11-05 Image sensor and control method therefor WO2020093197A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201880003761.9A CN109804617B (en) 2018-11-05 2018-11-05 Image sensor and control method thereof
PCT/CN2018/113967 WO2020093197A1 (en) 2018-11-05 2018-11-05 Image sensor and control method therefor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/113967 WO2020093197A1 (en) 2018-11-05 2018-11-05 Image sensor and control method therefor

Publications (1)

Publication Number Publication Date
WO2020093197A1 true WO2020093197A1 (en) 2020-05-14

Family

ID=66558647

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/113967 WO2020093197A1 (en) 2018-11-05 2018-11-05 Image sensor and control method therefor

Country Status (2)

Country Link
CN (1) CN109804617B (en)
WO (1) WO2020093197A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017018188A1 (en) * 2015-07-24 2017-02-02 ソニーセミコンダクタソリューションズ株式会社 Image sensor and electronic device
CN108063925A (en) * 2016-11-08 2018-05-22 爱思开海力士有限公司 Imaging sensor
US20180255215A1 (en) * 2017-03-06 2018-09-06 SK Hynix Inc. Image sensor

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI433307B (en) * 2008-10-22 2014-04-01 Sony Corp Solid state image sensor, method for driving a solid state image sensor, imaging apparatus, and electronic device
JP2014072788A (en) * 2012-09-28 2014-04-21 Sony Corp Solid-state imaging device, driving method and electronic apparatus
KR102414030B1 (en) * 2015-11-18 2022-06-29 에스케이하이닉스 주식회사 Image sensor

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017018188A1 (en) * 2015-07-24 2017-02-02 ソニーセミコンダクタソリューションズ株式会社 Image sensor and electronic device
CN108063925A (en) * 2016-11-08 2018-05-22 爱思开海力士有限公司 Imaging sensor
US20180255215A1 (en) * 2017-03-06 2018-09-06 SK Hynix Inc. Image sensor

Also Published As

Publication number Publication date
CN109804617A (en) 2019-05-24
CN109804617B (en) 2020-07-07

Similar Documents

Publication Publication Date Title
US10674102B2 (en) Solid-state imaging device and electronic camera
US9185369B2 (en) Solid state imaging device
EP1819151B1 (en) Solid-state imaging device, method for driving solid-state imaging device and camera
US7859581B2 (en) Image sensor with charge binning and dual channel readout
US8964084B2 (en) Solid-state image sensor and camera system
JP5262823B2 (en) Solid-state imaging device and electronic apparatus
US7440019B2 (en) Solid-state image pick-up device
US9674469B2 (en) Solid-state imaging device, method of driving the same, and electronic apparatus
CN108462841A (en) Pel array and imaging sensor
US20100097486A1 (en) Image sensing device and image sensing system
US8289423B2 (en) Solid-state imaging device, method for driving the same, and camera system
US9001240B2 (en) Common element pixel architecture (CEPA) for fast speed readout
KR101248436B1 (en) Pixel circuit of image sensor with wide dynamic range and operating method thereof
WO2020093197A1 (en) Image sensor and control method therefor
TW202341717A (en) Image sensor and method for reading out signal of image sensor
JP2014022822A (en) Image pickup device, image pickup device drive method, and image pickup system drive method
JP4321008B2 (en) Solid-state image sensor
JP6217794B2 (en) Solid-state imaging device and electronic camera
JP5958497B2 (en) Solid-state imaging device and electronic camera
WO2014203456A1 (en) Solid-state imaging device and method for driving same
JP2007151147A (en) Solid-state imaging device
JP2008010879A (en) Solid state imaging apparatus

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18939148

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18939148

Country of ref document: EP

Kind code of ref document: A1