CN110650301A - Image sensor, imaging method and device - Google Patents

Image sensor, imaging method and device Download PDF

Info

Publication number
CN110650301A
CN110650301A CN201910972469.4A CN201910972469A CN110650301A CN 110650301 A CN110650301 A CN 110650301A CN 201910972469 A CN201910972469 A CN 201910972469A CN 110650301 A CN110650301 A CN 110650301A
Authority
CN
China
Prior art keywords
pixel
color
light
image sensor
layer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910972469.4A
Other languages
Chinese (zh)
Other versions
CN110650301B (en
Inventor
杨鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201910972469.4A priority Critical patent/CN110650301B/en
Publication of CN110650301A publication Critical patent/CN110650301A/en
Application granted granted Critical
Publication of CN110650301B publication Critical patent/CN110650301B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/71Charge-coupled device [CCD] sensors; Charge-transfer registers specially adapted for CCD sensors
    • H04N25/75Circuitry for providing, modifying or processing image signals from the pixel array
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/76Addressed sensors, e.g. MOS or CMOS sensors

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Solid State Image Pick-Up Elements (AREA)
  • Color Television Image Signal Generators (AREA)

Abstract

The embodiment of the application discloses an image sensor, an imaging method and imaging equipment, wherein the image sensor comprises: the color filter array, the N layers of pixel arrays and the readout circuit, wherein N is greater than 1 and less than the number of color light components of the color model; the color filter array is arranged above the N layers of pixel arrays and is used for allowing light rays of color light components of the color model to penetrate; at least one of said N pixel arrays for converting light of at least two different said color light components into photo-generated charges, said at least two not including said N; the readout circuit is used for converting the photo-generated charges into electric signals and outputting the electric signals to form images.

Description

Image sensor, imaging method and device
Technical Field
The embodiment of the application relates to electronic technology, in particular to an image sensor, an imaging method and imaging equipment.
Background
Foveon X3 is the first image sensor worldwide that can capture all colors on one pixel. Digital cameras, which usually employ a Charge Coupled Device (CCD) or a Complementary Metal Oxide Semiconductor (CMOS), record only one of three color light components, i.e., Red light, Green light, or Blue light, in a Red Green Blue (RGB) model on one pixel; while Foveon X3 employs three layers of photoelectric conversion elements, each recording one of the color light components of RGB. Thus, the three photosensitive layers of Foveon X3 capture RGB colors at different depths, which ensures that the RGB colors are captured 100%, thereby providing a sharper image and better colors.
However, Foveon X3 suffers from a series of problems such as high power consumption and severe heat generation during operation.
Disclosure of Invention
In view of this, the embodiments of the present application provide an image sensor, an imaging method and an imaging apparatus. The technical scheme of the embodiment of the application is realized as follows:
in a first aspect, an embodiment of the present application provides an image sensor, including: the color filter array, the N layers of pixel arrays and the readout circuit, wherein N is greater than 1 and less than the number of color light components of the color model; the color filter array is arranged above the N layers of pixel arrays and is used for allowing light rays of color light components of the color model to penetrate; at least one of said N pixel arrays for converting light of at least two different said color light components into photo-generated charges, said at least two not including said N; the readout circuit is used for converting the photo-generated charges into electric signals and outputting the electric signals to form images.
In a second aspect, an embodiment of the present application provides an imaging method, including: turning on an image sensor; transmitting light rays of color light components of a color model through a color filter array of the image sensor;
converting light of at least two different color light components into photo-generated charges by at least one of N layers of pixel arrays of the image sensor, wherein N is larger than 1 and smaller than the number of color light components of the color model, and the at least two types do not include the N types; converting the photo-generated charge into an electrical signal by a readout circuit of the image sensor and outputting the electrical signal to form an image.
In a third aspect, an embodiment of the present application provides an electronic device, including a memory, a processor, and the image sensor according to the embodiment of the present application, where the memory stores a computer program that is executable on the processor, and the processor executes the computer program to implement the steps in the imaging method according to the embodiment of the present application.
In an embodiment of the present application, an image sensor includes: the color filter array, the N layers of pixel arrays and the readout circuit, wherein N is greater than 1 and less than the number of color light components of the color model; at least one of the N layers of pixel arrays is used for converting light rays of at least two different color light components into photo-generated charges; in this way, the power consumed and heat generated during operation of the image sensor is reduced, while achieving better image quality with fewer pixel arrays.
Drawings
FIG. 1 is a schematic structural diagram of an image sensor according to an embodiment of the present disclosure;
FIG. 2 is a schematic structural diagram of another image sensor according to an embodiment of the present disclosure;
FIG. 3A is a schematic structural diagram of another image sensor according to an embodiment of the present disclosure;
FIG. 3B is a schematic cross-sectional view of a first layer of a pixel array according to an embodiment of the present disclosure;
FIG. 4 is a schematic diagram of a third pixel unit and a pixel unit of a first layer of a pixel array being stacked in a staggered manner according to an embodiment of the present disclosure;
FIG. 5 is a schematic diagram of a third pixel unit and a pixel unit of a first layer of a pixel array being stacked in a staggered manner according to an embodiment of the present disclosure;
FIG. 6 is a schematic diagram of the working principle of Foveon X3;
FIG. 7 is a schematic structural diagram of another image sensor according to an embodiment of the present disclosure;
FIG. 8 is a schematic diagram of a color filter array according to an embodiment of the present application;
FIG. 9 is a schematic cross-sectional view of a pixel array of each layer according to an embodiment of the present application;
FIG. 10 is a schematic diagram of a readout circuit according to an embodiment of the present application;
FIG. 11 is a schematic flow chart of an implementation of an imaging method according to an embodiment of the present application;
fig. 12 is a hardware entity diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, specific technical solutions of the present application will be described in further detail below with reference to the accompanying drawings in the embodiments of the present application. The following examples are intended to illustrate the present application but are not intended to limit the scope of the present application.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein is for the purpose of describing embodiments of the present application only and is not intended to be limiting of the application.
In the following description, reference is made to "some embodiments" which describe a subset of all possible embodiments, but it is understood that "some embodiments" may be the same subset or different subsets of all possible embodiments, and may be combined with each other without conflict.
It should be noted that the terms "first \ second \ third" referred to in the embodiments of the present application are only used for distinguishing similar objects and do not represent a specific ordering for the objects, and it should be understood that "first \ second \ third" may be interchanged under specific ordering or sequence if allowed, so that the embodiments of the present application described herein can be implemented in other orders than illustrated or described herein.
An embodiment of the present application provides an image sensor, fig. 1 is a schematic structural diagram of the image sensor in the embodiment of the present application, and as shown in fig. 1, an image sensor 10 includes: n layers of pixel arrays 101 to 10N, a color filter array 111, and a readout circuit 121, N being greater than 1 and smaller than the number of color light components of the color model; wherein,
the color filter array 111 is disposed above the N-layer pixel arrays 101 to 10N for allowing light of color light components of the color model to pass therethrough, with a transmission direction of the light as a reference direction.
In the embodiment of the present application, the kind of the color model may be various, for example, the color model may be an RGB model, a CMYK model, or a Lab model, etc.; wherein the RGB model has 3 color light components, i.e., red (R), green (G), and blue (B), accordingly, when implemented, the N-layer pixel array may be a two-layer pixel array, and an exemplary structure may be seen in the image sensor shown in fig. 2 in the following embodiment; the CMYK model has 4 color light components, i.e., Cyan (Cyan, C), Magenta (M), Yellow (Y), and Black (Black, K), and accordingly, when implemented, the N-layer pixel array may be a two-layer pixel array or a three-layer pixel array.
It will be appreciated that the color filters of the color filter array function to filter light of the color filters. For example, for the RGB model, the color filter array may be composed of a violet color filter for allowing the blue and red light to pass through and a yellow color filter for allowing the green and red light to pass through. For another example, for the CMYK model, the color filter array may be composed of a cyan filter transmitting cyan light, a magenta filter transmitting magenta light, a yellow filter transmitting yellow light, and a black filter transmitting black light.
At least one of the N-layered pixel arrays 101 to 10N for converting light of at least two different color light components into photo-generated charges, the at least two excluding the N.
It should be noted that, the other pixel arrays except the at least one pixel array in the N-layer pixel array are configured to convert light of at least one color light component into photo-generated charges. For example, assume that N is 2, where one layer of pixel arrays is used to convert light of two different color light components into photo-generated charges, and another layer of pixel arrays is used to convert light of one or two of the remaining color light components into photo-generated charges.
A readout circuit 121 for converting the photo-generated charges into electrical signals and outputting the electrical signals to form an image.
In an embodiment of the present application, an image sensor includes: the color filter array, the N layers of pixel arrays and the readout circuit, wherein N is greater than 1 and less than the number of color light components of the color model; at least one of the N layers of pixel arrays is used for converting light rays of at least two different color light components into photo-generated charges; in this way, the power consumed and the heat generated when the image sensor works are reduced under the condition that better image quality is obtained by fewer pixel arrays; in addition, the image sensor has a pixel array layer number smaller than the color light component number of the color model, so that the process difficulty in manufacturing the image sensor is reduced. For example, for the RGB model, where N is 2, i.e., the image sensor has a two-layer pixel array, the process difficulty is less compared to Foveon X, which has a three-layer pixel array.
An embodiment of the present application further provides an image sensor, fig. 2 is a schematic structural diagram of the image sensor in the embodiment of the present application, and as shown in fig. 2, an image sensor 20 includes: a first layer of pixel array 201 and a second layer of pixel array 202, a color filter array 203, and readout circuitry 204; wherein,
the color filter array 203 is disposed on the first layer pixel array 201, taking the transmission direction of the light as a reference direction, for allowing the light of the color light component of the RGB color model to pass through.
It should be noted that the first-layer pixel array refers to the layer of pixel array that the light ray first reaches, and as shown in fig. 2, the first-layer pixel array 201 is stacked on the second-layer pixel array 202, and the color filter array 203 is stacked on the first-layer pixel array 201, with the direction of light ray transmission as a reference direction.
And a first layer pixel array 201 arranged between the color filter array 203 and the second layer pixel array 202 for converting the light rays of two different color light components into photo-generated charges.
Here, any two components among three color light components of red light, green light, and blue light in the RGB model may be used for the two different color light components. For example, blue and green light, and accordingly, the second layer pixel array is configured to absorb red light (i.e., convert red light into photo-generated charge); as another example, the two different color light components may also be red and green light, and the second layer pixel array is configured to absorb blue light.
It should be noted that in other embodiments, the first layer pixel array 201 can also be used to convert only one color light component into photo-generated charges, and the second layer pixel array 202 can be used to convert the remaining two different color light components into photo-generated charges. For example, a first layer of pixel array 201 is used to convert red light into photo-generated charge and a second layer of pixel array 202 is used to convert blue and green light into photo-generated charge. For another example, the first layer of pixel array 201 is configured to convert green light into photo-generated charge and the second layer of pixel array 202 is configured to convert blue light and red light into photo-generated charge. That is, the first layer pixel array 201 can be used to convert light of any color light component in the RGB model into photo-generated charges.
It should be further noted that, for a color model (e.g., CMYK model) having four color light components, the pixel array combination of the image sensor may be any one of the following:
the first method is as follows: two layers of pixel arrays are included, where a first layer of pixel arrays may be used to convert light of one, two, or three different color light components of the color model into photo-generated charge, and a second layer of pixel arrays is used to convert light of the remaining color light components of the color model into photo-generated charge. Taking the CMYK model as an example, the first layer of pixel array may be used to absorb cyan light and magenta light, and the second layer of pixel array may be used to absorb yellow light and black light.
The second method is as follows: the color model comprises three layers of pixel arrays, wherein any one layer of pixel array can be used for converting light rays of two different color light components of the color model into photo-generated charges, and the other two layers of pixel arrays are respectively used for converting light rays of the other color light component of the color model into the photo-generated charges. Taking the CMYK model as an example, the first layer of pixel array may be used to absorb cyan light, magenta light, and yellow light, and the second layer of pixel array may be used to absorb black light.
And a second layer of pixel array 202 for converting light of the remaining one of the color light components of the color model into photo-generated charges.
A readout circuit 204 for converting the photo-generated charge into an electrical signal and outputting the electrical signal to form an image.
In the embodiment of the present application, the image sensor using the RGB model has two layers of pixel arrays, and compared with Foveon X having three layers of pixel arrays, the number of pixel units is greatly reduced, so that the amount of electrical signal data output by the readout circuit is reduced, thereby reducing the complexity of RGB reduction algorithm, improving the color accuracy, and providing conditions for obtaining a higher frame rate.
An embodiment of the present invention further provides an image sensor, and fig. 3A is a schematic structural diagram of the image sensor in the embodiment of the present invention, and as shown in fig. 3A, the image sensor 30 includes: a first layer of pixel array 301 and a second layer of pixel array 302, a color filter array 303, and readout circuitry 304; wherein,
the first-layer pixel array 301, which is disposed between the color filter array 303 and the second-layer pixel array 302, includes M first pixel units 3011 and L second pixel units 3012, where M and L are integers greater than 0.
When implemented, the first pixel units 3011 and the second pixel units 3012 are arranged alternately, and M may be equal to L. The arrangement may be the arrangement shown in fig. 3B, or may be another arrangement.
The second layer pixel array 302 includes K third pixel units 3021, where K is an integer greater than 0.
It is understood that the first pixel unit, the second pixel unit and the third pixel unit are three different pixel units, and the different pixel units correspondingly absorb light with different wavelengths. For example, a first pixel unit is used for absorbing blue light with a wavelength of 407 nanometers (nm) to 505nm, a second pixel unit is used for absorbing green light with a wavelength of 505nm to 525nm, and a third pixel unit is used for absorbing red light with a wavelength of 640nm to 780 nm.
The color filter array 303, which is disposed on the first layer pixel array 301 with reference to the transmission direction of light, includes M first color filters 3031 and L second color filters 3032, the first color filters 3031 are stacked in alignment with the first pixel units 3011, and the second color filters 3032 are stacked in alignment with the second pixel units 3012.
A first color filter 3031 for allowing light of the color light component corresponding to the first pixel unit 3011 and light of the color light component corresponding to the third pixel unit 3021 to pass through.
And a second color filter 3032 for allowing light of the color light component corresponding to the second pixel unit 3012 and light of the color light component corresponding to the third pixel unit 3021 to pass therethrough.
It is understood that since the wavelengths of light absorbed by the three pixel units are different, the color filter stacked above the pixel units is also different accordingly. For example, if the first pixel unit is used for absorbing blue light, the first color filter is a purple color filter, the second pixel unit is used for absorbing green light, and the second color filter is a yellow color filter, the purple color filter filters out blue light and red light, wherein the blue light is absorbed by the first pixel unit, the red light is absorbed by the third pixel unit, the yellow color filter filters out green light and red light, the green light is absorbed by the first pixel unit, and the red light is absorbed by the third pixel unit; for another example, the first pixel unit is configured to absorb red light, the first color filter is a magenta color filter, the second pixel unit is configured to absorb blue light, and the second color filter is a cyan color filter, such that the magenta color filter filters out red light and green light, the red light is absorbed by the first pixel unit, the green light is absorbed by the third pixel unit, the cyan color filter filters out blue light and green light, the blue light is absorbed by the second pixel unit, and the green light is absorbed by the third pixel unit.
In the present embodiment, both color filters can transmit light absorbed by the pixel cells of the second layer of the pixel array, so that even if the number of the pixel cells of the second layer is small, the image quality is not affected.
Each of the first, second and third pixel units 3011, 3012 and 3021 is configured to convert light of its corresponding color light component into the photo-generated charge.
It will be appreciated that the wavelength of light absorbed is different for different types of pixel cells. For example, a first pixel cell is configured to absorb blue light, a second pixel cell is configured to absorb red light, and a third pixel cell is configured to absorb green light. This is because the three different pixel units include different photoelectric conversion elements. For example, as shown in table 1, the photosensitive region of the cylindrical PhotoDiode (PD) included in the first pixel unit for absorbing blue light has a diameter of 60nm, the photosensitive region of the cylindrical PhotoDiode of the second pixel unit for absorbing red light has a diameter of 120nm, and the photosensitive region of the cylindrical PhotoDiode of the third pixel unit for absorbing green light has a diameter of 90 nm.
TABLE 1
Pixel unit Color light component Diameter of photosensitive region of photodiode
First pixel unit Blue light 60nm
Second pixel unit Red light 120nm
Third pixel unit Green light 90nm
A readout circuit 304 for converting the photo-generated charge into an electrical signal and outputting the electrical signal to form an image.
In this embodiment, the image sensor using the RGB model has two pixel arrays, wherein the first pixel array has two pixel units, and the second pixel array has one pixel unit, so that the wiring is performed between the gaps of the second pixel array, compared with the case where the first pixel array has one pixel unit and the second pixel array has two pixel units, the former is simpler when the wiring of the metal wire is performed, thereby greatly reducing the process difficulty.
In other embodiments, each of the pixel units includes a plurality of photoelectric conversion elements 305; each photoelectric conversion element is provided with a photosensitive area with a diameter smaller than the light wavelength of the color light component corresponding to the photoelectric conversion element so as to convert the light of the color light component corresponding to the photoelectric conversion element into the photo-generated charges.
The diameter of the photosensitive region is a distance from the center of the photosensitive region to two points on the edge. For example, the photoelectric conversion element is a cylindrical photodiode, a photosensitive region of the photoelectric conversion element is circular, and a diameter of the photosensitive region is a diameter of the circular photodiode; for another example, the photoelectric conversion element is a square photodiode, the photosensitive region is square, and the diameter of the photosensitive region is the length or side length of the diagonal line of the square.
It can be understood that, in this embodiment, the photoelectric conversion element has a photosensitive area with a diameter smaller than the light wavelength of the color light component corresponding to the photosensitive area, so that the incident light can generate optical resonance in the cavity of the photoelectric conversion element, thereby enhancing the optical state density of the photoelectric conversion element, increasing the quantum efficiency, and improving the image quality.
For example, when the diameter of the photosensitive region of the cylindrical photodiode is 60nm, more than 95% of blue light can be absorbed; when the diameter of the photosensitive area of the cylindrical photodiode is 90nm, more than 90% of green light can be absorbed.
In implementation, the number of the photoelectric conversion elements of the three diameters may be the same, and may be different. However, the distance between adjacent photoelectric conversion elements is at least 50nm, which can reduce crosstalk. The thickness of the photosensitive area is generally between 80nm and 500nm, and the longer the thickness, the higher the absorption rate of light.
In other embodiments, the number of pixel cells of the second layer pixel array 302 is less than the number of pixel cells of the first layer pixel array 301.
For example, the number of pixel units of the second layer pixel array 302 may be 3/4 or 1/2, etc. of the number of pixel units of the first layer pixel array 301. Of course, the number of pixel units of the second layer pixel array 302 may be equal to the number of pixel units of the first layer pixel array 301, but then the power consumption of the image sensor is increased during operation, and accordingly, the amount of heat generated is also increased.
In the case where the number of pixel units of the second-layer pixel array 302 is smaller than the number of pixel units of the first-layer pixel array 301, as shown in fig. 4, the third pixel unit 3021 and the first and second pixel units 3011 and 3012 may be stacked with misalignment.
It should be noted that there are many ways of stacking by misalignment, for example, the third pixel unit is misaligned with the pixel units of the first layer of pixel array by one or more photoelectric conversion elements; for another example, when the number of the third pixel units is half of the number of the pixel units of the first-layer pixel array, as shown in fig. 5, the third pixel unit 3021 is disposed at a central position below the four pixel units of the first-layer pixel array.
The main working principle of Foveon X3 is as shown in fig. 6, that the difference of absorption lengths of light with different wavelengths in silicon is used to measure signals obtained at different depths, and finally, the detection of three colors of RGB is realized in one pixel.
However, Foveon X3 has high power consumption, severe heat generation, large pixel size, large data size, low frame rate, possibly severe spectral crosstalk (crosstalk), complicated RGB reduction algorithms, inaccurate color, poor color performance under high light sensitivity (ISO) of 100 or less, and high color noise under low light.
Based on this, an exemplary application of the embodiment of the present application in a practical application scenario will be described below.
The embodiment of the application provides a double-layer laminated Complementary Metal Oxide Semiconductor (CMOS) Image Sensor (CIS) based on a sub-wavelength photodiode. As shown in fig. 7, the image sensor 70 specifically includes: two layers of pixels, the first layer being made up of two types of pixels. A pixel covers the filter 701 of purple, there are several cylindrical photodiodes 702 of diameter 60nm under the filter 701, is used for absorbing the blue light; another pixel covers a yellow color filter 703, and below the color filter 703 are several 90nm diameter cylindrical photodiodes 704, which absorb green light. The second layer of pixels consists of several cylindrical photodiodes 705 with a diameter of 120nm, which absorb red light. In implementation, the number of photodiodes of three diameters is the same. The second layer has half the number of pixels as the first layer and is located between four pixels of the first layer.
Thus, the signal-to-noise ratio of the CIS and the resolution of the CIS can be improved by the pixel stacking method, and the false color in the demosaicing process can be reduced.
As shown in fig. 8, a color filter is covered above the pixel photodiode, and includes a violet filter P capable of absorbing green light and transmitting blue light and red light and a yellow filter Y capable of absorbing blue light and transmitting green light and red light, and the violet filter P and the yellow filter Y are alternately arranged.
The CIS structure based on the stacked pixels shown in fig. 7 has an operation method of: after the light passes through the color filters p (y), after the blue light (green light) passes through the plurality of cylindrical photodiode arrays, more than 95% of the blue light (more than 90% of the green light) is absorbed due to resonance absorption of the cylindrical photodiodes, converted into electric signals and stored in the first layer PD, and read out to obtain signals of channels b (g), and the red light is hardly absorbed. When light reaches the second layer PD, red light is absorbed by the second layer by a plurality of cylindrical photodiodes (the diameter of the cylindrical photodiodes is about 120 nm).
The diameter of the cylindrical photodiode covered with the color filter P is about 60nm, the diameter of the cylindrical photodiode covered with the color filter Y is about 90nm, the thickness of the light-sensitive region of the photodiode is between 80nm and 500nm, and the longer the thickness, the higher the absorption rate of light.
As shown in fig. 9, the left diagram is a schematic cross-sectional diagram of a first layer of photodiodes, mainly a photodiode 901 for absorbing blue light (blue photodiode for short) and a photodiode 902 for absorbing green light (green photodiode for short), and a second layer of photodiodes 903 for absorbing red light and a circuit 904 connecting the blue and green photodiodes to a transfer gate. The number of cylindrical photodiodes is determined by the pixel size, and it is necessary to ensure that the spacing between adjacent cylindrical photodiodes is greater than 50 nm.
The readout circuit of the stacked CIS pixel is similar to that of the conventional pixel structure, as shown in fig. 10. The working process is as follows: step 1, exposure: electron-hole pairs generated by light irradiation are separated by the existence of a PPD electric field, electrons move to the n region, and holes move to the p region; step 2, resetting: at the end of exposure, RST is activated, and a read-out area is reset to a high level; step 3, reading out reset level: after the reset is finished, reading out a reset level, and storing a read-out signal in a first capacitor; step 4, charge transfer: TX is activated, transferring charge from the photosensitive region completely to the n + region for readout; and 5, reading the signal level. It should be noted that each layer of photodiode has such a readout circuit.
In the embodiment of the application, based on the sub-wavelength photodiode, the stacked pixels are utilized, so that compared with a Bayer (Bayer) array CIS, the pseudo color of a demosaicing process is reduced, and the CIS resolving power is improved; compared with three-layer laminated pixels, the power consumption is reduced, and the second-layer pixels of the CIS structure provided by the embodiment of the application are reduced, so that the power consumption can be further reduced.
In other embodiments, a cylindrical photodiode may be replaced with a regular polygonal photodiode.
In other embodiments, the red and green photodiodes may be increased in thickness as appropriate to increase the absorption of both lights.
In other embodiments, R, G, B the three colors may be changed arbitrarily, although it should be noted that the color filters need to be changed accordingly.
Based on the foregoing embodiments, an imaging method is provided in the embodiments of the present application, fig. 11 is a schematic implementation flow diagram of the imaging method in the embodiments of the present application, and as shown in fig. 11, the method at least includes the following steps 111 to 114:
step 111, turning on an image sensor;
step 112, transmitting light rays of color light components of a color model through a color filter array of the image sensor;
step 113, converting light rays of at least two different color light components into photo-generated charges through at least one of N layers of pixel arrays of the image sensor, wherein the at least two colors do not include the N types, and N is greater than 1 and smaller than the number of color light components of the color model;
step 114, converting the photo-generated charges into electrical signals by a readout circuit of the image sensor, and outputting the electrical signals to form an image.
In other embodiments, the converting light of at least two different color light components into photo-generated charge by at least one of the N-layer pixel arrays of the image sensor comprises: in a case where the N-layer pixel array includes a first-layer pixel array and a second-layer pixel array, and the first-layer pixel array is disposed between the color filter array and the second-layer pixel array, light rays of two different color light components are converted into photo-generated charges by the first-layer pixel array.
In other embodiments, said converting light of two different said color light components into photo-generated charges by said first layer of pixel array comprises: and converting the light of the color light component corresponding to the light into the photo-generated charges through the M first pixel units and the L second pixel units of the first layer pixel array.
In other embodiments, the method further comprises: and converting the light of the color light component corresponding to the light into the photo-generated charges through K third pixel units of the second-layer pixel array.
In other embodiments, the converting the light of the color light component corresponding to the light into the photo-generated charges by the M first pixel units and the L second pixel units of the first layer pixel array includes: and converting the light of the color light component corresponding to the photoelectric conversion element into the photo-generated charges through the photoelectric conversion element of each pixel unit, wherein each photoelectric conversion element is provided with a photosensitive area with a diameter smaller than the light wavelength of the color light component corresponding to the photoelectric conversion element.
In other embodiments, the transmitting light rays of the color light components of the color model through the color filter array of the image sensor includes: transmitting light rays of color light components corresponding to the first pixel unit and light rays of color light components corresponding to the third pixel unit through the M first color filters in the color filter array; transmitting light rays of the color light component corresponding to the second pixel unit and light rays of the color light component corresponding to the third pixel unit through the L second color filters in the color filter array; wherein the first color filter is stacked in alignment with the first pixel unit and the second color filter is stacked in alignment with the second pixel unit.
The above description of the method embodiment, similar to the above description of the image sensor embodiment, has similar advantageous effects as the image sensor embodiment. For technical details which are not disclosed in the method embodiments of the present application, reference is made to the description of the embodiments of the image sensor of the present application for understanding.
It should be noted that, in the embodiment of the present application, when the image sensor is sold or used as a standalone product, the image sensor may also be stored in a computer-readable storage medium. In addition, if the above-described imaging method is implemented in the form of a software functional module and sold or used as a separate product, it may be stored in a computer-readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for enabling an electronic device (which may be a mobile phone, a tablet computer, a notebook computer, a desktop computer, a robot, a drone, or the like) to execute all or part of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read Only Memory (ROM), a magnetic disk, or an optical disk. Thus, embodiments of the present application are not limited to any specific combination of hardware and software.
Correspondingly, an embodiment of the present application provides an electronic device, fig. 12 is a schematic diagram of a hardware entity of the electronic device according to the embodiment of the present application, and as shown in fig. 12, the hardware entity of the electronic device 120 includes: comprising a memory 1201, a processor 1202 and an image sensor 1203, the memory 1201 stores a computer program operable on the processor 1202, the processor 1202 implements the steps in the imaging method provided in the above embodiments when executing the program.
The memory 1201 is configured to store instructions and applications executable by the processor 1202, and may also buffer data (e.g., image data, audio data, voice communication data, and video communication data) to be processed or already processed by the processor 1202 and modules in the electronic device 120, and may be implemented by a FLASH memory (FLASH) or a Random Access Memory (RAM).
Correspondingly, the embodiment of the present application provides a computer readable storage medium, on which a computer program is stored, which when executed by a processor implements the steps in the imaging method provided in the above embodiment.
Here, it should be noted that: the above description of the storage medium and device embodiments is similar to the description of the method embodiments above, with similar advantageous effects as the method embodiments. For technical details not disclosed in the embodiments of the storage medium and apparatus of the present application, reference is made to the description of the embodiments of the method or the embodiments of the image sensor of the present application for understanding.
It should be appreciated that reference throughout this specification to "one embodiment" or "an embodiment" means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present application. Thus, the appearances of the phrases "in one embodiment" or "in an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. It should be understood that, in the various embodiments of the present application, the sequence numbers of the above-mentioned processes do not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application. The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
In the several embodiments provided in the present application, it should be understood that the disclosed image sensor, apparatus and method may be implemented in other ways. The above-described device embodiments are merely illustrative, for example, the division of the unit is only a logical functional division, and there may be other division ways in actual implementation, such as: multiple units or components may be combined, or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the coupling, direct coupling or communication connection between the components shown or discussed may be through some interfaces, and the indirect coupling or communication connection between the devices or units may be electrical, mechanical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units; can be located in one place or distributed on a plurality of network units; some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, all functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may be separately regarded as one unit, or two or more units may be integrated into one unit; the integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
Those of ordinary skill in the art will understand that: all or part of the steps for realizing the method embodiments can be completed by hardware related to program instructions, the program can be stored in a computer readable storage medium, and the program executes the steps comprising the method embodiments when executed; and the aforementioned storage medium includes: various media that can store program codes, such as a removable Memory device, a Read Only Memory (ROM), a magnetic disk, or an optical disk.
Alternatively, the integrated units described above in the present application may be stored in a computer-readable storage medium if they are implemented in the form of software functional modules and sold or used as independent products. Based on such understanding, the technical solutions of the embodiments of the present application may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for enabling an electronic device (which may be a mobile phone, a tablet computer, a notebook computer, a desktop computer, a robot, a drone, or the like) to execute all or part of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a removable storage device, a ROM, a magnetic or optical disk, or other various media that can store program code.
The methods disclosed in the several image sensor embodiments provided by the present application can be combined arbitrarily without conflict to obtain a new image sensor embodiment.
Features disclosed in several of the product embodiments provided in the present application may be combined in any combination to yield new product embodiments without conflict.
The features disclosed in the several method or apparatus embodiments provided in the present application may be combined arbitrarily, without conflict, to arrive at new method embodiments or apparatus embodiments.
The above description is only for the embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. An image sensor, comprising: the color filter array, the N layers of pixel arrays and the readout circuit, wherein N is greater than 1 and less than the number of color light components of the color model; wherein,
the color filter array is arranged above the N layers of pixel arrays and is used for allowing light rays of color light components of the color model to penetrate;
at least one of said N pixel arrays for converting light of at least two different said color light components into photo-generated charges, said at least two not including said N;
the readout circuit is used for converting the photo-generated charges into electric signals and outputting the electric signals to form images.
2. The image sensor of claim 1, wherein the color model is an RGB model, and correspondingly,
the N-layer pixel array includes a first layer pixel array and a second layer pixel array, the first layer pixel array disposed between the color filter array and the second layer pixel array.
3. The image sensor of claim 2, wherein the first layer pixel array comprises M first pixel cells and L second pixel cells, the second layer pixel array comprises K third pixel cells, M, L and K each being an integer greater than 0; wherein,
each of the first pixel unit, the second pixel unit and the third pixel unit is respectively used for converting light rays of color light components corresponding to the pixel unit into the photo-generated charges.
4. The image sensor of claim 3, wherein each of the pixel units comprises a plurality of photoelectric conversion elements; wherein,
each photoelectric conversion element is provided with a photosensitive area with a diameter smaller than the light wavelength of the corresponding color light component so as to convert the light of the corresponding color light component into the photo-generated charges.
5. The image sensor of claim 3, wherein the color filter array comprises the M first color filters and the L second color filters, the first color filters being stacked in alignment with the first pixel cells and the second color filters being stacked in alignment with the second pixel cells; wherein,
the first color filter is used for allowing the light rays of the color light component corresponding to the first pixel unit and the light rays of the color light component corresponding to the third pixel unit to penetrate;
the second color filter is used for allowing the light rays of the color light component corresponding to the second pixel unit and the light rays of the color light component corresponding to the third pixel unit to penetrate.
6. The image sensor of claim 3, wherein the number of pixel cells of the second layer pixel array is less than the number of pixel cells of the first layer pixel array.
7. The image sensor of claim 6, wherein the third pixel cell is stacked misaligned with the first pixel cell and the second pixel cell.
8. A method of imaging, the method comprising:
turning on an image sensor;
transmitting light rays of color light components of a color model through a color filter array of the image sensor;
converting light of at least two different color light components into photo-generated charges by at least one of N layers of pixel arrays of the image sensor, the at least two not including the N, N being greater than 1 and less than the number of color light components of the color model;
converting the photo-generated charge into an electrical signal by a readout circuit of the image sensor and outputting the electrical signal to form an image.
9. The method of claim 8, wherein converting light of at least two different color light components into photo-generated charges by at least one of the N-layer pixel arrays of the image sensor comprises:
in a case where the N-layer pixel array includes a first-layer pixel array and a second-layer pixel array, and the first-layer pixel array is disposed between the color filter array and the second-layer pixel array, light rays of two different color light components are converted into photo-generated charges by the first-layer pixel array.
10. An electronic device comprising a memory, a processor and an image sensor according to any of claims 1 to 7, the memory storing a computer program executable on the processor, wherein the processor, when executing the program, performs the steps of the imaging method according to claim 8 or 9.
CN201910972469.4A 2019-10-14 2019-10-14 Image sensor, imaging method and device Active CN110650301B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910972469.4A CN110650301B (en) 2019-10-14 2019-10-14 Image sensor, imaging method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910972469.4A CN110650301B (en) 2019-10-14 2019-10-14 Image sensor, imaging method and device

Publications (2)

Publication Number Publication Date
CN110650301A true CN110650301A (en) 2020-01-03
CN110650301B CN110650301B (en) 2022-03-01

Family

ID=68993992

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910972469.4A Active CN110650301B (en) 2019-10-14 2019-10-14 Image sensor, imaging method and device

Country Status (1)

Country Link
CN (1) CN110650301B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111246133A (en) * 2020-01-09 2020-06-05 Oppo广东移动通信有限公司 Image sensor and image processing method
WO2022037557A1 (en) * 2020-08-19 2022-02-24 华为技术有限公司 Image sensor, signal processing method, and related device
EP4006976A1 (en) * 2020-11-30 2022-06-01 Samsung Electronics Co., Ltd. Image sensor
WO2023283919A1 (en) * 2021-07-16 2023-01-19 华为技术有限公司 Image sensor and electronic device
WO2023098638A1 (en) * 2021-11-30 2023-06-08 维沃移动通信有限公司 Image sensor, photographic module, electronic device and photographing method

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103765590A (en) * 2011-09-01 2014-04-30 佳能株式会社 Solid-state image sensor
US20150270314A1 (en) * 2014-03-19 2015-09-24 Kabushiki Kaisha Toshiba Solid-state imaging device
CN105898118A (en) * 2015-02-16 2016-08-24 三星电子株式会社 Image sensor and imaging apparatus including the same
CN107039473A (en) * 2015-11-30 2017-08-11 三星电子株式会社 Imaging sensor and the electronic installation including it
CN107251224A (en) * 2015-02-26 2017-10-13 索尼半导体解决方案公司 Solid-state imaging element and electronic equipment
CN107799540A (en) * 2016-09-02 2018-03-13 三星电子株式会社 Semiconductor devices
CN108474884A (en) * 2016-01-29 2018-08-31 富士胶片株式会社 Composition, film, near infrared ray cut-off filter, laminated body, pattern forming method, solid-state imaging element, image display device, infrared sensor and colour filter
US20180254356A1 (en) * 2017-03-01 2018-09-06 Phase Sensitive Innovations, Inc. Diamond-backed photodiodes, diamond-sandwiched photodiodes, photodiode systems and related methods of manufacture
CN110099230A (en) * 2019-04-23 2019-08-06 Oppo广东移动通信有限公司 Image processing method and device and storage medium
CN110112155A (en) * 2019-04-23 2019-08-09 Oppo广东移动通信有限公司 Pixel unit, imaging sensor and image processing method and storage medium

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103765590A (en) * 2011-09-01 2014-04-30 佳能株式会社 Solid-state image sensor
US20150270314A1 (en) * 2014-03-19 2015-09-24 Kabushiki Kaisha Toshiba Solid-state imaging device
CN105898118A (en) * 2015-02-16 2016-08-24 三星电子株式会社 Image sensor and imaging apparatus including the same
CN107251224A (en) * 2015-02-26 2017-10-13 索尼半导体解决方案公司 Solid-state imaging element and electronic equipment
CN107039473A (en) * 2015-11-30 2017-08-11 三星电子株式会社 Imaging sensor and the electronic installation including it
CN108474884A (en) * 2016-01-29 2018-08-31 富士胶片株式会社 Composition, film, near infrared ray cut-off filter, laminated body, pattern forming method, solid-state imaging element, image display device, infrared sensor and colour filter
CN107799540A (en) * 2016-09-02 2018-03-13 三星电子株式会社 Semiconductor devices
US20180254356A1 (en) * 2017-03-01 2018-09-06 Phase Sensitive Innovations, Inc. Diamond-backed photodiodes, diamond-sandwiched photodiodes, photodiode systems and related methods of manufacture
CN110099230A (en) * 2019-04-23 2019-08-06 Oppo广东移动通信有限公司 Image processing method and device and storage medium
CN110112155A (en) * 2019-04-23 2019-08-09 Oppo广东移动通信有限公司 Pixel unit, imaging sensor and image processing method and storage medium

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111246133A (en) * 2020-01-09 2020-06-05 Oppo广东移动通信有限公司 Image sensor and image processing method
WO2022037557A1 (en) * 2020-08-19 2022-02-24 华为技术有限公司 Image sensor, signal processing method, and related device
EP4006976A1 (en) * 2020-11-30 2022-06-01 Samsung Electronics Co., Ltd. Image sensor
WO2023283919A1 (en) * 2021-07-16 2023-01-19 华为技术有限公司 Image sensor and electronic device
WO2023098638A1 (en) * 2021-11-30 2023-06-08 维沃移动通信有限公司 Image sensor, photographic module, electronic device and photographing method

Also Published As

Publication number Publication date
CN110650301B (en) 2022-03-01

Similar Documents

Publication Publication Date Title
CN110650301B (en) Image sensor, imaging method and device
KR101129128B1 (en) Circuit and photo sensor overlap for backside illumination image sensor
CN110740277B (en) Image sensor, electronic device and imaging method
TWI567955B (en) Color filters for sub-diffraction limit sensors
US8134115B2 (en) Color filters for sub-diffraction limit-sized light sensors
EP2446474B1 (en) Gradient color filters for sub-diffraction limit sensors
TWI556418B (en) Image sensor
KR102372745B1 (en) Image sensor and electronic device having the same
CN110740236B (en) Image sensor, electronic device, image processing method, and storage medium
CN110049261B (en) Pixel structure, image sensor and terminal
US20110155908A1 (en) Color filter array and image obtaining apparatus
CN110677606B (en) Pixel structure, CIS and terminal
CN110071130B (en) CMOS image sensor, image processing method and storage medium
CN110505419B (en) Pixel structure, image sensor and terminal
CN109951661A (en) Imaging sensor and electronic equipment
CN110475083B (en) Pixel structure, image sensor and terminal
CN110797366A (en) Pixel structure, complementary metal oxide semiconductor image sensor and terminal
JP5253856B2 (en) Solid-state imaging device
CN110891137A (en) Image sensor, electronic device, image processing method, and storage medium
CN110677605B (en) Laminated CIS, image processing method, storage medium and terminal device
Gouveia et al. On evolution of cmos image sensors
CN110854145A (en) Pixel structure, image sensor and terminal
CN111182247B (en) Pixel structure, image sensor and terminal
CN110690237B (en) Image sensor, signal processing method and storage medium
CN111200724B (en) Polarization type CIS, image processing method, storage medium and terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant