CN115023945A - Hybrid pixel circuit for acquiring frame-based images and event-based images - Google Patents

Hybrid pixel circuit for acquiring frame-based images and event-based images Download PDF

Info

Publication number
CN115023945A
CN115023945A CN202080095165.5A CN202080095165A CN115023945A CN 115023945 A CN115023945 A CN 115023945A CN 202080095165 A CN202080095165 A CN 202080095165A CN 115023945 A CN115023945 A CN 115023945A
Authority
CN
China
Prior art keywords
pixel
transfer gate
photodiodes
transistor
event
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202080095165.5A
Other languages
Chinese (zh)
Inventor
石井隆雄
工藤義治
物井诚
史斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Publication of CN115023945A publication Critical patent/CN115023945A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/76Addressed sensors, e.g. MOS or CMOS sensors
    • H04N25/77Pixel circuitry, e.g. memories, A/D converters, pixel amplifiers, shared circuits or shared components
    • H04N25/778Pixel circuitry, e.g. memories, A/D converters, pixel amplifiers, shared circuits or shared components comprising amplifiers shared between a plurality of pixels, i.e. at least one part of the amplifier must be on the sensor array itself
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/76Addressed sensors, e.g. MOS or CMOS sensors
    • H04N25/77Pixel circuitry, e.g. memories, A/D converters, pixel amplifiers, shared circuits or shared components

Abstract

The invention provides a pixel circuit. The pixel circuit includes: a current change detector for detecting a change in photocurrent from the input port; one or more pixel cells, each pixel cell comprising one or more photodiodes and a corresponding transfer gate transistor and a reset transistor, wherein a source of each transfer gate transistor is connected to a cathode of a corresponding photodiode and a drain of each transfer gate transistor is connected to a source of the reset transistor; one or more selector circuits that connect a drain of the reset transistor to the input port of the current change detector or a fixed voltage in response to a control signal. The present invention achieves good registration of frame-based images and event-based images.

Description

Hybrid pixel circuit for acquiring frame-based images and event-based images
Technical Field
The present invention relates to a pixel circuit for acquiring frame-based images and event-based images.
Background
Conventional image data is obtained by a frame-based image sensor. The frame-based image data is light intensity data for each pixel on a frame-by-frame basis. Event-based image sensors are also known as Dynamic Vision Sensors (DVS), and are expected to be mainly used as components of new camera systems for mobile devices due to their change detection functions of low latency (high speed), high Dynamic range, and low power consumption. The event-based image data is a data array that includes at least the location of the event in the image area and its event timestamp.
However, there is a problem with event-based image sensors in that conventional images cannot be obtained with such sensors alone. Therefore, their application is limited to the field of machine vision, such as automotive applications. As a solution to this problem, a dual camera system equipped with an event-based image sensor and a frame-based image sensor may be used, but such a dual camera system increases the size of a camera module and requires implementation of an algorithm to register two images, i.e., an event-based image and a frame-based image.
For example, the prior art has at least the following two problems:
in terms of resolution, it is difficult for the event-based image sensor to improve image resolution because the pixel circuit of the event-based image sensor is more complicated than that of the conventional frame-based image sensor. Since an additional circuit is required for measuring the light intensity, this problem is more serious in Asynchronous Time-based Image sensors (ATIS) and dynamic Active pixel Vision sensors (DAVIS), and other sensors.
In terms of image quality, since the event-based pixels and the frame-based pixels have different sizes from each other, the composite arrangement of the event-based pixels and the frame-based pixels not only makes it difficult to mutually adjust the periodicity of the pixel arrangement, but also periodically generates fixed pattern noise due to the difference in periodicity.
Disclosure of Invention
In order to achieve both types of image acquisition without the above problems, the present invention discloses a pixel circuit and a method of driving the same. In the pixel circuit, a portion of the frame-based image sensor (e.g., a photodiode) is connected to an event detection circuit without significant changes in image characteristics (resolution, color reconstruction, noise, etc.). Meanwhile, the embodiment enables a conventional color image and a change detection image to be obtained from one image sensor circuit. This means that the frame-based image and the event-based image have a good registration, thereby reducing the image processing load.
The present invention provides a pixel circuit to achieve good registration of frame-based images and event-based images.
According to a first aspect, there is provided a pixel circuit comprising: a current change detector for detecting a change in photocurrent from the input port; one or more pixel cells, each pixel cell comprising one or more photodiodes and a corresponding transfer gate transistor and a reset transistor, wherein a source of each transfer gate transistor is connected to a cathode of a corresponding photodiode and a drain of each transfer gate transistor is connected to a source of the reset transistor; one or more selector circuits, each selector circuit responsive to a control signal to connect a drain of the reset transistor of one or more of the pixel cells to the input port of the current change detector or a fixed voltage.
In a first possible implementation of the first aspect, a clamping photodiode is used as the one or more photodiodes of the one or more pixel cells.
In a second possible implementation form of the first aspect, the current change detector and the one or more selector circuits are arranged on separate substrates.
In a third possible implementation manner of the first aspect, a method for operating each pixel unit is provided, and includes: connecting the drain of the reset transistor to an input port of the current change detector; turning on the reset transistor and the transfer gate transistor to detect a change in light intensity as an event-based sensor mode; connecting the drain of the reset transistor to the fixed voltage; as a frame-based sensor mode, the transfer gate transistor is turned off to accumulate photoelectrons, and each transistor in the pixel unit is switched to output a signal corresponding to the amount of accumulated photoelectrons.
With reference to the third possible implementation manner, in a fourth possible implementation manner of the first aspect, the method further includes: the mode is set separately for each set of one or more of the pixel cells connected to the same selector circuit.
In a fifth possible implementation manner of the first aspect, a method for operating each pixel unit is provided, and includes: connecting the drain of the reset transistor to an input port of the current change detector; turning on the reset transistor and a portion of the transfer gate transistor as an event-based sensor mode to detect a change in light intensity at the photodiode connected to the transfer gate transistor; turning off the remaining transfer gate transistors to accumulate photoelectrons generated in the photodiodes connected to the remaining transfer gate transistors as a frame-based sensor mode; connecting the drain of the reset transistor to the fixed voltage; each transistor in the pixel unit is switched to output a signal corresponding to the accumulated photoelectron amount.
With reference to the fifth possible implementation manner, in a sixth possible implementation manner of the first aspect, the method further includes: a mode is individually set for one or more of the photodiodes of each group in the pixel unit.
In a seventh possible implementation manner of the first aspect, an image sensor includes: the pixel circuit; a processor for reconstructing image data by reading out photoelectrons accumulated by the photodiode, the image data being acquired by the current change detector.
In an eighth possible implementation manner of the first aspect, the camera system includes: the pixel circuit; a processor for reconstructing image data by reading out photoelectrons accumulated by the photodiode, the image data being acquired by the current change detector.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings required for describing the embodiments or the prior art are briefly introduced below. It is obvious that the drawings in the following description only show some embodiments of the invention and that still further drawings can be derived from them by a person skilled in the art without inventive effort.
Fig. 1 shows an example of a pixel circuit 1 provided by an embodiment of the present invention;
FIG. 2 illustrates a typical event-based pixel circuit;
FIG. 3 illustrates a typical frame-based pixel circuit;
fig. 4 shows the operation of the pixel circuit 1 when the reset drain is connected to the current variation detector 200;
fig. 5 shows the operation of the pixel circuit 1 when the reset drain is connected to a fixed voltage;
fig. 6 shows an example of a pixel circuit 1 including one pixel unit 300;
fig. 7 shows an example of a pixel circuit 1 including a plurality of pixel cells 300;
fig. 8 shows an example of a pixel circuit 1 including a plurality of selector circuits 100 and a plurality of pixel cells 300;
fig. 9 shows an example of a pixel cell 300 including one photodiode;
fig. 10 shows an example of a pixel cell 300 including a plurality of photodiodes;
fig. 11 shows an example of a pixel cell 300 including a transistor for a Dual Conversion Gain (DCG) function;
fig. 12 shows an operation example of the pixel circuit 1 during an exposure period;
fig. 13 shows an example of the operation of the pixel circuit 1 during a readout period;
FIG. 14 shows an example of switching control between a frame-based sensor mode and an event-based sensor mode;
fig. 15 shows an example of a pixel unit having an overflow Gate (OFG);
fig. 16 shows a binning method performed by the current change detector 100 for a plurality of pixel cells 300;
fig. 17 shows a binning method performed by the current change detector 100 on a plurality of photodiodes;
fig. 18 shows a binning method performed by the current change detector 100 for color combinations;
fig. 19 shows an example of a pixel arrangement on a substrate;
fig. 20 shows another example of a pixel arrangement on a substrate;
fig. 21 shows another example of a pixel arrangement on a substrate;
fig. 22 shows another example of a pixel arrangement on a substrate;
fig. 23 shows an example of a stacked structure of a pixel circuit provided by another embodiment of the present invention;
FIG. 24 shows an example of a three stack sensor;
FIG. 25 shows an example of a dual stack sensor;
FIG. 26 shows another example of a dual stack sensor;
fig. 27 shows an overview of generating an overlapping image for SLAM;
fig. 28 illustrates inter-picture compensation of an event-based picture.
Detailed Description
The technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention. The described embodiments are only a part, not all, of the embodiments of the invention. All other embodiments obtained by a person skilled in the art according to embodiments of the invention without inventive step shall fall within the scope of protection of the invention.
Fig. 1 shows an example of a pixel circuit 1 provided by an embodiment of the present invention. The pixel circuit 1 comprises a selector circuit 100, a current change detector 200 and one or more pixel cells 300. The pixel circuit 1 functions not only as a frame-based pixel circuit but also as an event-based pixel circuit. The selector circuit 100 connects the drain of the reset transistor 30 in the pixel unit 300 (hereinafter referred to as "reset drain") to either a fixed voltage such as VDD or an input port of the current change detector 200.
As shown in fig. 4, when the reset drain is connected to the current change detector 200 and the transfer gate transistors 21 to 24 and the reset transistor 30 are turned on, the pixel unit 300 operates in an event-based sensor mode and the pixel circuit 1 operates as an event-based pixel circuit in fig. 2, while the current change detector 200 monitors a photocurrent. Only one photodiode is shown in fig. 2, and the transfer gate transistors 11 to 14 and the reset transistor 30 are omitted.
As shown in fig. 5, when the reset drain is connected to the fixed voltage such as VDD, the pixel unit 300 operates in a frame-based sensor mode, and the pixel circuit 1 operates as a frame-based pixel circuit in fig. 3. In fig. 5, the transfer gate transistors 21 to 24 are in an off state, and the photodiodes accumulate photoelectrons.
A clamping photodiode may be used as the photodiode of the pixel unit 300. The photoelectrons are accumulated in a state where a region around a cathode of the clamp photodiode is filled with holes. Dark current can be reduced with a clamped photodiode.
The selector circuit 100 switches between the two types of pixel circuits in response to a control signal (not shown in fig. 1) from outside the pixel circuit 1.
Figure 2 shows a typical event based pixel circuit. Such circuits are known as Dynamic Vision Sensors (DVS) or "event-based sensors". Such circuits convert Photodiode (PD) current into voltage in a logarithmic manner, detect a voltage change, and output a "turn-on event" when the voltage change exceeds a predetermined threshold voltage, or a "turn-off event" when the voltage change falls below a predetermined threshold voltage. Hereinafter, an event when the change in current is detected to exceed a predetermined threshold voltage, or an event when the change in current is detected to fall below a predetermined threshold voltage is also referred to as "event detection". When the selector circuit 100 is connected to the current change detector 200, the photocurrent continuously flows out of the photodiode, and the current change detector 200 detects the turn-on event and the turn-off event. The address and polarity, i.e. the location where the change is detected and whether the change is above or below an upper threshold, can be obtained and also a conventional color image can be obtained from one image sensor.
Fig. 3 shows a typical frame-based pixel circuit. The circuit is a universal Complementary Metal Oxide Semiconductor (CMOS) image sensor. The frame-based pixel circuit includes photodiodes 11-14, transfer gate transistors 21-24, a reset transistor 30, a source follower amplifier transistor 40, and a row strobe transistor 50.
Generally, the amount of light received by the photodiodes 11 to 14 and the like is measured as follows: first, the reset transistor 30 and the transfer gate transistors 21 to 24 are turned on, and the cathodes of the photodiodes 11 to 14 are reset to a depletion state (no electrons).
Then, the transfer gate transistors 21 to 24 are turned off, and photoelectrons are accumulated at the cathodes of the photodiodes 11 to 14. This period is referred to as the "exposure period". During this period, the reset transistor 30 may remain on, and the FD level will be pulled up for charge transfer as described below.
After the charge is accumulated, the reset transistor 30 is turned off, and the transfer gate transistor 21 is switched to transfer the charge to a Floating Diffusion (FD) 60. The term "switch" means to switch from off to on and, after a predetermined period of time, from on to off. The row strobe transistor 50 is turned on, and the source follower amplifier transistor 40 outputs a voltage corresponding to the amount of charge to an Analog-to-Digital Converter (ADC) (not shown in fig. 3). After the reset transistor 30 is turned off and before the transfer gate transistor 21 is switched, the reference level of the Floating Diffusion (FD) 60 may be measured to perform output level calibration. The row pass transistor 50 may be turned on after the reset transistor 30 is turned off. Repeating this process to measure the amount of light received by the photodiodes 12-14, i.e., switching the reset transistor 30 and switching the transfer gate transistor 22 to measure the amount of light received by the photodiode 12; switching the reset transistor 30 and switching the transfer gate transistor 23 to measure the amount of light received by the photodiode 13; the reset transistor 30 is switched and the transfer gate transistor 24 is switched to measure the amount of light received by the photodiode 14. This period is referred to as the "readout period".
Here, in fig. 1, when the reset drain is connected to the fixed voltage, the pixel circuit 1 operates as a general-purpose image sensor (frame-based sensor), and when it is connected to the current change detector 200, the pixel circuit 1 operates as a DVS (event-based sensor). For example, if the current change detector 200 detects the turn-on event or the turn-off event while the subject is moving, the start of high-speed movie shooting may be triggered by switching from the event-based pixel circuit to the frame-based pixel circuit.
The pixel circuit 1 may include one pixel unit 300 as shown in fig. 6, or the pixel circuit 1 may include a plurality of pixel units 300 as shown in fig. 7. In the case where a plurality of the pixel units 300 are provided in the pixel circuit 1, the selector circuit 100 may be provided for each pixel unit 300, or the current change detector 200 may be shared by a plurality of the pixel units 300, i.e., the pixel circuit 1 may include one or more selector circuits 100. In fig. 7, the pixel circuit 1 includes a selector circuit 100, and the drains of a plurality of the reset transistors 30 are connected to a node and connected to a current change detector 200 through a selector circuit 100. In fig. 8, the pixel circuit 1 includes a plurality of selector circuits 100, and the drains of a plurality of the reset transistors 30 are connected to a node and to a current change detector 200 through one of the plurality of selector circuits 100. As described below, in fig. 18, four pixel cells 300 (each of the pixel cells 300 includes four photodiodes) are connected to one selector circuit 100; in fig. 23, sixteen pixel cells 300 (each of which pixel cells 300 includes four photodiodes, shown as four small squares in the figure) are connected to one selector circuit 100 by inter-wafer connections (shown as pillars in the figure). The event based sensor mode and the frame based sensor mode may be set separately for each group of one or more of said pixel cells 300 connected to the same selector circuit 200.
In fig. 1, the pixel unit 300 is shared by four photodiodes 11 to 14. However, the number of the photodiodes is not limited to four. The pixel cell shown in fig. 9 includes one photodiode, and the pixel cell shown in fig. 10 includes a plurality of photodiodes.
As shown in fig. 11, the pixel cell may have a transistor 31, and the transistor 31 is used for a Dual Conversion Gain (DCG) function in order to obtain a wide dynamic range by combining two types of gains. When the output range of the source follower transistor or the input range of the ADC is insufficient, the transistor 31 connected to the source of the reset transistor is turned on.
Unlike the operation shown in fig. 4, a part of the transfer gate transistors may be turned on and the rest of the transfer gate transistors may be turned off as shown in fig. 12. The photodiode 11 operates in an event-based sensor mode, and the photodiodes 12 to 14 operate in a frame-based sensor mode. The photodiode connected to the transfer gate transistor in the off state (photodiodes 12 to 14 in fig. 12 and 13) accumulates photoelectrons, and the other photodiode is used for event detection (photodiode 11 in fig. 12 and 13). In this case, unlike the operation shown in fig. 5, as shown in fig. 13, the above-described part of the transfer gate transistors may be turned off, and the remaining transfer gate transistors may be turned on. As shown in fig. 12 and 13, the pixel function of acquiring a frame-based image or detection event is selectable, i.e., individually controllable, for each photodiode. An event-based sensor mode and a frame-based sensor mode may be separately set for one or more of the photodiodes of each group in the pixel cell 300. Some of the photodiodes are used for event detection while the remaining photodiodes in the same optoelectronic system are used to acquire frame-based images. In fig. 12, the photodiode for event detection and the photodiode for acquiring a frame-based image can be separated in one pixel unit 300 by controlling the gates of the transfer gate transistors 21 to 24. The pixel circuit 1 can acquire two types of image data at the same time by the following method.
During an exposure period, the selector circuit 100 connects the reset drain to the current change detector 200, and turns on the reset transistor 30 and a part of the transfer gate transistor (the transfer gate transistor 21) to connect the photodiode 11 to the current change detector 200. On the other hand, the other transfer gate transistors 22 to 24 are turned off to accumulate photoelectrons. In the example shown in fig. 12, the photodiode 11 may be connected to the current change detector 200 during a period from resetting of the photodiode to starting of the readout period.
Referring to fig. 13, during a readout period, the selector circuit 100 connects the reset drain to the fixed Voltage (VDD) and turns off the transfer gate transistor (the transfer gate transistor 21) that is turned on during the exposure period. Then, the reset transistor 30 and the remaining transfer gate transistors 22 to 24 are driven to read out the accumulated photoelectrons as a frame-based image. As described above in connection with fig. 3, the photoelectrons accumulated by the photodiode may be read out one by one, or may be read out simultaneously, that is, the sum of the photoelectrons accumulated by a plurality of the photodiodes may be read out to improve accuracy.
Fig. 14 shows an example of switching control between the frame-based sensor mode and the event-based sensor mode. The above figure shows an example of an exposure period for each row address. The lower graph shows an example of signal levels for the operation of the selector circuit 100 for the row address i of the image sensor. When the signal is at a high level, the selector circuit 100 connects the reset drain to the current change detector 200 (available for event detection). When the signal is at a low level, the selector circuit 100 connects the reset drain to the fixed voltage.
As shown in fig. 15, the pixel unit 300 may include an overflow Gate (OFG) or an overflow Drain (OFD) 61, 62 … … connected to the photodiode. In fig. 12, the transfer gate transistors 22 to 24 are turned off, and the photocurrents from the photodiodes 12 to 14 do not substantially flow. However, when the photodiode overflows, the photoelectrons flow through the transfer gate transistor connected to the photodiode. In this case, the OFG releases the overflowing photoelectrons. Applying a constant voltage to the gate of the OFG, connecting the drain of the OFG to the anode of the photodiode, and pulling the source of the OFG to a fixed voltage. This reduces the overflow current flowing from the photodiode in the pixel unit 300 to the current change detector 200. Referring to fig. 12 and 13, this is an effective topology for mitigating the effects of overflow current in the above-described operation.
In fig. 12, one photodiode is used for event detection and three photodiodes are used to acquire a frame-based image. However, two or more photodiodes may be used for event detection in order to increase the sensitivity of the event detection. Fig. 16 and 17 show examples of the binning function. In fig. 16, the photocurrents from the reset drains of two or more pixel units 300 may be added and flow into the current change detector 200 so as to improve sensitivity. Arrows indicate the flow of the photocurrent. In fig. 17, the photocurrents from two or more photodiodes may be added in a Floating Diffusion (FD) 60 and flow into the current change detector 200.
To improve selectivity for light having a particular wavelength or color, the color filter may be arranged as follows:
(i) a transparent or complementary color filter may be placed over the photodiode for event detection to increase its sensitivity. Fig. 19 shows an example of pixel arrangement on a substrate. Four small squares in the upper left corner outlined by thick lines correspond to one pixel cell 300. "F/G" denotes a photodiode (shown by a sparsely hatched rectangle) for acquiring a frame-based image using a green filter, "F/R" denotes a photodiode (shown by a hatched rectangle) for acquiring a frame-based image using a red filter, "F/B" denotes a photodiode (shown by a densely hatched rectangle) for acquiring a frame-based image using a blue filter, and "E" denotes a photodiode for event detection using a transparent filter. In fig. 20, each pixel cell 300 has a photodiode for event detection using a transparent color filter.
(ii) If a photodiode with a red filter is used for event detection, only a change in the red image is detected. The color used for event detection is not limited to one. Signals from some different or same color pixels in the photodiodes sharing the selector circuit 100 and the current change detector 200 may be combined by binning to improve sensitivity or to select a color for sensing. For example, if a photodiode having a red color filter, a photodiode having a blue color filter, and a photodiode having a green color filter are connected to the current variation detector 200, the amount of light of the color generated by mixing the three colors can be measured. In this case, the photocurrent may become white (R + G + B ═ W). The color filter array may include a bayer array, a quad bayer array, RGB, RYYB, RGBW, and other arrangements. The combination of colors is not limited to this example.
The circuit shown in fig. 18 includes one selector circuit 100, one current change detector 200, and four pixel cells 300, and the reset drains of the four pixel cells 300 are connected to the selector circuit 100. For example, as shown in fig. 18, a green filter is placed on four photodiodes (outlined with a dashed-dotted line) in the upper-left corner pixel unit, a blue filter is placed on four photodiodes (outlined with a dotted-dotted line) in the lower-left corner pixel unit, a red filter is placed on four photodiodes (outlined with a solid line) in the upper-right corner pixel unit, and a green filter is placed on four photodiodes (outlined with a dashed-dotted line) in the lower-right corner pixel unit. In fig. 18, signals from the uppermost photodiode having a green color filter in the upper left pixel unit, the uppermost photodiode having a blue color filter in the lower left pixel unit, the uppermost photodiode having a red color filter in the upper right pixel unit, and the uppermost photodiode having a green color filter in the lower right pixel unit are combined.
Fig. 21 shows an example of another pixel arrangement on the substrate. The small squares correspond to respective photodiodes. Four small squares in the upper left corner, outlined by bold lines, correspond to one pixel cell 300, in which pixel cell 300 four photodiodes have green color filters (shown as sparsely shaded rectangles). The four photodiodes in the next pixel cell 300 on the right have blue filters (shown with densely hatched rectangles), the four photodiodes in the next lower pixel cell 300 have red filters (shown with hatched rectangles), the four photodiodes in the lower left pixel cell 300 have green filters (shown with sparsely hatched rectangles), and the color pattern of these four pixel cells 300 is repeated in fig. 21. 16 pixel cells 300 are shown in fig. 21, and the lower left photodiode, denoted by "E" in each pixel cell 300, is used for event detection. The other photodiodes, denoted by "F", are used to acquire the frame-based image.
The bottom right photodiode in each pixel cell 300 shown in fig. 21 is used for event detection. However, to align the optical center with the coordinate system, the upper left and lower right photodiodes may be used for event detection and the upper right and lower left photodiodes may be used for frame-based image acquisition. Alternatively, the upper right and lower left photodiodes may be used for event detection, while the upper left and lower right photodiodes may be used for frame-based image acquisition.
In order to realize a wide dynamic range, a large-sized pixel and a small-sized pixel may be used in one pixel unit 300. The largest or larger pixels may be used as pixels for event detection to increase their sensitivity or may be used as pixels for acquiring frame-based images to increase their sensitivity. Fig. 22 shows an example of another pixel arrangement on the substrate. The large and small squares in the upper left corner outlined in bold lines correspond to one pixel cell 301, in which pixel cell 301 two photodiodes have green filters (shown as sparsely hatched rectangles). The two photodiodes in the next pixel cell to the right have blue filters (shown with densely hatched rectangles), the two photodiodes in the next pixel cell below have red filters (shown with hatched rectangles), and the two photodiodes in the lower left pixel cell have green filters (shown with sparsely hatched rectangles). Four pixel cells are shown in fig. 22, with the large photodiode indicated by "F" being used to acquire a frame-based image and the small photodiode indicated by "E" being used for event detection.
The pixel circuit provided by the embodiment is a hybrid pixel circuit for acquiring a frame-based image and detecting an event, and this circuit can be realized by the following structure without degrading the quality of an output image. Fig. 23 shows an example of a stacked structure of the pixel circuit provided by the embodiment. In fig. 23, the pixel cell 300 is disposed on an upper silicon substrate, the selector circuit 100 and the current change detector 200 are disposed on a lower silicon substrate, and the pixel cell 300 and the reset drain of the selector circuit 100 may be connected by a stacking process such as copper bonding or other silicon wafer-to-silicon connection, which is schematically illustrated as four pillars in fig. 23.
The small squares correspond to respective photodiodes. For example, each pixel unit 300 includes four photodiodes, the positions where the four photodiodes are placed constitute a square, the lower left 4 × 4 photodiodes in fig. 23 include four pixel units, green filters (shown by the sparsely hatched rectangles) are placed on the four photodiodes in the upper left pixel unit, red filters (shown by the hatched rectangles) are placed on the four photodiodes in the lower left pixel unit, blue filters (shown by the densely hatched rectangles) are placed on the four photodiodes in the upper right pixel unit, green filters (shown by the sparsely hatched rectangles) are placed on the four photodiodes in the lower right pixel unit, and this color pattern is repeated. The arrangement of the upper left 8x 8 photodiodes, the lower left 8x 8 photodiodes, the upper right 8x 8 photodiodes, and the lower right 8x 8 photodiodes used to acquire the frame-based image and detect the event may be the same as or different from the arrangement shown in fig. 21.
If each pixel cell 300 includes four photodiodes, the reset drains of 16 pixel cells 300 are connected to one selector circuit 100 through one pillar in fig. 23. The relationship between the number of photodiodes and the number of current change detectors 200 may be determined according to the minimum sizes of the photodiodes and the current change detectors 200, thereby eliminating the mismatch of their minimum sizes.
With this topology, the semiconductor process of the pixel cell 300 can be basically designed to achieve its image quality and resolution. In this case, it is only necessary to separate the reset drain from the power supply (VDD) and arrange it to the top metal layer of the power wiring in order to connect to another silicon substrate using the current change detector 200. Since the fixed voltage is applied to the reset drain, the influence of noise is relatively small. Accordingly, this allows the hybrid image sensor to be manufactured without any silicon process modifications to the implementation. Wiring need only be modified in the metal layer around the reset drain of the pixel cell as shown in fig. 23. Therefore, a silicon process (pure NMOS structure, optimized conditions for ion implantation, oxidation process, thermal process, etc.) optimized for frame-based image quality can be used with almost no change in circuit layout, and thus quality characteristics can be maintained.
Fig. 24 to 26 show a specific implementation of the stacked structure. Fig. 24 shows an example of a three-stack sensor, including a Back-Side illuminated Image sensor (BSI) photodiode layer (including pixel cells), an event detector layer, and a logic layer including Analog Front End (AFE) components such as a post ADC. The AFE also includes source follower transistors and other analog circuitry. Fig. 25 shows an example of a dual stack sensor including a BSI image sensor layer (acquiring frame-based images, including AFE and some logic circuitry) and an event detector layer including other logic. Fig. 26 illustrates another example of a dual stack sensor including BSI image sensor layers and event detector layers. The BSI image sensor layer captures frame-based images and includes some portions of the AFE and logic circuitry (i.e., pixel source followers, etc.); the event detector layer includes the AFE (i.e., ADC) and other portions of the logic circuitry.
In the above description, the inter-wafer connection is applied to the reset drain node, but there should be no limitation thereto. For example, the inter-die connections may be applied to various nodes, such as Floating Diffusion (FD) 60, additional capacitor nodes for dual conversion gain. On the other hand, the event detection circuit may further include other image processing units and memory units for storing event data or other purposes.
The embodiments provide a hybrid image sensor for acquiring frame-based images and event-based images that are well registered to each other without degrading the quality of the output image. With this technique, two types of images of high quality can be obtained by the same optical system and coordinate system. This enables a high quality compensation of the frame-based image with the event-based image. The following applications are expected to achieve this advantage well:
(1) the method comprises the steps of taking low delay, high dynamic range And low power consumption as synchronous positioning And map building (SLAM) to extract feature points, so that the mobile equipment can realize an indoor navigation system with high traceability. In the case of displaying a superimposed guide image on a frame-based image, this technique can realize SLAM for Augmented Reality (AR) navigation with less processing load. Fig. 27 shows an overview of the superimposed image generated for SLAM.
(2) Inter-frame motion of the object is detected to compensate for frame motion blur and interpolate an inter-frame image. Fig. 28 illustrates inter-picture compensation using an event-based picture. This technique may reduce motion blur. In fig. 28, during the period between the frame-based image 1 and the frame-based image 2, if the subject moves and the image becomes blurred, the image can be interpolated by using the acquired event-based image. This technique may also enable the reconstruction of high speed movies that exceed the frame rate limit by reconstructing one or more images between two consecutive frame-based images.
The invention may be used with other types of hybrid sensors such as global shutter image sensors and rolling shutter image sensors.
The foregoing disclosure is only illustrative of the present invention and is not intended to limit the scope of the invention. It will be understood by those of ordinary skill in the art that all or a portion of the procedures for practicing the above-described embodiments and equivalent modifications made in accordance with the claims of the present invention shall fall within the scope of the present invention.

Claims (9)

1. A pixel circuit, comprising:
a current change detector for detecting a change in photocurrent from the input port;
one or more pixel cells, each pixel cell comprising one or more photodiodes and a corresponding transfer gate transistor and a reset transistor, wherein a source of each transfer gate transistor is connected to a cathode of a corresponding photodiode and a drain of each transfer gate transistor is connected to a source of the reset transistor;
one or more selector circuits, each selector circuit responsive to a control signal to connect a drain of the reset transistor of one or more of the pixel cells to the input port of the current change detector or a fixed voltage.
2. The pixel circuit according to claim 1, wherein a clamping photodiode is used as the one or more photodiodes of the one or more pixel cells.
3. The pixel circuit of claim 1, wherein the current change detector and the one or more selector circuits are disposed on separate substrates.
4. A method of operating each pixel cell of claim 1, the method comprising:
connecting the drain of the reset transistor to an input port of the current change detector;
turning on the reset transistor and the transfer gate transistor to detect a change in light intensity as an event-based sensor mode;
connecting the drain of the reset transistor to the fixed voltage;
as a frame-based sensor mode, the transfer gate transistor is turned off to accumulate photoelectrons, and each transistor in the pixel unit is switched to output a signal corresponding to the amount of accumulated photoelectrons.
5. The method of operation of claim 4, further comprising: the mode is set separately for each set of one or more of the pixel cells connected to the same selector circuit.
6. A method of operating each pixel cell of claim 1, the method comprising:
connecting the drain of the reset transistor to an input port of the current change detector;
turning on the reset transistor and a portion of the transfer gate transistors as an event-based sensor mode to detect a change in light intensity at the photodiodes connected to the portion of the transfer gate transistors; turning off the remaining transfer gate transistors to accumulate photoelectrons generated in the photodiodes connected to the remaining transfer gate transistors as a frame-based sensor mode;
connecting the drain of the reset transistor to the fixed voltage;
each transistor in the pixel unit is switched to output a signal corresponding to the accumulated photoelectron amount.
7. The method of operation of claim 6, further comprising: a mode is set individually for one or more of the photodiodes of each group in the pixel unit.
8. An image sensor, comprising:
the pixel circuit according to claim 1;
a processor for reconstructing image data by reading out photoelectrons accumulated by the photodiode, the image data being acquired by the current change detector.
9. A camera system, the camera system comprising:
the pixel circuit according to claim 1;
a processor for reconstructing image data by reading out photoelectrons accumulated by the photodiode, the image data being acquired by the current change detector.
CN202080095165.5A 2020-02-10 2020-02-10 Hybrid pixel circuit for acquiring frame-based images and event-based images Pending CN115023945A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/074613 WO2021159231A1 (en) 2020-02-10 2020-02-10 Hybrid pixel circuit to capture frame based and event based images

Publications (1)

Publication Number Publication Date
CN115023945A true CN115023945A (en) 2022-09-06

Family

ID=77291905

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202080095165.5A Pending CN115023945A (en) 2020-02-10 2020-02-10 Hybrid pixel circuit for acquiring frame-based images and event-based images

Country Status (2)

Country Link
CN (1) CN115023945A (en)
WO (1) WO2021159231A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023093986A1 (en) * 2021-11-25 2023-06-01 Telefonaktiebolaget Lm Ericsson (Publ) A monolithic image sensor, a camera module, an electronic device and a method for operating a camera module
WO2024008305A1 (en) * 2022-07-08 2024-01-11 Telefonaktiebolaget Lm Ericsson (Publ) An image sensor system, a camera module, an electronic device and a method for operating a camera module for detecting events using infrared

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
PT864223E (en) * 1996-09-27 2002-09-30 Markus Bohm SENSOR OPTICAL AUTO-ADAPTABLE LOCALLY.
KR100279295B1 (en) * 1998-06-02 2001-02-01 윤종용 Active pixel sensor
GB2525625B (en) * 2014-04-29 2017-05-31 Isdi Ltd Device and method
EP3389259B1 (en) * 2015-12-07 2020-09-02 Panasonic Intellectual Property Management Co., Ltd. Solid-state image-capturing device and method for driving solid-state image-capturing device
KR102612194B1 (en) * 2016-12-14 2023-12-11 삼성전자주식회사 Event-based sensor and sensing method
CN107426513B (en) * 2017-07-25 2019-11-12 京东方科技集团股份有限公司 CMOS active pixel sensor and its driving method

Also Published As

Publication number Publication date
WO2021159231A1 (en) 2021-08-19

Similar Documents

Publication Publication Date Title
US20230097274A1 (en) Imaging device including photoelectric converters and capacitive element
US11350044B2 (en) Solid-state imaging device, method for driving solid-state imaging device, and electronic apparatus
JP5358136B2 (en) Solid-state imaging device
CN106954008B (en) Image pickup device and image pickup module
US8749675B2 (en) Solid state image pickup device and camera which can prevent color mixture
JP5188275B2 (en) Solid-state imaging device, driving method thereof, and imaging system
CN108389871B (en) Image pickup device
US7812873B2 (en) Image pickup device and image pickup system
JP7018593B2 (en) Imaging device and camera system
US20100079652A1 (en) Solid-state image pickup device and driving method therefor, and electronic apparatus
US10070079B2 (en) High dynamic range global shutter image sensors having high shutter efficiency
US20090290059A1 (en) Connection/separation element in photoelectric converter portion, solid-state imaging device, and imaging apparatus
JP2015056878A (en) Solid-state imaging device
US10873716B2 (en) Dual row control signal circuit for reduced image sensor shading
TW202205652A (en) Solid-state imaging device, method for manufacturing solid-state image device, and electronic apparatus
KR100801758B1 (en) Image sensor and controlling method thereof
CN115023945A (en) Hybrid pixel circuit for acquiring frame-based images and event-based images
Mabuchi et al. CMOS image sensors comprised of floating diffusion driving pixels with buried photodiode
US20050062866A1 (en) Multiplexed pixel column architecture for imagers
US10477126B1 (en) Dual eclipse circuit for reduced image sensor shading
US20240022800A1 (en) Image sensor and electronic device including the same
US20230232133A1 (en) Solid-state imaging device, method for driving solid-state imaging device, and electronic apparatus
US20220060646A1 (en) Solid-state imaging apparatus and electronic device
CN115442548A (en) Bit line control to support merged mode for pixel arrays with phase detection autofocus and image sensing photodiodes
CN115550574A (en) Solid-state imaging device, method for manufacturing solid-state imaging device, and electronic apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination