US20210265401A1 - Global shutter image sensor - Google Patents
Global shutter image sensor Download PDFInfo
- Publication number
- US20210265401A1 US20210265401A1 US17/236,433 US202117236433A US2021265401A1 US 20210265401 A1 US20210265401 A1 US 20210265401A1 US 202117236433 A US202117236433 A US 202117236433A US 2021265401 A1 US2021265401 A1 US 2021265401A1
- Authority
- US
- United States
- Prior art keywords
- voltage
- charge
- photodiode
- pixel cell
- comparator
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 239000003990 capacitor Substances 0.000 claims abstract description 142
- 238000005070 sampling Methods 0.000 claims abstract description 140
- 238000000034 method Methods 0.000 claims abstract description 55
- 238000003860 storage Methods 0.000 claims abstract description 39
- 230000010354 integration Effects 0.000 claims abstract description 30
- 230000015654 memory Effects 0.000 claims description 98
- 238000012546 transfer Methods 0.000 claims description 46
- 230000007423 decrease Effects 0.000 claims 1
- 239000004065 semiconductor Substances 0.000 description 67
- 239000000872 buffer Substances 0.000 description 36
- 238000013139 quantization Methods 0.000 description 27
- 210000005252 bulbus oculi Anatomy 0.000 description 19
- 239000002184 metal Substances 0.000 description 15
- 238000005259 measurement Methods 0.000 description 14
- 238000003384 imaging method Methods 0.000 description 10
- 238000010168 coupling process Methods 0.000 description 7
- 238000005859 coupling reaction Methods 0.000 description 7
- 238000012545 processing Methods 0.000 description 7
- 238000004590 computer program Methods 0.000 description 6
- 230000008878 coupling Effects 0.000 description 6
- 238000010586 diagram Methods 0.000 description 5
- 230000000694 effects Effects 0.000 description 5
- 230000002452 interceptive effect Effects 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 5
- 238000005096 rolling process Methods 0.000 description 5
- 238000005286 illumination Methods 0.000 description 4
- 230000003190 augmentative effect Effects 0.000 description 3
- 230000007547 defect Effects 0.000 description 3
- 210000001508 eye Anatomy 0.000 description 3
- 239000011521 glass Substances 0.000 description 3
- 239000007788 liquid Substances 0.000 description 3
- 238000012856 packing Methods 0.000 description 3
- 238000003491 array Methods 0.000 description 2
- 230000000712 assembly Effects 0.000 description 2
- 238000000429 assembly Methods 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 230000001427 coherent effect Effects 0.000 description 2
- 239000003086 colorant Substances 0.000 description 2
- 238000010276 construction Methods 0.000 description 2
- 230000002950 deficient Effects 0.000 description 2
- 238000009826 distribution Methods 0.000 description 2
- 238000005468 ion implantation Methods 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 239000004033 plastic Substances 0.000 description 2
- 229910021420 polycrystalline silicon Inorganic materials 0.000 description 2
- 229920005591 polysilicon Polymers 0.000 description 2
- 210000001747 pupil Anatomy 0.000 description 2
- 230000035945 sensitivity Effects 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000003750 conditioning effect Effects 0.000 description 1
- 230000005669 field effect Effects 0.000 description 1
- 229920000642 polymer Polymers 0.000 description 1
- 230000004434 saccadic eye movement Effects 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 229910052710 silicon Inorganic materials 0.000 description 1
- 239000010703 silicon Substances 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 239000000758 substrate Substances 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/14603—Special geometry or disposition of pixel-elements, address-lines or gate-electrodes
- H01L27/14605—Structural or functional details relating to the position of the pixel elements, e.g. smaller pixel elements in the center of the imager compared to pixel elements at the periphery
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
- H04N25/76—Addressed sensors, e.g. MOS or CMOS sensors
- H04N25/77—Pixel circuitry, e.g. memories, A/D converters, pixel amplifiers, shared circuits or shared components
- H04N25/772—Pixel circuitry, e.g. memories, A/D converters, pixel amplifiers, shared circuits or shared components comprising A/D, V/T, V/F, I/T or I/F converters
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/14609—Pixel-elements with integrated switching, control, storage or amplification elements
- H01L27/1461—Pixel-elements with integrated switching, control, storage or amplification elements characterised by the photosensitive area
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/50—Control of the SSIS exposure
- H04N25/57—Control of the dynamic range
- H04N25/59—Control of the dynamic range by controlling the amount of charge storable in the pixel, e.g. modification of the charge conversion ratio of the floating node capacitance
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
- H04N25/76—Addressed sensors, e.g. MOS or CMOS sensors
- H04N25/77—Pixel circuitry, e.g. memories, A/D converters, pixel amplifiers, shared circuits or shared components
- H04N25/771—Pixel circuitry, e.g. memories, A/D converters, pixel amplifiers, shared circuits or shared components comprising storage means other than floating diffusion
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
- H04N25/79—Arrangements of circuitry being divided between different or multiple substrates, chips or circuit boards, e.g. stacked image sensors
-
- H04N5/3559—
-
- H04N5/37452—
Abstract
In one example, a method comprises: within an integration period: enabling a photodiode of a pixel cell to accumulate charge responsive to incident light, and transferring the charge from the photodiode to a charge storage device of the pixel cell. The method further comprises: performing, using a sampling capacitor, a sample-and-hold operation to convert the charge stored in the charge storage device into a voltage; and generating a digital output based on the voltage to represent an intensity of the incident light received by the photodiode.
Description
- This patent application is a continuation of U.S. Non-Provisional patent application Ser. No. 16/369,763, filed Mar. 29, 2019, entitled “GLOBAL SHUTTER IMAGE SENSOR”, which claims priority to U.S. Provisional Patent Application Ser. No. 62/652,220, filed Apr. 3, 2018, entitled “GLOBAL SHUTTER IMAGE SENSOR,” which is assigned to the assignee hereof and is incorporated herein by reference in its entirety for all purposes.
- The disclosure relates generally to image sensors, and more specifically to pixel cell structure including interfacing circuits to determine light intensity for image generation.
- A typical image sensor includes an array of photodiodes to sense incident light by converting photons into charge (e.g., electrons or holes). To reduce image distortion, a global shutter operation can be performed in which each photodiode of the array of photodiodes senses the incident light simultaneously to generate charge. The charge generated by the array of photodiodes can then be quantized by an analog-to-digital converter (ADC) into digital values to generate the image.
- The present disclosure relates to image sensors. More specifically, and without limitation, this disclosure relates to a pixel cell. This disclosure also relates to operating the circuitries of pixel cells to generate a digital representation of the intensity of incident light.
- In one example, a pixel cell is provided. The pixel cell includes a first semiconductor die, the first semiconductor die including a photodiode and a charge sensing device. The pixel cell further includes a sampling capacitor, and a second semiconductor die forming a stack with the first semiconductor die, the second semiconductor die including an interface circuit coupled with the photodiode, the charge sensing device, and the sampling capacitor. The interface circuit is configured to: enable the photodiode to accumulate charge responsive to incident light within a integration period; transfer the charge from the photodiode to the charge sensing device; perform, using the sampling capacitor, a sample-and-hold operation to convert the charge in the charge sensing device into a voltage; and generate a digital output based on the voltage to represent an intensity of the incident light received by the photodiode.
- In some aspects, the pixel cell further includes a sampling switch coupled between the charge sensing device and the sampling capacitor. The interface circuit is configured to, as part of the sample-and-hold operation: enable the sampling switch to cause the sampling capacitor to sample the charge accumulated in the charge sensing device to develop the voltage; and disable the sampling switch to cause the sampling capacitor to hold the voltage.
- In some aspects, the voltage is a first voltage. The charge sensing device is configured to output a second voltage based on the stored charge. The pixel cell further includes a voltage buffer coupled between the charge sensing device and the sampling capacitor and configured to buffer the second voltage to output the first voltage to the sampling capacitor. The sampling capacitor is operated to sample the first voltage received from the voltage buffer when the sampling switch is enabled, and to hold the first voltage after the sampling switch is disabled.
- In some aspects, the sampling switch and the voltage buffer are included in the first semiconductor die.
- In some aspects, the sampling capacitor includes at least one of: a metal capacitor or a semiconductor capacitor sandwiched between the first semiconductor die and the second semiconductor die in the stack, or a metal capacitor or a semiconductor capacitor formed in the second semiconductor die.
- In some aspects, the interface circuit further comprises a resettable comparator. The pixel cell further comprises an AC capacitor coupled between the sampling capacitor and the comparator. The interface circuit is configured to, when the sampling switch is enabled: control the comparator to enter a reset state; operate the AC capacitor to: obtain a first sample of a reset voltage of the charge sensing device caused by a prior reset operation of the charge sensing device; obtain a second sample of an offset of the comparator when the comparator is in the reset state; store a third voltage across the AC capacitor based on the first sample of the reset voltage and the second sample of the offset; and output a fourth voltage to the comparator based on the first voltage and the third voltage. The digital output is generated based on the fourth voltage.
- In some aspects, the pixel cell further comprises a transfer switch coupled between the photodiode and the charge sensing device. The interface circuit is configured to: control the comparator to exit the reset state to hold the third voltage across the AC capacitor; enable the transfer switch to transfer the charge from the photodiode to the charge sensing device, wherein the transfer of the charge develops the first voltage at the sampling capacitor; and disable the transfer switch to stop the transfer of the charge, wherein the disabling of the transfer switch causes the sampling capacitor to hold the first voltage and the AC capacitor to hold the fourth voltage for the generation of the digital output.
- In some aspects, an output of the comparator of the pixel cell is coupled with a memory. The memory is coupled with a counter configured to update a count value periodically based on a clock. The comparator is configured to, after the transfer switch is disabled, compare the fourth voltage against a ramping threshold to output a decision. The memory is configured to store the count value from the counter based on the decision. The stored count value represents the digital output.
- In some aspects, the pixel cell further comprises a selection switch coupled between the output of the comparator and the memory. The interface circuit is configured to: enable the selection switch to transmit the decision to the memory when the pixel cell is selected to store the digital output in the memory; and disable the selection switch to block the decision from the memory when the pixel cell is not selected to store the digital output in the memory.
- In some aspects, the memory and the counter are included in the second semiconductor die.
- In some aspects, the pixel cell further comprises a shutter switch coupled between the photodiode and a charge sink. The interface circuit is configured to: disable the shutter switch to start the integration period and to enable the photodiode to accumulate the charge, and enable the shutter switch to end the integration period and to prevent the photodiode from accumulating the charge.
- In some aspects, the charge sensing device comprises at least one of: a floating drain node, or a pinned storage node.
- In some examples, an image sensor is provided. The image sensor comprises a first semiconductor die, the first semiconductor die including an array of light sensing circuits, each light sensing circuit of the array of light sensing circuits comprising a photodiode and a charge sensing device. The image sensor further comprises an array of sampling capacitors, each sampling capacitor of the array of sampling capacitors corresponding to a light sensing circuit of the array of light sensing circuits. The image sensor further comprises a second semiconductor die forming a stack with the first semiconductor die, the second semiconductor die including an array of interface circuits, each interface circuit of the array of interface circuits, each light sensing circuit of the array of light sensing circuits, and each sampling capacitor of the array of sampling capacitors forming a pixel cell. Each interface circuit of the each pixel cell is configured to: enable the photodiode of the corresponding light sensing circuit to accumulate charge responsive to incident light within a global integration period; transfer the charge from the photodiode to the charge sensing device of the corresponding light sensing circuit; perform, using the corresponding sampling capacitor, a sample-and-hold operation on the charge stored in the charge sensing device to obtain a voltage; and generate a digital output based on the voltage to represent an intensity of the incident light received by the corresponding pixel cell.
- In some aspects, in the each pixel cell: the light sensing circuit further includes a sampling switch coupled between the charge sensing device and the sampling capacitor. The interface circuit is configured to, as part of the sample-and-hold operation: enable the sampling switch to cause the sampling capacitor to sample the charge stored in the charge sensing device to develop the voltage; and disable the sampling switch to cause the sampling capacitor to hold the voltage.
- In some aspects, in the each pixel cell: the voltage is a first voltage. The charge sensing device is configured to output a second voltage based on the stored charge. The light sensing circuit further includes a voltage buffer coupled between the charge sensing device and the sampling capacitor and configured to buffer the second voltage to output the first voltage to the sampling capacitor. The sampling capacitor is operated to sample the first voltage received from the voltage buffer when the sampling switch is enabled, and to hold the first voltage after the sampling switch is disabled.
- In some aspects, in the each pixel cell: the each interface circuit further comprises a resettable comparator. The each light sensing circuit further comprises an AC capacitor coupled between the sampling capacitor and the comparator. The each interface circuit is configured to, when the sampling switch is enabled: control the comparator to enter a reset state; operate the AC capacitor to: obtain a first sample of a reset voltage of the charge sensing device caused by a prior reset operation of the charge sensing device; obtain a second sample of an offset of the comparator when the comparator is in the reset state; store a third voltage across the AC capacitor based on the first sample of the reset voltage and the second sample of the offset; and output a fourth voltage to the comparator based on the first voltage and the third voltage. The digital output is generated based on the fourth voltage.
- In some aspects, the each light sensing circuit further comprises a transfer switch coupled between the photodiode and the charge sensing device. The each interface circuit is configured to: control the comparator to exit the reset state to hold the third voltage across the AC capacitor; enable the transfer switch to transfer the charge from the photodiode to the charge sensing device, wherein the transfer of the charge develops the first voltage at the sampling capacitor; and disable the transfer switch to stop the transfer of the charge, wherein the disabling of the transfer switch causes the sampling capacitor to hold the first voltage and the AC capacitor to hold the fourth voltage for the generation of the digital output.
- In some aspects, the image sensor further includes a controller, a counter, and a bank of memory buffers. Each memory buffer of the bank of memory buffers is coupled with the counter. The counter is configured to update a count value periodically based on a clock. An output of the comparator of the each interface circuit is coupled to the each memory buffer via a selection switch controlled by the controller. The comparator is configured to, after the transfer switch is disabled, compare the fourth voltage against a ramping threshold to generate a decision. The controller is configured to, at different times, enable the selection switches of subsets of the pixel cells to transmit the decisions of the comparators of the selected subsets of the pixel cells to the bank of memory buffers. The bank of memory buffers is configured to store the count values from the counter based on the decisions of the selected subsets of the pixel cells at the different times. The stored count values represent the digital outputs of the pixel cells.
- In some example, a method is provided. The method comprises: enabling, by an interface circuit, a photodiode of a light sensing circuit to accumulate charge responsive to incident light within a integration period, wherein the light sensing circuit and the interface circuit are in, respectively, a first semiconductor die and a second semiconductor die forming a stack; transferring, by the interface circuit, the charge from the photodiode to a charge sensing device of the light sensing circuit; performing, by the interface circuit and using a sampling capacitor, a sample-and-hold operation to convert the charge stored in the charge sensing device into a voltage; and generating, by the interface circuit, a digital output based on the voltage to represent an intensity of the incident light received by the photodiode.
- In some aspects, the method further comprises: comparing the voltage with a ramping threshold to output a decision; controlling a memory to store a count value from a counter based on the decision; and providing the count value as the digital output. The memory and the counter is in the second semiconductor die.
- Illustrative embodiments are described with reference to the following figures.
-
FIG. 1A andFIG. 1B are diagrams of an embodiment of a near-eye display. -
FIG. 2 is an embodiment of a cross section of the near-eye display. -
FIG. 3 illustrates an isometric view of an embodiment of a waveguide display with a single source assembly. -
FIG. 4 illustrates a cross section of an embodiment of the waveguide display. -
FIG. 5 is a block diagram of an embodiment of a system including the near-eye display. -
FIG. 6A ,FIG. 6B , andFIG. 6C illustrate examples of a pixel cell and their operations. -
FIG. 7A ,FIG. 7B ,FIG. 7C ,FIG. 7D , andFIG. 7E illustrate examples of a pixel cell and its operations. -
FIG. 8A ,FIG. 8B ,FIG. 8C ,FIG. 8D , andFIG. 8E illustrate an example of an image sensor and their operations. -
FIG. 9 illustrates a flowchart of an example process for measuring light intensity. - The figures depict embodiments of the present disclosure for purposes of illustration only. One skilled in the art will readily recognize from the following description that alternative embodiments of the structures and methods illustrated may be employed without departing from the principles, or benefits touted, of this disclosure.
- In the appended figures, similar components and/or features may have the same reference label. Further, various components of the same type may be distinguished by following the reference label by a dash and a second label that distinguishes among the similar components. If only the first reference label is used in the specification, the description is applicable to any one of the similar components having the same first reference label irrespective of the second reference label.
- In the following description, for the purposes of explanation, specific details are set forth in order to provide a thorough understanding of certain inventive embodiments. However, it will be apparent that various embodiments may be practiced without these specific details. The figures and description are not intended to be restrictive.
- A typical image sensor includes an array of pixel cells. Each pixel cell includes a photodiode to sense incident light by converting photons into charge (e.g., electrons or holes). The charge generated by photodiodes of the array of pixel cells can then be quantized by an analog-to-digital converter (ADC) into digital values. The ADC can quantize the charge by, for example, using a comparator to compare a voltage representing the charge with one or more quantization levels, and a digital value can be generated based on the comparison result. The digital values can then be stored in a memory to generate the image.
- Due to power and chip area limitation, typically the ADC and the memory are shared by at least some of the pixel cells, instead of providing a dedicated ADC and a memory to each pixel cell. A rolling shutter operation can be performed to accommodate the sharing of the ADC and the memory among the pixel cells. For example, the array of pixel cells can be divided into multiple groups (e.g., rows or columns of pixel cells), with the pixels of each group sharing an ADC and the memory. To accommodate the sharing of the ADC and the memory, a rolling shutter operation can be performed in which each pixel cell within the group can take a turn to be exposed to incident light to generate the charge, followed by accessing the ADC to perform the quantization of the charge into a digital value, and storing the digital value into the memory. As the rolling shutter operation exposes different pixel cells to incident light at different times, an image generated from the rolling shutter operation can experience distortion, especially for images of a moving object and/or images captured when the image sensor is moving. The potential distortion introduced by rolling shutter operation makes it unsuitable for augmented reality/mixed reality/virtual reality (AR/MR/VR) applications, wearable applications, etc., in which the image sensor can be part of a headset and can be in motion when capturing images.
- To reduce image distortion, a global shutter operation can be performed in which each pixel cell of the array of pixel cells is exposed to incident light to generate charge simultaneously within a global shutter period (or a global integration period). Each pixel cell can include a charge sensing device to temporarily store the charge generated by the photodiode. When the pixel cell is granted access to the ADC and to the memory, the pixel cell can provide the charge from the charge sensing device to the ADC to perform the quantization to generate the digital value, and then store the digital value in the memory.
- There are various techniques to implement a charge sensing device in a pixel cell, such as implementing the charge sensing device as a floating drain node, a pinned storage node (of a pinned diode), etc. But a charge sensing device implemented using these techniques is susceptible to noise charge, which can degrade the correlation between the charge stored in the charge sensing device and the incident light intensity and reduce the sensitivity of the pixel cell. For example, a floating drain node configured as a charge sensing device can be susceptible to dark current, which can be leakage currents generated at the p-n junction of a semiconductor device due to crystallographic defects. The dark current can flow into the charge sensing device and add to the charge generated by the photodiode. As another example, a pinned storage node can generate charge when photons of the incident light penetrate into the semiconductor substrate of the pixel cell and reach the pinned storage node. The charge generated by the charge sensing device can add to the charge generated by the photodiode. In both cases, the charge stored in the charge sensing device is different from the charge generated by the photodiode during the global shutter period, and the quantization result of the charge in the charge sensing device may not provide an accurate representation of the incident light intensity.
- There are other noise sources that can further degrade the accurate representation of the incident light intensity. For example, during the reset of the charge sensing device between measurements, thermal noise (as well as other noises, such as 1/f noise, etc.) can be injected into the charge sensing device as reset noise, which adds charge not reflecting the incident light intensity to the floating node. Moreover, as discussed above, the quantization process typically includes using a comparator to compare a voltage representing the charge with one or more quantization levels, and a digital value can be generated based on the results of the comparisons. The comparator offset can also lead to errors in the comparison with the quantization levels, which can introduce errors in the digital value.
- This disclosure relates to a pixel cell that can improve a global shutter operation. In one example, a pixel cell can include a first semiconductor die, a sampling capacitor, and a second semiconductor die. The first semiconductor die includes a light receiving surface, a photodiode to receive incident light via the light receiving surface, and a charge sensing device to accumulate charge generated by the photodiode. The second semiconductor die forms a stack with the first semiconductor die and includes an interface circuit coupled with the sampling capacitor, the photodiode, and the charge sensing device. The sampling capacitor may include a metal capacitor sandwiched between the first and second semiconductor dies within the stack, or may include a device capacitor formed in the second semiconductor die. The charge sensing device may include a floating drain node, a pinned storage node, etc.
- To perform sensing of incident light, the interface circuit can expose the photodiode to the incident light within an integration period to cause the photodiode to generate charge. The interface circuit can perform, using the sampling capacitor, a sample-and-hold operation on the charge accumulated in the storage device within the integration period to obtain a voltage. More specifically, the pixel cell can include a sampling switch coupled between the charge sensing device and the sampling capacitor to support the sample-and-hold operation. The interface circuit can enable the sampling switch to cause the sampling capacitor to sample the charge accumulated in the storage device to develop the voltage, and then disable the sampling switch to cause the sampling capacitor to hold the voltage. The voltage held at the sampling capacitor, after the sampling switch is disabled, can be quantized to generate the digital output.
- The interface circuit of the pixel cell may include a comparator to perform the quantization. The comparator can be coupled with a memory and a counter, both of which can be external to the pixel cell. The counter can update a count value periodically based on a clock. The comparator can compare the voltage held at the sampling capacitor against a ramping threshold to generate a decision. Based on the decision, the memory can store the count value from the counter. The count value stored in the memory can be the digital output.
- In some examples, to further improve the accuracy of sensing of the incident light, an AC capacitor can be provided between the sampling capacitor and the comparator to store a second voltage representing the reset noise introduced to the charge sensing device and the offset of the comparator. The AC capacitor can also include a metal capacitor sandwiched between the first and second semiconductor dies within the stack. The AC capacitor can combine the second voltage with the voltage held at the sampling capacitor (“a first voltage”) to output a third voltage to the comparator, with the reset noise component removed from the third voltage as a result of the combination. The comparator can compare the third voltage with the thresholds to perform the quantization operation, in which the comparator offset component in the third voltage can compensate for the actual offset of the comparator.
- The disclosed techniques can improve light sensing in numerous ways. First, the sampling capacitor can provide an additional charge sensing device to store the charge generated by the photodiode. The sampling capacitor can also be less susceptible to noise charge. For example, the sampling capacitor can be a metal capacitor which is less susceptible to dark current due to crystallographic defects and which does not generate charge when receiving photons. Combined with the techniques of pre-storing the reset noise and the comparator offset in an AC capacitor to reduce the effect of the reset noise and comparator offset on the quantization operation as described above, the accuracy of the light sensing operation, and the fidelity of the image generation operation, can be substantially improved.
- The disclosed techniques can also reduce the footprint of the pixel cells, which allows packing a large number of pixel cells in an image sensor to improve resolution while minimizing the footprint of the image sensor. For example, by stacking the photodiode with the processing circuit to form a pixel cell, and by putting the memory external to the pixel cell, the footprint of the pixel cell can be reduced. Moreover, by forming the sampling capacitor and the AC capacitor between the semiconductor dies, these capacitors do not cover the light receiving surface, which can maximize the available pixel cell area for the light receiving surface and allows the footprint of the pixel cell to be further reduced. With the disclosed techniques, a high resolution image sensor with a small footprint can be achieved, which is especially useful for applications on a wearable device (e.g., a headset) where available space is very limited.
- The disclosed techniques can also improve reliability and speed of image generation. For example, as the memory is positioned outside the pixel cell and does not affect the footprint of the pixel cell, redundant memory devices can be provided to store the digital outputs from each pixel cell to reduce the likelihood of losing the digital outputs (and the pixel values) due to defective memory. But since the memory comprises mostly digital circuits and typically has a very small footprint, adding redundant memory (to be shared by the pixel cells) typically does not significantly increase the footprint of the image sensor. Moreover, compared with an implementation where the pixel cell transmits an analog voltage (e.g., a voltage at the charge sensing device) to an external ADC to perform the quantization operation, the disclosed techniques allow a part of the quantization (the comparator comparison) operation to be performed within the pixel cell, and only a digital output (the decision of the comparator) is transmitted from the pixel cell to the external memory. Compared with an analog voltage, the digital output can be transmitted with high fidelity (to distinguish between zeroes and ones) and at high speed. All these can improve the reliability and speed of image generation based on the light sensing operations by the pixel cells.
- The disclosed techniques may include or be implemented in conjunction with an artificial reality system. Artificial reality is a form of reality that has been adjusted in some manner before presentation to a user, which may include, e.g., a virtual reality (VR), an augmented reality (AR), a mixed reality (MR), a hybrid reality, or some combination and/or derivatives thereof. Artificial reality content may include completely generated content or generated content combined with captured (e.g., real-world) content. The artificial reality content may include video, audio, haptic feedback, or some combination thereof, any of which may be presented in a single channel or in multiple channels (such as stereo video that produces a three-dimensional effect to the viewer). Additionally, in some embodiments, artificial reality may also be associated with applications, products, accessories, services, or some combination thereof, that are used to, e.g., create content in an artificial reality and/or are otherwise used in (e.g., perform activities in) an artificial reality. The artificial reality system that provides the artificial reality content may be implemented on various platforms, including a head-mounted display (HMD) connected to a host computer system, a standalone HMD, a mobile device or computing system, or any other hardware platform capable of providing artificial reality content to one or more viewers.
-
FIG. 1A is a diagram of an embodiment of a near-eye display 100. Near-eye display 100 presents media to a user. Examples of media presented by near-eye display 100 include one or more images, video, and/or audio. In some embodiments, audio is presented via an external device (e.g., speakers and/or headphones) that receives audio information from the near-eye display 100, a console, or both, and presents audio data based on the audio information. Near-eye display 100 is generally configured to operate as a virtual reality (VR) display. In some embodiments, near-eye display 100 is modified to operate as an augmented reality (AR) display and/or a mixed reality (MR) display. - Near-
eye display 100 includes aframe 105 and adisplay 110.Frame 105 is coupled to one or more optical elements.Display 110 is configured for the user to see content presented by near-eye display 100. In some embodiments,display 110 comprises a waveguide display assembly for directing light from one or more images to an eye of the user. - Near-
eye display 100 further includesimage sensors image sensors sensors sensor 120 c may be configured to provide image data representing a field of view towards a direction B along the X axis, andsensor 120 d may be configured to provide image data representing a field of view towards a direction C along the X axis. - In some embodiments, sensors 120 a-120 d can be configured as input devices to control or influence the display content of the near-
eye display 100, to provide an interactive VR/AR/MR experience to a user who wears near-eye display 100. For example, sensors 120 a-120 d can generate physical image data of a physical environment in which the user is located. The physical image data can be provided to a location tracking system to track a location and/or a path of movement of the user in the physical environment. A system can then update the image data provided to display 110 based on, for example, the location and orientation of the user, to provide the interactive experience. In some embodiments, the location tracking system may operate a SLAM algorithm to track a set of objects in the physical environment and within a view of field of the user as the user moves within the physical environment. The location tracking system can construct and update a map of the physical environment based on the set of objects, and track the location of the user within the map. By providing image data corresponding to multiple fields of views, sensors 120 a-120 d can provide the location tracking system a more holistic view of the physical environment, which can lead to more objects to be included in the construction and updating of the map. With such an arrangement, the accuracy and robustness of tracking a location of the user within the physical environment can be improved. - In some embodiments, near-
eye display 100 may further include one or moreactive illuminators 130 to project light into the physical environment. The light projected can be associated with different frequency spectrums (e.g., visible light, infra-red light, ultra-violet light, etc.), and can serve various purposes. For example,illuminator 130 may project light in a dark environment (or in an environment with low intensity of infra-red light, ultra-violet light, etc.) to assist sensors 120 a-120 d in capturing images of different objects within the dark environment to, for example, enable location tracking of the user.Illuminator 130 may project certain markers onto the objects within the environment, to assist the location tracking system in identifying the objects for map construction/updating. - In some embodiments,
illuminator 130 may also enable stereoscopic imaging. For example, one or more ofsensors Illuminator 130 may project a set of IR markers on the object, the images of which can be captured by the IR pixel array. Based on a distribution of the IR markers of the object as shown in the image, the system can estimate a distance of different parts of the object from the IR pixel array, and generate a stereoscopic image of the object based on the distances. Based on the stereoscopic image of the object, the system can determine, for example, a relative position of the object with respect to the user, and can update the image data provided to display 100 based on the relative position information to provide the interactive experience. - As discussed above, near-
eye display 100 may be operated in environments associated with a very wide range of light intensities. For example, near-eye display 100 may be operated in an indoor environment or in an outdoor environment, and/or at different times of the day. Near-eye display 100 may also operate with or withoutactive illuminator 130 being turned on. As a result, image sensors 120 a-120 d may need to have a wide dynamic range to be able to operate properly (e.g., to generate an output that correlates with the intensity of incident light) across a very wide range of light intensities associated with different operating environments for near-eye display 100. -
FIG. 1B is a diagram of another embodiment of near-eye display 100.FIG. 1B illustrates a side of near-eye display 100 that faces the eyeball(s) 135 of the user who wears near-eye display 100. As shown inFIG. 1B , near-eye display 100 may further include a plurality ofilluminators eye display 100 further includes a plurality ofimage sensors Illuminators Sensor 150 a may include a pixel array to receive the reflected light and generate an image of the reflected pattern. Similarly,illuminators sensor 150 b.Sensor 150 b may also include a pixel array to generate an image of the reflected pattern. Based on the images of the reflected pattern fromsensors - As discussed above, to avoid damaging the eyeballs of the user,
illuminators image sensors FIG. 1A , the image sensors 120 a-120 d may need to be able to generate an output that correlates with the intensity of incident light when the intensity of the incident light is very low, which may further increase the dynamic range requirement of the image sensors. - Moreover, the image sensors 120 a-120 d may need to be able to generate an output at a high speed to track the movements of the eyeballs. For example, a user's eyeball can perform a very rapid movement (e.g., a saccade movement) in which there can be a quick jump from one eyeball position to another. To track the rapid movement of the user's eyeball, image sensors 120 a-120 d need to generate images of the eyeball at high speed. For example, the rate at which the image sensors generate an image frame (the frame rate) needs to at least match the speed of movement of the eyeball. The high frame rate requires short total exposure time for all of the pixel cells involved in generating the image frame, as well as high speed for converting the sensor outputs into digital values for image generation. Moreover, as discussed above, the image sensors also need to be able to operate at an environment with low light intensity.
-
FIG. 2 is an embodiment of across section 200 of near-eye display 100 illustrated inFIG. 1 .Display 110 includes at least onewaveguide display assembly 210. An exit pupil 230 is a location where asingle eyeball 220 of the user is positioned in an eyebox region when the user wears the near-eye display 100. For purposes of illustration,FIG. 2 shows thecross section 200 associatedeyeball 220 and a singlewaveguide display assembly 210, but a second waveguide display is used for a second eye of a user. -
Waveguide display assembly 210 is configured to direct image light to an eyebox located at exit pupil 230 and toeyeball 220.Waveguide display assembly 210 may be composed of one or more materials (e.g., plastic, glass, etc.) with one or more refractive indices. In some embodiments, near-eye display 100 includes one or more optical elements betweenwaveguide display assembly 210 andeyeball 220. - In some embodiments,
waveguide display assembly 210 includes a stack of one or more waveguide displays including, but not restricted to, a stacked waveguide display, a varifocal waveguide display, etc. The stacked waveguide display is a polychromatic display (e.g., a red-green-blue (RGB) display) created by stacking waveguide displays whose respective monochromatic sources are of different colors. The stacked waveguide display is also a polychromatic display that can be projected on multiple planes (e.g., multi-planar colored display). In some configurations, the stacked waveguide display is a monochromatic display that can be projected on multiple planes (e.g., multi-planar monochromatic display). The varifocal waveguide display is a display that can adjust a focal position of image light emitted from the waveguide display. In alternate embodiments,waveguide display assembly 210 may include the stacked waveguide display and the varifocal waveguide display. -
FIG. 3 illustrates an isometric view of an embodiment of awaveguide display 300. In some embodiments,waveguide display 300 is a component (e.g., waveguide display assembly 210) of near-eye display 100. In some embodiments,waveguide display 300 is part of some other near-eye display or other system that directs image light to a particular location. -
Waveguide display 300 includes asource assembly 310, anoutput waveguide 320, and acontroller 330. For purposes of illustration,FIG. 3 shows thewaveguide display 300 associated with asingle eyeball 220, but in some embodiments, another waveguide display separate, or partially separate, from thewaveguide display 300 provides image light to another eye of the user. -
Source assembly 310 generatesimage light 355.Source assembly 310 generates and outputs image light 355 to acoupling element 350 located on a first side 370-1 ofoutput waveguide 320.Output waveguide 320 is an optical waveguide that outputs expanded image light 340 to aneyeball 220 of a user.Output waveguide 320 receives image light 355 at one ormore coupling elements 350 located on the first side 370-1 and guides receivedinput image light 355 to a directingelement 360. In some embodiments,coupling element 350 couples the image light 355 fromsource assembly 310 intooutput waveguide 320. Couplingelement 350 may be, e.g., a diffraction grating, a holographic grating, one or more cascaded reflectors, one or more prismatic surface elements, and/or an array of holographic reflectors. - Directing
element 360 redirects the receivedinput image light 355 todecoupling element 365 such that the receivedinput image light 355 is decoupled out ofoutput waveguide 320 viadecoupling element 365. Directingelement 360 is part of, or affixed to, first side 370-1 ofoutput waveguide 320.Decoupling element 365 is part of, or affixed to, second side 370-2 ofoutput waveguide 320, such that directingelement 360 is opposed to thedecoupling element 365. Directingelement 360 and/ordecoupling element 365 may be, e.g., a diffraction grating, a holographic grating, one or more cascaded reflectors, one or more prismatic surface elements, and/or an array of holographic reflectors. - Second side 370-2 represents a plane along an x-dimension and a y-dimension.
Output waveguide 320 may be composed of one or more materials that facilitate total internal reflection ofimage light 355.Output waveguide 320 may be composed of e.g., silicon, plastic, glass, and/or polymers.Output waveguide 320 has a relatively small form factor. For example,output waveguide 320 may be approximately 50 mm wide along x-dimension, 30 mm long along y-dimension and 0.5-1 mm thick along a z-dimension. -
Controller 330 controls scanning operations ofsource assembly 310. Thecontroller 330 determines scanning instructions for thesource assembly 310. In some embodiments, theoutput waveguide 320 outputs expanded image light 340 to the user'seyeball 220 with a large field of view (FOV). For example, the expandedimage light 340 is provided to the user'seyeball 220 with a diagonal FOV (in x and y) of 60 degrees and/or greater and/or 150 degrees and/or less. Theoutput waveguide 320 is configured to provide an eyebox with a length of 20 mm or greater and/or equal to or less than 50 mm; and/or a width of 10 mm or greater and/or equal to or less than 50 mm. - Moreover,
controller 330 also controls image light 355 generated bysource assembly 310, based on image data provided byimage sensor 370.Image sensor 370 may be located on first side 370-1 and may include, for example, image sensors 120 a-120 d ofFIG. 1A to generate image data of a physical environment in front of the user (e.g., for location determination).Image sensor 370 may also be located on second side 370-2 and may includeimage sensors FIG. 1B to generate image data of eyeball 220 (e.g., for gaze point determination) of the user.Image sensor 370 may interface with a remote console that is not located withinwaveguide display 300.Image sensor 370 may provide image data to the remote console, which may determine, for example, a location of the user, a gaze point of the user, etc., and determine the content of the images to be displayed to the user. The remote console can transmit instructions tocontroller 330 related to the determined content. Based on the instructions,controller 330 can control the generation and outputting of image light 355 bysource assembly 310. -
FIG. 4 illustrates an embodiment of across section 400 of thewaveguide display 300. Thecross section 400 includessource assembly 310,output waveguide 320, andimage sensor 370. In the example ofFIG. 4 ,image sensor 370 may include a set ofpixel cells 402 located on first side 370-1 to generate an image of the physical environment in front of the user. In some embodiments, there can be amechanical shutter 404 interposed between the set ofpixel cells 402 and the physical environment to control the exposure of the set ofpixel cells 402. In some embodiments, themechanical shutter 404 can be replaced by an electronic shutter gate, as to be discussed below. Each ofpixel cells 402 may correspond to one pixel of the image. Although not shown inFIG. 4 , it is understood that each ofpixel cells 402 may also be overlaid with a filter to control the frequency range of the light to be sensed by the pixel cells. - After receiving instructions from the remote console,
mechanical shutter 404 can open and expose the set ofpixel cells 402 in an exposure period. During the exposure period,image sensor 370 can obtain samples of lights incident on the set ofpixel cells 402, and generate image data based on an intensity distribution of the incident light samples detected by the set ofpixel cells 402.Image sensor 370 can then provide the image data to the remote console, which determines the display content, and provide the display content information tocontroller 330.Controller 330 can then determine image light 355 based on the display content information. -
Source assembly 310 generates image light 355 in accordance with instructions from thecontroller 330.Source assembly 310 includes asource 410 and anoptics system 415.Source 410 is a light source that generates coherent or partially coherent light.Source 410 may be, e.g., a laser diode, a vertical cavity surface emitting laser, and/or a light emitting diode. -
Optics system 415 includes one or more optical components that condition the light fromsource 410. Conditioning light fromsource 410 may include, e.g., expanding, collimating, and/or adjusting orientation in accordance with instructions fromcontroller 330. The one or more optical components may include one or more lenses, liquid lenses, mirrors, apertures, and/or gratings. In some embodiments,optics system 415 includes a liquid lens with a plurality of electrodes that allows scanning of a beam of light with a threshold value of scanning angle to shift the beam of light to a region outside the liquid lens. Light emitted from the optics system 415 (and also source assembly 310) is referred to asimage light 355. -
Output waveguide 320 receivesimage light 355. Couplingelement 350 couples image light 355 fromsource assembly 310 intooutput waveguide 320. In embodiments wherecoupling element 350 is diffraction grating, a pitch of the diffraction grating is chosen such that total internal reflection occurs inoutput waveguide 320, and image light 355 propagates internally in output waveguide 320 (e.g., by total internal reflection), towarddecoupling element 365. - Directing
element 360 redirects image light 355 towarddecoupling element 365 for decoupling fromoutput waveguide 320. In embodiments where directingelement 360 is a diffraction grating, the pitch of the diffraction grating is chosen to cause incident image light 355 to exitoutput waveguide 320 at angle(s) of inclination relative to a surface ofdecoupling element 365. - In some embodiments, directing
element 360 and/ordecoupling element 365 are structurally similar.Expanded image light 340 exitingoutput waveguide 320 is expanded along one or more dimensions (e.g., may be elongated along x-dimension). In some embodiments,waveguide display 300 includes a plurality ofsource assemblies 310 and a plurality ofoutput waveguides 320. Each ofsource assemblies 310 emits a monochromatic image light of a specific band of wavelength corresponding to a primary color (e.g., red, green, or blue). Each ofoutput waveguides 320 may be stacked together with a distance of separation to output an expanded image light 340 that is multi-colored. -
FIG. 5 is a block diagram of an embodiment of asystem 500 including the near-eye display 100. Thesystem 500 comprises near-eye display 100, animaging device 535, an input/output interface 540, and image sensors 120 a-120 d and 150 a-150 b that are each coupled to controlcircuitries 510.System 500 can be configured as a head-mounted device, a wearable device, etc. - Near-
eye display 100 is a display that presents media to a user. Examples of media presented by the near-eye display 100 include one or more images, video, and/or audio. In some embodiments, audio is presented via an external device (e.g., speakers and/or headphones) that receives audio information from near-eye display 100 and/orcontrol circuitries 510 and presents audio data based on the audio information to a user. In some embodiments, near-eye display 100 may also act as an AR eyewear glass. In some embodiments, near-eye display 100 augments views of a physical, real-world environment, with computer-generated elements (e.g., images, video, sound, etc.). - Near-
eye display 100 includeswaveguide display assembly 210, one ormore position sensors 525, and/or an inertial measurement unit (IMU) 530.Waveguide display assembly 210 includessource assembly 310,output waveguide 320, andcontroller 330. -
IMU 530 is an electronic device that generates fast calibration data indicating an estimated position of near-eye display 100 relative to an initial position of near-eye display 100 based on measurement signals received from one or more ofposition sensors 525. -
Imaging device 535 may generate image data for various applications. For example,imaging device 535 may generate image data to provide slow calibration data in accordance with calibration parameters received fromcontrol circuitries 510.Imaging device 535 may include, for example, image sensors 120 a-120 d ofFIG. 1A for generating image data of a physical environment in which the user is located, for performing location tracking of the user.Imaging device 535 may further include, for example, image sensors 150 a-150 b ofFIG. 1B for generating image data for determining a gaze point of the user, to identify an object of interest of the user. - The input/
output interface 540 is a device that allows a user to send action requests to thecontrol circuitries 510. An action request is a request to perform a particular action. For example, an action request may be to start or end an application or to perform a particular action within the application. -
Control circuitries 510 provide media to near-eye display 100 for presentation to the user in accordance with information received from one or more of:imaging device 535, near-eye display 100, and input/output interface 540. In some examples,control circuitries 510 can be housed withinsystem 500 configured as a head-mounted device. In some examples,control circuitries 510 can be a standalone console device communicatively coupled with other components ofsystem 500. In the example shown inFIG. 5 ,control circuitries 510 include anapplication store 545, atracking module 550, and anengine 555. - The
application store 545 stores one or more applications for execution by thecontrol circuitries 510. An application is a group of instructions, that, when executed by a processor, generates content for presentation to the user. Examples of applications include: gaming applications, conferencing applications, video playback applications, or other suitable applications. -
Tracking module 550 calibratessystem 500 using one or more calibration parameters and may adjust one or more calibration parameters to reduce error in determination of the position of the near-eye display 100. -
Tracking module 550 tracks movements of near-eye display 100 using slow calibration information from theimaging device 535.Tracking module 550 also determines positions of a reference point of near-eye display 100 using position information from the fast calibration information. -
Engine 555 executes applications withinsystem 500 and receives position information, acceleration information, velocity information, and/or predicted future positions of near-eye display 100 from trackingmodule 550. In some embodiments, information received byengine 555 may be used for producing a signal (e.g., display instructions) towaveguide display assembly 210 that determines a type of content presented to the user. For example, to provide an interactive experience,engine 555 may determine the content to be presented to the user based on a location of the user (e.g., provided by tracking module 550), or a gaze point of the user (e.g., based on image data provided by imaging device 535), a distance between an object and user (e.g., based on image data provided by imaging device 535). -
FIG. 6A ,FIG. 6B , andFIG. 6C illustrates examples of an image sensor and its operations. As shown inFIG. 6A ,image sensor 600 can include an array of pixel cells, includingpixel cell 601, and can generate digital intensity data corresponding to pixels of an image.Pixel cell 601 may be part ofpixel cells 402 ofFIG. 4 . As shown inFIG. 6A ,pixel cell 601 may include aphotodiode 602, atransfer gate 604, and acharge sensing device 606.Photodiode 602 may include, for example, a P-N diode, a P-I-N diode, a pinned diode, etc.Photodiode 602 can generate charge upon receiving light within an exposure period, and the quantity of charge generated within the exposure period can be proportional to the intensity of the light.Photodiode 602 can also store the generated charge.Transfer gate 604 may include, for example, a metal-oxide-semiconductor field-effect transistor (MOSFET), a bipolar junction transistor (BJT), etc.Charge sensing device 606 may include, for example, a floating drain (FD) node of the transistor oftransfer gate 604, a pinned storage device formed from a pinned diode, etc. Towards the end of the exposure period, thetransfer gate 604 can be enabled to transfer the charge stored inphotodiode 602 to chargesensing device 606 to develop a voltage. An array of voltages, including v00, v01, . . . vji, can be obtained. The array of voltages can be quantized by an A/D converter (which can be external or internal to the pixel cells) into digital values. The digital values can be further processed to generate animage 610. - In
FIG. 6A , the presence ofcharge sensing device 606 in each pixel cell enablesimage sensor 600 to perform a global shutter operation, even if the pixel cells have to share the A/D converter. Specifically,photodiode 602 of each pixel cell can be exposed to incident light within the same global exposure period to generate charge. The charge can be temporarily stored atcharge sensing device 606 of the pixel cell at least until that pixel cell can access the A/D converter to quantize the charge. With such arrangements, a global shutter operation can be supported even if the voltages generated by the pixel cells are not quantized simultaneously. -
FIG. 6B andFIG. 6C illustrate cross-sectional views of examples ofimage sensor 600.Image sensor 600 can include a plurality of pixel cells 601 (e.g.,pixel cells photodiodes charge sensing devices Charge sensing devices 606 can be, for example, pinned storage nodes, floating drain nodes, etc. As shown inFIG. 6B andFIG. 6C ,image sensor 600 can be included in asemiconductor die 620 having afront side surface 622 and aback side surface 624.Front side surface 622 can be the front side of a semiconductor wafer from which semiconductor die 620 is fabricated, whereas backside surface 624 can be the back side of the semiconductor wafer. The front side of the semiconductor wafer can receive doping, ion implantation, etc., to formphotodiode 602 andcharge sensing devices 606, such that bothphotodiodes 602 andcharge sensing devices 606 are closer tofront side surface 622 than backside surface 624. Metal interconnects 626 (e.g.,metal interconnects front side surface 622, as shown inFIG. 6B , or onback side surface 624, as shown inFIG. 6C . Metal interconnects 626 can be used to transfer charge fromphotodiodes 602 to charge sensingdevices 606. -
Image sensor 600 can have different configurations. For example, as shown inFIG. 6B ,image sensor 600 can have a back side illumination (BSI) configuration, in which back-side surface 624 can be a light receiving surface forimage sensor 600. Moreover, as shown inFIG. 6C ,image sensor 600 can have a front side illumination (FSI) configuration, in which front-side surface 622 can be a light receiving surface forimage sensor 600. In bothFIG. 6B andFIG. 6C ,image sensor 600 may include color filters 632 (e.g., filters 632 a, 632 b, 632 c, etc.) and microlens 634 (e.g.,microlens side surface 624 inFIG. 6B ,front side surface 622 inFIG. 6C ) to reachphotodiodes 602. - As described above, noise charge can be added to
charge sensing devices 606, which can introduce error to the light intensity measurement operation. For example, in a case wherecharge sensing devices 606 are floating drain nodes, dark currents due to crystallographic defects may add noise charge to the charge transferred fromphotodiodes 602. As the light intensity measurement operation is based on measuring a quantity of charge generated byphotodiodes 602 within the global shutter period, while the measurement is based on the charge stored incharge sensing devices 606, the noise charge from dark currents can introduce error to the light intensity measurement operation. - In a case where
charge sensing devices 606 are pinned storage nodes of pinned diodes, the dark currents may be reduced compared with floating drain nodes, but the pinned diodes can receivephotons 640 via the light receiving surface and generate photon noise charge responsive tophotons 640, which can also be added to the charge transferred fromphotodiodes 602. Floating drain nodes, although being susceptible to dark currents, typically generate less photon noise charge than pinned diodes.Charge sensing devices 606 in the FSI configuration ofFIG. 6C can generate more noise charge than in the BSI configuration ofFIG. 6B becausecharge sensing devices 606 are closer to the light receiving surface in the FSI configuration, and there is a lack of light-shielding structure to blockphotons 640 from reachingcharge sensing devices 606. As a result, in the FSI configuration,charge sensing devices 606 can receivemore photons 640 and generate more noise charge compared with the BSI configuration. Moreover, as shown inFIG. 6B andFIG. 6C , in aBSI configuration photodiodes 602 are positioned further away from the light receiving surface than in a FSI configuration. As a result, in a BSI configuration incident light needs to travel through a longer distance withinsemiconductor 620, and therefore subject to larger power loss, before reachingphotodiodes 602 than in a FSI configuration. As a result, animage sensor 600 having a BSI configuration typically provides a lower light-to-charge conversion rate than a FSI configuration, which may reduce the sensitivity ofimage sensor 600 especially in a low light environment. -
FIG. 7A andFIG. 7B illustrate an example of apixel cell 700.FIG. 7A illustrates a cross-sectional structural view ofpixel cell 700, whereasFIG. 7B illustrates a schematic view ofpixel cell 700.Pixel cell 700 can perform a global shutter operation with improved noise performance. As shown inFIG. 7A , apixel cell 700 may include a first semiconductor die 702, a second semiconductor die 704, and asampling capacitor 706 forming a stack along a vertical direction (e.g., along the z-axis). First semiconductor die 702 may include aphotodiode 716 and acharge sensing device 718. Second semiconductor die 704 may includeinterface circuits 720. In the example ofFIG. 7A ,sampling capacitor 706 can include a metal capacitor formed from one or more metal layers 708 sandwiched between first semiconductor die 702 and second semiconductor die 704. In some examples,sampling capacitor 706 can also be formed as a device capacitor (e.g., a floating drain node, a pinned storage node, etc.) in one of first semiconductor die 702 or second semiconductor die 704. By stackingphotodiode 716,sampling capacitor 706, andinterface circuits 720 along a vertical direction, the horizontal footprint of pixel cell 700 (along the x/y axes) can be reduced, which allows packing a large number of pixel cells in an image sensor to improve resolution while minimizing the footprint of the image sensor. Moreover, by forming the sampling capacitor and the AC capacitor between the semiconductor dies, these capacitors do not cover the light receiving surface, which can maximize the available pixel cell area for the light receiving surface and allows the footprint of the pixel cell to be further reduced. With the disclosed techniques, a high resolution image sensor with a small footprint can be achieved, which is especially useful for applications on a wearable device (e.g., a headset) where available space is very limited. - As described above, first semiconductor die 702 may include
photodiode 716 andcharge sensing device 718.Photodiode 716 can be exposed to incident light within an integration period to generate and store charge. Towards the end of the integration period, the charge stored inphotodiode 716 can be transferred to chargesensing device 718 to develop a voltage.Interface circuits 720 of second semiconductor die 704 may include acontrol circuit 722 to controlsampling capacitor 706 to perform a sample-and-hold operation to sample the voltage and then store the voltage.Interface circuits 720 also include aprocessing circuit 724 to perform a quantization operation on the stored voltage to generate a digital output representing the intensity of the incident light received byphotodiode 716. As to be described below, the sample-and-hold operation can reduce the exposure ofsampling capacitor 706 to dark currents, which can improve the accuracy of the light sensing operation. - First semiconductor die 702 includes a
front side surface 710 and aback side surface 712.Photodiode 716 andcharge sensing device 718 can be formed by, for example, a doping process, an ion implantation process, etc., performed onfront side surface 710, such that bothphotodiode 716 andcharge sensing device 718 are closer tofront side surface 710 than backside surface 712. To improve light-charge conversion rate,pixel cell 700 can have a FSI configuration in whichfront side surface 710 is configured as the light receiving surface, with amicrolens 726 and acolor filter 728 positioned onfront side surface 710 to focus and filter the incident light. To reduce the effect of photon noise charge generation,charge sensing device 718 can be formed as a floating drain node, a metal capacitor, a polysilicon capacitor, etc. - Referring to
FIG. 7B , first semiconductor die 702 further includes other circuits including, for example, anoptional shutter switch 732, atransfer switch 734, astorage reset switch 736, avoltage buffer 738, and asampling switch 740. The switches can be controlled bycontrol circuit 722 to measure incident light intensity. Specifically, shutter switch 732 (controlled by a signal labelled “AB”) can act as an electronic shutter gate (in lieu of, or in combination with,mechanical shutter 404 ofFIG. 4 ) to control an exposure/integration period within which photodiode 716 can accumulate charge for light intensity measurement. In some examples,shutter switch 732 can also be configured as an anti-blooming gate to prevent charge generated byphotodiode 716 from leaking into other pixel cells when the photodiode saturates. In addition,transfer switch 734 can be controlled to transfer the charge fromphotodiode 716 to chargesensing device 718 to develop a voltage, which can be buffered byvoltage buffer 738.Sampling switch 740, together withsampling capacitor 706, can be controlled to perform a sample-and-hold operation of the buffered voltage.Storage reset switch 736 can resetcharge sensing device 718 prior to and after the sample-and-hold operation, to start over a new light intensity measurement. -
FIG. 7C illustrates an example sequence of control signals forshutter switch 732,transfer switch 734,storage reset switch 736, andsampling switch 740 to perform a sample-and-hold operation. As shown inFIG. 7C ,shutter switch 732 can be disabled (by de-asserting AB signal) at time T0 to start an integration/shutter period within which photodiode 716 can accumulate charge for light intensity measurement. Between times T0 and T1,charge sensing device 718 can be in a reset state, withstorage reset switch 736 enabled (by asserting the RST signal), whilephotodiode 716 is accumulating charge. Between times T1 and T4 can be the sampling period, within whichsampling switch 740 is enabled to electrically connectsampling capacitor 706 to the output ofvoltage buffer 738, which buffers the voltage ofcharge sensing device 718. During the sampling period,storage reset switch 736 can be disabled (by de-asserting the RST signal). The voltage acrosssampling capacitor 706 can track the buffered voltage atcharge sensing device 718. Between times T2 and T3 within the sampling period,transfer switch 734 can be enabled to transfer the charge accumulated inphotodiode 716 to chargesensing device 718. At timeT3 transfer switch 734 can be disabled, which ends the integration period, and the voltage atcharge sensing device 718 at time T3 can represent a quantity of charge accumulated by photodiode 716 (and transferred to charge sensing device 718) within the integration period between times T0 and T3.Sampling capacitor 706 can sample the buffered voltage atcharge sensing device 718 until time T4, such that the voltage atsampling capacitor 706 tracks the buffered voltage atcharge sensing device 718. After time T4,sampling capacitor 706 can hold the sampled voltage, which can then be quantized by processingcircuit 724 at a subsequent time after time T4. - An image sensor can include an array of
pixel cells 700. To support a global shutter operation, the array ofpixel cells 700 can share a global AB signal and a global TG signal so that a global integration period starts at the same time T0 and ends at the same time T3 for eachpixel cell 700. Thesampling capacitor 706 of each pixel cell can store the voltage representing the charge accumulated by thephotodiode 716 of each pixel cell within the global integration period. The voltages stored in the pixels can then be quantized by one or more ADCs. - Compared with
pixel cell 601 ofFIG. 6 ,pixel cell 700 can provide improved noise performance. Specifically,charge sensing device 718 is in a reset state for much of the integration period (e.g., from times T0 and T1) and is out of the reset state during the sampling period.Charge sensing device 718 is more susceptible to dark currents and photons, and can accumulate more dark currents noise charge and photon noise charge, during the sampling period compared with whencharge sensing device 718 is in the reset state. If the sampling period is relatively short, the noise charge added to the charge transferred fromphotodiode 716 can be reduced. Moreover, by performing a sample-and-hold operation,sampling capacitor 706 can be disconnected fromcharge sensing device 718 after sampling the buffered voltage atcharge sensing device 718, which can prevent dark currents from flowing intosampling capacitor 706 from charge sensing device 718 (or other components of first semiconductor die 702) which can contaminate the sampled voltage. Further, by implementingsampling capacitor 706 as a metal capacitor rather than a floating drain node or a pinned storage node,sampling capacitor 706 can be less susceptible to dark currents and photon noise charge when holding the sampled voltage. All these can reduce the noise components in the voltage being quantized and can improve the accuracy of the light sensing operations. -
FIG. 7D illustrates an example ofprocessing circuit 724. As shown inFIG. 7D ,processing circuit 724 includes acomparator 750.Comparator 750 can be coupled with amemory 760 which is also coupled with acounter 762. In some examples,memory 760 and counter 762 can be part ofpixel cell 700 andprocessing circuit 724. In some examples, as to be described below,memory 760 and counter 762 can be external topixel cell 700 and shared among an array ofpixel cells 700, to reduce the footprint ofpixel cell 700. -
Comparator 750,memory 760, and counter 762 can perform a quantization process of the sampled voltage at sampling capacitor 706 (labelled “VS”). Specifically,memory 760 can be a latch memory.Counter 762 can update its output count value (labelled “cnt”) periodically based on a clock signal.Comparator 750 can compare an input voltage (labelled “VCOMP_IN”), which is derived from the sampled voltage at sampling capacitor 706 (labelled “VS”), with a ramping threshold voltage (labelled “VREF”) to generate a decision (labelled “VOUT”). The decision can be a latch signal to control the latch memory to store a count value output bycounter 762. When ramping VREF voltage reaches or exceeds VCOMP_IN, the decision output ofcomparator 750 trips, and the count value output bycounter 762 when the decision trips can be stored inmemory 760. The count value stored inmemory 760 can represent a quantization result of VCOMP_IN and of VS, which can represent a measurement of the incident light intensity within the global shutter period ofFIG. 7C . - As shown in
FIG. 7D ,pixel cell 700 further includes anAC capacitor 746 and acomparator reset switch 752, which can be operated to compensate for measurement errors (e.g., comparator offset) introduced bycomparator 750, as well as other error signals such as, for example, reset noise introduced to charge sensing device 718 (by assertion of the RST signal) which can be present in the VS sampled voltage.AC capacitor 746 can be implemented as a metal capacitor between first semiconductor die 702 and second semiconductor die 704.AC capacitor 746 can be used to perform two sampling operations within the sampling period. A first sampling operation can be performed prior to transfer of charge fromphotodiode 716 to chargesensing device 718, which stores reset noise charge. As part of the first sampling operation, comparator resetswitch 752 can be enabled (by assertion of the COMP_RST signal) which can short the negative input and output terminals of the comparator. As a result of the first sampling operation,AC capacitor 746 can store a voltage (labelled “VCC”) across the capacitor which includes a component of the reset noise and an offset voltage ofcomparator 750. A second sampling operation can then be performed, in which comparator resetswitch 752 can be disabled, followed by enablingcharge transfer switch 742 to transfer the charge fromphotodiode 716 to chargesensing device 718. The VCOMP_IN input voltage can include the latest VS sampled voltage (which represents the charge stored inphotodiode 716 and transferred to charge sensing device 718) and the VCC voltage. The reset noise charge component in the latest VS sampled voltage can be cancelled by the reset noise charge component of the VCC voltage, while the comparator offset component in the VCC voltage remains in the VCOMP_IN input voltage. The comparator offset component in the VCOMP_IN input voltage can cancel out or substantially reduce the effect of the comparator offset ofcomparator 750 whencomparator 750 compares the new VCOMP_IN input voltage with the ramping threshold voltage. As both the comparator offset and reset noise are eliminated or at least substantially reduced, the accuracy of quantization can be improved. -
FIG. 7D illustrates an example sequence of control signals for the sample-and-hold operation including COMP_RST. The timings of AB, RST, TG, and SAMPLE signals inFIG. 7C is identical to those inFIG. 7C and their descriptions are not repeated here. As shown inFIG. 7D , within the sampling period and between times T1 and T2, charge transfer fromphotodiode 716 to chargesensing device 718 has not started. The voltage at charge sensing device 718 (and the sampled voltage VS) can be at a reset voltage VS_rst and can also include a reset noise component VσKTC. Between times T1 and T2, the sampled voltage VS can be as follows: -
V S(T2)=V S_rst +Vσ KTC (Equation 1) - Moreover, with
comparator reset switch 752 enabled, and the positive terminal ofcomparator 750 connected to a VREF voltage, the voltage of COMP_IN (VCOMP_IN) can track the VREF voltage, but differ by the comparator offset Vcomp_offset as follows: -
V COMP_IN(T2)=V REF +V comp_offset (Equation 2) - At time T2, the voltage difference VCC between the right plate of AC capacitor 746 (connected with COMP_IN) and the left plate of AC capacitor 746 (connected with sampling capacitor 706) can be as follows:
-
V CC(T2)=V COMP_IN(T2)−V S(T2) (Equation 3) - Combining
Equations -
V CC(T2)=(V REF +V comp_offset)−(V S_rst +Vσ KTC) (Equation 4) - The voltage difference VCC(T2) can represent a result of the first sampling operation.
- Between T2 and T3,
charge transfer switch 734 is enabled, and charge is transferred fromphotodiode 716 to chargesensing device 718 to develop a new voltage. At time T3, the sampled voltage VS(T3) can include a new voltage VS_out corresponding to the transferred charge can be sampled by samplingcapacitor 706, as well as the reset noise component VσKTC which remains atcharge sensing device 718, as follows: -
V S(T3)=V S_out +Vσ KTC (Equation 5) - VS(T3) can represent a result of the second sampling operation.
- At time T3, the
comparator reset switch 752 is disabled. The voltage difference VCC acrossAC capacitor 746 remains the same as at time T2. Via AC-coupling, the voltage of the right plate of AC capacitor 746 (VCOMP_IN) at time T3 can track VS(T3) but differ by the voltage difference VCC as follows: -
V COMP_IN(T3)=V S(T3)+V CC(T2) (Equation 6) - Combining Equation 6 with Equation 4 becomes:
-
V COMP_IN(T3)=V S_out +Vσ KTC+(V REF +V comp_offset)−(V S_rst +Vσ KTC) (Equation 7) - As shown in Equation 7, the VσKTC component of VS(T3) and the VσKTC component of VCC(T2) (and VCC(T3)) can be cancelled out. Equation 7 can be simplified as follows:
-
V COMP_IN(T3)=V S_out −V S_rst +V REF +V comp_offset (Equation 8)= - After T3, the voltage at VCOMP_IN can be held at VCOMP_IN (T3) when no additional charge is transferred to charge
sensing device 718 and/or after samplingswitch 740 is disabled. - As shown in Equation 8, VCOMP_IN (T3) includes a difference component VS_out−VS_rst, which represents the quantity of charge from the photodiode and transferred to charge
sensing device 718 between times T2 and T3. VCOMPIN (T3) further includes the Vcomp_offset component as well as VREF (from VCC). When comparator 470 compares VCOMP_IN with VREF, the comparator offset introduced by comparator 470 can be cancelled by the Vcomp_offset component, and only the difference VS_out−VS_rst, which represents the quantity of charge from the photodiode, is compared against VREF as part of the quantization process to generate the quantization result. Such arrangements can remove the reset noise and comparator offset from the quantization result and improve the accuracy of light intensity measurement. - As described above, to further reduce the footprint of
pixel cell 700,memory 760 and counter 762 can be positioned external topixel cell 700 and can be shared among a set ofpixel cells 700.FIG. 8A illustrates an example image sensor 800 which includes shared counter and memories. As shown inFIG. 8A , image sensor 800 includes a first semiconductor die 802 and a second semiconductor die 804. Semiconductor die 802 includes an array oflight sensing circuits 806, includinglight sensing circuits 806 a, with eachlight sensing circuit 806 includingphotodiode 716,charge sensing device 718,shutter switch 732,transfer switch 734,storage reset switch 736,voltage buffer 738, andsampling switch 740. Semiconductor die 804 includes an array ofinterface circuits 808, includinginterface circuits 808 a. Eachinterface circuit 808 includescomparator 750 andcomparator reset switch 752 and corresponds to eachlight sensing circuit 806. Image sensor 800 further includes an array ofsampling capacitors 706 and an array ofAC capacitors 746. Eachsampling capacitor 706 andAC capacitor 746 is coupled between a corresponding pair oflight sensing circuit 806 andinterface circuit 808 to form apixel cell 810. The arrays ofsampling capacitors 706 andAC capacitors 746 can be formed in ametal layer 812 stacked between first semiconductor die 802 and second semiconductor die 804. The light sensing operations oflight sensing circuits 806 andinterface circuits 808 usingsampling capacitors 706 andAC capacitors 746 are similar to the operations described inFIGS. 7B-7E and are not repeated here. - In addition, image sensor 800 includes a
counter 820, a bank ofmemory buffers 822, and acontroller 824, some or all of which can be part ofinterface circuits 808. Eachmemory buffer 822 within the bank can be a latch memory similar tomemory 760.Counter 820 can update a count value (“cnt”) periodically based on a clock.Counter 820 can output the count value to bank of memory buffers 822.Pixel cells 810 can control the timing of when the count values are stored in bank ofmemory buffers 822 based on comparing the sampled voltages stored at the pixel cells against a ramping threshold to quantize the sampled voltages, as described above.Controller 824 can control the access to bank ofmemory buffers 822 amongpixel cells 810 to quantize the sampled voltages. InFIG. 8A ,controller 824 can allow one row of pixel cells 810 (e.g., a set of pixel cells aligned along the x-axis ofFIG. 8A ) to access bank ofmemory buffers 822 to quantize the sampled voltages at the row ofpixel cells 810, followed by another row. Within a column of pixel cells 810 (e.g., a set of pixel cells aligned along the y-axis ofFIG. 8A ),comparator 750 of each pixel cell can be selectively coupled with a memory buffer within the bank via arow switch 830.Comparator 750 of each pixel cell is also selectively coupled with a power supply via a power switch 832. -
FIG. 8B andFIG. 8C illustrate example sequences of control signals for image sensor 800. The sequence of control signals forFIG. 8B can be, for an example, ofpixel cell 810 that includesshutter switch 732, whereas the sequence of control signals forFIG. 8C can be, for an example, ofpixel cell 810 that does not includeshutter switch 732. InFIG. 8B andFIG. 8C , the RST, TG, AB, and COMP_RST can be global signals to each pixel cell to perform a global shutter operation in a global integration period, as well as a first sampling operation (to sample reset noise charge and comparator offset) and a second sampling operation (to sample charge accumulated in the global integration period) in a sampling period. Following the sampling period, rows ofpixel cells 810 can take turn to access bank ofmemory buffers 822 to perform quantization. For example, when a first row ofpixel cells 810 is selected to access bank ofmemory buffers 822,row switch 830 of each pixel cell within the row (labelled “ROW[1]”) is enabled to couple the output ofcomparator 750 tomemory buffer 822, whereas power switch 832 of each pixel cell within that row (labelled “ON[1]”) is enabled bycontroller 824 to enablecomparator 750.Comparator 750 of each pixel cell can compare the sampled voltage stored at the pixel cell against ramping threshold VREF to generate a decision, which can be transmitted viarow switch 830 to control a time whenmemory buffer 822 stores the count value fromcounter 820. The count values stored in bank ofmemory buffers 822 can represent the quantization results for first row ofpixel cells 810. The count values in bank ofmemory buffers 822 can be read out (e.g., by an image reconstruction engine) via data_out buses. After the count values are read out from the bank of memory buffers for the first row ofpixel cells 810, row switches 830 and power switches 832 of the first row of pixel cells 810 (ROW[1] and ON[1]) can be disabled bycontroller 824.Controller 824 can then select a second row ofpixel cells 810 to access bank ofmemory buffers 822 to quantize the sampled voltages stored at the second row ofpixel cells 810.Row switch 830 and power switch 832 of the second row of pixel cells (ROW[2] and ON[2]) can be enabled bycontroller 824 to perform the quantization. - Although
FIG. 8A illustrates that a single bank ofmemory buffers 822 is shared among rows ofpixel cells 810, it is understood that multiple banks of memory buffers can be provided, which can increase the number of pixel cells that can concurrently perform quantization withcounter 820 and with the memory buffers, and the speeds of read out and image generation can be increased. For example, as shown inFIG. 8D , twomemory banks First memory bank 840 can be shared amongrows 850 ofpixel cells 810, whereasbottom memory bank 842 can be shared amongrows 852 ofpixel cells 810. Moreover, as shown inFIG. 8E , fourmemory banks rows pixel cells 810, memory bank 872 can be shared amongrows pixel cells 810, memory bank 874 can be shared amongrows pixel cells 810, whereas memory bank 876 can be shared amongrows 886 a and 886 b ofpixel cells 810. - The arrangements in
FIG. 8A -FIG. 8E , by putting the memory external to the pixel cell, can further the footprint of the pixel cells, which allows packing a large number of pixel cells in an image sensor to improve resolution while minimizing the footprint of the image sensor. Moreover, the reliability and speed of image generation can also be improved. For example, as the memory is positioned outside the pixel cell and does not affect the footprint of the pixel cell, redundant memory devices can be provided to store the digital outputs from each pixel cell to reduce the likelihood of losing the digital outputs (and the pixel values) due to defective memory. But since the memory comprises mostly digital circuit and typically has a very small footprint, adding redundant memory (to be shared by the pixel cells) typically do not significantly increase the footprint of the image sensor. Moreover, compared with an implementation where the pixel cell transmits an analog voltage (e.g., a voltage at the charge sensing device) to an external ADC to perform the quantization operation, the disclosed techniques allow a part of the quantization (the comparator comparison) operation to be performed within the pixel cell, and only a digital output (the decision of the comparator) is transmitted from the pixel cell to the external memory. Compared with an analog voltage, the digital output can be transmitted with high fidelity (to distinguish between zeroes and ones) and at high speed. All these can improve the reliability and speed of image generation based on the light sensing operations by the pixel cells. -
FIG. 9 includes a flowchart that illustrates anexample method 900 for performing measurement of light intensity.Method 900 can be performed by, for example,pixel cell 700 ofFIG. 7A -FIG. 7E and image sensor 800 ofFIG. 8A -FIG. 8D based on the techniques described above. - In
step 902, an interface circuit (e.g.,interface circuits 720, interface circuit 808) can enable a photodiode of a light sensing circuit (e.g., light sensing circuit 806) to accumulate charge responsive to incident light within a integration period. The light sensing circuit can be in a first semiconductor die (e.g., first semiconductor dies 702, 802, etc.), whereas the interface circuit can be in a second semiconductor die (e.g., second semiconductor dies 704, 804, etc.). The first semiconductor die and the second semiconductor die may form a stack, as shown inFIG. 7A andFIG. 8A . The light sensing circuit in the first semiconductor die can be configured as a front-side illumination device or a back-side illumination device as shown inFIG. 6A andFIG. 6B . The enabling can be based on, for example, disablingshutter switch 732 to enable the photodiode to accumulate charge. - In
step 904, the interface circuit can transfer the charge from the photodiode to a charge sensing device (e.g., charge sensing device 718) of the light sensing circuit. The transfer can be performed viatransfer switch 734 under the control of the interface circuit. The charge sensing device can be, for example, a floating drain device, a metal capacitor, a polysilicon capacitor, etc. - In
step 906, the interface circuit can perform, using a sampling capacitor (e.g., sampling capacitor 706), a sample-and-hold operation to convert the charge stored in the charge sensing device into a voltage. Specifically, the sampling capacitor can be coupled with the charge sensing device via a sampling switch controlled by the interface circuit. Referring back toFIG. 7C , within a sampling period, the transfer switch can be enabled, as part ofstep 904, to transfer the charge from the photodiode to the charge sensing device to develop a voltage, while the sampling switch is also enabled to enable the sampling capacitor to track the voltage of the charge sensing device. The transfer switch can be disabled prior to the end of the sampling period to freeze the voltage at the charge sensing device, and the sampling capacitor to continue tracking the voltage at the charge sensing device until the sampling period ends. After the sampling period ends, the sampling switch can be disabled, and the sampling capacitor can hold the sampled voltage for a subsequent quantization process. - In some examples, the interface circuit can include a resetable comparator (e.g., comparator 750) and an AC capacitor (e.g., AC capacitor 746). Referring back to
FIG. 7E , as part ofstep 906, the comparator can be reset within the sampling period to store comparator offset and reset noise (which is also present in the charge sensing device and reflected in the sampled voltage at the sampling capacitor) in the AC capacitor. The AC capacitor can also track the voltage sampled and held by the sampling capacitor and combine the sampled voltage with the reset noise and comparator offset information to generate an output voltage (e.g., VCOMP_IN). - In
step 908, the interface circuit can generate a digital output based on the voltage sample and held at the sampling capacitor to represent an intensity of the incident light received by the photodiode. The digital output can be generated based on a quantization process, in which the comparator can compare the output voltage of the AC capacitor with a ramping threshold to generate a decision. The decision can control a memory (e.g.,memory FIG. 8A toFIG. 8E , such that groups of pixel cells can take turn in storing the digital values at the shared memory. - Some portions of this description describe the embodiments of the disclosure in terms of algorithms and symbolic representations of operations on information. These algorithmic descriptions and representations are commonly used by those skilled in the data processing arts to convey the substance of their work effectively to others skilled in the art. These operations, while described functionally, computationally, or logically, are understood to be implemented by computer programs or equivalent electrical circuits, microcode, or the like. Furthermore, it has also proven convenient at times, to refer to these arrangements of operations as modules, without loss of generality. The described operations and their associated modules may be embodied in software, firmware, and/or hardware.
- Steps, operations, or processes described may be performed or implemented with one or more hardware or software modules, alone or in combination with other devices. In some embodiments, a software module is implemented with a computer program product comprising a computer-readable medium containing computer program code, which can be executed by a computer processor for performing any or all of the steps, operations, or processes described.
- Embodiments of the disclosure may also relate to an apparatus for performing the operations described. The apparatus may be specially constructed for the required purposes, and/or it may comprise a general-purpose computing device selectively activated or reconfigured by a computer program stored in the computer. Such a computer program may be stored in a non-transitory, tangible computer readable storage medium, or any type of media suitable for storing electronic instructions, which may be coupled to a computer system bus. Furthermore, any computing systems referred to in the specification may include a single processor or may be architectures employing multiple processor designs for increased computing capability.
- Embodiments of the disclosure may also relate to a product that is produced by a computing process described herein. Such a product may comprise information resulting from a computing process, where the information is stored on a non-transitory, tangible computer readable storage medium and may include any embodiment of a computer program product or other data combination described herein.
- The language used in the specification has been principally selected for readability and instructional purposes, and it may not have been selected to delineate or circumscribe the inventive subject matter. It is therefore intended that the scope of the disclosure be limited not by this detailed description, but rather by any claims that issue on an application based hereon. Accordingly, the disclosure of the embodiments is intended to be illustrative, but not limiting, of the scope of the disclosure, which is set forth in the following claims.
Claims (20)
1. A method comprising:
within an integration period:
enabling a photodiode of a pixel cell to accumulate charge responsive to incident light, and
transferring the charge from the photodiode to a charge storage device of the pixel cell;
performing, using a sampling capacitor, a sample-and-hold operation to convert the charge stored in the charge storage device into a voltage; and
generating a digital output based on the voltage to represent an intensity of the incident light received by the photodiode.
2. The method of claim 1 , wherein the sampling-and-hold operation comprises:
enabling a sampling switch to cause the sampling capacitor to sample the charge stored in the charge storage device to develop the voltage; and
disabling the sampling switch to cause the sampling capacitor to hold the voltage.
3. The method of claim 2 , wherein the enabling of the sampling switch starts at a first time within the integration period and ends at a second time after the integration period ends.
4. The method of claim 3 , wherein the transferring of the charge from the photodiode to the charge storage device starts after the first time and ends before the second time.
5. The method of claim 1 , wherein:
the pixel cell further includes a shutter switch coupled with the photodiode;
the integration period is started based on disabling the shutter switch; and
the integration period is ended based on enabling the shutter switch.
6. The method of claim 1 , wherein the pixel cell further includes:
a transfer switch coupled between the photodiode and the charge storage device; and
a reset switch coupled with the charge storage device;
wherein the integration period is started based on disabling the reset switch and the transfer switch; and
wherein the integration period is ended based on enabling the reset switch and the transfer switch.
7. The method of claim 6 , wherein generating the digital output based on the voltage comprises:
generating, using a comparator, a decision based on the voltage and a threshold;
controlling a memory to store a count value from a counter based on the decision; and
providing the count value as the digital output.
8. The method of claim 7 , wherein the threshold increases or decreases with time; and
wherein the count value measures a time when the threshold intersects with the voltage.
9. The method of claim 7 , further comprising: resetting the comparator prior to transferring the charge.
10. The method of claim 9 , wherein the charge storage device is coupled with the comparator via an AC capacitor;
wherein the voltage is a first voltage;
wherein the method further comprises:
obtaining, using the AC capacitor, a first sample of a reset voltage of the charge storage device caused by a prior reset operation of the charge storage device;
obtaining, using the AC capacitor, a second sample of an offset of the comparator when the comparator is in the reset state;
storing a second voltage across the AC capacitor based on the first sample of the reset voltage and the second sample of the offset; and
outputting, based on the first voltage and the second voltage, a third voltage to the comparator; and
wherein the digital output is generated based on the third voltage.
11. The method of claim 7 , wherein an output of the comparator is coupled with the memory via a selection switch; and
wherein the method further comprises:
enabling the selection switch to transmit the decision to the memory when the pixel cell is selected to store the digital output in the memory; and
disabling the selection switch to block the decision from the memory when the pixel cell is not selected to store the digital output in the memory.
12. The method of claim 7 , wherein the comparator is coupled with a power supply via a power switch; and
wherein the method further comprises:
enabling the power switch to enable the comparator to generate the decision when the pixel cell is selected to store the digital output in the memory; and
disabling the power switch to disable the comparator when the pixel cell is not selected to store the digital output in the memory.
13. The method of claim 7 , wherein the comparator is part of the pixel cell.
14. The method of claim 1 , wherein the charge storage device comprises at least one of: a floating drain node, or a pinned storage node.
15. A method comprising:
within an integration period:
enabling a first photodiode to accumulate a first charge responsive to incident light;
transferring the first charge from the first photodiode to a first charge storage device;
enabling a second photodiode to accumulate a second charge responsive to the incident light; and
transferring the second charge from the second photodiode to a second charge storage device;
performing, using a first sampling capacitor, a first sample-and-hold operation to convert the first charge stored in the first charge storage device into a first voltage;
performing, using a second sampling capacitor, a second sample-and-hold operation to convert the second charge stored in the second charge storage device into a second voltage;
generating a first digital output based on the first voltage to represent an intensity of the incident light received by the first photodiode;
storing the first digital output at a memory;
reading the first digital output from the memory;
generating a second digital output based on the second voltage to represent an intensity of the incident light received by the second photodiode; and
after the first digital output is read from the memory, storing the second digital output at the memory.
16. The method of claim 15 , further comprising:
within the integration period:
enabling a third photodiode to accumulate a third charge responsive to incident light; and
transferring the third charge from the third photodiode to a third charge storage device;
performing, using a third sampling capacitor, a third sample-and-hold operation to convert the third charge stored in the third charge storage device into a third voltage;
generating a third digital output based on the third voltage to represent an intensity of the incident light received by the third photodiode; and
storing the first digital output and the third digital output at the memory prior to reading of the first digital output from the memory.
17. The method of claim 16 , wherein the first photodiode is part of a first pixel cell;
wherein the second photodiode is part of a second pixel cell; and
wherein the third photodiode is part of a third pixel cell.
18. The method of claim 16 , wherein the first pixel cell and the third pixel cell are in a first row of pixel cells of an image sensor; and
wherein the second pixel cell is in a second row of the pixel cells of the image sensor.
19. The method of claim 16 , wherein each of the first, second, and third digital outputs is generated using, respectively, a first comparator, a second comparator, and a third comparator; and
wherein the first, second, and third comparators are reset during, respectively, the first, second, and third sample-and-hold operations.
20. The method of claim 19 , wherein:
the first comparator is part of the first pixel cell;
the second comparator is part of the second pixel cell; and
the third comparator is part of the third pixel cell.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/236,433 US20210265401A1 (en) | 2018-04-03 | 2021-04-21 | Global shutter image sensor |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201862652220P | 2018-04-03 | 2018-04-03 | |
US16/369,763 US11004881B2 (en) | 2018-04-03 | 2019-03-29 | Global shutter image sensor |
US17/236,433 US20210265401A1 (en) | 2018-04-03 | 2021-04-21 | Global shutter image sensor |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/369,763 Continuation US11004881B2 (en) | 2018-04-03 | 2019-03-29 | Global shutter image sensor |
Publications (1)
Publication Number | Publication Date |
---|---|
US20210265401A1 true US20210265401A1 (en) | 2021-08-26 |
Family
ID=68057359
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/369,763 Active US11004881B2 (en) | 2018-04-03 | 2019-03-29 | Global shutter image sensor |
US17/236,433 Abandoned US20210265401A1 (en) | 2018-04-03 | 2021-04-21 | Global shutter image sensor |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/369,763 Active US11004881B2 (en) | 2018-04-03 | 2019-03-29 | Global shutter image sensor |
Country Status (7)
Country | Link |
---|---|
US (2) | US11004881B2 (en) |
EP (1) | EP3777132A1 (en) |
JP (1) | JP2021517764A (en) |
KR (1) | KR20200139187A (en) |
CN (1) | CN111955002B (en) |
TW (1) | TW201943093A (en) |
WO (1) | WO2019195162A1 (en) |
Families Citing this family (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10686996B2 (en) | 2017-06-26 | 2020-06-16 | Facebook Technologies, Llc | Digital pixel with extended dynamic range |
US10598546B2 (en) | 2017-08-17 | 2020-03-24 | Facebook Technologies, Llc | Detecting high intensity light in photo sensor |
US11906353B2 (en) | 2018-06-11 | 2024-02-20 | Meta Platforms Technologies, Llc | Digital pixel with extended dynamic range |
US11463636B2 (en) | 2018-06-27 | 2022-10-04 | Facebook Technologies, Llc | Pixel sensor having multiple photodiodes |
US10897586B2 (en) | 2018-06-28 | 2021-01-19 | Facebook Technologies, Llc | Global shutter image sensor |
US11956413B2 (en) | 2018-08-27 | 2024-04-09 | Meta Platforms Technologies, Llc | Pixel sensor having multiple photodiodes and shared comparator |
US11595602B2 (en) | 2018-11-05 | 2023-02-28 | Meta Platforms Technologies, Llc | Image sensor post processing |
US11943561B2 (en) | 2019-06-13 | 2024-03-26 | Meta Platforms Technologies, Llc | Non-linear quantization at pixel sensor |
KR20210035950A (en) * | 2019-09-24 | 2021-04-02 | 삼성전자주식회사 | Image sensor device |
US11936998B1 (en) | 2019-10-17 | 2024-03-19 | Meta Platforms Technologies, Llc | Digital pixel sensor having extended dynamic range |
US11902685B1 (en) | 2020-04-28 | 2024-02-13 | Meta Platforms Technologies, Llc | Pixel sensor having hierarchical memory |
US20210360177A1 (en) * | 2020-05-12 | 2021-11-18 | Microsoft Technology Licensing, Llc | Global shutter time-of-flight camera |
US11910114B2 (en) | 2020-07-17 | 2024-02-20 | Meta Platforms Technologies, Llc | Multi-mode image sensor |
US11956560B2 (en) | 2020-10-09 | 2024-04-09 | Meta Platforms Technologies, Llc | Digital pixel sensor having reduced quantization operation |
CN112804467B (en) * | 2021-04-15 | 2021-06-25 | 北京惠风智慧科技有限公司 | Image coding method and device based on multiple CMOS sensors |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080042046A1 (en) * | 2006-08-21 | 2008-02-21 | Sony Corporation | Physical quantity detection device, method of driving physical quantity detection device, and imaging apparatus |
US20100140732A1 (en) * | 2008-12-09 | 2010-06-10 | Teledyne Scientific & Imaging, Llc | Method and apparatus for backside illuminated image sensors using capacitively coupled readout integrated circuits |
US7830292B2 (en) * | 2005-03-30 | 2010-11-09 | Aptina Imaging Corporation | High density row RAM for column parallel CMOS image sensors |
US20120267511A1 (en) * | 2011-04-19 | 2012-10-25 | Altasens, Inc. | Image sensor with hybrid heterostructure |
US20170295338A1 (en) * | 2016-04-11 | 2017-10-12 | Semiconductor Components Industries, Llc | Backside illuminated global shutter pixel with active reset |
US20180227516A1 (en) * | 2017-02-03 | 2018-08-09 | SmartSens Technology (U.S.), Inc. | Stacked image sensor pixel cell with dynamic range enhancement and selectable shutter modes and in-pixel cds |
US20180278865A1 (en) * | 2015-12-04 | 2018-09-27 | Canon Kabushiki Kaisha | Method for driving imaging apparatus |
US20190043903A1 (en) * | 2017-08-01 | 2019-02-07 | Semiconductor Components Industries, Llc | Stacked image sensor capacitors and related methods |
US20210144330A1 (en) * | 2017-05-10 | 2021-05-13 | Brillnics, Inc. | Solid-state imaging device, method for driving solid-state imaging device, and electronic apparatus |
Family Cites Families (176)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4596977A (en) | 1984-12-03 | 1986-06-24 | Honeywell Inc. | Dual slope analog to digital converter with out-of-range reset |
US5053771A (en) | 1990-07-16 | 1991-10-01 | Eastman Kodak Company | Adaptive dual range analog to digital converter |
JP2953297B2 (en) | 1994-03-30 | 1999-09-27 | 日本電気株式会社 | Light receiving element and driving method thereof |
US5844512A (en) | 1997-07-28 | 1998-12-01 | Hewlett-Packard Company | Autoranging apparatus and method for improved dynamic ranging in analog to digital converters |
US6529241B1 (en) | 1998-02-27 | 2003-03-04 | Intel Corporation | Photodetecting device supporting saturation detection and electronic shutter |
US6522395B1 (en) * | 1999-04-30 | 2003-02-18 | Canesta, Inc. | Noise reduction techniques suitable for three-dimensional information acquirable with CMOS-compatible image sensor ICS |
US6486504B1 (en) | 1999-10-26 | 2002-11-26 | Eastman Kodak Company | CMOS image sensor with extended dynamic range |
US6545624B2 (en) | 2000-02-11 | 2003-04-08 | Hyundai Electronics Industries Co., Ltd. | Image sensor with analog-to-digital converter that generates a variable slope ramp signal |
US20030049925A1 (en) | 2001-09-10 | 2003-03-13 | Layman Paul Arthur | High-density inter-die interconnect structure |
EP1470575B1 (en) | 2002-02-01 | 2018-07-25 | MACOM Technology Solutions Holdings, Inc. | Mesa structure avalanche photodiode |
CN1234234C (en) | 2002-09-30 | 2005-12-28 | 松下电器产业株式会社 | Solid-state photographic device and equipment using the photographic device |
US7280143B2 (en) | 2003-04-14 | 2007-10-09 | Micron Technology, Inc. | CMOS image sensor with active reset and 4-transistor pixels |
US6885331B2 (en) | 2003-09-15 | 2005-04-26 | Micron Technology, Inc. | Ramp generation with capacitors |
EP1530363B1 (en) | 2003-11-04 | 2010-08-11 | STMicroelectronics (Research & Development) Limited | Improvements in or relating to image sensors |
KR100574959B1 (en) | 2003-11-24 | 2006-04-28 | 삼성전자주식회사 | Auto exposure CMOS image sensor having auto exposure control function |
US6864817B1 (en) | 2003-12-30 | 2005-03-08 | Freescale Semiconductor, Inc. | Signaling dependent adaptive analog-to-digital converter (ADC) system and method of using same |
EP1732134B1 (en) | 2004-02-27 | 2012-10-24 | National University Corporation Tohoku Unversity | Solid-state imagine device, line sensor, optical sensor, and method for operating solid-state imaging device |
JP4317115B2 (en) | 2004-04-12 | 2009-08-19 | 国立大学法人東北大学 | Solid-state imaging device, optical sensor, and operation method of solid-state imaging device |
GB0412296D0 (en) | 2004-06-02 | 2004-07-07 | Council Cent Lab Res Councils | Imaging device |
US7508431B2 (en) | 2004-06-17 | 2009-03-24 | Hoya Corporation | Solid state imaging device |
JP4349232B2 (en) | 2004-07-30 | 2009-10-21 | ソニー株式会社 | Semiconductor module and MOS solid-state imaging device |
US8144227B2 (en) | 2004-09-02 | 2012-03-27 | Sony Corporation | Image pickup device and image pickup result outputting method |
JP4835856B2 (en) | 2005-01-06 | 2011-12-14 | 日本電気株式会社 | Semiconductor integrated circuit device |
JP4459064B2 (en) | 2005-01-14 | 2010-04-28 | キヤノン株式会社 | Solid-state imaging device, control method thereof, and camera |
JP2006197393A (en) | 2005-01-14 | 2006-07-27 | Canon Inc | Solid-state imaging device, driving method thereof and camera |
TWI429066B (en) | 2005-06-02 | 2014-03-01 | Sony Corp | Semiconductor image sensor module and manufacturing method thereof |
US20070013983A1 (en) | 2005-07-04 | 2007-01-18 | Dai Nippon Printing Co., Ltd. | Holographic viewing device, and holographic viewing card incorporating it |
KR100775058B1 (en) | 2005-09-29 | 2007-11-08 | 삼성전자주식회사 | Pixel Cell, Image Sensor Adopting The Pixel Cell, and Image Processing System Including The Image Sensor |
US7608823B2 (en) | 2005-10-03 | 2009-10-27 | Teledyne Scientific & Imaging, Llc | Multimode focal plane array with electrically isolated commons for independent sub-array biasing |
US7546026B2 (en) | 2005-10-25 | 2009-06-09 | Zoran Corporation | Camera exposure optimization techniques that take camera and scene motion into account |
US7652313B2 (en) | 2005-11-10 | 2010-01-26 | International Business Machines Corporation | Deep trench contact and isolation of buried photodetectors |
JPWO2007097287A1 (en) | 2006-02-20 | 2009-07-16 | パナソニック株式会社 | Imaging device and lens barrel |
US7609079B2 (en) | 2006-03-02 | 2009-10-27 | Dialog Semiconductor Gmbh | Probeless DC testing of CMOS I/O circuits |
US7326903B2 (en) | 2006-06-29 | 2008-02-05 | Noble Peak Vision Corp. | Mixed analog and digital pixel for high dynamic range readout |
JP4855192B2 (en) | 2006-09-14 | 2012-01-18 | 富士フイルム株式会社 | Image sensor and digital camera |
US7361989B1 (en) | 2006-09-26 | 2008-04-22 | International Business Machines Corporation | Stacked imager package |
US8107751B2 (en) | 2007-03-16 | 2012-01-31 | Sharp Laboratories Of America, Inc. | DPCM with adaptive range and PCM escape mode |
US7839703B2 (en) | 2007-06-15 | 2010-11-23 | Micron Technology, Inc. | Subtraction circuits and digital-to-analog converters for semiconductor devices |
US7825966B2 (en) | 2007-06-29 | 2010-11-02 | Omnivision Technologies, Inc. | High dynamic range sensor with blooming drain |
US7940311B2 (en) | 2007-10-03 | 2011-05-10 | Nokia Corporation | Multi-exposure pattern for enhancing dynamic range of images |
US8426793B1 (en) | 2007-10-04 | 2013-04-23 | Geoffrey L. Barrows | Vision sensor |
EP2063630A1 (en) | 2007-11-26 | 2009-05-27 | Thomson Licensing | Video capture device with variable shutter integration time |
US8369458B2 (en) | 2007-12-20 | 2013-02-05 | Ralink Technology Corporation | Wireless receiving system with an adaptively configurable analog to digital converter |
WO2009111556A1 (en) | 2008-03-04 | 2009-09-11 | Mesa Imaging Ag | Drift field demodulation pixel with pinned photo diode |
EP2104234B1 (en) | 2008-03-21 | 2011-08-31 | STMicroelectronics Limited | Analog-to-digital conversion in image sensors |
US8089035B2 (en) | 2008-04-16 | 2012-01-03 | Tower Semiconductor Ltd. | CMOS image sensor with high sensitivity wide dynamic range pixel for high resolution applications |
JP4661912B2 (en) | 2008-07-18 | 2011-03-30 | ソニー株式会社 | Solid-state imaging device and camera system |
EP2234387B8 (en) | 2009-03-24 | 2012-05-23 | Sony Corporation | Solid-state imaging device, driving method of solid-state imaging device, and electronic apparatus |
JP5306269B2 (en) | 2009-06-25 | 2013-10-02 | キヤノン株式会社 | Imaging apparatus and imaging method using optical coherence tomography |
US8569807B2 (en) | 2009-09-01 | 2013-10-29 | Taiwan Semiconductor Manufacturing Company, Ltd. | Backside illuminated image sensor having capacitor on pixel region |
CN102334293B (en) | 2009-09-11 | 2014-12-10 | 松下电器产业株式会社 | Analog/digital converter, image sensor system, and camera device |
KR101727270B1 (en) | 2009-11-06 | 2017-04-17 | 삼성전자주식회사 | Image sensor |
KR101111946B1 (en) | 2009-12-17 | 2012-02-14 | 엠텍비젼 주식회사 | Imaging device, image signal processor and method for sharing memory among chips |
US8606051B2 (en) | 2010-08-16 | 2013-12-10 | SK Hynix Inc. | Frame-wise calibration of column-parallel ADCs for image sensor array applications |
KR20120029840A (en) | 2010-09-17 | 2012-03-27 | 삼성전자주식회사 | Method of driving an image sensor |
US8928789B2 (en) | 2010-09-30 | 2015-01-06 | Canon Kabushiki Kaisha | Solid-state imaging apparatus |
US8576276B2 (en) | 2010-11-18 | 2013-11-05 | Microsoft Corporation | Head-mounted display device which provides surround video |
TWI462265B (en) | 2010-11-30 | 2014-11-21 | Ind Tech Res Inst | Image capture device |
KR101754131B1 (en) | 2010-12-01 | 2017-07-06 | 삼성전자주식회사 | Sampling circuit, sampling method, and photo detecting apparatus |
US8294077B2 (en) * | 2010-12-17 | 2012-10-23 | Omnivision Technologies, Inc. | Image sensor having supplemental capacitive coupling node |
US8847136B2 (en) | 2011-01-02 | 2014-09-30 | Pixim, Inc. | Conversion gain modulation using charge sharing pixel |
US8717467B2 (en) | 2011-01-25 | 2014-05-06 | Aptina Imaging Corporation | Imaging systems with array cameras for depth sensing |
US8674282B2 (en) | 2011-03-25 | 2014-03-18 | Aptina Imaging Corporation | Pumped pinned photodiode pixel array |
KR101251744B1 (en) | 2011-04-13 | 2013-04-05 | 엘지이노텍 주식회사 | Wdr pixel array, image sensor including the pixel array and method for operating the same |
KR101241704B1 (en) | 2011-04-14 | 2013-03-19 | 엘지이노텍 주식회사 | Pixel, pixel array, image sensor including the same and method for operating the image sensor |
US8575531B2 (en) | 2011-04-26 | 2013-11-05 | Aptina Imaging Corporation | Image sensor array for back side illumination with global shutter using a junction gate photodiode |
JP5808162B2 (en) | 2011-06-23 | 2015-11-10 | キヤノン株式会社 | Imaging device, imaging apparatus, and driving method of imaging device |
JP5868065B2 (en) | 2011-08-05 | 2016-02-24 | キヤノン株式会社 | Imaging device |
JP5901186B2 (en) | 2011-09-05 | 2016-04-06 | キヤノン株式会社 | Solid-state imaging device and driving method thereof |
US20130056809A1 (en) | 2011-09-07 | 2013-03-07 | Duli Mao | Image Sensor with Reduced Noiseby Blocking Nitridation Over Selected Areas |
US8461660B2 (en) | 2011-09-30 | 2013-06-11 | Omnivision Technologies, Inc. | CMOS image sensor with reset shield line |
JP2013084785A (en) | 2011-10-11 | 2013-05-09 | Sony Corp | Solid-state imaging device, and imaging device |
JP2013090127A (en) * | 2011-10-18 | 2013-05-13 | Olympus Corp | Solid-state imaging apparatus and imaging apparatus |
US8804021B2 (en) | 2011-11-03 | 2014-08-12 | Omnivision Technologies, Inc. | Method, apparatus and system for providing improved full well capacity in an image sensor pixel |
JP5963421B2 (en) | 2011-11-17 | 2016-08-03 | オリンパス株式会社 | Solid-state imaging device and imaging device |
KR20130062188A (en) | 2011-12-02 | 2013-06-12 | 삼성전자주식회사 | Image sensor, and image processing device including the same |
KR101801339B1 (en) | 2011-12-07 | 2017-11-27 | 한국전자통신연구원 | Fast frequency comparator with wide dynamic range |
US8754798B2 (en) | 2011-12-21 | 2014-06-17 | Realtek Semiconductor Corp. | High-speed successive-approximation-register analog-to-digital converter and method thereof |
US9531990B1 (en) | 2012-01-21 | 2016-12-27 | Google Inc. | Compound prediction using multiple sources or prediction modes |
CN103258829A (en) | 2012-02-16 | 2013-08-21 | 索尼公司 | Solid-state imaging device, image sensor, method of manufacturing image sensor, and electronic apparatus |
JP6164846B2 (en) | 2012-03-01 | 2017-07-19 | キヤノン株式会社 | Imaging device, imaging system, and driving method of imaging device |
JP5965674B2 (en) | 2012-03-05 | 2016-08-10 | オリンパス株式会社 | Solid-state imaging device and imaging device |
US8569700B2 (en) | 2012-03-06 | 2013-10-29 | Omnivision Technologies, Inc. | Image sensor for two-dimensional and three-dimensional image capture |
TW201340708A (en) | 2012-03-19 | 2013-10-01 | Sony Corp | Solid imaging device and electronic equipment |
IN2014DN08388A (en) | 2012-03-30 | 2015-05-08 | Nikon Corp | |
US8957358B2 (en) | 2012-04-27 | 2015-02-17 | Taiwan Semiconductor Manufacturing Company, Ltd. | CMOS image sensor chips with stacked scheme and methods for forming the same |
US9270906B2 (en) | 2012-05-02 | 2016-02-23 | Semiconductor Components Industries, Llc | Exposure time selection using stacked-chip image sensors |
US8779346B2 (en) | 2012-05-14 | 2014-07-15 | BAE Systems Imaging Solutions Inc. | Digital pixel sensor with reduced noise |
JP5885608B2 (en) | 2012-07-23 | 2016-03-15 | 株式会社東芝 | Solid-state imaging device |
JP6071315B2 (en) | 2012-08-08 | 2017-02-01 | オリンパス株式会社 | Solid-state imaging device and imaging device |
US9531961B2 (en) | 2015-05-01 | 2016-12-27 | Duelight Llc | Systems and methods for generating a digital image using separate color and intensity data |
US9185273B2 (en) | 2012-09-19 | 2015-11-10 | Semiconductor Components Industries, Llc | Imaging pixels with improved dynamic range |
US9343497B2 (en) | 2012-09-20 | 2016-05-17 | Semiconductor Components Industries, Llc | Imagers with stacked integrated circuit dies |
US9094612B2 (en) | 2012-09-25 | 2015-07-28 | Semiconductor Components Industries, Llc | Back side illuminated global shutter image sensors with back side charge storage |
US9478579B2 (en) | 2012-10-16 | 2016-10-25 | Omnivision Technologies, Inc. | Stacked chip image sensor with light-sensitive circuit elements on the bottom chip |
CN103730455B (en) * | 2012-10-16 | 2017-04-12 | 豪威科技股份有限公司 | Stacked chip image sensor with light-sensitive circuit elements on the bottom chip |
ES2476115B1 (en) | 2012-12-11 | 2015-04-20 | Consejo Superior De Investigaciones Científicas (Csic) | METHOD AND DEVICE FOR THE DETECTION OF THE TEMPORARY VARIATION OF LIGHT INTENSITY IN A PHOTOSENSOR MATRIX |
US9153616B2 (en) | 2012-12-26 | 2015-10-06 | Olympus Corporation | Solid-state imaging device and imaging device with circuit elements distributed on multiple substrates, method of controlling solid-state imaging device, and imaging device with circuit elements distributed on multiple substrates |
US8773562B1 (en) | 2013-01-31 | 2014-07-08 | Apple Inc. | Vertically stacked image sensor |
KR20140104169A (en) | 2013-02-20 | 2014-08-28 | 삼성전자주식회사 | Image sensor and computing system having the same |
JP2014236183A (en) | 2013-06-05 | 2014-12-15 | 株式会社東芝 | Image sensor device and method of manufacturing the same |
JP6188451B2 (en) | 2013-06-27 | 2017-08-30 | オリンパス株式会社 | Analog-digital converter and solid-state imaging device |
TWI659652B (en) | 2013-08-05 | 2019-05-11 | 新力股份有限公司 | Camera, electronic equipment |
JP5604703B1 (en) | 2013-09-10 | 2014-10-15 | 弘一 関根 | Solid-state imaging device |
US10043843B2 (en) | 2013-10-01 | 2018-08-07 | Forza Silicon Corporation | Stacked photodiodes for extended dynamic range and low light color discrimination |
JP6394056B2 (en) | 2013-11-27 | 2018-09-26 | ソニー株式会社 | A / D conversion device, gray code generation device, imaging device, and electronic device |
KR102210539B1 (en) | 2013-12-26 | 2021-02-01 | 삼성전자주식회사 | correlated double sampling circuit, analog-to digital converter and image sensor including the same |
KR102159261B1 (en) | 2014-01-21 | 2020-09-23 | 삼성전자 주식회사 | Image sensor capable of correcting output signal |
JP2015146364A (en) | 2014-02-03 | 2015-08-13 | ソニー株式会社 | Solid-state imaging element, method of driving the same, method of manufacturing the same, and electronic equipment |
US9832409B2 (en) | 2014-02-07 | 2017-11-28 | National University Corporation Shizuoka University | Image sensor |
CN110233978B (en) | 2014-02-07 | 2022-03-11 | 拉姆伯斯公司 | Feedthrough compensated image sensor |
KR102245973B1 (en) | 2014-02-17 | 2021-04-29 | 삼성전자주식회사 | Correlated double sampling circuit and image sensor including the same |
JP6278730B2 (en) | 2014-02-20 | 2018-02-14 | オリンパス株式会社 | Solid-state imaging device and imaging system |
EP2924979B1 (en) | 2014-03-25 | 2023-01-18 | IMEC vzw | Improvements in or relating to imaging sensors |
TWI656631B (en) | 2014-03-28 | 2019-04-11 | 日商半導體能源研究所股份有限公司 | Imaging device |
US20150287766A1 (en) | 2014-04-02 | 2015-10-08 | Tae-Chan Kim | Unit pixel of an image sensor and image sensor including the same |
US9531976B2 (en) | 2014-05-29 | 2016-12-27 | Semiconductor Components Industries, Llc | Systems and methods for operating image sensor pixels having different sensitivities and shared charge storage regions |
JP2015230355A (en) | 2014-06-04 | 2015-12-21 | リコーイメージング株式会社 | Imaging device and image pickup element |
JP2015231046A (en) | 2014-06-09 | 2015-12-21 | 株式会社東芝 | Solid state image pickup device |
JP6406888B2 (en) | 2014-06-17 | 2018-10-17 | キヤノン株式会社 | Analog-digital conversion circuit driving method, analog-digital conversion circuit, imaging apparatus, imaging system, analog-digital conversion circuit inspection method |
US9699393B2 (en) | 2014-06-26 | 2017-07-04 | Semiconductor Components Industries, Llc | Imaging systems for infrared and visible imaging with patterned infrared cutoff filters |
KR102134636B1 (en) | 2014-07-14 | 2020-07-16 | 삼성전자주식회사 | Unit pixel of image sensor and image sensor having the same |
WO2016014860A1 (en) | 2014-07-25 | 2016-01-28 | Rambus Inc. | Low-noise, high dynamic-range image sensor |
CN106537898B (en) | 2014-07-25 | 2020-07-28 | 株式会社半导体能源研究所 | Image forming apparatus with a plurality of image forming units |
US9344658B2 (en) | 2014-07-31 | 2016-05-17 | Omnivision Technologies, Inc. | Negative biased substrate for pixels in stacked image sensors |
JP6522919B2 (en) | 2014-10-15 | 2019-05-29 | オリンパス株式会社 | Imaging device, imaging device |
US9325335B1 (en) | 2014-10-24 | 2016-04-26 | Teledyne Scientific & Imaging, Llc | Comparator circuits with local ramp buffering for a column-parallel single slope ADC |
US9332200B1 (en) | 2014-12-05 | 2016-05-03 | Qualcomm Incorporated | Pixel readout architecture for full well capacity extension |
KR102410019B1 (en) | 2015-01-08 | 2022-06-16 | 삼성전자주식회사 | Image sensor |
US9515105B2 (en) | 2015-02-18 | 2016-12-06 | Semiconductor Components Industries, Llc | Dual photodiode image pixels with preferential blooming path |
US9524994B2 (en) | 2015-04-14 | 2016-12-20 | Semiconductor Components Industries, Llc | Image sensor pixels with multiple compartments |
US9819882B2 (en) | 2015-06-05 | 2017-11-14 | Caeleste Cvba | Global shutter high dynamic range sensor |
US9848142B2 (en) | 2015-07-10 | 2017-12-19 | Semiconductor Components Industries, Llc | Methods for clocking an image sensor |
TWI704811B (en) | 2015-07-27 | 2020-09-11 | 日商新力股份有限公司 | Solid-state imaging device, its control method, and electronic equipment |
KR102460175B1 (en) | 2015-08-21 | 2022-10-28 | 삼성전자주식회사 | Shared pixel and image sensor including the same |
US10014333B2 (en) | 2015-08-26 | 2018-07-03 | Semiconductor Components Industries, Llc | Back-side illuminated pixels with interconnect layers |
US9909922B2 (en) | 2015-09-03 | 2018-03-06 | Johnson & Johnson Vision Care, Inc. | Anti-aliasing photodetector system |
US9948875B2 (en) | 2015-10-01 | 2018-04-17 | Semiconductor Components Industries, Llc | High dynamic range imaging pixels with improved readout |
US11297258B2 (en) | 2015-10-01 | 2022-04-05 | Qualcomm Incorporated | High dynamic range solid state image sensor and camera system |
US9654712B2 (en) | 2015-10-07 | 2017-05-16 | Semiconductor Components Industries, Llc | Pixels with a global shutter and high dynamic range |
KR102433575B1 (en) | 2015-10-12 | 2022-08-19 | 삼성전자주식회사 | Image sensor |
US9936151B2 (en) | 2015-10-16 | 2018-04-03 | Capsovision Inc | Single image sensor for capturing mixed structured-light images and regular images |
EP3365916B1 (en) | 2015-10-21 | 2020-12-09 | Heptagon Micro Optics Pte. Ltd. | Demodulation pixel devices, arrays of pixel devices and optoelectronic devices incorporating the same |
US9818777B2 (en) | 2015-11-12 | 2017-11-14 | Stmicroelectronics (Research & Development) Limited | Hybrid analog-digital pixel implemented in a stacked configuration |
US9991306B2 (en) | 2015-12-10 | 2018-06-05 | Semiconductor Components Industries, Llc | Hybrid bonded image sensor and method of operating such image sensor |
CN108886048B (en) | 2016-03-31 | 2022-12-16 | 索尼公司 | Imaging device, method for manufacturing imaging device, and electronic device |
US10015416B2 (en) | 2016-05-24 | 2018-07-03 | Semiconductor Components Industries, Llc | Imaging systems with high dynamic range and phase detection pixels |
US9900117B2 (en) | 2016-05-27 | 2018-02-20 | Nxp Usa, Inc. | Communication unit receiver, integrated circuit and method for ADC dynamic range selection |
WO2017214391A1 (en) * | 2016-06-08 | 2017-12-14 | Invisage Technologies, Inc. | Image sensors with electronic shutter |
EP3258683A1 (en) | 2016-06-13 | 2017-12-20 | ams AG | Image sensor and method for operating an image sensor |
US20170366766A1 (en) | 2016-06-16 | 2017-12-21 | Semiconductor Components Industries, Llc | Image sensors having high dynamic range functionalities |
US9967496B2 (en) | 2016-06-30 | 2018-05-08 | Sony Corporation | Active reset circuit for reset spread reduction in single-slope ADC |
CN109314123B (en) | 2016-07-06 | 2023-06-20 | 索尼半导体解决方案公司 | Imaging element, method of manufacturing imaging element, and electronic apparatus |
IL246796B (en) | 2016-07-14 | 2020-05-31 | Semi Conductor Devices An Elbit Systems Rafael Partnership | A dual band photo-detector and a method thereof |
US9979912B2 (en) | 2016-09-12 | 2018-05-22 | Semiconductor Components Industries, Llc | Image sensors with power supply noise rejection capabilities |
US9800260B1 (en) | 2016-11-04 | 2017-10-24 | Analog Devices Global | Method and apparatus to increase dynamic range in delta-sigma ADC using internal feedback across all integrators in loop-filter |
KR20180072134A (en) | 2016-12-21 | 2018-06-29 | 에스케이하이닉스 주식회사 | Analog-digital converting apparatus, and cmos image sensor thereof |
US20190355782A1 (en) | 2017-01-25 | 2019-11-21 | BAE Systems Imaging Solutions Inc. | Imaging array with extended dynamic range |
US20180220093A1 (en) | 2017-02-01 | 2018-08-02 | Renesas Electronics Corporation | Image sensor |
US10686996B2 (en) | 2017-06-26 | 2020-06-16 | Facebook Technologies, Llc | Digital pixel with extended dynamic range |
US10419701B2 (en) | 2017-06-26 | 2019-09-17 | Facebook Technologies, Llc | Digital pixel image sensor |
US10750097B2 (en) | 2017-08-14 | 2020-08-18 | Facebooke Technologies, Llc | Varying exposure time of pixels in photo sensor using motion prediction |
US10825854B2 (en) | 2017-08-16 | 2020-11-03 | Facebook Technologies, Llc | Stacked photo sensor assembly with pixel level interconnect |
US10608101B2 (en) | 2017-08-16 | 2020-03-31 | Facebook Technologies, Llc | Detection circuit for photo sensor with stacked substrates |
US10598546B2 (en) | 2017-08-17 | 2020-03-24 | Facebook Technologies, Llc | Detecting high intensity light in photo sensor |
JP6929750B2 (en) | 2017-09-29 | 2021-09-01 | キヤノン株式会社 | Imaging device, imaging system, moving object |
JP7039236B2 (en) | 2017-09-29 | 2022-03-22 | キヤノン株式会社 | Sequential comparison type AD converter, image pickup device, image pickup system, mobile body |
JP7100439B2 (en) | 2017-10-20 | 2022-07-13 | ブリルニクス シンガポール プライベート リミテッド | Solid-state image sensor, solid-state image sensor driving method, and electronic equipment |
US11393867B2 (en) | 2017-12-06 | 2022-07-19 | Facebook Technologies, Llc | Multi-photodiode pixel cell |
US10827142B2 (en) | 2018-03-02 | 2020-11-03 | Facebook Technologies, Llc | Digital pixel array with adaptive exposure |
US10812742B2 (en) | 2018-04-18 | 2020-10-20 | Facebook Technologies, Llc | Apparatus and method for determining whether a photodiode saturates and outputting a digital value representing a charge from that photodiode based on that determination |
US11233085B2 (en) | 2018-05-09 | 2022-01-25 | Facebook Technologies, Llc | Multi-photo pixel cell having vertical gate structure |
US10804926B2 (en) | 2018-06-08 | 2020-10-13 | Facebook Technologies, Llc | Charge leakage compensation in analog-to-digital converter |
US11089210B2 (en) | 2018-06-11 | 2021-08-10 | Facebook Technologies, Llc | Configurable image sensor |
US11463636B2 (en) | 2018-06-27 | 2022-10-04 | Facebook Technologies, Llc | Pixel sensor having multiple photodiodes |
US11956413B2 (en) | 2018-08-27 | 2024-04-09 | Meta Platforms Technologies, Llc | Pixel sensor having multiple photodiodes and shared comparator |
-
2019
- 2019-03-29 US US16/369,763 patent/US11004881B2/en active Active
- 2019-04-01 WO PCT/US2019/025170 patent/WO2019195162A1/en unknown
- 2019-04-01 KR KR1020207030391A patent/KR20200139187A/en not_active Application Discontinuation
- 2019-04-01 EP EP19723902.3A patent/EP3777132A1/en active Pending
- 2019-04-01 CN CN201980024435.0A patent/CN111955002B/en active Active
- 2019-04-01 JP JP2020547367A patent/JP2021517764A/en active Pending
- 2019-04-03 TW TW108111979A patent/TW201943093A/en unknown
-
2021
- 2021-04-21 US US17/236,433 patent/US20210265401A1/en not_active Abandoned
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7830292B2 (en) * | 2005-03-30 | 2010-11-09 | Aptina Imaging Corporation | High density row RAM for column parallel CMOS image sensors |
US20080042046A1 (en) * | 2006-08-21 | 2008-02-21 | Sony Corporation | Physical quantity detection device, method of driving physical quantity detection device, and imaging apparatus |
US20100140732A1 (en) * | 2008-12-09 | 2010-06-10 | Teledyne Scientific & Imaging, Llc | Method and apparatus for backside illuminated image sensors using capacitively coupled readout integrated circuits |
US20120267511A1 (en) * | 2011-04-19 | 2012-10-25 | Altasens, Inc. | Image sensor with hybrid heterostructure |
US20180278865A1 (en) * | 2015-12-04 | 2018-09-27 | Canon Kabushiki Kaisha | Method for driving imaging apparatus |
US20170295338A1 (en) * | 2016-04-11 | 2017-10-12 | Semiconductor Components Industries, Llc | Backside illuminated global shutter pixel with active reset |
US20180227516A1 (en) * | 2017-02-03 | 2018-08-09 | SmartSens Technology (U.S.), Inc. | Stacked image sensor pixel cell with dynamic range enhancement and selectable shutter modes and in-pixel cds |
US20210144330A1 (en) * | 2017-05-10 | 2021-05-13 | Brillnics, Inc. | Solid-state imaging device, method for driving solid-state imaging device, and electronic apparatus |
US20190043903A1 (en) * | 2017-08-01 | 2019-02-07 | Semiconductor Components Industries, Llc | Stacked image sensor capacitors and related methods |
Also Published As
Publication number | Publication date |
---|---|
US11004881B2 (en) | 2021-05-11 |
CN111955002A (en) | 2020-11-17 |
JP2021517764A (en) | 2021-07-26 |
KR20200139187A (en) | 2020-12-11 |
WO2019195162A1 (en) | 2019-10-10 |
US20190305020A1 (en) | 2019-10-03 |
EP3777132A1 (en) | 2021-02-17 |
CN111955002B (en) | 2023-05-02 |
TW201943093A (en) | 2019-11-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20210265401A1 (en) | Global shutter image sensor | |
US11863886B2 (en) | Pixel sensor having multiple photodiodes | |
US10917589B2 (en) | Digital pixel with extended dynamic range | |
US11956413B2 (en) | Pixel sensor having multiple photodiodes and shared comparator | |
US11102430B2 (en) | Pixel sensor having multiple photodiodes | |
US10903260B2 (en) | Multi-photodiode pixel cell | |
US10931884B2 (en) | Pixel sensor having adaptive exposure time | |
US10686996B2 (en) | Digital pixel with extended dynamic range | |
US10834344B2 (en) | Digital pixel with extended dynamic range | |
US10923523B2 (en) | Multi-photodiode pixel cell | |
WO2019113278A1 (en) | Multi-photodiode pixel cell | |
US20200396399A1 (en) | Non-linear quantization at pixel sensor | |
WO2019241238A1 (en) | Pixel cell with multiple photodiodes | |
US20230092325A1 (en) | Digital pixel sensor | |
US11877080B2 (en) | Pixel sensor having shared readout structure | |
US11974044B2 (en) | Pixel sensor having adaptive exposure time |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |