WO2023039784A1 - Method and event-based vision sensor for acquiring event-based image during blanking time of frame-based image read-out - Google Patents

Method and event-based vision sensor for acquiring event-based image during blanking time of frame-based image read-out Download PDF

Info

Publication number
WO2023039784A1
WO2023039784A1 PCT/CN2021/118673 CN2021118673W WO2023039784A1 WO 2023039784 A1 WO2023039784 A1 WO 2023039784A1 CN 2021118673 W CN2021118673 W CN 2021118673W WO 2023039784 A1 WO2023039784 A1 WO 2023039784A1
Authority
WO
WIPO (PCT)
Prior art keywords
event
synchronous operation
circuit
pixel circuit
during
Prior art date
Application number
PCT/CN2021/118673
Other languages
French (fr)
Inventor
Takao Ishii
Original Assignee
Huawei Technologies Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co., Ltd. filed Critical Huawei Technologies Co., Ltd.
Priority to PCT/CN2021/118673 priority Critical patent/WO2023039784A1/en
Publication of WO2023039784A1 publication Critical patent/WO2023039784A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/47Image sensors with pixel address output; Event-driven image sensors; Selection of pixels to be read out based on image data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/60Noise processing, e.g. detecting, correcting, reducing or removing noise
    • H04N25/616Noise processing, e.g. detecting, correcting, reducing or removing noise involving a correlated sampling function, e.g. correlated double sampling [CDS] or triple sampling
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/703SSIS architectures incorporating pixels for producing signals other than image signals
    • H04N25/707Pixels for event detection

Definitions

  • the present application relates to a method and an event-based vision sensor for acquiring an event-based image.
  • Frame-based image data is light intensity data for each pixel frame-by-frame.
  • event-based vision sensor which is also called a Dynamic Vision Sensor (DVS)
  • DVS Dynamic Vision Sensor
  • its change detection function is expected to achieve the following functions:
  • SLAM Simultaneous Localization And Mapping
  • event-based vision sensors cannot obtain conventional images by themselves alone. Accordingly, their application has been limited to machine vision including automotive applications. Namely, the above applications require another camera that takes frame-based images. In addition, a registration process for aligning two types of images may require a large computation.
  • a photodiode is shared by a pixel source follower of a frame-based image sensor and an event detector of an event-based vision sensor.
  • this image sensor can obtain two types of images through the same optical system. This means there is no need for a registration process to align two types of images for these image sensors.
  • the operation method according to the prior art can be applied in completely synchronized systems only. Therefore, it is difficult to apply this technique in systems involving asynchronous components, like an event-based sensor output that detects a signal when a pixel detect illuminance level changes.
  • a method and an event-based vision sensor for acquiring an event-based image during blanking time of frame-based image read-out is provided to achieve suppression of interference during the operation of the frame-based image sensor from a rush current induced by asynchronous control signals.
  • a driving method of an event-based vision sensor for use with a circuit for a synchronous operation includes:
  • an event as a light illuminance change during the entire synchronous operation, except for a period during which an event-based pixel circuit detects said event and gets reset;
  • communication between the event-based pixel circuit and the peripheral circuit is suspended by disconnecting row and column lines from the peripheral circuit.
  • communication between the event-based pixel circuit and the peripheral circuit is suspended by turning off the peripheral circuit.
  • the method further comprises assigning a time stamp to the event data based on a clock for driving the synchronous operation.
  • the time stamp is incremented based on a synchronous signal for an Analog to Digital Converter (ADC) operation of a frame-based image sensor.
  • ADC Analog to Digital Converter
  • Correlated Double Sampling is performed during the synchronous operation, and the part of the synchronous operation begins at the end of a second descending slope and ends at the beginning of a first descending slope.
  • an event-based vision sensor for use with a circuit for a synchronous operation, where the event-based vision sensor includes:
  • an event-based pixel circuit configured to detect an event as a light illuminance change during the entire synchronous operation, except for a period during which an event-based pixel circuit detects said event and gets reset;
  • a peripheral circuit configured to read said event information during a part of the synchronous operation when reading out event data, and suspend communication with the event-based pixel circuit to read said event information during the other part of the synchronous operation.
  • communication with the event-based pixel circuit is suspended by disconnecting row and column lines from the peripheral circuit.
  • communication with the event-based pixel circuit is suspended by turning off the peripheral circuit.
  • the peripheral circuit is further configured to assign a time stamp to the event data based on a clock for driving the synchronous operation.
  • the time stamp is incremented based on a synchronous signal for an Analog to Digital Converter (ADC) operation of a frame-based image sensor.
  • ADC Analog to Digital Converter
  • Correlated Double Sampling is performed during the synchronous operation, and the part of the synchronous operation begins at the end of a second descending slope and ends at the beginning of a first descending slope.
  • FIG. 1 shows an example of a frame-based pixel circuit
  • FIG. 2 shows an example of a typical RAMP waveform used for Correlated Double Sampling (CDS) ;
  • FIG. 3 shows examples of an event-based pixel circuit 100 and an arbiter circuit 200
  • FIG. 4 shows an example of an arbiter circuit 201 according to an embodiment of the present invention
  • FIG. 5 shows an operation according to the embodiment of the present invention
  • FIG. 6 shows a read-out operation of an event-based image and a frame-based image
  • FIG. 7 shows examples of an event-based pixel circuit 101 and a scanner circuit 202
  • FIG. 8 shows an example of a scanner circuit 203 according to an embodiment of the present invention.
  • FIG. 9 shows a read-out operation of an event-based image and a frame based image.
  • the embodiments of the present invention provide operation methods and circuits for mitigating interference from an event-based operating component to a synchronized (frame-based) operating component.
  • a synchronized (frame-based) operating component First of all, examples of the frame-based operating component and the event-based operating component are explained separately.
  • Fig. 1 shows an example of a frame-based pixel circuit.
  • This circuit includes a typical pixel circuit and an ADC.
  • a frame-based image sensor has i x j pixels that are arranged in i columns and j rows.
  • the pixel circuit includes a photodiode (PD) , a transfer gate transistor (TX) , a reset transistor (RST) , a source follower amplifier transistor (AMP) , a row select transistor (SEL) , and a current source that is connected between a source of the SEL and ground.
  • the ADC is connected between the source of the SEL and the current source.
  • a pixel signal (PIXEL) which is an analog signal, is input into an inverted input of a comparator (CMP) via a capacitor for compensating operating range difference of them (output range of the pixel signal and input range of the comparator)
  • a ramp signal (RAMP) generated by a RAMP generator is input into a non-inverted input of the CMP via a capacitor for compensating operating range difference of them (output range of the ramp signal and input range of the comparator) .
  • the RAMP generator outputs a RAMP waveform.
  • the RAMP generator may be shared by the ADCs for pixels placed in the same column.
  • An output of the CMP is input into a counter (CNT) , and the count value is stored in a Latch (not shown in Fig. 1) .
  • Fig. 2 shows an example of a typical RAMP waveform used for Correlated Double Sampling (CDS) .
  • the solid line shows the ramp signal (RAMP)
  • the dashed line shows pixel output at a reset level.
  • the reset reveal and the signal level are sampled, respectively, as follows: the first descending slope is used for AD conversion of the pixel output at the reset level, and the second descending slope is used for AD conversion of the pixel output at the signal level.
  • the CNT starts counting at the beginning of the first descending slope, and counts until the first descending slope crosses the dashed line. This count value corresponds to the reset level. If there is no signal from the pixel, the signal level is identical to the reset level, namely, the level of the dashed line does not change.
  • the CNT starts counting at the beginning of the second descending slope, and counts until the second descending slope crosses the dashed line. If there is a signal from the pixel, the dashed line drops to the level corresponding to the signal below the reset level, the CNT starts counting at the beginning of the second descending slope, and counts until the second descending slope crosses the dashed line. This count value corresponds to the signal level.
  • the output level is defined as the difference between the count value corresponding to the signal level and the count value corresponding to the reset level.
  • Embodiment 1 of the present invention The following describes Embodiment 1 of the present invention.
  • Fig. 3 shows examples of an event-based pixel circuit 100 and an arbiter circuit 200. It is assumed that the event-based pixel circuit 100 in Fig. 3 is located at the i th column and the j th row in the event-based vision sensor. For example, an event-based pixel circuit 100 may be provided for each pixel in the frame-based image sensor.
  • the arbiter circuit 200 includes a column arbiter 210 and a row arbiter 220. Column lines C1, C2, and C3 are connected to the column arbiter 210, respectively. Row lines R1 and R2 are connected to the row arbiter 220. Also these Column and Row line has large load capacitance and may induce rush current when signals on these lines toggles.
  • the event-based pixel circuit 100 includes an event detector 110, transistors 121 to 124, AND gates 131 to 133, and a delay 140.
  • a photodiode may be provided for each event detector 110, or may be shared by the event detector 110 and the frame-based pixel circuit (not shown in Fig. 3) .
  • the event detector 110 converts photodiode current to voltage logarithmically, detects voltage variation, and outputs "ON event” when the voltage variation exceeds a predetermined threshold voltage or outputs "OFF event” when the voltage variation falls below a predetermined threshold voltage.
  • the event detector 110 If the event detector 110 outputs "ON event” (High level) , the transistor 121 is turned on, and the voltage of the row line R1 of the j th row becomes Low level (hereinafter, referred to as "RowReq” ) . "ON event” (High level) is maintained until the column arbiter 210 outputs "ColAck” on the column line C3, and the event detector 110 receives a reset signal. "RowReq” is detected by the row arbiter 220. The row arbiter 220 arbitrates “RowReq” from a plurality of rows, and outputs "RowAck” to the row line R2 one-by-one.
  • the row arbiter 220 If the row arbiter 220 outputs "RowAck” , "ON event” and “RowAck” are input to the AND gate 131, the output of the AND gate 131 becomes High, the transistor 123 is turned on, and the voltage of the column line C1 of the i th column becomes Low. In this way, the arbiter circuit 200 detects "ON event” at the i th column and the j th row.
  • the column arbiter 210 outputs "ColAck” on the column line C3, "RowAck” and “ColAck” are input to the AND gate 133, the output of the AND gate 133 becomes High, the delay 140 receives a High level signal from the AND gate 133, and delays and outputs it to a "RESET” terminal of the event detector 110.
  • the event detector 110 resets "ON event” , namely, the voltage of the "ON" terminal of the event detector 110 becomes Low, the output of the AND gate 131 becomes Low, the transistors 121 and 123 are turned off, and the voltage of the row line R1 and the column line C1 becomes High.
  • a hand-shake protocol is operated in the vertical and horizontal directions as soon as the event is detected, charging and discharging current flows in long control lines during each event detection, and a large loop current may be induced for charging and discharging of these signal lines.
  • Embodiment 1 of the present invention provides an implementation of the arbiter circuit for row and column directions of an asynchronous type event-based vision sensor.
  • Fig. 4 shows an example of the arbiter circuit 201 according to an embodiment of the present invention.
  • the row line R1 is connected to the row arbiter 221, and the column lines C1 and C2 are connected to the column arbiter 211.
  • the column lines C1 and C2 and the row line R1 are pulled down, namely, connected to ground. Charging and discharging current are suppressed by pulling down the control lines.
  • these arbiters are disabled by turning off connection to the vertical and horizontal signal lines running over the columns and the rows by turning off switches 213, 215, and 223 and turning on switches 212, 214, and 222. Furthermore, when these arbiters are disconnected to the vertical and horizontal signal lines, these lines might be pulled down or up to avoid charge/discharge current induced by event detection.
  • Event data acquisition during a blanking period of a cycle of a read out operation for a frame-based image is performed by the following procedures: (1) acquiring event data, when it is in a period other than an AD conversion period, (2) obtaining a timestamp for each event data based on the AD conversion period, and (3) disconnecting the peripheral circuit (the arbiter circuit 200 in Fig. 4) from the event-based pixel circuit 100 during the AD conversion period, by cutting current pass on row and column communication of event data; herein the event detecting function of each pixel is not disabled.
  • the event-based pixel circuit 100 continues to detect "ON event” or “OFF event” and output “RowReq” , and this "RowReq” is detected and arbitrated by the row arbiter 220 when the switches 213, 215, and 223 are turned on.
  • the waveform diagram in Fig. 5 shows an operation according to the embodiment of the present invention.
  • “Hsync” means a synchronous signal for an ADC operation for a column parallel ADC (AD conversions for all the columns are performed in parallel) in the frame-based image sensor.
  • said ADC is a single slope ADC
  • a "RAMP waveform” indicates its reference voltage waveform.
  • an analog input level is measured as the time for crossing over the analog input and the RAMP level by a counter, and a count value at this time becomes a digital output.
  • two RAMP descending slopes are included in one ADC period.
  • Event-based vision data is a data array including at least an event position in an image area and its event time stamp.
  • the time stamp is stepped up based on "Hsync” .
  • the time stamp is N after the first "Hsync”
  • the time stamp is N+1 after the second "Hsync” .
  • time stamp error according to said alternative operations of "Event Trans Enable” and “Event Trans Disable” can be negligible because maximum time stamp error in the operation is one cycle of "Hsync" as can be seen from Fig. 6 in which T. S. means Time Stamp.
  • FIG. 6 shows a read out operation of an event-based image and a frame-based image.
  • Waveforms of "Hsync” and ADC are shown in the upper part of Fig. 6.
  • values of the time stamp are shown over time. The value of the time stamp is incremented when "Hsync" becomes low level.
  • Vertical axis "Y" corresponds to row number. Rectangles denoted by “R.O. ” (read out) show that the read-out operation of the frame-based image sensor is performed for the corresponding row.
  • Fig. 6 shows that the read-out operation is performed from the bottom row to the top row. This read-out order is merely an example.
  • the dotted areas show the "Event Trans Disable” period, namely, event transactions for all of the rows are disabled.
  • the dotted areas range from the bottom row to the top row including the areas behind the rectangles denoted by “R.O. ” .
  • Slanted line areas show the "Event Trans Enable” period, namely, event transactions for any of the rows, on which "ON event” or "OFF event” is detected, are enabled.
  • the slanted line areas range from the bottom row to the top row including the areas behind the rectangles denoted by “R.O. ” .
  • the order of the event transactions are arbitrated by the row arbiter 220. As shown by a two-way arrow at the bottom of Fig. 6, the events, which are detected during the period from the end of "Event Trans Enable” period to the beginning of the next "Event Trans Enable” period, are assigned time stamp “N+1” .
  • the event detector 110 is reset by the control signal coming from the arbiter circuit 200, after a hand-shake protocol.
  • this operation allows pixels to continue to detect an event even during the "Event Trans Disable” period. Furthermore, it is possible to read out the event detecting results of each pixel in the next "Event Trans Enable” period. According to this operation, it is possible to suppress interference in the operation of the frame-based image sensor from rush current induced by asynchronous control signals.
  • Embodiment 2 of the present invention Event detection statuses of all the pixels are checked, namely, whether the event is detected or not is checked for all the pixels in every "Event Trans Enable” period.
  • each pixel can detect an event for itself even in the "Event Trans Disable” period, and this event data is output during the next "Event Trans Enable” period.
  • Fig. 7 shows examples of an event-based pixel circuit 101 and a scanner circuit 202. It is assumed that the event-based pixel circuit 101 in Fig. 7 is located at the i th column and the j th row in the event-based vision sensor.
  • the scanner circuit 202 (a peripheral circuit) includes a column arbiter/scanner 230 and a row scanner 240. Column lines C1, C2, and C3 are connected to the column arbiter/scanner 230, and lines C1 and C2 are also connected to the ground via capacitors, respectively.
  • Row line R2 is connected to the row scanner 240. Compared with the circuit shown in Fig. 3, the row line R1 and the transistors 121 and 122 are not provided.
  • the event-based pixel circuit 101 includes an event detector 110, transistors 123 and 124, AND gates 131 and 132, and a delay 140.
  • the row scanner 240 outputs "RowSel” sequentially, for example, from the bottom row to the top row. If the event detector 110 outputs "ON event” and the row scanner 240 outputs "RowSel” , the output of the AND gate 131 becomes High, the transistor 123 is turned on, and the voltage of the column line C1 of the i th column becomes Low. In this way, the scanner circuit 202 detects "ON event" at the i th column and the j th row.
  • the column arbiter/scanner 230 outputs "ColAck” on line C3, "RowSel” and “ColAck” are input to the AND gate 133, the output of the AND gate 133 becomes High, the delay 140 receives the High level signal from the AND gate 133, and delays and outputs it to a "RESET” terminal of the event detector 110.
  • the event detector 110 resets "ON event” , namely, the voltage of "ON" terminal of the event detector 110 becomes Low.
  • the row scanner 240 is kept on during the "Event Trans Disable” period.
  • a hand-shake protocol is implemented in the vertical and horizontal directions as soon as the event is detected, charging and discharging current flows in long control lines during each event detection, and a large loop current may be induced for charging and discharging of these signal lines.
  • Embodiment 2 of the present invention provides an implementation of the scanner circuit for row and column directions of an asynchronous type event-based vision sensor.
  • Fig. 8 shows an example of the scanner circuit 203 according to an embodiment of the present invention. During the periods shown as "Event Trans Disable" in Fig. 5, the row scanner 241 is turned off, and the column lines C1 and C2 are terminated. The pull up resistances of the column lines C1 and C2 automatically pull them up.
  • Fig. 9 shows a read-out operation of an event-based image and a frame-based image.
  • Vertical axis "Y" corresponds to row number. Rectangles denoted by “R.O. ” (read out) show that the read-out operation of the frame-based image sensor is performed for the corresponding row.
  • Fig. 9 shows that the read-out operation is performed from the bottom row to the top row. This read-out order is merely an example.
  • the dotted areas show the "Event Trans Disable" period, namely, event transactions for all of the rows are disabled.
  • the dotted areas range from the bottom row to the top row including the areas behind the rectangles denoted by “R.O. ” .
  • Slanted line areas show the "Event Trans Enable” period, namely, event transactions for all the rows are performed from the bottom row to the top row, as shown by the tilt of the slanted line areas.
  • the slanted line areas range from the bottom row to the top row including the areas behind the rectangles denoted by “R.O. ” .
  • events which are detected after the top row is selected in the "Event Trans Enable” period and before the top row is selected in the next "Event Trans Enable” period, are assigned time stamp “N+1” .
  • event detection statuses of each pixel are scanned during the "Event Trans Enable" period.
  • the status of a pixel may be represented with 1 or 2 bits. Thus, it is possible to scan all of the rows during a short period of time.
  • turning off the row scanner can suppress generation of a large loop current, and it is possible to run asynchronous operations with suppression of interference, even in noise-sensitive periods.
  • event data is acquired in H-blank (blanking time of ADC periods) of frame-based read-out for n-lines, and it is possible to reduce crosstalk noise to ADC results of frame-based images.
  • the prior art proposed a hybrid pixel structure for acquiring a frame-based image and an event-based image, and also provided a crosstalk mitigating technique based on ADC operation periodicity.
  • the embodiments of the present invention provide techniques to expand the noise mitigating system for a system involving an asynchronous operation like an event-based vision sensor.
  • the beneficial effect of the embodiments of the present invention can be summarized as follows:
  • the technical problem of crosstalk noise between synchronous and asynchronous systems e.g. stacked hybrid pixel for a frame-based image and an event-based vision
  • the embodiments of the present invention provide alternative operations for event detection communication and A/D conversion of frame-based images.
  • the technical effect of the embodiments of the present invention is that it is possible to run event detection always without crosstalk noise in frame-based image.

Abstract

A driving method of an event-based vision sensor for use with a circuit for a synchronous operation, the method includes: detecting an event as a light illuminance change during the entire synchronous operation, except for a period during which an event-based pixel circuit detects said event and gets reset; reading said event information during a part of the synchronous operation when reading out event data; and suspending communication to read said event information between the event-based pixel circuit and a peripheral circuit of the event-based vision sensor during the other part of the synchronous operation. The method achieves suppression of interference in the operation of the frame-based image sensor from a rush current induced by asynchronous control signals.

Description

[Title established by the ISA under Rule 37.2] METHOD AND EVENT-BASED VISION SENSOR FOR ACQUIRING EVENT-BASED IMAGE DURING BLANKING TIME OF FRAME-BASED IMAGE READ-OUT TECHNICAL FIELD
The present application relates to a method and an event-based vision sensor for acquiring an event-based image.
BACKGROUND
Conventional image data is obtained by a frame-based image sensor. Frame-based image data is light intensity data for each pixel frame-by-frame. It is expected that event-based vision sensor, which is also called a Dynamic Vision Sensor (DVS) , will be applied mainly as a component of a new camera system for mobile devices because of its change detection function with low latency (high speed) , low power consumption, and high dynamic range. For example, its change detection function is expected to achieve the following functions:
(1) Feature point extraction for Simultaneous Localization And Mapping (SLAM) with low latency, high dynamic range, and low power consumption, which allows mobile devices to achieve an indoor navigation system with high traceability.
(2) Fast detection of moving objects which achieves reconstruction of a high-speed movie and a high-resolution image, to compensate for motion blur and to interpolate inter-frame images that exceed a frame rate limit of a conventional frame-based image sensor.
However, event-based vision sensors cannot obtain conventional images by themselves alone. Accordingly, their application has been limited to machine vision including automotive applications. Namely, the above applications require another camera that takes frame-based images. In addition, a registration process for aligning two types of images may require a large computation.
In order to align two images more accurately, a hybrid architecture was already proposed. A photodiode is shared by a pixel source follower of a frame-based image sensor and an event detector of an event-based vision sensor. Thus, this image sensor can obtain two types of images through the same optical system. This means there is no need for a registration process to align two types of images for these image sensors.
However, these techniques require implementation of pixel source followers and event detectors close to each other. Thus, there are concerns about interference between their operations. Especially, event detector operation can induce a rush current for its digital operation, which may cause fluctuations in pixel source follower output level. Furthermore, communication between the event detector in pixel and peripheral encoders creates a large current loop, which may produce electromagnetic noise in pixel source follower output.
To avoid other components making an interference in the pixel source follower and an Analog to Digital Converter (ADC) in the frame-based image sensor, a synchronization technique is proposed. It is possible to mitigate the interference in the pixel source follower and the ADC by maintaining the state of the component to be the same in the two conversion phases of Correlated Double Sampling (CDS) , namely digital output acquisition for an already-known level and unknown level.
The operation method according to the prior art can be applied in completely synchronized systems only. Therefore, it is difficult to apply this technique in systems involving asynchronous components, like an event-based sensor output that detects a signal when a pixel detect illuminance level changes.
SUMMARY
A method and an event-based vision sensor for acquiring an event-based image during blanking time of frame-based image read-out is provided to achieve suppression of interference during the operation of the frame-based image sensor from a rush current induced by asynchronous control signals.
According to a first aspect, a driving method of an event-based vision sensor for use with a circuit for a synchronous operation is provided, where the method includes:
detecting an event as a light illuminance change during the entire synchronous operation, except for a period during which an event-based pixel circuit detects said event and gets reset;
reading said event information during a part of the synchronous operation when reading out event data; and
suspending communication to read said event information between the event-based pixel circuit and a peripheral circuit of the event-based vision sensor during the other part of the synchronous operation.
In a possible implementation of the first aspect, communication between the event-based pixel circuit and the peripheral circuit is suspended by disconnecting row and column lines from the peripheral circuit.
In a possible implementation of the first aspect, communication between the event-based pixel circuit and the peripheral circuit is suspended by turning off the peripheral circuit.
In a possible implementation of the first aspect, the method further comprises assigning a time stamp to the event data based on a clock for driving the synchronous operation.
In a possible implementation of the first aspect, the time stamp is incremented based on a synchronous signal for an Analog to Digital Converter (ADC) operation of a frame-based image sensor.
In a possible implementation of the first aspect, Correlated Double Sampling is performed during the synchronous operation, and the part of the synchronous operation begins at the end of a second descending slope and ends at the beginning of a first descending slope.
According to a second aspect, an event-based vision sensor for use with a circuit for a synchronous operation is provided, where the event-based vision sensor includes:
an event-based pixel circuit configured to detect an event as a light illuminance change during the entire synchronous operation, except for a period during which an event-based pixel circuit detects said event and gets reset; and
a peripheral circuit configured to read said event information during a part of the synchronous operation when reading out event data, and suspend communication with the event-based pixel circuit to read said event information during the other part of the synchronous operation.
In a possible implementation of the second aspect, communication with the event-based pixel circuit is suspended by disconnecting row and column lines from the peripheral circuit.
In a possible implementation of the second aspect, communication with the event-based pixel circuit is suspended by turning off the peripheral circuit.
In a possible implementation of the second aspect, the peripheral circuit is further configured to assign a time stamp to the event data based on a clock for driving the synchronous operation.
In a possible implementation of the second aspect, the time stamp is incremented based on a synchronous signal for an Analog to Digital Converter (ADC) operation of a frame-based image sensor.
In a possible implementation of the second aspect, Correlated Double Sampling is performed during the synchronous operation, and the part of the synchronous operation begins at the end of a second descending slope and ends at the beginning of a first descending slope.
BRIEF DESCRIPTION OF DRAWINGS
To describe the technical solutions in the embodiments of the present invention or in the  prior art more clearly, the following is a brief introduction of the accompanying drawings required for describing the embodiments or the prior art. The accompanying drawings in the following description merely show some embodiments of the present invention, and a person of ordinary skill in the art may still derive other drawings from these accompanying drawings without requiring creative effort.
FIG. 1 shows an example of a frame-based pixel circuit;
FIG. 2 shows an example of a typical RAMP waveform used for Correlated Double Sampling (CDS) ;
FIG. 3 shows examples of an event-based pixel circuit 100 and an arbiter circuit 200;
FIG. 4 shows an example of an arbiter circuit 201 according to an embodiment of the present invention;
FIG. 5 shows an operation according to the embodiment of the present invention;
FIG. 6 shows a read-out operation of an event-based image and a frame-based image;
FIG. 7 shows examples of an event-based pixel circuit 101 and a scanner circuit 202;
FIG. 8 shows an example of a scanner circuit 203 according to an embodiment of the present invention; and
FIG. 9 shows a read-out operation of an event-based image and a frame based image.
DESCRIPTION OF EMBODIMENTS
The following clearly and completely describes the technical solutions in the embodiments of the present invention with reference to the accompanying drawings of the embodiments of the present invention. The described embodiments are only some but not all of the embodiments of the present invention. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments of the present invention without requiring creative effort shall fall within the protected scope of the present invention.
The embodiments of the present invention provide operation methods and circuits for mitigating interference from an event-based operating component to a synchronized (frame-based) operating component. First of all, examples of the frame-based operating component and the event-based operating component are explained separately.
Fig. 1 shows an example of a frame-based pixel circuit. This circuit includes a typical pixel circuit and an ADC. For example, a frame-based image sensor has i x j pixels that are arranged in i columns and j rows. The pixel circuit includes a photodiode (PD) , a transfer gate transistor (TX) , a reset transistor (RST) , a source follower amplifier transistor (AMP) , a row select transistor (SEL) , and a current source that is connected between a source of the SEL and ground. The ADC is  connected between the source of the SEL and the current source. A pixel signal (PIXEL) , which is an analog signal, is input into an inverted input of a comparator (CMP) via a capacitor for compensating operating range difference of them (output range of the pixel signal and input range of the comparator) , and a ramp signal (RAMP) generated by a RAMP generator is input into a non-inverted input of the CMP via a capacitor for compensating operating range difference of them (output range of the ramp signal and input range of the comparator) . The RAMP generator outputs a RAMP waveform. The RAMP generator may be shared by the ADCs for pixels placed in the same column. An output of the CMP is input into a counter (CNT) , and the count value is stored in a Latch (not shown in Fig. 1) .
Fig. 2 shows an example of a typical RAMP waveform used for Correlated Double Sampling (CDS) . The solid line shows the ramp signal (RAMP) , and the dashed line shows pixel output at a reset level. The reset revel and the signal level are sampled, respectively, as follows: the first descending slope is used for AD conversion of the pixel output at the reset level, and the second descending slope is used for AD conversion of the pixel output at the signal level. The CNT starts counting at the beginning of the first descending slope, and counts until the first descending slope crosses the dashed line. This count value corresponds to the reset level. If there is no signal from the pixel, the signal level is identical to the reset level, namely, the level of the dashed line does not change. The CNT starts counting at the beginning of the second descending slope, and counts until the second descending slope crosses the dashed line. If there is a signal from the pixel, the dashed line drops to the level corresponding to the signal below the reset level, the CNT starts counting at the beginning of the second descending slope, and counts until the second descending slope crosses the dashed line. This count value corresponds to the signal level. The output level is defined as the difference between the count value corresponding to the signal level and the count value corresponding to the reset level.
The following describes Embodiment 1 of the present invention.
Fig. 3 shows examples of an event-based pixel circuit 100 and an arbiter circuit 200. It is assumed that the event-based pixel circuit 100 in Fig. 3 is located at the i th column and the j th row in the event-based vision sensor. For example, an event-based pixel circuit 100 may be provided for each pixel in the frame-based image sensor. The arbiter circuit 200 includes a column arbiter 210 and a row arbiter 220. Column lines C1, C2, and C3 are connected to the column arbiter 210, respectively. Row lines R1 and R2 are connected to the row arbiter 220. Also these Column and Row line has large load capacitance and may induce rush current when signals on these lines toggles.
The event-based pixel circuit 100 includes an event detector 110, transistors 121 to 124,  AND gates 131 to 133, and a delay 140. A photodiode may be provided for each event detector 110, or may be shared by the event detector 110 and the frame-based pixel circuit (not shown in Fig. 3) . The event detector 110 converts photodiode current to voltage logarithmically, detects voltage variation, and outputs "ON event" when the voltage variation exceeds a predetermined threshold voltage or outputs "OFF event" when the voltage variation falls below a predetermined threshold voltage. If the event detector 110 outputs "ON event" (High level) , the transistor 121 is turned on, and the voltage of the row line R1 of the j th row becomes Low level (hereinafter, referred to as "RowReq" ) . "ON event" (High level) is maintained until the column arbiter 210 outputs "ColAck" on the column line C3, and the event detector 110 receives a reset signal. "RowReq" is detected by the row arbiter 220. The row arbiter 220 arbitrates "RowReq" from a plurality of rows, and outputs "RowAck" to the row line R2 one-by-one. If the row arbiter 220 outputs "RowAck" , "ON event" and "RowAck" are input to the AND gate 131, the output of the AND gate 131 becomes High, the transistor 123 is turned on, and the voltage of the column line C1 of the i th column becomes Low. In this way, the arbiter circuit 200 detects "ON event" at the i th column and the j th row. Then, the column arbiter 210 outputs "ColAck" on the column line C3, "RowAck" and "ColAck" are input to the AND gate 133, the output of the AND gate 133 becomes High, the delay 140 receives a High level signal from the AND gate 133, and delays and outputs it to a "RESET" terminal of the event detector 110. The event detector 110 resets "ON event" , namely, the voltage of the "ON" terminal of the event detector 110 becomes Low, the output of the AND gate 131 becomes Low, the  transistors  121 and 123 are turned off, and the voltage of the row line R1 and the column line C1 becomes High.
In the above operation of the event-based pixel circuit 100 and the arbiter circuit 200, a hand-shake protocol is operated in the vertical and horizontal directions as soon as the event is detected, charging and discharging current flows in long control lines during each event detection, and a large loop current may be induced for charging and discharging of these signal lines.
The following Embodiment 1 of the present invention provides an implementation of the arbiter circuit for row and column directions of an asynchronous type event-based vision sensor. Fig. 4 shows an example of the arbiter circuit 201 according to an embodiment of the present invention. During the periods shown as "Event Trans Enable" in Fig. 5, the row line R1 is connected to the row arbiter 221, and the column lines C1 and C2 are connected to the column arbiter 211. During the periods shown as "Event Trans Disable" in Fig. 5, the column lines C1 and C2 and the row line R1 are pulled down, namely, connected to ground. Charging and discharging current are suppressed by pulling down the control lines. According to this embodiment of the present invention, these arbiters are disabled by turning off connection to the vertical and horizontal signal lines running  over the columns and the rows by turning off  switches  213, 215, and 223 and turning on  switches  212, 214, and 222. Furthermore, when these arbiters are disconnected to the vertical and horizontal signal lines, these lines might be pulled down or up to avoid charge/discharge current induced by event detection.
Event data acquisition during a blanking period of a cycle of a read out operation for a frame-based image is performed by the following procedures: (1) acquiring event data, when it is in a period other than an AD conversion period, (2) obtaining a timestamp for each event data based on the AD conversion period, and (3) disconnecting the peripheral circuit (the arbiter circuit 200 in Fig. 4) from the event-based pixel circuit 100 during the AD conversion period, by cutting current pass on row and column communication of event data; herein the event detecting function of each pixel is not disabled. Even if the  switches  212, 214, and 222 are turned off, the event-based pixel circuit 100 continues to detect "ON event" or "OFF event" and output "RowReq" , and this "RowReq" is detected and arbitrated by the row arbiter 220 when the  switches  213, 215, and 223 are turned on.
The waveform diagram in Fig. 5 shows an operation according to the embodiment of the present invention. Herein, "Hsync" means a synchronous signal for an ADC operation for a column parallel ADC (AD conversions for all the columns are performed in parallel) in the frame-based image sensor. Generally, said ADC is a single slope ADC, and a "RAMP waveform" indicates its reference voltage waveform. During its descending period, an analog input level is measured as the time for crossing over the analog input and the RAMP level by a counter, and a count value at this time becomes a digital output. For the Correlated Double Sampling, two RAMP descending slopes are included in one ADC period. The RAMP waveform and alternative operations of "Event Trans Enable" and "Event Trans Disable" are shown in Fig. 5. During only the period called "Event Trans Enable" , pixels in the event-based vision sensor are allowed to communicate with the peripheral circuits for its event data output. On the other hand, during the period called "Event Trans Disable" , pixels are not allowed to communicate with the peripheral circuits, but are able to detect an event and its status is output during the next "Event Trans Enable" period.
Event-based vision data is a data array including at least an event position in an image area and its event time stamp. The time stamp is stepped up based on "Hsync" . In Fig. 5, the time stamp is N after the first "Hsync" , and the time stamp is N+1 after the second "Hsync" . In the case of time stamp increment based on the "Hsync" , time stamp error according to said alternative operations of "Event Trans Enable" and "Event Trans Disable" can be negligible because maximum time stamp error in the operation is one cycle of "Hsync" as can be seen from Fig. 6 in which T. S. means Time Stamp. Fig. 6 shows a read out operation of an event-based image and a frame-based  image. Waveforms of "Hsync" and ADC are shown in the upper part of Fig. 6. Under the waveform of the ADC, values of the time stamp are shown over time. The value of the time stamp is incremented when "Hsync" becomes low level. Vertical axis "Y" corresponds to row number. Rectangles denoted by “R.O. ” (read out) show that the read-out operation of the frame-based image sensor is performed for the corresponding row. Fig. 6 shows that the read-out operation is performed from the bottom row to the top row. This read-out order is merely an example. The dotted areas show the "Event Trans Disable" period, namely, event transactions for all of the rows are disabled. The dotted areas range from the bottom row to the top row including the areas behind the rectangles denoted by “R.O. ” . Slanted line areas (with lines slanting upward to the right) show the "Event Trans Enable" period, namely, event transactions for any of the rows, on which "ON event" or "OFF event" is detected, are enabled. The slanted line areas range from the bottom row to the top row including the areas behind the rectangles denoted by “R.O. ” . The order of the event transactions are arbitrated by the row arbiter 220. As shown by a two-way arrow at the bottom of Fig. 6, the events, which are detected during the period from the end of "Event Trans Enable" period to the beginning of the next "Event Trans Enable" period, are assigned time stamp “N+1” .
For example, it is possible to implement the above-mentioned embodiment with conventional components of event-based vision sensors by modification of the control program and adding a function to terminate for horizontal and vertical signal lines which run over entire rows and columns (e.g. by using a pull up/down circuit, etc. ) .
According to this embodiment, generally, the event detector 110 is reset by the control signal coming from the arbiter circuit 200, after a hand-shake protocol. Thus, this operation allows pixels to continue to detect an event even during the "Event Trans Disable" period. Furthermore, it is possible to read out the event detecting results of each pixel in the next "Event Trans Enable" period. According to this operation, it is possible to suppress interference in the operation of the frame-based image sensor from rush current induced by asynchronous control signals.
The following describes Embodiment 2 of the present invention. Event detection statuses of all the pixels are checked, namely, whether the event is detected or not is checked for all the pixels in every "Event Trans Enable" period. In this operation, in the same way as Embodiment 1 above, each pixel can detect an event for itself even in the "Event Trans Disable" period, and this event data is output during the next "Event Trans Enable" period.
Fig. 7 shows examples of an event-based pixel circuit 101 and a scanner circuit 202. It is assumed that the event-based pixel circuit 101 in Fig. 7 is located at the i th column and the j th row in the event-based vision sensor. The scanner circuit 202 (a peripheral circuit) includes a column arbiter/scanner 230 and a row scanner 240. Column lines C1, C2, and C3 are connected to the  column arbiter/scanner 230, and lines C1 and C2 are also connected to the ground via capacitors, respectively. Row line R2 is connected to the row scanner 240. Compared with the circuit shown in Fig. 3, the row line R1 and the  transistors  121 and 122 are not provided.
The event-based pixel circuit 101 includes an event detector 110,  transistors  123 and 124, AND  gates  131 and 132, and a delay 140. The row scanner 240 outputs "RowSel" sequentially, for example, from the bottom row to the top row. If the event detector 110 outputs "ON event" and the row scanner 240 outputs "RowSel" , the output of the AND gate 131 becomes High, the transistor 123 is turned on, and the voltage of the column line C1 of the i th column becomes Low. In this way, the scanner circuit 202 detects "ON event" at the i th column and the j th row. Thereafter, the column arbiter/scanner 230 outputs "ColAck" on line C3, "RowSel" and "ColAck" are input to the AND gate 133, the output of the AND gate 133 becomes High, the delay 140 receives the High level signal from the AND gate 133, and delays and outputs it to a "RESET" terminal of the event detector 110. The event detector 110 resets "ON event" , namely, the voltage of "ON" terminal of the event detector 110 becomes Low. The row scanner 240 is kept on during the "Event Trans Disable" period.
In the above operation of the event based pixel circuit 101 and the scanner circuit 202, a hand-shake protocol is implemented in the vertical and horizontal directions as soon as the event is detected, charging and discharging current flows in long control lines during each event detection, and a large loop current may be induced for charging and discharging of these signal lines.
The following Embodiment 2 of the present invention provides an implementation of the scanner circuit for row and column directions of an asynchronous type event-based vision sensor. Fig. 8 shows an example of the scanner circuit 203 according to an embodiment of the present invention. During the periods shown as "Event Trans Disable" in Fig. 5, the row scanner 241 is turned off, and the column lines C1 and C2 are terminated. The pull up resistances of the column lines C1 and C2 automatically pull them up.
Fig. 9 shows a read-out operation of an event-based image and a frame-based image. Vertical axis "Y" corresponds to row number. Rectangles denoted by “R.O. ” (read out) show that the read-out operation of the frame-based image sensor is performed for the corresponding row. Fig. 9 shows that the read-out operation is performed from the bottom row to the top row. This read-out order is merely an example. The dotted areas show the "Event Trans Disable" period, namely, event transactions for all of the rows are disabled. The dotted areas range from the bottom row to the top row including the areas behind the rectangles denoted by “R.O. ” . Slanted line areas (with lines slanting upward to the right) show the "Event Trans Enable" period, namely, event transactions for all the rows are performed from the bottom row to the top row, as shown by the tilt of the slanted  line areas. The slanted line areas range from the bottom row to the top row including the areas behind the rectangles denoted by “R.O. ” . As shown by a two-way arrow at the bottom of Fig. 9, events, which are detected after the top row is selected in the "Event Trans Enable" period and before the top row is selected in the next "Event Trans Enable" period, are assigned time stamp “N+1” .
As shown in Fig. 9, event detection statuses of each pixel are scanned during the "Event Trans Enable" period. Generally, the status of a pixel may be represented with 1 or 2 bits. Thus, it is possible to scan all of the rows during a short period of time.
According to this embodiment, turning off the row scanner can suppress generation of a large loop current, and it is possible to run asynchronous operations with suppression of interference, even in noise-sensitive periods.
In the embodiments of the present invention, event data is acquired in H-blank (blanking time of ADC periods) of frame-based read-out for n-lines, and it is possible to reduce crosstalk noise to ADC results of frame-based images.
The prior art proposed a hybrid pixel structure for acquiring a frame-based image and an event-based image, and also provided a crosstalk mitigating technique based on ADC operation periodicity. On the other hand, the embodiments of the present invention provide techniques to expand the noise mitigating system for a system involving an asynchronous operation like an event-based vision sensor. The beneficial effect of the embodiments of the present invention can be summarized as follows: The technical problem of crosstalk noise between synchronous and asynchronous systems (e.g. stacked hybrid pixel for a frame-based image and an event-based vision) can be solved. The embodiments of the present invention provide alternative operations for event detection communication and A/D conversion of frame-based images. The technical effect of the embodiments of the present invention is that it is possible to run event detection always without crosstalk noise in frame-based image.
Disclosed above are merely exemplary embodiments of the present invention, and are certainly not intended to limit the scope of protection of the present invention. A person of ordinary skill in the art will understand that all or some of the processes that implement the foregoing embodiments and equivalent modifications made in accordance with the claims of the present invention shall fall within the scope of the present invention.

Claims (12)

  1. A driving method of an event-based vision sensor for use with a circuit for a synchronous operation, comprising:
    detecting an event as a light illuminance change during the entire synchronous operation, except for a period during which an event-based pixel circuit detects said event and gets reset;
    reading said event information during a part of the synchronous operation when reading out event data; and
    suspending communication to read said event information between the event-based pixel circuit and a peripheral circuit of the event-based vision sensor during the other part of the synchronous operation.
  2. The method according to claim 1, wherein the communication between the event-based pixel circuit and the peripheral circuit is suspended by disconnecting row and column lines from the peripheral circuit.
  3. The method according to claim 1, wherein the communication between the event-based pixel circuit and the peripheral circuit is suspended by turning off the peripheral circuit.
  4. The method according to any one of claims 1 to 3, further comprising assigning a time stamp to the event data based on a clock for driving the synchronous operation.
  5. The method according to claim 4, wherein the time stamp is incremented based on a synchronous signal for an Analog to Digital Converter (ADC) operation of a frame-based image sensor.
  6. The method according to any one of claims 1 to 5, wherein Correlated Double Sampling is performed during the synchronous operation, and the part of the synchronous operation begins at the end of a second descending slope and ends at the beginning of a first descending slope.
  7. An event based vision sensor for use with a circuit for a synchronous operation, comprising:
    an event-based pixel circuit configured to detect an event as a light illuminance change during the entire synchronous operation, except for a period during which an event-based pixel circuit detects said event and gets reset; and
    a peripheral circuit configured to read said event information during a part of the synchronous operation when reading out event data, and suspend communication with the event-based pixel circuit to read said event information during the other part of the synchronous operation.
  8. The event-based vision sensor according to claim 7, wherein the communication with the event-based pixel circuit is suspended by disconnecting row and column lines from the peripheral  circuit.
  9. The method according to claim 7, wherein the communication with the event-based pixel circuit is suspended by turning off the peripheral circuit.
  10. The method according to any one of claims 7 to 9, wherein the peripheral circuit is further configured to assign a time stamp to the event data based on a clock for driving the synchronous operation.
  11. The method according to claim 10, wherein the time stamp is incremented based on a synchronous signal for an Analog to Digital Converter (ADC) operation of a frame-based image sensor.
  12. The method according to any one of claims 7 to 11, wherein Correlated Double Sampling is performed during the synchronous operation, and the part of the synchronous operation begins at the end of a second descending slope and ends at the beginning of a first descending slope.
PCT/CN2021/118673 2021-09-16 2021-09-16 Method and event-based vision sensor for acquiring event-based image during blanking time of frame-based image read-out WO2023039784A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2021/118673 WO2023039784A1 (en) 2021-09-16 2021-09-16 Method and event-based vision sensor for acquiring event-based image during blanking time of frame-based image read-out

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2021/118673 WO2023039784A1 (en) 2021-09-16 2021-09-16 Method and event-based vision sensor for acquiring event-based image during blanking time of frame-based image read-out

Publications (1)

Publication Number Publication Date
WO2023039784A1 true WO2023039784A1 (en) 2023-03-23

Family

ID=85602274

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/118673 WO2023039784A1 (en) 2021-09-16 2021-09-16 Method and event-based vision sensor for acquiring event-based image during blanking time of frame-based image read-out

Country Status (1)

Country Link
WO (1) WO2023039784A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100182468A1 (en) * 2006-11-23 2010-07-22 Austrian Research Centers Gmbh-Arc Method for the generation of an image in electronic form, picture element (pixel) for an image sensor for the generation of an image as well as image sensor
US20210160413A1 (en) * 2018-12-18 2021-05-27 Sony Semiconductor Solutions Corporation Image sensor, recording apparatus, and reset method
CN112913222A (en) * 2018-10-30 2021-06-04 索尼半导体解决方案公司 Sensor and control method
US20210258525A1 (en) * 2020-02-14 2021-08-19 Prophesee Event array readout control of event-based vision sensing
CN113365004A (en) * 2021-06-07 2021-09-07 豪威芯仑传感器(上海)有限公司 Pixel acquisition circuit and image sensor

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100182468A1 (en) * 2006-11-23 2010-07-22 Austrian Research Centers Gmbh-Arc Method for the generation of an image in electronic form, picture element (pixel) for an image sensor for the generation of an image as well as image sensor
CN112913222A (en) * 2018-10-30 2021-06-04 索尼半导体解决方案公司 Sensor and control method
US20210160413A1 (en) * 2018-12-18 2021-05-27 Sony Semiconductor Solutions Corporation Image sensor, recording apparatus, and reset method
US20210258525A1 (en) * 2020-02-14 2021-08-19 Prophesee Event array readout control of event-based vision sensing
CN113365004A (en) * 2021-06-07 2021-09-07 豪威芯仑传感器(上海)有限公司 Pixel acquisition circuit and image sensor

Similar Documents

Publication Publication Date Title
US10594973B2 (en) Conditional-reset, multi-bit read-out image sensor
WO2018105475A1 (en) Solid-state imaging device and imaging device using solid-state imaging device
CN100515050C (en) Solid-state image pickup device, method of driving same and imaging apparatus
EP2832090B1 (en) Cmos image sensors implementing full frame digital correlated double sampling with global shutter
EP2549743B1 (en) Imaging apparatus
US9979913B2 (en) Driving method of imaging device and driving method of imaging system
US11258977B2 (en) Image sensor readout method and architecture
US20090167915A1 (en) Solid-state imaging device and driving method of the same
CN112311964B (en) Pixel acquisition circuit, dynamic vision sensor and image acquisition equipment
EP3917134A1 (en) Image sensor with a plurality of super-pixels
US10033956B2 (en) Image sensor
US11089191B2 (en) Image sensor, recording apparatus, and reset method
EP2773099B1 (en) Image pickup apparatus, driving method for image pickup apparatus, image pickup system, and driving method for image pickup system
JP2016048813A (en) Solid-state imaging device, imaging method and electronic apparatus
US20120199723A1 (en) Solid-state imaging device
WO2023039784A1 (en) Method and event-based vision sensor for acquiring event-based image during blanking time of frame-based image read-out
EP3871407B1 (en) Ultra-high dynamic range cmos sensor
US20100033601A1 (en) Solid-state imaging device with a sensor core unit and method of driving the same
EP4187915A1 (en) Image sensor
US20230139066A1 (en) Image sensing apparatus
CN117957853A (en) Method for acquiring event-based images during blanking time of frame-based image readout and event-based vision sensor
US20210014446A1 (en) Signal separation method, pixel unit, and pixel array
US11758304B2 (en) Image sensor
US20240089626A1 (en) Image sensor
JP2019033442A (en) Imaging element and method for controlling the same