CN211702211U - Image sensor, biometric detection system, and electronic device - Google Patents

Image sensor, biometric detection system, and electronic device Download PDF

Info

Publication number
CN211702211U
CN211702211U CN202020315246.9U CN202020315246U CN211702211U CN 211702211 U CN211702211 U CN 211702211U CN 202020315246 U CN202020315246 U CN 202020315246U CN 211702211 U CN211702211 U CN 211702211U
Authority
CN
China
Prior art keywords
exposure
row
pixel
mos transistor
image sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202020315246.9U
Other languages
Chinese (zh)
Inventor
区国雄
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Fushi Technology Co Ltd
Original Assignee
Shenzhen Fushi Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Fushi Technology Co Ltd filed Critical Shenzhen Fushi Technology Co Ltd
Priority to CN202020315246.9U priority Critical patent/CN211702211U/en
Application granted granted Critical
Publication of CN211702211U publication Critical patent/CN211702211U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The utility model discloses an image sensor, including pixel array, pixel array includes a plurality of pixel circuit, a plurality of pixel circuit are arranged to have a plurality of rows and a plurality of row, pixel array divides into a plurality of exposure regions, and every exposure region includes the pixel circuit of one or more lines, a plurality of exposure regions's part or whole have different exposure time. The utility model also discloses a biological characteristic detecting system and electronic equipment.

Description

Image sensor, biometric detection system, and electronic device
Technical Field
The present invention relates to the field of optoelectronics, and more particularly to an optical image sensor and a biometric detection system.
Background
Fingerprint detection and identification have been widely used as means of identity authentication, and are almost one of the necessary functions on electronic devices such as mobile phones and tablet computers. Common fingerprint detection methods include capacitive fingerprint detection, optical fingerprint detection, ultrasonic detection and the like, wherein the optical fingerprint detection is the current mainstream trend. At present, the market has a large demand for optical fingerprint detection under a screen. Fingerprint detection under the screen is to place the fingerprint detection system below the display screen of the electronic equipment, and has the fingerprint detection function under the screen while meeting the requirement of high screen occupation ratio. Optical fingerprint imaging requires providing a light source for emitting a detection beam to a finger and a light sensing element for sensing the detection beam returning from the finger. Because different areas of the photosensitive surface of the photosensitive element have different distances from the light source or have different light paths, and the light intensities of the different areas of the photosensitive surface of the photosensitive element are different when the photosensitive element senses the detection light beam, the fingerprint optical image generated in the common sensing mode may have the problems that the image brightness is seriously uneven, the dark area is too dark, the bright area is too exposed, the fingerprint optical image cannot be completely obtained, and the like.
SUMMERY OF THE UTILITY MODEL
In view of this, the present invention provides an image sensor, a biometric detection system, and an electronic device with better effects.
One aspect of the present application provides an image sensor, including a pixel array, the pixel array includes a plurality of pixel circuits, the plurality of pixel circuits are arranged to have a plurality of rows and a plurality of columns, the pixel array is divided into a plurality of exposure areas, each exposure area includes pixel circuits of one or more rows, and the exposure time of a partial exposure area or a full exposure area of the plurality of exposure areas is different.
In some embodiments, a light source is disposed on one side of the image sensor, and the exposure time of the exposure region closer to the light source is shorter than the exposure time of the exposure region farther from the light source.
In some embodiments, the light source is configured to emit a light beam, the light beam can return from an external object after reaching the external object, and the image sensor receives the light beam returned by the external object and converts the light beam into an electrical signal when the exposure area is exposed, so as to obtain the biometric information of the external object.
In some embodiments, the light source is disposed outside of a first row of pixel circuits and/or a last row of pixel circuits of the pixel array.
In some embodiments, the pixel array includes M rows of pixel circuits, the M rows of pixel circuits are divided into K exposure regions, 1 < K ≦ M, M > 1, M, K being positive integers.
In some embodiments, for the same exposure area, the exposure time length of each row of pixel circuits in the exposure area is the same, or the exposure time length of a part of rows of pixel circuits in the exposure area is different from the exposure time length of the rest of rows.
In some embodiments, the exposure end time points of the pixel circuits in each row of the pixel array are different, the exposure start time points of the pixel circuits in different rows of the same exposure area are different, and the exposure start time points of the pixel circuits in two rows divided in different exposure areas are the same or different.
In some embodiments, the image sensor further comprises a row control circuit and a readout circuit, the row control circuit and the readout circuit are connected with the pixel circuit, the row control circuit is used for controlling the exposure time of the pixel circuit, and the readout circuit is used for reading the electric signal of the pixel circuit.
In some embodiments, the pixel circuit includes a photodiode, a first MOS transistor, a second MOS transistor, a third MOS transistor N3 and a fourth MOS transistor, a control terminal of the first MOS transistor is connected to the row control circuit to receive the control signal output by the row control circuit, a first terminal of the first MOS transistor is connected to the cathode of the photodiode, a second terminal of the first MOS transistor is connected to the first terminal of the second MOS transistor, a control terminal of the second MOS transistor is connected to the row control circuit to receive the reset control signal output by the row control circuit, a second terminal of the second MOS transistor is connected to the power voltage, a control terminal of the third MOS transistor is connected to the first terminal of the second MOS transistor to follow the change of the output signal of the first terminal of the second MOS transistor and output from the first terminal of the third MOS transistor, a second terminal of the third MOS transistor is connected to the power voltage, and a second terminal of the fourth MOS transistor is connected to the first terminal of the third transistor, The control end of the fourth MOS tube is connected with the row control circuit to receive the row strobe signal output by the row control circuit, and the first end of the fourth MOS tube is used as the output end of the pixel circuit and is connected with the readout circuit.
In some embodiments, the pixel circuits of the pixel array are in a rolling exposure mode or a global exposure mode.
One aspect of the present application provides a biometric detection system, including a light source and an image sensor, the image sensor being the above-mentioned image sensor, the light source being configured to emit a detection light beam to an external object, and the image sensor being configured to receive the detection light beam returned by the external object and being configured to generate a corresponding biometric image.
In certain embodiments, the biometric detection system is used for fingerprint detection.
An aspect of the application provides an electronic device comprising an image sensor as described above, or comprising a biometric detection system as described above.
In some embodiments, the electronic device further comprises a display screen, the biometric detection system is positioned below all or part of the display screen, and the biometric detection system is capable of transmitting a detection beam to an external object and receiving a detection beam returned by the external object through all or part of the display screen.
The beneficial effects of this application lie in, image sensor's pixel array has different exposure area, according to and the light source between the distance far and near different, different exposure area can have different exposure time, and the exposure time of the exposure area that is closer to the light source is less than the exposure time of the exposure area far away from the light source. Further, the exposure region may include a plurality of rows of pixel circuits, or the exposure region may include only one row of pixel circuits. By exposing different exposure areas for different time lengths, a biological characteristic image with uniform brightness can be obtained. The image sensor, the biological characteristic detection system and the electronic equipment can obtain the biological characteristic image with uniform brightness and a large effective imaging range, so that a good biological characteristic detection effect is achieved.
Drawings
FIG. 1 is a block schematic diagram of one embodiment of an image sensor of the present application;
FIG. 2 is a schematic diagram of a pixel circuit of FIG. 1;
FIG. 3 is a timing diagram of driving signals of the pixel circuit of FIG. 2;
FIG. 4 is a schematic view of a rolling exposure;
FIG. 5 is a schematic view of one embodiment of a biometric detection system of the present application;
FIG. 6 is a graph illustrating optical power and first edge distance of a photosite of the pixel array of FIG. 5;
FIG. 7 is a timing diagram of portions of signals for a partially exposed area of one embodiment of the pixel array of FIG. 5;
fig. 8-9 are schematic diagrams of exposure regions and exposure times for one embodiment of the pixel array of fig. 5.
Detailed Description
In the detailed description of the embodiments of the present invention, it will be understood that when a substrate, a sheet, a layer, or a pattern is referred to as being "on" or "under" another substrate, another sheet, another layer, or another pattern, it can be "directly" or "indirectly" on the other substrate, the other sheet, the other layer, or the other pattern, or one or more intervening layers may also be present. The thickness and size of each layer in the drawings of the specification may be exaggerated, omitted, or schematically represented for clarity. Further, the sizes of the elements in the drawings do not completely reflect actual sizes.
The technical solutions in the embodiments of the present invention will be described clearly and completely with reference to the accompanying drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present invention without any inventive step, are within the scope of the present invention.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures. Further, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
In the following description, numerous specific details are provided to provide a thorough understanding of embodiments of the application. One skilled in the relevant art will recognize, however, that the subject technology can be practiced without one or more of the specific details, or with other structures, components, and so forth. In other instances, well-known structures or operations are not shown or described in detail to avoid obscuring the focus of the application.
A Complementary Metal Oxide Semiconductor (CMOS) image sensor is a Semiconductor device that converts an optical signal into an electrical signal. With the continuous improvement of CMOS Image Sensor (CIS) manufacturing processes, CMOS image sensors have relative advantages over CCD image sensors in terms of power consumption, integratability, random addressing, and the like, and are the mainstream devices in the field of solid-state image sensors.
In general, a CMOS image sensor includes a pixel array, a readout circuit coupled to the pixel array, and a digital processing circuit. The pixel array is composed of a plurality of pixel circuits distributed in an array mode. The pixel circuits in the same row are output by the signal lines in the same row and are coupled to a readout circuit. The pixel array converts the received optical signals into analog electrical signals, and the readout circuit converts the analog electrical signals into digital electrical signals and then outputs the digital electrical signals to the digital processing circuit for subsequent processing.
The exposure modes of the pixel unit array of the CMOS image sensor are mainly classified into global exposure and rolling shutter. And (3) simultaneously starting exposure and finishing exposure of the pixels of the CMOS image sensor with global exposure, and then performing reading operation. Rolling exposure means that one readout circuit is shared for each column of pixels, exposed row by row and read out row by row to the readout circuit. So that the readout is performed first by the pixels of the first row, then by the second row, and so on. For a rolling exposure CMOS image sensor, the exposure time for each row is generally equal, and the readout time for each row is generally equal. It will be appreciated that the exposure end times between two adjacent rows have a time difference which can be considered to be greater than or equal to the read out time of the previous row. The following embodiments of the present application describe a rolling exposure based approach, but it should be understood that the present application is not limited to rolling exposure or global exposure.
Fig. 1 is a block diagram of an image sensor 10 according to the present application. The image sensor 10 includes a pixel array 11, a row control circuit 12, and a readout circuit 13. The pixel array 11 includes a plurality of pixel circuits 111 arranged in an array. The plurality of pixel circuits 111 are arranged to have a plurality of rows and a plurality of columns. For example, but not limited to, the pixel circuits 111 are arranged in a row direction in fig. 1 to have a plurality of rows, and the pixel circuits 111 are arranged in a column direction in fig. 1 to have a plurality of columns. The row and column directions are substantially perpendicular. The pixel array 11 collects light beams with biometric information of an external object and converts the light beams into electrical signals. The row control circuit 12 is configured to control the pixel circuit 111 to enter different stages of resetting, exposing, and reading out, and the readout circuit 13 may be configured to read out an electrical signal generated by the pixel circuit 111 through light sensing when the pixel circuit 111 is in the readout stage. The row control circuit 12, readout circuit 13 may comprise one or more integrated circuits (e.g., image processing circuitry, microprocessors, random access memory, etc.). And may comprise a component separate from the image sensor 10 and/or forming part of the image sensor 10. The image sensor 10 may be a front-illuminated (FSI) image sensor or a back-illuminated (BSI) image sensor. The pixel circuit 111 may include one or more photoelectric conversion elements such as, but not limited to: a photodiode or any other suitable photosensitive element capable of generating an electrical charge in response to light.
Fig. 2 is a schematic circuit diagram of the pixel circuit 111. Fig. 2 shows a pixel circuit 111 of a 4T structure composed of 4 CMOS transistors. The pixel circuit 111 may include: the photodiode PD, a first MOS transistor N1, a second MOS transistor N2, a third MOS transistor N3 and a fourth MOS transistor N4. A control terminal (not numbered) of the first MOS transistor N1 is connected to the row control circuit 12 to receive the switching control signal TX output by the row control circuit 12. A first terminal (not numbered) of the first MOS transistor N1 is connected to the cathode of the photodiode PD, and a second terminal (not numbered) of the first MOS transistor N1 is connected to a first terminal (not numbered) of the second MOS transistor N2. A control terminal (not numbered) of the second MOS transistor N2 is connected to the row control circuit 12 for receiving the reset control signal RX output by the row control circuit 12, and a second terminal (not numbered) of the second MOS transistor N2 is connected to the operating voltage VDD. A control terminal (not numbered) of the third MOS transistor N3 is connected to the first terminal of the second MOS transistor N2, and is configured to follow a change of the output signal of the first terminal of the second MOS transistor N2 and output from the first terminal (not numbered) of the third MOS transistor N3. The second end (not numbered) of the third MOS transistor is connected to the operating voltage VDD. A second terminal (not numbered) of the fourth MOS transistor N4 is connected to the first terminal of the third MOS transistor N3, a control terminal (not numbered) of the fourth MOS transistor is connected to the row control circuit to receive the row strobe signal SEL output by the row control circuit, and a first terminal (not numbered) of the fourth MOS transistor is connected to the readout circuit 12 as the output terminal of the pixel circuit 111. When the row strobe signal SEL is at a high level, the pixel circuits of the row are all connected to the corresponding readout circuits 13, and the voltage data of the row is read out in parallel by the readout circuits 13. Optionally, the first MOS transistor N1, the second MOS transistor N2, the third MOS transistor N3, and the fourth MOS transistor N4 are NMOS transistors, the first terminal may be a source of the NMOS transistor, the second terminal may be a drain of the NMOS transistor, and the control terminal may be a gate of the NMOS transistor. Optionally, in some embodiments, the first MOS transistor N1, the second MOS transistor N2, the third MOS transistor N3, and the fourth MOS transistor N4 may be PMOS transistors. Alternatively, in some embodiments, the pixel circuit 111 may be a three-transistor active pixel (3T-APS), a four-transistor active pixel (4T-APS), a five-transistor active pixel (5T-APS), or the like, which is not limited in this application.
In the present embodiment, for convenience of description, a node at which the source of the second MOS transistor N2 is connected to the drain of the first MOS transistor N1 is referred to as a node FD (the node FD is also called a floating diffusion region). As can be seen from the photoelectric effect, the intensity of the current signal generated by the photodiode PD is proportional to the intensity of the incident light. When the first MOS transistor N1 is turned on, the current signal flowing into the node FD is converted into a voltage signal, and the voltage signal is output to the column signal line 15 via the third MOS transistor N3 and the fourth MOS transistor N4, and is quantized by the readout circuit 12. The pixel array 10 further includes a plurality of row signal lines 14 and column signal lines 15, each row signal line 14 being connected to a row of pixel circuits 111, and each column signal line 15 being connected to a column of pixel circuits 111. The row control circuit 12 supplies a row gate signal SEL to the pixel circuits 111 of one row via a row signal line 14. When the row strobe signal SEL is at a high level, the pixel circuits 111 in the row are all connected to the readout circuit 13, and the photo-generated electrical signals of the pixel circuits 111 in the row are simultaneously read out in parallel by the readout circuit 13. Specifically, the row control circuit 12 provides a row selection signal SEL, and first selects the pixels in the first row, and after the electrical signals generated by all the pixel circuits 111 in the first row sensing the light beams are read out by the readout circuit 13, the row control circuit 12 selects the pixels in the second row, and after the electrical signals generated by all the pixel circuits 111 in the second row sensing the light beams are read out by the readout circuit 13, the row control circuit 12 selects … … the pixels in the third row, and so on, thereby implementing a rolling exposure mode of line-by-line exposure and line-by-line readout. Of course, in other embodiments, the readout circuit 13 may have different configurations to read out the photo-generated electrical signal in different ways, which is not limited in this application.
Please refer to fig. 3, which is a timing diagram of a portion of signals provided to the pixel circuit 111 shown in fig. 2. Before the exposure starts, the reset control signal RX and the conversion control signal TX are asserted (become high level), the first MOS transistor and the second MOS transistor are turned on, and the operating voltage VDD is supplied to the photodiode PD via the first MOS transistor and the second MOS transistor. The reset control signal RX and the changeover control signal TX may then be deactivated (become low level). The first MOS tube and the second MOS tube are cut off. The photodiode PD starts sensing the light beam and accumulating charges. The row strobe signal SEL is effective (becomes high level), and the fourth MOS tube is conducted. And the third MOS tube is used as a source follower, and the voltage on the node FD is output as VOUT. The reset control signal RX is asserted to reset the voltage at node FD to the operating voltage VDD. Then the conversion control signal TX is asserted, the first MOS transistor is turned on, the photodiode PD discharges via the first MOS transistor, and the voltage signal VOUT is output from the fourth MOS transistor. The voltage signal VOUT can be regarded as an electrical signal generated by sensing a light beam when the pixel circuit 111 is exposed to light. The voltage signal VOUT is supplied to the readout circuit 13 via the column signal line 15.
Fig. 4 is a timing diagram illustrating a conventional rolling exposure operation performed by the pixel array 11 shown in fig. 1. The exposure time Te and the reading time Tr of the pixel circuits 111 of each row are kept uniform. Tf is a time when the image sensor 10 outputs one frame (frame), Te is an exposure time of the pixel circuits 111 of one row, and Tr is a readout time of the electric signal data of the pixel circuits 111 of one row. In general, the readout time Tr needs to reach a certain length of time in order for the readout circuit 13 to read the electric signals of the pixel circuits 111. The readout time Tr of the pixel circuits 111 of each row in the embodiment is kept uniform.
In fig. 4, i is a positive integer, and the pixel circuit 111 in the ith row enters a readout stage after the exposure is completed, where the pixel circuit 111 in the (i + 1) th row is in an exposure period. When the readout of the pixel circuit 111 of the ith row is completed, the pixel circuit 111 of the (i + 1) th row enters a readout phase. The pixel circuits 111 in the (i + 2) th row enter an exposure period. A timing diagram of the operation of successive 3 rows of pixels in this manner is shown in figure 4. However, when the image sensor 10 is used to generate a biometric image, the rolling exposure method may result in a poor generated image.
Fig. 5 is a schematic diagram of the image sensor 10 applied to a biometric detection system 1. The biometric detection system 1 comprises the image sensor 10 and a light source 20. The biometric detection system 1 may also typically include a lens (not shown) positioned over the image sensor 10. During exposure, a light beam can be focused through the lens onto the image sensor 10. The light source 20 is used to emit a detection light beam 201 to an external object 1000 (e.g., a finger). The detection beam 201 can be transmitted after entering the external object 1000, or the detection beam 201 can be reflected by the external object 1000. For convenience of description, the detection beam 201 reflected by the external object 1000 and transmitted from the inside of the external object 1000 is collectively referred to as a detection beam returned by the external object 1000. As an example, fig. 5 shows a case where the detection beam 201 is transmitted after entering the external object 1000. The image sensor 10 can sense the detection light beam 201 returned by the external object 1000 and convert it into a corresponding electrical signal. The specific process of sensing the detection light beam 201 and converting the detection light beam into an electrical signal by the pixel circuit 111 has been described above, and the embodiments of the present application are not described in detail again.
The light source 20 is disposed at one side of the image sensor 10. Specifically, the pixel array 11 includes a first edge 101, a second edge 102, a third edge 103, and a fourth edge 104 connected end to end in sequence. The light source 20 is closer to the first edge 101 and further from the second edge 102, the third edge 103 and the fourth edge 104. In fig. 5, the first edge 101 and the third edge 103 are parallel to the row direction, and the second edge 102 and the fourth edge 104 are parallel to the column direction. The pixel circuits 111 in the first row of the pixel array 11 are farther from the light source 20, and the pixel circuits 111 in the last row of the pixel array 11 are closer to the light source 20. The light source 20 may be regarded as being disposed outside the pixel circuits 111 of the last row. Alternatively, the light source 20 may be disposed outside the first row of pixel circuits 111 or outside the last row of pixel circuits 111.
It should be noted that the descriptions of the "first edge 101, the second edge 102, the third edge 103, the fourth edge 104", "the first row", "the last row", "the row", and "the column" are only for convenience of understanding, and are not intended to limit the structure or the function. It will be appreciated that the first row may also be the last row in some embodiments, and the last row may also be the first row in some embodiments. A "row" may be a "column" in some embodiments, and a "column" may be a "row" in some embodiments.
A light sensing point S is arbitrarily selected on the pixel array 11, the optical power of the detection beam at the light sensing point S is denoted by P, and the vertical distance between the light sensing point S and the first edge 101 is denoted by d. It will be appreciated that the optical power P at the photosite S is smaller when the distance d is larger. Fig. 6 is a schematic diagram showing a relationship between the vertical distance d between the light-sensing point S and the first edge 101 and the magnitude of the optical power P at the light-sensing point S.
The first edge 101 is a side of the pixel array 11 close to the light source 20, and the light source 20 is disposed outside the first edge 101. The closer the pixel circuit 111 on the pixel array 11 is to the first edge 101, the closer the pixel circuit is to the light source 20, and understandably, the higher the optical power of the detection light beam 201 that can be received by the pixel circuit 111. The pixel array 11 senses the detection beam 201 returned from the external object 1000 in a rolling exposure manner. If the exposure time of the pixel circuits 111 of each row is the same, the pixel circuits 111 having different distances from the first edge 101 have different magnitudes of electric signals obtained by photoelectric conversion. It can be understood that, relatively speaking, the electrical signals generated by the pixel circuits 111 close to the light source 20 after exposure are high in image brightness, and the electrical signals generated by the pixel circuits 111 far from the light source 20 after exposure are low in image brightness. When the electrical signal generated by the pixel circuit 111 exposed to light at a different distance from the light source 20 generates a corresponding optical image, brightness unevenness, or even partial overexposure or underexposure may occur. The overexposed or underexposed portion generally cannot be used for normal biometric detection and identification, resulting in a reduced effective imaging area and failure to completely acquire a clear biometric image (e.g., a fingerprint image), which results in less biometric information of the external object 1000 being acquired and a poor biometric detection effect.
In the image sensor 10 of the present application, the pixel array 11 may be divided into a plurality of exposure regions, each of which includes one or more rows of pixel circuits 111, and some or all of the plurality of exposure regions have different exposure times. The exposure time of the exposure area closer to the light source 20 is not longer than the exposure time of the exposure area farther from the light source 20. For one pixel circuit 111, it can be considered that: the exposure time and the light power received per unit time are the brightness of the image corresponding to the pixel circuit 111. It should be noted that, for convenience of description, the exposure time of one exposure region in the present application is the exposure time of the pixel circuit 111 in the exposure region.
By controlling different exposure areas to have different exposure times, especially so that the exposure time of an exposure area farther from the light source 20 is not less than the exposure time of an exposure area closer to the light source 20. Therefore, compared with the conventional pixel circuits 111 in each row having the same exposure time, the image sensor 10 of the embodiment of the application can enable the pixel circuits 111 in different exposure areas of the pixel array 11 to have closer or substantially the same brightness.
Referring to fig. 7, a timing diagram of a portion of the working signals of the portion of the exposure area of the pixel array 11 is shown. The timing diagram in fig. 7 takes 3 consecutive exposure regions as an example, and each exposure region includes 3 rows of pixel circuits 111. The pixel circuits 111 in 3 rows are respectively in the first exposure area E1, the second exposure area EA2 and the third exposure area EA3, where EA1, EA2 and EA3 respectively represent 3 consecutive first exposure areas, second exposure areas and third exposure areas. The length of exposure time may be different for different exposure regions. The exposure time lengths of the pixel circuits 111 of each row in the same exposure area may be equal, but the exposure start time points and the exposure end time points of the pixel circuits 111 of each row in the same exposure area have a sequential order. The length of the exposure time of the pixel circuit 111 of a certain row can be obtained by two pulse intervals of the control signal TX applied to the row, and then, by controlling the pulse interval of the conversion control signal TX, the control of the length of the exposure time of the pixel circuit 111 of a line change can be realized.
The first exposure area EA1 may be an area where the pixel array 11 is closer to the light source 20. The second exposure region EA2 is a region farther from the light source 20 than the first exposure region EA 1. The third exposure region EA3 is a region farther from the light source 20 than the second exposure region EA 2.
The first exposure area EA1 includes pixel circuits 111 in rows 1, 2 and 3, corresponding row selection signals are SEL1, SEL2 and SEL3, reset control signals are RX1, RX2 and RX3, and conversion control signals are TX1, TX2 and TX 3. The pixel circuits 111 of the 1 st, 2 nd, and 3 rd rows have an exposure time length T1.
The second exposure area EA2 includes pixel circuits 111 in rows 4, 5 and 6, corresponding row selection signals are SEL4, SEL5 and SEL6, reset control signals are RX4, RX5 and RX6, and conversion control signals are TX4, TX5 and TX 6. The pixel circuits 111 of the 4 th, 5 th, and 6 th rows have an exposure time length T2.
The third exposure area EA3 includes pixel circuits 111 in rows 7, 8 and 9, corresponding row selection signals are SEL7, SEL8 and SEL9, reset control signals are RX7, RX8 and RX9, and conversion control signals are TX7, TX8 and TX 9. The pixel circuits 111 of the 7 th, 8 th, and 9 th rows have an exposure time length T3. Wherein T2 is more than 0 and less than or equal to T1 and less than or equal to T3. In addition, it can be seen that the exposure start time points are the same in the 4 th and 1 st rows, and the exposure end time points are different.
Alternatively, in some embodiments, the exposure start time points of the two rows of pixel circuits divided in different exposure areas may be the same or different. The exposure end time points of the pixel circuits of each row of the pixel array may be different, and the exposure start time points of the pixel circuits of different rows of the same exposure region may be different.
In the example shown in fig. 7 in particular, the exposure start times of the 1 st, 4 th and 7 th lines may be at any suitable point in time within the time of one frame. For the first exposure area EA1, the exposure start time point of the 2 nd row is delayed by a length of about one readout time from the exposure start time point of the 1 st row, and the exposure start time point of the 3 rd row is delayed by a length of about one readout time from the exposure start time point of the 2 nd row. For the second exposure area EA2, the exposure start time point of the 5 th row is delayed by a length of about one readout time from the exposure start time point of the 4 th row, and the exposure start time point of the 6 th row is delayed by a length of about one readout time from the exposure start time point of the 5 th row. For the third exposure field EA3, the exposure start time point of the 8 th row is delayed by a length of about one readout time from the exposure start time point of the 7 th row, and the exposure start time point of the 9 th row is delayed by a length of about one readout time from the exposure start time point of the 8 th row.
In fact, within a frame time, for the pixel circuits 111 in the row where the exposure of each exposure area is started at the earliest time, the exposure start time point may not be limited, and the exposure end time point may not be limited, and it is only required that the exposure time length of the exposure area farther from the light source 20 is greater than or equal to the exposure time length of the exposure area closer to the light source 20, and the readout circuit reads only the pixel circuits 111 in the row in the pixel array 11 at any time point.
Therefore, the exposure end time points of the pixel circuits 111 in each row of the pixel array 11 are different, the exposure start time points of the pixel circuits 111 in different rows of the same exposure area are different, and the exposure start time points of the pixel circuits 11 in two rows divided in different exposure areas are the same or different.
It should be understood that the signal timing of the 3 exposure regions shown in fig. 7 is only for convenience of understanding, and is not a limitation in the embodiment of the present application. The pixel array 111 may include any number of multiple exposure regions, which may also include one or more rows of pixel circuits 111. When the exposure region includes a plurality of rows of pixel circuits 111, the plurality of rows of pixel circuits 111 may be a continuous plurality of rows of pixel circuits 111, or there may be a plurality of rows of pixel circuits 111 spaced apart.
Next, the pixel array 11 including 8 different exposure regions will be described with reference to fig. 8 and 9. Referring to fig. 8 and 9, the abscissa in fig. 8 represents the distance d between the different exposure areas and the first edge 101, and the ordinate represents the exposure time Te corresponding to the different exposure areas. By way of example and not limitation, fig. 8 and 9 show that the pixel array 11 has first to eighth exposure regions EA1 to EA8, distances between the first to eighth exposure regions EA1 to EA8 and the light source 20 decrease in sequence, the first to eighth exposure regions EA1 to EA8 have respective exposure times T1 to T8, and it can be seen that exposure time lengths T1 to T8 of the first to eighth exposure regions EA1 to EA8 decrease in sequence.
Optionally, in an exemplary embodiment, the pixel array 11 may include 128 rows by 128 columns of pixel circuits 111. Each row of the M rows of pixel circuits 111 is arranged in a row direction in fig. 1, and each column of the N columns of pixel circuits 111 is arranged in a column direction in fig. 1. The pixel circuits 111 in the first to 128 th rows are sequentially divided in the first to eighth exposure areas EA1 to EA8 in order. The first exposure area EA1 includes pixel circuits 111 in first to 16 th rows. The second exposure region EA2 includes pixel circuits 111 in rows 17 to 32. The third exposure region EA3 includes pixel circuits 111 in rows 33 to 48. The fourth exposure area EA4 includes pixel circuits 111 in rows 49 to 64. The fifth exposure region EA5 includes pixel circuits 111 in rows 65 to 80. The sixth exposure region EA6 includes pixel circuits 111 in rows 81 to 96. The seventh exposure area EA7 includes pixel circuits 111 of the 97 th to 112 th rows. The eighth exposure region EA8 includes pixel circuits 111 in rows 113 to 128.
The exposure operation of the pixel array 11 is as follows:
the pixel circuits 111 of the first exposure area EA1 are exposed line by line from the first line to the 16 th line. With respect to the pixel circuits 111 of two adjacent rows in the first exposure region, the exposure start time and the exposure end time of the subsequent row are delayed by one readout time length Tr from those of the preceding row, respectively, with the exposure time length of each row being T1.
The pixel circuits 111 of the second exposure area EA2 are exposed line by line from the 17 th line to the 32 th line. With respect to the pixel circuits 111 of two adjacent rows within the second exposure area, the exposure start time and the exposure end time of the subsequent row are delayed by one readout time length Tr from those of the preceding row, respectively, with the exposure time length of each row being T2. Further, the exposure start time of the pixel circuit 111 in the 17 th row is delayed by T1 to T2+ Tr from the exposure start time of the pixel circuit 111 in the 16 th row, and the exposure end time of the pixel circuit 111 in the 17 th row is delayed by one readout time Tr from the exposure end time of the pixel circuit 111 in the 16 th row.
The pixel circuits 111 of the third exposure area EA3 are exposed line by line from line 33 to line 48. With respect to the pixel circuits 111 of two adjacent rows in the third exposure area, the exposure start time and the exposure end time of the subsequent row are delayed by one readout time length Tr from those of the preceding row, respectively, with the exposure time length of each row being T3. Further, the exposure start time of the pixel circuit 111 in the 33 th row is delayed by T2 to T3+ Tr from the exposure start time of the pixel circuit 111 in the 32 th row, and the exposure end time of the pixel circuit 111 in the 33 th row is delayed by one readout time Tr from the exposure end time of the pixel circuit 111 in the 32 th row.
The pixel circuits 111 of the fourth exposure area EA4 are exposed line by line from the 49 th line to the 64 th line. With respect to the pixel circuits 111 of two adjacent rows within the fourth exposure area, the exposure start time and the exposure end time of the subsequent row are delayed by one readout time length Tr from those of the preceding row, respectively, with the exposure time length of each row being T4. Further, the exposure start time of the pixel circuit 111 in the 49 th row is delayed from the exposure start time of the pixel circuit 111 in the 48 th row by: T3-T4+ Tr, the exposure end time of the pixel circuit 111 of the 49 th row is delayed by one readout time Tr from the exposure end time of the pixel circuit 111 of the 48 th row.
The pixel circuits 111 of the fifth exposure area EA5 are exposed line by line from the line 65 to the line 80. With respect to the pixel circuits 111 of two adjacent rows within the fifth exposure area, the exposure start time and the exposure end time of the subsequent row are delayed by one readout time length Tr from those of the preceding row, respectively, with the exposure time length of each row being T5. Further, the exposure start time of the pixel circuit 111 in the 65 th row is delayed from the exposure start time of the pixel circuit 111 in the 64 th row by: T4-T5+ Tr, the exposure end time of the pixel circuit 111 of the 65 th row is delayed by one readout time Tr from the exposure end time of the pixel circuit 111 of the 64 th row.
The pixel circuits 111 of the sixth exposure area EA6 are exposed line by line from the 81 th line to the 96 th line. With respect to the pixel circuits 111 of two adjacent rows within the sixth exposure area, the exposure start time and the exposure end time of the subsequent row are delayed by one readout time length Tr from those of the preceding row, respectively, with the exposure time length of each row being T6. Further, the exposure start time of the pixel circuit 111 in the 81 th row is delayed from the exposure start time of the pixel circuit 111 in the 80 th row by: T5-T6+ Tr, the exposure end time of the pixel circuit 111 of the 81 th row is delayed by one readout time Tr from the exposure end time of the pixel circuit 111 of the 80 th row.
The pixel circuits 111 of the seventh exposure area EA7 are exposed line by line from line 97 to line 112. With the pixel circuits 111 of two adjacent rows in the seventh exposure area, the exposure start time and the exposure end time of the subsequent row are delayed by one readout time length Tr from those of the preceding row, respectively, with the exposure time length of each row being T7. Further, the exposure start time of the pixel circuit 111 in the 97 th row is delayed from the exposure start time of the pixel circuit 111 in the 96 th row by: T6-T7+ Tr, the exposure end time of the pixel circuit 111 of the 97 th row is delayed by one readout time Tr from the exposure end time of the pixel circuit 111 of the 96 th row.
The pixel circuits 111 of the eighth exposure area EA8 are exposed line by line from line 113 to line 128. With respect to the pixel circuits 111 of two adjacent rows within the eighth exposure area, the exposure start time and the exposure end time of the subsequent row are delayed by one readout time length Tr from those of the previous row, respectively, with the exposure time length of each row being T8. Further, the exposure start time of the pixel circuit 111 in the 113 th row is delayed from the exposure start time of the pixel circuit 111 in the 112 th row by: T7-T8+ Tr, the exposure end time of the pixel circuit 111 of the 113 th row is delayed by one readout time Tr from the exposure end time of the pixel circuit 111 of the 112 th row.
As can be seen from the above, with respect to the pixel circuits 111 in two adjacent rows in the same exposure area, the exposure start time and the exposure end time of the subsequent row are delayed by one readout time Tr from the exposure start time and the exposure end time of the previous row, respectively. The exposure start time of the first row of pixel circuits 111 of the subsequent exposure area is delayed more than one readout time Tr from the exposure start time of the last row of pixel circuits 111 of the previous exposure area for the adjacent exposure areas.
To this end, the 128 rows of pixel circuits 111 of the pixel array 11 complete the generation and readout of image data for one frame. Generally, the image sensor 10 can realize generation and readout of image data for several frames in one second. The number of rows, columns, and frame rate (frame) of the pixel array 11 may all affect the size of the exposure time length Te and readout time length Tr. Those skilled in the art will appreciate that the present application is not described in detail.
In the above embodiment, the readout circuit 13 only needs to read out the electrical signal data of the pixel circuits 111 in one row at each time, and the electrical signals of the pixel circuits 111 not in the same row are not read out at the same time. After the electric signals of the pixel circuits 111 in the previous row are read by the readout circuit 13, the electric signals of the pixel circuits 111 in the next row start to be read by the readout circuit 13. The pixel circuits 111 in each row have the same time length in the readout phase, which is the readout time Tr. In this way, the accuracy of the electrical signal data read by the readout circuit 13 can be ensured to be high. Of course, in some embodiments, the pixel circuits 111 of different rows may also be read by the readout circuit 13 at the same time, or the pixel circuits 111 of different rows may also have different readout times. The present application is not limited thereto. In a modified embodiment, the pixel array 11 may also be sequentially exposed in the order from the eighth to first exposure regions EA8 to EA 1.
Optionally, in some embodiments, the pixel array 11 may be divided into different exposure regions, and when the pixel array 11 performs exposure, the pixel circuits 111 in the same exposure region may have the same exposure time length, and the pixel circuits 111 in different exposure regions may have different exposure time lengths.
Optionally, in some embodiments, different rows of the same exposure area of the pixel array 11 may have the same or different exposure time lengths, and some rows of different areas may have the same or different exposure time lengths, but overall, the beam energy received by the exposure area farther from the light source 20 is approximately equal to the beam energy received by the exposure area closer to the light source 20, so that the brightness of the image obtained after exposure imaging is generally average, and no obvious bright portion overexposure or dark portion underexposure occurs. The light beam described herein refers primarily to the detection light beam emitted by the light source 20, for example, but not limited to, the detection light beam may be visible and/or near infrared light.
Alternatively, in some embodiments, the pixel array 11 may be exposed line by line from the first line, or may be exposed line by line from the last line, or may be exposed in an interlaced manner, or in different exposure areas, or in different numbers of lines of pixel circuits 111, or may select a specific line or lines of pixel circuits 111 from the unexposed pixel circuits 111 to expose each time, and so on. This is not a limitation of the present application.
Optionally, in some embodiments, the pixel array 11 includes M rows by N columns of pixel circuits 111, the M rows of pixel circuits 111 are divided into K exposure areas, wherein: k is more than 1 and less than or equal to M, M is more than 1, and M, K is a positive integer. In the K exposure areas, the exposure time of the exposure area closer to the light source 20 is not longer than the exposure time of the exposure area farther from the light source 20. It can be understood that the greater the number of the exposure regions and the different exposure time of each exposure region, the better the optical imaging effect obtained by the pixel array 11. Further, when the pixel circuits 111 in each row are in an exposure area, that is, K is equal to M, the exposure time of the pixel circuits 111 in each row is different, and the optical imaging effect of the pixel array 11 is better.
Although in the above embodiments, the exposure area includes one or more rows of pixel circuits 111, it can be considered that the exposure area is partitioned by rows, it should be understood that in some possible embodiments, the exposure area may also be partitioned by columns, that is, one exposure area may include one or more columns of pixel circuits 111. This is not a limitation of the present application.
Optionally, in some embodiments, the Light source 20 may include or be one or more of a Light-Emitting Diode (LED), a Vertical-Cavity Surface-Emitting Laser (VCSEL), a Laser Diode (LD), or a Light-Emitting array thereof. The detection light beam 201 emitted by the light source 20 may be visible light and/or invisible light. In some embodiments, for example and without limitation, the detection beam 201 may be near infrared light having a wavelength in the range of 780 nanometers to 2000 nanometers.
Optionally, in some embodiments, the light source 20 is configured to emit a beam of near infrared light that is directed onto the finger through at least a portion of one of the display screens. Near infrared light striking the finger can be reflected off the finger and/or transmitted out after entering the interior of the finger. For convenience of description, the near-infrared light transmitted from the finger and the reflected near-infrared light are collectively referred to as a beam returned from the finger. The pixel array 11 can receive the light beam returned from the finger through at least part of the display screen and convert the light beam into a corresponding electric signal to acquire fingerprint characteristic information of the finger. Of course, the present application is not limited to fingerprint detection. The detection of biometric features by the present application may include, but is not limited to, fingerprints, palm prints, toe prints, other biometric prints, facial features, and the like. The finger may also be a palm, toe, face or any other external object with a recognizable biometric feature. Alternatively, the near infrared light may be a light beam having a wavelength range of 780 nm to 2000 nm. Of course, in other possible embodiments of the present application, the light source 20 may be used to emit visible light and/or invisible light for acquiring fingerprint or other biometric information, such as, but not limited to, optical image information of a fingerprint or other biometric information.
Compared with the prior art, the pixel array 11 of the image sensor 10 of the present application has different exposure regions, which may have different exposure times according to the distance from the light source 20, and the exposure time of the exposure region closer to the light source 20 is shorter than the exposure time of the exposure region farther from the light source 20. Further, the exposure region may include a plurality of rows of pixel circuits 111, or the exposure region may include only one row of pixel circuits. The biological characteristic image with more uniform brightness can be obtained by exposing different exposure areas for different time lengths. Therefore, the method and the device can obtain optical imaging with uniform brightness and a large effective imaging range, and have a good biological feature detection effect.
It should be noted that, part or all of the embodiments of the present invention, and part or all of the modifications, replacements, alterations, splits, combinations, extensions, etc. of the embodiments are considered to be covered by the novel idea of the present invention, and belong to the protection scope of the present invention, without the creative efforts.
Any reference in this application to "one embodiment," "an embodiment," "example embodiment," etc., means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment. Further, when a particular feature or structure is described in connection with any embodiment, it is submitted that it is within the purview of one skilled in the art to effect such feature or structure in connection with other ones of the embodiments.
The orientations and positional relationships indicated by "length", "width", "upper", "lower", "left", "right", "front", "rear", "back", "front", "vertical", "horizontal", "top", "bottom", "inner", "outer", and the like, which may appear in the present specification, are based on the orientations and positional relationships shown in the drawings, and are only for convenience of describing the present invention and simplifying the description, but do not indicate or imply that the device or element referred to must have a specific orientation, be constructed and operated in a specific orientation, and thus, should not be construed as limiting the present invention. Like reference numbers and letters refer to like items in the figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures. Meanwhile, in the description of the present invention, the terms "first", "second", and the like are used only for distinguishing the description, and are not to be construed as indicating or implying relative importance. In the description of the present invention, "plurality" or "a plurality" means at least two or two unless explicitly specifically defined otherwise. In the description of the present invention, it should be further noted that, unless otherwise explicitly stated or limited, the terms "disposed," "mounted," and "connected" are to be interpreted broadly, and may be, for example, fixedly connected, detachably connected, or integrally connected; can be mechanically or electrically connected; either directly or indirectly through intervening media, or may be interconnected between two elements. The specific meaning of the above terms in the present invention can be understood in specific cases to those skilled in the art.
The above description is only for the specific embodiment of the present invention, but the protection scope of the present invention is not limited thereto, and any person skilled in the art can easily think of the changes or substitutions within the technical scope of the present invention, and all should be covered within the protection scope of the present invention. The terms used in the following claims should not be construed to limit the novel features to the specific embodiments disclosed in the specification. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (14)

1. An image sensor is characterized by comprising a pixel array, wherein the pixel array comprises a plurality of pixel circuits which are arranged to be provided with a plurality of rows and a plurality of columns, the pixel array is divided into a plurality of exposure areas, each exposure area comprises one or more rows of pixel circuits, and the exposure time of partial exposure areas or all exposure areas in the exposure areas is different from each other.
2. The image sensor as claimed in claim 1, wherein a light source is provided at one side of the image sensor, and an exposure area closer to the light source has a shorter exposure time than an exposure area farther from the light source.
3. The image sensor as claimed in claim 2, wherein the light source is configured to emit a light beam, the light beam can reach an external object and return from the external object, and the image sensor receives the light beam returned by the external object and converts the light beam into an electrical signal when the exposure area is exposed, so as to obtain the biometric information of the external object.
4. The image sensor of claim 2, wherein the light source is disposed outside of a first row of pixel circuits and/or a last row of pixel circuits of the pixel array.
5. The image sensor of claim 1, wherein the pixel array includes M rows of pixel circuits, the M rows of pixel circuits being divided into K exposure regions, 1 < K ≦ M, M > 1, M, K being positive integers.
6. The image sensor of claim 1, wherein the exposure time length of each row of pixel circuits in the exposure region is the same for the same exposure region, or the exposure time length of the pixel circuits in a part of rows of the exposure region is different from the exposure time length of the pixel circuits in the rest of rows.
7. The image sensor according to claim 1, wherein the exposure end time points of the pixel circuits of each row of the pixel array are different, the exposure start time points of the pixel circuits of different rows of the same exposure region are different, and the exposure start time points of the pixel circuits of two rows divided in different exposure regions are the same or different.
8. The image sensor of claim 1, further comprising row control circuitry and readout circuitry, the row control circuitry and readout circuitry being connected to the pixel circuitry, the row control circuitry being configured to control an exposure time of the pixel circuitry, the readout circuitry being configured to read electrical signals of the pixel circuitry.
9. The image sensor as claimed in claim 8, wherein the pixel circuit includes a photodiode, a first MOS transistor, a second MOS transistor, a third MOS transistor and a fourth MOS transistor, a control terminal of the first MOS transistor is connected to the row control circuit for receiving the control signal output by the row control circuit, a first terminal of the first MOS transistor is connected to a cathode of the photodiode, a second terminal of the first MOS transistor is connected to a first terminal of the second MOS transistor, a control terminal of the second MOS transistor is connected to the row control circuit for receiving the reset control signal output by the row control circuit, a second terminal of the second MOS transistor is connected to the power voltage, a control terminal of the third MOS transistor is connected to a first terminal of the second MOS transistor for following a change of the output signal of the first terminal of the second MOS transistor and outputting from the first terminal of the third MOS transistor, a second terminal of the third MOS transistor is connected to the power voltage, and a second terminal of the fourth MOS transistor is connected to a first terminal of the third MOS transistor, The control end of the fourth MOS tube is connected with the row control circuit to receive the row strobe signal output by the row control circuit, and the first end of the fourth MOS tube is used as the output end of the pixel circuit and is connected with the readout circuit.
10. The image sensor of claim 1, wherein the pixel circuits of the pixel array employ either a rolling exposure mode or a global exposure mode.
11. A biometric detection system comprising a light source and an image sensor, the image sensor being as claimed in any one of claims 1 to 10, the light source being adapted to emit a detection beam towards an external object, and the image sensor being adapted to receive the detection beam returned by the external object and to generate a corresponding biometric image.
12. The biometric detection system of claim 11, wherein the biometric detection system is used for fingerprint detection.
13. An electronic device comprising an image sensor according to any of claims 1 to 10, or comprising a biometric detection system according to claim 11 or 12.
14. The electronic device of claim 13, further comprising a display screen, wherein the biometric detection system is positioned below all or a portion of the display screen, and wherein the biometric detection system is capable of transmitting a detection beam to an external object and receiving a detection beam back from the external object through all or a portion of the display screen.
CN202020315246.9U 2020-03-14 2020-03-14 Image sensor, biometric detection system, and electronic device Active CN211702211U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202020315246.9U CN211702211U (en) 2020-03-14 2020-03-14 Image sensor, biometric detection system, and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202020315246.9U CN211702211U (en) 2020-03-14 2020-03-14 Image sensor, biometric detection system, and electronic device

Publications (1)

Publication Number Publication Date
CN211702211U true CN211702211U (en) 2020-10-16

Family

ID=72780832

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202020315246.9U Active CN211702211U (en) 2020-03-14 2020-03-14 Image sensor, biometric detection system, and electronic device

Country Status (1)

Country Link
CN (1) CN211702211U (en)

Similar Documents

Publication Publication Date Title
CN108200367B (en) Pixel unit, method for forming pixel unit and digital camera imaging system assembly
US10567691B2 (en) Solid-state imaging device, method for driving solid-state imaging device, and electronic apparatus
US9088726B2 (en) Solid-state image capturing device, method of driving solid-state image capturing device, and image capturing apparatus
CN106454163B (en) Image sensor with hybrid heterostructure
US8228402B2 (en) Solid-state imaging apparatus with two light proof optical black sections
US8325256B2 (en) Solid-state imaging device
CN108270981B (en) Pixel unit, imaging method and imaging device thereof
JP4363390B2 (en) Solid-state imaging device, driving method of solid-state imaging device, and imaging device
US9277147B2 (en) Multimode pixel readout for enhanced dynamic range
JP2021193814A (en) Imaging device
EP3547670B1 (en) An image pickup apparatus and a driving method for an image pickup apparatus
US20060044438A1 (en) Pixel for boosting pixel reset voltage
CN110996077A (en) Image sensor, camera assembly and mobile terminal
KR20100054540A (en) Pixel circuit, photo-electricity converter, and image sensing system thereof
US20110149274A1 (en) Intensity Estimation Using Binary Sensor Array With Spatially Varying Thresholds
TW201043018A (en) Solid-state image pickup element and control method thereof
US10567689B2 (en) Image sensors having multi-storage image sensor pixels
JP2004309701A (en) Range-finding/photometric sensor and camera
CN112740661A (en) Solid-state imaging device, control method of solid-state imaging device, and electronic apparatus
US7332701B2 (en) Dual-mode CMOS imaging sensor with supporting LED
CN212012776U (en) Imaging system and image sensor for generating color information and pulsed light information
US20090008685A1 (en) Image Sensor and Controlling Method Thereof
US9711675B2 (en) Sensing pixel and image sensor including the same
CN111263087A (en) Image sensor, biometric detection system, electronic device, and driving method of image sensor
US11350049B2 (en) Dark current calibration method and associated pixel circuitry

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant