CN110574367A - Image sensor and image sensitization method - Google Patents

Image sensor and image sensitization method Download PDF

Info

Publication number
CN110574367A
CN110574367A CN201980001909.XA CN201980001909A CN110574367A CN 110574367 A CN110574367 A CN 110574367A CN 201980001909 A CN201980001909 A CN 201980001909A CN 110574367 A CN110574367 A CN 110574367A
Authority
CN
China
Prior art keywords
pixel
pixels
filter layer
control unit
exposure
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201980001909.XA
Other languages
Chinese (zh)
Inventor
王晗
杨红明
涂娇姣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Publication of CN110574367A publication Critical patent/CN110574367A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics

Abstract

The embodiment of the application discloses an image sensor and an image sensitization method, wherein the sensor comprises: the infrared light cut-off filter layer is coated on the micro mirror heads corresponding to the red pixels, the green pixels and the blue pixels, so that infrared light cannot enter the red pixels, the green pixels and the blue pixels, the red pixels, the green pixels and the blue pixels can only respectively sense red light, green light and blue light, and the IR component in a visible light sensing result is filtered out, so that R, G, B and the IR component can sense light independently, and the sensing effect of the sensor is greatly improved.

Description

Image sensor and image sensitization method
Technical Field
the present application relates to the field of image processing, and in particular, to an image sensor and an image sensing method.
Background
A Color Filter Array (CFA) of a conventional Bayer Red Green Blue Sensor (RGB Sensor) includes R, G, B three components, which is different from a conventional Bayer RGB Sensor, and a CFA of a Red Green Blue Infrared (RGB infra Red, rgbrir) Sensor includes R, G, B and IR four components, as shown in fig. 1, which is a graph of a photosensitive characteristic of each pixel in the rgbrir Sensor, wherein only an IR Filter layer can transmit only Infrared light, an R Filter layer can transmit both Red light and Infrared light, a G Filter layer can transmit both Green light and Infrared light, and a B Filter layer can transmit both Blue light and Infrared light, so that even if the Filter layer is used, an IR component in visible light cannot be completely stripped off, and photosensitive results obtained by photosensitive of R pixels, G pixels, and B pixels of a photosensitive device all have a certain degree of IR component signals, and due to the influence of the IR component, the color information of the image signal obtained by the sensor is inaccurate. Under certain illumination conditions, the photosensitive effect of the existing RGBIR sensor is not satisfactory.
at present, more and more application scenes need to be based on visible light signals and infrared light signals, such as living body detection, night video monitoring, a color-black-and-white dynamic fusion technology and the like, the application scenes put higher requirements on the color accuracy and the dynamic range of images, and how to improve the photosensitive effect of the sensor needs to be solved urgently.
Disclosure of Invention
the embodiment of the application provides an image sensor and a method for image sensitization, so that R, G, B and IR components can be independently sensitized, and the sensitization effect is greatly improved.
A first aspect of the present application provides an image sensor, comprising: the pixel array comprises red pixels, green pixels, blue pixels and infrared light pixels, and each pixel corresponds to one micro lens; the filter layer comprises an infrared light cut-off filter layer, the infrared light cut-off filter layer is coated on the micro-mirror heads corresponding to the red pixel, the green pixel and the blue pixel respectively, the infrared light cut-off filter layer is used for cutting off optical signals with the wavelength larger than a first preset wavelength, and the optical signals with the wavelength larger than the first preset wavelength comprise infrared light.
The image sensor provided by the embodiment of the application coats the infrared light cut-off filter layer on the micro mirror heads corresponding to the red pixels, the green pixels and the blue pixels, so that the IR light is cut off to enter the visible light pixels, the IR component signals in the photosensitive result of the visible light pixels are removed, the color of the photosensitive result is more accurate, and the photosensitive effect of the sensor is improved. Furthermore, because the infrared light cut-off filter layer is coated on the micro-mirror head based on the coating technology in the embodiment of the application, on one hand, a complex mechanical structure is not required to be added; on the other hand, the structure of the pixel under the micro lens cannot be changed, the relatively simple and stable internal structure of the pixel is favorable for controlling the problem that the imaging is influenced by the main light path incident Angle (CRA) and the like, and the photosensitive effect of the sensor is improved on the premise of keeping the stable structure of the pixel.
in one possible embodiment, the first predetermined wavelength is 650nm, in which case the infrared light cut filter cuts off light having a wavelength greater than the visible range, ensuring that infrared light of all wavelength ranges cannot enter the red, green and blue pixels.
in one possible embodiment, the filter layer further comprises a red filter layer, a green filter layer, a blue filter layer, and an infrared light filter layer; the infrared light filter layer is coated on the micro-mirror heads corresponding to the infrared light pixels, and the infrared light filter layer can pass infrared light in a specific wavelength range; the red filter layer is coated on the micro-mirror head corresponding to the red pixel, the green filter layer is coated on the micro-mirror head corresponding to the green pixel, and the blue filter layer is coated on the micro-mirror head corresponding to the blue pixel; the red filter layer can only pass red light and infrared light in a first wavelength range, the green filter layer can only pass green light and infrared light in a second wavelength range, the blue filter layer can only pass blue light and infrared light in a third wavelength range, and the wavelengths of the infrared light in the first wavelength range, the infrared light in the second wavelength range and the infrared light in the third wavelength range are all larger than the first preset wavelength.
the image sensor provided by the embodiment of the application has the advantages that the red filter layer and the infrared light cut-off filter layer are coated on the red pixel, the IR component in the red pixel photosensitive result is filtered, the red pixel can only sense R light, correspondingly, the green filter layer and the infrared light cut-off filter layer are coated on the green pixel, the blue filter layer and the infrared light cut-off filter layer are coated on the blue pixel, the IR component in the green pixel and the blue pixel photosensitive result is filtered, the green pixel can only sense G light, and the blue pixel can only sense B light. The infrared light filter layer is coated on the infrared light pixels, so that the IR pixels can only sense the IR light, and the color accuracy of the sensing result obtained by the RGBIR sensor is greatly improved.
It should be understood that the red filter layer is above or below the infrared light cut filter layer; the green filter layer is arranged above or below the infrared light cut-off filter layer; the blue filter layer is arranged above or below the infrared light cut-off filter layer. The coating sequence of the infrared light cut-off filter layer, the red filter layer, the green filter layer and the blue filter layer on the micro lens is not limited in the embodiment of the application.
According to the image sensor provided by the embodiment of the application, the red filter layer and the infrared light cut-off filter layer are coated on the micro-mirror heads of the red pixels; a green light filtering layer and an infrared light cut-off filtering layer are coated on the micro lens of the green pixel; coating a blue filter layer and an infrared light cut-off filter layer on the micro-mirror heads of the blue pixels; the infrared light filter layer is coated on the micro-mirror head of the infrared light pixel, the position relation of the infrared light cut-off filter layer and the red filter layer, and the position relation of the green filter layer and the blue filter layer coated on the micro-mirror head are not limited, and the red filter layer, the green filter layer and the blue filter layer can be respectively coated on the infrared light cut-off filter layer; alternatively, the infrared light cut filter layer may be coated on the red filter layer, the green filter layer, and the blue filter layer, respectively, as long as the light passes through the infrared light cut filter layer and the filter layer of any visible light component before reaching the microlens.
In an optional case, the infrared light cut-off filter layer is coated on the micro-mirror head, and the red filter layer, the green filter layer and the blue filter layer are coated on the inner side of the micro-lens or respectively arranged in the red pixel, the green pixel and the blue pixel; in an alternative case, a red filter layer, a green filter layer, and a blue filter layer are coated on the micro-mirror head, and an infrared cut filter layer is coated inside the micro-lens or inside the red, green, and blue pixels.
In a possible embodiment, the sensor further includes an optical filter, where the optical filter is configured to filter out ultraviolet light and infrared light with a wavelength greater than a second preset wavelength, and the second preset wavelength is greater than any one of the first preset wavelength and the specific wavelength range; the light rays sequentially pass through the optical filter, the filter layer and the micro lens to reach the pixel array.
the image sensor that this application embodiment provided, the filter can the longer far infrared light of wavelength and the shorter ultraviolet light of wavelength in the filtering natural light, avoids far infrared light and ultraviolet light to influence the sensitization characteristic of sensitization device.
In one possible embodiment, the sensor further comprises a charge readout module, each pixel in the array of pixels comprising a photosensitive device; the photosensitive device is used for converting light into electric charge; the charge readout module outputs the charges accumulated by the photosensitive device to obtain a photosensitive result.
In one possible embodiment, the sensor further comprises: and the logic control circuit is used for independently controlling the exposure time of a visible light pixel and the exposure time of the infrared light pixel respectively, wherein the visible light pixel comprises the red pixel, the green pixel and the blue pixel.
the exposure of the RGB visible light component and the IR component of the existing RGBIR sensor is uniformly controlled, and the problem of exposure unbalance is easy to occur when the illumination condition is not ideal, so the photosensitive dynamic range of the existing RGBIR sensor is poor. The image sensor provided by the embodiment of the application, the exposure time of visible light pixel and IR pixel is independent control, for example, can be under the condition that infrared light is too strong and visible light is too weak, increase the exposure time of visible light and reduce the exposure time of infrared light, make the exposure time of visible light and infrared light tend to balance, avoid when infrared light is dominant or visible light is dominant, the exposure unbalance problem easily appears, the dynamic range of sensor sensitization is promoted, satisfy the requirement of user to indexes such as definition and signal-to-noise ratio.
in one possible implementation, the logic control circuit includes: a first control line and a second control line, visible light pixels in the pixel array coupled to the first control line, infrared light pixels in the pixel array coupled to the second control line; the logic control circuit is specifically configured to: controlling an exposure start time of the visible light pixel based on the first control line; and controlling the exposure starting time of the infrared light pixel based on the second control line.
In one possible implementation, the logic control circuit is further configured to: and controlling the exposure time of the visible light pixel and the infrared light pixel to meet a preset ratio based on the first control line and the second control line.
Illustratively, the first control line outputs a first control signal, the second control line outputs a second control signal, the visible light pixel starts to be exposed when a first effective transition edge of the first control signal arrives, and the infrared light pixel starts to be exposed when a second effective transition edge of the second control signal arrives; by setting the arrival time of the first effective jumping edge and the second effective jumping edge, the exposure time of the visible light pixel and the exposure time of the infrared light pixel meet a preset proportion.
the image sensor provided by the embodiment of the application can enable the exposure time of the visible light signal and the infrared light signal to meet the preset proportion by setting the arrival time of the effective jump edge of the control signal of the visible light signal and the effective jump edge of the infrared light signal, for example, when the proportion of the exposure time of the visible light signal to the exposure time of the infrared light signal is 2:1, the exposure result is better in definition, the signal-to-noise ratio is higher, the control signal of the visible light signal jumps first and the infrared light signal jumps later, and the time difference between jump time points of the two signals is ensured to enable the exposure time of the visible light signal and the exposure time of the infrared light signal to meet the preset proportion. Therefore, the photosensitive effect of the sensor is controlled more finely by accurately setting the exposure time proportion of the visible light signal and the infrared light signal. Illustratively, the active transition edge may be a falling edge of a high level signal, a rising edge of a low level signal, a rising edge of a high level signal, a falling edge of a low level signal, etc.
In one possible embodiment, the sensor further comprises: and the logic control circuit is used for independently controlling the exposure time of the red pixel, the green pixel, the blue pixel and the infrared pixel.
The image sensor that this application embodiment provided, the exposure time of R, G, B and the four components of IR is independent control respectively, and when some scenes required the sensitization result of R, G weight highly and hoped to reduce B, IR's sensitization result, then can strengthen the sensitization effect of R, G weight through the exposure time of four components of nimble control, weaken the sensitization effect of B, IR weight for final sensitization result more accords with the scene demand. The dynamic range of the sensor light sensing is further improved, and the light sensing result of definition or signal to noise ratio which is more in line with the requirements of customers is provided.
In one possible implementation, the logic control circuit includes: a first control line, a second control line, a third control line, and a fourth control line, a red pixel in the pixel array coupled to the first control line, a green pixel in the pixel array coupled to the second control line, a blue pixel in the pixel array coupled to the third control line, an infrared pixel in the pixel array coupled to the fourth control line; the logic control circuit is specifically configured to: controlling an exposure start time of the red pixel based on the first control line; controlling an exposure start time of the green pixel based on the second control line; controlling an exposure start time of the blue pixel based on the third control line; and controlling the exposure starting time of the infrared light pixel based on the fourth control line.
in one possible implementation, the logic control circuit is further configured to: controlling the exposure time of the red pixel, the green pixel, the blue pixel and the infrared pixel to satisfy a preset ratio based on the first control line, the second control line, the third control line and the fourth control line.
The image sensor provided by the embodiment of the application can preset the exposure time of R, G, B and IR components to meet the preset proportion so as to realize fine control on the photosensitive effect of the sensor.
illustratively, the first control line outputs a first control signal, the second control line outputs a second control signal, the third control line outputs a third control signal, and the fourth control line outputs a fourth control signal; when a first effective transition edge of the first control signal comes, the red pixel starts to be exposed, when a second effective transition edge of the second control signal comes, the green pixel starts to be exposed, when a third effective transition edge of the third control signal comes, the green pixel starts to be exposed, and when a fourth effective transition edge of the fourth control signal comes, the infrared pixel starts to be exposed. The arrival time of the first effective transition edge, the second effective transition edge, the third effective transition edge and the fourth effective transition edge is set, so that the exposure time of R, G, B and the exposure time of the IR components meet the preset proportion.
in one possible embodiment, the sensor further comprises: a row coordinate control line, a column coordinate control line and an exposure start control line; each pixel in the pixel array is coupled to a respective row coordinate control line and column coordinate control line, the exposure start control line comprises a plurality of branches, and each branch corresponds to one pixel; when the control signals output by the row coordinate control line and the column coordinate control line of the target pixel are both effective levels, the branch output control signal of the exposure start control line corresponding to the target pixel controls the exposure start time of the target pixel, and the target pixel is any pixel in the pixel array.
According to the image sensor provided by the embodiment of the application, the photosensitive time of each pixel can be independently controlled, in some scenes needing to enhance the pixels of the target area, the exposure time of the pixels in the target area can be only increased, the photosensitive flexibility of the sensor is further improved, and the requirements of users on photosensitive results are further met.
In one possible embodiment, the sensor further comprises: and the exposure end control signal is used for uniformly controlling the exposure end time of all the pixels in the pixel array.
in one possible implementation mode, the logic control circuit comprises a first control variable x and a second control variable y, and when x and y meet the coordinate condition of the visible light pixel, the logic control circuit outputs a reset signal of the logic control circuit to the first control line as a first control signal; when x and y satisfy the coordinate condition of the IR pixel, the reset signal of the logic control circuit is output to the second control line as a second control signal.
in one possible implementation mode, the logic control circuit comprises a first control variable x and a second control variable y, and when x and y meet the coordinate condition of the R pixel, the reset signal of the logic control circuit is output to the first control line to serve as a first control signal; outputting a reset signal of the logic control circuit to a second control line as a second control signal when x and y satisfy a coordinate condition of the G pixel; when x and y satisfy the coordinate condition of the B pixel, outputting the reset signal of the logic control circuit to a third control line as a third control signal; when x and y satisfy the coordinate condition of the IR pixel, the reset signal of the logic control circuit is output to the fourth control line as a fourth control signal.
The second aspect of the present application provides a method for image sensitization, which is applied to an image sensor, the sensor comprising: the infrared light cut-off filter layer, the micro lenses and the pixel array, wherein the pixel array comprises red pixels, green pixels, blue pixels and infrared pixels, and each pixel in the pixel array corresponds to one micro lens; the infrared light cut-off filter layer is respectively coated on the micro-mirror heads corresponding to the red pixel, the green pixel and the blue pixel, and the method comprises the following steps: light reaches the red pixel, the green pixel and the blue pixel through the infrared light cut-off filter layer and the micro lens; the infrared light cut-off filter layer is used for cutting off infrared light, so that the infrared light cannot enter the red pixel, the green pixel and the blue pixel.
In one possible embodiment, the sensor further comprises a red filter layer, a green filter layer, a blue filter layer, and an infrared light filter layer; the red filter layer is coated on the micro-mirror head corresponding to the red pixel, the green filter layer is coated on the micro-mirror head corresponding to the green pixel, the blue filter layer is coated on the micro-mirror head corresponding to the blue pixel, and the infrared light filter layer is coated on the micro-mirror head corresponding to the infrared light pixel, and the method specifically comprises the following steps: the light ray sequentially passes through the infrared light filter layer and the micro lens to reach the infrared light pixel; the light ray passes through the infrared light cut-off filter layer, the red filter layer and the micro lens in sequence to reach the red pixel; the light ray passes through the infrared light cut-off filter layer, the green filter layer and the micro lens in sequence to reach the green pixel; the light ray sequentially passes through the infrared light cut-off filter layer, the blue filter layer and the micro lens to reach the blue pixel; or the light ray passes through the red filter layer, the infrared light cut-off filter layer and the micro lens in sequence to reach the red pixel; the light ray passes through the green filter layer, the infrared light cut-off filter layer and the micro lens in sequence to reach the green pixel; the light ray passes through the blue filter layer, the infrared light cut-off filter layer and the micro lens in sequence to reach the blue pixel; wherein the infrared light filter layer is configured to pass only infrared light within a specific wavelength range, the red light filter layer is configured to pass only red light and infrared light within a first wavelength range, the green light filter layer is configured to pass only green light and infrared light within a second wavelength range, and the blue light filter layer is configured to pass only blue light and infrared light within a third wavelength range; the infrared light cut by the infrared light cut filter layer includes: infrared light in the first wavelength range, infrared light in the second wavelength range, and infrared light in the third wavelength range.
the position relation of the infrared light cut-off filter layer, the red filter layer, the green filter layer and the blue filter layer coated on the micro-mirror head is not limited in the embodiment of the application, and the red filter layer, the green filter layer and the blue filter layer can be coated on the infrared light cut-off filter layer respectively; alternatively, the infrared light cut filter layer may be coated on the red filter layer, the green filter layer, and the blue filter layer, respectively, as long as the light passes through the infrared light cut filter layer and the filter layer of any visible light component before reaching the microlens.
In a possible embodiment, the sensor further includes an optical filter, where the light is light after the original light in the natural world passes through the optical filter, the optical filter is used to filter out ultraviolet light and far infrared light, and the wavelength of the far infrared light is greater than the wavelength of the infrared light in the specific wavelength range that the infrared light filter layer allows to pass through, and the method further includes: the original light of the nature passes through the optical filter to obtain the light.
in one possible embodiment, the sensor further comprises a charge readout module, each pixel in the array of pixels comprises a photosensitive device, the method further comprising: the photosensitive device converts light into electric charge; and outputting the accumulated charges through a charge readout module to obtain a sensitization result.
In one possible embodiment, the method further comprises: controlling an exposure start time of a visible light pixel including the red pixel, the green pixel, and the blue pixel based on the first control line; and controlling the exposure starting time of the infrared light pixel based on the second control line.
in one possible embodiment, the method further comprises: and controlling the exposure time of the visible light pixel and the infrared light pixel to meet a preset ratio based on the first control line and the second control line.
In one possible embodiment, the method further comprises: controlling an exposure start time of the red pixel based on a first control line; controlling an exposure start time of the green pixel based on a second control line; controlling an exposure start time of the blue pixel based on a third control line; the exposure start time of the infrared light pixel is controlled based on the fourth control line.
in one possible embodiment, the method further comprises: controlling the exposure time of the red pixel, the green pixel, the blue pixel and the infrared pixel to satisfy a preset ratio based on the first control line, the second control line, the third control line and the fourth control line.
in one possible embodiment, each pixel in the sensor is coupled to a respective row coordinate control line and column coordinate control line, and corresponds to a branch of the exposure start control line, the method further comprising: when the control signals output by the row coordinate control line and the column coordinate control line of the target pixel are both effective levels, the branch of the exposure start control line corresponding to the target pixel outputs a control signal, and the exposure start time of the target pixel is controlled based on the control signal, wherein the target pixel is any one pixel in the pixel array.
A third aspect of the present application provides an apparatus for independent exposure, the apparatus comprising: the exposure control device comprises at least two control units, wherein each control unit is used for correspondingly controlling the exposure starting time of one type of pixels in a pixel array of a sensor, and the pixel array of the sensor comprises at least two types of pixels.
In the existing sensor comprising various types of pixels, the exposure time of different types of pixels is uniformly controlled, the problem of exposure unbalance is easy to occur when the illumination condition is not ideal, the exposure control flexibility is poor, and the dynamic range of the sensor exposure is poor. The device that this application provided can independent control to the exposure time of the pixel of different grade type in the sensor, has promoted the dynamic range and the SNR of sensor sensitization. For example, the apparatus is a control unit or a logic control circuit independent from the sensor, and the corresponding product form may be a processor or a chip product including a processor.
In one possible embodiment, the apparatus further comprises: the pixel array.
The device may be a sensor comprising a control unit.
in one possible embodiment, the sensor is an rgbiir sensor, the at least two types of pixels comprising: a visible light pixel and an IR pixel, the visible light pixel comprising: r pixel, G pixel, B pixel; alternatively, the at least two types of pixels include: r pixel, B pixel, G pixel and IR pixel, the at least two control units include: a first control unit, a second control unit, a third control unit and a fourth control unit; the first control unit is used for controlling the exposure starting time of the R pixel; the second control unit is used for controlling the exposure starting time of the G pixel; the third control unit is used for controlling the exposure starting time of the B pixel; the fourth control unit is used for controlling the exposure start time of the IR pixel.
in one possible embodiment, the sensor is an RGBW sensor, the at least two types of pixels comprising: a visible light pixel and a W pixel, the visible light pixel including: r pixel, G pixel, B pixel, this at least two control unit includes: a first control unit and a second control unit; the first control unit is used for controlling the exposure starting time of the visible light pixel; the second control unit is used for controlling the exposure starting time of the W pixel; or the at least two types of pixels include: r pixel, B pixel, G pixel and W pixel, the at least two control units include: a first control unit, a second control unit, a third control unit and a fourth control unit; the first control unit is used for controlling the exposure starting time of the R pixel; the second control unit is used for controlling the exposure starting time of the G pixel; the third control unit is used for controlling the exposure starting time of the B pixel; the fourth control unit is used for controlling the exposure starting time of the W pixel.
In one possible embodiment, the sensor is an RCCB sensor, the at least two types of pixels including: a visible light pixel and a C pixel, the visible light pixel including: r pixels and B pixels, the at least two control units including: a first control unit and a second control unit; the first control unit is used for controlling the exposure starting time of the visible light pixel; the second control unit is used for controlling the exposure starting time of the C pixel; the at least two types of pixels include: r pixel, B pixel and C pixel, this at least two control units include: a first control unit, a second control unit and a third control unit; the first control unit is used for controlling the exposure starting time of the R pixel; the second control unit is used for controlling the exposure starting time of the B pixel; the third control unit is used for controlling the exposure starting time of the C pixel.
In one possible embodiment, the exposure times of the at least two types of pixels are controlled to satisfy a preset ratio based on the at least two control units.
in one possible embodiment, the apparatus further comprises: and the exposure end control unit is used for uniformly controlling the exposure end time of all the pixels in the pixel array.
A fourth aspect of the present application provides a method of independent exposure, the method being applied to a sensor comprising at least two types of pixels, the at least two types of pixels comprising a first type of pixel and a second type of pixel, the method comprising: controlling an exposure start time of the first type of pixel based on a first control unit; the exposure start time of the second type of pixel is controlled based on a second control unit.
in one possible embodiment, the method further comprises: controlling an exposure time of each type of the at least two types of pixels to satisfy a preset ratio.
Illustratively, the exposure time of the visible light pixels and the exposure time of the IR pixels are controlled to meet a preset proportion based on the first control unit and the second control unit; or controlling R, G, B the exposure time of the IR pixels to satisfy a preset ratio based on the first, second, third and fourth control units. Or controlling the exposure time of the visible light pixel and the W pixel to meet a preset proportion based on the first control unit and the second control unit; or controlling R, G, B the exposure time of the W pixel to satisfy a preset ratio based on the first control unit, the second control unit, the third control unit and the fourth control unit; or controlling the exposure time of the visible light pixel and the C pixel to meet a preset ratio based on the first control unit and the second control unit; alternatively, the exposure time of R, B and the C pixel is controlled to satisfy a preset ratio based on the first control unit, the second control unit, and the third control unit.
in one possible embodiment, the sensor is an rgbiir sensor, the first type of pixels being visible light pixels, the second type of pixels being IR pixels, the visible light pixels comprising R pixels, G pixels and B pixels; or the sensor is an RGBW sensor, the first type of pixel is a visible light pixel, the second type of pixel is a W pixel, and the visible light pixel includes an R pixel, a G pixel, and a B pixel; the sensor is an RCCB sensor, the first type of pixels being visible light pixels, the second type of pixels being C pixels, the visible light pixels including R pixels and B pixels.
In one possible embodiment, the at least two types of pixels further comprise: a third type of pixel; the method further comprises the following steps: the exposure start time of the third type of pixel is controlled based on a third control unit.
in one possible embodiment, the sensor is an RCCB sensor, the first type of pixel is an R pixel, the second type of pixel is a B pixel, and the third type of pixel is a C pixel; the method specifically comprises the following steps: controlling an exposure start time of the R pixel based on the first control unit; controlling an exposure start time of the B pixel based on the second control unit; controlling an exposure start time of the C pixel based on the third control unit.
In one possible embodiment, the at least two types of pixels further comprise: a third type of pixel and a fourth type of pixel, the method further comprising: controlling an exposure start time of the third type of pixel based on a third control unit; the exposure start time of the fourth type of pixel is controlled based on a fourth control unit.
in one possible embodiment, the sensor is an rgbiir sensor, the first type of pixels are R pixels, the second type of pixels are G pixels, the third type of pixels are B pixels, the fourth type of pixels are IR pixels; the method specifically comprises the following steps: controlling an exposure start time of the R pixel based on the first control unit; controlling an exposure start time of the G pixel based on the second control unit; controlling an exposure start time of the B pixel based on the third control unit; controlling an exposure start time of the IR pixel based on the fourth control unit; or, the sensor is an RGBW sensor, the first type of pixel is an R pixel, the second type of pixel is a G pixel, the third type of pixel is a B pixel, and the fourth type of pixel is a W pixel; the method specifically comprises the following steps: controlling an exposure start time of the R pixel based on the first control unit; controlling an exposure start time of the G pixel based on the second control unit; controlling an exposure start time of the B pixel based on the third control unit; the exposure start time of the W pixel is controlled based on the fourth control unit.
In one possible embodiment, the method further comprises: the exposure end time of all the pixels in the pixel array is uniformly controlled based on an exposure end control unit.
a fifth aspect of the present application provides a computer-readable storage medium having stored therein instructions, which, when run on a computer or processor, cause the computer or processor to perform the method as set forth in the fourth aspect or any one of its possible embodiments.
A sixth aspect of the present application provides a computer program product comprising instructions which, when run on a computer or processor, cause the computer or processor to perform the method as set forth in the fourth aspect or any one of its possible embodiments.
Drawings
FIG. 1 is a diagram illustrating a photosensitive characteristic curve of each pixel of an exemplary RGBIR sensor according to an embodiment of the present disclosure;
FIG. 2a is a schematic diagram of an exemplary 2X2 array ordered RGBIR sensor;
FIG. 2b is a schematic diagram of another exemplary 2X2 array ordered RGBIR sensor;
FIG. 3a is a schematic diagram of an exemplary 4X4 array ordered RGBIR sensor;
FIG. 3b is a schematic diagram of another exemplary 4X4 array ordered RGBIR sensor;
FIG. 4 is a schematic structural diagram of an exemplary RGBIR sensor provided in an embodiment of the present application;
FIG. 5 is a schematic structural diagram of another exemplary RGBIR sensor provided in an embodiment of the present application;
FIG. 6 is a schematic structural diagram of another exemplary RGBIR sensor provided in an embodiment of the present application;
FIG. 7a is a schematic diagram of an exemplary 2X2 array ordered RGBIR control connection provided by an embodiment of the present application;
FIG. 7b is a schematic diagram of an exemplary 4X4 array ordered RGBIR control connection provided by an embodiment of the present application;
FIG. 8 is a timing diagram of exemplary control signals provided by an embodiment of the present application;
FIG. 9a is a schematic diagram of another exemplary 2X2 array ordered RGBIR control connection provided by an embodiment of the present application;
FIG. 9b is a schematic diagram of another exemplary 4X4 array ordered RGBIR control connection provided by an embodiment of the present application;
FIG. 10 is a timing diagram of exemplary control signals provided by embodiments of the present application;
FIG. 11 is a schematic diagram of an exemplary sensor in which the exposure time of each pixel can be independently controlled;
FIG. 12 is a timing diagram of exemplary control signals provided by an embodiment of the present application;
Fig. 13 is a graph of the light sensing characteristics of each pixel of the rgbiir sensor provided in the embodiment of the present application;
FIG. 14a is a schematic diagram of an exemplary 2X2 array ordered RGBW sensor;
FIG. 14b is a schematic diagram of an exemplary 2X2 array sequenced RCCB sensor;
FIG. 15 is a block diagram of an exemplary control unit for independent exposure of an RCCB sensor according to an embodiment of the present application;
FIG. 16 is a diagram illustrating a hardware architecture of an exemplary apparatus for independent exposure according to an embodiment of the present disclosure;
FIG. 17 is a flowchart illustrating an exemplary method for image sensitization according to an embodiment of the application;
fig. 18 is a flowchart illustrating an exemplary method for independently controlling an exposure time according to an embodiment of the present application.
Detailed Description
the terms "first," "second," and the like in the description and in the claims of the present application and in the above-described drawings are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. Furthermore, the terms "comprises" and "comprising," as well as any variations thereof, are intended to cover a non-exclusive inclusion, such as a list of steps or elements. A method, system, article, or apparatus is not necessarily limited to those steps or elements explicitly listed, but may include other steps or elements not explicitly listed or inherent to such process, system, article, or apparatus.
It should be understood that in the present application, "at least one" means one or more, "a plurality" means two or more. "and/or" for describing an association relationship of associated objects, indicating that there may be three relationships, e.g., "a and/or B" may indicate: only A, only B and both A and B are present, wherein A and B may be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. "at least one of the following" or similar expressions refer to any combination of these items, including any combination of single item(s) or plural items. For example, at least one (one) of a, b, or c, may represent: a, b, c, "a and b", "a and c", "b and c", or "a and b and c", wherein a, b, c may be single or plural.
People have higher and higher requirements on the image quality of imaging equipment, for example, in the field of security monitoring, images acquired by security monitoring equipment need to meet the requirement of upgrading subjective feelings of human eyes on one hand, and also need to meet the requirement of identifying various machines on the other hand; for example, in the field of mobile phone photography, whether portrait or landscape photography, the definition and color expression of images are required to be higher and higher. The quality of the image obtained by the imaging device is related to the sensitivity of the image sensor and the light conditions of the scene being photographed. Moreover, more and more application scenes need to be simultaneously based on visible light signals and infrared light signals, such as night video monitoring, living body detection, color-black-and-white dynamic fusion technology and the like, the imaging result of visible light only does not meet the requirements, and infrared light assistance is needed; however, the existing RGBIR sensor can not realize the stripping of visible light signals and IR signals, the photosensitive result of the visible light signals has a certain degree of IR components, and the color information is inaccurate; and the exposure of the visible light component and the IR component is uniformly controlled, so that the sensor is easy to have the problem of exposure unbalance and has poor photosensitive dynamic range.
The embodiment of the application provides an RGBIR image sensor, can realize the independent sensitization of visible light and IR light, peels off the IR component signal in the sensitization result of visible light signal, promotes the color accuracy of sensor sensitization result.
Two exemplary 2X2 array ordered rgbirr sensors are shown in fig. 2a and 2b, and two exemplary 4X4 array ordered rgbirr sensors are shown in fig. 3a and 3 b. Each cell in the figure represents a pixel, R represents a red pixel, G represents a green pixel, B represents a blue pixel, IR represents an infrared light pixel, and the 2X2 array order means that the minimal repeating unit of the rgb IR quartering arrangement is an array of 2X2, and the array unit of 2X2 contains R, G, B, IR all components; the 4X4 array ordering indicates that the smallest repeating unit of the rgb ir quartet arrangement is a 4X4 array, where the 4X4 array contains all components. It should be understood that there may be other arrangements of 2X2 array ordering and 4X4 array ordering of the rgbiir sensors, and the arrangement of the rgbiir sensors is not limited in the embodiments of the present application.
Fig. 4 is a schematic structural diagram of an exemplary rgbiir sensor provided in an embodiment of the present application.
The rgbiir sensor includes a filter layer, micro lenses (micro lenses) 403, a pixel array 404, and a charge readout module 405, where the filter layer includes: the infrared light cut filter layer 401, the red filter layer 402R, the green filter layer 402G, the blue filter layer 402B, and the infrared light filter layer 402IR, and the pixel array 404 includes red pixels R, green pixels G, and blue pixels B.
The infrared light Cut filter layer 401 may also be referred to as an IR-Cut, and is configured to Cut off optical signals having a wavelength greater than a first preset wavelength, including infrared optical signals. Illustratively, the first preset wavelength is 650nm, and the infrared light cut-off filter layer 401 is configured to cut off optical signals with a wavelength greater than 650nm, where the optical signals with a wavelength greater than 650nm include infrared optical signals. Illustratively, the typical wavelength of visible light is around 430nm to 650nm, and the typical wavelength of infrared light to which the IR pixel is exposed is around 850nm to 920 nm. The IR-Cut can Cut off optical signals with the wavelength larger than 650nm, so that infrared light in the wavelength range of 850nm to 920nm cannot enter the red pixel, the green pixel and the blue pixel. The light sensing characteristics of the light passing through the red filter layer 402R in the red pixel are shown by the thin black solid line R in fig. 1, and the red pixel has two photosensitive intensity peaks near red light 650nm and near IR light 850 nm; the light-sensing characteristic of the light through the green filter layer 402G in the green pixel is shown by a short dashed line G in fig. 1, the light-sensing characteristic of the green pixel appears twice in the vicinity of 550nm of green light and in the vicinity of 850nm of IR light, the light-sensing characteristic of the light through the blue filter layer 402B in the blue pixel is shown by a dotted line B in fig. 1, and the light-sensing characteristic of the blue pixel appears twice in the vicinity of 450nm of blue light and in the vicinity of 850nm of IR light; the light sensing characteristics of the light passing through the infrared light filter layer 402IR in the IR pixel are shown by the long dashed line IR in fig. 1, and the IR pixel has a peak of light sensing intensity only in the vicinity of 850nm (or 910nm) of IR light. Based on this, it is possible to obtain: the red filter layer 402R may simultaneously transmit red light and IR light in the first wavelength range, the green filter layer 402G may simultaneously transmit green light and IR light in the second wavelength range, and the blue filter layer 402B may simultaneously transmit blue light and IR light in the third wavelength range. It should be understood that the first wavelength range, the second wavelength range and the third wavelength range may be the same or different, and the wavelengths of the infrared light in the first wavelength range, the infrared light in the second wavelength range and the infrared light in the third wavelength range are all greater than the first preset wavelength. The infrared light filter layer 402IR may transmit only IR light of a specific wavelength range. It should be understood that the specific wavelength range may be 850nm-920nm, or the specific wavelength range may be any specific wavelength within and around the range 850nm-920 nm. For example, the IR pixel may mainly sense IR light at 850nm or at 910nm, and the IR pixel may sense infrared light with any specific wavelength in the range of 850nm to 920nm and its vicinity, which is not limited in this application.
The micro-lens 403 is a tiny convex lens device on each photosensitive pixel of the sensor, and is used for concentrating the input light into each photosensitive pixel.
the infrared cut-off filter layer 401 is coated on the micro mirror heads corresponding to the red, green and blue pixels, respectively, so that light exceeding 650nm cannot enter the red, green and blue pixels.
The micromirror head corresponding to the red pixel is also coated with a red filter layer 402R, so that only red light around 650nm enters the red pixel, which can only sense red light.
The micromirror head corresponding to the green pixel is also coated with a green filter layer 402G so that only green light near 550nm enters the green pixel, which can only sense green light.
the micromirror head corresponding to the blue pixel is also coated with a blue filter layer 402B, so that only blue light near 450nm enters the blue pixel, and the blue pixel can only sense blue light.
The infrared light filter layer 402IR is coated on the micromirror head corresponding to the infrared light pixel, so that only near infrared light near 850nm or 910nm enters the infrared light pixel, and the infrared light pixel can only sense IR light.
According to the embodiment of the application, the infrared light cut-off filter layer is coated on the micro mirror heads corresponding to the red pixels, the green pixels and the blue pixels, the IR light reaching the red pixels, the green pixels and the blue pixels is cut off, the IR component signals in the photosensitive result of the visible light pixels are removed, the color of the photosensitive result is more accurate, and the photosensitive effect of the sensor is improved. Furthermore, since the infrared light cut-off filter layer is coated on the micro-mirror head based on the coating technology, on one hand, a complex mechanical structure does not need to be added; on the other hand, the structure of the pixel under the micro lens cannot be changed, the pixel under the micro lens only has a photosensitive device such as a photodiode, the relatively simple and stable pixel internal structure is favorable for controlling the problem that the imaging is influenced by the main light path incident angle (Chief RayAngle, CRA) and the like, the filter layer is coated on the micro mirror head, and the photosensitive effect of the sensor is improved on the premise of keeping the structure of the pixel stable.
the structure of the inner part of the pixel is not a smooth inner wall, a plurality of bulges are arranged on the inner wall of the pixel, if the incident angle of light and the main light path of the micro lens deviate, part of light can be blocked by the bulges on the inner wall of the pixel, and the photosensitive effect of the sensor can be reduced. The CRA of a pixel located at the optical center (optical center) of the sensor is 0 degree, the CRA angle of a pixel farther from the optical center is larger, and generally, if the offset distance of the pixel from the center of the picture is taken as the abscissa and the CRA angle of the pixel is taken as the ordinate, the function between the offset distance of the pixel from the center and the CRA angle of the pixel is a linear function, and this rule is called that the CRA behaves uniformly. In order to make the sensor conform to the rule that CRA shows consistency, the position of the micro-lens of the pixel needs to be finely adjusted according to the position of the pixel, for example, the micro-lens of the pixel located at the optical center is directly above the pixel, the micro-lens of the pixel deviated from the optical center is not directly above the pixel, and the micro-lens deviation amplitude of the pixel farther away from the optical center is relatively larger. If the structure inside the pixel under the micro-lens is complex, which easily causes the CRA to behave inconsistently, the method of fine-tuning the position of the micro-lens on the surface of the pixel may not be suitable. The filter layer that the sensor that this application embodiment provided added coats on the microlens, can not change the inner structure of pixel, and the inside result of pixel is simple stable, promotes the sensitization effect of sensor under the unanimous prerequisite of CRA performance that does not influence the sensor.
each pixel in pixel array 404 includes a photosensitive device, which may be a photodiode, for example, for converting an optical signal into an electrical signal or converting an optical signal into an electrical charge.
And a charge readout module 405, configured to read out charges accumulated by the photosensitive device, and output the charges to a subsequent image processing circuit or an image processor. The charge readout module is similar to a buffer area, and charges accumulated by the photosensitive device are transferred and temporarily buffered in the charge readout module, and the charge signal of the corresponding pixel is output under the control of the readout signal.
It should be understood that the sensor infrared cut-off filter layer shown in fig. 4 is coated on the red filter layer, the green filter layer and the blue filter layer, respectively, and in an alternative case, the red filter layer, the green filter layer and the blue filter layer may be coated on the infrared cut-off filter layer, respectively, such as the rgbiir sensor shown in fig. 5, and other parts of the sensor shown in fig. 5 are the same as the sensor shown in fig. 4, and will not be described again. The image sensor provided by the embodiment of the application does not limit the position relation of the coating of the infrared light cut-off filter layer, the red filter layer, the green filter layer and the blue filter layer on the micro-mirror head.
In an alternative case, only the infrared light cut filter layer may be coated on the micromirror head, with the red filter layer being made in the red pixels, the green filter layer being made in the green pixels, the blue filter layer being made in the blue pixels, and the infrared light filter layer being made in the IR pixels.
In an alternative case, only a red filter layer, a green filter layer, and a blue filter layer and an infrared light filter layer may be coated on the micro-mirror head, and the infrared light cut filter layer is made in the red pixel, the green pixel, and the blue pixel.
fig. 6 is a block diagram of another exemplary rgbiir sensor provided in an embodiment of the present application.
The sensor includes: the image sensor comprises an optical filter 601, a filter layer, a micro-lens 604, a pixel array 605 and a charge readout module 606, wherein the filter layer comprises: the infrared cut filter layer 602, the red filter layer 603R, the green filter layer 603G, the blue filter layer 603B, and the infrared light filter layer 603IR, and the pixel array 605 includes red pixels R, green pixels G, and blue pixels B.
the optical filter 601 is configured to filter ultraviolet light and infrared light with a wavelength greater than a second preset wavelength, so that visible light and a part of infrared light pass through the optical filter. The second preset wavelength is larger than any wavelength in the first preset wavelength and the specific wavelength range passed by the infrared light filter layer. In an optional case, the infrared light with a wavelength greater than the second preset wavelength may be referred to as far infrared light, and the wavelength of the far infrared light is greater than the wavelength of the infrared light allowed to pass through by the infrared light filter layer. Illustratively, the visible light has a wavelength of about 430nm to 650nm, and the IR pixel is sensitive to infrared light having a typical wavelength of about 850nm to 920 nm. The second predetermined wavelength may be, for example, 900nm, 920nm, or any wavelength between 850nm and 950 nm. Illustratively, the filter may be an all-pass filter or a dual-pass (dual-pass) filter, one exemplary all-pass filter being configured to filter ultraviolet light having a wavelength of less than 400nm and infrared light having a wavelength of greater than 900 nm. An exemplary dual-pass filter is used to pass only visible light and infrared light in the range of 800nm to 900nm, and at this time, the dual-pass filter is equivalent to filtering infrared light larger than 900 nm; in an alternative case, the double pass filter is used to pass only visible light and infrared light in the range of 900nm to 950nm, in which case the double pass filter is equivalent to filtering infrared light greater than 950 nm. It should be understood that the wavelength of the infrared light filtered by the all-pass filter and the wavelength of the infrared light allowed to pass through by the two-pass filter may be designed as required, and this is not limited in this embodiment of the application. The optical filter 601 can prevent long-wavelength far infrared light and ultraviolet light from affecting the photosensitive characteristics of the photosensitive device.
The filter layers are the same as those of the sensor shown in fig. 4 and 5, and the infrared cut filter layer 602, the red filter layer 603R, the green filter layer 603G, the blue filter layer 603B, and the infrared light filter layer 603IR are all coated on the micromirror head, and the upper and lower positional relationship of the infrared cut filter layer 602 and the filter layers 603R to 603B of each visible light component is not limited.
the light rays sequentially pass through the optical filter, the filter layer and the micro lens to enter the pixel array. Therefore, the wavelength of the infrared light filtered by the optical filter is larger than the wavelength of the infrared light allowed to pass through the infrared light filter layer in the optical filter layer, and the infrared cut-off filter layer can filter the light larger than the visible light, so that the infrared light is prevented from entering the red pixel, the blue pixel and the green pixel.
Please refer to the description of the embodiment portion corresponding to fig. 4 for the micro lens 604, the pixel array 605 and the charge readout module 606, which is not described herein again.
further, embodiments of the present application also provide a sensor capable of independently controlling exposure times of visible light pixels and infrared light pixels, as shown in fig. 7a and 7b, where fig. 7a is an exemplary RGBIR control connection diagram of 2X2 array sorting, and fig. 7b is an exemplary RGBIR control connection diagram of 4X4 array sorting.
The sensor includes a pixel array 710 and a logic control circuit 720.
The pixel array 710 is a pixel array in a sensor as shown in any one of the embodiments of fig. 4-6, such as the pixel array 404, the pixel array 504, or the pixel array 605.
And a logic control circuit 720 for independently controlling the exposure time of the visible light pixels and the infrared light pixels, respectively, wherein the visible light pixels include red pixels, green pixels and blue pixels. Specifically, the logic control circuit 720 includes a first control line and a second control line, or may be said to include two independent control circuits: a first control circuit and a second control circuit. Visible light pixels R, G, B in pixel array 710 are coupled to a first control line and IR pixels in pixel array 710 are coupled to a second control line. It should be understood that the control lines in fig. 7a and 7b having the same name are the same line or are connected to each other, for example, the first control line of the pixel array side and the first control line of the logic control circuit are the same line or are connected to each other, and the second control line of the pixel array side and the second control line of the logic control circuit are the same line or are connected to each other.
If y is the number of rows of pixels in the pixel array, X is the number of columns of pixels in the pixel array, X is 0. ltoreq. x.ltoreq.M-1, y is 0. ltoreq. y.ltoreq.N-1, M is the total number of columns of the pixel array, and N is the total number of rows of the pixel array, then for a 2X2 array ordered RGBIR sensor as shown in FIG. 7a,
The coordinates of the visible light pixels satisfy the following condition: y% 2 ═ 0 and y% 2 ═ 1 and x% 2 ═ 1,
The coordinates of the IR pixels satisfy the following condition: y% 2 ═ 1 and x% 2 ═ 0.
for a 4X4 array ordered rgbiir sensor as shown in figure 7b,
The coordinates of the visible light pixels satisfy the following condition: y% 2 ═ 0 and y% 2 ═ 1 and x% 2 ═ 0,
The coordinates of the IR pixels satisfy the following condition: y% 2 ═ 1 and x% 2 ═ 1.
here, "%" is a remainder operation, "y% 2 ═ 0" indicates that the remainder of y divided by 2 is 0, and "x% 2 ═ 1" indicates that the remainder of x divided by 2 is 1. Here, "%" is a remainder operation, "y% 2 ═ 0" indicates that the remainder of y divided by 2 is 0, and "x% 2 ═ 1" indicates that the remainder of x divided by 2 is 1.
pixels at visible pixel locations are coupled to a first control line and pixels at IR pixel locations are coupled to a second control line. It should be understood that, when the array ordering of the rgbair is different, the respective coordinate conditions of the rgbair pixels will change accordingly, and therefore, the connection mode of the logic control circuit and the pixel array needs to be designed correspondingly according to the arrangement mode of the sensor.
For a 2X2 array ordered rgbiir sensor as shown in figure 2b,
The coordinates of the visible light pixels satisfy the following condition: y% 2 ═ 0 and y% 2 ═ 1 and x% 2 ═ 0,
the coordinates of the IR pixels satisfy the following condition: y% 2 ═ 1 and x% 2 ═ 1.
For a 4X4 array ordered rgbiir sensor as shown in figure 3b,
the coordinates of the visible light pixels satisfy the following condition: y% 2 ═ 1 and y% 2 ═ 0 and x% 2 ═ 1,
The coordinates of the IR pixels satisfy the following condition: y% 2 ═ 0 and x% 2 ═ 0.
The coordinate conditions satisfied by the coordinates of each pixel of the sensors in other arrangements are not listed.
The first control line outputs a first control signal, the second control line outputs a second control signal, the first control signal is used for controlling the exposure starting time of the visible light pixel, and the second control signal is used for controlling the exposure starting time of the infrared light pixel. The first control signal and the second control signal are independent of each other, and thus exposure start times of the visible light pixel and the infrared light pixel may be different. Illustratively, the visible light pixel begins to be exposed when a first active transition edge of the first control signal arrives, and the infrared light pixel begins to be exposed when a second active transition edge of the second control signal arrives. The effective transition edges of the first control signal and the second control signal may be both falling edges or both rising edges, or one of the falling edges and the other rising edge. Fig. 8 is a timing diagram of an exemplary control signal. In this figure, the active transition edges of the first control signal and the second control signal are both falling edges, and in an alternative case, the first control signal and the second control signal may be derived from a system reset signal of the logic control circuit. As shown in fig. 8, the first control signal and the second control signal are both active high signals, and when the falling edge of the first control signal comes, the visible light pixel starts to be exposed, and when the falling edge of the second control signal comes, the infrared light pixel starts to be exposed.
Optionally, the logic control circuit 720 further includes: the reset signal may be a system clock signal, and the first control signal and the second control signal may both be derived from the reset signal. Illustratively, the logic control circuit 720 internally includes a logic operation circuit, which may include a logic operation such as and, or, not, xor, etc., the logic operation circuit including three inputs: variable x, variable y and reset signal, this logical operation circuit includes two output: the first control line and the second control line are used for connecting the reset signal with the output end of the first control line if the variable x and the variable y meet the coordinate condition of the visible light pixel; and if the variable x and the variable y meet the coordinate condition of the infrared light pixel, connecting the reset signal with the output end of the second control line.
Optionally, the logic control circuit 720 further includes:
And the exposure end control line is used for uniformly controlling the exposure stop time of all the pixels in the pixel array.
The exposure end control line outputs an exposure end signal, the exposure end signal may be an active high signal or an active low signal, and the exposure end time point may be a falling edge of a high level or a rising edge of a low level. In the timing chart of the control signals shown in fig. 8, the exposure end control signal is an active high signal, and when the falling edge of the exposure end control signal comes, all the pixels in the pixel array stop exposing. That is, the time for starting exposure of the visible light pixels and the time for ending exposure of the IR pixels in the pixel array are independently controlled by the first control line and the second control line, respectively, and the time for ending exposure is uniformly controlled by the exposure ending control line, as shown in fig. 8, the exposure time of the visible light pixels is the time difference between the falling edge of the first control signal and the falling edge of the exposure ending control signal: the first exposure time, the exposure time of the IR pixel is the time difference between the falling edge of the second control signal and the falling edge of the end-of-exposure control signal: a second exposure time. Thus, the exposure times of the visible and IR pixels are independently controlled.
in a low-illumination scene, if the RGBIR sensor is directly used for sensitization, the signal-to-noise ratio of the sensitization result is smaller because the visible light energy is less; if an IR lamp is used for supplementing light, the IR light is stronger than the visible light, under the condition of the same exposure time, the visible light energy which can be captured is much less than the IR light energy, and if the information amount of the visible light is forcibly increased, the IR light is overexposed, and a large amount of effective information is lost in an image with unbalanced exposure. If the visible light and the IR light are separately exposed, the exposure time of the IR light is reduced, and the exposure time of the visible light is relatively prolonged, so that the detail information contained in the photosensitive result of the visible light signal can be effectively improved.
In an exemplary embodiment, the exposure time of the visible light pixel and the infrared light pixel can satisfy the preset ratio by controlling the arrival time of the first effective transition edge of the first control signal and the second effective transition edge of the second control signal. For example, when the ratio of the exposure time of the visible light signal to the exposure time of the infrared light signal is 2:1, the exposure result is better in definition, the signal-to-noise ratio is higher, the control signal of the visible light signal jumps first, the infrared light signal jumps later, and the time difference between the jump time points of the two signals is ensured so that the exposure time of the visible light signal and the exposure time of the infrared light signal meet the preset ratio.
the image sensor provided by the embodiment of the application, the exposure time of visible light pixel and IR pixel is independent control, for example, can be under the condition that infrared light is too strong and visible light is too weak, increase the exposure time of visible light and reduce the exposure time of infrared light, make the exposure time of visible light and infrared light tend to balance, avoid when infrared light is dominant or visible light is dominant, the exposure unbalance problem easily appears, the dynamic range of sensor sensitization is promoted, satisfy the requirement of user to indexes such as definition and signal-to-noise ratio. Furthermore, the photosensitive effect of the sensor can be more finely controlled by accurately setting the exposure time proportion of the visible light signal and the infrared light signal.
Optionally, the logic control circuit 720 further includes:
and the charge transfer control line is used for controlling the time point of transferring the charges accumulated by the photosensitive devices of the pixel array to the charge readout module. The charge transfer control line outputs a charge transfer control signal, which may be an active high signal or an active low signal. In the timing diagram of the control signals shown in fig. 8, the charge transfer control signal is an active high signal, and when the falling edge of the charge transfer control signal comes, the accumulated charges are transferred from the photosensitive device to the charge readout module. In an alternative case, the exposure end control signal is reset again after the charge transfer control signal is reset.
It should be understood that the functions of the logic control circuit may also be implemented by software code running on a processor, or the functions of the logic control circuit may be implemented partly by hardware circuits and partly by software modules. Illustratively, the sensor may comprise a pixel array and a control unit, the control unit being a software module running on the processor, the control unit comprising a first control unit and a second control unit for independently controlling the exposure start times of the visible light pixels and the IR pixels, respectively; the control unit further includes an exposure end control unit for uniformly controlling an exposure end time of each pixel in the pixel array. The control unit further comprises a charge transfer control unit and a reset unit, wherein the reset unit is used for providing the reset signal, and the function of the charge transfer control unit is similar to that of a charge transfer control line, and the details are not repeated here.
further, embodiments of the present application also provide a sensor capable of independently controlling exposure time of the four components of the RGBIR, as shown in fig. 9a and fig. 9b, where fig. 9a is an exemplary RGBIR control connection diagram with 2X2 array sequencing, and fig. 9b is an exemplary RGBIR control connection diagram with 4X4 array sequencing.
the sensor includes a pixel array 910 and a logic control circuit 920.
The pixel array 910 is a pixel array in a sensor as shown in any one of the embodiments of fig. 4 to 6, such as the pixel array 404, the pixel array 504 or the pixel array 605.
and a logic control circuit 920 for independently controlling the exposure time of the red pixel, the green pixel, the blue pixel and the infrared pixel. Specifically, the logic control circuit 920 includes a first control line, a second control line, a third control line, and a fourth control line, or may also include four independent control circuits: the first control circuit, the second control circuit, the third control circuit and the fourth control circuit. The red pixels in the pixel array are coupled to a first control line, the green pixels are coupled to a second control line, the blue pixels are coupled to a third control line, and the infrared pixels are coupled to a fourth control line. It should be understood that the control lines in fig. 9a and 9b, which are named the same, are the same and connected to each other, for example, the first control line of the pixel array side and the first control line of the logic control circuit are the same, the fourth control line of the pixel array side and the fourth control line of the logic control circuit are the same, and so on.
if y is the number of rows of pixels in the pixel array, X is the number of columns of pixels in the pixel array, X is 0. ltoreq. x.ltoreq.M-1, y is 0. ltoreq. y.ltoreq.N-1, M is the total number of columns of the pixel array, and N is the total number of rows of the pixel array, then for a 2X2 array ordered RGBIR sensor as shown in FIG. 9a,
the coordinates of the red pixel satisfy the following condition: y% 2 is 0 and x% 2 is 0,
the coordinates of the green pixel satisfy the following condition: y% 2 is 0 and x% 2 is 1,
the coordinates of the blue pixel satisfy the following condition: y% 2 ═ 1 and x% 2 ═ 1,
The coordinates of the IR pixels satisfy the following condition: y% 2 ═ 1 and x% 2 ═ 0.
for a 4X4 array ordered rgbiir sensor as shown in figure 9b,
The coordinates of the red pixel satisfy the following condition: y% 4 ═ 0 and x% 4 ═ 0, and y% 4 ═ 2 and x% 4 ═ 2;
the coordinates of the green pixel satisfy the following condition: y% 2! X% of 2, or a salt thereof,
The coordinates of the blue pixel satisfy the following condition: y% 4 ═ 0 and x% 4 ═ 2, and y% 4 ═ 2 and x% 4 ═ 0,
the coordinates of the IR pixels satisfy the following condition: y% 2 ═ 1 and x% 2 ═ 1.
Wherein "%" is a remainder operation, "y% 4 ═ 0" indicates that the remainder of y divided by 4 is 0, "x% 4 ═ 2" indicates that the remainder of x divided by 4 is 2, and "! Meaning not equal.
The coordinates of the pixels coupled to the first control line satisfy the coordinate condition of the red pixels, the coordinates of the pixels coupled to the second control line satisfy the coordinate condition of the green pixels, the coordinates of the pixels coupled to the third control line satisfy the coordinate condition of the blue pixels, and the coordinates of the pixels coupled to the fourth control line satisfy the coordinate condition of the infrared light pixels. It should be understood that, when the array ordering of the rgbair is different, the respective coordinate conditions of the rgbair pixels will change accordingly, and therefore, the connection mode of the logic control circuit and the pixel array needs to be designed correspondingly according to the arrangement mode of the sensor. For a 2X2 array ordered rgbirr sensor as shown in fig. 2b, the coordinates of the red pixel satisfy the following condition:
y% 2 is 1 and x% 2 is 0,
The coordinates of the green pixel satisfy the following condition: y% 2 is 0 and x% 2 is 0,
The coordinates of the blue pixel satisfy the following condition: y% 2 is 0 and x% 2 is 1,
The coordinates of the IR pixels satisfy the following condition: y% 2 ═ 1 and x% 2 ═ 1.
For a 4X4 array ordered rgbirr sensor as shown in fig. 3b, the coordinates of the red pixel satisfy the following condition: y% 4 ═ 1 and x% 4 ═ 3, and y% 4 ═ 3 and x% 4 ═ 1;
the coordinates of the green pixel satisfy the following condition:
y%2!=x%2,
the coordinates of the blue pixel satisfy the following condition:
y% 4 ═ 1 and x% 4 ═ 1, and y% 4 ═ 3 and x% 4 ═ 3,
the coordinates of the IR pixels satisfy the following condition: y% 2 ═ 0 and x% 2 ═ 0.
the first control line outputs a first control signal, the second control line outputs a second control signal, the third control line outputs a third control signal, the fourth control line outputs a fourth control signal, the first control signal is used for controlling the exposure start time of the red pixel, the second control signal is used for controlling the exposure start time of the green pixel, the third control signal is used for controlling the exposure start time of the blue pixel, and the fourth control signal is used for controlling the exposure start time of the infrared pixel. The first to fourth control signals are independent from each other, and thus the exposure start times of the four components of the rgbiir may be different. Illustratively, the red pixel starts to be exposed when a first active transition edge of the first control signal arrives, the green pixel starts to be exposed when a second active transition edge of the second control signal arrives, the blue pixel starts to be exposed when a third active transition edge of the third control signal arrives, and the IR pixel starts to be exposed when a fourth active transition edge of the fourth control signal arrives. The first control signal to the fourth control signal may all be high-level active signals, and the active transition edges of the first control signal to the fourth control signal may all be falling edges or rising edges, or may be part of the falling edges and the remaining part of the falling edges being rising edges. Fig. 10 is a timing diagram of an exemplary control signal. The first control signal to the fourth control signal are all high-level effective signals, and the effective transition edges of the first control signal to the fourth control signal are all falling edges. As shown in fig. 10, the red pixel starts to be exposed when the falling edge of the first control signal comes, the green pixel starts to be exposed when the falling edge of the second control signal comes, the blue pixel starts to be exposed when the falling edge of the third control signal comes, and the IR pixel starts to be exposed when the falling edge of the fourth control signal comes.
optionally, the logic control circuit 920 further includes: the reset signal may be a system clock signal, and the first to fourth control signals may be all derived from the reset signal. Illustratively, the logic control circuit 920 internally includes a logic operation circuit, which may include a logic operation such as and, or, not, xor, etc., and the logic operation circuit includes three inputs: variable x, variable y and reset signal, this logical operation circuit includes four output: first to fourth control lines. If the variable x and the variable y meet the coordinate condition of the red pixel, connecting the reset signal with the output end of the first control line; if the variable x and the variable y meet the coordinate condition of the green pixel, connecting the reset signal with the output end of the second control line; if the variable x and the variable y meet the coordinate condition of the blue pixel, connecting the reset signal with the output end of the third control line; and if the variable x and the variable y meet the coordinate condition of the infrared light pixel, connecting the reset signal with the output end of the fourth control line. It should be understood that, when the array ordering of the rgbair is different, the respective coordinate conditions of the rgbair pixels will change accordingly, and therefore, the logic operation circuit inside the logic control circuit needs to be adjusted correspondingly according to the arrangement of the pixel array.
Optionally, the logic control circuit 920 further includes:
And the exposure end control line is used for uniformly controlling the exposure stop time of all the pixels in the pixel array.
The exposure end control line outputs an exposure end signal, the exposure end signal may be an active high signal or an active low signal, and the exposure end time point may be a falling edge of a high level or a rising edge of a low level. In the timing chart of the control signals shown in fig. 10, the exposure end control signal is an active high signal, and when the falling edge of the exposure end control signal comes, all the pixels in the pixel array stop exposing. That is, the exposure start times of the four pixels rgbiir in the pixel array are independently controlled by the first to fourth control lines, and the exposure end time is controlled by the exposure end control line, for example, in the control signal timing chart shown in fig. 10, the exposure time of the R pixel is the time difference between the falling edge of the first control signal and the falling edge of the exposure end control signal: the first exposure time, the exposure time of the G pixel, the B pixel, and the IR pixel are the second exposure time, the third exposure time, and the fourth exposure time, respectively. Thus, the exposure times of the RGBIR four components are independently controlled.
in an exemplary embodiment, the exposure time of the four components of the rgbiir may satisfy the preset ratio by controlling the time when the first to fourth active transition edges of the first to fourth control signals arrive.
Optionally, the logic control circuit 920 further includes: and a charge transfer control line for controlling when charges accumulated by the light sensing devices of the pixel array are transferred to the charge readout module. The charge transfer control line outputs a charge transfer control signal, which may be an active high signal or an active low signal. The charge transfer control signals shown in fig. 10 are the same as those shown in fig. 8.
It should be understood that the functions of the logic control circuit may also be implemented by software code running on a processor, or the functions of the logic control circuit may be implemented partly by hardware circuits and partly by software modules. Illustratively, the sensor may comprise a pixel array and a control unit, the control unit being a software module running on the processor, the control unit comprising a first control unit, a second control unit, a third control unit, a fourth control unit for independently controlling R, G, B exposure start times of the four components of IR, respectively; the control unit further includes an exposure end control unit for uniformly controlling exposure end times of the four components of the pixel. The control unit further comprises a charge transfer control unit and a reset unit, wherein the reset unit is used for providing a reset signal, and the function of the charge transfer control unit is similar to that of a charge transfer control line, and the details are not repeated here.
the exposure time of the image sensor provided by the embodiment of the application, R, G, B and the exposure time of the IR four components are respectively and independently controlled, so that the photosensitive dynamic range of the sensor is further improved. When the sensitization result of R, G components is required to be high in some scenes and the sensitization result of B, IR is expected to be reduced, the sensitization effect of R, G components can be enhanced and the sensitization effect of B, IR components can be weakened by flexibly controlling the exposure time of four components, so that the final sensitization result is more in line with the scene requirements or more in line with the definition or signal-to-noise ratio required by customers. Further, the exposure time of the four components of R, G, B and IR can be preset to meet the preset ratio to achieve fine control of the sensor's sensing effect.
Further, the embodiment of the present application also provides a sensor in which the exposure time of each pixel can be independently controlled, as shown in fig. 11, the sensor includes a pixel array 1110 and a logic control circuit 1120.
the pixel array 1110 is a pixel array in a sensor as shown in any one of the embodiments of fig. 4 to 6, such as the pixel array 404, the pixel array 504 or the pixel array 605.
The logic control circuit 1120 includes a row coordinate control circuit and a column coordinate control circuit, or row coordinate control line and column coordinate control line, with each pixel in the pixel array coupled to a respective row coordinate control line and column coordinate control line.
the logic control circuit 1120 further comprises: and the exposure starting control line outputs a reset signal to the target pixel and controls the exposure starting time of the target pixel based on the reset signal when the row coordinate control signal output by the row coordinate line of the target pixel and the column coordinate control signal output by the column coordinate line are both effective signals. Illustratively, the exposure start control line has a plurality of branches, each pixel is coupled to one branch, and when the row coordinate control signal and the column coordinate control signal of the target pixel both meet the requirement, the corresponding branch of the target pixel outputs an effective control signal. The column coordinate control line and the row coordinate control line are equivalent to switches, the reset signal is input, the exposure starting control line is output, when the signals in the column coordinate control line and the row coordinate control line are both effective signals, the switches are opened, the reset signal can be output to the target pixel through the exposure starting control line, and the exposure of the target pixel is controlled. Illustratively, when the signals in the column coordinate control line and the row coordinate control line are both active signals, and an active transition edge of the reset signal comes, the control target pixel starts exposure. If one signal in the column coordinate control line and the row coordinate control line does not meet the requirement, the switch is closed, and the exposure starting control line has no control signal output. Because each pixel in the pixel array has a row coordinate line and a column coordinate line corresponding to each pixel, the exposure time of each pixel can be independently controlled, for example, signals in the row coordinate line and the column coordinate line of a pixel point needing key exposure can be preferentially set as effective signals, so that the exposure time of the key exposure pixel is prolonged.
optionally, the logic control circuit 1120 further includes: the exposure end control line is used to uniformly control the exposure end time of each pixel in the pixel array, and specifically, reference may be made to the exposure end control lines of the logic control circuit 720 and the logic control circuit 920, which is not described herein again.
optionally, the logic control circuit 1120 further includes: for controlling when to transfer the charges accumulated in the photosensitive device to the charge readout module, reference may be made to the charge transfer control lines of the logic control circuit 720 and the logic control circuit 920, which is not described herein again.
it should be understood that the functions of the logic control circuit may also be implemented by software code running on a processor, or the functions of the logic control circuit may be implemented partly by hardware circuits and partly by software modules. Illustratively, the sensor may include a pixel array and a control unit, the control unit being a software module running on a processor, the control unit including a row control unit, a column control unit and an exposure start control unit, the row control unit and the column control unit being configured to indicate a row coordinate and a vertical coordinate of a pixel, respectively, the exposure start control unit being configured to output an effective control signal to control an exposure start time of a target pixel when both the row control unit and the column control unit of the target pixel meet a requirement.
The sensor provided by the embodiment of the application can control the exposure starting time of the pixels according to the conditions of the control signals in the row coordinate control line and the column coordinate control line of each pixel, and the exposure ending time is uniformly controlled by the exposure ending control line, so that the exposure time of each pixel can be different. Further, the exposure time of each pixel can meet the preset proportion by setting the time when the row coordinate control signal and the column coordinate control signal corresponding to the pixel are both changed into the effective signals. In some scenes needing to enhance the pixels of the target area, the exposure time of the pixels in the target area can be increased only, so that the flexibility of the light sensing of the sensor is further improved, and the requirement of a user on the light sensing result is further met.
fig. 12 shows an exemplary timing diagram of the control signals. Fig. 12 illustrates control of the exposure start time of the pixel by the exposure start control signal by taking two pixels as an example. The signals in the timing diagram are all active high, it being understood that each control signal may also be active low.
the first pixel is coupled to a first row coordinate control line and a first column coordinate control line, a signal in the first row coordinate control line is a row coordinate control signal 1, a signal in the first column coordinate control line is a column coordinate control signal 1, the second pixel is coupled to a second row coordinate control line and a second column coordinate control line, a signal in the second row coordinate control line is a row coordinate control signal 2, and a signal in the second column coordinate control line is a column coordinate control signal 2. When the row coordinate control signal 1 and the column coordinate control signal 1 of the first pixel are both at a high level, the exposure start control signal of the first pixel is active, specifically, the reset signal is used as the exposure start control signal, and when the falling edge of the reset signal comes, the first pixel is controlled to start exposure; when the row coordinate control signal 2 and the column coordinate control signal 2 of the second pixel are both at a high level, the exposure start control signal of the second pixel is asserted, specifically, the reset signal is used as the exposure start control signal, and when the falling edge of the reset signal comes, the second pixel is controlled to start exposure. When the falling edge of the exposure end control signal comes, the first pixel and the second pixel stop exposing. So far, the exposure time of the first pixel is the first exposure time, and the exposure time of the second pixel is the second exposure time. It should be understood that the exposure start control signal of the first pixel and the exposure start control signal of the second pixel may be two different branches of the same signal, and when the coordinate control signal of the first pixel meets the requirement, the branch corresponding to the first pixel outputs an effective control signal; and when the coordinate control signal of the second pixel meets the requirement, the branch corresponding to the second pixel outputs an effective control signal.
as shown in fig. 13, a graph of the light sensing characteristics of each pixel of the rgb ir sensor provided in the embodiment of the present application is shown. Wherein, the abscissa is the wavelength of light, the unit is nm, and the ordinate is the photosensitive intensity. The thin solid line is a photosensitive characteristic curve of the R pixel, the short dashed line is a photosensitive characteristic curve of the G pixel, the dotted line is a photosensitive characteristic curve of the B pixel, and the long dashed line is a photosensitive characteristic curve of the IR pixel. As can be seen from fig. 13, the R pixel has a photosensitive intensity peak only in the vicinity of 650nm of red light, the G pixel has a photosensitive intensity peak only in the vicinity of 550nm of green light, the B pixel has a photosensitive intensity peak only in the vicinity of 450nm of blue light, and the IR pixel has a photosensitive intensity peak only in the vicinity of 850nm (910 nm in some cases) of infrared light. Compared with the existing RGBIR sensor, the RGBIR sensor provided by the embodiment of the application removes the IR component in the R, G, B pixel photosensitive result, so that the R pixel can only sense red light, the G pixel can only sense green light, and the B pixel can only sense blue light, and the color accuracy of the sensor photosensitive result is improved.
The sensor provided by the embodiment of the application can be used in security monitoring cameras of cells, intelligent transportation electronic eye devices, video cameras, mobile phones and other terminal devices with imaging, photographing or video recording functions.
FIG. 14a is a schematic diagram of an exemplary 2X2 array-ordered RGBW (red green blue white) sensor. FIG. 14b is a schematic diagram of an exemplary 2X2 array-ordered RCCB (read clear glue) sensor. Each cell in the figure represents a pixel, R represents a red pixel, G represents a green pixel, B represents a blue pixel, and W represents a white pixel in the RGBW sensor. R in the RCCB sensor represents a red pixel, B represents a blue pixel, C represents a clear pixel, the RCCB sensor replaces a G pixel in the RGB sensor with a C pixel, the wavelength range allowed by the C pixel is between 400nm and 657nm, the clear pixel can pass a large amount of light, and the C pixel is matched with the current Demosaic algorithm.
The embodiment of the application also provides an RGBW sensor which can independently control the exposure time of the visible light pixel and the W pixel. The exposure control of the RGBW sensor may be implemented by a logic control circuit or control unit, which may be a software module running on a processor. The control logic of the independent exposure of the RGBW sensor is similar to that of the independent exposure of the RGBIR sensor, and the IR pixel is replaced by the W pixel. The logic control circuit of the RGBW sensor for visible light pixel and independent exposure of the W pixel refers to the logic control circuit of the rgbeir sensor shown in fig. 7a, and is not described herein again.
correspondingly, the embodiments of the present application also provide an RGBW sensor capable of independently controlling exposure times of four components of RGBW, where the exposure control may be implemented by a logic control circuit or a control unit, and the control unit may be a software module running on a processor. The control logic of the RGBW sensor 4 component independent exposure is similar to the control logic of the RGBIR sensor 4 component independent exposure, and the logic control circuit of the RGBW sensor with 4 component independent exposure refers to the logic control circuit of the RGBIR sensor shown in FIG. 9a, which is not repeated herein.
The embodiment of the application also provides an RCCB sensor which can independently control the exposure time of the visible light pixels and the C pixels. The exposure control of the RCCB sensor may be performed by a logic control circuit or control unit, in the embodiment of the application, the logic control circuit of the RCCB refers to the logic control circuit 720, wherein a first control line is used for controlling the R pixels and the B pixels, and a second control line is used for controlling the 2C pixels.
Correspondingly, the embodiment of the present application further provides an RCCB sensor in which the exposure times of the R, B, C three components are independently controllable, and the exposure control of the RCCB sensor can be performed by a logic control circuit or a control unit, which is taken as an example for description herein. As shown in fig. 15, for the control unit for independent exposure of an exemplary RCCB sensor provided in this embodiment of the present application, the control unit 1500 includes a first control unit 1510, a second control unit 1520, and a third control unit 1530, where the first control unit 1510 is configured to control an exposure start time of an R pixel, the second control unit 1520 is configured to control an exposure start time of a B pixel, and the third control unit 1530 is configured to control an exposure start time of a C pixel. The control unit 1500 further includes an exposure end control unit 1540 for uniformly controlling the exposure end time of all the pixels in the pixel array. The control unit 1500 may further include a charge transfer control unit 1550 and a reset unit 1560, and the functions of the charge transfer control unit 1550 and the reset unit 1560 are described with reference to the charge transfer control line and the reset signal in 720 and 920, which are not described herein again. The control unit may illustratively be a software module running on a general-purpose processor or a dedicated processor.
the embodiment of the application provides an independent exposure device, which is used for controlling the exposure time of a pixel array of a sensor, and comprises at least two control units, wherein each control unit of the at least two control units is used for respectively and correspondingly controlling the exposure starting time of one type of pixels in the pixel array of the sensor, and the pixel array of the sensor controlled by the device comprises at least two types of pixels.
It should be understood that the independent exposure means may be considered as a control means independent of the sensor, for example, a general purpose processor or a dedicated processor, or may be considered as an independently solidified hardware logic or hardware circuit. For example, the independent exposure apparatus can be regarded as a logic control circuit in fig. 7a and 7b, fig. 9a and 9b, fig. 11, or a control unit in fig. 15.
fig. 16 is a schematic diagram of a hardware architecture of an independent exposure apparatus according to an embodiment of the present application. It should be understood that the logic control circuits in fig. 7a, 7b, 9a, 9b and 11 can be implemented by software modules running on the independent exposure apparatus as shown in fig. 16. The control unit shown in fig. 15 may also be implemented by a software module running on the exposure control apparatus shown in fig. 16.
The exposure control apparatus 1600 includes: at least one Central Processing Unit (CPU), at least one memory, a Microcontroller (MCU), a receiving interface, a transmitting interface, and the like. Optionally, the exposure control apparatus 1600 further includes: a dedicated video or Graphics processor, and a Graphics Processing Unit (GPU), among others.
Alternatively, the CPU may be a single-core (single-CPU) processor or a multi-core (multi-CPU) processor; alternatively, the CPU may be a processor group including a plurality of processors, and the plurality of processors are coupled to each other via one or more buses. In an alternative case, the exposure control may be performed partly by software code running on a general purpose CPU or MCU and partly by hardware logic; or it may be entirely implemented in software code running on a general purpose CPU or MCU. Alternatively, the Memory 302 may be a nonvolatile Memory, such as an Embedded MultiMedia Card (EMMC), a Universal Flash Storage (UFS) or a Read-Only Memory (ROM), or other types of static Storage devices capable of storing static information and instructions, a nonvolatile Memory (volatile Memory), such as a Random Access Memory (RAM), or other types of dynamic Storage devices capable of storing information and instructions, an Electrically Erasable Programmable Read-Only Memory (EEPROM), a Compact Disc Read-Only Memory (CD-ROM), or other optical Disc Storage, optical Disc Storage (including Compact Disc, laser Disc, blu-ray Disc, digital versatile Disc, optical Disc, etc.), a Storage medium, or other magnetic Storage devices, Or any other computer-readable storage medium that can be used to carry or store program code in the form of instructions or data structures and that can be accessed by a computer, but is not limited to such. The receiving interface may be an interface for data input of the processor chip.
in one possible embodiment, the apparatus for independent exposure further comprises: an array of pixels. In this case, the means for independent exposure includes at least two types of pixels, that is, the means for independent exposure may be a sensor including a control unit or a logic control circuit, or the means for independent exposure may be a sensor that can independently control exposure. Illustratively, the independent exposure device can be an rgbiir sensor, an RGBW sensor, an RCCB sensor, and the like, which independently control exposure.
it should be understood that in an alternative case, the visible light pixels are grouped into one type of pixels, i.e., the R pixels, G pixels, and B pixels are grouped into one type of pixels, and the IR pixels, W pixels, or C pixels are considered to be another type of pixels, for example, the rgbiir sensor includes two types of pixels: visible light pixels and IR pixels, RGBW sensors include two types of pixels: visible light pixels and W pixels, the RCCB sensor includes two types of pixels: a visible light pixel and a C pixel.
In another alternative case, each pixel component is considered a type of pixel, e.g., an rgbiir sensor includes: r, G, B and IR, the RGBW sensor includes: r, G, B and W, the RCCB sensor includes: r, B and C.
In one possible embodiment, the sensor is an RGBIR sensor that can achieve independent exposure of visible and IR pixels, respectively, or R, G, B, IR quartets of independent exposures, respectively.
for an RGBIR sensor in which visible light pixels and IR pixels are exposed independently, the at least two control units include: a first control unit and a second control unit; the first control unit is used for controlling the exposure starting time of the visible light pixel; the second control unit is used for controlling the exposure start time of the IR pixels.
For an R, G, B, IR four component RGBIR sensor with independent exposures, at least two control units include: a first control unit, a second control unit, a third control unit and a fourth control unit; the first control unit is used for controlling the exposure starting time of the R pixel; the second control unit is used for controlling the exposure starting time of the G pixel; the third control unit is used for controlling the exposure starting time of the B pixel; the fourth control unit is used for controlling the exposure start time of the IR pixels.
In one possible embodiment, the sensor is an RGBW sensor, and the RGBW sensor can realize independent exposure of the visible light pixel and the W pixel respectively, and can also realize R, G, B, W four-component independent exposure respectively.
For an RGBW sensor in which a visible light pixel and a W pixel are independently exposed, respectively, the at least two control units include: a first control unit and a second control unit; the first control unit is used for controlling the exposure starting time of the visible light pixel; the second control unit is used for controlling the exposure starting time of the W pixel.
For an R, G, B, W four component RGBW sensor with independent exposures, the at least two control units include: a first control unit, a second control unit, a third control unit and a fourth control unit; the first control unit is used for controlling the exposure starting time of the R pixel; the second control unit is used for controlling the exposure starting time of the G pixel; the third control unit is used for controlling the exposure starting time of the B pixel; the fourth control unit is used for controlling the exposure starting time of the W pixel.
In one possible implementation, the sensor is an RCCB sensor, and the RCCB sensor can realize independent exposure of visible light pixels and C pixels respectively, and can also realize R, B, C independent exposure of three components respectively.
for an RGBW sensor in which a visible light pixel and a C pixel are independently exposed, respectively, the at least two control units include: a first control unit and a second control unit; the first control unit is used for controlling the exposure starting time of the visible light pixel; the second control unit is used for controlling the exposure starting time of the C pixel.
For an R, B, C RCCB sensor with three components each exposed independently, at least two control units include: a first control unit, a second control unit and a third control unit; the first control unit is used for controlling the exposure starting time of the R pixel; the second control unit is used for controlling the exposure starting time of the B pixel; the third control unit is used for controlling the exposure starting time of the C pixel.
in a possible embodiment, the means for independently exposing may further control the exposure time of the at least two types of pixels to satisfy a preset ratio based on the at least two control units. Illustratively, the exposure time of the visible light pixels and the exposure time of the IR pixels are controlled to meet a preset proportion based on the first control unit and the second control unit; or controlling R, G, B the exposure time of the IR pixels to satisfy a preset ratio based on the first, second, third and fourth control units. Or controlling the exposure time of the visible light pixel and the W pixel to meet a preset proportion based on the first control unit and the second control unit; or controlling R, G, B the exposure time of the W pixel to satisfy a preset ratio based on the first control unit, the second control unit, the third control unit and the fourth control unit; or controlling the exposure time of the visible light pixel and the C pixel to meet a preset ratio based on the first control unit and the second control unit; alternatively, the exposure time of R, B and the C pixel is controlled to satisfy a preset ratio based on the first control unit, the second control unit, and the third control unit.
in one possible embodiment, the apparatus for independent exposure further comprises: and the exposure end control unit is used for uniformly controlling the exposure end time of all the pixels in the pixel array.
The application also provides an image sensitization method. Fig. 17 is a schematic flow chart of an exemplary image sensitization method provided in the embodiment of the present application. The method applies an image sensor comprising: the infrared light cut-off filter layer, the micro lenses and the pixel array, wherein the pixel array comprises red pixels, green pixels, blue pixels and infrared light pixels, and one micro lens corresponds to the upper part of each pixel in the pixel array; the infrared light cut-off filter layer is respectively coated on the micro-mirror heads corresponding to the red pixels, the green pixels and the blue pixels, and the method comprises the following steps:
1701. The method comprises the steps that original light rays of the nature pass through an optical filter to obtain first light rays;
The filter is used to filter ultraviolet light and far infrared light, where the far infrared light is infrared light with a longer wavelength, for example, the infrared light with a wavelength longer than a second predetermined wavelength mentioned in the foregoing embodiments may be referred to as far infrared light. The wavelength of the far infrared light is larger than the wavelength of the infrared light in the specific wavelength range allowed to pass through by the subsequent infrared light filter layer. The optical filter can refer to the description of the optical filter on the device side, and the description thereof is omitted here.
1702. the first light reaches the infrared light pixel through the infrared light filter layer and the micro lens;
1703. The first light reaches the red pixel through the infrared light cut-off filter layer, the red filter layer and the micro lens;
1704. the first light reaches the green pixel through the infrared light cut-off filter layer, the green filter layer and the micro lens;
1705. The first light reaches the blue pixel through the infrared light cut-off filter layer, the blue filter layer and the micro lens;
it should be understood that the reference numbers of steps 1702-1705 do not limit the execution order of the method, and steps 1702-1705 may be executed synchronously in general, or steps may not be executed strictly synchronously, but have some time difference, which is not limited in the embodiment of the present application.
The infrared light cut-off filter layer cuts off infrared light including: infrared light in the first wavelength range, infrared light in the second wavelength range, and infrared light in the third wavelength range.
Since the infrared light passed by the red filter layer, the green filter layer, and the blue filter layer is within the wavelength range of the infrared light cut by the infrared light cut filter layer, the infrared light cut filter layer cuts off the infrared light entering the R pixel, the G pixel, and the B pixel, so that the R pixel, the G pixel, and the B pixel can respectively sense only the R light, the G light, and the B light.
It should be appreciated that step 1701 is an optional step, and that original light rays from nature may also enter the filter layer and the micro-lens directly without passing through the filter. The infrared light cut filter layer may be on the red filter layer, the green filter layer, and the blue filter layer; the red filter layer, the green filter layer, and the blue filter layer may also be on the infrared light cut filter layer, which is not limited in this embodiment of the application.
1706. The photosensitive device in the pixel converts the light entering the pixel into electric charge;
1707. and outputting the accumulated charges through a charge readout module to obtain a sensitization result.
In one possible embodiment, the method further comprises:
controlling an exposure start time of a visible light pixel including the red pixel, the green pixel, and the blue pixel based on the first control line;
And controlling the exposure starting time of the infrared light pixel based on the second control line.
In the method, the visible light pixels and the infrared light pixels can be exposed independently, so that the photosensitive effect of the sensor is improved.
in one possible embodiment, the method further comprises: and controlling the exposure time of the visible light pixel and the infrared light pixel to meet a preset ratio based on the first control line and the second control line.
In one possible embodiment, the method further comprises:
Controlling an exposure start time of the red pixel based on a first control line;
Controlling an exposure start time of the green pixel based on a second control line;
Controlling an exposure start time of the blue pixel based on a third control line;
the exposure start time of the infrared light pixel is controlled based on the fourth control line.
In the method, the 4 pixel components can be respectively and independently exposed, and the photosensitive effect of the sensor is improved.
In one possible embodiment, the method further comprises: and controlling the exposure time of the red pixel, the green pixel and the blue pixel to meet a preset proportion.
In one possible embodiment, each pixel in the sensor is coupled to a respective row coordinate control line and column coordinate control line, and corresponds to a branch of the exposure start control line, the method further comprising: when the control signals output by the row coordinate control line and the column coordinate control line of the target pixel are both effective levels, the branch of the exposure start control line corresponding to the target pixel outputs a control signal, and the exposure start time of the target pixel is controlled based on the control signal, wherein the target pixel is any one pixel in the pixel array.
In this method, the exposure time can be controlled individually for each pixel.
in one possible embodiment, the method further comprises: the exposure end time of all pixels in the pixel array is controlled based on the exposure end control line.
As shown in fig. 18, a flowchart of an exemplary method for independently controlling an exposure time provided for an embodiment of the present application is applied to a sensor including at least two types of pixels, where the at least two types of pixels include a first type of pixel and a second type of pixel, and the method includes:
1801. Controlling an exposure start time of the first type of pixels based on the first control unit;
1802. The exposure start time of the second type of pixel is controlled based on the second control unit.
Illustratively, the sensor may be an rgbiir sensor, and correspondingly, the first type of pixels are visible light pixels, which include R, G, B pixels, and the second type of pixels are IR pixels. The sensor may be an RGBW sensor, with the first type of pixel being a visible light pixel comprising R, G, B pixels and the second type of pixel being a W pixel. The sensor may be an RCCB sensor, and correspondingly, the first type of pixels are visible light pixels, the visible light pixels include R, B pixels, and the second type of pixels are C pixels. The first control unit and the second control unit are independent of each other, so that the exposure start times of the first type pixels and the second type pixels are independently controlled. It should be understood that the first control unit and the second control unit may be implemented by hardware logic circuits, or by software modules running on a processor.
In one possible embodiment, the at least two types of pixels further comprise: a third type of pixel; the method further comprises the following steps: the exposure start time of the third type of pixel is controlled based on the third control unit.
illustratively, the sensor is an RCCB sensor, the first type of pixels are R pixels, the second type of pixels are B pixels, and the third type of pixels are C pixels; the method specifically comprises the following steps: controlling an exposure start time of the R pixel based on the first control unit; controlling an exposure start time of the B pixel based on the second control unit; the exposure start time of the C pixel is controlled based on the third control unit.
In one possible embodiment, the at least two types of pixels further comprise: the at least two types of pixels further include: a third type of pixel and a fourth type of pixel, the method further comprising:
Controlling an exposure start time of the third type of pixel based on a third control unit;
The exposure start time of the fourth type of pixel is controlled based on a fourth control unit.
Illustratively, the sensor is an rgbiir sensor, the first type of pixels are R pixels, the second type of pixels are G pixels, the third type of pixels are B pixels, and the fourth type of pixels are IR pixels; the method specifically comprises the following steps:
controlling an exposure start time of the R pixel based on the first control unit;
controlling an exposure start time of the G pixel based on the second control unit;
Controlling an exposure start time of the B pixel based on the third control unit;
Controlling an exposure start time of the IR pixel based on the fourth control unit; alternatively, the first and second electrodes may be,
The sensor is an RGBW sensor, the first type of pixel is an R pixel, the second type of pixel is a G pixel, the third type of pixel is a B pixel, and the fourth type of pixel is a W pixel; the method specifically comprises the following steps:
controlling an exposure start time of the R pixel based on the first control unit;
controlling an exposure start time of the G pixel based on the second control unit;
controlling an exposure start time of the B pixel based on the third control unit;
the exposure start time of the W pixel is controlled based on the fourth control unit.
Optionally, the method further includes:
The exposure end time of all pixels is controlled based on the exposure end control unit.
Optionally, the method further includes: controlling an exposure time of each type of the at least two types of pixels to satisfy a preset ratio.
illustratively, the exposure time of the visible light pixels and the exposure time of the IR pixels are controlled to meet a preset proportion based on the first control unit and the second control unit; or controlling R, G, B the exposure time of the IR pixels to satisfy a preset ratio based on the first, second, third and fourth control units. Or controlling the exposure time of the visible light pixel and the W pixel to meet a preset proportion based on the first control unit and the second control unit; or controlling R, G, B the exposure time of the W pixel to satisfy a preset ratio based on the first control unit, the second control unit, the third control unit and the fourth control unit; or controlling the exposure time of the visible light pixel and the C pixel to meet a preset ratio based on the first control unit and the second control unit; alternatively, the exposure time of R, B and the C pixel is controlled to satisfy a preset ratio based on the first control unit, the second control unit, and the third control unit.
The exposure starting time of different types of pixels is respectively and independently controlled, and the exposure ending time is uniformly controlled, so that the exposure time among the pixels can meet the preset proportion by setting the exposure starting time of different pixels.
optionally, the method further includes: the charge accumulated in the photosensitive device is transferred to the charge readout module based on the charge transfer control unit.
Embodiments of the present application also provide a computer-readable storage medium, which stores instructions that, when executed on a computer or a processor, cause the computer or the processor to perform some or all of the steps of any one of the methods for independent exposure control provided by the embodiments of the present application.
embodiments of the present application also provide a computer program product containing instructions, which when run on a computer or a processor, causes the computer or the processor to perform some or all of the steps of any of the methods for independent exposure control provided by embodiments of the present application.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.

Claims (36)

1. An image sensor, characterized in that the sensor comprises: the pixel array comprises red pixels, green pixels, blue pixels and infrared light pixels, and each pixel corresponds to one micro lens;
the filter layer comprises an infrared light cut-off filter layer, the infrared light cut-off filter layer is coated on the micro-mirror heads corresponding to the red pixels, the green pixels and the blue pixels respectively, the infrared light cut-off filter layer is used for cutting off optical signals with wavelengths larger than a first preset wavelength, and the optical signals with wavelengths larger than the first preset wavelength comprise infrared light.
2. the sensor of claim 1, wherein the filter layers further comprise a red filter layer, a green filter layer, a blue filter layer, and an infrared light filter layer;
the micro-mirror heads corresponding to the infrared pixels are coated with the infrared light filter layer, and the infrared light filter layer can pass infrared light in a specific wavelength range;
The red filter layer is further coated on the micro-mirror head corresponding to the red pixel, the green filter layer is further coated on the micro-mirror head corresponding to the green pixel, and the blue filter layer is further coated on the micro-mirror head corresponding to the blue pixel;
The red filter layer can only pass red light and infrared light in a first wavelength range, the green filter layer can only pass green light and infrared light in a second wavelength range, the blue filter layer can only pass blue light and infrared light in a third wavelength range, and the wavelengths of the infrared light in the first wavelength range, the infrared light in the second wavelength range and the infrared light in the third wavelength range are all larger than the first preset wavelength.
3. The sensor according to claim 2, further comprising an optical filter for filtering out ultraviolet light and infrared light having a wavelength greater than a second preset wavelength, wherein the second preset wavelength is greater than any one of the first preset wavelength and the specific wavelength range;
and the light rays sequentially pass through the optical filter, the filter layer and the micro lens to reach the pixel array.
4. A sensor according to any one of claims 1 to 3, further comprising a charge readout module, each pixel in the array of pixels comprising a photosensitive device;
The photosensitive device is used for converting light rays into electric charges;
And the charge readout module outputs the charges accumulated by the photosensitive device to obtain a photosensitive result.
5. the sensor of any one of claims 1 to 4, further comprising: and the logic control circuit is used for independently controlling the exposure time of a visible light pixel and the exposure time of the infrared light pixel respectively, wherein the visible light pixel comprises the red pixel, the green pixel and the blue pixel.
6. The sensor of claim 5, wherein the logic control circuit comprises: a first control line and a second control line, visible light pixels in the pixel array coupled to the first control line, infrared light pixels in the pixel array coupled to the second control line; the logic control circuit is specifically configured to:
Controlling an exposure start time of the visible light pixels based on the first control line;
controlling an exposure start time of the infrared light pixel based on the second control line.
7. The sensor of claim 6,
the logic control circuitry is further to: controlling the exposure time of the visible light pixel and the infrared light pixel to meet a preset ratio based on the first control line and the second control line.
8. The sensor of any one of claims 1 to 4, further comprising: and the logic control circuit is used for independently controlling the exposure time of the red pixel, the green pixel, the blue pixel and the infrared pixel.
9. The sensor of claim 8, wherein the logic control circuit comprises: a first control line, a second control line, a third control line, and a fourth control line, a red pixel in the pixel array coupled to the first control line, a green pixel in the pixel array coupled to the second control line, a blue pixel in the pixel array coupled to the third control line, an infrared pixel in the pixel array coupled to the fourth control line; the logic control circuit is specifically configured to:
Controlling an exposure start time of the red pixel based on the first control line;
Controlling an exposure start time of the green pixel based on the second control line;
Controlling an exposure start time of the blue pixel based on the third control line;
controlling an exposure start time of the infrared light pixel based on the fourth control line.
10. The sensor of claim 9,
the logic control circuitry is further to: controlling exposure times of the red, green, blue, and infrared pixels to satisfy a preset ratio based on the first, second, third, and fourth control lines.
11. the sensor of any one of claims 1 to 4, further comprising: a row coordinate control line, a column coordinate control line and an exposure start control line; each pixel in the pixel array is coupled to a respective row coordinate control line and column coordinate control line, the exposure start control line includes a plurality of branches, and each branch corresponds to a pixel;
when the control signals output by the row coordinate control line and the column coordinate control line of a target pixel are both effective levels, the branch output control signal of the exposure start control line corresponding to the target pixel controls the exposure start time of the target pixel, and the target pixel is any one pixel in the pixel array.
12. a method of image sensitization, the method being applied to an image sensor, the sensor comprising: the infrared light cut-off filter layer, the micro lenses and the pixel array, wherein the pixel array comprises red pixels, green pixels, blue pixels and infrared light pixels, and each pixel in the pixel array corresponds to one micro lens; the infrared light cut-off filter layer is respectively coated on the micro-mirror heads corresponding to the red pixels, the green pixels and the blue pixels, and the method comprises the following steps:
Light rays reach the red pixel, the green pixel and the blue pixel through the infrared light cut-off filter layer and the micro lens;
the infrared light cut-off filter layer is used for cutting off infrared light, so that the infrared light cannot enter the red pixel, the green pixel and the blue pixel.
13. The method of claim 12, wherein the sensor further comprises a red filter layer, a green filter layer, a blue filter layer, and an infrared light filter layer; the red filter layer is coated on the micro-mirror heads corresponding to the red pixels, the green filter layer is coated on the micro-mirror heads corresponding to the green pixels, the blue filter layer is coated on the micro-mirror heads corresponding to the blue pixels, and the infrared filter layer is coated on the micro-mirror heads corresponding to the infrared pixels, wherein the method specifically comprises the following steps:
The light rays sequentially pass through the infrared light filter layer and the micro lens to reach the infrared light pixels;
The light rays sequentially pass through the infrared light cut-off filter layer, the red filter layer and the micro lens to reach the red pixels;
the light rays sequentially pass through the infrared light cut-off filter layer, the green filter layer and the micro lens to reach the green pixels;
The light rays sequentially pass through the infrared light cut-off filter layer, the blue filter layer and the micro lens to reach the blue pixels; alternatively, the first and second electrodes may be,
The light rays sequentially pass through the red filter layer, the infrared light cut-off filter layer and the micro lens to reach the red pixels;
the light rays sequentially pass through the green filter layer, the infrared light cut-off filter layer and the micro lens to reach the green pixels;
the light rays sequentially pass through the blue filter layer, the infrared light cut-off filter layer and the micro lens to reach the blue pixels;
wherein the infrared light filter layer is configured to pass only infrared light within a specific wavelength range, the red light filter layer is configured to pass only red light and infrared light within a first wavelength range, the green light filter layer is configured to pass only green light and infrared light within a second wavelength range, and the blue light filter layer is configured to pass only blue light and infrared light within a third wavelength range;
the infrared light cut by the infrared light cut filter layer includes: infrared light within the first wavelength range, infrared light within the second wavelength range, and infrared light within the third wavelength range.
14. the method according to claim 12 or 13, wherein the sensor further comprises an optical filter, the light is the light after the original light in the nature passes through the optical filter, the optical filter is used for filtering ultraviolet light and far infrared light, the wavelength of the far infrared light is larger than the wavelength of the infrared light in the specific wavelength range allowed to pass by the infrared light filter layer, and before the method according to claim 12 or 13, the method further comprises:
And the original light rays of the nature pass through the optical filter to obtain the light rays.
15. the method of any of claims 12 to 14, wherein the sensor further comprises a charge readout module, each pixel in the array of pixels comprising a photosensitive device, the method further comprising:
The photosensitive device converts light into electric charges;
And outputting the accumulated charges through a charge readout module to obtain a sensitization result.
16. the method of any one of claims 12 to 15, further comprising:
Controlling an exposure start time of visible light pixels including the red pixels, the green pixels, and the blue pixels based on the first control line;
controlling an exposure start time of the infrared light pixel based on the second control line.
17. The method of claim 16, further comprising:
Controlling the exposure time of the visible light pixel and the infrared light pixel to meet a preset ratio based on the first control line and the second control line.
18. The method of any one of claims 12 to 15, further comprising:
Controlling an exposure start time of the red pixel based on a first control line;
Controlling an exposure start time of the green pixel based on a second control line;
controlling an exposure start time of the blue pixel based on a third control line;
controlling an exposure start time of the infrared light pixel based on a fourth control line.
19. A method according to any one of claims 12 to 15, wherein each pixel in the sensor is coupled to a respective row coordinate control line and column coordinate control line and corresponds to a branch of an exposure start control line, the method further comprising:
When the control signals output by the row coordinate control line and the column coordinate control line of a target pixel are both effective levels, the branch of the exposure start control line corresponding to the target pixel outputs a control signal, and the exposure start time of the target pixel is controlled based on the control signal, wherein the target pixel is any one pixel in the pixel array.
20. An apparatus for independent exposure, the apparatus comprising: the control device comprises at least two control units, wherein each control unit is used for correspondingly controlling the exposure starting time of one type of pixels in a pixel array of a sensor, and the pixel array of the sensor comprises at least two types of pixels.
21. The apparatus of claim 20, further comprising: the pixel array.
22. The apparatus of claim 20 or 21, wherein the sensor is an RGBIR sensor,
the at least two types of pixels include: a visible light pixel and an IR pixel, the visible light pixel comprising: r pixel, G pixel, B pixel, said;
Alternatively, the at least two types of pixels include: r pixel, B pixel, G pixel and IR pixel, the at least two control units include: a first control unit, a second control unit, a third control unit and a fourth control unit;
The first control unit is used for controlling the exposure starting time of the R pixel;
The second control unit is used for controlling the exposure starting time of the G pixel;
The third control unit is used for controlling the exposure starting time of the B pixel;
The fourth control unit is for controlling an exposure start time of the IR pixel.
23. the apparatus of claim 20 or 21, wherein the sensor is an RGBW sensor,
The at least two types of pixels include: a visible light pixel and a W pixel, the visible light pixel including: r pixel, G pixel, B pixel, two at least control unit include: a first control unit and a second control unit;
the first control unit is used for controlling the exposure starting time of the visible light pixel;
The second control unit is used for controlling the exposure starting time of the W pixel; or
The at least two types of pixels include: r pixel, B pixel, G pixel and W pixel, the at least two control units include: a first control unit, a second control unit, a third control unit and a fourth control unit;
The first control unit is used for controlling the exposure starting time of the R pixel;
The second control unit is used for controlling the exposure starting time of the G pixel;
The third control unit is used for controlling the exposure starting time of the B pixel;
The fourth control unit is configured to control an exposure start time of the W pixel.
24. The apparatus of claim 20 or 21, wherein the sensor is an RCCB sensor,
The at least two types of pixels include: a visible light pixel and a C pixel, the visible light pixel including: r pixels and B pixels, the at least two control units including: a first control unit and a second control unit;
The first control unit is used for controlling the exposure starting time of the visible light pixel;
the second control unit is used for controlling the exposure starting time of the C pixel;
the at least two types of pixels include: r pixel, B pixel and C pixel, the at least two control units include: a first control unit, a second control unit and a third control unit;
The first control unit is used for controlling the exposure starting time of the R pixel;
the second control unit is used for controlling the exposure starting time of the B pixel;
The third control unit is used for controlling the exposure starting time of the C pixel.
25. the apparatus according to any one of claims 20 to 24, wherein the exposure time of the at least two types of pixels is controlled to satisfy a preset ratio based on the at least two control units.
26. The apparatus of any one of claims 20 to 25, further comprising: and the exposure end control unit is used for uniformly controlling the exposure end time of all the pixels in the pixel array.
27. a method of independent exposure for use with a sensor comprising at least two types of pixels, the at least two types of pixels comprising a first type of pixel and a second type of pixel, the method comprising:
Controlling an exposure start time of the first type of pixel based on a first control unit;
controlling an exposure start time of the second type of pixel based on a second control unit.
28. The method of claim 27, further comprising:
Controlling an exposure time of each type of pixels of the at least two types of pixels to satisfy a preset ratio.
29. the method of claim 27 or 28, wherein the sensor is an rgbiir sensor, the first type of pixels are visible light pixels, the second type of pixels are IR pixels, and the visible light pixels comprise R pixels, G pixels, and B pixels; alternatively, the first and second electrodes may be,
the sensor is an RGBW sensor, the first type of pixels are visible light pixels, the second type of pixels are W pixels, and the visible light pixels comprise R pixels, G pixels and B pixels;
the sensor is an RCCB sensor, the first type of pixels are visible light pixels, the second type of pixels are C pixels, and the visible light pixels comprise R pixels and B pixels.
30. The method of claim 27 or 28, wherein the at least two types of pixels further comprise: a third type of pixel; the method further comprises the following steps:
Controlling an exposure start time of the third type of pixel based on a third control unit.
31. the method of claim 30, wherein the sensor is an RCCB sensor, the first type of pixel is an R pixel, the second type of pixel is a B pixel, and the third type of pixel is a C pixel; the method specifically comprises the following steps:
Controlling an exposure start time of the R pixel based on the first control unit;
Controlling an exposure start time of the B pixel based on the second control unit;
controlling an exposure start time of the C pixel based on the third control unit.
32. the method of claim 27 or 28, wherein the at least two types of pixels further comprise: a third type of pixel and a fourth type of pixel, the method further comprising:
Controlling an exposure start time of the third type of pixel based on a third control unit;
controlling an exposure start time of the fourth type of pixel based on a fourth control unit.
33. The method of claim 32, wherein the sensor is an rgbiir sensor, the first type of pixel is an R pixel, the second type of pixel is a G pixel, the third type of pixel is a B pixel, and the fourth type of pixel is an IR pixel; the method specifically comprises the following steps:
Controlling an exposure start time of the R pixel based on the first control unit;
Controlling an exposure start time of the G pixels based on the second control unit;
Controlling an exposure start time of the B pixel based on the third control unit;
controlling an exposure start time of the IR pixel based on the fourth control unit; alternatively, the first and second electrodes may be,
The sensor is an RGBW sensor, the first type of pixels are R pixels, the second type of pixels are G pixels, the third type of pixels are B pixels, and the fourth type of pixels are W pixels; the method specifically comprises the following steps:
Controlling an exposure start time of the R pixel based on the first control unit;
Controlling an exposure start time of the G pixels based on the second control unit;
controlling an exposure start time of the B pixel based on the third control unit;
Controlling an exposure start time of the W pixel based on the fourth control unit.
34. the method of claims 27 to 33, further comprising: the exposure end time of all the pixels in the pixel array is uniformly controlled based on an exposure end control unit.
35. a computer-readable storage medium having stored therein instructions which, when run on a computer or processor, cause the computer or processor to perform the method of any of claims 27-34.
36. A computer program product comprising instructions which, when run on a computer or processor, cause the computer or processor to perform the method of any of claims 27-34.
CN201980001909.XA 2019-07-31 2019-07-31 Image sensor and image sensitization method Pending CN110574367A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/098481 WO2021016900A1 (en) 2019-07-31 2019-07-31 Image sensor and image photosensing method

Publications (1)

Publication Number Publication Date
CN110574367A true CN110574367A (en) 2019-12-13

Family

ID=68786091

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201980001909.XA Pending CN110574367A (en) 2019-07-31 2019-07-31 Image sensor and image sensitization method

Country Status (2)

Country Link
CN (1) CN110574367A (en)
WO (1) WO2021016900A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111447423A (en) * 2020-03-25 2020-07-24 浙江大华技术股份有限公司 Image sensor, imaging apparatus, and image processing method
CN112363180A (en) * 2020-10-28 2021-02-12 Oppo广东移动通信有限公司 Imaging distance measuring sensor, method, system and storage medium
CN113038046A (en) * 2021-03-23 2021-06-25 北京灵汐科技有限公司 Pixel sensing array and vision sensor
WO2021174425A1 (en) * 2020-03-03 2021-09-10 华为技术有限公司 Image sensor and image sensitization method
CN114095672A (en) * 2020-07-31 2022-02-25 北京小米移动软件有限公司 Imaging system, method and electronic device
WO2022199413A1 (en) * 2021-03-23 2022-09-29 北京灵汐科技有限公司 Pixel sensing array and visual sensor

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113973181A (en) * 2021-11-30 2022-01-25 维沃移动通信有限公司 Image sensor, camera module and electronic equipment

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014142259A1 (en) * 2013-03-14 2014-09-18 富士フイルム株式会社 Solid-state imaging element, manufacturing method for same, curable composition for forming infrared cutoff filter, and camera module
CN105990378A (en) * 2014-10-06 2016-10-05 采钰科技股份有限公司 Image sensors and methods of forming the same
CN106454148A (en) * 2016-11-15 2017-02-22 天津大学 CMOS image sensor pixel structure based on blocked independent exposure and control method thereof
CN107709942A (en) * 2015-06-26 2018-02-16 索尼公司 Check equipment, sensor device, sensitivity control device, inspection method and program
JP2018045011A (en) * 2016-09-13 2018-03-22 富士フイルム株式会社 Infrared absorbent, composition, film, optical filter, laminate, solid state imaging device, image display apparatus, and infrared sensor
CN108347551A (en) * 2017-01-25 2018-07-31 芯视达系统公司 The image sensing device of high dynamic range
CN108419062A (en) * 2017-02-10 2018-08-17 杭州海康威视数字技术股份有限公司 Image co-registration equipment and image interfusion method
CN108965654A (en) * 2018-02-11 2018-12-07 浙江宇视科技有限公司 Double spectrum camera systems and image processing method based on single-sensor
WO2018225517A1 (en) * 2017-06-07 2018-12-13 ソニーセミコンダクタソリューションズ株式会社 Information processing device and method
CN109922286A (en) * 2019-03-21 2019-06-21 思特威(上海)电子科技有限公司 Cmos image sensor and its imaging method
CN109951646A (en) * 2017-12-20 2019-06-28 杭州海康威视数字技术股份有限公司 Image interfusion method, device, electronic equipment and computer readable storage medium
CN109981940A (en) * 2017-11-30 2019-07-05 普里露尼库斯股份有限公司 Solid-state imaging apparatus, the method for driving solid-state imaging apparatus and electronic equipment

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106878690A (en) * 2015-12-14 2017-06-20 比亚迪股份有限公司 The imaging method of imageing sensor, imaging device and electronic equipment
CN107360405A (en) * 2016-05-09 2017-11-17 比亚迪股份有限公司 Imaging sensor, imaging method and imaging device

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014142259A1 (en) * 2013-03-14 2014-09-18 富士フイルム株式会社 Solid-state imaging element, manufacturing method for same, curable composition for forming infrared cutoff filter, and camera module
CN105990378A (en) * 2014-10-06 2016-10-05 采钰科技股份有限公司 Image sensors and methods of forming the same
CN107709942A (en) * 2015-06-26 2018-02-16 索尼公司 Check equipment, sensor device, sensitivity control device, inspection method and program
JP2018045011A (en) * 2016-09-13 2018-03-22 富士フイルム株式会社 Infrared absorbent, composition, film, optical filter, laminate, solid state imaging device, image display apparatus, and infrared sensor
CN106454148A (en) * 2016-11-15 2017-02-22 天津大学 CMOS image sensor pixel structure based on blocked independent exposure and control method thereof
CN108347551A (en) * 2017-01-25 2018-07-31 芯视达系统公司 The image sensing device of high dynamic range
CN108419062A (en) * 2017-02-10 2018-08-17 杭州海康威视数字技术股份有限公司 Image co-registration equipment and image interfusion method
WO2018225517A1 (en) * 2017-06-07 2018-12-13 ソニーセミコンダクタソリューションズ株式会社 Information processing device and method
CN109981940A (en) * 2017-11-30 2019-07-05 普里露尼库斯股份有限公司 Solid-state imaging apparatus, the method for driving solid-state imaging apparatus and electronic equipment
CN109951646A (en) * 2017-12-20 2019-06-28 杭州海康威视数字技术股份有限公司 Image interfusion method, device, electronic equipment and computer readable storage medium
CN108965654A (en) * 2018-02-11 2018-12-07 浙江宇视科技有限公司 Double spectrum camera systems and image processing method based on single-sensor
CN109922286A (en) * 2019-03-21 2019-06-21 思特威(上海)电子科技有限公司 Cmos image sensor and its imaging method

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021174425A1 (en) * 2020-03-03 2021-09-10 华为技术有限公司 Image sensor and image sensitization method
CN115462066A (en) * 2020-03-03 2022-12-09 华为技术有限公司 Image sensor and image sensing method
CN111447423A (en) * 2020-03-25 2020-07-24 浙江大华技术股份有限公司 Image sensor, imaging apparatus, and image processing method
CN114095672A (en) * 2020-07-31 2022-02-25 北京小米移动软件有限公司 Imaging system, method and electronic device
CN112363180A (en) * 2020-10-28 2021-02-12 Oppo广东移动通信有限公司 Imaging distance measuring sensor, method, system and storage medium
CN113038046A (en) * 2021-03-23 2021-06-25 北京灵汐科技有限公司 Pixel sensing array and vision sensor
WO2022199413A1 (en) * 2021-03-23 2022-09-29 北京灵汐科技有限公司 Pixel sensing array and visual sensor
CN113038046B (en) * 2021-03-23 2023-07-25 北京灵汐科技有限公司 Pixel sensing array and vision sensor

Also Published As

Publication number Publication date
WO2021016900A1 (en) 2021-02-04

Similar Documents

Publication Publication Date Title
CN110574367A (en) Image sensor and image sensitization method
US11637974B2 (en) Systems and methods for HDR video capture with a mobile device
CN105611124B (en) Imaging sensor, imaging method, imaging device and terminal
CN108462844B (en) Method and apparatus for pixel binning and readout
CN108933899B (en) Panorama shooting method, device, terminal and computer readable storage medium
JP7335020B2 (en) Bimodal biomimetic vision sensor
TWI552601B (en) Exposure control for image sensors
CN103905731B (en) A kind of wide dynamic images acquisition method and system
CN110072052A (en) Image processing method, device, electronic equipment based on multiple image
CN207835595U (en) A kind of dual camera module and terminal
CN103888661A (en) Image pickup apparatus, image pickup system and method of controlling image pickup apparatus
WO2021174425A1 (en) Image sensor and image sensitization method
CN111131798B (en) Image processing method, image processing apparatus, and imaging apparatus
CN107370917B (en) Control method, electronic device, and computer-readable storage medium
CN110062159A (en) Image processing method, device, electronic equipment based on multiple image
US9497427B2 (en) Method and apparatus for image flare mitigation
CN105430360B (en) Imaging method, imaging sensor, imaging device and electronic installation
KR20190051371A (en) Camera module including filter array of complementary colors and electronic device including the camera module
US20230026814A1 (en) Image sensor with embedded neural processing unit
JP2002196227A (en) Ranging sensor and ranging device
CN106331537B (en) Digital image sensor and its control method, image capture device
US11671714B1 (en) Motion based exposure control
US11825207B1 (en) Methods and systems for shift estimation for one or more output frames
US20230370727A1 (en) High dynamic range (hdr) image generation using a combined short exposure image
US20240054659A1 (en) Object detection in dynamic lighting conditions

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20191213