WO2021016900A1 - Capteur d'image et procédé de photodétection d'image - Google Patents

Capteur d'image et procédé de photodétection d'image Download PDF

Info

Publication number
WO2021016900A1
WO2021016900A1 PCT/CN2019/098481 CN2019098481W WO2021016900A1 WO 2021016900 A1 WO2021016900 A1 WO 2021016900A1 CN 2019098481 W CN2019098481 W CN 2019098481W WO 2021016900 A1 WO2021016900 A1 WO 2021016900A1
Authority
WO
WIPO (PCT)
Prior art keywords
pixels
pixel
control
filter layer
infrared light
Prior art date
Application number
PCT/CN2019/098481
Other languages
English (en)
Chinese (zh)
Inventor
王晗
杨红明
涂娇姣
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to CN201980001909.XA priority Critical patent/CN110574367A/zh
Priority to PCT/CN2019/098481 priority patent/WO2021016900A1/fr
Publication of WO2021016900A1 publication Critical patent/WO2021016900A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics

Definitions

  • This application relates to the field of image processing, and in particular to an image sensor and a method for image light-sensing.
  • the color filter array (CFA) of the traditional Bayer red, green and blue sensor contains three components of R, G, and B, which is different from the traditional Bayer RGB sensor.
  • Infrared, RGBIR) sensor CFA contains four components R, G, B and IR, as shown in Figure 1 is the photosensitive characteristic curve of each pixel in the RGBIR sensor, where only the IR filter layer can only transmit infrared light, R The filter layer transmits both red and infrared light, the G filter layer transmits both green and infrared light, and the B filter layer transmits both blue and infrared light. Therefore, even if the filter layer is used, the visible light The IR component still cannot be completely peeled off.
  • the photosensitive results obtained by the R pixel, G pixel and B pixel of the photosensitive device all have a certain degree of IR component signal. Due to the influence of the IR component, the color information of the image signal obtained by the sensor is not quasi. Under certain lighting conditions, the photosensitivity of existing RGBIR sensors is not satisfactory.
  • the embodiment of the present application provides an image sensor and an image light-sensing method, so that the R, G, B, and IR components can be light-sensitized independently, which greatly improves the light-sensing effect.
  • the first aspect of the present application provides an image sensor, the sensor includes: a filter layer, a plurality of micro lenses and a pixel array, the pixel array includes red pixels, green pixels, blue pixels and infrared light pixels, each pixel corresponds to A micro lens; the filter layer includes an infrared light cut filter layer, the red pixel, the green pixel and the micro lens corresponding to the blue pixel are respectively coated with the infrared light cut filter layer, the infrared light cut filter The layer is used to cut off an optical signal with a wavelength greater than a first preset wavelength, and the optical signal with a wavelength greater than the first preset wavelength includes infrared light.
  • an infrared light cut-off filter layer is coated on the micro lenses corresponding to the red pixels, green pixels, and blue pixels, which cuts off IR light from entering the visible light pixels, and removes the light-sensitive results of the visible light pixels.
  • the IR component signal, the color of the photosensitive result is more accurate, and the photosensitive effect of the sensor is improved.
  • the embodiment of the present application coats the infrared light cut filter layer on the micro lens based on the coating technology, on the one hand, there is no need to add a complicated mechanical structure; on the other hand, the pixel itself under the micro lens is not changed.
  • the relatively simple and stable internal structure of the pixel is conducive to controlling the main light path angle of incidence (Chief Ray Angle, CRA) and other issues that affect imaging, and it improves the sensitivity of the sensor while maintaining the stability of the pixel structure.
  • CRA main light path angle of incidence
  • the first preset wavelength is 650nm.
  • the infrared cut filter cuts off all light with a wavelength greater than the visible light range, ensuring that infrared light in all wavelength ranges cannot enter Red pixels, green pixels, and blue pixels.
  • the filter layer further includes a red filter layer, a green filter layer, a blue filter layer and an infrared filter layer; the infrared light pixel corresponding to the micro lens is coated with the infrared light Optical filter layer, the infrared filter layer can pass infrared light in a specific wavelength range; the red filter layer is also coated on the micro lens corresponding to the red pixel, and the micro lens corresponding to the green pixel is also coated Green filter layer, the blue filter layer is also coated on the micro lens corresponding to the blue pixel; the red filter layer can only pass red light and infrared light in the first wavelength range, and the green filter layer only Can pass green light and infrared light in the second wavelength range, the blue filter layer can only pass blue light and infrared light in the third wavelength range, infrared light in the first wavelength range, the second wavelength range The wavelengths of the infrared light within and the infrared light within the third wavelength range are both greater than the first preset wavelength.
  • a red filter layer and an infrared light cut filter layer are coated on the red pixels to filter out the IR component in the light-sensing result of the red pixels, so that the red pixels can only light-receive R light.
  • the green pixel is coated with a green filter layer and an infrared cut-off filter layer
  • the blue pixel is coated with a blue filter layer and an infrared cut-off filter layer to filter out the green pixels and blue pixels.
  • the IR component in the result allows the green pixel to receive only G light, and the blue pixel to receive only B light.
  • the infrared light filter layer is coated on the infrared light pixels, so that the IR pixels can only receive the IR light, which greatly improves the color accuracy of the light-sensing results obtained by the RGBIR sensor.
  • the red filter layer is above or below the infrared light cut filter layer; the green filter layer is above or below the infrared light cut filter layer; the blue filter layer is above or below the infrared light cut filter layer. Above or below the filter layer.
  • the embodiment of the present application does not limit the coating sequence of the infrared light cut filter layer, the red filter layer, the green filter layer, and the blue filter layer on the micro lens.
  • the micro lens of the red pixel is coated with a red filter layer and an infrared light cut filter layer;
  • the micro lens of the green pixel is coated with a green filter layer and an infrared light cut filter layer; blue
  • the micro lens of the color pixel is coated with a blue filter layer and an infrared light cut filter layer;
  • the micro lens of an infrared light pixel is coated with an infrared filter layer, and the infrared light cut filter layer and the red filter layer are not limited.
  • the position relationship between the green filter layer and the blue filter layer coated on the micro lens, the red filter layer, the green filter layer and the blue filter layer can be respectively coated on the infrared cut filter layer; or
  • the infrared cut filter layer can also be coated on the red filter layer, the green filter layer and the blue filter layer respectively, as long as the light passes through the infrared cut filter layer and any one before reaching the micro lens
  • the filter layer for visible light components is sufficient.
  • the infrared cut filter layer is coated on the micro lens, and the red filter layer, the green filter layer, and the blue filter layer are coated on the inner side of the micro lens or are respectively made on the red Pixels, green pixels, and blue pixels; in an optional case, the red filter layer, the green filter layer, and the blue filter layer are coated on the micro lens, and the infrared cut filter layer is coated on the micro lens. Inside the lens or inside the red pixels, green pixels, and blue pixels.
  • the senor further includes a filter for filtering ultraviolet light and infrared light with a wavelength greater than a second preset wavelength, and the second preset wavelength is greater than the first preset wavelength. Set the wavelength and any wavelength within the specific wavelength range; light sequentially passes through the filter, the filter layer and the micro lens to reach the pixel array.
  • the filter can filter out the far-infrared light with a longer wavelength and the ultraviolet light with a shorter wavelength in natural light, so as to prevent the far-infrared light and ultraviolet light from affecting the photosensitive characteristics of the photosensitive device.
  • the senor further includes a charge readout module, and each pixel in the pixel array includes a photosensitive device; the photosensitive device is used to convert light into electric charge; the charge readout module uses the photosensitive device The accumulated electric charge is output, and the photosensitive result is obtained.
  • the senor further includes: a logic control circuit for independently controlling the exposure time of the visible light pixel and the infrared light pixel, the visible light pixel includes the red pixel, the green pixel, and the blue pixel .
  • the exposure of the RGB visible light component and the IR component of the existing RGBIR sensor is uniformly controlled, and the problem of exposure imbalance is prone to occur when the lighting conditions are not ideal, so the dynamic range of the existing RGBIR sensor is poor.
  • the exposure time of visible light pixels and IR pixels is independently controlled.
  • the exposure time of visible light can be increased and the exposure time of infrared light can be reduced, so that The exposure time of visible light and infrared light tends to be balanced to avoid exposure imbalance when infrared light is the dominant component or visible light is the dominant component, and the dynamic range of the sensor's sensitivity is improved to meet the user's indicators of clarity and signal-to-noise ratio Requirements.
  • the logic control circuit includes a first control line and a second control line, the visible light pixels in the pixel array are coupled to the first control line, and the infrared light pixels in the pixel array are coupled to The second control line; the logic control circuit is specifically configured to: control the exposure start time of the visible light pixel based on the first control line; control the exposure start time of the infrared light pixel based on the second control line.
  • the logic control circuit is further configured to control the exposure time of the visible light pixel and the infrared light pixel to meet a preset ratio based on the first control line and the second control line.
  • the first control line outputs a first control signal
  • the second control line outputs a second control signal.
  • the first effective transition edge of the first control signal arrives, the visible light pixel starts to be exposed.
  • the second effective transition edge of the second control signal arrives, the infrared light pixel starts to be exposed; by setting the arrival time of the first effective transition edge and the second effective transition edge, the visible light pixel and the infrared light pixel The exposure time meets the preset ratio.
  • the image sensor provided in the embodiments of the present application can set the arrival time of the effective transition edge of the respective control signals of the visible light signal and the infrared light signal, so that the exposure time of the visible light signal and the infrared light signal meets the preset ratio, for example, when the visible light signal When the ratio of the exposure time to the infrared light signal is 2:1, the definition of the exposure result is better, and the signal-to-noise ratio is higher, so that the control signal of the visible light signal jumps first, and the infrared light signal jumps after it.
  • the time difference between the jump time points of the signal makes the exposure time of the visible light signal and the exposure time of the infrared light signal meet the preset ratio.
  • the valid transition edge may be a falling edge of a high-level signal, a rising edge of a low-level signal, a rising edge of a high-level signal, a falling edge of a low-level signal, and so on.
  • the senor further includes: a logic control circuit for independently controlling the exposure time of the red pixel, the green pixel, the blue pixel, and the infrared light pixel.
  • the exposure time of the four components of R, G, B and IR are independently controlled.
  • the dynamic range of the sensor's sensitivity is further improved, and the sensitivity results with sharpness or signal-to-noise ratio more in line with customer needs are provided.
  • the logic control circuit includes: a first control line, a second control line, a third control line, and a fourth control line, and the red pixels in the pixel array are coupled to the first control line,
  • the green pixels in the pixel array are coupled to the second control line, the blue pixels in the pixel array are coupled to the third control line, and the infrared light pixels in the pixel array are coupled to the fourth control line;
  • the logic control The circuit is specifically configured to: control the exposure start time of the red pixel based on the first control line; control the exposure start time of the green pixel based on the second control line; control the exposure of the blue pixel based on the third control line Start time; control the exposure start time of the infrared light pixel based on the fourth control line.
  • the logic control circuit is further configured to: control the red pixel, the green pixel, and the green pixel based on the first control line, the second control line, the third control line, and the fourth control line.
  • the exposure time of the blue pixel and the infrared light pixel meets a preset ratio.
  • the exposure time of the four components of R, G, B, and IR may be preset to meet the preset ratio, so as to achieve fine control of the sensor's light-sensing effect.
  • the first control line outputs a first control signal
  • the second control line outputs a second control signal
  • the third control line outputs a third control signal
  • the fourth control line outputs a fourth control signal
  • the first effective transition edge of the first control signal arrives, the red pixel starts to be exposed
  • the second effective transition edge of the second control signal comes, the green pixel starts to expose.
  • the three effective transition edges arrive, the green pixel starts to be exposed
  • the fourth effective transition edge of the fourth control signal arrives, the infrared light pixel starts to expose.
  • the sensor further includes: a row coordinate control line, a column coordinate control line, and an exposure start control line; each pixel in the pixel array is coupled to its own row coordinate control line and column coordinate control line,
  • the exposure start control line includes multiple branches, and each branch corresponds to a pixel; when the control signals output by the row coordinate control line and the column coordinate control line of the target pixel are both effective levels, the target pixel corresponds to the
  • the branch of the exposure start control line outputs a control signal to control the exposure start time of the target pixel, and the target pixel is any pixel in the pixel array.
  • the exposure time of each pixel can be independently controlled. In some scenes where pixels in the target area need to be enhanced, only the exposure time of the pixels in the target area can be increased, which further improves the flexibility of the sensor's exposure. It also further satisfies the needs of users for photosensitization results.
  • the senor further includes: an exposure end control signal for uniformly controlling the exposure end time of all pixels in the pixel array.
  • the logic control circuit includes a first control variable x and a second control variable y, and when x and y meet the coordinate conditions of the visible light pixel, the reset signal of the logic control circuit is output to the first control The line is used as the first control signal; when x and y meet the coordinate conditions of the IR pixel, the reset signal of the logic control circuit is output to the second control line as the second control signal.
  • the logic control circuit includes a first control variable x and a second control variable y.
  • the reset signal of the logic control circuit is output to the first control Line as the first control signal;
  • the reset signal of the logic control circuit is output to the second control line as the second control signal;
  • the reset signal of the logic control circuit is output to the third control line as the third control signal;
  • the reset signal of the logic control circuit is output to the fourth control line as the fourth control signal.
  • the second aspect of the present application provides a method of image light-sensing, which is applied to an image sensor, the sensor includes: an infrared light cut filter layer, a plurality of micro lenses and a pixel array, the pixel array includes red pixels, green pixels, Blue pixels and infrared light pixels, each pixel in the pixel array corresponds to a micro lens; the infrared light cut filter layer is respectively coated on the micro lens corresponding to the red pixel, the green pixel and the blue pixel,
  • the method includes: light passes through the infrared light cut filter layer and the micro lens to reach the red pixel, the green pixel, and the blue pixel; wherein the infrared light cut filter layer is used to cut off infrared light so that infrared light cannot Enter the red pixel, the green pixel, and the blue pixel.
  • the senor further includes a red filter layer, a green filter layer, a blue filter layer, and an infrared filter layer; the red filter layer is coated on the micro lens corresponding to the red pixel , The green filter layer is coated on the micro lens corresponding to the green pixel, the blue filter layer is coated on the micro lens corresponding to the blue pixel, and the infrared filter layer is coated on the infrared light
  • the method specifically includes: the light sequentially passes through the infrared light filter layer and the micro lens to reach the infrared light pixel; the light sequentially passes through the infrared light cut filter layer, the red filter layer, and The micro lens reaches the red pixel; the light sequentially passes through the infrared light cut filter layer, the green filter layer and the micro lens to reach the green pixel; the light light sequentially passes through the infrared light cut filter layer and the blue filter The light layer and the micro lens reach the blue pixel; or, the light sequentially passes through the infrared light cut filter
  • the embodiment of the application does not limit the positional relationship of the infrared cut filter layer and the red filter layer, the green filter layer and the blue filter layer coated on the micro lens, the red filter layer, the green filter layer and the blue
  • the filter layer can be respectively coated on the infrared light cut filter layer; or the infrared light cut filter layer can also be respectively coated on the red filter layer, the green filter layer and the blue filter layer, as long as The light passes through the infrared cut filter layer and any visible light component filter layer before reaching the micro lens.
  • the senor further includes a filter, the light being the original light in nature after passing through the filter, and the filter is used to filter ultraviolet light and far-infrared light.
  • the wavelength of the infrared light is greater than the wavelength of the infrared light in the specific wavelength range allowed by the infrared light filter layer, and the method further includes: the original light in nature passes through the filter to obtain the light.
  • the senor further includes a charge readout module, and each pixel in the pixel array includes a photosensitive device, and the method further includes: the photosensitive device converts light into electric charge; The accumulated electric charge is output, and the photosensitive result is obtained.
  • the method further includes: controlling the exposure start time of a visible light pixel based on the first control line, the visible light pixel including the red pixel, the green pixel, and the blue pixel; and based on the second control line The line controls the exposure start time of the infrared light pixel.
  • the method further includes: controlling the exposure time of the visible light pixel and the infrared light pixel to meet a preset ratio based on the first control line and the second control line.
  • the method further includes: controlling the exposure start time of the red pixel based on the first control line; controlling the exposure start time of the green pixel based on the second control line; controlling the exposure start time of the green pixel based on the third control line The exposure start time of the blue pixel; the exposure start time of the infrared light pixel is controlled based on the fourth control line.
  • the method further includes: controlling the red pixel, the green pixel, and the blue pixel based on the first control line, the second control line, the third control line, and the fourth control line.
  • the exposure time of the pixel and the infrared light pixel meets a preset ratio.
  • each pixel in the sensor is coupled to a respective row coordinate control line and a column coordinate control line, and each pixel corresponds to a branch of the exposure start control line
  • the method further includes: When the control signals output by the row coordinate control line and the column coordinate control line of the target pixel are both effective levels, the branch of the exposure start control line corresponding to the target pixel outputs a control signal, and the target is controlled based on the control signal
  • the exposure start time of the pixel, and the target pixel is any pixel in the pixel array.
  • a third aspect of the present application provides an independent exposure device, the device includes: at least two control units, each of the at least two control units is used to correspondingly control one type of the pixel array of the sensor
  • the exposure start time of the pixel and the pixel array of the sensor includes at least two types of pixels.
  • the exposure time of different types of pixels is uniformly controlled, exposure is prone to imbalance when the lighting conditions are not ideal, exposure control flexibility is poor, and the dynamic range of sensor exposure is relatively poor.
  • the device provided in the present application can independently control the exposure time of different types of pixels in the sensor, which improves the dynamic range and signal-to-noise ratio of the sensor.
  • the device is a control unit or logic control circuit independent of the sensor, and the corresponding product form may be a processor or a chip product containing a processor.
  • the device further includes: the pixel array.
  • the device may be a sensor including a control unit.
  • the senor is an RGBIR sensor
  • the at least two types of pixels include: visible light pixels and IR pixels
  • the visible light pixels include: R pixels, G pixels, and B pixels
  • at least Two types of pixels include: R pixels, B pixels, G pixels, and IR pixels.
  • the at least two control units include: a first control unit, a second control unit, a third control unit, and a fourth control unit; the first The control unit is used to control the exposure start time of the R pixel; the second control unit is used to control the exposure start time of the G pixel; the third control unit is used to control the exposure start time of the B pixel; The four control unit is used to control the exposure start time of the IR pixel.
  • the senor is an RGBW sensor
  • the at least two types of pixels include: visible light pixels and W pixels
  • the visible light pixels include: R pixels, G pixels, and B pixels
  • the at least two control units It includes: a first control unit and a second control unit; the first control unit is used to control the exposure start time of the visible light pixel; the second control unit is used to control the exposure start time of the W pixel; or the at least two One type of pixel includes: R pixel, B pixel, G pixel, and W pixel.
  • the at least two control units include: a first control unit, a second control unit, a third control unit, and a fourth control unit; the first control unit The unit is used to control the exposure start time of the R pixel; the second control unit is used to control the exposure start time of the G pixel; the third control unit is used to control the exposure start time of the B pixel; the fourth The control unit is used to control the exposure start time of the W pixel.
  • the senor is an RCCB sensor
  • the at least two types of pixels include visible light pixels and C pixels
  • the visible light pixels include R pixels and B pixels
  • the at least two control units include: A control unit and a second control unit; the first control unit is used to control the exposure start time of the visible light pixel; the second control unit is used to control the exposure start time of the C pixel; the at least two types of pixels Including: R pixel, B pixel and C pixel, the at least two control units include: a first control unit, a second control unit and a third control unit; the first control unit is used to control the exposure start time of the R pixel The second control unit is used to control the exposure start time of the B pixel; the third control unit is used to control the exposure start time of the C pixel.
  • the device further includes: an exposure end control unit, configured to uniformly control the exposure end time of all pixels in the pixel array.
  • the fourth aspect of the present application provides an independent exposure method, which is applied to a sensor including at least two types of pixels, the at least two types of pixels include a first type of pixel and a second type of pixel,
  • the method includes: controlling the exposure start time of the first type of pixel based on a first control unit; and controlling the exposure start time of the second type of pixel based on the second control unit.
  • the method further includes: controlling the exposure time of each of the at least two types of pixels to meet a preset ratio.
  • the first control unit and the second control unit controlling the exposure time of the visible light pixels and the IR pixels to meet the preset ratio; or based on the first control unit, the second control unit, the third control unit, and the fourth control unit control
  • the exposure time of R, G, B and IR pixels meets the preset ratio.
  • first control unit and the second control unit to control the exposure time of the visible light pixel and the W pixel to meet the preset ratio; or, based on the first control unit, the second control unit, the third control unit, and the fourth control unit to control R , G, B and W pixels’ exposure time meets a preset ratio; or, based on the first control unit and the second control unit controlling the exposure time of the visible light pixel and the C pixel to meet the preset ratio; or, based on the first control unit, the The second control unit and the third control unit control the exposure time of the R, B and C pixels to meet the preset ratio.
  • the senor is an RGBIR sensor, the first type of pixels are visible light pixels, the second type of pixels are IR pixels, and the visible light pixels include R pixels, G pixels, and B pixels;
  • the sensor is an RGBW sensor, the first type of pixels are visible light pixels, the second type of pixels are W pixels, and the visible light pixels include R pixels, G pixels, and B pixels;
  • the sensor is an RCCB sensor, and the The first type of pixels are visible light pixels, the second type of pixels are C pixels, and the visible light pixels include R pixels and B pixels.
  • the at least two types of pixels further include: a third type of pixels; the method further includes: controlling the exposure start time of the third type of pixels based on the third control unit.
  • the senor is an RCCB sensor
  • the first type of pixels are R pixels
  • the second type of pixels are B pixels
  • the third type of pixels are C pixels; this method Specifically, it includes: controlling the exposure start time of the R pixel based on the first control unit; controlling the exposure start time of the B pixel based on the second control unit; controlling the exposure start time of the C pixel based on the third control unit .
  • the at least two types of pixels further include: a third type of pixels and a fourth type of pixels, and the method further includes: controlling the third type of pixels based on a third control unit The exposure start time of the pixel; the fourth control unit controls the exposure start time of the fourth type of pixel.
  • the senor is an RGBIR sensor
  • the first type of pixels are R pixels
  • the second type of pixels are G pixels
  • the third type of pixels are B pixels
  • the first type of pixels are B pixels.
  • the four types of pixels are IR pixels
  • the method specifically includes: controlling the exposure start time of the R pixel based on the first control unit; controlling the exposure start time of the G pixel based on the second control unit;
  • the control unit controls the exposure start time of the B pixel; controls the exposure start time of the IR pixel based on the fourth control unit; or, the sensor is an RGBW sensor, the first type of pixel is an R pixel, and the second One type of pixel is a G pixel, the third type of pixel is a B pixel, and the fourth type of pixel is a W pixel;
  • the method specifically includes: controlling the exposure start time of the R pixel based on the first control unit ; Based on the second control unit to control the exposure start time of the G pixel; based on the third control
  • the method further includes: uniformly controlling the exposure end time of all pixels in the pixel array based on the exposure end control unit.
  • the fifth aspect of the present application provides a computer-readable storage medium with instructions stored in the computer-readable storage medium, which when run on a computer or processor, cause the computer or processor to execute the fourth aspect or The method in any of its possible implementations.
  • the sixth aspect of the present application provides a computer program product containing instructions, which when run on a computer or processor, causes the computer or processor to execute the fourth aspect or any one of its possible implementations. Methods.
  • FIG. 1 is a schematic diagram of the photosensitive characteristic curve of each pixel of an exemplary RGBIR sensor according to an embodiment of the application;
  • Figure 2a is a schematic diagram of an exemplary 2X2 array of RGBIR sensors
  • Fig. 2b is a schematic diagram of another exemplary 2X2 array sorted RGBIR sensor
  • Figure 3a is a schematic diagram of an exemplary 4X4 array of RGBIR sensors
  • Figure 3b is a schematic diagram of another exemplary 4X4 array of RGBIR sensors
  • Fig. 4 is a schematic structural diagram of an exemplary RGBIR sensor provided by an embodiment of the application.
  • Fig. 5 is a schematic structural diagram of another exemplary RGBIR sensor provided by an embodiment of the application.
  • Fig. 6 is a schematic structural diagram of another exemplary RGBIR sensor provided by an embodiment of the application.
  • FIG. 7a is a schematic diagram of an exemplary 2X2 array sorting RGBIR control connection provided by an embodiment of the application.
  • FIG. 7b is a schematic diagram of an exemplary 4X4 array sorting RGBIR control connection provided by an embodiment of the application.
  • FIG. 8 is a timing diagram of an exemplary control signal provided by an embodiment of the application.
  • FIG. 9a is a schematic diagram of another exemplary 2X2 array sorting RGBIR control connection provided by an embodiment of the application.
  • Fig. 9b is another exemplary RGBIR control connection diagram of 4X4 array sorting provided by an embodiment of the application.
  • FIG. 10 is a timing diagram of an exemplary control signal provided by an embodiment of the application.
  • FIG. 11 An embodiment of the present application provides an exemplary schematic diagram of a sensor in which the exposure time of each pixel can be independently controlled;
  • FIG. 12 is a timing diagram of an exemplary control signal provided by an embodiment of the application.
  • FIG. 13 is a graph of photosensitive characteristics of each pixel of an RGBIR sensor provided by an embodiment of the application.
  • Figure 14a is a schematic diagram of an exemplary 2X2 array of RGBW sensors
  • Figure 14b is a schematic diagram of an exemplary 2X2 array of RCCB sensors
  • 15 is a framework diagram of an exemplary independent exposure control unit of an RCCB sensor provided by an embodiment of the application.
  • 16 is a schematic diagram of the hardware architecture of an exemplary independent exposure apparatus provided by an embodiment of the application.
  • FIG. 17 is a schematic flowchart of an exemplary image light-sensing method according to an embodiment of the application.
  • FIG. 18 is a schematic flowchart of an exemplary method for independently controlling exposure time provided by an embodiment of the application.
  • At least one (item) refers to one or more, and “multiple” refers to two or more.
  • “And/or” is used to describe the association relationship of associated objects, indicating that there can be three types of relationships, for example, “A and/or B” can mean: only A, only B, and both A and B , Where A and B can be singular or plural.
  • the character “/” generally indicates that the associated objects are in an “or” relationship.
  • the following at least one item (a)” or similar expressions refers to any combination of these items, including any combination of a single item (a) or plural items (a).
  • At least one (a) of a, b or c can mean: a, b, c, "a and b", “a and c", “b and c", or "a and b and c" ", where a, b, and c can be single or multiple.
  • the quality of the image obtained by the imaging device is related to the light sensitivity of the image sensor and the light condition of the shooting scene. Moreover, more and more application scenarios need to be based on both visible light signals and infrared light signals, such as night video surveillance, live detection, and color-black and white dynamic fusion technology.
  • the imaging results of only visible light do not meet the requirements and require infrared light assistance;
  • the existing RGBIR sensor cannot achieve the separation of visible light signal and IR signal.
  • the photosensitive result of visible light signal has a certain degree of IR component, and the color information is not accurate; and the exposure of visible light component and IR component is uniformly controlled, and the sensor is easy There is an exposure imbalance problem, and the photosensitive dynamic range is poor.
  • the embodiments of the present application provide an RGBIR image sensor, which can realize independent light-sensing of visible light and IR light, strip the IR component signal in the light-sensing result of the visible light signal, and improve the color accuracy of the light-sensing result of the sensor.
  • FIGS 2a and 2b show two exemplary 2X2 arrays of RGBIR sensors
  • Figures 3a and 3b show two exemplary 4X4 arrays of RGBIR sensors.
  • each grid represents a pixel
  • R represents a red pixel
  • G represents a green pixel
  • B represents a blue pixel
  • IR represents an infrared light pixel
  • 2X2 array sorting means that the smallest repeating unit of the RGBIR four-component arrangement is a 2X2 array.
  • the 2X2 array unit contains all the components of R, G, B, and IR
  • the 4X4 array ordering means that the smallest repeating unit of the RGBIR four-component arrangement is a 4X4 array
  • the 4X4 array unit contains all the components. It should be understood that there may also be other arrangements of RGBIR sensors with 2X2 array sorting and 4X4 array sorting, and the embodiment of the present application does not limit the arrangement of RGBIR sensors.
  • FIG. 4 it is a schematic structural diagram of an exemplary RGBIR sensor provided by an embodiment of this application.
  • the RGBIR sensor includes a filter layer, a microlens 403, a pixel array 404, and a charge readout module 405.
  • the filter layer includes: an infrared cut filter layer 401, a red filter layer 402R, and a green filter
  • the layer 402G, the blue filter layer 402B and the infrared filter layer 402IR, the pixel array 404 includes a red pixel R, a green pixel G, and a blue pixel B.
  • the infrared light cut filter layer 401 may also be referred to as IR-Cut, and is used to cut off optical signals with a wavelength greater than a first preset wavelength, and the optical signals with a wavelength greater than the first preset wavelength include infrared optical signals.
  • the first preset wavelength is 650 nm
  • the infrared light cut filter layer 401 is used to cut off optical signals with a wavelength greater than 650 nm
  • the optical signals with a wavelength greater than 650 nm include infrared optical signals.
  • the typical wavelength of visible light rays is about 430 nm to 650 nm
  • the typical wavelength of infrared light rays that the IR pixel is sensitive to is about 850 nm to 920 nm.
  • IR-Cut can cut off light signals with a wavelength greater than 650nm, so that infrared light in the wavelength range of about 850nm to 920nm cannot enter the red pixels, green pixels, and blue pixels.
  • the photosensitive characteristics of light passing through the red filter layer 402R in the red pixel are shown by the thin black solid line R in Figure 1.
  • the red pixel has two photosensitive intensity peaks near 650nm of red light and 850nm of IR light;
  • the light-sensitive characteristics of the green filter layer 402G in the green pixel are shown by the short dashed line G in Figure 1.
  • the green pixel has two light-sensitive intensity peaks near 550nm of green light and 850nm of IR light, and the light passes through the blue filter.
  • the photosensitive characteristic of the light layer 402B in the blue pixel is shown by the dotted line B in Fig. 1.
  • the blue pixel has two light-sensitive intensity peaks near 450nm of blue light and 850nm of IR light; light passes through the infrared light filter layer
  • the photosensitive characteristic of the 402IR in the IR pixel is shown by the long dotted line IR in Fig. 1.
  • the IR pixel only has a photosensitive intensity peak near 850nm (910nm) of IR light. Based on this, it can be obtained that the red filter layer 402R can transmit red light and IR light in the first wavelength range at the same time, and the green filter layer 402G can transmit green light and IR light in the second wavelength range at the same time.
  • the filter layer 402B can transmit blue light and IR light in the third wavelength range at the same time.
  • the first wavelength range, the second wavelength range, and the third wavelength range may be the same or different, and infrared light in the first wavelength range, infrared light in the second wavelength range, and infrared light in the third wavelength range
  • the wavelengths of are all greater than the first preset wavelength.
  • the infrared light filter layer 402IR can only transmit IR light in a specific wavelength range.
  • the specific wavelength range may be 850nm-920nm, or the specific wavelength range may be any specific wavelength in the range of 850nm-920nm and in the vicinity thereof.
  • the IR pixel can mainly photosensitive 850nm IR light, or can mainly photosensitive 910nm IR light, and the IR pixel can photosensitive infrared light of any specific wavelength in the range of 850nm-920nm and its vicinity. This is not limited.
  • the micro lens 403 is a tiny convex lens device on each photosensitive pixel of the sensor, which is used to concentrate the input light into each photosensitive pixel.
  • the micro lenses corresponding to the red pixels, green pixels, and blue pixels are respectively coated with an infrared cut filter layer 401, so light exceeding 650 nm cannot enter the red pixels, green pixels, and blue pixels.
  • the micro lens corresponding to the red pixel is also coated with a red filter layer 402R, so only red light near 650 nm enters the red pixel, and the red pixel can only receive red light.
  • the micro lens corresponding to the green pixel is also coated with a green filter layer 402G, so only the green light near 550 nm enters the green pixel, and the green pixel can only receive green light.
  • the micro lens corresponding to the blue pixel is also coated with a blue filter layer 402B, so that only blue light near 450 nm enters the blue pixel, and the blue pixel can only light blue light.
  • the micro lens corresponding to the infrared light pixel is coated with an infrared light filter layer 402IR, so that only near-infrared light near 850 nm or 910 nm enters the infrared light pixel, and the infrared light pixel can only receive IR light.
  • an infrared light cut filter layer is coated on the micro lens corresponding to the red pixel, the green pixel and the blue pixel, which cuts off the IR light reaching the red pixel, the green pixel and the blue pixel, and removes the visible light pixel.
  • the IR component signal in the photosensitive result the color of the photosensitive result is more accurate, and the photosensitive effect of the sensor is improved.
  • the embodiment of the present application coats the infrared light cut filter layer on the micro lens based on the coating technology, on the one hand, there is no need to add a complicated mechanical structure; on the other hand, the pixel itself under the micro lens is not changed.
  • the pixels under the micro lens only have photosensitive devices such as photodiodes, and the relatively simple and stable internal structure of the pixels is conducive to controlling the main light path angle of incidence (Chief Ray Angle, CRA) and other issues that affect imaging.
  • the filter layer is coated on On the micro lens, the light-sensing effect of the sensor is improved while keeping the structure of the pixel itself stable.
  • the internal structure of the pixel itself is not a smooth inner wall. There are some protrusions on the inner wall of the pixel. If the incident angle of the light is offset from the main optical path of the micro lens, part of the light will be blocked by the protrusions on the inner wall of the pixel, and the sensor itself The photosensitivity effect will decrease.
  • the CRA of the pixel located in the optical center of the sensor is 0 degrees, and the CRA angle of the pixel farther from the optical center is larger.
  • the offset distance of the pixel from the center of the screen is taken as the abscissa
  • the The CRA angle is taken as the ordinate
  • the function between the offset distance of the pixel from the center and the CRA angle of the pixel is a linear function.
  • CRA consistent performance In order to make the sensor conform to the law of consistent CRA performance, the position of the micro lens of the pixel needs to be fine-tuned according to the position of the pixel. For example, the micro lens of the pixel located in the optical center is directly above the pixel, and the micro lens of the pixel off the optical center is not in the pixel. Directly above the pixel, the further away from the optical center, the deviation of the micro lens is relatively large. If the internal structure of the pixel under the micro lens is complicated, it is easy to cause inconsistent CRA performance, and the method of fine-tuning the position of the micro lens on the pixel surface may no longer be applicable.
  • the filter layer added by the sensor provided by the embodiment of the application is coated on the micro lens, and does not change the internal structure of the pixel.
  • the internal result of the pixel is simple and stable, and the sensor is improved without affecting the CRA performance of the sensor.
  • the photosensitive effect is applied to the pixel.
  • Each pixel in the pixel array 404 includes a photosensitive device, for example, the photosensitive device may be a photodiode, which is used to convert an optical signal into an electrical signal or convert an optical signal into an electric charge.
  • the photosensitive device may be a photodiode, which is used to convert an optical signal into an electrical signal or convert an optical signal into an electric charge.
  • the charge readout module 405 is used to read out the charge accumulated by the photosensitive device and output it to the subsequent image processing circuit or image processor.
  • the charge readout module is similar to a buffer area, the charge accumulated by the photosensitive device is transferred and temporarily buffered in the charge readout module, and the charge signal of the corresponding pixel is output under the control of the readout signal.
  • the infrared cut filter layer of the sensor shown in FIG. 4 is respectively coated on the red filter layer, the green filter layer and the blue filter layer.
  • the red filter layer, The green filter layer and the blue filter layer can be respectively coated on the infrared cut-off filter layer, such as the RGBIR sensor shown in Fig. 5.
  • the other parts of the sensor shown in Fig. 5 are the same as those shown in Fig. 4. I won't repeat it here.
  • the image sensor provided in the embodiments of the present application does not limit the positional relationship of the infrared cut filter layer and the red filter layer, the green filter layer and the blue filter layer coated on the micro lens.
  • only the infrared cut filter layer can be coated on the micro lens, and the red filter layer can be used in the red pixels, the green filter layer can be used in the green pixels, and the blue filter layer can be used in the green pixels.
  • the color filter layer is made in the blue pixel, and the infrared filter layer is made in the IR pixel.
  • the red filter layer, the green filter layer, the blue filter layer, and the infrared filter layer can be coated on the micro lens, and the infrared cut filter layer can be used as Among the red pixels, green pixels, and blue pixels.
  • FIG. 6 a structural diagram of another exemplary RGBIR sensor provided in an embodiment of this application.
  • the sensor includes a filter 601, a filter layer, a micro lens 604, a pixel array 605, and a charge readout module 606.
  • the filter layer includes an infrared cut filter layer 602, a red filter layer 603R, and a green filter A layer 603G, a blue filter layer 603B and an infrared filter layer 603IR
  • the pixel array 605 includes a red pixel R, a green pixel G, and a blue pixel B.
  • the filter 601 is used to filter ultraviolet light and infrared light with a wavelength greater than the second preset wavelength, so that visible light and part of the infrared light pass through the filter.
  • the second preset wavelength is greater than the first preset wavelength and any wavelength in the specific wavelength range passed by the infrared light filter layer.
  • the infrared light with a wavelength greater than the second preset wavelength may be referred to as far-infrared light, and the wavelength of the far-infrared light is greater than the wavelength of the infrared light allowed by the infrared light filter layer.
  • the visible light rays have a wavelength of about 430 nm to 650 nm, and the typical wavelength range of the infrared light rays of the IR pixel is about 850 nm to 920 nm.
  • the second preset wavelength may be, for example, 900 nm, or 920 nm, or may also be any wavelength between 850 nm and 950 nm.
  • the filter may be an all-pass filter or a dual-pass filter.
  • An exemplary all-pass filter is used to filter ultraviolet light with a wavelength of less than 400 nm and wavelengths greater than 900 nm. Infrared light.
  • An exemplary double-pass filter is used to pass only visible light and infrared light in the range of 800 nm to 900 nm. At this time, the double-pass filter is equivalent to filtering out infrared light greater than 900 nm; in an optional case Among them, the double-pass filter is used to pass only visible light and infrared light in the range of 900 nm to 950 nm. At this time, the double-pass filter is equivalent to filtering out infrared light greater than 950 nm. It should be understood that the wavelength of the infrared light filtered by the all-pass filter and the wavelength of the infrared light allowed by the double-pass filter can be designed as required, which is not limited in the embodiment of the present application.
  • the filter 601 can prevent the long-wavelength far-infrared light and ultraviolet light from affecting the photosensitive characteristics of the photosensitive device.
  • the filter layer is the same as that of the sensor shown in FIGS. 4 and 5, the infrared cut filter layer 602, the red filter layer 603R, the green filter layer 603G, the blue filter layer 603B and the infrared filter layer 603IR Both are coated on the micro lens, and the upper and lower positions of the infrared cut filter layer 602 and the filter layers 603R-603B of each visible light component are not limited.
  • the wavelength of part of the infrared light filtered by the filter is greater than the wavelength of the infrared light allowed by the infrared filter layer in the filter layer, and the infrared cut filter layer can filter out all the light larger than the visible light part, thereby Avoid infrared light from entering the red pixels, blue pixels, and green pixels.
  • the pixel array 605, and the charge readout module 606 please refer to the description of the embodiment corresponding to FIG. 4, which will not be repeated here.
  • an embodiment of the present application also provides a sensor capable of independently controlling the exposure time of visible light pixels and infrared light pixels, as shown in FIG. 7a and FIG. 7b, where FIG. 7a is an exemplary 2X2 array of RGBIR
  • FIG. 7b is an exemplary schematic diagram of the RGBIR control connection of 4X4 array sorting.
  • the sensor includes a pixel array 710 and a logic control circuit 720.
  • the pixel array 710 is the pixel array in the sensor as shown in any of the embodiments in FIGS. 4 to 6, for example, the pixel array 404, the pixel array 504 or the pixel array 605.
  • the logic control circuit 720 is used to independently control the exposure time of the visible light pixels and the infrared light pixels.
  • the visible light pixels include red pixels, green pixels, and blue pixels.
  • the logic control circuit 720 includes a first control line and a second control line, or in other words, includes two independent control circuits: a first control circuit and a second control circuit.
  • the visible light pixels R, G, and B in the pixel array 710 are coupled to the first control line, and the IR pixels in the pixel array 710 are coupled to the second control line.
  • the control lines with the same name in FIGS. 7a and 7b are the same line or connected to each other.
  • the first control line on the pixel array side and the first control line of the logic control circuit are the same line or Connected
  • the second control line on the pixel array side and the second control line of the logic control circuit are the same line or connected.
  • % is the remainder operation
  • % is the remainder operation
  • the pixel located at the visible light pixel location is coupled to the first control line, and the pixel located at the IR pixel location is coupled to the second control line. It should be understood that when the RGBIR arrays are arranged differently, the respective coordinate conditions of the RGBIR pixels will change accordingly. Therefore, the connection between the logic control circuit and the pixel array needs to be designed correspondingly according to the arrangement of the sensors.
  • the first control line outputs a first control signal
  • the second control line outputs a second control signal.
  • the first control signal is used to control the exposure start time of visible light pixels
  • the second control signal is used to control the exposure start time of infrared light pixels.
  • the first control signal and the second control signal are independent of each other, so the exposure start time of the visible light pixel and the infrared light pixel may be different. Exemplarily, when the first effective transition edge of the first control signal arrives, the visible light pixel starts to expose, and when the second effective transition edge of the second control signal arrives, the infrared light pixel starts to expose.
  • the effective transition edges of the first control signal and the second control signal can be both falling edges or rising edges, or one can be a falling edge and the other can be a rising edge.
  • the effective transition edge of the control signal in the embodiment of the present application is Not limited. As shown in FIG. 8, it is an exemplary control signal timing diagram. In this figure, the effective transition edges of the first control signal and the second control signal are both falling edges. In an optional case, the first control signal and the second control signal can be reset according to the system of the logic control circuit. The signal gets. As shown in Figure 8, the first control signal and the second control signal are both high-level active signals. When the falling edge of the first control signal arrives, the visible light pixel starts to expose. When the falling edge of the second control signal arrives, The infrared light pixels are exposed.
  • the logic control circuit 720 further includes a reset signal.
  • the reset signal may be a system clock signal, and the first control signal and the second control signal may both be obtained from the reset signal.
  • the logic control circuit 720 includes a logic operation circuit.
  • the logic operation circuit may include logic operations such as AND, OR, NOT, XOR, etc.
  • the logic operation circuit includes three inputs: a variable x, a variable y, and a reset signal.
  • the logic operation circuit includes two output terminals: a first control line and a second control line.
  • the reset signal is connected to the output terminal of the first control line; if the variable When x and variable y meet the coordinate condition of the infrared light pixel, the reset signal is connected to the output terminal of the second control line.
  • the logic control circuit 720 further includes:
  • the exposure end control line is used to uniformly control the exposure stop time of all pixels in the pixel array.
  • the exposure end control line outputs the exposure end signal.
  • the exposure end signal can be a high-level active signal or a low-level active signal.
  • the exposure end time point can be a high-level falling edge or a low-level rising edge. along. In the control signal timing chart shown in FIG. 8, the exposure end control signal is a high-level active signal.
  • the exposure end control signal is a high-level active signal.
  • the The exposure time is the time difference between the falling edge of the first control signal and the falling edge of the exposure end control signal: the first exposure time
  • the exposure time of the IR pixel is the difference between the falling edge of the second control signal and the falling edge of the exposure end control signal Time difference between: the second exposure time. Therefore, the exposure time of visible light pixels and IR pixels is independently controlled.
  • the RGBIR sensor In a low-light scene, if you directly use the RGBIR sensor for light exposure, because the visible light energy is less, the signal-to-noise ratio of the light-sensing result will be smaller; if you use the IR lamp to fill light, the IR light is much stronger than the visible light, in the same exposure time situation Below, the visible light energy that can be captured is much less than the IR light energy, and if the amount of visible light information is forcibly increased, the IR light will be overexposed, and the image with unbalanced exposure will lose a lot of effective information. If visible light and IR light are exposed separately, the exposure time of IR light is reduced, and the exposure time of visible light is relatively prolonged, which can effectively improve the detailed information contained in the photosensitive result of the visible light signal.
  • the exposure time of the visible light pixel and the infrared light pixel Meet the preset ratio.
  • the ratio of the exposure time of the visible light signal to the infrared light signal is 2:1
  • the definition of the exposure result is better and the signal-to-noise ratio is higher, so that the control signal of the visible light signal jumps first, and the infrared light signal jumps later.
  • the time difference between the jump time points of the two signals makes the exposure time of the visible light signal and the exposure time of the infrared light signal meet the preset ratio.
  • the exposure time of visible light pixels and IR pixels is independently controlled. For example, when infrared light is too strong and visible light is too weak, the exposure time of visible light can be increased and the exposure time of infrared light can be reduced, so that The exposure time of visible light and infrared light tends to be balanced to avoid exposure imbalance when infrared light is the dominant component or visible light is the dominant component, and the dynamic range of the sensor's sensitivity is improved to meet the user's indicators of clarity and signal-to-noise ratio Requirements. Further, by accurately setting the exposure time ratio of the visible light signal to the infrared light signal, the light-sensing effect of the sensor can be controlled more finely.
  • the logic control circuit 720 further includes:
  • the charge transfer control line is used to control the time point when the charge accumulated by the photosensitive device of the pixel array is transferred to the charge readout module.
  • the charge transfer control signal is output in the charge transfer control line, and the charge transfer control signal can be a high-level effective signal or a low-level effective signal. In the control signal timing diagram shown in FIG. 8, the charge transfer control signal is a high-level effective signal. When the falling edge of the charge transfer control signal arrives, the accumulated charge is transferred from the photosensitive device to the charge readout module. In an optional situation, after the charge transfer control signal is reset, the exposure end control signal is reset again.
  • the functions of the logic control circuit may also be implemented by software codes running on the processor, or the functions of the logic control circuit may be partially implemented by hardware circuits and partially implemented by software modules.
  • the sensor may include a pixel array and a control unit.
  • the control unit is a software module running on a processor.
  • the control unit includes a first control unit and a second control unit for independently controlling visible light pixels and IR pixels.
  • the exposure start time; the control unit also includes an exposure end control unit for uniformly controlling the exposure end time of each pixel in the pixel array.
  • the control unit also includes a charge transfer control unit and a reset unit.
  • the reset unit is used to provide the above reset signal.
  • the function of the charge transfer control unit is similar to the charge transfer control line, which will not be repeated here.
  • an embodiment of the present application also provides a sensor capable of independently controlling the exposure time of the four components of RGBIR, as shown in FIG. 9a and FIG. 9b, where FIG. 9a is an exemplary 2X2 array sequenced RGBIR control connection Schematic diagram, FIG. 9b is an exemplary 4X4 array sorting RGBIR control connection diagram.
  • the sensor includes a pixel array 910 and a logic control circuit 920.
  • the pixel array 910 is the pixel array in the sensor as shown in any of the embodiments in FIGS. 4 to 6, such as the pixel array 404, the pixel array 504 or the pixel array 605.
  • the logic control circuit 920 is used to independently control the exposure time of the red pixel, the green pixel, the blue pixel and the infrared light pixel.
  • the logic control circuit 920 includes a first control line, a second control line, a third control line, and a fourth control line, or it can be said that it includes four independent control circuits: a first control circuit and a second control circuit. , The third control circuit and the fourth control circuit.
  • the red pixels in the pixel array are coupled to the first control line
  • the green pixels are coupled to the second control line
  • the blue pixels are coupled to the third control line
  • the infrared light pixels are coupled to the fourth control line. It should be understood that the control lines with the same name in FIGS.
  • 9a and 9b are the same control line and are connected to each other.
  • the first control line on the pixel array side and the first control line of the logic control circuit are the same line
  • the fourth control line on the side is the same line as the fourth control line of the logic control circuit, and so on.
  • the coordinates of the pixels coupled to the first control line meet the coordinates of the red pixels
  • the coordinates of the pixels coupled to the second control line meet the coordinates of the green pixels
  • the coordinates of the pixels coupled to the third control line meet the coordinates of the blue pixels
  • the coordinate of the pixel coupled to the fourth control line meets the coordinate condition of the infrared light pixel. It should be understood that when the RGBIR arrays are arranged differently, the respective coordinate conditions of the RGBIR pixels will change accordingly. Therefore, the connection between the logic control circuit and the pixel array needs to be designed correspondingly according to the arrangement of the sensors. As shown in Figure 2b for the 2X2 array of RGBIR sensors, the coordinates of the red pixels meet the following conditions:
  • the first control line outputs the first control signal
  • the second control line outputs the second control signal
  • the third control line outputs the third control signal
  • the fourth control line outputs the fourth control signal.
  • the first control signal is used to control the red pixel Exposure start time
  • the second control signal is used to control the exposure start time of the green pixel
  • the third control signal is used to control the exposure start time of the blue pixel
  • the fourth control signal is used to control the exposure start of the infrared light pixel time.
  • the first to fourth control signals are independent of each other, so the exposure start time of the four components of RGBIR can be different.
  • the red pixel begins to be exposed
  • the green pixel begins to be exposed
  • the third valid transition edge arrives
  • the blue pixel starts to be exposed
  • the fourth valid transition edge of the fourth control signal comes, the IR pixel starts to expose.
  • the first control signal to the fourth control signal can all be high-level effective signals
  • the valid transition edges of the first control signal to the fourth control signal can all be falling edges, all rising edges, or part of falling edges. The remaining part is the rising edge, and the embodiment of the present application does not limit the effective transition edge of the control signal.
  • Fig. 10 it is an exemplary control signal timing diagram.
  • the first control signal to the fourth control signal are all high-level effective signals, and the valid transition edges of the first control signal to the fourth control signal are all falling edges.
  • the first control signal to The fourth control signal can be obtained according to the system reset signal of the logic control circuit. As shown in Figure 10, when the falling edge of the first control signal arrives, the red pixels begin to be exposed. When the falling edge of the second control signal arrives, the green pixels begin to expose. When the falling edge of the third control signal arrives, the blue pixels begin to be exposed. The color pixel starts to be exposed, and when the falling edge of the fourth control signal arrives, the IR pixel starts to be exposed.
  • the logic control circuit 920 further includes a reset signal, the reset signal may be a system clock signal, and the first control signal to the fourth control signal may all be obtained from the reset signal.
  • the logic control circuit 920 includes a logic operation circuit.
  • the logic operation circuit may include logical operations such as AND, OR, NOT, XOR, etc.
  • the logic operation circuit includes three inputs: a variable x, a variable y, and a reset signal.
  • the logic operation circuit includes four output terminals: the first control line to the fourth control line.
  • variable x and variable y meet the coordinate condition of the red pixel
  • variable x and variable y meet the coordinate condition of the green pixel
  • the output terminal is connected; if the variable x and the variable y meet the coordinate condition of the blue pixel, connect the reset signal to the output terminal of the third control line; if the variable x and the variable y meet the coordinate condition of the infrared light pixel, the reset signal Connect with the output terminal of the fourth control line.
  • the logic control circuit 920 further includes:
  • the exposure end control line is used to uniformly control the exposure stop time of all pixels in the pixel array.
  • the exposure end control line outputs the exposure end signal.
  • the exposure end signal can be a high-level active signal or a low-level active signal.
  • the exposure end time point can be a high-level falling edge or a low-level rising edge. along.
  • the exposure end control signal is a high-level active signal.
  • the exposure time of the R pixel is the time difference between the falling edge of the first control signal and the falling edge of the exposure end control signal: the first exposure time, the exposure time of the G pixel, the B pixel and the IR pixel are respectively The second exposure time, the third exposure time, and the fourth exposure time. Therefore, the exposure time of the four RGBIR components is independently controlled.
  • the time when the first effective transition edge of the first control signal arrives to the fourth effective transition edge of the fourth control signal can be controlled, so that the exposure time of the four components of RGBIR meets the preset requirements. Set the ratio.
  • the logic control circuit 920 further includes a charge transfer control line for controlling when to transfer the charge accumulated by the photosensitive device of the pixel array to the charge readout module.
  • the charge transfer control signal is output in the charge transfer control line, and the charge transfer control signal can be a high-level effective signal or a low-level effective signal.
  • the charge transfer control signal shown in FIG. 10 is the same as that shown in FIG. 8.
  • the functions of the logic control circuit may also be implemented by software codes running on the processor, or the functions of the logic control circuit may be partially implemented by hardware circuits and partially implemented by software modules.
  • the sensor may include a pixel array and a control unit, the control unit is a software module running on a processor, and the control unit includes a first control unit, a second control unit, a third control unit, and a fourth control unit, It is used to independently control the exposure start time of the four components of R, G, B and IR; the control unit also includes an exposure end control unit for uniformly controlling the exposure end time of the four components of the pixel.
  • the control unit also includes a charge transfer control unit and a reset unit. The reset unit is used to provide a reset signal.
  • the function of the charge transfer control unit is similar to the charge transfer control line, which will not be repeated here.
  • the exposure time of the four components of R, G, B, and IR is independently controlled, which further improves the dynamic range of the sensor.
  • the light-sensing effect of the components makes the final light-sensing result more in line with the requirements of the scene or the sharpness or signal-to-noise ratio of the customer requirements.
  • the exposure time of the four components of R, G, B, and IR may be preset to meet a preset ratio, so as to achieve fine control of the photosensitive effect of the sensor.
  • an embodiment of the present application also provides a sensor in which the exposure time of each pixel can be independently controlled.
  • the sensor includes a pixel array 1110 and a logic control circuit 1120.
  • the pixel array 1110 is the pixel array in the sensor as shown in any of the embodiments in FIGS. 4 to 6, such as the pixel array 404, the pixel array 504 or the pixel array 605.
  • the logic control circuit 1120 includes a row coordinate control circuit and a column coordinate control circuit, or in other words, includes a row coordinate control line and a column coordinate control line, and each pixel in the pixel array is coupled to its own row coordinate control line and column coordinate control line.
  • the logic control circuit 1120 also includes a reset signal and an exposure start control line.
  • the exposure start control line When the row coordinate control signal output from the row coordinate line of the target pixel and the column coordinate control signal output from the column coordinate line are both valid signals, the exposure start control line outputs The reset signal is sent to the target pixel, and the exposure start time of the target pixel is controlled based on the reset signal.
  • the exposure start control line has multiple branches, and each pixel is coupled to one branch.
  • the branch corresponding to the target pixel outputs an effective control signal .
  • the column coordinate control line and the row coordinate control line are equivalent to the switch.
  • the reset signal is input and the exposure start control line is output.
  • the switch When the signals in the column coordinate control line and row coordinate control line are both valid signals, the switch is turned on and the reset signal is You can output to the target pixel through the exposure start control line, and control the exposure of the target pixel.
  • the signals in the column coordinate control line and the row coordinate control line are both valid signals, and the valid transition edge of the reset signal arrives, the target pixel is controlled to start exposure. If there is a signal in the column coordinate control line and the row coordinate control line that does not meet the requirements, the switch is closed, and the exposure start control line has no control signal output.
  • each pixel in the pixel array has its own corresponding row coordinate line and column coordinate line
  • the exposure time of each pixel can be independently controlled, for example, the row coordinate line and column coordinate of the pixel that needs to be exposed
  • the signal in the line is preferentially set as a valid signal, thereby extending the exposure time of the key exposure pixels.
  • the logic control circuit 1120 further includes: an exposure end control line for uniformly controlling the exposure end time of each pixel in the pixel array.
  • an exposure end control line for uniformly controlling the exposure end time of each pixel in the pixel array.
  • the logic control circuit 1120 further includes: a charge transfer control line for controlling when the charge accumulated in the photosensitive device is transferred to the charge readout module.
  • a charge transfer control line for controlling when the charge accumulated in the photosensitive device is transferred to the charge readout module.
  • the functions of the logic control circuit may also be implemented by software codes running on the processor, or the functions of the logic control circuit may be partially implemented by hardware circuits and partially implemented by software modules.
  • the sensor may include a pixel array and a control unit, the control unit is a software module running on a processor, the control unit includes a row control unit, a column control unit, and an exposure start control unit, a row control unit and a column control unit For indicating the row coordinates and ordinate of the pixels, the exposure start control unit is used for outputting an effective control signal to control the exposure start time of the target pixel when the row control unit and the column control unit of the target pixel meet the requirements.
  • the sensor provided in the embodiments of the present application can control the start time of the pixel exposure according to the control signal in the row coordinate control line and the column coordinate control line of each pixel, and the exposure end time is uniformly controlled by the exposure end control line, so The exposure time of each pixel can be different. Further, it is possible to set the time when the row coordinate control signal and the column coordinate control signal corresponding to the pixel both become valid signals, so that the exposure time of each pixel meets the preset ratio. In some scenes where pixels in the target area need to be enhanced, only the exposure time of the pixels in the target area can be increased, which further improves the flexibility of the sensor's light sensitivity and further satisfies the user's requirements for light-sensing results.
  • FIG. 12 it is an exemplary control signal timing diagram.
  • FIG. 12 uses two pixels as an example to illustrate the control of the exposure start control signal on the pixel exposure start time.
  • the signals in the timing diagram are all high-level effective, and it should be understood that each control signal may also be low-level effective.
  • the first pixel is coupled to the first row coordinate control line and the first column coordinate control line.
  • the signal in the first row coordinate control line is row coordinate control signal 1
  • the signal in the first column coordinate control line is column coordinate control signal 1.
  • the second pixel is coupled to the second row coordinate control line and the second column coordinate control line, the signal in the second row coordinate control line is row coordinate control signal 2, and the signal in the second column coordinate control line is the column coordinate control signal 2.
  • the reset signal is used as the exposure start control signal, and when the reset signal falls When the edge comes, the first pixel is controlled to start exposure; when the row coordinate control signal 2 and column coordinate control signal 2 of the second pixel are both high, the exposure start control signal of the second pixel is valid.
  • the reset signal is used as Exposure start control signal, and when the falling edge of the reset signal arrives, control the second pixel to start exposure. When the falling edge of the exposure end control signal arrives, both the first pixel and the second pixel stop exposure. So far, the exposure time of the first pixel is the first exposure time, and the exposure time of the second pixel is the second exposure time.
  • the exposure start control signal of the first pixel and the exposure start control signal of the second pixel may be two different branches of the same signal.
  • the branch corresponding to the first pixel outputs Effective control signal;
  • the branch corresponding to the second pixel outputs an effective control signal.
  • FIG. 13 it is a graph of the photosensitive characteristic curve of each pixel of the RGBIR sensor provided by the embodiment of the application.
  • the abscissa is the wavelength of the light
  • the unit is nm
  • the ordinate is the sensitivity of light.
  • the thin solid line is the photosensitive characteristic curve of the R pixel
  • the short dashed line is the photosensitive characteristic curve of the G pixel
  • the dotted line is the photosensitive characteristic curve of the B pixel
  • the long dashed line is the photosensitive characteristic curve of the IR pixel.
  • the R pixel only has a photosensitive intensity peak near 650nm in red light
  • the G pixel only has a photosensitive intensity peak near 550nm in green light
  • the B pixel only has a photosensitive intensity peak near blue light at 450nm.
  • IR pixels only have a peak in infrared light.
  • the RGBIR sensor Compared with the existing RGBIR sensor, the RGBIR sensor provided in the embodiments of the present application removes the IR component of the R, G, and B pixel light-sensing results, so that the R pixel can only receive red light, and the G pixel can only receive green light and B Pixels can only light blue light, which improves the color accuracy of the sensor's light-sensing results.
  • the sensors provided in the embodiments of the present application can be used in security surveillance cameras, intelligent transportation electronic eye devices, video cameras, cameras, mobile phones, and other terminal devices with imaging, photographing, or video recording functions in a community.
  • FIG. 14a it is a schematic diagram of an exemplary RGBW (red green blue white) sensor arranged in a 2X2 array.
  • Fig. 14b it is a schematic diagram of an exemplary RCCB (red clear clear blue) sensor arranged in a 2X2 array.
  • each grid represents a pixel.
  • R represents a red pixel
  • G represents a green pixel
  • B represents a blue pixel
  • W represents a white light pixel.
  • RCCB sensor R represents red pixels
  • B represents blue pixels
  • C represents clear pixels.
  • the RCCB sensor replaces G pixels in RGB sensors with C pixels.
  • the wavelength range allowed by C pixels is between 400nm and 657nm, clear pixels
  • the amount of light that can pass is large, and the C pixel matches the current demosaic algorithm.
  • the embodiment of the present application also provides an RGBW sensor capable of independently controlling the exposure time of visible light pixels and W pixels.
  • the exposure control of the RGBW sensor can be implemented by a logic control circuit or a control unit, which can be a software module running on a processor.
  • the control logic of the independent exposure of the RGBW sensor is similar to the control logic of the independent exposure of the RGBIR sensor, and the IR pixels can be replaced with W pixels.
  • the logic control circuit for the independent exposure of visible light pixels and W pixels of the RGBW sensor refers to the logic control circuit of the RGBIR sensor as shown in FIG. 7a, which will not be repeated here.
  • an embodiment of the present application also provides an RGBW sensor capable of independently controlling the exposure time of the four components of RGBW.
  • the exposure control can be implemented by a logic control circuit or a control unit, and the control unit can be software running on a processor. Module.
  • the control logic of RGBW sensor 4-component independent exposure is similar to the control logic of RGBIR sensor 4-component independent exposure.
  • the logic control circuit of 4-component independent exposure RGBW sensor refers to the logic control circuit of RGBIR sensor as shown in Figure 9a. Repeat.
  • the embodiment of the present application also provides an RCCB sensor capable of independently controlling the exposure time of visible light pixels and C pixels.
  • the exposure control of the RCCB sensor can be completed by a logic control circuit or a control unit.
  • the logic control circuit of the RCCB in the embodiment of this application refers to the logic control circuit 720, where the first control line is used to control R pixels and B pixels, and the second control line is used To control 2 C pixels.
  • the embodiment of the present application also provides an RCCB sensor with independently controllable exposure times of the three components of R, B, and C.
  • the exposure control of the RCCB sensor can be completed by a logic control circuit or a control unit, where the control unit is Examples are explained.
  • the control unit 1500 includes a first control unit 1510, a second control unit 1520, and a third control unit 1530, Among them, the first control unit 1510 is used to control the exposure start time of R pixels, the second control unit 1520 is used to control the exposure start time of B pixels, and the third control unit 1530 is used to control the exposure start time of C pixels. .
  • the control unit 1500 also includes an exposure end control unit 1540 for uniformly controlling the exposure end time of all pixels in the pixel array.
  • the control unit 1500 may also include a charge transfer control unit 1550 and a reset unit 1560.
  • the control unit may be a software module running on a general-purpose processor or a special-purpose processor.
  • the embodiment of the present application provides an independent exposure device for controlling the exposure time of the pixel array of the sensor.
  • the device includes at least two control units, and each of the at least two control units is used to correspond to The exposure start time of one type of pixel in the pixel array of the sensor is controlled, and the pixel array of the sensor controlled by the device includes at least two types of pixels.
  • the independent exposure device can be regarded as a control device independent of the sensor, for example, it can be a general-purpose processor or a dedicated processor, or can be regarded as an independently solidified hardware logic or hardware circuit.
  • the independent exposure device can be considered as the logic control circuit in Fig. 7a and Fig. 7b, Fig. 9a and Fig. 9b, Fig. 11 or the control unit in Fig. 15.
  • FIG. 16 it is a schematic diagram of the hardware architecture of an independent exposure apparatus provided by an embodiment of this application. It should be understood that the logic control circuits in FIGS. 7a, 7b, 9a, 9b, and 11 described above can all be completed by software modules running on the independent exposure device shown in FIG. 16. The control unit shown in FIG. 15 can also be completed by a software module running on the exposure control device shown in FIG. 16.
  • the exposure control device 1600 includes: at least one central processing unit (Central Processing Unit, CPU), at least one memory, a microcontroller (Microcontroller Unit, MCU), a receiving interface, a transmitting interface, and the like.
  • the exposure control device 1600 further includes: a dedicated video or graphics processor, and a graphics processing unit (Graphics Processing Unit, GPU), etc.
  • the CPU may be a single-CPU processor or a multi-CPU processor; optionally, the CPU may be a processor group composed of multiple processors, between multiple processors Coupled to each other through one or more buses.
  • the exposure control can be partly completed by software codes running on a general-purpose CPU or MCU, and partly completed by hardware logic circuits; or it can also be completely completed by software codes running on a general-purpose CPU or MCU.
  • the memory 302 may be a non-power-down volatile memory, such as Embedded MultiMedia Card (EMMC), Universal Flash Storage (UFS) or Read-Only Memory (Read-Only Memory, ROM), or other types of static storage devices that can store static information and instructions, or volatile memory (volatile memory), such as Random Access Memory (RAM), or can store information and Other types of dynamic storage devices for instructions can also be Electrically Erasable Programmable Read-Only Memory (EEPROM), Compact Disc Read-Only Memory, CD-ROM or other optical discs Storage, optical disc storage (including compact discs, laser discs, optical discs, digital universal discs, Blu-ray discs, etc.), magnetic disk storage media or other magnetic storage devices, or can be used to carry or store program codes in the form of instructions or data structures and can Any other computer-readable storage medium accessed by the computer, but not limited to this.
  • the receiving interface may be a data input interface of the processor chip.
  • the independent exposure device further includes a pixel array.
  • the independently exposed device includes at least two types of pixels, that is, the independently exposed device can be a sensor including a control unit or a logic control circuit, or in other words, the independently exposed device can be A sensor that independently controls exposure.
  • the independent exposure device may be an RGBIR sensor, an RGBW sensor, an RCCB sensor, etc. that independently control exposure.
  • visible light pixels are classified as one type of pixel, that is, R pixels, G pixels, and B pixels are classified as one type of pixels, and IR pixels, W pixels, or C pixels Pixels are considered to be another type of pixels.
  • RGBIR sensors include two types of pixels: visible light pixels and IR pixels
  • RGBW sensors include two types of pixels: visible light pixels and W pixels
  • RCCB sensors include two types of pixels. Pixels: visible light pixels and C pixels.
  • each pixel component is considered to be a type of pixel.
  • the RGBIR sensor includes four types of pixels: R, G, B, and IR
  • the RGBW sensor includes: R, G
  • the RCCB sensor includes three types of pixels: R, B and C.
  • the senor is an RGBIR sensor.
  • the RGBIR sensor can realize the independent exposure of visible light pixels and IR pixels, and can also realize independent exposure of the four components of R, G, B, and IR.
  • the at least two control units include: a first control unit and a second control unit; the first control unit is used to control the exposure start time of the visible light pixels; the second control unit Used to control the exposure start time of IR pixels.
  • At least two control units include: a first control unit, a second control unit, a third control unit, and a fourth control unit; the first control unit is used for Control the exposure start time of R pixels; the second control unit is used to control the exposure start time of G pixels; the third control unit is used to control the exposure start time of B pixels; the fourth control unit is used to control the exposure of IR pixels Start time.
  • the senor is an RGBW sensor.
  • the RGBW sensor can realize independent exposure of visible light pixels and W pixels, or can realize independent exposure of the four components of R, G, B, and W.
  • the at least two control units include: a first control unit and a second control unit; the first control unit is used to control the exposure start time of the visible light pixels; the second control unit Used to control the exposure start time of W pixels.
  • At least two control units include: a first control unit, a second control unit, a third control unit and a fourth control unit; the first control unit is used for Control the exposure start time of R pixels; the second control unit is used to control the exposure start time of G pixels; the third control unit is used to control the exposure start time of B pixels; the fourth control unit is used to control the exposure of W pixels Start time.
  • the senor is an RCCB sensor.
  • the RCCB sensor can realize the independent exposure of visible light pixels and C pixels respectively, and can also realize independent exposure of the three components of R, B, and C.
  • the at least two control units include: a first control unit and a second control unit; the first control unit is used to control the exposure start time of the visible light pixels; the second control unit Used to control the exposure start time of C pixels.
  • At least two control units include: a first control unit, a second control unit and a third control unit; the first control unit is used to control the start of exposure of the R pixel Time; the second control unit is used to control the exposure start time of the B pixel; the third control unit is used to control the exposure start time of the C pixel.
  • the independent exposure device may also control the exposure time of the at least two types of pixels to meet a preset ratio based on at least two control units.
  • a preset ratio based on at least two control units.
  • the first control unit and the second control unit controlling the exposure time of the visible light pixels and the IR pixels to meet the preset ratio; or based on the first control unit, the second control unit, the third control unit, and the fourth control unit control The exposure time of R, G, B and IR pixels meets the preset ratio.
  • first control unit and the second control unit to control the exposure time of the visible light pixel and the W pixel to meet the preset ratio; or, based on the first control unit, the second control unit, the third control unit, and the fourth control unit to control R , G, B and W pixels’ exposure time meets a preset ratio; or, based on the first control unit and the second control unit controlling the exposure time of the visible light pixel and the C pixel to meet the preset ratio; or, based on the first control unit, the The second control unit and the third control unit control the exposure time of the R, B and C pixels to meet the preset ratio.
  • the independent exposure device further includes: an exposure end control unit for uniformly controlling the exposure end time of all pixels in the pixel array.
  • This application also provides a method for image sensitivity.
  • FIG. 17 it is a schematic flowchart of an exemplary image light-sensing method provided by an embodiment of this application.
  • the method applies an image sensor, the sensor includes: an infrared light cut filter layer, a plurality of micro lenses and a pixel array, the pixel array includes red pixels, green pixels, blue pixels and infrared light pixels, the pixel array Each pixel corresponds to a micro lens; the infrared light cut filter layer is respectively coated on the micro lens corresponding to the red pixel, the green pixel, and the blue pixel, and the method includes:
  • the original light in nature passes through the filter to obtain the first light
  • the filter is used to filter out ultraviolet light and far-infrared light.
  • Far-infrared light is infrared light with a longer wavelength.
  • the infrared light with a wavelength greater than the second preset wavelength mentioned in the foregoing embodiment can be referred to as far-infrared light.
  • the wavelength of the far-infrared light is greater than the wavelength of the infrared light in the specific wavelength range allowed by the subsequent infrared light filter layer.
  • the filter please refer to the description of the filter on the device side, which will not be repeated here.
  • the first light passes through the infrared light filter layer and the micro lens to reach the infrared light pixel;
  • the first light passes through the infrared light cut filter layer, the red filter layer and the micro lens to reach the red pixel.
  • the first light passes through the infrared light cut filter layer, the green filter layer and the micro lens to reach the green pixel.
  • the first light reaches the blue pixel through the infrared cut filter layer, the blue filter layer and the micro lens.
  • steps 1702-1705 do not limit the execution order of the method. Steps 1702-1705 can usually be executed synchronously, or the steps may not be executed strictly synchronously, but there are some time differences between them. The embodiments of this application do not limit this.
  • the infrared filter layer allows only infrared light in a specific wavelength range to pass, the red filter layer is used to pass only red light and infrared light in the first wavelength range, and the green filter layer is used to pass only green light and the first wavelength range.
  • the red filter layer is used to pass only red light and infrared light in the first wavelength range
  • the green filter layer is used to pass only green light and the first wavelength range.
  • the blue filter layer is used to pass only blue light and infrared light in the third wavelength range
  • the infrared light cut by the infrared light cut-off filter layer includes: Infrared light, infrared light in the second wavelength range, and infrared light in the third wavelength range.
  • the infrared light cut filter layer cuts off the entry into the R pixel , G pixel and B pixel infrared light, so that R pixel, G pixel and B pixel can only receive R light, G light and B light respectively.
  • step 1701 is an optional step, and the original light from nature may not pass through the filter but directly enter the filter layer and the micro lens.
  • the infrared cut filter layer can be on top of the red filter layer, the green filter layer and the blue filter layer; the red filter layer, the green filter layer and the blue filter layer can also be on the infrared cut filter layer Above, the embodiment of the present application does not limit this.
  • the photosensitive device in the pixel converts the light entering the pixel into electric charge
  • the method further includes:
  • the exposure start time of the infrared light pixel is controlled based on the second control line.
  • visible light pixels and infrared light pixels can be independently exposed, which improves the light-sensing effect of the sensor.
  • the method further includes: controlling the exposure time of the visible light pixel and the infrared light pixel to meet a preset ratio based on the first control line and the second control line.
  • the method further includes:
  • the exposure start time of the infrared light pixel is controlled based on the fourth control line.
  • the four pixel components can be independently exposed, which improves the sensitivity of the sensor.
  • the method further includes: controlling the exposure time of the red pixel, the green pixel, and the blue pixel to meet a preset ratio.
  • each pixel in the sensor is coupled to a respective row coordinate control line and a column coordinate control line, and each pixel corresponds to a branch of the exposure start control line
  • the method further includes: When the control signals output by the row coordinate control line and the column coordinate control line of the target pixel are both effective levels, the branch of the exposure start control line corresponding to the target pixel outputs a control signal, and the target is controlled based on the control signal
  • the exposure start time of the pixel, and the target pixel is any pixel in the pixel array.
  • each pixel can individually control the exposure time.
  • the method further includes: controlling the exposure end time of all pixels in the pixel array based on the exposure end control line.
  • an exemplary method for independently controlling exposure time is a schematic flow chart of an exemplary method for independently controlling exposure time.
  • the method is applied to a sensor including at least two types of pixels.
  • One type of pixel and the second type of pixel the method includes:
  • the sensor may be an RGBIR sensor.
  • the first type of pixels are visible light pixels, the visible light pixels include R, G, and B pixels, and the second type of pixels are IR pixels.
  • the sensor may be an RGBW sensor.
  • the first type of pixels are visible light pixels, the visible light pixels include R, G, and B pixels, and the second type of pixels are W pixels.
  • the sensor may be an RCCB sensor.
  • the first type of pixels are visible light pixels, the visible light pixels include R and B pixels, and the second type of pixels are C pixels.
  • the first control unit and the second control unit are independent of each other, so the exposure start time of the first type pixel and the second type pixel are independently controlled. It should be understood that the first control unit and the second control unit may be implemented by a hardware logic circuit, or may be implemented by a software module running on a processor.
  • the at least two types of pixels further include: a third type of pixels; the method further includes: controlling the exposure start time of the third type of pixels based on the third control unit.
  • the senor is an RCCB sensor
  • the first type of pixels are R pixels
  • the second type of pixels are B pixels
  • the third type of pixels are C pixels
  • the method specifically includes: controlling based on the first control unit The exposure start time of the R pixel; the exposure start time of the B pixel is controlled based on the second control unit; the exposure start time of the C pixel is controlled based on the third control unit.
  • the at least two types of pixels further include: the at least two types of pixels further include: a third type of pixels and a fourth type of pixels, and the method further includes:
  • the fourth control unit controls the exposure start time of the fourth type of pixels.
  • the senor is an RGBIR sensor
  • the pixels of the first type are R pixels
  • the pixels of the second type are G pixels
  • the pixels of the third type are B pixels
  • the pixels of the fourth type are Is an IR pixel
  • the method specifically includes:
  • the sensor is an RGBW sensor, the pixels of the first type are R pixels, the pixels of the second type are G pixels, the pixels of the third type are B pixels, and the pixels of the fourth type are W pixels;
  • the method specifically includes:
  • the exposure start time of the W pixel is controlled.
  • the method further includes:
  • the method further includes: controlling the exposure time of each of the at least two types of pixels to meet a preset ratio.
  • the first control unit and the second control unit controlling the exposure time of the visible light pixels and the IR pixels to meet the preset ratio; or based on the first control unit, the second control unit, the third control unit, and the fourth control unit control
  • the exposure time of R, G, B and IR pixels meets the preset ratio.
  • first control unit and the second control unit to control the exposure time of the visible light pixel and the W pixel to meet the preset ratio; or, based on the first control unit, the second control unit, the third control unit, and the fourth control unit to control R , G, B and W pixels’ exposure time meets a preset ratio; or, based on the first control unit and the second control unit controlling the exposure time of the visible light pixel and the C pixel to meet the preset ratio; or, based on the first control unit, the The second control unit and the third control unit control the exposure time of the R, B and C pixels to meet the preset ratio.
  • the exposure start time of different types of pixels is independently controlled, and the exposure end time is uniformly controlled. Therefore, the exposure time between the pixels can be set to meet a preset ratio by setting the exposure start time of different pixels.
  • the method further includes: transferring the charge accumulated in the photosensitive device to the charge readout module based on the charge transfer control unit.
  • the embodiments of the present application also provide a computer-readable storage medium.
  • the computer-readable storage medium stores instructions that, when run on a computer or processor, cause the computer or the processor to execute any one provided in the embodiments of the present application. Part or all of the steps in the method of independent exposure control.
  • the embodiments of the present application also provide a computer program product containing instructions, which when run on a computer or processor, cause the computer or the processor to execute part or all of any of the independent exposure control methods provided in the embodiments of the present application step.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Color Television Image Signal Generators (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)

Abstract

Les modes de réalisation de la présente invention concernent un capteur d'image et un procédé de photodétection d'image. Le capteur comprend : une couche de filtrage, une pluralité de micro-lentilles, et un réseau de pixels ; le réseau de pixels comprend des pixels rouges, des pixels verts, des pixels bleus, et des pixels infrarouges, les micro-lentilles correspondant aux pixels rouges, aux pixels verts, et les pixels bleus étant revêtues d'une couche de filtrage de coupure infrarouge, de telle sorte que la lumière infrarouge ne peut pas entrer dans les pixels rouges, les pixels verts ou les pixels bleus et les pixels rouges, les pixels verts et les pixels bleus ne peuvent respectivement détecter que la lumière rouge, la lumière verte et la lumière bleue, filtrant le composant IR du résultat de photodétection de lumière visible, permettant aux composants R, G, B et IR d'être détectés indépendamment, et améliorant ainsi significativement l'effet de photodétection du capteur.
PCT/CN2019/098481 2019-07-31 2019-07-31 Capteur d'image et procédé de photodétection d'image WO2021016900A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201980001909.XA CN110574367A (zh) 2019-07-31 2019-07-31 一种图像传感器和图像感光的方法
PCT/CN2019/098481 WO2021016900A1 (fr) 2019-07-31 2019-07-31 Capteur d'image et procédé de photodétection d'image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/098481 WO2021016900A1 (fr) 2019-07-31 2019-07-31 Capteur d'image et procédé de photodétection d'image

Publications (1)

Publication Number Publication Date
WO2021016900A1 true WO2021016900A1 (fr) 2021-02-04

Family

ID=68786091

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/098481 WO2021016900A1 (fr) 2019-07-31 2019-07-31 Capteur d'image et procédé de photodétection d'image

Country Status (2)

Country Link
CN (1) CN110574367A (fr)
WO (1) WO2021016900A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113973181A (zh) * 2021-11-30 2022-01-25 维沃移动通信有限公司 图像传感器、摄像模组和电子设备

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021174425A1 (fr) * 2020-03-03 2021-09-10 华为技术有限公司 Capteur d'image et procédé de sensibilisation d'image
CN111447423A (zh) * 2020-03-25 2020-07-24 浙江大华技术股份有限公司 图像传感器、摄像装置及图像处理方法
CN114095672A (zh) * 2020-07-31 2022-02-25 北京小米移动软件有限公司 成像系统、方法及电子设备
CN112363180A (zh) * 2020-10-28 2021-02-12 Oppo广东移动通信有限公司 一种成像测距传感器、方法、系统及存储介质
WO2022199413A1 (fr) * 2021-03-23 2022-09-29 北京灵汐科技有限公司 Réseau de détection de pixels et capteur visuel
CN113038046B (zh) * 2021-03-23 2023-07-25 北京灵汐科技有限公司 像素传感阵列和视觉传感器

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106878690A (zh) * 2015-12-14 2017-06-20 比亚迪股份有限公司 图像传感器的成像方法、成像装置和电子设备
CN107360405A (zh) * 2016-05-09 2017-11-17 比亚迪股份有限公司 图像传感器、成像方法和成像装置

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105190374A (zh) * 2013-03-14 2015-12-23 富士胶片株式会社 固体摄像元件及其制造方法、红外光截止滤波器形成用硬化性组合物、照相机模块
US9666620B2 (en) * 2014-10-06 2017-05-30 Visera Technologies Company Limited Stacked filter and image sensor containing the same
WO2016208415A1 (fr) * 2015-06-26 2016-12-29 ソニー株式会社 Appareil d'inspection, appareil de détection, appareil de commande de sensibilité, procédé d'inspection, et programme
JP6760805B2 (ja) * 2016-09-13 2020-09-23 富士フイルム株式会社 赤外線吸収剤、組成物、膜、光学フィルタ、積層体、固体撮像素子、画像表示装置および赤外線センサ
CN106454148B (zh) * 2016-11-15 2019-07-12 天津大学 分块独立曝光cmos图像传感器像素结构及其控制方法
US10638054B2 (en) * 2017-01-25 2020-04-28 Cista System Corp. System and method for visible and infrared high dynamic range sensing
CN111988587B (zh) * 2017-02-10 2023-02-07 杭州海康威视数字技术股份有限公司 图像融合设备和图像融合方法
EP3637045B1 (fr) * 2017-06-07 2023-05-24 Sony Semiconductor Solutions Corporation Dispositif et procédé de traitement d'informations
JP7280681B2 (ja) * 2017-11-30 2023-05-24 ブリルニクス シンガポール プライベート リミテッド 固体撮像装置、固体撮像装置の駆動方法、および電子機器
CN112788249B (zh) * 2017-12-20 2022-12-06 杭州海康威视数字技术股份有限公司 图像融合方法、装置、电子设备及计算机可读存储介质
CN108965654B (zh) * 2018-02-11 2020-12-25 浙江宇视科技有限公司 基于单传感器的双光谱摄像机系统和图像处理方法
CN109922286A (zh) * 2019-03-21 2019-06-21 思特威(上海)电子科技有限公司 Cmos图像传感器及其成像方法

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106878690A (zh) * 2015-12-14 2017-06-20 比亚迪股份有限公司 图像传感器的成像方法、成像装置和电子设备
CN107360405A (zh) * 2016-05-09 2017-11-17 比亚迪股份有限公司 图像传感器、成像方法和成像装置

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113973181A (zh) * 2021-11-30 2022-01-25 维沃移动通信有限公司 图像传感器、摄像模组和电子设备

Also Published As

Publication number Publication date
CN110574367A (zh) 2019-12-13

Similar Documents

Publication Publication Date Title
WO2021016900A1 (fr) Capteur d'image et procédé de photodétection d'image
US8471939B2 (en) Image sensor having multiple sensing layers
WO2021174425A1 (fr) Capteur d'image et procédé de sensibilisation d'image
US9521319B2 (en) Array cameras and array camera modules including spectral filters disposed outside of a constituent image sensor
US8478123B2 (en) Imaging devices having arrays of image sensors and lenses with multiple aperture sizes
CN103220534B (zh) 图像撷取装置与方法
US9681057B2 (en) Exposure timing manipulation in a multi-lens camera
US9077916B2 (en) Improving the depth of field in an imaging system
US9729806B2 (en) Imaging systems with phase detection pixels
TWI567958B (zh) 雙像素大小彩色影像感測器及其製造方法
WO2016139134A1 (fr) Métadonnées de champ lumineux
WO2021073141A1 (fr) Procédé de traitement d'image, dispositif de traitement d'image et dispositif photographique
US20050146634A1 (en) Cameras, optical systems, imaging methods, and optical filter configuration methods
WO2020155739A1 (fr) Capteur d'image, procédé d'acquisition de données d'image à partir d'un capteur d'image, et dispositif de caméra
US9497427B2 (en) Method and apparatus for image flare mitigation
CN212628124U (zh) 暗景全彩功能图像传感器及其成像装置
US7813545B2 (en) Backlit subject detection in an image
JP7447947B2 (ja) 電子機器
US11418697B2 (en) Image sensor and photographing apparatus including the same
US11671714B1 (en) Motion based exposure control
US20230353881A1 (en) Methods and systems for shift estimation for one or more output frames
KR20230111379A (ko) 이미지 센서 및 이미징 장치
CN117676353A (zh) 一种图像信号处理方法及装置
JP2017188788A (ja) 電子機器、再生装置、記録プログラム、再生プログラムおよび撮像システム。

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2020521459

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19939100

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19939100

Country of ref document: EP

Kind code of ref document: A1