WO2021212763A1 - 高动态范围图像处理系统及方法、电子设备和可读存储介质 - Google Patents

高动态范围图像处理系统及方法、电子设备和可读存储介质 Download PDF

Info

Publication number
WO2021212763A1
WO2021212763A1 PCT/CN2020/119963 CN2020119963W WO2021212763A1 WO 2021212763 A1 WO2021212763 A1 WO 2021212763A1 CN 2020119963 W CN2020119963 W CN 2020119963W WO 2021212763 A1 WO2021212763 A1 WO 2021212763A1
Authority
WO
WIPO (PCT)
Prior art keywords
color
image
intermediate image
image data
dynamic range
Prior art date
Application number
PCT/CN2020/119963
Other languages
English (en)
French (fr)
Inventor
杨鑫
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Publication of WO2021212763A1 publication Critical patent/WO2021212763A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/741Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/81Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • H04N23/843Demosaicing, e.g. interpolating colour pixel values
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/133Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements including elements passing panchromatic light, e.g. filters passing white light
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/134Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on three different wavelength filter elements

Definitions

  • This application relates to the field of image processing technology, and in particular to a high dynamic range image processing system, a high dynamic range image processing method, electronic equipment, and computer-readable storage media.
  • a camera may be provided in an electronic device such as a mobile phone to realize a photographing function.
  • An image sensor for receiving light can be set in the camera.
  • the image sensor may be provided with a filter array.
  • the embodiments of the present application provide a high dynamic range image processing system, a high dynamic range image processing method, electronic equipment, and a computer-readable storage medium.
  • the embodiment of the present application provides a high dynamic range image processing system.
  • the high dynamic range image processing system includes an image sensor, an image fusion module, a high dynamic range image processing module and an image processor.
  • the image sensor includes a pixel array including a plurality of panchromatic photosensitive pixels and a plurality of color photosensitive pixels, and the color photosensitive pixels have a narrower spectral response than the panchromatic photosensitive pixels.
  • the pixel array includes a minimum repeating unit, each of the minimum repeating units includes a plurality of sub-units, and each of the sub-units includes a plurality of single-color photosensitive pixels and a plurality of full-color photosensitive pixels.
  • the pixel array is exposed at a first exposure time to obtain a first original image
  • the first original image includes first color original image data generated by the single-color photosensitive pixel exposed at the first exposure time and The first panchromatic original image data generated by the panchromatic photosensitive pixels exposed for the first exposure time.
  • the pixel array is exposed at a second exposure time to obtain a second original image
  • the second original image includes second color original image data generated by the single-color photosensitive pixel exposed at the second exposure time and The second panchromatic original image data generated by the panchromatic photosensitive pixels exposed for the second exposure time.
  • the first exposure time is not equal to the second exposure time.
  • the image fusion module is used for fusing the first color original image data and the first full-color original image data into a first color intermediate image containing only the first color intermediate image data, and the second color original
  • the image data and the second full-color original image data are fused into a second color intermediate image that contains only second color intermediate image data, where both the first color intermediate image and the second color intermediate image include multiple colors Image pixels, a plurality of the color image pixels are arranged in a Bayer array.
  • the image processor is configured to perform color conversion processing on the first color intermediate image and the second color intermediate image to obtain a first color intermediate image after color conversion and a second color intermediate image after color conversion.
  • the high dynamic range image processing module is used for performing high dynamic range processing on the color converted first color intermediate image and the color converted second color intermediate image to obtain a color high dynamic range image.
  • the embodiments of the present application provide a high dynamic range image processing method.
  • the high dynamic range image processing method is used in a high dynamic range image processing system.
  • the high dynamic range image processing system includes an image sensor, the image sensor includes a pixel array, the pixel array includes a plurality of full-color photosensitive pixels and a plurality of color photosensitive pixels, and the color photosensitive pixels have a higher sensitivity than the full-color photosensitive pixels.
  • the pixel has a narrower spectral response
  • the pixel array includes a minimum repeating unit, each of the minimum repeating units includes a plurality of sub-units, and each of the sub-units includes a plurality of single-color photosensitive pixels and a plurality of full-color photosensitive pixels;
  • the high dynamic range image processing method includes: exposing the pixel array, wherein the pixel array is exposed at a first exposure time to obtain a first original image, and the first original image includes all exposed at the first exposure time.
  • the pixel array is exposed at a second exposure time to obtain a second original image
  • the second original image includes second color original image data generated by the single-color photosensitive pixel exposed at the second exposure time and The second panchromatic original image data generated by the panchromatic photosensitive pixels exposed for the second exposure time.
  • the first exposure time is not equal to the second exposure time. Fusing the first color original image data and the first panchromatic original image data into a first color intermediate image containing only the first color intermediate image data; combining the second color original image data with the second color intermediate image data
  • the full-color original image data is fused into a second color intermediate image that only contains second color intermediate image data. Both the first color intermediate image and the second color intermediate image include a plurality of color image pixels.
  • Image pixels are arranged in a Bayer array; performing color conversion processing on the first color intermediate image and the second color intermediate image to obtain a first color intermediate image after color conversion and a second color intermediate image after color conversion; High dynamic range processing is performed on the first color intermediate image after color conversion and the second color intermediate image after color conversion to obtain a color high dynamic range image.
  • the embodiment of the present application provides an electronic device.
  • the electronic device includes a lens, a housing, and the above-mentioned high dynamic range image processing system.
  • the lens and the high dynamic range image processing system are combined with the housing, and the lens cooperates with the image sensor of the high dynamic range image processing system for imaging.
  • the embodiments of the present application provide a non-volatile computer-readable storage medium containing a computer program.
  • the processor is caused to execute the above-mentioned high dynamic range image processing method.
  • FIG. 1 is a schematic diagram of a high dynamic range image processing system according to an embodiment of the present application
  • FIG. 2 is a schematic diagram of a pixel array according to an embodiment of the present application.
  • FIG. 3 is a schematic cross-sectional view of a photosensitive pixel according to an embodiment of the present application.
  • FIG. 4 is a pixel circuit diagram of a photosensitive pixel according to an embodiment of the present application.
  • FIG. 5 is a schematic diagram of the arrangement of the smallest repeating unit in a pixel array according to an embodiment of the present application
  • FIG. 6 is a schematic diagram of the arrangement of the smallest repeating unit in the pixel array according to the embodiment of the present application.
  • FIG. 7 is a schematic diagram of the arrangement of the smallest repeating unit in another pixel array according to an embodiment of the present application.
  • FIG. 8 is a schematic diagram of the arrangement of the smallest repeating unit in another pixel array according to an embodiment of the present application.
  • FIG. 9 is a schematic diagram of the arrangement of the smallest repeating unit in another pixel array according to an embodiment of the present application.
  • FIG. 10 is a schematic diagram of the arrangement of the smallest repeating unit in another pixel array according to an embodiment of the present application.
  • FIG. 11 is a schematic diagram of an original image output by an image sensor according to an embodiment of the present application.
  • FIG. 12 is a schematic diagram of a way for an image sensor to output raw image data according to an embodiment of the present application.
  • FIG. 13 is a schematic diagram of yet another way for an image sensor to output raw image data according to an embodiment of the present application.
  • FIG. 14 is a schematic diagram of a color intermediate image according to an embodiment of the present application.
  • FIG. 15 is a schematic diagram of yet another color intermediate image according to an embodiment of the present application.
  • 16 is a schematic diagram of another high dynamic range image processing system according to an embodiment of the present application.
  • FIG. 17 is a schematic diagram of black level correction according to an embodiment of the present application.
  • FIG. 18 is a schematic diagram of lens shading correction according to an embodiment of the present application.
  • FIG. 19 is a schematic diagram of a dead pixel compensation process according to an embodiment of the present application.
  • FIG. 23 are schematic diagrams of demosaicing according to an embodiment of the present application.
  • FIG. 24 is a schematic diagram of the mapping relationship between Vout and Vin in a tone mapping process according to an embodiment of the present application.
  • FIG. 25 is a schematic diagram of a brightness alignment process according to an embodiment of the present application.
  • FIG. 26 is a schematic diagram of an original image output by another image sensor according to an embodiment of the present application.
  • FIG. 27 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
  • FIG. 28 is a schematic flowchart of a high dynamic range image processing method according to an embodiment of the present application.
  • FIG. 29 is a schematic diagram of interaction between a non-volatile computer-readable storage medium and a processor according to an embodiment of the present application.
  • the high dynamic range image processing system 100 includes an image sensor 10, an image fusion module 20 and a high dynamic range image processing module 30.
  • the image sensor 10 includes a pixel array 11, and the pixel array 11 includes a plurality of panchromatic photosensitive pixels and a plurality of color photosensitive pixels.
  • the color photosensitive pixels have a narrower spectral response than the panchromatic photosensitive pixels.
  • the pixel array 11 includes a minimum repeating unit, each minimum repeating unit includes a plurality of sub-units, and each sub-unit includes a plurality of single-color photosensitive pixels and a plurality of full-color photosensitive pixels.
  • the pixel array 11 is exposed at a first exposure time to obtain a first original image.
  • the first original image includes first color original image data generated by single-color photosensitive pixels exposed at the first exposure time and full-color photosensitive pixels exposed at the first exposure time.
  • the pixel array 11 is exposed at a second exposure time to obtain a second original image.
  • the second original image includes second color original image data generated by single-color photosensitive pixels exposed at the second exposure time and full-color photosensitive pixels exposed at the second exposure time.
  • the first exposure time is not equal to the second exposure time.
  • the image fusion module 10 is used for fusing the first color original image data and the first panchromatic original image data into a first color intermediate image containing only the first color intermediate image data; combining the second color original image data with the second panchromatic image data
  • the original image data is fused into a second color intermediate image containing only the second color intermediate image data.
  • Both the first color intermediate image and the second color intermediate image include multiple color image pixels, and the multiple color image pixels are arranged in a Bayer array.
  • the image processor 40 is configured to perform color conversion processing on the first color intermediate image and the second color intermediate image to obtain the first color intermediate image after color conversion and the second color intermediate image after color conversion.
  • the high dynamic range image processing module 30 is configured to perform high dynamic range processing on the first color intermediate image after color conversion and the second color intermediate image after color conversion to obtain a color high dynamic range image.
  • the pixel array 11 is exposed to a third exposure time to obtain a third original image
  • the third original image includes a third original image generated by a single-color photosensitive pixel exposed at the third exposure time.
  • the color original image data and the third panchromatic original image data generated by the panchromatic photosensitive pixels exposed at the third exposure time; wherein the third exposure time is not equal to the first exposure time, and the third exposure time is not equal to the second exposure time.
  • the image fusion module 20 is also used for fusing the third color original image data and the third panchromatic original image data into a third color intermediate image containing only the third color intermediate image data.
  • the image processor 40 is configured to perform color conversion processing on the third color intermediate image to obtain a color-converted third color intermediate image.
  • the high dynamic range image processing module 30 is configured to perform high dynamic range processing on the color-converted first color intermediate image, the color-converted second color intermediate image, and the color-converted third color intermediate image to obtain a color high dynamic range image.
  • each color original image data is generated by a single single-color photosensitive pixel
  • each panchromatic original image data is generated by a single panchromatic photosensitive pixel
  • the image sensor 10 outputs multiple original image data.
  • the output mode includes a color original image data and a full-color original image data alternately output.
  • each color original image data is generated by multiple single-color photosensitive pixels in the same subunit
  • each panchromatic original image data is generated by multiple panchromatic pixels in the same subunit.
  • the photosensitive pixels are generated together, and the output mode of the image sensor 10 for outputting a plurality of original image data includes alternately outputting a plurality of color original image data and a plurality of full-color original image data.
  • the image processor 40 includes an image pre-processing module 41 and an image post-processing module 42.
  • the image preprocessing module 41 is configured to: perform image preprocessing on the first color intermediate image to obtain the preprocessed first color intermediate image; and perform image preprocessing on the second color intermediate image to obtain the preprocessed second color The middle image.
  • the image post-processing module 42 is configured to perform color conversion processing on the preprocessed first color intermediate image and the preprocessed second color intermediate image to obtain the color converted first color intermediate image and the color converted second color The middle image.
  • the image processor 40 includes an image pre-processing module 41 and an image post-processing module 42.
  • the image preprocessing module 41 is configured to perform image preprocessing on the first color intermediate image to obtain the preprocessed first color intermediate image; perform image preprocessing on the second color intermediate image to obtain the preprocessed second color intermediate image And performing image preprocessing on the third color intermediate image to obtain a preprocessed third color intermediate image.
  • the image post-processing module 42 is configured to perform color conversion processing on the preprocessed first color intermediate image, the preprocessed second color intermediate image, and the preprocessed third color intermediate image to obtain the color converted first color The intermediate image, the second color intermediate image after color conversion, and the third color intermediate image after color conversion.
  • the image preprocessing includes at least one of black level correction, lens shading correction, and dead pixel compensation.
  • the high dynamic range image processing system 100 further includes a storage module 50.
  • the storage module 50 is used to store the color-converted image and transmit the color-converted image to the high dynamic range image processing.
  • the module 30 performs high dynamic range image processing to obtain a color high dynamic range image.
  • the image fusion module 20 is integrated in the image sensor 10.
  • the high dynamic range image processing method of the embodiment of the present application is used in the high dynamic range image processing system 100.
  • the high dynamic range image processing system 100 may include the image sensor 10.
  • the image sensor 10 includes a pixel array 11.
  • the pixel array 11 includes a plurality of full-color photosensitive pixels and a plurality of color photosensitive pixels. Color photosensitive pixels have a narrower spectral response than full-color photosensitive pixels.
  • the pixel array 11 includes the smallest repeating unit. Each minimum repeating unit contains multiple subunits. Each subunit includes a plurality of single-color photosensitive pixels and a plurality of full-color photosensitive pixels.
  • High dynamic range image processing methods include:
  • the pixel array 11 Exposure of the pixel array 11, where the pixel array 11 is exposed to a first exposure time to obtain a first original image, and the first original image includes first color original image data generated by single-color photosensitive pixels exposed at the first exposure time and The first full-color original image data generated by the full-color photosensitive pixels exposed at the first exposure time; the pixel array is exposed at the second exposure time to obtain a second original image, and the second original image includes the single-color photosensitive pixels exposed at the second exposure time The generated second color original image data and the second panchromatic original image data generated by the panchromatic photosensitive pixels exposed at the second exposure time; wherein the first exposure time is not equal to the second exposure time;
  • first color original image data and the first panchromatic original image data are fused into a first color intermediate image containing only the first color intermediate image data
  • second color original image data and the second panchromatic original image data are fused
  • both the first color intermediate image and the second color intermediate image include a plurality of color image pixels, and the plurality of color image pixels are arranged in a Bayer array
  • the high dynamic range image processing method further includes: the pixel array is exposed at a third exposure time to obtain a third original image, and the third original image includes a third image generated by the single-color photosensitive pixels exposed at the third exposure time.
  • the third color original image data and the third panchromatic original image data are fused into a third color intermediate image that contains only the third color intermediate image data.
  • the third color intermediate image contains multiple color image pixels, and the multiple color image pixels represent Bayer array arrangement.
  • the third color intermediate image is subjected to color conversion processing to obtain a third color intermediate image after color conversion; the step is to perform high dynamic range processing on the first color intermediate image after color conversion and the second color intermediate image after color conversion to obtain color
  • the high dynamic range image includes: performing high dynamic range processing on the first color intermediate image after color conversion, the second color intermediate image after color conversion, and the third color intermediate image after color conversion to obtain a color high dynamic range image.
  • the high dynamic range image processing method further includes: each color original image data is generated by a single single-color photosensitive pixel, each full-color original image data is generated by a single panchromatic photosensitive pixel, and the image sensor 10 (Fig. 1)
  • the output mode for outputting multiple original image data includes alternate output of one color original image data and one full-color original image data.
  • each color original image data is jointly generated by a plurality of single-color photosensitive pixels in the same subunit
  • each panchromatic original image data is jointly generated by a plurality of panchromatic photosensitive pixels in the same subunit.
  • the output mode of the image sensor 10 (shown in FIG. 1) for outputting a plurality of original image data includes alternately outputting a plurality of color original image data and a plurality of full-color original image data.
  • the high dynamic range image processing method further includes: performing image preprocessing on the first color intermediate image to obtain the preprocessed first color intermediate image; performing image preprocessing on the second color intermediate image to obtain The preprocessed second color intermediate image.
  • the step of performing color conversion processing on the first color intermediate image and the second color intermediate image to obtain the first color intermediate image after color conversion and the second color intermediate image after color conversion includes: preprocessing the first color intermediate image The image and the preprocessed second color intermediate image are subjected to color conversion processing to obtain a first color intermediate image after color conversion and a second color intermediate image after color conversion.
  • the high dynamic range image processing method further includes: performing image preprocessing on the first color intermediate image to obtain the preprocessed first color intermediate image; performing image preprocessing on the second color intermediate image to obtain Preprocessed second color intermediate image; image preprocessing is performed on the third color intermediate image to obtain the preprocessed third color intermediate image. Step performing color conversion processing on the first color intermediate image, the second color intermediate image, and the third color intermediate image to obtain the first color intermediate image after color conversion, the second color intermediate image after color conversion, and the first color intermediate image after color conversion.
  • the three-color intermediate image includes: performing color conversion processing on the preprocessed first color intermediate image, the preprocessed second color intermediate image, and the preprocessed third color intermediate image to obtain the color converted first color The intermediate image, the second color intermediate image after color conversion, and the third color intermediate image after color conversion.
  • the image preprocessing includes at least one of black level correction, lens shading correction, dead pixel compensation, demosaicing, color correction, and global tone mapping.
  • the high dynamic range image processing system includes a storage module to store the color-converted image in the storage module. And obtain the color-converted image from the storage module and perform high dynamic range image processing on the color-converted image to obtain a color high dynamic range image.
  • the present application also provides an electronic device 1000.
  • the electronic device 1000 of the embodiment of the present application includes a lens 300, a housing 200, and a high dynamic range image processing system 100.
  • the high dynamic range image processing system 100 includes an image sensor 10, an image fusion module 20 and a high dynamic range image processing module 30.
  • the image sensor 10 includes a pixel array 11, and the pixel array 11 includes a plurality of panchromatic photosensitive pixels and a plurality of color photosensitive pixels.
  • the color photosensitive pixels have a narrower spectral response than the panchromatic photosensitive pixels.
  • the pixel array 11 includes a minimum repeating unit, each minimum repeating unit includes a plurality of sub-units, and each sub-unit includes a plurality of single-color photosensitive pixels and a plurality of full-color photosensitive pixels.
  • the pixel array 11 is exposed at a first exposure time to obtain a first original image.
  • the first original image includes first color original image data generated by single-color photosensitive pixels exposed at the first exposure time and full-color photosensitive pixels exposed at the first exposure time.
  • the pixel array 11 is exposed to a second exposure time to obtain a second original image.
  • the second original image includes second color original image data generated by a single-color photosensitive pixel exposed at the second exposure time and a full-color photosensitive image exposed at the second exposure time.
  • the first exposure time is not equal to the second exposure time.
  • the image fusion module 10 is used for fusing the first color original image data and the first panchromatic original image data into a first color intermediate image containing only the first color intermediate image data; combining the second color original image data with the second panchromatic image data
  • the original image data is fused into a second color intermediate image containing only the second color intermediate image data.
  • Both the first color intermediate image and the second color intermediate image include multiple color image pixels, and the multiple color image pixels are arranged in a Bayer array.
  • the image processor 40 is configured to perform color conversion processing on the first color intermediate image and the second color intermediate image to obtain the first color intermediate image after color conversion and the second color intermediate image after color conversion.
  • the high dynamic range image processing module 30 is configured to perform high dynamic range processing on the first color intermediate image after color conversion and the second color intermediate image after color conversion to obtain a color high dynamic range image.
  • the pixel array 11 is exposed to a third exposure time to obtain a third original image
  • the third original image includes a third original image generated by a single-color photosensitive pixel exposed at the third exposure time.
  • the color original image data and the third panchromatic original image data generated by the panchromatic photosensitive pixels exposed at the third exposure time; wherein the third exposure time is not equal to the first exposure time, and the third exposure time is not equal to the second exposure time.
  • the image fusion module 20 is also used for fusing the third color original image data and the third panchromatic original image data into a third color intermediate image containing only the third color intermediate image data.
  • the image processor 40 is configured to perform color conversion processing on the third color intermediate image to obtain a color-converted third color intermediate image.
  • the high dynamic range image processing module 30 is configured to perform high dynamic range processing on the color-converted first color intermediate image, the color-converted second color intermediate image, and the color-converted third color intermediate image to obtain a color high dynamic range image.
  • each color original image data is generated by a single single-color photosensitive pixel
  • each panchromatic original image data is generated by a single panchromatic photosensitive pixel
  • the image sensor 10 outputs multiple original image data.
  • the output mode includes a color original image data and a full-color original image data alternately output.
  • each color original image data is generated by multiple single-color photosensitive pixels in the same subunit
  • each panchromatic original image data is generated by multiple panchromatic pixels in the same subunit.
  • the photosensitive pixels are generated together, and the output mode of the image sensor 10 for outputting a plurality of original image data includes alternately outputting a plurality of color original image data and a plurality of full-color original image data.
  • the image processor 40 includes an image pre-processing module 41 and an image post-processing module 42.
  • the image preprocessing module 41 is configured to: perform image preprocessing on the first color intermediate image to obtain the preprocessed first color intermediate image; and perform image preprocessing on the second color intermediate image to obtain the preprocessed second color The middle image.
  • the image post-processing module 42 is configured to perform color conversion processing on the preprocessed first color intermediate image and the preprocessed second color intermediate image to obtain the color converted first color intermediate image and the color converted second color The middle image.
  • the image processor 40 includes an image pre-processing module 41 and an image post-processing module 42.
  • the image preprocessing module 41 is configured to perform image preprocessing on the first color intermediate image to obtain the preprocessed first color intermediate image; perform image preprocessing on the second color intermediate image to obtain the preprocessed second color intermediate image And performing image preprocessing on the third color intermediate image to obtain a preprocessed third color intermediate image.
  • the image post-processing module 42 is configured to perform color conversion processing on the preprocessed first color intermediate image, the preprocessed second color intermediate image, and the preprocessed third color intermediate image to obtain the color converted first color The intermediate image, the second color intermediate image after color conversion, and the third color intermediate image after color conversion.
  • the image preprocessing includes at least one of black level correction, lens shading correction, and dead pixel compensation.
  • the high dynamic range image processing system 100 further includes a storage module 50.
  • the storage module 50 is used to store the color-converted image and transmit the color-converted image to the high dynamic range image processing.
  • the module 30 performs high dynamic range image processing to obtain a color high dynamic range image.
  • the image fusion module 20 is integrated in the image sensor 10.
  • This application also provides a non-volatile computer-readable storage medium 400 containing a computer program.
  • the processor 60 is caused to execute the high dynamic range image processing method of any one of the foregoing embodiments.
  • the high dynamic range image processing system 100 of the embodiment of the present application performs fusion algorithm processing on the multi-frame original image output by the image sensor 10 through the image fusion module 20 in advance to obtain a multi-frame color intermediate image with image pixels arranged in a Bayer array.
  • the multi-frame color intermediate image can be processed by the image processor 40, which solves the problem that the image processor cannot directly process the image in which the image pixels are arranged in a non-Bayer array.
  • FIG. 2 is a schematic diagram of the image sensor 10 in the embodiment of the present application.
  • the image sensor 10 includes a pixel array 11, a vertical driving unit 12, a control unit 13, a column processing unit 14 and a horizontal driving unit 15.
  • the image sensor 10 may adopt a complementary metal oxide semiconductor (CMOS, Complementary Metal Oxide Semiconductor) photosensitive element or a charge-coupled device (CCD, Charge-coupled Device) photosensitive element.
  • CMOS complementary metal oxide semiconductor
  • CCD Charge-coupled Device
  • the pixel array 11 includes a plurality of photosensitive pixels 110 (shown in FIG. 3) arranged two-dimensionally in an array (ie, arranged in a two-dimensional matrix), and each photosensitive pixel 110 includes a photoelectric conversion element 1111 (shown in FIG. 4) .
  • Each photosensitive pixel 110 converts light into electric charge according to the intensity of light incident thereon.
  • the vertical driving unit 12 includes a shift register and an address decoder.
  • the vertical drive unit 12 includes readout scanning and reset scanning functions.
  • the readout scan refers to sequentially scanning the unit photosensitive pixels 110 line by line, and reading signals from these unit photosensitive pixels 110 line by line.
  • the signal output by each photosensitive pixel 110 in the selected and scanned photosensitive pixel row is transmitted to the column processing unit 14.
  • the reset scan is used to reset the charge, and the photocharge of the photoelectric conversion element is discarded, so that the accumulation of new photocharge can be started.
  • the signal processing performed by the column processing unit 14 is correlated double sampling (CDS) processing.
  • CDS correlated double sampling
  • the reset level and the signal level output from each photosensitive pixel 110 in the selected photosensitive pixel row are taken out, and the level difference is calculated.
  • the signals of the photosensitive pixels 110 in a row are obtained.
  • the column processing unit 14 may have an analog-to-digital (A/D) conversion function for converting analog pixel signals into a digital format.
  • A/D analog-to-digital
  • the horizontal driving unit 15 includes a shift register and an address decoder.
  • the horizontal driving unit 15 sequentially scans the pixel array 11 column by column. Through the selection scanning operation performed by the horizontal driving unit 15, each photosensitive pixel column is sequentially processed by the column processing unit 14, and is sequentially output.
  • control unit 13 configures timing signals according to the operation mode, and uses various timing signals to control the vertical driving unit 12, the column processing unit 14 and the horizontal driving unit 15 to work together.
  • FIG. 3 is a schematic diagram of a photosensitive pixel 110 in an embodiment of the present application.
  • the photosensitive pixel 110 includes a pixel circuit 111, a filter 112, and a micro lens 113. Along the light-receiving direction of the photosensitive pixel 110, the microlens 113, the filter 112, and the pixel circuit 111 are arranged in sequence.
  • the microlens 113 is used for condensing light
  • the filter 112 is used for passing light of a certain waveband and filtering out the light of other wavebands.
  • the pixel circuit 111 is used to convert the received light into electrical signals, and provide the generated electrical signals to the column processing unit 14 shown in FIG. 2.
  • FIG. 4 is a schematic diagram of a pixel circuit 111 of a photosensitive pixel 110 in an embodiment of the present application.
  • the pixel circuit 111 in FIG. 4 can be applied to each photosensitive pixel 110 (shown in FIG. 3) in the pixel array 11 shown in FIG.
  • the working principle of the pixel circuit 111 will be described below with reference to FIGS. 2 to 4.
  • the pixel circuit 111 includes a photoelectric conversion element 1111 (for example, a photodiode), an exposure control circuit (for example, a transfer transistor 1112), a reset circuit (for example, a reset transistor 1113), and an amplification circuit (for example, an amplification transistor 1114). ) And a selection circuit (for example, a selection transistor 1115).
  • the transfer transistor 1112, the reset transistor 1113, the amplifying transistor 1114, and the selection transistor 1115 are, for example, MOS transistors, but are not limited thereto.
  • the photoelectric conversion element 1111 includes a photodiode, and the anode of the photodiode is connected to the ground, for example.
  • the photodiode converts the received light into electric charge.
  • the cathode of the photodiode is connected to the floating diffusion unit FD via an exposure control circuit (for example, a transfer transistor 1112).
  • the floating diffusion unit FD is connected to the gate of the amplification transistor 1114 and the source of the reset transistor 1113.
  • the exposure control circuit is a transfer transistor 1112, and the control terminal TG of the exposure control circuit is the gate of the transfer transistor 1112.
  • the transfer transistor 1112 When a pulse of an active level (for example, VPIX level) is transmitted to the gate of the transfer transistor 1112 through the exposure control line, the transfer transistor 1112 is turned on.
  • the transfer transistor 1112 transfers the charge photoelectrically converted by the photodiode to the floating diffusion unit FD.
  • the drain of the reset transistor 1113 is connected to the pixel power supply VPIX.
  • the source of the reset transistor 113 is connected to the floating diffusion unit FD.
  • a pulse of an effective reset level is transmitted to the gate of the reset transistor 113 via the reset line, and the reset transistor 113 is turned on.
  • the reset transistor 113 resets the floating diffusion unit FD to the pixel power supply VPIX.
  • the gate of the amplifying transistor 1114 is connected to the floating diffusion unit FD.
  • the drain of the amplifying transistor 1114 is connected to the pixel power supply VPIX.
  • the amplifying transistor 1114 After the floating diffusion unit FD is reset by the reset transistor 1113, the amplifying transistor 1114 outputs the reset level through the output terminal OUT via the selection transistor 1115. After the charge of the photodiode is transferred by the transfer transistor 1112, the amplifying transistor 1114 outputs a signal level through the output terminal OUT via the selection transistor 1115.
  • the drain of the selection transistor 1115 is connected to the source of the amplification transistor 1114.
  • the source of the selection transistor 1115 is connected to the column processing unit 14 in FIG. 2 through the output terminal OUT.
  • the selection transistor 1115 is turned on.
  • the signal output by the amplifying transistor 1114 is transmitted to the column processing unit 14 through the selection transistor 1115.
  • the pixel structure of the pixel circuit 111 in the embodiment of the present application is not limited to the structure shown in FIG. 4.
  • the pixel circuit 111 may also have a three-transistor pixel structure, in which the functions of the amplifying transistor 1114 and the selecting transistor 1115 are performed by one transistor.
  • the exposure control circuit is not limited to the way of a single transfer transistor 1112, and other electronic devices or structures with the function of controlling the conduction of the control terminal can be used as the exposure control circuit in the embodiment of the present application.
  • the implementation of the transistor 1112 is simple, low in cost, and easy to control.
  • 5 to 10 are schematic diagrams of the arrangement of photosensitive pixels 110 (shown in FIG. 3) in the pixel array 11 (shown in FIG. 2) according to some embodiments of the present application.
  • the photosensitive pixels 110 include two types, one is a full-color photosensitive pixel W, and the other is a color photosensitive pixel.
  • 5 to 10 only show the arrangement of a plurality of photosensitive pixels 110 in a minimum repeating unit. The smallest repeating unit shown in FIGS. 5 to 10 is copied multiple times in rows and columns to form the pixel array 11. Each minimum repeating unit is composed of multiple full-color photosensitive pixels W and multiple color photosensitive pixels. Each minimum repeating unit includes multiple subunits.
  • Each subunit includes a plurality of single-color photosensitive pixels and a plurality of full-color photosensitive pixels W.
  • the full-color photosensitive pixel W and the color photosensitive pixel in each sub-unit are alternately arranged.
  • multiple photosensitive pixels 110 in the same row are photosensitive pixels 110 of the same category; or, multiple photosensitive pixels 110 in the same column are photosensitive pixels 110 of the same category 110.
  • FIG. 5 is a schematic diagram of the arrangement of photosensitive pixels 110 (shown in FIG. 3) in the smallest repeating unit of an embodiment of the application.
  • the smallest repeating unit is 4 rows, 4 columns and 16 photosensitive pixels 110
  • the sub-units are 2 rows, 2 columns and 4 photosensitive pixels 110.
  • the arrangement method is:
  • W represents the full-color photosensitive pixel
  • A represents the first color photosensitive pixel among the multiple color photosensitive pixels
  • B represents the second color photosensitive pixel among the multiple color photosensitive pixels
  • C represents the third color photosensitive pixel among the multiple color photosensitive pixels Pixels.
  • full-color photosensitive pixels W and single-color photosensitive pixels are alternately arranged.
  • the categories of subunits include three categories.
  • the first type subunit UA includes a plurality of full-color photosensitive pixels W and a plurality of first color photosensitive pixels A
  • the second type of subunit UB includes a plurality of panchromatic photosensitive pixels W and a plurality of second-color photosensitive pixels B
  • the third type of subunit UC includes a plurality of full-color photosensitive pixels W and a plurality of third-color photosensitive pixels C.
  • Each minimum repeating unit includes four subunits, which are one subunit of the first type UA, two subunits of the second type UB, and one subunit of the third type UC.
  • a first type subunit UA and a third type subunit UC are arranged in the first diagonal direction D1 (for example, the direction connecting the upper left corner and the lower right corner in FIG. 5), and two second type subunits UB are arranged In the second diagonal direction D2 (for example, the direction where the upper right corner and the lower left corner are connected in FIG. 5).
  • the first diagonal direction D1 is different from the second diagonal direction D2.
  • the first diagonal line and the second diagonal line are perpendicular.
  • first diagonal direction D1 may also be a direction connecting the upper right corner and the lower left corner
  • second diagonal direction D2 may also be a direction connecting the upper left corner and the lower right corner
  • direction here is not a single direction, but can be understood as the concept of a "straight line” indicating the arrangement, and there may be two-way directions at both ends of the straight line.
  • the explanation of the first diagonal direction D1 and the second diagonal direction D2 in FIGS. 6 to 10 is the same as here.
  • FIG. 6 is a schematic diagram of the arrangement of photosensitive pixels 110 (shown in FIG. 3) in a minimum repeating unit according to another embodiment of the application.
  • the smallest repeating unit is 36 photosensitive pixels 110 in 6 rows and 6 columns, and the sub-units are 9 photosensitive pixels 110 in 3 rows and 3 columns.
  • the arrangement method is:
  • W represents the full-color photosensitive pixel
  • A represents the first color photosensitive pixel among the multiple color photosensitive pixels
  • B represents the second color photosensitive pixel among the multiple color photosensitive pixels
  • C represents the third color photosensitive pixel among the multiple color photosensitive pixels Pixels.
  • full-color photosensitive pixels W and single-color photosensitive pixels are alternately arranged.
  • the categories of subunits include three categories.
  • the first type subunit UA includes a plurality of full-color photosensitive pixels W and a plurality of first color photosensitive pixels A
  • the second type of subunit UB includes a plurality of panchromatic photosensitive pixels W and a plurality of second-color photosensitive pixels B
  • the third type of subunit UC includes a plurality of full-color photosensitive pixels W and a plurality of third-color photosensitive pixels C.
  • Each minimum repeating unit includes four subunits, which are one subunit of the first type UA, two subunits of the second type UB, and one subunit of the third type UC.
  • one first type subunit UA and one third type subunit UC are arranged in the first diagonal direction D1
  • two second type subunits UB are arranged in the second diagonal direction D2.
  • the first diagonal direction D1 is different from the second diagonal direction D2.
  • the first diagonal line and the second diagonal line are perpendicular.
  • FIG. 7 is a schematic diagram of the arrangement of photosensitive pixels 110 (shown in FIG. 3) in the smallest repeating unit according to another embodiment of the application.
  • the minimum repeating unit is 8 rows and 8 columns and 64 photosensitive pixels 110
  • the sub-units are 4 rows and 4 columns and 16 photosensitive pixels 110.
  • the arrangement method is:
  • W represents the full-color photosensitive pixel
  • A represents the first color photosensitive pixel among the multiple color photosensitive pixels
  • B represents the second color photosensitive pixel among the multiple color photosensitive pixels
  • C represents the third color photosensitive pixel among the multiple color photosensitive pixels Pixels.
  • full-color photosensitive pixels W and single-color photosensitive pixels are alternately arranged.
  • the categories of subunits include three categories.
  • the first type subunit UA includes a plurality of full-color photosensitive pixels W and a plurality of first color photosensitive pixels A
  • the second type of subunit UB includes a plurality of panchromatic photosensitive pixels W and a plurality of second-color photosensitive pixels B
  • the third type of subunit UC includes a plurality of full-color photosensitive pixels W and a plurality of third-color photosensitive pixels C.
  • Each minimum repeating unit includes four subunits, which are one subunit of the first type UA, two subunits of the second type UB, and one subunit of the third type UC.
  • one first type subunit UA and one third type subunit UC are arranged in the first diagonal direction D1
  • two second type subunits UB are arranged in the second diagonal direction D2.
  • the first diagonal direction D1 is different from the second diagonal direction D2.
  • the first diagonal line and the second diagonal line are perpendicular.
  • FIG. 8 is a schematic diagram of the arrangement of photosensitive pixels 110 (shown in FIG. 3) in the smallest repeating unit according to another embodiment of the application.
  • the smallest repeating unit is 4 rows, 4 columns and 16 photosensitive pixels 110, and the sub-units are 2 rows, 2 columns and 4 photosensitive pixels 110.
  • the arrangement method is:
  • W represents the full-color photosensitive pixel
  • A represents the first color photosensitive pixel among the multiple color photosensitive pixels
  • B represents the second color photosensitive pixel among the multiple color photosensitive pixels
  • C represents the third color photosensitive pixel among the multiple color photosensitive pixels Pixels.
  • the arrangement of the photosensitive pixels 110 in the smallest repeating unit shown in FIG. 8 is roughly the same as the arrangement of the photosensitive pixels 110 in the smallest repeating unit shown in FIG.
  • the alternating sequence of full-color photosensitive pixels W and single-color photosensitive pixels in the subunit UB is inconsistent with the alternating sequence of full-color photosensitive pixels W and single-color photosensitive pixels in the second type of subunit UB in the lower left corner of FIG. 5, and ,
  • the alternating sequence of the full-color photosensitive pixel W and the single-color photosensitive pixel in the third type subunit UC in FIG. 8 is the same as the full-color photosensitive pixel W and the single-color photosensitive pixel W in the third type subunit UC in the lower right corner of FIG.
  • the alternating sequence of photosensitive pixels is also inconsistent. Specifically, in the second type subunit UB in the lower left corner of FIG. 5, the alternating sequence of the photosensitive pixels 110 in the first row is full-color photosensitive pixels W, single-color photosensitive pixels (ie, second-color photosensitive pixels B), and The alternating sequence of the photosensitive pixels 110 in the two rows is single-color photosensitive pixels (ie, second-color photosensitive pixels B) and full-color photosensitive pixels W; and in the second-type subunit UB in the lower left corner in FIG.
  • the alternating sequence of photosensitive pixels 110 is single-color photosensitive pixels (ie, second-color photosensitive pixels B), full-color photosensitive pixels W, and the alternating sequence of photosensitive pixels 110 in the second row is full-color photosensitive pixels W, single-color photosensitive pixels (ie The second color photosensitive pixel B).
  • the alternating sequence of photosensitive pixels 110 in the first row is full-color photosensitive pixels W, single-color photosensitive pixels (ie, third-color photosensitive pixels C), and the second row
  • the alternating sequence of the photosensitive pixels 110 is a single-color photosensitive pixel (that is, a third-color photosensitive pixel C) and a full-color photosensitive pixel W; and in the third type of subunit UC in the lower right corner of FIG.
  • the photosensitive pixels 110 in the first row The alternating sequence of the single-color photosensitive pixel (ie the third color photosensitive pixel C), the full-color photosensitive pixel W, the alternating sequence of the photosensitive pixel 110 in the second row is the full-color photosensitive pixel W, the single-color photosensitive pixel (ie the third color Photosensitive pixel C).
  • the alternating sequence of pixels is not consistent.
  • the alternating sequence of the photosensitive pixels 110 in the first row is full-color photosensitive pixels W, single-color photosensitive pixels (that is, first-color photosensitive pixels A), and the second row
  • the alternating sequence of the photosensitive pixels 110 is a single-color photosensitive pixel (that is, the first color photosensitive pixel A), a full-color photosensitive pixel W; and in the third type of subunit CC shown in FIG.
  • the photosensitive pixels 110 in the first row The alternating sequence is single-color photosensitive pixels (that is, third-color photosensitive pixels C), full-color photosensitive pixels W, and the alternating sequence of photosensitive pixels 110 in the second row is full-color photosensitive pixels W, single-color photosensitive pixels (that is, third-color photosensitive pixels). Pixel C). That is to say, in the same minimum repeating unit, the alternating sequence of full-color photosensitive pixels W and color photosensitive pixels in different subunits can be the same (as shown in Figure 5) or inconsistent (as shown in Figure 8). Show).
  • FIG. 9 is a schematic diagram of the arrangement of photosensitive pixels 110 (shown in FIG. 3) in the smallest repeating unit according to another embodiment of the application.
  • the smallest repeating unit is 4 rows, 4 columns and 16 photosensitive pixels 110, and the sub-units are 2 rows, 2 columns and 4 photosensitive pixels 110.
  • the arrangement method is:
  • W represents the full-color photosensitive pixel
  • A represents the first color photosensitive pixel among the multiple color photosensitive pixels
  • B represents the second color photosensitive pixel among the multiple color photosensitive pixels
  • C represents the third color photosensitive pixel among the multiple color photosensitive pixels Pixels.
  • photosensitive pixels 110 in the same row are photosensitive pixels 110 of the same category.
  • the photosensitive pixels 110 of the same category include: (1) all full-color photosensitive pixels W; (2) all first-color photosensitive pixels A; (3) all second-color photosensitive pixels B; (4) all The third color photosensitive pixel C.
  • the categories of subunits include three categories.
  • the first type subunit UA includes a plurality of full-color photosensitive pixels W and a plurality of first color photosensitive pixels A
  • the second type of subunit UB includes a plurality of panchromatic photosensitive pixels W and a plurality of second-color photosensitive pixels B
  • the third type of subunit UC includes a plurality of full-color photosensitive pixels W and a plurality of third-color photosensitive pixels C.
  • Each minimum repeating unit includes four subunits, which are one subunit of the first type UA, two subunits of the second type UB, and one subunit of the third type UC.
  • one first type subunit UA and one third type subunit UC are arranged in the first diagonal direction D1
  • two second type subunits UB are arranged in the second diagonal direction D2.
  • the first diagonal direction D1 is different from the second diagonal direction D2.
  • the first diagonal line and the second diagonal line are perpendicular.
  • FIG. 10 is a schematic diagram of the arrangement of photosensitive pixels 110 (shown in FIG. 3) in the smallest repeating unit according to another embodiment of the application.
  • the smallest repeating unit is 4 rows, 4 columns and 16 photosensitive pixels 110, and the sub-units are 2 rows, 2 columns and 4 photosensitive pixels 110.
  • the arrangement method is:
  • W represents the full-color photosensitive pixel
  • A represents the first color photosensitive pixel among the multiple color photosensitive pixels
  • B represents the second color photosensitive pixel among the multiple color photosensitive pixels
  • C represents the third color photosensitive pixel among the multiple color photosensitive pixels Pixels.
  • a plurality of photosensitive pixels 110 in the same column are photosensitive pixels 110 of the same category.
  • the photosensitive pixels 110 of the same category include: (1) all full-color photosensitive pixels W; (2) all first-color photosensitive pixels A; (3) all second-color photosensitive pixels B; (4) all The third color photosensitive pixel C.
  • the categories of subunits include three categories.
  • the first type subunit UA includes a plurality of full-color photosensitive pixels W and a plurality of first color photosensitive pixels A
  • the second type of subunit UB includes a plurality of panchromatic photosensitive pixels W and a plurality of second-color photosensitive pixels B
  • the third type of subunit UC includes a plurality of full-color photosensitive pixels W and a plurality of third-color photosensitive pixels C.
  • Each minimum repeating unit includes four subunits, which are one subunit of the first type UA, two subunits of the second type UB, and one subunit of the third type UC.
  • one first type subunit UA and one third type subunit UC are arranged in the first diagonal direction D1
  • two second type subunits UB are arranged in the second diagonal direction D2.
  • the first diagonal direction D1 is different from the second diagonal direction D2.
  • the first diagonal line and the second diagonal line are perpendicular.
  • multiple photosensitive pixels 110 in the same row in some subunits may be photosensitive pixels 110 of the same category, and multiple photosensitive pixels 110 in the same column in the remaining subunits
  • the pixels 110 are photosensitive pixels 110 of the same type.
  • the first color photosensitive pixel A may be a red photosensitive pixel R; the second color photosensitive pixel B may be a green photosensitive pixel G; and the third color photosensitive pixel C may be Blue photosensitive pixel Bu.
  • the first color photosensitive pixel A may be a red photosensitive pixel R; the second color photosensitive pixel B may be a yellow photosensitive pixel Y; and the third color photosensitive pixel C may be Blue photosensitive pixel Bu.
  • the first color photosensitive pixel A may be a magenta photosensitive pixel M; the second color photosensitive pixel B may be a cyan photosensitive pixel Cy; and the third color photosensitive pixel C may It is the yellow photosensitive pixel Y.
  • the response band of the full-color photosensitive pixel W may be the visible light band (for example, 400 nm-760 nm).
  • the full-color photosensitive pixel W is provided with an infrared filter to filter out infrared light.
  • the response wavelength bands of the full-color photosensitive pixel W are visible light and near-infrared wavelengths (for example, 400nm-1000nm), and the photoelectric conversion element 1111 (shown in FIG. 4) in the image sensor 10 (shown in FIG. 1) (Shown) to match the response band.
  • the full-color photosensitive pixel W may not be provided with a filter or a filter that can pass light of all wavelength bands.
  • the response band of the full-color photosensitive pixel W is determined by the response band of the photoelectric conversion element 1111, that is, the two match. .
  • the embodiments of the present application include, but are not limited to, the above-mentioned waveband range.
  • the following embodiments all describe the first single-color photosensitive pixel A as the red photosensitive pixel R, the second single-color photosensitive pixel B as the green photosensitive pixel G, and the third single-color photosensitive pixel as the blue photosensitive pixel Bu.
  • the control unit 13 controls the pixel array 11 to expose.
  • the pixel array 11 is exposed for the first exposure time to obtain the first original image.
  • the first original image includes first color original image data generated by single-color photosensitive pixels exposed at the first exposure time and first full-color original image data generated by panchromatic photosensitive pixels W exposed at the first exposure time.
  • the pixel array 11 is exposed for a second exposure time to obtain a second original image.
  • the second original image includes second color original image data generated by the single-color photosensitive pixels exposed at the second exposure time and second full-color original image data generated by the panchromatic photosensitive pixels W exposed at the second exposure time; wherein, the first One exposure time is not equal to the second exposure time.
  • the pixel array 11 performs two exposures. For example, as shown in FIG. 11, in the first exposure, the pixel array 11 is exposed for a first exposure time L (for example, a long exposure time) to obtain a first original image.
  • the first original image includes first color original image data generated by the single-color photosensitive pixels exposed at the first exposure time L and first full-color original image data generated by the panchromatic photosensitive pixels W exposed at the first exposure time L.
  • the pixel array 11 is exposed for a second exposure time S (for example, a short exposure time) to obtain a second original image.
  • the second original image includes second color original image data generated by the single-color photosensitive pixels exposed at the second exposure time S and second full-color original image data generated by the panchromatic photosensitive pixels W exposed at the second exposure time S. It should be noted that the pixel array 11 may also perform a short exposure first, and then perform a long exposure, which is not limited here.
  • the image sensor 10 can output multiple original image data generated by the pixel array 11, and the multiple original image data form an original image.
  • each color original image data in each frame of the original image (the first original image, the second original image, and the third original image below) is generated by a single single-color photosensitive pixel, and each full-color original image The data are all generated by a single full-color photosensitive pixel W, and the output mode of the image sensor 10 for outputting multiple original image data can be alternately outputting one color original image data and one full-color original image data.
  • each single-color photosensitive pixel After the pixel array 11 is exposed, each single-color photosensitive pixel generates a color original image data corresponding to the single-color photosensitive pixel, and each panchromatic photosensitive pixel W generates a full-color corresponding to the panchromatic photosensitive pixel W.
  • Original image data For a plurality of photosensitive pixels 110 in the same row, the original image data generated by the plurality of photosensitive pixels 110 is output in a manner: one color original image data and one full-color original image data are alternately output. After the multiple original image data of the same line are output, multiple original image data of the next line are output.
  • the output mode of multiple original image data of each line is one color original image data and one full-color original image data output. .
  • the image sensor 10 sequentially outputs a plurality of original image data, and the plurality of original image data forms an original image.
  • the alternate output of a color original image data and a full-color original image data can include the following two types: (1) output a color original image data first, and then output a full-color original image data; (2) output first A full-color original image data, and then output a color original image data.
  • the specific alternating sequence is related to the arrangement of the full-color photosensitive pixels W and the color photosensitive pixels in the pixel array 11.
  • the alternating sequence is (1); when the photosensitive pixel 110 in the 0 row and 0 column of the pixel array 11 is a full-color photosensitive pixel W, The alternate sequence is (2).
  • the output mode of the original image data will be described below by taking FIG. 12 as an example.
  • the pixel array 11 includes 8*8 photosensitive pixels 110, and the photosensitive pixels 110 in the 0th row and 0th column of the pixel array 11 are full-color photosensitive pixels W, then After the pixel array 11 is exposed, the image sensor 10 first outputs the full-color original image data generated by the full-color photosensitive pixel p00 in the 0th row and 0th column.
  • the image pixel P00 corresponding to the full-color original image data is located at the 0th row of the original image.
  • the image sensor 10 outputs the color original image data generated by the color photosensitive pixel p01 in the 0th row and 1st column.
  • the image pixel P01 corresponding to the color original image data is located in the 0th row and 1st column of the original image ...;
  • the image sensor 10 outputs color original image data generated by the color photosensitive pixel p07 in the 0th row and 7th column, and the image pixel P07 corresponding to the color original image data is located in the 0th row and 7th column of the original image. So far, the original image data generated by the eight photosensitive pixels 110 in the 0th row of the pixel array 11 are all output.
  • the image sensor 10 sequentially outputs the original image data generated by the eight photosensitive pixels 110 in the first row of the pixel array 11; subsequently, the image sensor 10 sequentially outputs the original image data generated by the eight photosensitive pixels 110 in the second row of the pixel array 11; It can be deduced by analogy until the image sensor 10 outputs the full-color original image data generated by the full-color photosensitive pixel p77 in the seventh row and seventh column.
  • the original image data generated by the plurality of photosensitive pixels 110 form an original image, wherein the position of the image pixel corresponding to the original image data generated by each photosensitive pixel 110 in the original image is the same as the position of the photosensitive pixel 110 in the pixel array 11. Corresponding to the location.
  • each color original image data in each frame of original image (the first original image, the second original image, and the third original image below) is generated by multiple single-color photosensitive pixels in the same subunit.
  • Each panchromatic original image data is generated by multiple panchromatic photosensitive pixels W in the same subunit.
  • the output mode of the image sensor 10 for outputting a plurality of original image data includes a plurality of color original image data and a plurality of panchromatic original images. Data is output alternately.
  • multiple single-color photosensitive pixels in the same sub-unit jointly generate a color original image data corresponding to the sub-unit
  • multiple full-color photosensitive pixels W in the same sub-unit jointly generate one and
  • the full-color original image data corresponding to the sub-unit, that is, one sub-unit corresponds to one color original image data and one full-color original image data.
  • the original image data corresponding to the multiple subunits is outputted as follows: multiple color original image data corresponding to multiple subunits in the same row alternate with multiple full-color original image data Output, wherein the output mode of the multiple color original image data is that the multiple color original images are output in sequence; the output mode of the multiple panchromatic original image data is that the multiple full color original image data are output in sequence. After the multiple original image data of the same line is output, multiple original image data of the next line are output.
  • the output mode of multiple original image data of each line is multiple color original image data and multiple full color original images Data is output alternately.
  • the image sensor 10 sequentially outputs a plurality of original image data, and the plurality of original image data forms an original image.
  • the alternate output of multiple color original image data and multiple full-color original image data may include the following two types: (1) First output multiple color original image data one after another, and then output multiple full-color original image data one after another. Image data; (2) First output multiple full-color original image data one after another, and then output multiple color original image data one after another.
  • the specific alternating sequence is related to the arrangement of the full-color photosensitive pixels W and the color photosensitive pixels in the pixel array 11.
  • the alternating sequence is (1); when the photosensitive pixel 110 in the 0 row and 0 column of the pixel array 11 is a full-color photosensitive pixel W, The alternate sequence is (2).
  • the output mode of the original image data will be described below by taking FIG. 13 as an example. Please refer to FIG. 1, FIG. 2, FIG. 3, and FIG. 13, assuming that the pixel array 11 includes 8*8 photosensitive pixels 110.
  • the full-color photosensitive pixel p00, the full-color photosensitive pixel p11, the color photosensitive pixel p01, and the color photosensitive pixel p10 in the pixel array 11 constitute a subunit U1;
  • the full-color photosensitive pixel p02, the full-color photosensitive pixel p13, the color photosensitive pixel p03 and the color photosensitive pixel Pixel p12 constitutes subunit U2;
  • full-color photosensitive pixel p04, full-color photosensitive pixel p15, color photosensitive pixel p05, and color photosensitive pixel p14 constitute sub-unit U3;
  • the image sensor 10 first outputs the full-color photosensitive pixel p00 and the panchromatic photosensitive pixel p11 in the subunit U1
  • the image pixel P00 corresponding to the panchromatic original image data is located in the 0th row and 0th column of the original image; subsequently, the image sensor 10 then outputs the panchromatic photosensitive pixel p02 and panchromatic photosensitive pixel p02 in the subunit U2.
  • the panchromatic original image data generated together with the panchromatic photosensitive pixel p15, the image pixel P02 corresponding to the panchromatic original image data is located in the 0th row and the second column of the original image; subsequently, the image sensor 10 then outputs the panchromatic in the subunit U4 Panchromatic original image data generated by the photosensitive pixel p06 and the panchromatic photosensitive pixel p17 together, and the image pixel P03 corresponding to the panchromatic primitive image data is located in the 0th row and third column of the original image.
  • the image sensor 10 first outputs the color original image data jointly generated by the color photosensitive pixel p01 and the color photosensitive pixel p10 in the subunit U1, and the image pixel P10 corresponding to the color original image data is located in the first row and the 0 column of the original image; , The image sensor 10 then outputs the color original image data jointly generated by the color photosensitive pixel p03 and the color photosensitive pixel p12 in the subunit U2, and the image pixel P11 corresponding to the color original image data is located in the first row and first column of the original image; subsequently, The image sensor 10 then outputs the color original image data generated by the color photosensitive pixel p05 and the color photosensitive pixel p14 in the subunit U3.
  • the image pixel P12 corresponding to the color original image data is located in the first row and second column of the original image;
  • the sensor 10 then outputs the color original image data jointly generated by the color photosensitive pixel p07 and the color photosensitive pixel p16 in the subunit U4, and the image pixel P13 corresponding to the color original image data is located in the first row and third column of the original image. So far, the multiple color original image data corresponding to the multiple subunits in the first row have also been output.
  • the image sensor 10 outputs multiple full-color original image data and multiple color original image data corresponding to the multiple sub-units in the second row, and multiple full-color original images corresponding to the multiple sub-units in the second row.
  • the output mode of the data and the multiple color original image data is the same as the output mode of the multiple full-color original image data and the multiple color original image data corresponding to the multiple subunits in the first row, and will not be repeated here.
  • the image sensor 10 has outputted multiple full-color original image data and multiple color original image data corresponding to the multiple subunits in the fourth row. In this way, the original image data generated by the plurality of photosensitive pixels 110 forms a frame of original image.
  • the first original image and the second original image are transmitted to the image fusion module 20 for image fusion processing to obtain the first original image.
  • Color intermediate image and second color intermediate image Specifically, the image fusion module 20 fuses the first color original image data and the first panchromatic original image data in the first original image to obtain a first color intermediate image containing only the first color intermediate image data, and The second color original image data and the second panchromatic original image data in the second original image are merged to obtain a second color intermediate image containing only the second color intermediate image data, the first color intermediate image and the second color intermediate image
  • the images all contain multiple color image pixels, and the multiple color image pixels are arranged in a Bayer array.
  • the image fusion module 20 fuses the color original image data and the whole
  • the color intermediate image obtained after coloring the original image data includes a plurality of color image pixels, and the plurality of color image pixels are arranged in a Bayer array.
  • the resolution of the color intermediate image is the same as the resolution of the pixel array 11.
  • the image fusion module 20 fuses the color original image data and the panchromatic image data.
  • the color intermediate image obtained after the original image data includes a plurality of color image pixels, and the plurality of color image pixels are arranged in a Bayer array.
  • the resolution of the color intermediate image is the same as the resolution of the pixel array 11.
  • the original image data when the image sensor 10 is working in the high-resolution mode, can be output by alternately outputting one color original image data and one full-color original image data.
  • the image sensor 10 works in the low-resolution mode, the original image data can be output by alternately outputting multiple color original image data and multiple full-color original image data.
  • the image sensor 10 when the ambient brightness is high, the image sensor 10 can work in a high-resolution mode, which is beneficial to improve the clarity of the finally acquired image; when the ambient brightness is low, the image sensor 10 can work in a low-resolution mode, which is beneficial to Increase the brightness of the final acquired image.
  • the image fusion module 20 may be integrated in the image sensor 10, may also be integrated in the image processor 40, or may be separately provided outside the image sensor 10 and the image processor 40.
  • the high dynamic range image processing system 100 also includes an image processor 40.
  • the image processor 40 includes an image preprocessing module 41. After the image fusion module 20 obtains the first color intermediate image and the second color intermediate image, the two images are transmitted to the image preprocessing module 41 for image preprocessing. .
  • the image preprocessing module 41 performs image preprocessing on the first color intermediate image to obtain the preprocessed first color intermediate image, and performs preprocessing on the second color intermediate image to obtain the preprocessed second color intermediate image.
  • image preprocessing includes at least one of black level correction, lens shading correction, and dead pixel compensation.
  • image preprocessing includes only black level correction; or, image preprocessing includes lens shading correction and dead pixel compensation; or, image preprocessing includes black level correction processing and lens shading correction; or, image preprocessing includes black power Level correction, lens shading correction and dead pixel compensation.
  • the black level correction process may be that the image preprocessing module 41 subtracts a fixed value from each pixel value (that is, each color intermediate image data) on the basis of obtaining the color intermediate image fused by the image fusion module 20.
  • the fixed value corresponding to the pixel value of each color channel can be the same or different.
  • the first color intermediate image has the pixel values of the red channel, the pixel values of the green channel, and the pixel values of the blue channel.
  • the image preprocessing module 41 performs black level correction on the first color intermediate image, and all the pixel values in the first color intermediate image are subtracted from a fixed value of 5 to obtain the first color that has undergone black level correction.
  • the middle image is the image preprocessing module 41 performing black level correction on the first color intermediate image.
  • the image sensor 10 adds a fixed offset of 5 (or other values) before the ADC input, so that the output pixel value is between 5 (or other values) to 255, and the black level correction can make the While the details of the dark parts of the image obtained by the image sensor 10 and the high dynamic range image processing system 100 of the application embodiment are completely preserved, the pixel value of the image is not increased or decreased, which is beneficial to improving the image quality.
  • Lens shadow is the phenomenon that the lens has a shadow around the lens caused by the uneven optical refraction of the lens, that is, the intensity of the received light in the center and the surrounding area of the image area is inconsistent.
  • the process of lens shading correction may be that the image preprocessing module 41 may perform grid division on the processed image on the basis of the black level corrected first color intermediate image and the black level corrected second color intermediate image, Then through the compensation effect of each grid area adjacent or itself and adjacent circumference, bilinear interpolation method is used to correct the image of lens shading.
  • the process of lens shading correction processing can also be that the image preprocessing module 41 directly divides the first color intermediate image and the second color intermediate image, and divides the processed image into grids, and then passes the grid area adjacent to or itself and For the compensation effect of adjacent weeks, bilinear interpolation is used to correct the image of lens shading.
  • the following takes the lens shading correction of the first color intermediate image as an example for description.
  • the image preprocessing module 41 divides the first color intermediate image (that is, the processed image) into sixteen equally Grid, each of the sixteen grids has a preset compensation coefficient. Then, the image preprocessing module 41 performs shading correction on the image through the bilinear interpolation method according to the compensation effect of each grid area adjacent or itself and its adjacent compensation system.
  • R2 is the pixel value in the dashed frame in the first color intermediate image after lens shading correction
  • R1 is the pixel value in the dashed frame in the first color intermediate image as shown in the figure.
  • R2 R1*k1
  • k1 is obtained by bilinear interpolation of the compensation coefficients 1.10, 1.04, 1.05, and 1.09 of the grid adjacent to the R1 pixel.
  • the coordinates of the image are (x, y)
  • x is counted from the first image pixel from the left to the right
  • y is counted from the first image pixel on the top
  • both x and y are natural numbers, such as those on the edge of the image Logo shown.
  • f(x, y) represents the compensation value of the coordinate (x, y) in each grid compensation coefficient graph.
  • f(0.75,0.75) is the compensation coefficient value corresponding to R1 in each grid compensation coefficient graph.
  • the compensation coefficient of each grid has been preset before the image preprocessing module 41 performs lens shading correction.
  • the compensation coefficient of each grid can be determined by the following methods: (1) Place the lens 300 in a closed device with constant and uniform light intensity and color temperature, and make the lens 300 face a pure gray target with uniform brightness distribution in the closed device The object is shot to obtain a grayscale image; (2) The grayscale image is gridded (for example, divided into 16 grids) to obtain the grayscale image divided into different grid areas; (3) The different grids of the grayscale image are calculated The compensation coefficient of the grid area. After the compensation coefficient of the lens 300 is determined, the high dynamic range image processing system 100 of the present application sets the compensation coefficient in the image preprocessing module 41 in advance.
  • the image preprocessing module 41 in the high dynamic range image processing system 100 compares the image When performing lens shading correction, the compensation coefficient is obtained, and the image preprocessing module 41 uses a bilinear interpolation method to perform lens shading correction on the image according to the compensation effect of each grid area.
  • the photosensitive pixels 110 on the pixel array 11 of the image sensor 40 may have process defects, or errors may occur in the process of converting optical signals into electrical signals, resulting in incorrect pixel information on the image, resulting in inaccurate pixel values in the image. These defective pixels appear on the output image as image dead pixels. Image dead pixels may exist, so the image needs to be compensated for dead pixels.
  • Dead pixel compensation may include the following steps: (1) a 3 ⁇ 3 pixel matrix of photosensitive pixels of the same color is established with the pixel to be detected as the center pixel; (2) the surrounding pixels of the central pixel are taken as Reference point, determine whether the difference between the color value of the central pixel and the surrounding pixels is greater than the first threshold, if yes, the central pixel is a bad pixel, if not, the central pixel is normal Point; (3) Perform bilinear interpolation on the central pixel point determined as a bad point to obtain the corrected pixel value.
  • the first color intermediate image (which may be the uncorrected first color intermediate image, or the corrected first color intermediate image, etc.) is described below with dead pixel compensation.
  • R1 is the pixel to be detected, and the image preprocessing module 41 uses R1 as the center pixel to establish a 3 ⁇ 3 pixel matrix of pixels of the same color as the photosensitive pixel of R1 to obtain the second image in FIG. 19. And taking the surrounding pixels of the central pixel point R1 as a reference point, it is determined whether the difference between the color value of the central pixel point R1 and the surrounding pixels is greater than the first threshold Q (Q is preset in the image preprocessing module 41). Assume). If it is, the central pixel R1 is a bad pixel, and if not, the central pixel R1 is a normal pixel.
  • R1 is a dead pixel
  • bilinear interpolation is performed on R1 to obtain the corrected pixel value R1' (the case where R1 is a dead pixel is shown in the figure) to obtain the third image in FIG. 19.
  • the image preprocessing module 41 of the embodiment of the present application can perform dead pixel compensation on the image, which is beneficial for the high dynamic range image processing system 100 to eliminate the processing defects of the photosensitive pixels 110 during the imaging process of the high dynamic range image processing system 100. Or the defective image caused by errors in the process of converting optical signals into electrical signals, thereby improving the accuracy of the pixel values of the target image formed by the high dynamic range image processing system 100, so that the embodiments of the present application have better Imaging effect.
  • the image processor 40 also includes an image post-processing module 42.
  • the image post-processing module 42 performs color conversion processing on the preprocessed first color intermediate image and the preprocessed second color intermediate image to obtain colors.
  • the converted first color intermediate image and the color converted second color intermediate image are to convert an image from one color space (for example, RGB color space) to another color space (for example, YUV color space) so as to have a wider range of application scenarios or a more efficient transmission format.
  • the color conversion process to convert the image from the RGB color space to the YUV color space is beneficial to the implementation of this application
  • the subsequent image processing of the high dynamic range image processing system 100 compresses the chrominance information of the image, which can reduce the amount of information of the image while not affecting the viewing effect of the image, thereby improving the transmission efficiency of the image.
  • the image post-processing module 42 may perform first-type image post-processing on the pre-processed first color intermediate image and the pre-processed second color intermediate image to obtain the first-type image after post-processing.
  • the image post-processing module 42 performs second-type image post-processing, such as color conversion, on the first color intermediate image after the first-type image post-processing and the second color intermediate image after the first-type image post-processing, to obtain a second color intermediate image.
  • the first color intermediate image after the image post-processing and the second color intermediate image after the second image post-processing are obtained, for example, a first color intermediate image after color conversion and a second color intermediate image after color conversion are obtained.
  • the first type of image post-processing includes at least one of demosaicing, color correction, and global tone mapping.
  • the first type of image post-processing includes only demosaicing; or, the first type of image post-processing includes demosaicing and color correction; or, the first type of image post-processing includes demosaicing, color correction, and global tone mapping.
  • each image pixel grid of the first color intermediate image and the second color intermediate image in the embodiment of the present application is a single-color image pixel, and there is no optical information of other colors, it is necessary to compare the first color intermediate image and the second color intermediate image.
  • the intermediate image is demosaiced.
  • the image post-processing module 42 can directly demosaic the first intermediate color intermediate image and the second color intermediate image; or, the image post-processing module 42 can perform dead pixel processing on the first color intermediate image and the second color intermediate image. Demosaicing is performed on the basis of. In the following, the demosaicing of the first color intermediate image is taken as an example for description.
  • the demosaicing step includes the following steps: (1) Decompose the first color intermediate image into a first red intermediate image, a first green intermediate image, and a first blue intermediate image, as shown in FIG. 20, the resulting first red intermediate image In the image, the first green intermediate image, and the first blue intermediate image, some image pixel grids have no pixel value. (2) The first red intermediate image, the first green intermediate image, and the first blue intermediate image are respectively interpolated using a bilinear interpolation method. As shown in FIG. 21, the image post-processing module 42 uses a bilinear interpolation method to perform interpolation processing on the first blue intermediate image. The image pixel Bu1 to be interpolated in FIG.
  • the image post-processing module 42 uses a bilinear interpolation method to perform interpolation processing on the first green intermediate image.
  • the image pixel G1 to be interpolated in FIG. 22 performs bilinear interpolation according to the four image pixels G2, G3, G4, and G5 around G1 to obtain the interpolated image pixel G1' of G1.
  • the image pixels to be interpolated in all blank spaces in the first image of FIG. 22 are traversed to use the bilinear interpolation method to complete the pixel values to obtain the interpolated first green intermediate image.
  • the image post-processing module 42 may use a bilinear interpolation method to perform interpolation processing on the first red intermediate image to obtain an interpolated first red intermediate image.
  • the image post-processing module 42 performs demosaicing on the color image, which facilitates the implementation of the present application to complete the color image with the pixel value of a single color channel into a color image with multiple color channels, so that the hardware of the single-color photosensitive pixel On the basis of maintaining the complete presentation of the image color.
  • the color correction may specifically be the use of a color correction matrix for the first color intermediate image and the second color intermediate image (may be the first color intermediate image after demosaicing and the second color intermediate image after demosaicing). Each color channel value is corrected once to realize the correction of the image color. As follows:
  • the color correction matrix (CCM) is preset in the image post-processing module 42.
  • the color correction matrix can be specifically:
  • the image post-processing module 42 traverses all pixels in the image and performs color correction through the above color correction matrix to obtain a color-corrected image.
  • the color correction in the embodiment of the present application is beneficial to eliminate the problem of serious color deviation caused by colored light sources in the image or video frame, and the color distortion of people or objects in the image, so that the high dynamic range image processing system 100 of the embodiment of the present application can recover The original color of the image improves the visual effect of the image.
  • the high dynamic range image processing system 100 of the embodiment of the present application performs non-linear mapping for the tone mapping of the image, but the slope of the mapping relationship in the interval with a lower gray value is greater than that in the interval with a higher gray value
  • the slope of, as shown in FIG. 24, is beneficial to the discrimination of pixels with different gray values in the interval with lower gray value, and most of the pixels are distributed in the interval with lower gray value, thus making the embodiments of the present application
  • the highly dynamic image processing system 100 has better imaging effects.
  • the image fusion module 20 after the image fusion module 20 obtains the first color intermediate image and the second color intermediate image, they may be directly transmitted to the image post-processing module 42 for color conversion processing without image preprocessing. Obtain the first color intermediate image after color conversion and the second color intermediate image after color conversion; or, after the image fusion module 20 obtains the first color intermediate image and the second color intermediate image, they directly transmit them to the image post-processing module 42,
  • the image post-processing module 42 may also perform the first type of image post-processing on the first color intermediate image and the second color intermediate image and then perform color conversion to obtain the first color intermediate image after the color conversion and the second color intermediate image after the color conversion.
  • the color intermediate image is not limited here.
  • the high dynamic range image processing system 100 also includes a storage module 50.
  • the storage module 50 is used to store the color-converted image of the image post-processing module 42 in the image processor 40, and transmit the color-converted image to the high
  • the dynamic range image processing module 30 performs high dynamic range image processing to obtain a color high dynamic range image.
  • the image post-processing module 42 in the image processor 40 sequentially performs color conversion processing on the first color intermediate image and the second color intermediate image. After the image post-processing module 42 completes the color conversion processing on the first color intermediate image, the The obtained first color intermediate image after color conversion is transmitted to the storage module 50 for storage.
  • the storage module 50 After the image post-processing module 42 completes the color conversion process on the second color intermediate image, it transmits the obtained second color intermediate image to The storage module 50 performs storage.
  • the storage module 50 stores all the images after the image post-processing module 42 performs the color conversion process (that is, when the storage module 50 stores the color-converted first color intermediate image and the color-converted The second color intermediate image)
  • the storage module 50 transmits all the stored images (ie, the first color intermediate image after preprocessing color conversion and the second color intermediate image after color conversion) to the high dynamic range image processing module 30.
  • the image post-processing module 42 may also perform color conversion processing on the second color intermediate image, and then perform color conversion processing on the first color intermediate image; the image post-processing module 42 may also perform color conversion processing on the first color intermediate image at the same time.
  • the image and the second color intermediate image undergo color conversion processing, which is not limited here. No matter what method the image post-processing module 42 adopts to perform color conversion processing on the first color intermediate image and the second color intermediate image, the storage module 50 only stores the first color intermediate image after color conversion and the second color intermediate image after color conversion. After the color intermediate image, the two images are transmitted to the high dynamic range image processing module 30.
  • the high dynamic range image processing module 30 After obtaining the first color intermediate image after color conversion and the second color intermediate image after color conversion, the high dynamic range image processing module 30 performs high dynamic range processing on the two images to obtain a high dynamic range image. Specifically, please refer to FIG. 25, assuming that the pixel value V1 of the image pixel P12 (the image pixel marked with a dashed circle in the first color intermediate image after color conversion in FIG. 25) is greater than the first preset threshold V0, that is, the image pixel P12 To overexpose the image pixel P12, the high dynamic range image processing unit 31 expands a predetermined area with the overexposed image pixel P12 as the center, for example, the 3*3 area shown in FIG. 25.
  • the high dynamic range image processing unit 31 searches for an intermediate image pixel with a pixel value smaller than the first preset threshold V0 in a predetermined area of 3*3, such as the image pixel P21 in FIG. 25 (the first color conversion in FIG. 25).
  • the pixel value V2 of the image pixel marked with a dotted circle in the color intermediate image is less than the first preset threshold V0, and the image pixel P21 is the intermediate image pixel P21.
  • the high dynamic range image processing unit 31 searches for the image pixels corresponding to the overexposed image pixel P12 and the intermediate image pixel P21 respectively in the second color intermediate image after the color conversion, that is, the image pixel P1'2' (color in FIG. 25).
  • the high dynamic range image processing unit 31 performs this brightness alignment process on each of the overexposed image pixels in the first color intermediate image with color conversion and brightness alignment to obtain the first color after color conversion and brightness alignment.
  • the middle image Since the pixel value of the overexposed image pixel in the first color intermediate image after color conversion and brightness alignment has been corrected, the pixel value of each image pixel in the first color intermediate image after color conversion and brightness alignment is more accurate .
  • the high dynamic range image processing module 30 can merge the color converted and brightness aligned image with similar images In order to get a highly dynamic image.
  • the high dynamic range image processing module 30 first performs motion detection on the first color intermediate image after color conversion and brightness alignment to identify whether there is a motion blur area in the first color intermediate image after color conversion and brightness alignment. If there is no motion blur area in the first color intermediate image after color conversion and brightness alignment, the first color intermediate image after color conversion and brightness alignment and the second color intermediate image after color conversion are directly merged to obtain color high dynamics Range image.
  • the fusion of the two intermediate images at this time follows the following principles: (1) In the first color intermediate image after color conversion and brightness alignment, the pixel values of the image pixels in the overexposed area are directly replaced with the second color intermediate image after color conversion (2) In the first color intermediate image after color conversion and brightness alignment, the pixel value of the image pixel in the under-exposed area is: the long-exposure pixel value divided by the coefficient K1 , The coefficient K1 is the average of K2 and K3; K2 is the ratio of the long-exposure pixel value to the medium-exposure pixel value, and K3 is the ratio of the long-exposure pixel value to the short-exposure pixel value; (3) the first after color conversion and brightness alignment In a color intermediate image, the pixel value of the image pixels in the area neither under-exposed nor over-exposed is
  • the fusion of the two intermediate images at this time must follow the above three principles, and also need to follow the (4) principle: color conversion and brightness
  • the pixel value of the image pixel in the motion blur area is directly replaced with the pixel value of the image pixel corresponding to the motion blur area in the second color intermediate image after color conversion.
  • the high dynamic range image processing system 100 of the embodiment of the present application performs high dynamic range processing on the image through the high dynamic range image processing module 30, first performs brightness alignment processing on the image, and then merges the brightness aligned image with other images to obtain
  • the high dynamic image enables the target image formed by the high dynamic range image processing system 100 to have a larger dynamic range, and thus a better imaging effect.
  • the pixel array 11 may also be exposed for a third exposure time to obtain a third original image.
  • the third original image includes third color original image data generated by the single-color photosensitive pixel exposed at the third exposure time and third full-color original image data generated by the panchromatic photosensitive pixel W exposed at the third exposure time.
  • the third exposure time is not equal to the first exposure time, and the third exposure time is not equal to the second exposure time.
  • the pixel array 11 performs three exposures to obtain a first original image, a second original image, and a third original image, respectively.
  • the first original image includes first color original image data generated by single-color photosensitive pixels exposed at a first exposure time L and first full-color original image data generated by panchromatic photosensitive pixels W exposed at a first exposure time L .
  • the second original image includes second color original image data generated by the single-color photosensitive pixels exposed at the second exposure time M and second full-color original image data generated by the panchromatic photosensitive pixels W exposed at the second exposure time M.
  • the third original image includes third color original image data generated by the single-color photosensitive pixel exposed at the third exposure time S and third full-color original image data generated by the panchromatic photosensitive pixel W exposed at the third exposure time S.
  • the image fusion module 20 can merge the first color original image data and the first panchromatic original image data into a first color intermediate image containing only the first color intermediate image data, and combine the second color original image data with the second panchromatic image data.
  • the original image data is fused into a second color intermediate image containing only the second color intermediate image data
  • the third color original image data and the third panchromatic original image data are fused into a third color containing only the third color intermediate image data The middle image.
  • the specific implementation is the same as the specific implementation in the embodiment described in FIG. 14 and FIG. 15, and will not be repeated here.
  • the image preprocessing module 41 may perform image preprocessing on the first color intermediate image to obtain the preprocessed first color intermediate image; perform preprocessing on the second color intermediate image to obtain the preprocessed second color intermediate image, and The third color intermediate image is preprocessed to obtain the preprocessed third color intermediate image.
  • the specific implementation manner is the same as the implementation manner of image preprocessing in any one of the embodiments described in FIG. 17 to FIG. 19, and will not be repeated here.
  • the image post-processing module 42 performs color conversion processing on the preprocessed first color intermediate image, the preprocessed second color intermediate image, and the preprocessed third color intermediate image to obtain the color converted first color intermediate image Image, the second color intermediate image after color conversion, and the third color intermediate image after color conversion; or, the image post-processing module 42 performs processing on the preprocessed first color intermediate image, preprocessed second color intermediate image, Perform image post-processing on the pre-processed third color intermediate image to obtain a first color intermediate image after image post-processing, a second color intermediate image after image post-processing, and a third color intermediate image after image post-processing; or , The image post-processing module 42 directly performs color conversion processing on the first color intermediate image, the second color intermediate image, and the third color intermediate image to obtain the first color intermediate image after color conversion and the second color intermediate image after color conversion. The image and the third color intermediate image after color conversion.
  • the specific color conversion processing process is the same as the color conversion processing process in the for
  • the high dynamic range image processing module 30 performs high dynamic range processing on the color-converted first color intermediate image, the color-converted second color intermediate image, and the color-converted third color intermediate image to obtain a color high dynamic range image .
  • the specific implementation method of high dynamic range processing is the same as the specific implementation manner of fusing the preprocessed first color intermediate image and the preprocessed second color intermediate image into a color high dynamic range image in the embodiment described in FIG. 25 , I won’t repeat it here.
  • the present application also provides an electronic device 1000.
  • the electronic device 1000 of the embodiment of the present application includes a lens 300, a housing 200, and the high dynamic range image processing system 100 of any one of the above embodiments.
  • the lens 300 and the high dynamic range image processing system 100 are combined with the housing 200.
  • the lens 300 cooperates with the image sensor 10 of the high dynamic range image processing system 100 for imaging.
  • the electronic device 1000 may be a mobile phone, a tablet computer, a notebook computer, a smart wearable device (such as a smart watch, a smart bracelet, a smart glasses, a smart helmet), a drone, a head-mounted display device, etc., which are not limited here.
  • a smart wearable device such as a smart watch, a smart bracelet, a smart glasses, a smart helmet
  • a drone a head-mounted display device, etc., which are not limited here.
  • the electronic device 1000 of the embodiment of the present application performs the fusion algorithm processing on the multi-frame original image output by the image sensor 10 through the image fusion module 20 provided in the high dynamic range image processing system 100, so as to obtain the image pixels arranged in a Bayer array.
  • Multi-frame color intermediate image In this way, the multi-frame color intermediate image can be processed by the image processor 40, which solves the problem that the image processor cannot directly process the image in which the image pixels are arranged in a non-Bayer array.
  • the high dynamic range image processing method of the embodiment of the present application is used in the high dynamic range image processing system 100.
  • the high dynamic range image processing system 100 may include the image sensor 10.
  • the image sensor 10 includes a pixel array 11.
  • the pixel array 11 includes a plurality of full-color photosensitive pixels and a plurality of color photosensitive pixels. Color photosensitive pixels have a narrower spectral response than full-color photosensitive pixels.
  • the pixel array 11 includes the smallest repeating unit. Each minimum repeating unit contains multiple subunits. Each subunit includes a plurality of single-color photosensitive pixels and a plurality of full-color photosensitive pixels.
  • High dynamic range image processing methods include:
  • the pixel array 11 Exposure of the pixel array 11, where the pixel array 11 is exposed to a first exposure time to obtain a first original image, and the first original image includes first color original image data generated by single-color photosensitive pixels exposed at the first exposure time and The first full-color original image data generated by the full-color photosensitive pixels exposed at the first exposure time; the pixel array is exposed at the second exposure time to obtain a second original image, and the second original image includes the single-color photosensitive pixels exposed at the second exposure time The generated second color original image data and the second panchromatic original image data generated by the panchromatic photosensitive pixels exposed at the second exposure time; wherein the first exposure time is not equal to the second exposure time;
  • first color original image data and the first panchromatic original image data are fused into a first color intermediate image containing only the first color intermediate image data
  • second color original image data and the second panchromatic original image data are fused
  • both the first color intermediate image and the second color intermediate image include a plurality of color image pixels, and the plurality of color image pixels are arranged in a Bayer array
  • the high dynamic range image processing method further includes: the pixel array is exposed at a third exposure time to obtain a third original image, and the third original image includes a third image generated by the single-color photosensitive pixels exposed at the third exposure time.
  • the third color original image data and the third panchromatic original image data are fused into a third color intermediate image that contains only the third color intermediate image data.
  • the third color intermediate image contains multiple color image pixels, and the multiple color image pixels represent Bayer array arrangement.
  • the third color intermediate image is subjected to color conversion processing to obtain a third color intermediate image after color conversion; the step is to perform high dynamic range processing on the first color intermediate image after color conversion and the second color intermediate image after color conversion to obtain color
  • the high dynamic range image includes: performing high dynamic range processing on the first color intermediate image after color conversion, the second color intermediate image after color conversion, and the third color intermediate image after color conversion to obtain a color high dynamic range image.
  • the high dynamic range image processing method further includes: each color original image data is generated by a single single-color photosensitive pixel, each full-color original image data is generated by a single panchromatic photosensitive pixel, and the image sensor 10 (Fig. 1)
  • the output mode for outputting multiple original image data includes alternate output of one color original image data and one full-color original image data.
  • each color original image data is jointly generated by a plurality of single-color photosensitive pixels in the same subunit
  • each panchromatic original image data is jointly generated by a plurality of panchromatic photosensitive pixels in the same subunit.
  • the output mode of the image sensor 10 (shown in FIG. 1) for outputting a plurality of original image data includes alternately outputting a plurality of color original image data and a plurality of full-color original image data.
  • the high dynamic range image processing method further includes: performing image preprocessing on the first color intermediate image to obtain the preprocessed first color intermediate image; performing image preprocessing on the second color intermediate image to obtain The preprocessed second color intermediate image.
  • the step of performing color conversion processing on the first color intermediate image and the second color intermediate image to obtain the first color intermediate image after color conversion and the second color intermediate image after color conversion includes: preprocessing the first color intermediate image The image and the preprocessed second color intermediate image are subjected to color conversion processing to obtain a first color intermediate image after color conversion and a second color intermediate image after color conversion.
  • the high dynamic range image processing method further includes: performing image preprocessing on the first color intermediate image to obtain the preprocessed first color intermediate image; performing image preprocessing on the second color intermediate image to obtain Preprocessed second color intermediate image; image preprocessing is performed on the third color intermediate image to obtain the preprocessed third color intermediate image. Step performing color conversion processing on the first color intermediate image, the second color intermediate image, and the third color intermediate image to obtain the first color intermediate image after color conversion, the second color intermediate image after color conversion, and the first color intermediate image after color conversion.
  • the three-color intermediate image includes: performing color conversion processing on the preprocessed first color intermediate image, the preprocessed second color intermediate image, and the preprocessed third color intermediate image to obtain the color converted first color The intermediate image, the second color intermediate image after color conversion, and the third color intermediate image after color conversion.
  • the image preprocessing includes at least one of black level correction, lens shading correction, dead pixel compensation, demosaicing, color correction, and global tone mapping.
  • the high dynamic range image processing system includes a storage module to store the color-converted image in the storage module. And obtain the color-converted image from the storage module and perform high dynamic range image processing on the color-converted image to obtain a color high dynamic range image.
  • This application also provides a non-volatile computer-readable storage medium 400 containing a computer program.
  • the processor 60 is caused to execute the high dynamic range image processing method of any one of the foregoing embodiments.
  • the pixel array 11 is exposed, where the pixel array 11 is exposed for a first exposure time to obtain a first original image, and the first original image includes first color original image data generated by a single color photosensitive pixel exposed for the first exposure time and a first original image.
  • the first color original image data and the first panchromatic original image data are fused into a first color intermediate image containing only the first color intermediate image data, and the second color original image data and the second panchromatic original image data are fused into only the first color intermediate image data.
  • High dynamic range processing is performed on the first color intermediate image after color conversion and the second color intermediate image after color conversion to obtain a color high dynamic range image.
  • the processor 60 When the computer program is executed by the processor 60, the processor 60 is caused to perform the following steps:

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Color Television Image Signal Generators (AREA)
  • Image Processing (AREA)
  • Studio Devices (AREA)

Abstract

一种高动态范围图像处理系统(100)及方法、电子设备(1000)和可读存储介质(400)。高动态范围图像处理系统(100)包括图像传感器(10)、图像融合模块(20)、高动态范围图像处理模块(30)及图像处理器(40)。图像传感器(10)中的像素阵列(11)曝光得到多帧原始图像。图像融合模块(20)对原始图像进行融合获得中间图像。高动态范围图像处理模块(30)对经过图像处理器(40)进行色彩转换处理后的彩色中间图像,进行高动态范围处理。

Description

高动态范围图像处理系统及方法、电子设备和可读存储介质
优先权信息
本申请请求2020年4月20日向中国国家知识产权局提交的、专利申请号为202010310641.2的专利申请的优先权和权益,并且通过参照将其全文并入此处。
技术领域
本申请涉及图像处理技术领域,特别涉及一种高动态范围图像处理系统、高动态范围图像处理方法、电子设备及计算机可读存储介质。
背景技术
手机等电子设备中可以设置有摄像头以实现拍照功能。摄像头内可以设置用于接收光线的图像传感器。图像传感器中可以设置有滤光片阵列。
发明内容
本申请实施方式提供了一种高动态范围图像处理系统、高动态范围图像处理方法、电子设备及计算机可读存储介质。
本申请实施方式提供一种高动态范围图像处理系统。高动态范围图像处理系统包括图像传感器、图像融合模块、高动态范围图像处理模块及图像处理器。所述图像传感器包括像素阵列,所述像素阵列包括多个全色感光像素和多个彩色感光像素,所述彩色感光像素具有比所述全色感光像素更窄的光谱响应。所述像素阵列包括最小重复单元,每个所述最小重复单元包含多个子单元,每个所述子单元包括多个单颜色感光像素及多个全色感光像素。所述像素阵列以第一曝光时间曝光得到第一原始图像,所述第一原始图像包括以所述第一曝光时间曝光的所述单颜色感光像素生成的第一彩色原始图像数据和以所述第一曝光时间曝光的所述全色感光像素生成的第一全色原始图像数据。所述像素阵列以第二曝光时间曝光得到第二原始图像,所述第二原始图像包括以所述第二曝光时间曝光的所述单颜色感光像素生成的第二彩色原始图像数据和以所述第二曝光时间曝光的所述全色感光像素生成的第二全色原始图像数据。其中,所述第一曝光时间不等于所述第二曝光时间。所述图像融合模块用于将所述第一彩色原始图像数据与所述第一全色原始图像数据融合为仅包含第一彩色中间图像数据的第一彩色中间图像,将所述第二彩色原始图像数据与所述第二全色原始图像数据融合为仅包含第二彩色中间图像数据的第二彩色中间图像,,所述第一彩色中间图像和所述第二彩色中间图像均包含多个彩色图像像素,多个所述彩色图像像素呈拜耳阵列排布。所述图像处理器用于对所述第一彩色中间图像及所述第二彩色中间图像进行色彩转换处理以获得色彩转换后的第一彩色中间图像及色彩转换后的第二彩色中间图像。所述高动态范围图像处理模块用于对所述色彩转换后的第一彩色中间图像及所述色彩转换后的第二彩色中间图像进行高动态范围处理以获得彩色高动态范围图像。
本申请实施方式提供一种高动态范围图像处理方法。所述高动态范围图像处理方法用于高动态范围图像处理系统。所述高动态范围图像处理系统包括图像传感器,所述图像传感器包括像素阵列,所述像素阵列包括多个全色感光像素和多个彩色感光像素,所述彩色感光像素具有比所述全色感光像素更窄的光谱响应,所述像素阵列包括最小重复单元,每个所述最小重复单元包含多个子单元,每个所述子单元包括多个单颜色感光像素及多个全色感光像素;所述高动态范围图像处理方法包括:所述像素阵列曝光,其中,所述像素阵列以第一曝光时间曝光得到第一原始图像,所述第一原始图像包括以所述第一曝光时间曝光的所述单颜色感光像素生成的第一彩色原始图像数据和以所述第一曝光时间曝光的所述全色感光像素生成的第一全色原始图像数据。所述像素阵列以第二曝光时间曝光得到第二原始图像,所述第二原始图像包括以所述第二曝光时间曝光的所述单颜色感光像素生成的第二彩色原始图像数据和以所述第二曝光时间曝光的所述全色感光像素生成的第二全色原始图像数据。其中,所述第一曝光时间不等于所述第二曝光时间。将所述第一彩色原始图像数据与所述第一全色原始图像数据融合为仅包含第一彩色中间图像数据的第一彩色中间图像;将所述第二彩色原始图像数据与所述第二全色原始图像数据融合为仅包含第二彩色中间图像数据的第二彩色中间图像,所述第一彩色中间图像和所述第二彩色中间图像均包含多个彩色图像像素,多个所述彩色图像像素呈拜耳阵列排布;对所述第一彩色中间图像及所述第二彩色中间图像进行色彩转换处理以获得色彩转换后的第一彩色中间图像及色彩转换后的第二彩色中间图像;对所述色彩转换后的第一彩色中间图像及所述色彩转换后的第二彩色中间图像进行高动态范围处理以获得彩色高动态范围图像。
本申请实施方式提供一种电子设备。所述电子设备包括镜头、壳体及上述的高动态范围图像处理系 统。所述镜头、所述高动态范围图像处理系统与所述壳体结合,所述镜头与所述高动态范围图像处理系统的图像传感器配合成像。
本申请实施方式提供一种包含计算机程序的非易失性计算机可读存储介质。所述计算机程序被处理器执行时,使得所述处理器执行上述的高动态范围图像处理方法。
本申请实施方式的附加方面和优点将在下面的描述中部分给出,部分将从下面的描述中变得明显,或通过本申请的实践了解到。
附图说明
本申请的上述和/或附加的方面和优点可以从结合下面附图对实施方式的描述中将变得明显和容易理解,其中:
图1是本申请实施方式的一种高动态范围图像处理系统的示意图;
图2是本申请实施方式的一种像素阵列的示意图;
图3是本申请实施方式的一种感光像素的截面示意图;
图4是本申请实施方式的一种感光像素的像素电路图;
图5是本申请实施方式的一种像素阵列中最小重复单元的排布示意图;
图6是本申请实施方式的又像素阵列中最小重复单元的排布示意图;
图7是本申请实施方式的又一种像素阵列中最小重复单元的排布示意图;
图8是本申请实施方式的又一种像素阵列中最小重复单元的排布示意图;
图9是本申请实施方式的又一种像素阵列中最小重复单元的排布示意图;
图10是本申请实施方式的又一种像素阵列中最小重复单元的排布示意图;
图11是本申请实施方式的一种图像传感器输出的原始图像的示意图;
图12是本申请实施方式的一种图像传感器输出原始图像数据的方式的示意图;
图13是本申请实施方式的又一种图像传感器输出原始图像数据的方式的示意图;
图14是本申请实施方式的一种彩色中间图像的示意图;
图15是本申请实施方式的又一种彩色中间图像的示意图;
图16是本申请实施方式的又一种高动态范围图像处理系统的示意图;
图17是本申请实施方式的一种黑电平校正的示意图;
图18是本申请实施方式的一种镜头阴影校正的示意图;
图19是本申请实施方式的一种坏点补偿处理的示意图;
图20至图23是本申请实施方式的一种去马赛克的示意图;
图24是本申请实施方式的一种色调映射处理的Vout和Vin之间的映射关系示意图;
图25是本申请实施方式的一种亮度对齐处理的示意图;
图26是本申请实施方式的又一种图像传感器输出的原始图像的示意图;
图27本申请实施方式的一种电子设备的结构示意图;
图28是本申请实施方式的一种高动态范围图像处理方法的流程示意图;
图29是本申请实施方式的一种非易失性计算机可读存储介质与处理器的交互示意图。
具体实施方式
下面详细描述本申请的实施方式,所述实施方式的示例在附图中示出,其中,相同或类似的标号自始至终表示相同或类似的元件或具有相同或类似功能的元件。下面通过参考附图描述的实施方式是示例性的,仅用于解释本申请的实施方式,而不能理解为对本申请的实施方式的限制。
请参阅图1及图2,本申请实施方式提供一种高动态范围图像处理系统100。高动态范围图像处理系统100包括图像传感器10、图像融合模块20及高动态范围图像处理模块30。图像传感器10包括像素阵列11,像素阵列11包括多个全色感光像素和多个彩色感光像素,彩色感光像素具有比全色感光像素更窄的光谱响应。像素阵列11包括最小重复单元,每个最小重复单元包含多个子单元,每个子单元包括多个单颜色感光像素及多个全色感光像素。像素阵列11以第一曝光时间曝光得到第一原始图像,第一原始图像包括以第一曝光时间曝光的单颜色感光像素生成的第一彩色原始图像数据和以第一曝光时间曝光的全色感光像素生成的第一全色原始图像数据。像素阵列11以第二曝光时间曝光得到第二原始图像,第二原始图像包括以第二曝光时间曝光的单颜色感光像素生成的第二彩色原始图像数据和以第二曝光时间曝光的全色感光像素生成的第二全色原始图像数据。其中,第一曝光时间不等于第二曝光时间。图像融合模块10用于将第一彩色原始图像数据与第一全色原始图像数据融合为仅包含第一彩色中间图像数据的第一彩色中间图像;将第二彩色原始图像数据与第二全色原始图像数据融合为仅包含第二 彩色中间图像数据的第二彩色中间图像,第一彩色中间图像和第二彩色中间图像均包含多个彩色图像像素,多个彩色图像像素呈拜耳阵列排布。图像处理器40用于对第一彩色中间图像及第二彩色中间图像进行色彩转换处理以获得色彩转换后的第一彩色中间图像及色彩转换后的第二彩色中间图像。高动态范围图像处理模块30用于对色彩转换后的第一彩色中间图像及色彩转换后的第二彩色中间图像进行高动态范围处理以获得彩色高动态范围图像。
请参阅图1及图2,在某些实施方式中,像素阵列11以第三曝光时间曝光得到第三原始图像,第三原始图像包括以第三曝光时间曝光的单颜色感光像素生成的第三彩色原始图像数据和以第三曝光时间曝光的全色感光像素生成的第三全色原始图像数据;其中,第三曝光时间不等于第一曝光时间,第三曝光时间不等于第二曝光时间。图像融合模块20还用于将第三彩色原始图像数据与第三全色原始图像数据融合为仅包含第三彩色中间图像数据的第三彩色中间图像。图像处理器40用于对第三彩色中间图像进行色彩转换处理以获得色彩转换后的第三彩色中间图像。高动态范围图像处理模块30用于对色彩转换后的第一彩色中间图像、色彩转换后的第二彩色中间图像及色彩转换后的第三彩色中间图像进行高动态范围处理以获得彩色高动态范围图像。
请参阅图12,在某些实施方式中,每个彩色原始图像数据由单个单颜色感光像素生成,每个全色原始图像数据由单个全色感光像素生成,图像传感器10输出多个原始图像数据的输出方式包括一个彩色原始图像数据与一个全色原始图像数据交替输出。
请参阅图13,在某些实施方式中,每个彩色原始图像数据由同一子单元中的多个单颜色感光像素共同生成,每个全色原始图像数据由同一子单元中的多个全色感光像素共同生成,图像传感器10输出多个原始图像数据的输出方式包括多个彩色原始图像数据与多个全色原始图像数据交替输出。
请参阅图16,在某些实施方式中,图像处理器40包括图像预处理模块41及图像后处理模块42。图像预处理模块41用于:对第一彩色中间图像进行图像预处理以获得预处理后的第一彩色中间图像;及对第二彩色中间图像进行图像预处理以获得预处理后的第二彩色中间图像。图像后处理模块42用于对预处理后的第一彩色中间图像及预处理后的第二彩色中间图像进行色彩转换处理以获得色彩转换后的第一彩色中间图像及色彩转换后的第二彩色中间图像。
请参阅图16,在某些实施方式中,图像处理器40包括图像预处理模块41及图像后处理模块42。图像预处理模块41用于对第一彩色中间图像进行图像预处理以获得预处理后的第一彩色中间图像;对第二彩色中间图像进行图像预处理以获得预处理后的第二彩色中间图像;及对第三彩色中间图像进行图像预处理以获得预处理后的第三彩色中间图像。图像后处理模块42用于对预处理后的第一彩色中间图像、预处理后的第二彩色中间图像及预处理后的第三彩色中间图像进行色彩转换处理以获得色彩转换后的第一彩色中间图像、色彩转换后的第二彩色中间图像及色彩转换后的第三彩色中间图像。
在某些实施方式中,图像预处理包括黑电平校正、镜头阴影校正、坏点补偿中的至少一种。
请参阅图16,在某些实施方式中,高动态范围图像处理系统100还包括存储模块50,存储模块50用于存储色彩转换后的图像,并将色彩转换后的图像传输至高动态范围图像处理模块30进行高动态范围图像处理,以获得彩色高动态范围图像。
在某些实施方式中,图像融合模块20集成在图像传感器10中。
请参阅图2及图28,本申请提供一种高动态范围图像处理方法。本申请实施方式的高动态范围图像处理方法用于高动态范围图像处理系统100。高动态范围图像处理系统100可以包括图像传感器10。图像传感器10包括像素阵列11。像素阵列11包括多个全色感光像素和多个彩色感光像素。彩色感光像素具有比全色感光像素更窄的光谱响应。像素阵列11包括最小重复单元。每个最小重复单元包含多个子单元。每个子单元包括多个单颜色感光像素及多个全色感光像素。高动态范围图像处理方法包括:
01:像素阵11曝光,其中,像素阵列11以第一曝光时间曝光得到第一原始图像,第一原始图像包括以第一曝光时间曝光的单颜色感光像素生成的第一彩色原始图像数据和以第一曝光时间曝光的全色感光像素生成的第一全色原始图像数据;像素阵列以第二曝光时间曝光得到第二原始图像,第二原始图像包括以第二曝光时间曝光的单颜色感光像素生成的第二彩色原始图像数据和以第二曝光时间曝光的全色感光像素生成的第二全色原始图像数据;其中,第一曝光时间不等于第二曝光时间;
02:将第一彩色原始图像数据与第一全色原始图像数据融合为仅包含第一彩色中间图像数据的第一彩色中间图像,将第二彩色原始图像数据与第二全色原始图像数据融合为仅包含第二彩色中间图像数据的第二彩色中间图像,第一彩色中间图像和第二彩色中间图像均包含多个彩色图像像素,多个彩色图像像素呈拜耳阵列排布;
03:对第一彩色中间图像及第二彩色中间图像进行色彩转换处理以获得色彩转换后的第一彩色中间图像及色彩转换后的第二彩色中间图像;
04:对色彩转换后的第一彩色中间图像及色彩转换后的第二彩色中间图像进行高动态范围处理以获 得彩色高动态范围图像。
在某些实施方式中,高动态范围图像处理方法还包括:像素阵列以第三曝光时间曝光得到第三原始图像,第三原始图像包括以第三曝光时间曝光的单颜色感光像素生成的第三彩色原始图像数据和以第三曝光时间曝光的全色感光像素生成的第三全色原始图像数据,其中,第三曝光时间不等于第一曝光时间,第三曝光时间不等于第二曝光时间。第三彩色原始图像数据与第三全色原始图像数据融合为仅包含第三彩色中间图像数据的第三彩色中间图像,第三彩色中间图像均包含多个彩色图像像素,多个彩色图像像素呈拜耳阵列排布。第三彩色中间图像进行色彩转换处理以获得色彩转换后的第三彩色中间图像;步骤对色彩转换后的第一彩色中间图像及色彩转换后的第二彩色中间图像进行高动态范围处理以获得彩色高动态范围图像,包括:对色彩转换后的第一彩色中间图像、色彩转换后的第二彩色中间图像及色彩转换后的第三彩色中间图像进行高动态范围处理以获得彩色高动态范围图像。
在某些实施方式中,高动态范围图像处理方法还包括:每个彩色原始图像数据由单个单颜色感光像素生成,每个全色原始图像数据由单个全色感光像素生成,图像传感器10(图1所示)输出多个原始图像数据的输出方式包括一个彩色原始图像数据与一个全色原始图像数据交替输出。
在某些实施方式中,每个彩色原始图像数据由同一子单元中的多个单颜色感光像素共同生成,每个全色原始图像数据由同一子单元中的多个全色感光像素共同生成。图像传感器10(图1所示)输出多个原始图像数据的输出方式包括多个彩色原始图像数据与多个全色原始图像数据交替输出。
在某些实施方式中,高动态范围图像处理方法还包括:对第一彩色中间图像进行图像预处理以获得预处理后的第一彩色中间图像;对第二彩色中间图像进行图像预处理以获得预处理后的第二彩色中间图像。步骤对第一彩色中间图像及第二彩色中间图像进行色彩转换处理以获得色彩转换后的第一彩色中间图像及色彩转换后的第二彩色中间图像,包括:对预处理后的第一彩色中间图像及预处理后的第二彩色中间图像进行色彩转换处理以获得色彩转换后的第一彩色中间图像及色彩转换后的第二彩色中间图像。
在某些实施方式中,高动态范围图像处理方法还包括:对第一彩色中间图像进行图像预处理以获得预处理后的第一彩色中间图像;对第二彩色中间图像进行图像预处理以获得预处理后的第二彩色中间图像;对第三彩色中间图像进行图像预处理以获得预处理后的第三彩色中间图像。步骤对第一彩色中间图像、第二彩色中间图像及第三彩色中间图像进行色彩转换处理以获得色彩转换后的第一彩色中间图像、色彩转换后的第二彩色中间图像及色彩转换后的第三彩色中间图像,包括:对预处理后的第一彩色中间图像、预处理后的第二彩色中间图像及预处理后的第三彩色中间图像进行色彩转换处理以获得色彩转换后的第一彩色中间图像、色彩转换后的第二彩色中间图像及色彩转换后的第三彩色中间图像。
在某些实施方式中,图像预处理包括黑电平校正、镜头阴影校正、坏点补偿、去马赛克、色彩校正和全局色调映射中的至少一种。
在某些实施方式中,高动态范围图像处理系统包括存储模块,将色彩转换后的图像存储至存储模块。并从存储模块获取色彩转换后的图像并对色彩转换后的图像进行高动态范围图像处理,以获得彩色高动态范围图像。
请参阅图27,本申请还提供一种电子设备1000。本申请实施方式的电子设备1000包括镜头300、壳体200及高动态范围图像处理系统100。高动态范围图像处理系统100包括图像传感器10、图像融合模块20及高动态范围图像处理模块30。图像传感器10包括像素阵列11,像素阵列11包括多个全色感光像素和多个彩色感光像素,彩色感光像素具有比全色感光像素更窄的光谱响应。像素阵列11包括最小重复单元,每个最小重复单元包含多个子单元,每个子单元包括多个单颜色感光像素及多个全色感光像素。像素阵列11以第一曝光时间曝光得到第一原始图像,第一原始图像包括以第一曝光时间曝光的单颜色感光像素生成的第一彩色原始图像数据和以第一曝光时间曝光的全色感光像素生成的第一全色原始图像数据。像素阵列11以第二曝光时间曝光得到第二原始图像,第二原始图像包括以第二曝光时间曝光的单颜色感光像素生成的第二彩色原始图像数据和以第二曝光时间曝光的全色感光像素生成的第二全色原始图像数据。其中,第一曝光时间不等于第二曝光时间。图像融合模块10用于将第一彩色原始图像数据与第一全色原始图像数据融合为仅包含第一彩色中间图像数据的第一彩色中间图像;将第二彩色原始图像数据与第二全色原始图像数据融合为仅包含第二彩色中间图像数据的第二彩色中间图像,第一彩色中间图像和第二彩色中间图像均包含多个彩色图像像素,多个彩色图像像素呈拜耳阵列排布。图像处理器40用于对第一彩色中间图像及第二彩色中间图像进行色彩转换处理以获得色彩转换后的第一彩色中间图像及色彩转换后的第二彩色中间图像。高动态范围图像处理模块30用于对色彩转换后的第一彩色中间图像及色彩转换后的第二彩色中间图像进行高动态范围处理以获得彩色高动态范围图像。
请参阅图1及图2,在某些实施方式中,像素阵列11以第三曝光时间曝光得到第三原始图像,第三原始图像包括以第三曝光时间曝光的单颜色感光像素生成的第三彩色原始图像数据和以第三 曝光时间曝光的全色感光像素生成的第三全色原始图像数据;其中,第三曝光时间不等于第一曝光时间,第三曝光时间不等于第二曝光时间。图像融合模块20还用于将第三彩色原始图像数据与第三全色原始图像数据融合为仅包含第三彩色中间图像数据的第三彩色中间图像。图像处理器40用于对第三彩色中间图像进行色彩转换处理以获得色彩转换后的第三彩色中间图像。高动态范围图像处理模块30用于对色彩转换后的第一彩色中间图像、色彩转换后的第二彩色中间图像及色彩转换后的第三彩色中间图像进行高动态范围处理以获得彩色高动态范围图像。
请参阅图12,在某些实施方式中,每个彩色原始图像数据由单个单颜色感光像素生成,每个全色原始图像数据由单个全色感光像素生成,图像传感器10输出多个原始图像数据的输出方式包括一个彩色原始图像数据与一个全色原始图像数据交替输出。
请参阅图13,在某些实施方式中,每个彩色原始图像数据由同一子单元中的多个单颜色感光像素共同生成,每个全色原始图像数据由同一子单元中的多个全色感光像素共同生成,图像传感器10输出多个原始图像数据的输出方式包括多个彩色原始图像数据与多个全色原始图像数据交替输出。
请参阅图16,在某些实施方式中,图像处理器40包括图像预处理模块41及图像后处理模块42。图像预处理模块41用于:对第一彩色中间图像进行图像预处理以获得预处理后的第一彩色中间图像;及对第二彩色中间图像进行图像预处理以获得预处理后的第二彩色中间图像。图像后处理模块42用于对预处理后的第一彩色中间图像及预处理后的第二彩色中间图像进行色彩转换处理以获得色彩转换后的第一彩色中间图像及色彩转换后的第二彩色中间图像。
请参阅图16,在某些实施方式中,图像处理器40包括图像预处理模块41及图像后处理模块42。图像预处理模块41用于对第一彩色中间图像进行图像预处理以获得预处理后的第一彩色中间图像;对第二彩色中间图像进行图像预处理以获得预处理后的第二彩色中间图像;及对第三彩色中间图像进行图像预处理以获得预处理后的第三彩色中间图像。图像后处理模块42用于对预处理后的第一彩色中间图像、预处理后的第二彩色中间图像及预处理后的第三彩色中间图像进行色彩转换处理以获得色彩转换后的第一彩色中间图像、色彩转换后的第二彩色中间图像及色彩转换后的第三彩色中间图像。
在某些实施方式中,图像预处理包括黑电平校正、镜头阴影校正、坏点补偿中的至少一种。
请参阅图16,在某些实施方式中,高动态范围图像处理系统100还包括存储模块50,存储模块50用于存储色彩转换后的图像,并将色彩转换后的图像传输至高动态范围图像处理模块30进行高动态范围图像处理,以获得彩色高动态范围图像。
在某些实施方式中,图像融合模块20集成在图像传感器10中。
请参阅29,本申请还提供一种包含计算机程序的非易失性计算机可读存储介质400。该计算机程序被处理器60执行时,使得处理器60执行上述任意一个实施方式的高动态范围图像处理方法。
本申请实施方式的高动态范围图像处理系统100通过图像融合模块20对图像传感器10输出的多帧原始图像事先进行融合算法处理,以得到图像像素呈拜耳阵列排布的多帧彩色中间图像。如此,多帧彩色中间图像可以被图像处理器40处理,解决了图像处理器不能直接对图像像素呈非拜耳阵列排布的图像进行处理的问题。
图2是本申请实施方式中的图像传感器10的示意图。图像传感器10包括像素阵列11、垂直驱动单元12、控制单元13、列处理单元14和水平驱动单元15。
例如,图像传感器10可以采用互补金属氧化物半导体(CMOS,Complementary Metal Oxide Semiconductor)感光元件或者电荷耦合元件(CCD,Charge-coupled Device)感光元件。
例如,像素阵列11包括以阵列形式二维排列(即二维矩阵形式排布)的多个感光像素110(图3所示),每个感光像素110包括光电转换元件1111(图4所示)。每个感光像素110根据入射在其上的光的强度将光转换为电荷。
例如,垂直驱动单元12包括移位寄存器和地址译码器。垂直驱动单元12包括读出扫描和复位扫描功能。读出扫描是指顺序地逐行扫描单位感光像素110,从这些单位感光像素110逐行地读取信号。例如,被选择并被扫描的感光像素行中的每一感光像素110输出的信号被传输到列处理单元14。复位扫描用于复位电荷,光电转换元件的光电荷被丢弃,从而可以开始新的光电荷的积累。
例如,由列处理单元14执行的信号处理是相关双采样(CDS)处理。在CDS处理中,取出从所选感光像素行中的每一感光像素110输出的复位电平和信号电平,并且计算电平差。因而,获得了一行中的感光像素110的信号。列处理单元14可以具有用于将模拟像素信号转换为数字格式的模数(A/D)转换功能。
例如,水平驱动单元15包括移位寄存器和地址译码器。水平驱动单元15顺序逐列扫描像素阵列11。通过水平驱动单元15执行的选择扫描操作,每一感光像素列被列处理单元14顺序地处理,并且被顺序输出。
例如,控制单元13根据操作模式配置时序信号,利用多种时序信号来控制垂直驱动单元12、列处理单元14和水平驱动单元15协同工作。
图3是本申请实施方式中一种感光像素110的示意图。感光像素110包括像素电路111、滤光片112、及微透镜113。沿感光像素110的收光方向,微透镜113、滤光片112、及像素电路111依次设置。微透镜113用于汇聚光线,滤光片112用于供某一波段的光线通过并过滤掉其余波段的光线。像素电路111用于将接收到的光线转换为电信号,并将生成的电信号提供给图2所示的列处理单元14。
图4是本申请实施方式中一种感光像素110的像素电路111的示意图。图4中像素电路111可应用在图2所示的像素阵列11内的每个感光像素110(图3所示)中。下面结合图2至图4对像素电路111的工作原理进行说明。
如图4所示,像素电路111包括光电转换元件1111(例如,光电二极管)、曝光控制电路(例如,转移晶体管1112)、复位电路(例如,复位晶体管1113)、放大电路(例如,放大晶体管1114)和选择电路(例如,选择晶体管1115)。在本申请的实施例中,转移晶体管1112、复位晶体管1113、放大晶体管1114和选择晶体管1115例如是MOS管,但不限于此。
例如,光电转换元件1111包括光电二极管,光电二极管的阳极例如连接到地。光电二极管将所接收的光转换为电荷。光电二极管的阴极经由曝光控制电路(例如,转移晶体管1112)连接到浮动扩散单元FD。浮动扩散单元FD与放大晶体管1114的栅极、复位晶体管1113的源极连接。
例如,曝光控制电路为转移晶体管1112,曝光控制电路的控制端TG为转移晶体管1112的栅极。当有效电平(例如,VPIX电平)的脉冲通过曝光控制线传输到转移晶体管1112的栅极时,转移晶体管1112导通。转移晶体管1112将光电二极管光电转换的电荷传输到浮动扩散单元FD。
例如,复位晶体管1113的漏极连接到像素电源VPIX。复位晶体管113的源极连接到浮动扩散单元FD。在电荷被从光电二极管转移到浮动扩散单元FD之前,有效复位电平的脉冲经由复位线传输到复位晶体管113的栅极,复位晶体管113导通。复位晶体管113将浮动扩散单元FD复位到像素电源VPIX。
例如,放大晶体管1114的栅极连接到浮动扩散单元FD。放大晶体管1114的漏极连接到像素电源VPIX。在浮动扩散单元FD被复位晶体管1113复位之后,放大晶体管1114经由选择晶体管1115通过输出端OUT输出复位电平。在光电二极管的电荷被转移晶体管1112转移之后,放大晶体管1114经由选择晶体管1115通过输出端OUT输出信号电平。
例如,选择晶体管1115的漏极连接到放大晶体管1114的源极。选择晶体管1115的源极通过输出端OUT连接到图2中的列处理单元14。当有效电平的脉冲通过选择线被传输到选择晶体管1115的栅极时,选择晶体管1115导通。放大晶体管1114输出的信号通过选择晶体管1115传输到列处理单元14。
需要说明的是,本申请实施例中像素电路111的像素结构并不限于图4所示的结构。例如,像素电路111也可以具有三晶体管像素结构,其中放大晶体管1114和选择晶体管1115的功能由一个晶体管完成。例如,曝光控制电路也不局限于单个转移晶体管1112的方式,其它具有控制端控制导通功能的电子器件或结构均可以作为本申请实施例中的曝光控制电路,本申请实施方式中的单个转移晶体管1112的实施方式简单、成本低、易于控制。
图5至图10是本申请某些实施方式的像素阵列11(图2所示)中的感光像素110(图3所示)的排布示意图。感光像素110包括两类,一类为全色感光像素W,另一类为彩色感光像素。图5至图10仅示出了一个最小重复单元中的多个感光像素110的排布。对图5至图10所示的最小重复单元在行和列上多次复制,即可形成像素阵列11。每个最小重复单元均由多个全色感光像素W和多个彩色感光像素组成。每个最小重复单元包括多个子单元。每个子单元内包括多个单颜色感光像素和多个全色感光像素W。其中,图5至图8所示的最小重复单元中,每个子单元中的全色感光像素W和彩色感光像素交替设置。图9和图10所示的最小重复单元中,每个子单元中,同一行的多个感光像素110为同一类别的感光像素110;或者,同一列的多个感光像素110为同一类别的感光像素110。
具体地,例如,图5为本申请一个实施例的最小重复单元中感光像素110(图3所示)的排布示意图。其中,最小重复单元为4行4列16个感光像素110,子单元为2行2列4个感光像素110。排布方式为:
Figure PCTCN2020119963-appb-000001
W表示全色感光像素;A表示多个彩色感光像素中的第一颜色感光像素;B表示多个彩色感光像素中的第二颜色感光像素;C表示多个彩色感光像素中的第三颜色感光像素。
例如,如图5所示,对于每个子单元,全色感光像素W和单颜色感光像素交替设置。
例如,如图5所示,子单元的类别包括三类。其中,第一类子单元UA包括多个全色感光像素W和多个第一颜色感光像素A;第二类子单元UB包括多个全色感光像素W和多个第二颜色感光像素B;第三类子单元UC包括多个全色感光像素W和多个第三颜色感光像素C。每个最小重复单元包括四个子单元,分别为一个第一类子单元UA、两个第二类子单元UB及一个第三类子单元UC。其中,一个第一类子单元UA与一个第三类子单元UC设置在第一对角线方向D1(例如图5中左上角和右下角连接的方向),两个第二类子单元UB设置在第二对角线方向D2(例如图5中右上角和左下角连接的方向)。第一对角线方向D1与第二对角线方向D2不同。例如,第一对角线和第二对角线垂直。
需要说明的是,在其他实施方式中,第一对角线方向D1也可以是右上角和左下角连接的方向,第二对角线方向D2也可以是左上角和右下角连接的方向。另外,这里的“方向”并非单一指向,可以理解为指示排布的“直线”的概念,可以有直线两端的双向指向。下文图6至图10中对第一对角线方向D1及第二对角线方向D2的解释与此处相同。
再例如,图6为本申请另一个实施例的最小重复单元中感光像素110(图3所示)的排布示意图。其中,最小重复单元为6行6列36个感光像素110,子单元为3行3列9个感光像素110。排布方式为:
Figure PCTCN2020119963-appb-000002
W表示全色感光像素;A表示多个彩色感光像素中的第一颜色感光像素;B表示多个彩色感光像素中的第二颜色感光像素;C表示多个彩色感光像素中的第三颜色感光像素。
例如,如图6所示,对于每个子单元,全色感光像素W和单颜色感光像素交替设置。
例如,如图6所示,子单元的类别包括三类。其中,第一类子单元UA包括多个全色感光像素W和多个第一颜色感光像素A;第二类子单元UB包括多个全色感光像素W和多个第二颜色感光像素B;第三类子单元UC包括多个全色感光像素W和多个第三颜色感光像素C。每个最小重复单元包括四个子单元,分别为一个第一类子单元UA、两个第二类子单元UB及一个第三类子单元UC。其中,一个第一类子单元UA与一个第三类子单元UC设置在第一对角线方向D1,两个第二类子单元UB设置在第二对角线方向D2。第一对角线方向D1与第二对角线方向D2不同。例如,第一对角线和第二对角线垂直。
再例如,图7为本申请又一个实施例的最小重复单元中感光像素110(图3所示)的排布示意图。其中,最小重复单元为8行8列64个感光像素110,子单元为4行4列16个感光像素110。排布方式为:
Figure PCTCN2020119963-appb-000003
W表示全色感光像素;A表示多个彩色感光像素中的第一颜色感光像素;B表示多个彩色感光像素中的第二颜色感光像素;C表示多个彩色感光像素中的第三颜色感光像素。
例如,如图7所示,对于每个子单元,全色感光像素W和单颜色感光像素交替设置。
例如,如图7所示,子单元的类别包括三类。其中,第一类子单元UA包括多个全色感光像素W和多个第一颜色感光像素A;第二类子单元UB包括多个全色感光像素W和多个第二颜色感光像素B;第三类子单元UC包括多个全色感光像素W和多个第三颜色感光像素C。每个最小重复单元包括四个子单元,分别为一个第一类子单元UA、两个第二类子单元UB及一个第三类子单元UC。其中,一个第一类子单元UA与一个第三类子单元UC设置在第一对角线方向D1,两个第二类子单元UB设置在第二对角线方向D2。第一对角线方向D1与第二对角线方向D2不同。例如,第一对角线和第二对角线垂直。
具体地,例如,图8为本申请再一个实施例的最小重复单元中感光像素110(图3所示)的排布示意图。其中,最小重复单元为4行4列16个感光像素110,子单元为2行2列4个感光像素110。排布方式为:
Figure PCTCN2020119963-appb-000004
Figure PCTCN2020119963-appb-000005
W表示全色感光像素;A表示多个彩色感光像素中的第一颜色感光像素;B表示多个彩色感光像素中的第二颜色感光像素;C表示多个彩色感光像素中的第三颜色感光像素。
图8所示的最小重复单元中感光像素110的排布与图5所示的最小重复单元中感光像素110的排布大致相同,其不同之处在于,图8中位于左下角的第二类子单元UB中的全色感光像素W与单颜色感光像素的交替顺序与图5中位于左下角的第二类子单元UB中的全色感光像素W与单颜色感光像素的交替顺序不一致,并且,图8中的第三类子单元UC中的全色感光像素W与单颜色感光像素的交替顺序与图5中位于右下角的第三类子单元UC中的全色感光像素W与单颜色感光像素的交替顺序也不一致。具体地,图5中位于左下角的第二类子单元UB中,第一行的感光像素110的交替顺序为全色感光像素W、单颜色感光像素(即第二颜色感光像素B),第二行的感光像素110的交替顺序为单颜色感光像素(即第二颜色感光像素B)、全色感光像素W;而图8中位于左下角的第二类子单元UB中,第一行的感光像素110的交替顺序为单颜色感光像素(即第二颜色感光像素B)、全色感光像素W,第二行的感光像素110的交替顺序为全色感光像素W、单颜色感光像素(即第二颜色感光像素B)。图5中位于右下角的第三类子单元UC中,第一行的感光像素110的交替顺序为全色感光像素W、单颜色感光像素(即第三颜色感光像素C),第二行的感光像素110的交替顺序为单颜色感光像素(即第三颜色感光像素C)、全色感光像素W;而图8中位于右下角的第三类子单元UC中,第一行的感光像素110的交替顺序为单颜色感光像素(即第三颜色感光像素C)、全色感光像素W,第二行的感光像素110的交替顺序为全色感光像素W、单颜色感光像素(即第三颜色感光像素C)。
如图8所示,图8中的第一类子单元UA中的全色感光像素W与单颜色感光像素的交替顺序与第三类子单元UC中的全色感光像W素与单颜色感光像素的交替顺序不一致。具体地,图8所示的第一类子单元CA中,第一行的感光像素110的交替顺序为全色感光像素W、单颜色感光像素(即第一颜色感光像素A),第二行的感光像素110的交替顺序为单颜色感光像素(即第一颜色感光像素A)、全色感光像素W;而图8所示的第三类子单元CC中,第一行的感光像素110的交替顺序为单颜色感光像素(即第三颜色感光像素C)、全色感光像素W,第二行的感光像素110的交替顺序为全色感光像素W、单颜色感光像素(即第三颜色感光像素C)。也即是说,同一最小重复单元中,不同子单元内的全色感光像素W与彩色感光像素的交替顺序可以是一致的(如图5所示),也可以是不一致的(如图8所示)。
再例如,图9为本申请还一个实施例的最小重复单元中感光像素110(图3所示)的排布示意图。其中,最小重复单元为4行4列16个感光像素110,子单元为2行2列4个感光像素110。排布方式为:
Figure PCTCN2020119963-appb-000006
W表示全色感光像素;A表示多个彩色感光像素中的第一颜色感光像素;B表示多个彩色感光像素中的第二颜色感光像素;C表示多个彩色感光像素中的第三颜色感光像素。
例如,如图9所示,对于每个子单元,同一行的多个感光像素110为同一类别的感光像素110。其中,同一类别的感光像素110包括:(1)均为全色感光像素W;(2)均为第一颜色感光像素A;(3)均为第二颜色感光像素B;(4)均为第三颜色感光像素C。
例如,如图9所示,子单元的类别包括三类。其中,第一类子单元UA包括多个全色感光像素W和多个第一颜色感光像素A;第二类子单元UB包括多个全色感光像素W和多个第二颜色感光像素B;第三类子单元UC包括多个全色感光像素W和多个第三颜色感光像素C。每个最小重复单元包括四个子单元,分别为一个第一类子单元UA、两个第二类子单元UB及一个第三类子单元UC。其中,一个第一类子单元UA与一个第三类子单元UC设置在第一对角线方向D1,两个第二类子单元UB设置在第二对角线方向D2。第一对角线方向D1与第二对角线方向D2不同。例如,第一对角线和第二对角线垂直。
再例如,图10为本申请还一个实施例的最小重复单元中感光像素110(图3所示)的排布示意图。其中,最小重复单元为4行4列16个感光像素110,子单元为2行2列4个感光像素110。排布方式为:
Figure PCTCN2020119963-appb-000007
W表示全色感光像素;A表示多个彩色感光像素中的第一颜色感光像素;B表示多个彩色感光像素中的第二颜色感光像素;C表示多个彩色感光像素中的第三颜色感光像素。
例如,如图10所示,对于每个子单元,同一列的多个感光像素110为同一类别的感光像素110。其中,同一类别的感光像素110包括:(1)均为全色感光像素W;(2)均为第一颜色感光像素A;(3)均 为第二颜色感光像素B;(4)均为第三颜色感光像素C。
例如,如图10所示,子单元的类别包括三类。其中,第一类子单元UA包括多个全色感光像素W和多个第一颜色感光像素A;第二类子单元UB包括多个全色感光像素W和多个第二颜色感光像素B;第三类子单元UC包括多个全色感光像素W和多个第三颜色感光像素C。每个最小重复单元包括四个子单元,分别为一个第一类子单元UA、两个第二类子单元UB及一个第三类子单元UC。其中,一个第一类子单元UA与一个第三类子单元UC设置在第一对角线方向D1,两个第二类子单元UB设置在第二对角线方向D2。第一对角线方向D1与第二对角线方向D2不同。例如,第一对角线和第二对角线垂直。
例如,在其他实施方式中,同一最小重复单元中,也可以是部分子单元内的同一行的多个感光像素110为同一类别的感光像素110,其余部分子单元内的同一列的多个感光像素110为同一类别的感光像素110。
例如,如图5至图10所示的最小重复单元中,第一颜色感光像素A可以为红色感光像素R;第二颜色感光像素B可以为绿色感光像素G;第三颜色感光像素C可以为蓝色感光像素Bu。
例如,如图5至图10所示的最小重复单元中,第一颜色感光像素A可以为红色感光像素R;第二颜色感光像素B可以为黄色感光像素Y;第三颜色感光像素C可以为蓝色感光像素Bu。
例如,如图5至图10所示的最小重复单元中,第一颜色感光像素A可以为品红色感光像素M;第二颜色感光像素B可以为青色感光像素Cy;第三颜色感光像素C可以为黄色感光像素Y。
需要说明的是,在一些实施例中,全色感光像素W的响应波段可为可见光波段(例如,400nm-760nm)。例如,全色感光像素W上设置有红外滤光片,以实现红外光的滤除。在另一些实施例中,全色感光像素W的响应波段为可见光波段和近红外波段(例如,400nm-1000nm),与图像传感器10(图1所示)中的光电转换元件1111(图4所示)的响应波段相匹配。例如,全色感光像素W可以不设置滤光片或者设置可供所有波段的光线通过的滤光片,全色感光像素W的响应波段由光电转换元件1111的响应波段确定,即两者相匹配。本申请的实施例包括但不局限于上述波段范围。
为了方便说明,以下实施例均以第一单颜色感光像素A为红色感光像素R,第二单颜色感光B为绿色感光像素G,第三单颜色感光像素为蓝色感光像素Bu进行说明。
请参阅图1、图2、图3、图4及图11,在某些实施方式中,控制单元13控制像素阵列11曝光。其中,像素阵列11以第一曝光时间曝光得到第一原始图像。第一原始图像包括以第一曝光时间曝光的单颜色感光像素生成的第一彩色原始图像数据和以第一曝光时间曝光的全色感光像素W生成的第一全色原始图像数据。像素阵列11以第二曝光时间曝光得到第二原始图像。第二原始图像包括以第二曝光时间曝光的单颜色感光像素生成的第二彩色原始图像数据和以第二曝光时间曝光的全色感光像素W生成的第二全色原始图像数据;其中,第一曝光时间不等于第二曝光时间。
具体地,像素阵列11进行两次曝光。例如,如图11所示,在第一次曝光中,像素阵列11以第一曝光时间L(例如表示长曝光时间)曝光得到第一原始图像。第一原始图像包括以第一曝光时间L曝光的单颜色感光像素生成的第一彩色原始图像数据和以第一曝光时间L曝光的全色感光像素W生成的第一全色原始图像数据。在第二次曝光中,像素阵列11以第二曝光时间S(例如表示短曝光时间)曝光得到第二原始图像。第二原始图像包括以第二曝光时间S曝光的单颜色感光像素生成的第二彩色原始图像数据和以第二曝光时间S曝光的全色感光像素W生成的第二全色原始图像数据。需要说明的是,像素阵列11也可以先进行短曝光,再进行长曝光,在此不作限制。
像素阵列11曝光完毕后,图像传感器10可以输出像素阵列11生成的多个原始图像数据,多个原始图像数据形成原始图像。
在一个例子中,每帧原始图像(第一原始图像、第二原始图像及下文的第三原始图像)中的每个彩色原始图像数据均由单个单颜色感光像素生成,每个全色原始图像数据均由单个全色感光像素W生成,图像传感器10输出多个原始图像数据的输出方式可以为一个彩色原始图像数据与一个全色原始图像数据交替输出。
具体地,像素阵列11曝光后,每一个单颜色感光像素生成一个与该单颜色感光像素对应的彩色原始图像数据,每一个全色感光像素W生成一个与该全色感光像素W对应的全色原始图像数据。并且,对于处于同一行的多个感光像素110而言,该多个感光像素110生成的原始图像数据的输出方式为:一个彩色原始图像数据与一个全色原始图像数据交替输出。在同一行的多个原始图像数据输出完毕后,再输出下一行的多个原始图像数据,每一行的多个原始图像数据的输出方式均为一个彩色原始图像数据与一个全色原始图像数据输出。如此,图像传感器10依次输出多个原始图像数据,多个原始图像数据形成一张原始图像。需要说明的是,一个彩色原始图像数据和一个全色原始图像数据交替输出可以包括以下两种:(1)先输出一个彩色原始图像数据,再输出一个全色原始图像数据;(2)先输出一个全色原始图像数据,再输出一个彩色原始图像数据。具体的交替顺序与像素阵列11中的全色感光像素W与彩色 感光像素的排布相关。当处于像素阵列11的0行0列的感光像素110为彩色感光像素时,则交替顺序为(1);当处于像素阵列11的0行0列的感光像素110为全色感光像素W时,则交替顺序为(2)。
下面以图12为例对原始图像数据的输出方式做说明。请结合图1、图2、图3及图12,假设像素阵列11包括8*8个感光像素110,且像素阵列11的第0行第0列的感光像素110为全色感光像素W,则当像素阵列11曝光完成后,图像传感器10先输出第0行第0列的全色感光像素p00生成的全色原始图像数据,该全色原始图像数据对应的图像像素P00位于原始图像的第0行第0列;随后,图像传感器10再输出第0行第1列的彩色感光像素p01生成的彩色原始图像数据,该彩色原始图像数据对应的图像像素P01位于原始图像的第0行第1列;…;图像传感器10输出第0行第7列的彩色感光像素p07生成的彩色原始图像数据,该彩色原始图像数据对应的图像像素P07位于原始图像的第0行第7列。至此,像素阵列11第0行内8个感光像素110生成的原始图像数据均被输出。随后,图像传感器10再依次输出像素阵列11第1行内8个感光像素110生成的原始图像数据;随后,图像传感器10再依次输出像素阵列11第2行内8个感光像素110生成的原始图像数据;以此类推,直至图像传感器10输出第7行第7列的全色感光像素p77生成的全色原始图像数据为止。如此,多个感光像素110生成的原始图像数据形成一帧原始图像,其中,每个感光像素110生成的原始图像数据对应的图像像素在原始图像中的位置与该感光像素110在像素阵列11中的位置相对应。
在另一个例子中,每帧原始图像(第一原始图像、第二原始图像及下文的第三原始图像)中的每个彩色原始图像数据由同一子单元中的多个单颜色感光像素共同生成,每个全色原始图像数据由同一子单元中的多个全色感光像素W共同生成,图像传感器10输出多个原始图像数据的输出方式包括多个彩色原始图像数据与多个全色原始图像数据交替输出。
具体地,像素阵列11曝光后,同一个子单元中的多个单颜色感光像素共同生成一个与该子单元对应的彩色原始图像数据,同一个子单元中的多个全色感光像素W共同生成一个与该子单元对应的全色原始图像数据,也即,一个子单元对应有一个彩色原始图像数据及一个全色原始图像数据。并且,对于处于同一行的多个子单元而言,该多个子单元对应的原始图像数据的输出方式为:同一行的多个子单元对应的多个彩色原始图像数据与多个全色原始图像数据交替输出,其中,多个彩色原始图像数据的输出方式为多个彩色原始图像依次接连输出;多个全色原始图像数据的输出方式为多个全色原始图像数据依次接连输出。在同一行的多个原始图像数据输出完毕后,再输出下一行的多个原始图像数据,每一行的多个原始图像数据的输出方式均为多个彩色原始图像数据与多个全色原始图像数据交替输出。如此,图像传感器10依次输出多个原始图像数据,多个原始图像数据形成一张原始图像。需要说明的是,多个彩色原始图像数据和多个全色原始图像数据交替输出可以包括以下两种:(1)先依次接连输出多个彩色原始图像数据,再依次接连输出多个全色原始图像数据;(2)先依次接连输出多个全色原始图像数据,再依次接连输出多个彩色原始图像数据。具体的交替顺序与像素阵列11中的全色感光像素W与彩色感光像素的排布相关。当处于像素阵列11的0行0列的感光像素110为彩色感光像素时,则交替顺序为(1);当处于像素阵列11的0行0列的感光像素110为全色感光像素W时,则交替顺序为(2)。
下面以图13为例对原始图像数据的输出方式做说明。请结合图1、图2、图3及图13,假设像素阵列11包括8*8个感光像素110。像素阵列11中的全色感光像素p00、全色感光像素p11、彩色感光像素p01及彩色感光像素p10构成子单元U1;全色感光像素p02、全色感光像素p13、彩色感光像素p03及彩色感光像素p12构成子单元U2;全色感光像素p04、全色感光像素p15、彩色感光像素p05及彩色感光像素p14构成子单元U3;全色感光像素p06、全色感光像素p17、彩色感光像素p07及彩色感光像素p16构成子单元U4,其中,子单元U1、子单元U2、子单元U3及子单元U4位于同一行。由于像素阵列11的第0行0列的感光像素110为全色感光像素W,则当像素阵列11曝光完成后,图像传感器10先输出子单元U1中全色感光像素p00和全色感光像素p11共同生成的全色原始图像数据,该全色原始图像数据对应的图像像素P00位于原始图像的第0行第0列;随后,图像传感器10再输出子单元U2中全色感光像素p02和全色感光像素p13共同生成的全色原始图像数据,该全色原始图像数据对应的图像像素P01位于原始图像的第0行第1列;随后,图像传感器10再输出子单元U3中全色感光像素p04和全色感光像素p15共同生成的全色原始图像数据,该全色原始图像数据对应的图像像素P02位于原始图像的第0行第2列;随后,图像传感器10再输出子单元U4中全色感光像素p06和全色感光像素p17共同生成的全色原始图像数据,该全色原始图像数据对应的图像像素P03位于原始图像的第0行第3列。至此,处于第一行的多个子单元对应的多个全色原始图像数据均已输出。随后,图像传感器10先输出子单元U1中彩色感光像素p01和彩色感光像素p10共同生成的彩色原始图像数据,该彩色原始图像数据对应的图像像素P10位于原始图像的第1行第0列;随后,图像传感器10再输出子单元U2中彩色感光像素p03和彩色感光像素p12共同生成的彩色原始图像数据,该彩色原始图像数据对应的图像像素P11位于原始图像的第1行第1列;随后,图像传感器10再输出子单元U3中彩色感光像素p05和彩色感光 像素p14共同生成的彩色原始图像数据,该彩色原始图像数据对应的图像像素P12位于原始图像的第1行第2列;随后,图像传感器10再输出子单元U4中彩色感光像素p07和彩色感光像素p16共同生成的彩色原始图像数据,该彩色原始图像数据对应的图像像素P13位于原始图像的第1行第3列。至此,处于第一行的多个子单元对应的多个彩色原始图像数据也均已输出。随后,图像传感器10再对处于第二行的多个子单元对应的多个全色原始图像数据及多个彩色原始图像数据进行输出,处于第二行的多个子单元对应的多个全色原始图像数据及多个彩色原始图像数据的输出方式与处于第一行的多个子单元对应的多个全色原始图像数据及多个彩色原始图像数据的输出方式相同,在此不再赘述。以此类推,直至图像传感器10输出完处于第四行的多个子单元对应的多个全色原始图像数据及多个彩色原始图像数据为止。如此,多个感光像素110生成的原始图像数据形成一帧原始图像。
请参阅图1、图2及图11,图像传感器10输出第一原始图像及第二原始图像后,将第一原始图像及第二原始图像传输至图像融合模块20进行图像融合处理以获得第一彩色中间图像及第二彩色中间图像。具体地,图像融合模块20对第一原始图像中的第一彩色原始图像数据及第一全色原始图像数据进行融合,以获得仅包含第一彩色中间图像数据的第一彩色中间图像,并对第二原始图像中的第二彩色原始图像数据及第二全色原始图像数据进行融合,以获得仅包含第二彩色中间图像数据的第二彩色中间图像,第一彩色中间图像和第二彩色中间图像均包含多个彩色图像像素,多个彩色图像像素呈拜耳阵列排布。
具体地,当图像传感器10输出多个原始图像数据的输出方式为一个彩色原始图像数据与一个全色原始图像数据交替输出时,如图14所示,图像融合模块20融合彩色原始图像数据及全色原始图像数据后得到的彩色中间图像包括多个彩色图像像素,多个彩色图像像素呈拜耳阵列排布。并且,该彩色中间图像的分辨率与像素阵列11的分辨率相同。
当图像传感器10输出多个原始图像数据的输出方式包括多个彩色原始图像数据与多个全色原始图像数据交替输出时,如图15所示,图像融合模块20融合彩色原始图像数据及全色原始图像数据后得到的彩色中间图像包括多个彩色图像像素,多个彩色图像像素呈拜耳阵列排布。并且,该彩色中间图像的分辨率与像素阵列11的分辨率相同。
在某些实施方式中,在图像传感器10工作在高分辨率模式下时,可以采用一个彩色原始图像数据与一个全色原始图像数据交替输出的方式进行原始图像数据的输出。在图像传感器10工作在低分辨率模式下时,可以采用多个彩色原始图像数据与多个全色原始图像数据交替输出的方式进行原始图像数据的输出。示例地,在环境亮度较高时图像传感器10可以工作在高分辨率模式,有利于提升最终获取的图像的清晰度;在环境亮度较低时图像传感器10可以工作在低分辨率模式,有利于提升最终获取的图像的亮度。
需要说明的是,图像融合模块20可以集成在图像传感器10中,也可以集成在图像处理器40中,还可以单独设置在图像传感器10及图像处理器40之外。
高动态范围图像处理系统100还包括图像处理器40。请参阅图16,图像处理器40包括图像预处理模块41,图像融合模块20获得第一彩色中间图像及第二彩色中间图像后,将这两张图像传输至图像预处理模块41进行图像预处理。图像预处理模块41对第一彩色中间图像进行图像预处理以获得预处理后的第一彩色中间图像,对第二彩色中间图像进行预处理以获得预处理后的第二彩色中间图像。
需要说明的是,图像预处理包括黑电平校正、镜头阴影校正、坏点补偿中的至少一种。例如,图像预处理仅包括黑电平校正;或者,图像预处理包括镜头阴影校正和坏点补偿;或者,图像预处理包括黑电平校正处理和镜头阴影校正;或者,图像预处理包括黑电平校正、镜头阴影校正和坏点补偿。
由于图像传感器10采集的信息经过一系列转换生成原始图像。以8bit数据为例,单个图像像素的有效值是0~255,但是实际图像传感器10中的模数转换芯片的精度可能无法将电压值很小的一部分转换出来,便容易造成生成图像的暗部细节的损失。黑电平校正的过程可以是,图像预处理模块41在获得图像融合模块20融合后的彩色中间图像的基础上,将每个像素值(即每个彩色中间图像数据)减去一个固定值。各颜色通道的像素值对应的固定值可以是一样,也可以是不一样。以图像预处理模块41对第一彩色中间图像进行黑电平校正为例,第一彩色中间图像中具有红色通道的像素值、绿色通道的像素值和蓝色通道的像素值。请参阅图17,图像预处理模块41对第一彩色中间图像进行黑电平校正,第一彩色中间图像中所有的像素值均减去固定值5,从而得到经过黑电平校正的第一彩色中间图像。同时图像传感器10在ADC的输入之前加上一个固定的偏移量5(或者其他数值),使输出的像素值在5(或者其他数值)~255之间,配合黑电平校正,能使得本申请实施方式的图像传感器10和高动态范围图像处理系统100得到的图像的暗部的细节完全保留的同时,不增大或减小图像的像素值,有利于提高成像质量。
镜头阴影是由于镜头对于光学折射不均匀导致的镜头周围出现阴影的情况,即影像区的中心和四周 的接收到的光强程度不一致的现象。镜头阴影校正的过程可以是,图像预处理模块41可以在经过黑电平校正的第一彩色中间图像及经过黑电平校正的第二彩色中间图像基础上,将被处理图像进行网格划分,再通过各网格区域邻近的或者自身及邻近周的补偿系效,采用双线性插值方法对图像进行镜头阴影矫正。镜头阴影处理矫正处理的过程也可以是,图像预处理模块41直接对第一彩色中间图像及第二彩色中间图像,将被处理图像进行网格划分,再通过各网格区域邻近的或者自身及邻近周的补偿系效,采用双线性插值方法对图像进行镜头阴影矫正。下文以对第一彩色中间图像进行镜头阴影校正为例进行说明,如图18所示,图像预处理模块41将第一彩色中间图像(即被处理图像)进行划分,均等地分为十六个网格,十六个网格中每个网格具有一预设好的补偿系数。然后,图像预处理模块41根据各网格区域邻近的或者自身及其邻近的补偿系效通过双线性插值方法对图像进行阴影矫正。R2为图示的经过镜头阴影校正的第一彩色中间图像中虚线框内的像素值,R1为图示的第一彩色中间图像中的虚线框内像素值。R2=R1*k1,k1由R1像素邻近的的网格的补偿系数1.10、1.04、1.05和1.09进行双线性插值获得。设图像的坐标记为(x,y),x从左第一个图像像素开始往右计数,y从上第一个图像像素开始往下计数,x和y均为自然数,如图像边上的标识所示。例如,R1的坐标为(3,3),则R1在各网格补偿系数图中的坐标应为(0.75,0.75)。f(x,y)表示各网格补偿系数图中坐标为(x,y)的补偿值。则f(0.75,0.75)为R1在各网格补偿系数图中对应的补偿系数值。双线性插值的插值公式可以为f(i+u,j+v)=(1-u)(1-v)f(i,j)+(1-u)vf(i,j+1)+u(1-v)f(i+1,j)+uvf(i+1,j+1),其中,x=i+u,i为x的整数部分,u为x的小数部分,j为y的整数部分,v为y的小数部分。则有f(0.75,j0.75)=(0.25)*(0.25)*f(0,0)+0.25*0.75*f(0,1)+0.75*0.25*f(1,0)+0.75*0.75f(1,1)=0.0625*1.11+0.1875*1.10+0.1875*1.09+0.5625*1.03。各网格的补偿系数在图像预处理模块41进行镜头阴影校正之前已经预先设置。各网格的补偿系数可由如下方法确定:(1)将镜头300置于光线强度和色温恒定且均一的密闭装置内,并使镜头300在该密闭装置内正对亮度分布均匀的纯灰色的目标对象拍摄得到灰度图像;(2)将灰度图像进行网格划分(例如划分为16个网格),得到划分为不同网格区域的灰度图像;(3)计算灰度图像的不同网格区域的补偿系数。确定了镜头300的补偿系数之后,本申请的高动态范围图像处理系统100将该补偿系数预先设置在图像预处理模块41中,当高动态范围图像处理系统100中的图像预处理模块41对图像进行镜头阴影校正时,该补偿系数被获取,图像预处理模块41再根据各网格区域的补偿系效,采用双线性插值方法对图像进行镜头阴影校正。
图像传感器40的像素阵列11上的感光像素110可能存在工艺上的缺陷,或光信号进行转化为电信号的过程中出现错误,从而造成图像上像素信息错误,导致图像中的像素值不准确,这些有缺陷的像素表现在输出的图像上即为图像坏点。图像坏点可能存在,因此需要对图像进行坏点补偿。坏点补偿可以包括如下步骤:(1)以待检测像素点为中心像素点建立相同颜色的感光像素的像素点的3×3像素矩阵;(2)以所述中心像素点的周围像素点为参考点,判断所述中心像素点的色值与所述周围像素点的差值是否均大于第一阈值,如果是,则该中心像素点为坏点,如果否,则该中心像素点为正常点;(3)对判定为坏点的中心像素点进行双线性插值得到校正后的像素值。请参阅图19,下面以对第一彩色中间图像(可以是未校正过的第一彩色中间图像,或者经过校正的第一彩色中间等)进行坏点补偿进行说明,图19中的第一张图中的R1为待检测像素点,图像预处理模块41以R1为中心像素点建立与R1的感光像素相同颜色的像素点的3×3像素矩阵,得到图19中的第二张图。并以中心像素点R1的所述周围像素点为参考点,判断中心像素点R1的色值与所述周围像素点的差值是否均大于第一阈值Q(Q在图像预处理模块41中预设)。如果是,则该中心像素点R1为坏点,如果否,则该中心像素点R1为正常点。如果R1是坏点,则对R1进行双线性插值得到校正后的像素值R1’(图中展示的为R1是坏点的情况)得到图19中的第三张图。本申请实施方式的图像预处理模块41可以对图像进行坏点补偿,有利于高动态范围图像处理系统100消除高动态范围图像处理系统100的成像过程中,由于感光像素110存在工艺上的缺陷,或光信号进行转化为电信号的过程中出现错误而产生的图像坏点,进而提高高动态范围图像处理系统100形成的目标图像的像素值的准确性,从而使得本申请实施方式具有更好的成像效果。
请参阅图16,图像处理器40还包括图像后处理模块42,图像后处理模块42对预处理后的第一彩色中间图像及预处理后的第二彩色中间图像进行色彩转换处理,以获得色彩转换后的第一彩色中间图像及色彩转换后的第二彩色中间图像。色彩转换处理是将图像由一个色彩空间(例如RGB色彩空间)转换成另一个色彩空间(例如YUV色彩空间)从而具有更广泛的应用场景或者具有更高效率的传输格式。在具体的实施例中,色彩转换处理的步骤可以为对图像中的所有像素值的R、G和B通道像素值进行如下公式转换得到Y、U和V通道像素值:(1)Y=0.30R+0.59G+0.11B;(2)U=0.493(B-Y);(3)V=0.877(R-Y);从而将该图像由RGB色彩空间转换为YUV色彩空间。由于YUV色彩空间中的亮度信号Y和色度信号U和V是分离的,并且人眼对亮度的敏感超过色度,色彩转换处理将图像由RGB色彩空间转换为YUV色彩空间有利于本申请实施方式的高动态范围图像处理系统100后续的其他图像处理对图像进行色度信息的压缩,在不影响图像观看效果的同时,能减小图像的信息量,从而提高图像的传输效率。
在一些实施例中,图像后处理模块42可以先对预处理后的第一彩色中间图像及预处理后的第二彩色中间图像进行第一类图像后处理,以获得第一类图像后处理之后的第一彩色中间图像和第一类图像后处理之后的第二彩色中间图像。图像后处理模块42再对第一类图像后处理之后的第一彩色中间图像及第一类图像后处理之后的第二彩色中间图像进行第二类图像后处理,如色彩转换,以获得第二类图像后处理之后的第一彩色中间图像及第二类图像后处理之后的第二彩色中间图像,如获得色彩转换后的第一彩色中间图像及色彩转换后的第二彩色中间图像。
需要说明的是,第一类图像后处理包括去马赛克、色彩校正和全局色调映射中的至少一种。例如,第一类图像后处理只包括去马赛克;或者,第一类图像后处理包括去马赛克和色彩校正;或者,第一类图像后处理包括去马赛克、色彩校正和全局色调映射。
由于本申请实施方式的第一彩色中间图像及第二彩色中间图像的每个图像像素格中均为单颜色图像像素,没有其他颜色的光学信息,因此需要对第一彩色中间图像及第二彩色中间图像进行去马赛克。图像后处理模块42可以直接对第一中间彩色中间图像及第二彩色中间图像进行去马赛克;或者,图像后处理模块42可以在经过坏点处理后的第一彩色中间图像及第二彩色中间图像的基础上进行去马赛克。以下以对第一彩色中间图像进行去马赛克为例进行说明。去马赛克的步骤包括如下步骤:(1)将第一彩色中间图像分解成第一红色中间图像、第一绿色中间图像和第一蓝色中间图像,如图20所示,所得的第一红色中间图像、第一绿色中间图像和第一蓝色中间图像中部分图像像素格没有像素值。(2)采用双线性插值方法分别对第一红色中间图像、第一绿色中间图像和第一蓝色中间图像进行插值处理。如图21所示,图像后处理模块42采用双线性插值方法对第一蓝色中间图像进行插值处理。图21的待插值图像像素Bu1根据Bu1周围的四个图像像素B2u、Bu3、Bu4和Bu5进行双线性插值,得到Bu1的插值像素Bu1’。图21的第一张图中的所有空白处的待插值图像像素均遍历地采用该双线性插值的方式补全像素值,得到插值后的第一蓝色中间图像。如图22所示,图像后处理模块42采用双线性插值方法对第一绿色中间图像进行插值处理。图22的待插值图像像素G1根据G1周围的四个图像像素G2、G3、G4和G5进行双线性插值,得到G1的插值图像像素G1’。图22的第一张图中的所有空白处的待插值图像像素均遍历地采用该双线性插值的方式补全像素值,得到插值后的第一绿色中间图像。与之类似地,图像后处理模块42可以采用双线性插值方法对第一红色中间图像进行插值处理,得到插值后的第一红色中间图像。(3)将插值后的第一红色中间图像、插值后的第一绿色中间图像和插值后的第一蓝色中间图像重新合成为一张图像,该图像中每个图像像素均具有3个颜色通道的值。如图23所示。图像后处理模块42对彩色图像进行去马赛克,有利于本申请实施方式将具有单颜色通道的像素值的彩色图像补全为具有多个颜色通道的彩色图像,从而在单颜色的感光像素的硬件基础上保持图像色彩的完整呈现。
色彩校正具体可以为利用一个色彩校正矩阵对第一彩色中间图像及第二彩色中间图像(可以为经过去马赛克的第一彩色中间图像及经过去马赛克的第二彩色中间图像)的各图像像素的各颜色通道值进行一次校正,从而实现了对图像色彩的矫正。如下所示:
Figure PCTCN2020119963-appb-000008
其中,色彩矫正矩阵(Color Correction Matrix,CCM)在图像后处理模块42中预设。例如,色彩矫正矩阵具体可以为:
Figure PCTCN2020119963-appb-000009
图像后处理模块42通过对图像中的所有像素遍历地通过以上色彩矫正矩阵进行色彩校正,可以得到经过色彩校正的图像。本申请实施方式中色彩校正有利于消除图像或视频帧中因为有色光源等造成的颜色严重偏差、图像中人或物体颜色失真的问题,使得本申请实施方式的高动态范围图像处理系统100能够恢复图像原始色彩,提高了图像的视觉效果。
色调映射处理可以包括如下步骤:(1)把第一彩色中间图像及第二彩色中间图像(可以为经过色彩校正的第一彩色中间图像及经过色彩校正的第二彩色中间图像)的灰度值归一化到区间[0,1]内,记归一化后的灰度值为Vin;(2)设Vout=Y(Vin),Vout和Vin之间的映射关系可以为如图24所示;(3)把Vout乘上255(当设定输出图像的灰度值为256阶时,乘上255,在其他设定时,可以为其他数值)后再四舍五入取整数,得到了色调映射处理后的图像。本申请实施方式的高动态范围图像处理系统100对图像 的色调映射处理并非线性的映射,而是在灰度值较低的区间的映射关系的斜率大于在灰度值较高的区间的映射关系的斜率,如图24所示,有利灰度值较低的区间内不同灰度值的像素点的区分度,而大部分像素都分布在灰度值较低的区间,因而使得本申请实施方式的高动态图像处理系统100具有更好的成像效果。
需要说明的是,在一些实施例中,图像融合模块20获得第一彩色中间图像及第二彩色中间图像后,可以不需要经过图像预处理直接传输至图像后处理模块42进行色彩转换处理,以获得色彩转换后的第一彩色中间图像及色彩转换后的第二彩色中间图像;或者,图像融合模块20获得第一彩色中间图像及第二彩色中间图像后,直接传输至图像后处理模块42,图像后处理模块42也可以对第一彩色中间图像及第二彩色中间图像进行第一类图像后处理之后再进行色彩转换,以获得色彩转换后的第一彩色中间图像及色彩转换后的第二彩色中间图像,在此不作限制。
请参阅图16,高动态范围图像处理系统100还包括存储模块50,存储模块50用于存储图像处理器40中的图像后处理模块42色彩转换后的图像,并将色彩转换后的图像传输至高动态范围图像处理模块30进行高动态范围图像处理,以获得彩色高动态范围图像。具体地,图像处理器40中的图像后处理模块42依次对第一彩色中间图像及第二彩色中间图像进行色彩转换处理,图像后处理模块42对第一彩色中间图像完成色彩转换处理后,将获得的色彩转换后的第一彩色中间图像传输至存储模块50进行存储,图像后处理模块42对第二彩色中间图像完成色彩转换处理后,将获得的色彩转换后的第二彩色中间图像传输至存储模块50进行存储,当存储模块50内存储有图像后处理模块42进行色彩转换处理后的所有图像后(即当存储模块50内存储有色彩转换后的第一彩色中间图像及色彩转换后的第二彩色中间图像),存储模块50将存储的所有图像(即预处理色彩转换后的第一彩色中间图像及色彩转换后的第二彩色中间图像)传输至高动态范围图像处理模块30。
需要说明的是,图像后处理模块42也可以先对第二彩色中间图像进行色彩转换处理后,再对第一彩色中间图像进行色彩转换处理;图像后处理模块42也可同时对第一彩色中间图像及第二彩色中间图像进行色彩转换处理,在此不作限制。无论图像后处理模块42采用何种方式对第一彩色中间图像及第二彩色中间图像进行色彩转换处理,存储模块50只有在存储有色彩转换后的第一彩色中间图像及色彩转换后的第二彩色中间图像后,才将这两张图像传输至高动态范围图像处理模块30。
高动态范围图像处理模块30在获取到色彩转换后的第一彩色中间图像及色彩转换后的第二彩色中间图像后对这两张图像进行高动态范围处理以获得高动态范围图像。具体地,请结合图25,假设图像像素P12(图25中色彩转换后的第一彩色中间图像内标记有虚线圆圈的图像像素)的像素值V1大于第一预设阈值V0,即图像像素P12为过曝图像像素P12,则高动态范围图像处理单元31以过曝图像像素P12为中心扩展一个预定区域,例如,图25所示的3*3区域。当然,在其他实施例中,也可以是4*4区域、5*5区域、10*10区域等,在此不作限制。随后,高动态范围图像处理单元31在3*3的预定区域内寻找像素值小于第一预设阈值V0的中间图像像素,例如图25中的图像像素P21(图25中色彩转换后的第一彩色中间图像内标记有点画线圆圈的图像像素)的像素值V2小于第一预设阈值V0,则图像像素P21即为中间图像像素P21。随后,高动态范围图像处理单元31在色彩转换后的第二彩色中间图像中寻找与过曝图像像素P12及中间图像像素P21分别对应的图像像素,即图像像素P1’2’(图25中色彩转换后的第二彩色中间图像内标记有虚线圆圈的图像像素)和图像像素P2’1’(图25中色彩转换后的第二彩色中间图像内标记有点画线圆圈的图像像素),其中,图像像素P1’2’与过曝图像像素P12对应,图像像素P2’1’与中间图像像素P21对应,图像像素P1’2’的像素值为V3,图像像素P2’1’的像素值为V4。随后,根据V1’/V3=V2/V4来计算出V1’,并利用V1’的值来替换掉V1的值。由此,即可计算出过曝图像像素P12的实际像素值。高动态范围图像处理单元31对色彩转换且亮度对齐的第一彩色中间图像中的每一个过曝图像像素均执行这一亮度对齐的处理过程,即可得到色彩转换且亮度对齐后的第一彩色中间图像。由于色彩转换且亮度对齐后的第一彩色中间图像中的过曝图像像素的像素值经过了修正,色彩转换且亮度对齐后的第一彩色中间图像中的每个图像像素的像素值均较为准确。高动态范围处理过程中,在获取到色彩转换且亮度对齐后的色彩转换后的第一彩色中间图像后,高动态范围图像处理模块30可以对色彩转换且亮度对齐后的图像和同类图像进行融合以得到高动态的图像。具体地,高动态范围图像处理模块30首先对色彩转换且亮度对齐后的第一彩色中间图像进行运动检测,以识别色彩转换且亮度对齐后的第一彩色中间图像中是否存在运动模糊区域。若色彩转换且亮度对齐后的第一彩色中间图像中不存在运动模糊区域,则直接融合色彩转换且亮度对齐后的第一彩色中间图像及色彩转换后的第二彩色中间图像以得到彩色高动态范围图像。若色彩转换且亮度对齐后的第一彩色中间图像中存在运动模糊区域,则将色彩转换且亮度对齐后的第一彩色中间图像中的运动模糊区域剔除,只融合色彩转换后的第二彩色中间图像和色彩转换且亮度对齐后的第一彩色中间图像中除运动模糊区域以外的区域以得到彩色高动态范围图像。具体地,在融合色彩转换且亮度对齐后的第一彩色中间图像及色彩转 换后的第二彩色中间图像时,若色彩转换且亮度对齐后的第一彩色中间图像中不存在运动模糊区域,则此时两张中间图像的融合遵循以下原则:(1)色彩转换且亮度对齐后的第一彩色中间图像中,过曝区域的图像像素的像素值直接替换为色彩转换后的第二彩色中间图像中对应于该过曝区域的图像像素的像素值;(2)色彩转换且亮度对齐后的第一彩色中间图像中,欠曝区域的图像像素的像素值为:长曝光像素值除以系数K1,系数K1为K2和K3的平均数;K2为长曝光像素值和中曝光像素值的比例,K3为长曝光像素值和短曝光像素值的比例;(3)色彩转换且亮度对齐后的第一彩色中间图像中,未欠曝也未过曝区域的图像像素的像素值为:长曝光像素值除以系数K1。若色彩转换且亮度对齐后的第一彩色中间图像中存在运动模糊区域,则此时两张中间图像的融合除了遵循上述三个原则外,还需要遵循第(4)个原则:色彩转换且亮度对齐后的第一彩色中间图像中,运动模糊区域的图像像素的像素值直接替换为色彩转换后的第二彩色中间图像中对应于该运动模糊区域的图像像素的像素值。本申请实施方式的高动态范围图像处理系统100通过高动态范围图像处理模块30对图像进行高动态范围处理,先对图像进行亮度对齐处理,再对亮度对齐后的图像与其他图像进行融合,得到高动态的图像,使得高动态范围图像处理系统100形成的目标图像具有更大的动态范围,进而具有更好的成像效果。
在某些实施方式中,像素阵列11还可以以第三曝光时间曝光得到第三原始图像。第三原始图像包括以第三曝光时间曝光的单颜色感光像素生成的第三彩色原始图像数据和以第三曝光时间曝光的全色感光像素W生成的第三全色原始图像数据。其中,第三曝光时间不等于第一曝光时间,第三曝光时间不等于第二曝光时间。
具体地,请参阅图26,像素阵列11进行三次曝光,以分别得到第一原始图像、第二原始图像和第三原始图像。其中,第一原始图像包括以第一曝光时间L曝光的单颜色感光像素生成的第一彩色原始图像数据和以第一曝光时间L曝光的全色感光像素W生成的第一全色原始图像数据。第二原始图像包括以第二曝光时间M曝光的单颜色感光像素生成的第二彩色原始图像数据和以第二曝光时间M曝光的全色感光像素W生成的第二全色原始图像数据。第三原始图像包括以第三曝光时间S曝光的单颜色感光像素生成的第三彩色原始图像数据和以第三曝光时间S曝光的全色感光像素W生成的第三全色原始图像数据。
图像融合模块20可以将第一彩色原始图像数据与第一全色原始图像数据融合为仅包含第一彩色中间图像数据的第一彩色中间图像,并将第二彩色原始图像数据与第二全色原始图像数据融合为仅包含第二彩色中间图像数据的第二彩色中间图像,并将第三彩色原始图像数据与第三全色原始图像数据融合为仅包含第三彩色中间图像数据的第三彩色中间图像。具体实施方式与图14及图15所述实施例中具体实施方式相同,在此不作赘述。
图像预处理模块41可以对第一彩色中间图像进行图像预处理以获得预处理后的第一彩色中间图像;对第二彩色中间图像进行预处理以获得预处理后的第二彩色中间图像,并对第三彩色中间图像进行预处理以获得预处理后的第三彩色中间图像。具体实施方式与图17至图19任一所述实施例中的图像预处理的实施方式相同,在此不作赘述。
图像后处理模块42对预处理后的第一彩色中间图像、预处理后的第二彩色中间图像及预处理后的第三彩色中间图像进行色彩转换处理,以获得色彩转换后的第一彩色中间图像、色彩转换后的第二彩色中间图像及色彩转换后的第三彩色中间图像;或者,图像后处理模块42对预处理后的第一彩色中间图像、预处理后的第二彩色中间图像及预处理后的第三彩色中间图像进行图像后处理,以获得图像后处理之后的第一彩色中间图像、图像后处理之后的第二彩色中间图像及图像后处理之后的第三彩色中间图像;或者,图像后处理模块42直接对第一彩色中间图像、第二彩色中间图像及第三彩色中间图像进行色彩转换处理,以获得色彩转换后的第一彩色中间图像、色彩转换后的第二彩色中间图像及色彩转换后的第三彩色中间图像。具体的色彩转换处理过程与上述实施例中色彩转换处理过程相同,在此不作赘述。
高动态范围图像处理模块30对色彩转换后的第一彩色中间图像、色彩转换后的第二彩色中间图像及色彩转换后的第三彩色中间图像进行高动态范围处理,以获得彩色高动态范围图像。具体的高动态范围处理的实施方法与图25所述的实施例中将预处理后的第一彩色中间图像及预处理后的第二彩色中间图像融合为彩色高动态范围图像的具体实施方式相同,在此不作赘述。
请参阅图27,本申请还提供一种电子设备1000。本申请实施方式的电子设备1000包括镜头300、壳体200及上述任意一项实施方式的高动态范围图像处理系统100。镜头300、高动态范围图像处理系统100与壳体200结合。镜头300与高动态范围图像处理系统100的图像传感器10配合成像。
电子设备1000可以是手机、平板电脑、笔记本电脑、智能穿戴设备(例如智能手表、智能手环、智能眼镜、智能头盔)、无人机、头显设备等,在此不作限制。
本申请实施方式的电子设备1000通过在高动态范围图像处理系统100中设置的图像融合模块20对图像传感器10输出的多帧原始图像事先进行融合算法处理,以得到图像像素呈拜耳阵列排布的多帧彩 色中间图像。如此,多帧彩色中间图像可以被图像处理器40处理,解决了图像处理器不能直接对图像像素呈非拜耳阵列排布的图像进行处理的问题。
请参阅图2及图28,本申请提供一种高动态范围图像处理方法。本申请实施方式的高动态范围图像处理方法用于高动态范围图像处理系统100。高动态范围图像处理系统100可以包括图像传感器10。图像传感器10包括像素阵列11。像素阵列11包括多个全色感光像素和多个彩色感光像素。彩色感光像素具有比全色感光像素更窄的光谱响应。像素阵列11包括最小重复单元。每个最小重复单元包含多个子单元。每个子单元包括多个单颜色感光像素及多个全色感光像素。高动态范围图像处理方法包括:
01:像素阵11曝光,其中,像素阵列11以第一曝光时间曝光得到第一原始图像,第一原始图像包括以第一曝光时间曝光的单颜色感光像素生成的第一彩色原始图像数据和以第一曝光时间曝光的全色感光像素生成的第一全色原始图像数据;像素阵列以第二曝光时间曝光得到第二原始图像,第二原始图像包括以第二曝光时间曝光的单颜色感光像素生成的第二彩色原始图像数据和以第二曝光时间曝光的全色感光像素生成的第二全色原始图像数据;其中,第一曝光时间不等于第二曝光时间;
02:将第一彩色原始图像数据与第一全色原始图像数据融合为仅包含第一彩色中间图像数据的第一彩色中间图像,将第二彩色原始图像数据与第二全色原始图像数据融合为仅包含第二彩色中间图像数据的第二彩色中间图像,第一彩色中间图像和第二彩色中间图像均包含多个彩色图像像素,多个彩色图像像素呈拜耳阵列排布;
03:对第一彩色中间图像及第二彩色中间图像进行色彩转换处理以获得色彩转换后的第一彩色中间图像及色彩转换后的第二彩色中间图像;
04:对色彩转换后的第一彩色中间图像及色彩转换后的第二彩色中间图像进行高动态范围处理以获得彩色高动态范围图像。
在某些实施方式中,高动态范围图像处理方法还包括:像素阵列以第三曝光时间曝光得到第三原始图像,第三原始图像包括以第三曝光时间曝光的单颜色感光像素生成的第三彩色原始图像数据和以第三曝光时间曝光的全色感光像素生成的第三全色原始图像数据,其中,第三曝光时间不等于第一曝光时间,第三曝光时间不等于第二曝光时间。第三彩色原始图像数据与第三全色原始图像数据融合为仅包含第三彩色中间图像数据的第三彩色中间图像,第三彩色中间图像均包含多个彩色图像像素,多个彩色图像像素呈拜耳阵列排布。第三彩色中间图像进行色彩转换处理以获得色彩转换后的第三彩色中间图像;步骤对色彩转换后的第一彩色中间图像及色彩转换后的第二彩色中间图像进行高动态范围处理以获得彩色高动态范围图像,包括:对色彩转换后的第一彩色中间图像、色彩转换后的第二彩色中间图像及色彩转换后的第三彩色中间图像进行高动态范围处理以获得彩色高动态范围图像。
在某些实施方式中,高动态范围图像处理方法还包括:每个彩色原始图像数据由单个单颜色感光像素生成,每个全色原始图像数据由单个全色感光像素生成,图像传感器10(图1所示)输出多个原始图像数据的输出方式包括一个彩色原始图像数据与一个全色原始图像数据交替输出。
在某些实施方式中,每个彩色原始图像数据由同一子单元中的多个单颜色感光像素共同生成,每个全色原始图像数据由同一子单元中的多个全色感光像素共同生成。图像传感器10(图1所示)输出多个原始图像数据的输出方式包括多个彩色原始图像数据与多个全色原始图像数据交替输出。
在某些实施方式中,高动态范围图像处理方法还包括:对第一彩色中间图像进行图像预处理以获得预处理后的第一彩色中间图像;对第二彩色中间图像进行图像预处理以获得预处理后的第二彩色中间图像。步骤对第一彩色中间图像及第二彩色中间图像进行色彩转换处理以获得色彩转换后的第一彩色中间图像及色彩转换后的第二彩色中间图像,包括:对预处理后的第一彩色中间图像及预处理后的第二彩色中间图像进行色彩转换处理以获得色彩转换后的第一彩色中间图像及色彩转换后的第二彩色中间图像。
在某些实施方式中,高动态范围图像处理方法还包括:对第一彩色中间图像进行图像预处理以获得预处理后的第一彩色中间图像;对第二彩色中间图像进行图像预处理以获得预处理后的第二彩色中间图像;对第三彩色中间图像进行图像预处理以获得预处理后的第三彩色中间图像。步骤对第一彩色中间图像、第二彩色中间图像及第三彩色中间图像进行色彩转换处理以获得色彩转换后的第一彩色中间图像、色彩转换后的第二彩色中间图像及色彩转换后的第三彩色中间图像,包括:对预处理后的第一彩色中间图像、预处理后的第二彩色中间图像及预处理后的第三彩色中间图像进行色彩转换处理以获得色彩转换后的第一彩色中间图像、色彩转换后的第二彩色中间图像及色彩转换后的第三彩色中间图像。
在某些实施方式中,图像预处理包括黑电平校正、镜头阴影校正、坏点补偿、去马赛克、色彩校正和全局色调映射中的至少一种。
在某些实施方式中,高动态范围图像处理系统包括存储模块,将色彩转换后的图像存储至存储模块。并从存储模块获取色彩转换后的图像并对色彩转换后的图像进行高动态范围图像处理,以获得彩色高动态范围图像。
请参阅29,本申请还提供一种包含计算机程序的非易失性计算机可读存储介质400。该计算机程序被处理器60执行时,使得处理器60执行上述任意一个实施方式的高动态范围图像处理方法。
例如,请参阅图1、图2、图28及图29,计算机程序被处理器60执行时,使得处理器60执行以下步骤:
像素阵11曝光,其中,像素阵列11以第一曝光时间曝光得到第一原始图像,第一原始图像包括以第一曝光时间曝光的单颜色感光像素生成的第一彩色原始图像数据和以第一曝光时间曝光的全色感光像素生成的第一全色原始图像数据;像素阵列以第二曝光时间曝光得到第二原始图像,第二原始图像包括以第二曝光时间曝光的单颜色感光像素生成的第二彩色原始图像数据和以第二曝光时间曝光的全色感光像素生成的第二全色原始图像数据;其中,第一曝光时间不等于第二曝光时间;
将第一彩色原始图像数据与第一全色原始图像数据融合为仅包含第一彩色中间图像数据的第一彩色中间图像,将第二彩色原始图像数据与第二全色原始图像数据融合为仅包含第二彩色中间图像数据的第二彩色中间图像,第一彩色中间图像和第二彩色中间图像均包含多个彩色图像像素,多个彩色图像像素呈拜耳阵列排布;
对第一彩色中间图像及第二彩色中间图像进行色彩转换处理以获得色彩转换后的第一彩色中间图像及色彩转换后的第二彩色中间图像;
对色彩转换后的第一彩色中间图像及色彩转换后的第二彩色中间图像进行高动态范围处理以获得彩色高动态范围图像。
再例如,请参阅图29,计算机程序被处理器60执行时,使得处理器60执行以下步骤:
对第一彩色中间图像进行图像预处理以获得预处理后的第一彩色中间图像;
对第二彩色中间图像进行图像预处理以获得预处理后的第二彩色中间图像;
对预处理后的第一彩色中间图像及预处理后的第二彩色中间图像进行色彩转换处理以获得色彩转换后的第一彩色中间图像及色彩转换后的第二彩色中间图像。
在本说明书的描述中,参考术语“一个实施方式”、“一些实施方式”、“示意性实施方式”、“示例”、“具体示例”或“一些示例”等的描述意指结合所述实施方式或示例描述的具体特征、结构、材料或者特点包含于本申请的至少一个实施方式或示例中。在本说明书中,对上述术语的示意性表述不一定指的是相同的实施方式或示例。而且,描述的具体特征、结构、材料或者特点可以在任何的一个或多个实施方式或示例中以合适的方式结合。此外,在不相互矛盾的情况下,本领域的技术人员可以将本说明书中描述的不同实施例或示例以及不同实施例或示例的特征进行结合和组合。
流程图中或在此以其他方式描述的任何过程或方法描述可以被理解为,表示包括一个或更多个用于实现特定逻辑功能或过程的步骤的可执行指令的代码的模块、片段或部分,并且本申请的优选实施方式的范围包括另外的实现,其中可以不按所示出或讨论的顺序,包括根据所涉及的功能按基本同时的方式或按相反的顺序,来执行功能,这应被本申请的实施例所属技术领域的技术人员所理解。
尽管上面已经示出和描述了本申请的实施方式,可以理解的是,上述实施方式是示例性的,不能理解为对本申请的限制,本领域的普通技术人员在本申请的范围内可以对上述实施方式进行变化、修改、替换和变型。

Claims (24)

  1. 一种高动态范围图像处理系统,其特征在于,包括图像传感器、图像融合模块、高动态范围图像处理模块及图像处理器;
    所述图像传感器包括像素阵列,所述像素阵列包括多个全色感光像素和多个彩色感光像素,所述彩色感光像素具有比所述全色感光像素更窄的光谱响应,所述像素阵列包括最小重复单元,每个所述最小重复单元包含多个子单元,每个所述子单元包括多个单颜色感光像素及多个全色感光像素;
    所述像素阵列以第一曝光时间曝光得到第一原始图像,所述第一原始图像包括以所述第一曝光时间曝光的所述单颜色感光像素生成的第一彩色原始图像数据和以所述第一曝光时间曝光的所述全色感光像素生成的第一全色原始图像数据;所述像素阵列以第二曝光时间曝光得到第二原始图像,所述第二原始图像包括以所述第二曝光时间曝光的所述单颜色感光像素生成的第二彩色原始图像数据和以所述第二曝光时间曝光的所述全色感光像素生成的第二全色原始图像数据;其中,所述第一曝光时间不等于所述第二曝光时间;
    所述图像融合模块用于将所述第一彩色原始图像数据与所述第一全色原始图像数据融合为仅包含第一彩色中间图像数据的第一彩色中间图像,将所述第二彩色原始图像数据与所述第二全色原始图像数据融合为仅包含第二彩色中间图像数据的第二彩色中间图像,所述第一彩色中间图像和所述第二彩色中间图像均包含多个彩色图像像素,多个所述彩色图像像素呈拜耳阵列排布;
    所述图像处理器用于对所述第一彩色中间图像及所述第二彩色中间图像进行色彩转换处理以获得色彩转换后的第一彩色中间图像及色彩转换后的第二彩色中间图像;
    所述高动态范围图像处理模块用于对所述色彩转换后的第一彩色中间图像及所述色彩转换后的第二彩色中间图像进行高动态范围处理以获得彩色高动态范围图像。
  2. 根据权利要求1所述的高动态范围图像处理系统,其特征在于,所述像素阵列以第三曝光时间曝光得到第三原始图像,所述第三原始图像包括以所述第三曝光时间曝光的所述单颜色感光像素生成的第三彩色原始图像数据和以所述第三曝光时间曝光的所述全色感光像素生成的第三全色原始图像数据;其中,所述第三曝光时间不等于所述第一曝光时间,所述第三曝光时间不等于所述第二曝光时间;
    所述图像融合模块还用于将所述第三彩色原始图像数据与所述第三全色原始图像数据融合为仅包含第三彩色中间图像数据的第三彩色中间图像;
    所述图像处理器用于对所述第三彩色中间图像进行色彩转换处理以获得色彩转换后的第三彩色中间图像;
    所述高动态范围图像处理模块用于对所述色彩转换后的第一彩色中间图像、所述色彩转换后的第二彩色中间图像及所述色彩转换后的第三彩色中间图像进行高动态范围处理以获得所述彩色高动态范围图像。
  3. 根据权利要求1或2所述的高动态范围图像处理系统,其特征在于,每个彩色原始图像数据由单个所述单颜色感光像素生成,每个全色原始图像数据由单个所述全色感光像素生成,所述图像传感器输出多个原始图像数据的输出方式包括一个所述彩色原始图像数据与一个所述全色原始图像数据交替输出;或
    每个彩色原始图像数据由同一所述子单元中的多个所述单颜色感光像素共同生成,每个全色原始图像数据由同一所述子单元中的多个所述全色感光像素共同生成,所述图像传感器输出多个原始图像数据的输出方式包括多个所述彩色原始图像数据与多个所述全色原始图像数据交替输出。
  4. 根据权利要求1所述的高动态范围图像处理系统,其特征在于,所述图像处理器包括图像预处理模块,所述图像预处理模块用于:
    对所述第一彩色中间图像进行图像预处理以获得预处理后的第一彩色中间图像;及
    对所述第二彩色中间图像进行图像预处理以获得预处理后的第二彩色中间图像;及
    图像后处理模块,所述图像后处理模块用于对所述预处理后的第一彩色中间图像及所述预处理后的第二彩色中间图像进行色彩转换处理以获得所述色彩转换后的第一彩色中间图像及所述色彩转换后的第二彩色中间图像。
  5. 根据权利要求2所述的高动态范围图像处理系统,其特征在于,所述图像处理器包括:
    图像预处理模块,所述图像预处理模块用于:
    对所述第一彩色中间图像进行图像预处理以获得预处理后的第一彩色中间图像;
    对所述第二彩色中间图像进行图像预处理以获得预处理后的第二彩色中间图像;及
    对所述第三彩色中间图像进行图像预处理以获得预处理后的第三彩色中间图像;及
    图像后处理模块,所述图像后处理模块用于对所述预处理后的第一彩色中间图像、所述预处理后的 第二彩色中间图像及所述预处理后的第三彩色中间图像进行色彩转换处理以获得所述色彩转换后的第一彩色中间图像、所述色彩转换后的第二彩色中间图像及所述色彩转换后的第三彩色中间图像。
  6. 根据权利要求4或5所述的高动态范围图像处理系统,其特征在于,所述图像预处理包括黑电平校正、镜头阴影校正、坏点补偿中的至少一种。
  7. 根据权利要求4或5所述的高动态范围图像处理系统,其特征在于,所述高动态范围图像处理系统还包括存储模块,所述存储模块用于存储色彩转换后的图像,并将色彩转换后的所述图像传输至所述高动态范围图像处理模块进行高动态范围图像处理,以获得所述彩色高动态范围图像。
  8. 根据权利要求1所述的高动态范围图像处理系统,其特征在于,所述图像融合模块集成在所述图像传感器中。
  9. 一种高动态范围图像处理方法,用于高动态范围图像处理系统,其特征在于,所述高动态范围图像处理系统包括图像传感器,所述图像传感器包括像素阵列,所述像素阵列包括多个全色感光像素和多个彩色感光像素,所述彩色感光像素具有比所述全色感光像素更窄的光谱响应,所述像素阵列包括最小重复单元,每个所述最小重复单元包含多个子单元,每个所述子单元包括多个单颜色感光像素及多个全色感光像素;所述高动态范围图像处理方法包括:
    所述像素阵列曝光,其中,所述像素阵列以第一曝光时间曝光得到第一原始图像,所述第一原始图像包括以所述第一曝光时间曝光的所述单颜色感光像素生成的第一彩色原始图像数据和以所述第一曝光时间曝光的所述全色感光像素生成的第一全色原始图像数据;所述像素阵列以第二曝光时间曝光得到第二原始图像,所述第二原始图像包括以所述第二曝光时间曝光的所述单颜色感光像素生成的第二彩色原始图像数据和以所述第二曝光时间曝光的所述全色感光像素生成的第二全色原始图像数据;其中,所述第一曝光时间不等于所述第二曝光时间;
    将所述第一彩色原始图像数据与所述第一全色原始图像数据融合为仅包含第一彩色中间图像数据的第一彩色中间图像,将所述第二彩色原始图像数据与所述第二全色原始图像数据融合为仅包含第二彩色中间图像数据的第二彩色中间图像,所述第一彩色中间图像和所述第二彩色中间图像均包含多个彩色图像像素,多个所述彩色图像像素呈拜耳阵列排布;
    对所述第一彩色中间图像及所述第二彩色中间图像进行色彩转换处理以获得色彩转换后的第一彩色中间图像及色彩转换后的第二彩色中间图像;及
    对所述色彩转换后的第一彩色中间图像及所述色彩转换后的第二彩色中间图像进行高动态范围处理以获得彩色高动态范围图像。
  10. 根据权利要求9所述的高动态范围图像处理方法,其特征在于,所述高动态范围图像处理方法还包括:
    所述像素阵列以第三曝光时间曝光得到第三原始图像,所述第三原始图像包括以所述第三曝光时间曝光的所述单颜色感光像素生成的第三彩色原始图像数据和以所述第三曝光时间曝光的所述全色感光像素生成的第三全色原始图像数据;其中,所述第三曝光时间不等于所述第一曝光时间,所述第三曝光时间不等于所述第二曝光时间;
    将所述第三彩色原始图像数据与所述第三全色原始图像数据融合为仅包含第三彩色中间图像数据的第三彩色中间图像,所述第三彩色中间图像包含多个彩色图像像素,多个所述彩色图像像素呈拜耳阵列排布;
    对所述第三彩色中间图像进行色彩转换处理以获得色彩转换后的第三彩色中间图像;及
    所述对所述色彩转换后的第一彩色中间图像及所述色彩转换后的第二彩色中间图像进行高动态范围处理以获得彩色高动态范围图像,包括:
    对所述色彩转换后的第一彩色中间图像、所述色彩转换后的第二彩色中间图像及所述色彩转换后的第三彩色中间图像进行高动态范围处理以获得所述彩色高动态范围图像。
  11. 根据权利要求9或10所述的高动态范围图像处理方法,其特征在于,每个彩色原始图像数据由单个所述单颜色感光像素生成,每个全色原始图像数据由单个所述全色感光像素生成,所述图像传感器输出多个原始图像数据的输出方式包括一个所述彩色原始图像数据与一个所述全色原始图像数据交替输出;或
    每个彩色原始图像数据由同一所述子单元中的多个所述单颜色感光像素共同生成,每个全色原始图像数据由同一所述子单元中的多个所述全色感光像素共同生成,所述图像传感器输出多个原始图像数据的输出方式包括多个所述彩色原始图像数据与多个所述全色原始图像数据交替输出。
  12. 根据权利要求9所述的高动态范围图像处理方法,其特征在于,所述高动态范围图像处理方法还包括:
    对所述第一彩色中间图像进行图像预处理以获得预处理后的第一彩色中间图像;及
    对所述第二彩色中间图像进行图像预处理以获得预处理后的第二彩色中间图像;
    所述对所述第一彩色中间图像及所述第二彩色中间图像进行色彩转换处理以获得色彩转换后的第一彩色中间图像及色彩转换后的第二彩色中间图像,包括:
    对所述预处理后的第一彩色中间图像及所述预处理后的第二彩色中间图像进行色彩转换处理以获得所述色彩转换后的第一彩色中间图像及所述色彩转换后的第二彩色中间图像。
  13. 根据权利要求10所述所述的高动态范围图像处理方法,其特征在于,所述高动态范围图像处理方法还包括:
    对所述第一彩色中间图像进行图像预处理以获得预处理后的第一彩色中间图像;
    对所述第二彩色中间图像进行图像预处理以获得预处理后的第二彩色中间图像;及
    对所述第三彩色中间图像进行图像预处理以获得预处理后的第三彩色中间图像;
    所述对所述第一彩色中间图像、所述第二彩色中间图像及所述第三彩色中间图像进行色彩转换处理以获得色彩转换后的第一彩色中间图像、色彩转换后的第二彩色中间图像及色彩转换后的第三彩色中间图像,包括:
    对所述预处理后的第一彩色中间图像、所述预处理后的第二彩色中间图像及所述预处理后的第三彩色中间图像进行色彩转换处理以获得所述色彩转换后的第一彩色中间图像、所述色彩转换后的第二彩色中间图像及所述色彩转换后的第三彩色中间图像。
  14. 根据权利要求12或13所述的高动态范围图像处理方法,其特征在于,所述图像预处理包括黑电平校正、镜头阴影校正、坏点补偿中的至少一种。
  15. 根据权利要求12或13所述的高动态范围图像处理方法,其特征在于,所述高动态范围图像处理系统包括存储模块,所述高动态范围图像处理方法还包括:
    将色彩转换后的所述图像存储至所述存储模块;及
    从所述存储模块获取色彩转换后的所述图像并对色彩转换后的所述图像进行高动态范围图像处理,以获得所述彩色高动态范围图像。
  16. 一种电子设备,其特征在于,包括:
    镜头;
    壳体;及
    高动态范围图像处理系统,所述镜头、所述高动态范围图像处理系统与所述壳体结合,所述镜头与所述高动态范围图像处理系统的图像传感器配合成像,所述高动态范围图像处理系统包括图像传感器、图像融合模块、高动态范围图像处理模块及图像处理器;
    所述图像传感器包括像素阵列,所述像素阵列包括多个全色感光像素和多个彩色感光像素,所述彩色感光像素具有比所述全色感光像素更窄的光谱响应,所述像素阵列包括最小重复单元,每个所述最小重复单元包含多个子单元,每个所述子单元包括多个单颜色感光像素及多个全色感光像素;
    所述像素阵列以第一曝光时间曝光得到第一原始图像,所述第一原始图像包括以所述第一曝光时间曝光的所述单颜色感光像素生成的第一彩色原始图像数据和以所述第一曝光时间曝光的所述全色感光像素生成的第一全色原始图像数据;所述像素阵列以第二曝光时间曝光得到第二原始图像,所述第二原始图像包括以所述第二曝光时间曝光的所述单颜色感光像素生成的第二彩色原始图像数据和以所述第二曝光时间曝光的所述全色感光像素生成的第二全色原始图像数据;其中,所述第一曝光时间不等于所述第二曝光时间;
    所述图像融合模块用于将所述第一彩色原始图像数据与所述第一全色原始图像数据融合为仅包含第一彩色中间图像数据的第一彩色中间图像,将所述第二彩色原始图像数据与所述第二全色原始图像数据融合为仅包含第二彩色中间图像数据的第二彩色中间图像,所述第一彩色中间图像和所述第二彩色中间图像均包含多个彩色图像像素,多个所述彩色图像像素呈拜耳阵列排布;
    所述图像处理器用于对所述第一彩色中间图像及所述第二彩色中间图像进行色彩转换处理以获得色彩转换后的第一彩色中间图像及色彩转换后的第二彩色中间图像;
    所述高动态范围图像处理模块用于对所述色彩转换后的第一彩色中间图像及所述色彩转换后的第二彩色中间图像进行高动态范围处理以获得彩色高动态范围图像。
  17. 根据权利要求16所述的电子设备,其特征在于,所述像素阵列以第三曝光时间曝光得到第三原始图像,所述第三原始图像包括以所述第三曝光时间曝光的所述单颜色感光像素生成的第三彩色原始图像数据和以所述第三曝光时间曝光的所述全色感光像素生成的第三全色原始图像数据;其中,所述第三曝光时间不等于所述第一曝光时间,所述第三曝光时间不等于所述第二曝光时间;
    所述图像融合模块还用于将所述第三彩色原始图像数据与所述第三全色原始图像数据融合为仅包含第三彩色中间图像数据的第三彩色中间图像;
    所述图像处理器用于对所述第三彩色中间图像进行色彩转换处理以获得色彩转换后的第三彩色中间图像;
    所述高动态范围图像处理模块用于对所述色彩转换后的第一彩色中间图像、所述色彩转换后的第二彩色中间图像及所述色彩转换后的第三彩色中间图像进行高动态范围处理以获得所述彩色高动态范围图像。
  18. 根据权利要求16或17所述的电子设备,其特征在于,每个彩色原始图像数据由单个所述单颜色感光像素生成,每个全色原始图像数据由单个所述全色感光像素生成,所述图像传感器输出多个原始图像数据的输出方式包括一个所述彩色原始图像数据与一个所述全色原始图像数据交替输出;或
    每个彩色原始图像数据由同一所述子单元中的多个所述单颜色感光像素共同生成,每个全色原始图像数据由同一所述子单元中的多个所述全色感光像素共同生成,所述图像传感器输出多个原始图像数据的输出方式包括多个所述彩色原始图像数据与多个所述全色原始图像数据交替输出。
  19. 根据权利要求16所述的电子设备,其特征在于,所述图像处理器包括图像预处理模块,所述图像预处理模块用于:
    对所述第一彩色中间图像进行图像预处理以获得预处理后的第一彩色中间图像;及
    对所述第二彩色中间图像进行图像预处理以获得预处理后的第二彩色中间图像;及
    图像后处理模块,所述图像后处理模块用于对所述预处理后的第一彩色中间图像及所述预处理后的第二彩色中间图像进行色彩转换处理以获得所述色彩转换后的第一彩色中间图像及所述色彩转换后的第二彩色中间图像。
  20. 根据权利要求17所述的电子设备,其特征在于,所述图像处理器包括:
    图像预处理模块,所述图像预处理模块用于:
    对所述第一彩色中间图像进行图像预处理以获得预处理后的第一彩色中间图像;
    对所述第二彩色中间图像进行图像预处理以获得预处理后的第二彩色中间图像;及
    对所述第三彩色中间图像进行图像预处理以获得预处理后的第三彩色中间图像;及
    图像后处理模块,所述图像后处理模块用于对所述预处理后的第一彩色中间图像、所述预处理后的第二彩色中间图像及所述预处理后的第三彩色中间图像进行色彩转换处理以获得所述色彩转换后的第一彩色中间图像、所述色彩转换后的第二彩色中间图像及所述色彩转换后的第三彩色中间图像。
  21. 根据权利要求19或20所述的电子设备,其特征在于,所述图像预处理包括黑电平校正、镜头阴影校正、坏点补偿中的至少一种。
  22. 根据权利要求19或20所述的电子设备,其特征在于,所述高动态范围图像处理系统还包括存储模块,所述存储模块用于存储色彩转换后的图像,并将色彩转换后的所述图像传输至所述高动态范围图像处理模块进行高动态范围图像处理,以获得所述彩色高动态范围图像。
  23. 根据权利要求16所述的电子设备,其特征在于,所述图像融合模块集成在所述图像传感器中。
  24. 一种包含计算机程序的非易失性计算机可读存储介质,其特征在于,所述计算机程序被处理器执行时,使得所述处理器执行权利要求9至15任意一项所述的高动态范围图像处理方法。
PCT/CN2020/119963 2020-04-20 2020-10-09 高动态范围图像处理系统及方法、电子设备和可读存储介质 WO2021212763A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010310641.2A CN111491111B (zh) 2020-04-20 2020-04-20 高动态范围图像处理系统及方法、电子设备和可读存储介质
CN202010310641.2 2020-04-20

Publications (1)

Publication Number Publication Date
WO2021212763A1 true WO2021212763A1 (zh) 2021-10-28

Family

ID=71812941

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/119963 WO2021212763A1 (zh) 2020-04-20 2020-10-09 高动态范围图像处理系统及方法、电子设备和可读存储介质

Country Status (2)

Country Link
CN (1) CN111491111B (zh)
WO (1) WO2021212763A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114140808A (zh) * 2021-11-03 2022-03-04 浪潮软件集团有限公司 一种基于国产cpu和操作系统的电子公文识别方法

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111491111B (zh) * 2020-04-20 2021-03-26 Oppo广东移动通信有限公司 高动态范围图像处理系统及方法、电子设备和可读存储介质
CN111586375B (zh) * 2020-05-08 2021-06-11 Oppo广东移动通信有限公司 高动态范围图像处理系统及方法、电子设备和可读存储介质
CN111970459B (zh) * 2020-08-12 2022-02-18 Oppo广东移动通信有限公司 高动态范围图像处理系统及方法、电子设备和可读存储介质
CN111970460B (zh) * 2020-08-17 2022-05-20 Oppo广东移动通信有限公司 高动态范围图像处理系统及方法、电子设备和可读存储介质
CN111899178B (zh) * 2020-08-18 2021-04-16 Oppo广东移动通信有限公司 图像处理方法、图像处理系统、电子设备及可读存储介质
CN112019775B (zh) * 2020-09-04 2023-03-24 成都微光集电科技有限公司 一种坏点检测校正方法及装置
CN112019758B (zh) * 2020-10-16 2021-01-08 湖南航天捷诚电子装备有限责任公司 一种机载双目头戴夜视装置使用方法与夜视装置
CN112702543B (zh) * 2020-12-28 2021-09-17 Oppo广东移动通信有限公司 图像处理方法、图像处理系统、电子设备及可读存储介质
CN116744120B (zh) * 2022-09-15 2024-04-12 荣耀终端有限公司 图像处理方法和电子设备

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104170376A (zh) * 2012-03-27 2014-11-26 索尼公司 图像处理设备、成像装置、图像处理方法及程序
US20150146067A1 (en) * 2013-11-25 2015-05-28 Samsung Electronics Co., Ltd. Pixel array and image sensor including same
CN105409205A (zh) * 2013-07-23 2016-03-16 索尼公司 摄像装置、摄像方法及程序
CN106412407A (zh) * 2016-11-29 2017-02-15 广东欧珀移动通信有限公司 控制方法、控制装置及电子装置
CN111491111A (zh) * 2020-04-20 2020-08-04 Oppo广东移动通信有限公司 高动态范围图像处理系统及方法、电子设备和可读存储介质
CN111586375A (zh) * 2020-05-08 2020-08-25 Oppo广东移动通信有限公司 高动态范围图像处理系统及方法、电子设备和可读存储介质

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3079353B1 (en) * 2013-12-04 2021-03-03 Sony Semiconductor Solutions Corporation Image processing device, image processing method, electronic apparatus, and program
CN103873781B (zh) * 2014-03-27 2017-03-29 成都动力视讯科技股份有限公司 一种宽动态摄像机实现方法及装置
US9344639B2 (en) * 2014-08-12 2016-05-17 Google Technology Holdings LLC High dynamic range array camera

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104170376A (zh) * 2012-03-27 2014-11-26 索尼公司 图像处理设备、成像装置、图像处理方法及程序
CN105409205A (zh) * 2013-07-23 2016-03-16 索尼公司 摄像装置、摄像方法及程序
US20150146067A1 (en) * 2013-11-25 2015-05-28 Samsung Electronics Co., Ltd. Pixel array and image sensor including same
CN106412407A (zh) * 2016-11-29 2017-02-15 广东欧珀移动通信有限公司 控制方法、控制装置及电子装置
CN111491111A (zh) * 2020-04-20 2020-08-04 Oppo广东移动通信有限公司 高动态范围图像处理系统及方法、电子设备和可读存储介质
CN111586375A (zh) * 2020-05-08 2020-08-25 Oppo广东移动通信有限公司 高动态范围图像处理系统及方法、电子设备和可读存储介质

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114140808A (zh) * 2021-11-03 2022-03-04 浪潮软件集团有限公司 一种基于国产cpu和操作系统的电子公文识别方法

Also Published As

Publication number Publication date
CN111491111B (zh) 2021-03-26
CN111491111A (zh) 2020-08-04

Similar Documents

Publication Publication Date Title
WO2021212763A1 (zh) 高动态范围图像处理系统及方法、电子设备和可读存储介质
WO2021208593A1 (zh) 高动态范围图像处理系统及方法、电子设备和存储介质
WO2021196554A1 (zh) 图像传感器、处理系统及方法、电子设备和存储介质
WO2021196553A1 (zh) 高动态范围图像处理系统及方法、电子设备和可读存储介质
WO2021223364A1 (zh) 高动态范围图像处理系统及方法、电子设备和可读存储介质
CN112261391B (zh) 图像处理方法、摄像头组件及移动终端
WO2021179806A1 (zh) 图像获取方法、成像装置、电子设备及可读存储介质
WO2022007215A1 (zh) 图像获取方法、摄像头组件及移动终端
CN111970460B (zh) 高动态范围图像处理系统及方法、电子设备和可读存储介质
CN112738493B (zh) 图像处理方法、图像处理装置、电子设备及可读存储介质
US20230247308A1 (en) Image processing method, camera assembly and mobile terminal
CN111970461B (zh) 高动态范围图像处理系统及方法、电子设备和可读存储介质
CN111970459B (zh) 高动态范围图像处理系统及方法、电子设备和可读存储介质
CN112822475B (zh) 图像处理方法、图像处理装置、终端及可读存储介质
CN111835971B (zh) 图像处理方法、图像处理系统、电子设备及可读存储介质
CN114073068B (zh) 图像采集方法、摄像头组件及移动终端
CN111031297B (zh) 图像传感器、控制方法、摄像头组件和移动终端
CN112235485B (zh) 图像传感器、图像处理方法、成像装置、终端及可读存储介质
US20220279108A1 (en) Image sensor and mobile terminal
CN112738494B (zh) 图像处理方法、图像处理系统、终端设备及可读存储介质
WO2022141743A1 (zh) 图像处理方法、图像处理系统、电子设备及可读存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20932316

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20932316

Country of ref document: EP

Kind code of ref document: A1