WO2021196553A1 - Système et procédé de traitement d'images à plage dynamique élevée, dispositif électronique et support d'enregistrement lisible - Google Patents

Système et procédé de traitement d'images à plage dynamique élevée, dispositif électronique et support d'enregistrement lisible Download PDF

Info

Publication number
WO2021196553A1
WO2021196553A1 PCT/CN2020/119959 CN2020119959W WO2021196553A1 WO 2021196553 A1 WO2021196553 A1 WO 2021196553A1 CN 2020119959 W CN2020119959 W CN 2020119959W WO 2021196553 A1 WO2021196553 A1 WO 2021196553A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
high dynamic
color
original image
dynamic range
Prior art date
Application number
PCT/CN2020/119959
Other languages
English (en)
Chinese (zh)
Inventor
杨鑫
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Publication of WO2021196553A1 publication Critical patent/WO2021196553A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/741Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/81Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • H04N23/88Camera processing pipelines; Components thereof for processing colour signals for colour balance, e.g. white-balance circuits or colour temperature control

Definitions

  • This application relates to the field of image processing technology, and in particular to a high dynamic range image processing system, a high dynamic range image processing method, electronic equipment, and a non-volatile computer-readable storage medium.
  • a camera may be provided in an electronic device such as a mobile phone to realize a photographing function.
  • An image sensor for receiving light can be set in the camera.
  • the image sensor may be provided with a filter array.
  • the embodiments of the present application provide a high dynamic range image processing system, a high dynamic range image processing method, electronic equipment, and a non-volatile computer-readable storage medium.
  • the embodiment of the present application provides a high dynamic range image processing system.
  • the high dynamic range image processing system includes an image sensor, an image fusion module and a high dynamic range image processing module.
  • the image sensor includes a pixel array, the pixel array includes a plurality of panchromatic photosensitive pixels and a plurality of color photosensitive pixels, the color photosensitive pixels have a narrower spectral response than the panchromatic photosensitive pixels, and the pixel array includes The smallest repeating unit, each of the smallest repeating units includes a plurality of sub-units, each of the sub-units includes a plurality of single-color photosensitive pixels and a plurality of full-color photosensitive pixels, and the pixel array in the image sensor is exposed to light.
  • the single-color photosensitive pixels is exposed for a first exposure time, and at least one of the single-color photosensitive pixels is exposed for a second exposure that is less than the first exposure time Time exposure, at least one of the full-color photosensitive pixels is exposed at a third exposure time that is less than the first exposure time.
  • the first color information generated by the single-color photosensitive pixel exposed at the first exposure time obtains a first color original image
  • the second color generated by the single-color photosensitive pixel exposed at the second exposure time The information obtains a second color original image
  • the panchromatic photosensitive pixels exposed at the third exposure time generate a first panchromatic original image.
  • the image fusion module and the high dynamic range image processing module are used to perform fusion algorithm processing and high dynamic range processing on the first color original image, the second color original image, and the first panchromatic original image To get the first high dynamic range image.
  • the first high dynamic range image includes a plurality of color image pixels, and the plurality of color image pixels are arranged in a Bayer array.
  • the first high dynamic range image is processed by an image processor to obtain a second high dynamic range image.
  • the embodiments of the present application provide a high dynamic range image processing method.
  • the high dynamic range image processing method is used in a high dynamic range image processing system.
  • the high dynamic range image processing system includes an image sensor, the image sensor includes a pixel array, the pixel array includes a plurality of full-color photosensitive pixels and a plurality of color photosensitive pixels, and the color photosensitive pixels have a higher sensitivity than the full-color photosensitive pixels.
  • the pixel has a narrower spectral response.
  • the pixel array includes a minimum repeating unit, each of the minimum repeating units includes a plurality of sub-units, and each of the sub-units includes a plurality of single-color photosensitive pixels and a plurality of full-color photosensitive pixels.
  • the high dynamic range image processing method includes: controlling the exposure of the pixel array, wherein, for a plurality of photosensitive pixels in the same subunit, at least one single-color photosensitive pixel is exposed at a first exposure time, and at least one The single-color photosensitive pixel is exposed at a second exposure time that is less than the first exposure time, and at least one full-color photosensitive pixel is exposed at a third exposure time that is less than the first exposure time.
  • the first color information generated by the single-color photosensitive pixel exposed at the first exposure time obtains a first color original image
  • the second color generated by the single-color photosensitive pixel exposed at the second exposure time The information obtains a second color original image
  • the panchromatic photosensitive pixels exposed at the third exposure time generate a first panchromatic original image.
  • performing fusion algorithm processing and high dynamic range processing on the first color original image, the second color original image, and the first panchromatic original image to obtain a first high dynamic range image.
  • the first high dynamic range image includes a plurality of color image pixels, and the plurality of color image pixels are arranged in a Bayer array.
  • the first high dynamic range image is processed by an image processor to obtain a second high dynamic range image.
  • the embodiment of the present application provides an electronic device.
  • the electronic device includes a lens, a housing, and the above-mentioned high dynamic range image processing system.
  • the lens and the high dynamic range image processing system are combined with the housing, and the lens cooperates with the image sensor of the high dynamic range image processing system for imaging.
  • the embodiments of the present application provide a non-volatile computer-readable storage medium containing a computer program.
  • the processor is caused to execute the above-mentioned high dynamic range image processing method.
  • FIG. 1 is a schematic diagram of a high dynamic range image processing system according to an embodiment of the present application
  • FIG. 2 is a schematic diagram of a pixel array according to an embodiment of the present application.
  • FIG. 3 is a schematic cross-sectional view of a photosensitive pixel according to an embodiment of the present application.
  • FIG. 4 is a pixel circuit diagram of a photosensitive pixel according to an embodiment of the present application.
  • FIG. 5 is a schematic diagram of the arrangement of the smallest repeating unit in a pixel array according to an embodiment of the present application
  • FIG. 6 is a schematic diagram of the arrangement of the smallest repeating unit in another pixel array according to an embodiment of the present application.
  • FIG. 7 is a schematic diagram of the arrangement of the smallest repeating unit in another pixel array according to an embodiment of the present application.
  • FIG. 8 is a schematic diagram of the arrangement of the smallest repeating unit in another pixel array according to an embodiment of the present application.
  • FIG. 9 is a schematic diagram of the arrangement of the smallest repeating unit in another pixel array according to an embodiment of the present application.
  • FIG. 10 is a schematic diagram of the arrangement of the smallest repeating unit in another pixel array according to an embodiment of the present application.
  • FIG. 11 is a schematic diagram of an original image output by an image sensor according to an embodiment of the present application.
  • FIG. 12 is a schematic diagram of an image fusion processing principle according to an embodiment of the present application.
  • FIG. 13 is a schematic diagram of another image fusion processing principle according to an embodiment of the present application.
  • FIG. 14 is a schematic diagram of brightness alignment processing according to an embodiment of the present application.
  • FIG. 15 is a schematic diagram of a high dynamic range processing principle according to an embodiment of the present application.
  • 16 is a schematic diagram of another high dynamic range image processing system according to an embodiment of the present application.
  • FIG. 17 is a schematic diagram of lens shading correction processing according to an embodiment of the present application.
  • FIG. 18 is a schematic diagram of another high dynamic range image processing system according to an embodiment of the present application.
  • FIG. 19 is a schematic diagram of another high dynamic range processing principle according to an embodiment of the present application.
  • FIG. 20 is a schematic diagram of an original image output by another image sensor according to an embodiment of the present application.
  • FIG. 21 is a schematic diagram of another high dynamic range processing principle according to an embodiment of the present application.
  • FIG. 22 is a schematic diagram of another image fusion processing principle according to an embodiment of the present application.
  • FIG. 23 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
  • FIG. 24 is a schematic flowchart of a method for acquiring a high dynamic range image according to an embodiment of the present application.
  • FIG. 25 is a schematic diagram of interaction between a non-volatile computer-readable storage medium and a processor according to an embodiment of the present application.
  • the high dynamic range image processing system 100 includes an image sensor 10, an image fusion module 20 and a high dynamic range image processing module 30.
  • the image sensor 10 includes a pixel array 11.
  • the pixel array 11 includes a plurality of panchromatic photosensitive pixels and a plurality of color photosensitive pixels, and the color photosensitive pixels have a narrower spectral response than the panchromatic photosensitive pixels.
  • the pixel array 11 includes a minimum repeating unit, each minimum repeating unit includes a plurality of sub-units, and each sub-unit includes a plurality of single-color photosensitive pixels and a plurality of full-color photosensitive pixels.
  • the pixel array 11 in the image sensor 10 is exposed to light, wherein, for a plurality of photosensitive pixels in the same subunit, at least one single-color photosensitive pixel is exposed at a first exposure time, and at least one single-color photosensitive pixel is exposed at a first exposure time that is less than the first exposure time. Exposure at a second exposure time, at least one full-color photosensitive pixel is exposed at a third exposure time that is less than the first exposure time.
  • the first color information generated by the single-color photosensitive pixels exposed at the first exposure time obtains the first color original image
  • the second color information generated by the single-color photosensitive pixels exposed at the second exposure time obtains the second color original image
  • the full-color photosensitive pixels exposed at the third exposure time generate a first full-color original image.
  • the image fusion module 20 and the high dynamic range image processing module 30 are used to perform fusion algorithm processing and high dynamic range processing on the first color original image, the second color original image, and the first panchromatic original image to obtain the first high dynamic range image .
  • the first high dynamic range image includes a plurality of color image pixels, and the plurality of color image pixels are arranged in a Bayer array.
  • the first high dynamic range image is processed by the image processor 40 to obtain a second high dynamic range image.
  • part of the panchromatic photosensitive pixels in the same subunit are exposed at the fourth exposure time, and the remaining panchromatic photosensitive pixels are exposed at the third exposure time, and the fourth exposure time is less than or equal to the first exposure time.
  • the exposure time is greater than the third exposure time.
  • the image fusion module 20 is used for fusing the first color original image and the second panchromatic original image into a first intermediate image, and fusing the second color original image and the first panchromatic original image into a second intermediate image, with a fourth exposure
  • the second panchromatic information generated by the time-exposed single-color photosensitive pixels obtains the second panchromatic original image.
  • the high dynamic range image processing module 30 is used to fuse the first intermediate image and the second intermediate image into a first high dynamic range image.
  • the high dynamic range image processing module 30 includes a high dynamic range image processing unit 31 and a brightness mapping unit 33.
  • the high dynamic range image processing unit 31 is used for fusing the first intermediate image and the second intermediate image into a third high dynamic range image.
  • the brightness mapping unit 33 is configured to perform brightness mapping on the third high dynamic range image to obtain the first high dynamic range image.
  • the high dynamic range image processing module 30 includes a high dynamic range image processing unit 31, a lens shading correction unit 37 and a statistical unit 35.
  • the high dynamic range image processing unit 31 is used for fusing the first intermediate image and the second intermediate image into a third high dynamic range image;
  • the lens shading correction unit 37 is used for correcting the third high dynamic range image to obtain a high dynamic range corrected image;
  • the statistical unit 35 is used to process the high dynamic range correction image to obtain statistical data, and the statistical data is provided to the image processor 40 for automatic exposure processing and/or automatic white balance processing.
  • the high dynamic range image processing module 30 includes a statistical unit 35, the statistical unit 35 is used to process the first intermediate image and the second intermediate image to obtain statistical data, the statistical data is provided to the image processing
  • the device 40 performs automatic exposure processing and/or automatic white balance processing.
  • part of the panchromatic photosensitive pixels in the same subunit are exposed at the fourth exposure time, and the remaining panchromatic photosensitive pixels are exposed at the third exposure time, and the fourth exposure time is less than or equal to the first exposure time.
  • the exposure time is greater than the third exposure time.
  • the high dynamic range image processing module 30 is used for fusing the first color original image and the second color original image into a first high dynamic color original image, and fusing the first panchromatic original image and the second panchromatic original image into a first high Dynamic panchromatic original image.
  • the image fusion module 20 is used for fusing the first high dynamic color original image and the first high dynamic full color original image into a first high dynamic range image.
  • the high dynamic range image processing module 30 further includes a high dynamic range image processing unit 31 and a brightness mapping unit 33.
  • the high dynamic range image processing unit 31 is used for fusing the first color original image and the second color original image into a second high dynamic color original image, and fusing the first panchromatic original image and the second panchromatic original image into the second high Dynamic full-color original image;
  • the brightness mapping unit 33 is used to perform brightness mapping on the second high dynamic color original image to obtain the first high dynamic color original image, and perform brightness mapping on the second high dynamic full color original image to obtain the first high Dynamic panchromatic original image.
  • the high dynamic range image processing module 30 further includes a high dynamic range image processing unit 31, a lens shading correction unit 37 and a statistics unit 35.
  • the high dynamic range image processing unit 31 is used for fusing the first color original image and the second color original image into a second high dynamic color original image, and fusing the first panchromatic original image and the second panchromatic original image into the second high Dynamic panchromatic original image;
  • the lens shading correction unit 37 is used to correct the second highly dynamic color original image to obtain a highly dynamic color corrected image, and correct the second highly dynamic panchromatic original image to obtain a highly dynamic panchromatic corrected image;
  • the statistical unit 35 It is used to process the high dynamic color correction image and the high dynamic full color correction image to obtain statistical data, and the statistical data is provided to the image processor 40 for automatic exposure processing and/or automatic white balance processing.
  • all panchromatic photosensitive pixels in the same subunit are exposed at the third exposure time; the high dynamic range image processing module 30 is used to combine the first color original image with the second color original image Fusion into a first high dynamic color original image; the image fusion module 20 is used for fusing the first high dynamic color original image and the first panchromatic original image into a first high dynamic range image.
  • the high dynamic range image processing module 30 further includes a high dynamic range image processing unit 31 and a brightness mapping unit 33.
  • the high dynamic range image processing unit 31 is used to fuse the first color original image and the second color original image into a second high dynamic color original image;
  • the brightness mapping unit 33 is used to perform brightness mapping on the second high dynamic color original image to obtain The first high-dynamic color original image.
  • the high dynamic range image processing module 30 further includes a high dynamic range image processing unit 31, a lens shading correction unit 37 and a statistics unit 35.
  • the high dynamic range image processing unit 31 is used for fusing the first color original image and the second color original image into a second high dynamic color original image.
  • the lens shading correction unit 37 is used to correct the second high dynamic color original image to obtain a high dynamic color corrected image.
  • the statistical unit 35 is used to process the high dynamic color correction image and the first full-color original image to obtain statistical data, and the statistical data is provided to the image processor 40 for automatic exposure processing and/or automatic white balance processing.
  • both the image fusion module 20 and the high dynamic range image processing module 30 are integrated in the image sensor.
  • the high dynamic range image processing method of the embodiment of the present application is used in the high dynamic range image processing system 100.
  • the high dynamic range image processing system 100 includes an image sensor 10.
  • the image sensor 10 includes a pixel array 11.
  • the pixel array 11 includes a plurality of full-color photosensitive pixels and a plurality of color photosensitive pixels. Color photosensitive pixels have a narrower spectral response than full-color photosensitive pixels.
  • the pixel array 11 includes a minimum repeating unit, and each minimum repeating unit includes a plurality of sub-units. Each subunit includes a plurality of single-color photosensitive pixels and a plurality of full-color photosensitive pixels.
  • High dynamic range image processing methods include:
  • the first high dynamic range image includes multiple color images Pixels, a plurality of color image pixels are arranged in a Bayer array, and the first high dynamic range image is processed by the image processor to obtain the second high dynamic range image.
  • part of the panchromatic photosensitive pixels in the same subunit are exposed at the fourth exposure time, and the remaining panchromatic photosensitive pixels are exposed at the third exposure time.
  • the fourth exposure time is less than or equal to the first exposure time and greater than The third exposure time, and the second panchromatic information generated by the single-color photosensitive pixels exposed at the fourth exposure time to obtain the second panchromatic original image; for the first color original image, the second color original image and the first panchromatic original
  • Performing fusion algorithm processing and high dynamic range processing on the image to obtain the first high dynamic range image includes: fusing the first color original image and the second panchromatic original image into a first intermediate image, and combining the second color original image with the first full color original image.
  • the color original image is fused into a second intermediate image; and the first intermediate image and the second intermediate image are fused into a first high dynamic range image.
  • fusing the first intermediate image and the second intermediate image into a first high dynamic range image includes: fusing the first intermediate image and the second intermediate image into a third high dynamic range image; and The three high dynamic range images are subjected to brightness mapping to obtain the first high dynamic range image.
  • the high dynamic range image processing method further includes: fusing the first intermediate image and the second intermediate image into a third high dynamic range image; obtaining a high dynamic range corrected image from the third high dynamic range image; And processing the high dynamic range correction image to obtain statistical data, and the statistical data is provided to the image processor for automatic exposure processing and/or automatic white balance processing.
  • the high dynamic range image processing method further includes: processing the first intermediate image and the second intermediate image to obtain statistical data, and the statistical data is provided to the image processor for automatic exposure processing and/or automatic white balance processing .
  • part of the panchromatic photosensitive pixels in the same subunit are exposed at the fourth exposure time, and the remaining panchromatic photosensitive pixels are exposed at the third exposure time.
  • the fourth exposure time is less than or equal to the first exposure time and greater than The third exposure time, and the second panchromatic information generated by the single-color photosensitive pixels exposed at the fourth exposure time to obtain the second panchromatic original image.
  • Performing fusion algorithm processing and high dynamic range processing on the first color original image, the second color original image, and the first panchromatic original image to obtain the first high dynamic range image includes: combining the first color original image and the second color original image Fusion into a first high-dynamic color original image, fusing the first full-color original image and the second full-color original image into a first high-dynamic full-color original image; and combining the first high-dynamic color original image with the first high-dynamic full-color original image The color original image is fused into the first high dynamic range image.
  • the first color original image and the second color original image are fused into a first high dynamic color original image
  • the first panchromatic original image and the second panchromatic original image are fused into a first high dynamic color original image.
  • the color original image includes: fusing the first color original image and the second color original image into a second high-dynamic color original image, and fusing the first panchromatic original image and the second panchromatic original image into a second high-dynamic panchromatic original image Image; and brightness mapping is performed on the second high dynamic color original image to obtain the first high dynamic color original image, and brightness mapping is performed on the second high dynamic panchromatic original image to obtain the first high dynamic full color original image.
  • the high dynamic range image processing method includes: fusing a first color original image and a second color original image into a second high dynamic color original image, and combining the first panchromatic original image with the second panchromatic original image.
  • the color correction image and the high dynamic full color correction image are used to obtain statistical data, and the statistical data is provided to the image processor 40 for automatic exposure processing and/or automatic white balance processing.
  • all panchromatic photosensitive pixels in the same subunit are exposed at the third exposure time; the first color original image, the second color original image, and the first panchromatic original image are processed by a fusion algorithm and are highly dynamic
  • the range processing to obtain the first high dynamic range image includes: fusing the first color original image and the second color original image into a first high dynamic color original image; and combining the first high dynamic color original image and the first panchromatic original image Fusion is the first high dynamic range image.
  • fusing the first color original image and the second color original image into a first high dynamic color original image includes: fusing the first color original image and the second color original image into a second high dynamic color original image And performing brightness mapping on the second high dynamic color original image to obtain the first high dynamic color original image.
  • the high dynamic range image processing method includes: fusing a first color original image and a second color original image into a second high dynamic color original image; correcting the second high dynamic color original image to obtain a high dynamic color Correcting the image; and processing the high dynamic color correction image and the first full-color original image to obtain statistical data, which is provided to the image processor 40 for automatic exposure processing and/or automatic white balance processing.
  • the present application also provides an electronic device 1000.
  • the electronic device 1000 of the embodiment of the present application includes a lens 300, a housing 200, and the high dynamic range image processing system 100 described in any one of the above embodiments.
  • This application also provides a non-volatile computer-readable storage medium 400 containing a computer program.
  • the processor 60 is caused to execute the high dynamic range image processing method described in any one of the foregoing embodiments.
  • the high dynamic range image processing system 100 of the embodiment of the present application performs fusion algorithm processing and high dynamic range processing on the panchromatic original image and color original image output by the image sensor 10 through the image fusion module 20 and the high dynamic range image processing module 30.
  • the first high dynamic range image is input into the image processor to complete subsequent processing, thereby solving the problem that the image processor 40 cannot directly arrange the image pixels in a non-Bayer array. The problem of processing cloth images.
  • FIG. 2 is a schematic diagram of the image sensor 10 in the embodiment of the present application.
  • the image sensor 10 includes a pixel array 11, a vertical driving unit 12, a control unit 13, a column processing unit 14 and a horizontal driving unit 15.
  • the image sensor 10 may adopt a complementary metal oxide semiconductor (CMOS, Complementary Metal Oxide Semiconductor) photosensitive element or a charge-coupled device (CCD, Charge-coupled Device) photosensitive element.
  • CMOS complementary metal oxide semiconductor
  • CCD Charge-coupled Device
  • the pixel array 11 includes a plurality of photosensitive pixels 110 (shown in FIG. 3) arranged two-dimensionally in an array (ie, arranged in a two-dimensional matrix), and each photosensitive pixel 110 includes a photoelectric conversion element 1111 (shown in FIG. 4) .
  • Each photosensitive pixel 110 converts light into electric charge according to the intensity of light incident thereon.
  • the vertical driving unit 12 includes a shift register and an address decoder.
  • the vertical drive unit 12 includes readout scanning and reset scanning functions.
  • the readout scan refers to sequentially scanning the unit photosensitive pixels 110 line by line, and reading signals from these unit photosensitive pixels 110 line by line.
  • the signal output by each photosensitive pixel 110 in the selected and scanned photosensitive pixel row is transmitted to the column processing unit 14.
  • the reset scan is used to reset the charge, and the photocharge of the photoelectric conversion element is discarded, so that the accumulation of new photocharge can be started.
  • the signal processing performed by the column processing unit 14 is correlated double sampling (CDS) processing.
  • CDS correlated double sampling
  • the reset level and the signal level output from each photosensitive pixel 110 in the selected photosensitive pixel row are taken out, and the level difference is calculated.
  • the signals of the photosensitive pixels 110 in a row are obtained.
  • the column processing unit 14 may have an analog-to-digital (A/D) conversion function for converting analog pixel signals into a digital format.
  • A/D analog-to-digital
  • the horizontal driving unit 15 includes a shift register and an address decoder.
  • the horizontal driving unit 15 sequentially scans the pixel array 11 column by column. Through the selection scanning operation performed by the horizontal driving unit 15, each photosensitive pixel column is sequentially processed by the column processing unit 14, and is sequentially output.
  • control unit 13 configures timing signals according to the operation mode, and uses various timing signals to control the vertical driving unit 12, the column processing unit 14 and the horizontal driving unit 15 to work together.
  • FIG. 3 is a schematic diagram of a photosensitive pixel 110 in an embodiment of the present application.
  • the photosensitive pixel 110 includes a pixel circuit 111, a filter 112, and a micro lens 113. Along the light-receiving direction of the photosensitive pixel 110, the microlens 113, the filter 112, and the pixel circuit 111 are arranged in sequence.
  • the microlens 113 is used for condensing light
  • the filter 112 is used for passing light of a certain waveband and filtering out the light of other wavebands.
  • the pixel circuit 111 is used to convert the received light into electrical signals, and provide the generated electrical signals to the column processing unit 14 shown in FIG. 2.
  • FIG. 4 is a schematic diagram of a pixel circuit 111 of a photosensitive pixel 110 in an embodiment of the present application.
  • the pixel circuit 111 in FIG. 4 can be applied to each photosensitive pixel 110 (shown in FIG. 3) in the pixel array 11 shown in FIG.
  • the working principle of the pixel circuit 111 will be described below with reference to FIGS. 2 to 4.
  • the pixel circuit 111 includes a photoelectric conversion element 1111 (for example, a photodiode), an exposure control circuit (for example, a transfer transistor 1112), a reset circuit (for example, a reset transistor 1113), and an amplification circuit (for example, an amplification transistor 1114). ) And a selection circuit (for example, a selection transistor 1115).
  • the transfer transistor 1112, the reset transistor 1113, the amplifying transistor 1114, and the selection transistor 1115 are, for example, MOS transistors, but are not limited thereto.
  • the photoelectric conversion element 1111 includes a photodiode, and the anode of the photodiode is connected to the ground, for example.
  • the photodiode converts the received light into electric charge.
  • the cathode of the photodiode is connected to the floating diffusion unit FD via an exposure control circuit (for example, a transfer transistor 1112).
  • the floating diffusion unit FD is connected to the gate of the amplification transistor 1114 and the source of the reset transistor 1113.
  • the exposure control circuit is a transfer transistor 1112, and the control terminal TG of the exposure control circuit is the gate of the transfer transistor 1112.
  • the transfer transistor 1112 When a pulse of an active level (for example, VPIX level) is transmitted to the gate of the transfer transistor 1112 through the exposure control line, the transfer transistor 1112 is turned on.
  • the transfer transistor 1112 transfers the charge photoelectrically converted by the photodiode to the floating diffusion unit FD.
  • the drain of the reset transistor 1113 is connected to the pixel power supply VPIX.
  • the source of the reset transistor 113 is connected to the floating diffusion unit FD.
  • a pulse of an effective reset level is transmitted to the gate of the reset transistor 113 via the reset line, and the reset transistor 113 is turned on.
  • the reset transistor 113 resets the floating diffusion unit FD to the pixel power supply VPIX.
  • the gate of the amplifying transistor 1114 is connected to the floating diffusion unit FD.
  • the drain of the amplifying transistor 1114 is connected to the pixel power supply VPIX.
  • the amplifying transistor 1114 After the floating diffusion unit FD is reset by the reset transistor 1113, the amplifying transistor 1114 outputs the reset level through the output terminal OUT via the selection transistor 1115. After the charge of the photodiode is transferred by the transfer transistor 1112, the amplifying transistor 1114 outputs a signal level through the output terminal OUT via the selection transistor 1115.
  • the drain of the selection transistor 1115 is connected to the source of the amplification transistor 1114.
  • the source of the selection transistor 1115 is connected to the column processing unit 14 in FIG. 2 through the output terminal OUT.
  • the selection transistor 1115 is turned on.
  • the signal output by the amplifying transistor 1114 is transmitted to the column processing unit 14 through the selection transistor 1115.
  • the pixel structure of the pixel circuit 111 in the embodiment of the present application is not limited to the structure shown in FIG. 4.
  • the pixel circuit 111 may also have a three-transistor pixel structure, in which the functions of the amplifying transistor 1114 and the selecting transistor 1115 are performed by one transistor.
  • the exposure control circuit is not limited to the way of a single transfer transistor 1112, and other electronic devices or structures with the function of controlling the conduction of the control terminal can be used as the exposure control circuit in the embodiment of the present application.
  • the implementation of the transistor 1112 is simple, low in cost, and easy to control.
  • 5 to 10 are schematic diagrams of the arrangement of photosensitive pixels 110 (shown in FIG. 3) in the pixel array 11 (shown in FIG. 2) according to some embodiments of the present application.
  • the photosensitive pixels 110 include two types, one is a full-color photosensitive pixel W, and the other is a color photosensitive pixel.
  • 5 to 10 only show the arrangement of a plurality of photosensitive pixels 110 in a minimum repeating unit. The smallest repeating unit shown in FIGS. 5 to 10 is copied multiple times in rows and columns to form the pixel array 11. Each minimum repeating unit is composed of multiple full-color photosensitive pixels W and multiple color photosensitive pixels. Each minimum repeating unit includes multiple subunits.
  • Each subunit includes a plurality of single-color photosensitive pixels and a plurality of full-color photosensitive pixels W.
  • the full-color photosensitive pixel W and the color photosensitive pixel in each sub-unit are alternately arranged.
  • multiple photosensitive pixels 110 in the same row are photosensitive pixels 110 of the same category; or, multiple photosensitive pixels 110 in the same column are photosensitive pixels 110 of the same category 110.
  • FIG. 5 is a schematic diagram of the arrangement of photosensitive pixels 110 (shown in FIG. 3) in the smallest repeating unit of an embodiment of the application.
  • the smallest repeating unit is 4 rows, 4 columns and 16 photosensitive pixels 110
  • the sub-units are 2 rows, 2 columns and 4 photosensitive pixels 110.
  • the arrangement method is:
  • W represents the full-color photosensitive pixel
  • A represents the first color photosensitive pixel among the multiple color photosensitive pixels
  • B represents the second color photosensitive pixel among the multiple color photosensitive pixels
  • C represents the third color photosensitive pixel among the multiple color photosensitive pixels Pixels.
  • full-color photosensitive pixels W and single-color photosensitive pixels are alternately arranged.
  • the categories of subunits include three categories.
  • the first type subunit UA includes a plurality of full-color photosensitive pixels W and a plurality of first color photosensitive pixels A
  • the second type of subunit UB includes a plurality of panchromatic photosensitive pixels W and a plurality of second-color photosensitive pixels B
  • the third type of subunit UC includes a plurality of full-color photosensitive pixels W and a plurality of third-color photosensitive pixels C.
  • Each minimum repeating unit includes four subunits, which are one subunit of the first type UA, two subunits of the second type UB, and one subunit of the third type UC.
  • a first type subunit UA and a third type subunit UC are arranged in the first diagonal direction D1 (for example, the direction connecting the upper left corner and the lower right corner in FIG. 5), and two second type subunits UB are arranged In the second diagonal direction D2 (for example, the direction where the upper right corner and the lower left corner are connected in FIG. 5).
  • the first diagonal direction D1 is different from the second diagonal direction D2.
  • the first diagonal line and the second diagonal line are perpendicular.
  • first diagonal direction D1 may also be a direction connecting the upper right corner and the lower left corner
  • second diagonal direction D2 may also be a direction connecting the upper left corner and the lower right corner
  • direction here is not a single direction, but can be understood as the concept of a "straight line” indicating the arrangement, and there may be two-way directions at both ends of the straight line.
  • the explanation of the first diagonal direction D1 and the second diagonal direction D2 in FIGS. 6 to 10 is the same as here.
  • FIG. 6 is a schematic diagram of the arrangement of photosensitive pixels 110 (shown in FIG. 3) in a minimum repeating unit according to another embodiment of the application.
  • the smallest repeating unit is 36 photosensitive pixels 110 in 6 rows and 6 columns, and the sub-units are 9 photosensitive pixels 110 in 3 rows and 3 columns.
  • the arrangement method is:
  • W represents the full-color photosensitive pixel
  • A represents the first color photosensitive pixel among the multiple color photosensitive pixels
  • B represents the second color photosensitive pixel among the multiple color photosensitive pixels
  • C represents the third color photosensitive pixel among the multiple color photosensitive pixels Pixels.
  • full-color photosensitive pixels W and single-color photosensitive pixels are alternately arranged.
  • the categories of subunits include three categories.
  • the first type subunit UA includes a plurality of full-color photosensitive pixels W and a plurality of first color photosensitive pixels A
  • the second type of subunit UB includes a plurality of panchromatic photosensitive pixels W and a plurality of second-color photosensitive pixels B
  • the third type of subunit UC includes a plurality of full-color photosensitive pixels W and a plurality of third-color photosensitive pixels C.
  • Each minimum repeating unit includes four subunits, which are one subunit of the first type UA, two subunits of the second type UB, and one subunit of the third type UC.
  • one first type subunit UA and one third type subunit UC are arranged in the first diagonal direction D1
  • two second type subunits UB are arranged in the second diagonal direction D2.
  • the first diagonal direction D1 is different from the second diagonal direction D2.
  • the first diagonal line and the second diagonal line are perpendicular.
  • FIG. 7 is a schematic diagram of the arrangement of photosensitive pixels 110 (shown in FIG. 3) in the smallest repeating unit according to another embodiment of the application.
  • the minimum repeating unit is 8 rows and 8 columns and 64 photosensitive pixels 110
  • the sub-units are 4 rows and 4 columns and 16 photosensitive pixels 110.
  • the arrangement method is:
  • W represents the full-color photosensitive pixel
  • A represents the first color photosensitive pixel among the multiple color photosensitive pixels
  • B represents the second color photosensitive pixel among the multiple color photosensitive pixels
  • C represents the third color photosensitive pixel among the multiple color photosensitive pixels Pixels.
  • full-color photosensitive pixels W and single-color photosensitive pixels are alternately arranged.
  • the categories of subunits include three categories.
  • the first type subunit UA includes a plurality of full-color photosensitive pixels W and a plurality of first color photosensitive pixels A
  • the second type of subunit UB includes a plurality of panchromatic photosensitive pixels W and a plurality of second-color photosensitive pixels B
  • the third type of subunit UC includes a plurality of full-color photosensitive pixels W and a plurality of third-color photosensitive pixels C.
  • Each minimum repeating unit includes four subunits, which are one subunit of the first type UA, two subunits of the second type UB, and one subunit of the third type UC.
  • one first type subunit UA and one third type subunit UC are arranged in the first diagonal direction D1
  • two second type subunits UB are arranged in the second diagonal direction D2.
  • the first diagonal direction D1 is different from the second diagonal direction D2.
  • the first diagonal line and the second diagonal line are perpendicular.
  • FIG. 8 is a schematic diagram of the arrangement of photosensitive pixels 110 (shown in FIG. 3) in the smallest repeating unit according to another embodiment of the application.
  • the smallest repeating unit is 4 rows, 4 columns and 16 photosensitive pixels 110, and the sub-units are 2 rows, 2 columns and 4 photosensitive pixels 110.
  • the arrangement method is:
  • W represents the full-color photosensitive pixel
  • A represents the first color photosensitive pixel among the multiple color photosensitive pixels
  • B represents the second color photosensitive pixel among the multiple color photosensitive pixels
  • C represents the third color photosensitive pixel among the multiple color photosensitive pixels Pixels.
  • the arrangement of the photosensitive pixels 110 in the smallest repeating unit shown in FIG. 8 is roughly the same as the arrangement of the photosensitive pixels 110 in the smallest repeating unit shown in FIG.
  • the alternating sequence of full-color photosensitive pixels W and single-color photosensitive pixels in the subunit UB is inconsistent with the alternating sequence of full-color photosensitive pixels W and single-color photosensitive pixels in the second type of subunit UB in the lower left corner of FIG. 5, and ,
  • the alternating sequence of the full-color photosensitive pixel W and the single-color photosensitive pixel in the third type subunit UC in FIG. 8 is the same as the full-color photosensitive pixel W and the single-color photosensitive pixel W in the third type subunit UC in the lower right corner of FIG.
  • the alternating sequence of photosensitive pixels is also inconsistent. Specifically, in the second type subunit UB in the lower left corner of FIG. 5, the alternating sequence of the photosensitive pixels 110 in the first row is full-color photosensitive pixels W, single-color photosensitive pixels (ie, second-color photosensitive pixels B), and The alternating sequence of the two rows of photosensitive pixels 110 is single-color photosensitive pixels (ie, second-color photosensitive pixels B) and full-color photosensitive pixels W; and in the second-type subunit UB in the lower left corner of FIG.
  • the first row The alternating sequence of photosensitive pixels 110 is single-color photosensitive pixels (ie, second-color photosensitive pixels B), full-color photosensitive pixels W, and the alternating sequence of photosensitive pixels 110 in the second row is full-color photosensitive pixels W, single-color photosensitive pixels (ie The second color photosensitive pixel B).
  • the alternating sequence of the photosensitive pixels 110 in the first row is full-color photosensitive pixels W, single-color photosensitive pixels (that is, third-color photosensitive pixels C), and the second row
  • the alternating sequence of the photosensitive pixels 110 is a single-color photosensitive pixel (that is, a third-color photosensitive pixel C) and a full-color photosensitive pixel W; and in the third type subunit UC in the lower right corner of FIG.
  • the photosensitive pixels 110 in the first row The alternating sequence of the single-color photosensitive pixel (ie the third color photosensitive pixel C), the full-color photosensitive pixel W, the alternating sequence of the photosensitive pixel 110 in the second row is the full-color photosensitive pixel W, the single-color photosensitive pixel (ie the third color Photosensitive pixel C).
  • the alternating sequence of pixels is not consistent.
  • the alternating sequence of the photosensitive pixels 110 in the first row is full-color photosensitive pixels W, single-color photosensitive pixels (that is, first-color photosensitive pixels A), and the second row
  • the alternating sequence of the photosensitive pixels 110 is a single-color photosensitive pixel (that is, the first color photosensitive pixel A), a full-color photosensitive pixel W; and in the third type of subunit CC shown in FIG.
  • the photosensitive pixels 110 in the first row The alternating sequence is single-color photosensitive pixels (that is, third-color photosensitive pixels C), full-color photosensitive pixels W, and the alternating sequence of photosensitive pixels 110 in the second row is full-color photosensitive pixels W, single-color photosensitive pixels (that is, third-color photosensitive pixels). Pixel C). That is to say, in the same minimum repeating unit, the alternating sequence of full-color photosensitive pixels W and color photosensitive pixels in different subunits can be the same (as shown in Figure 5) or inconsistent (as shown in Figure 8). Show).
  • FIG. 9 is a schematic diagram of the arrangement of photosensitive pixels 110 (shown in FIG. 3) in the smallest repeating unit according to another embodiment of the application.
  • the smallest repeating unit is 4 rows, 4 columns and 16 photosensitive pixels 110, and the sub-units are 2 rows, 2 columns and 4 photosensitive pixels 110.
  • the arrangement method is:
  • W represents the full-color photosensitive pixel
  • A represents the first color photosensitive pixel among the multiple color photosensitive pixels
  • B represents the second color photosensitive pixel among the multiple color photosensitive pixels
  • C represents the third color photosensitive pixel among the multiple color photosensitive pixels Pixels.
  • photosensitive pixels 110 in the same row are photosensitive pixels 110 of the same category.
  • the photosensitive pixels 110 of the same category include: (1) all full-color photosensitive pixels W; (2) all first-color photosensitive pixels A; (3) all second-color photosensitive pixels B; (4) all The third color photosensitive pixel C.
  • the categories of subunits include three categories.
  • the first type subunit UA includes a plurality of full-color photosensitive pixels W and a plurality of first color photosensitive pixels A
  • the second type of subunit UB includes a plurality of panchromatic photosensitive pixels W and a plurality of second-color photosensitive pixels B
  • the third type of subunit UC includes a plurality of full-color photosensitive pixels W and a plurality of third-color photosensitive pixels C.
  • Each minimum repeating unit includes four subunits, which are one subunit of the first type UA, two subunits of the second type UB, and one subunit of the third type UC.
  • one first type subunit UA and one third type subunit UC are arranged in the first diagonal direction D1
  • two second type subunits UB are arranged in the second diagonal direction D2.
  • the first diagonal direction D1 is different from the second diagonal direction D2.
  • the first diagonal line and the second diagonal line are perpendicular.
  • FIG. 10 is a schematic diagram of the arrangement of photosensitive pixels 110 (shown in FIG. 3) in the smallest repeating unit according to another embodiment of the application.
  • the smallest repeating unit is 4 rows, 4 columns and 16 photosensitive pixels 110, and the sub-units are 2 rows, 2 columns and 4 photosensitive pixels 110.
  • the arrangement method is:
  • W represents the full-color photosensitive pixel
  • A represents the first color photosensitive pixel among the multiple color photosensitive pixels
  • B represents the second color photosensitive pixel among the multiple color photosensitive pixels
  • C represents the third color photosensitive pixel among the multiple color photosensitive pixels Pixels.
  • a plurality of photosensitive pixels 110 in the same column are photosensitive pixels 110 of the same category.
  • the photosensitive pixels 110 of the same category include: (1) all full-color photosensitive pixels W; (2) all first-color photosensitive pixels A; (3) all second-color photosensitive pixels B; (4) all The third color photosensitive pixel C.
  • the categories of subunits include three categories.
  • the first type subunit UA includes a plurality of full-color photosensitive pixels W and a plurality of first color photosensitive pixels A
  • the second type of subunit UB includes a plurality of panchromatic photosensitive pixels W and a plurality of second-color photosensitive pixels B
  • the third type of subunit UC includes a plurality of full-color photosensitive pixels W and a plurality of third-color photosensitive pixels C.
  • Each minimum repeating unit includes four subunits, which are one subunit of the first type UA, two subunits of the second type UB, and one subunit of the third type UC.
  • a first type subunit UA and a third type subunit UC are arranged in a first diagonal direction D1
  • two second type subunits UB are arranged in a second diagonal direction D2.
  • the first diagonal direction D1 is different from the second diagonal direction D2.
  • the first diagonal line and the second diagonal line are perpendicular.
  • multiple photosensitive pixels 110 in the same row in some sub-units may be photosensitive pixels 110 of the same category, and multiple photosensitive pixels 110 in the same column in the remaining sub-units
  • the pixels 110 are photosensitive pixels 110 of the same type.
  • the first color photosensitive pixel A may be a red photosensitive pixel R; the second color photosensitive pixel B may be a green photosensitive pixel G; and the third color photosensitive pixel C may be Blue photosensitive pixel Bu.
  • the first color photosensitive pixel A may be a red photosensitive pixel R; the second color photosensitive pixel B may be a yellow photosensitive pixel Y; and the third color photosensitive pixel C may be Blue photosensitive pixel Bu.
  • the first color photosensitive pixel A may be a magenta photosensitive pixel M; the second color photosensitive pixel B may be a cyan photosensitive pixel Cy; and the third color photosensitive pixel C may It is the yellow photosensitive pixel Y.
  • the response band of the full-color photosensitive pixel W may be the visible light band (for example, 400 nm-760 nm).
  • the full-color photosensitive pixel W is provided with an infrared filter to filter out infrared light.
  • the response wavelength bands of the full-color photosensitive pixel W are visible light and near-infrared wavelengths (for example, 400nm-1000nm), and the photoelectric conversion element 1111 (shown in FIG. 4) in the image sensor 10 (shown in FIG. 1) (Shown) to match the response band.
  • the full-color photosensitive pixel W may not be provided with a filter or a filter that can pass light of all wavelength bands.
  • the response band of the full-color photosensitive pixel W is determined by the response band of the photoelectric conversion element 1111, that is, the two match. .
  • the embodiments of the present application include, but are not limited to, the above-mentioned waveband range.
  • the control unit 13 controls the pixel array 11 to expose.
  • at least one single-color photosensitive pixel is exposed with a first exposure time
  • at least one single-color photosensitive pixel is exposed with a second exposure time less than the first exposure time
  • at least one full-color photosensitive pixel is exposed
  • the photosensitive pixel W is exposed at a third exposure time that is less than or equal to the first exposure time.
  • the plurality of single-color photosensitive pixels exposed at the first exposure time in the pixel array 11 may generate first color information
  • the plurality of single-color photosensitive pixels exposed at the second exposure time may generate second color information, which are exposed at the third exposure time.
  • a plurality of panchromatic photosensitive pixels W can generate panchromatic information.
  • the first color information may form a first color original image.
  • the second color information can form a second color original image.
  • Panchromatic information can generate a panchromatic original image.
  • part of the panchromatic photosensitive pixels W in the same subunit is exposed at the fourth exposure time, and the remaining panchromatic photosensitive pixels W are exposed at the third exposure time.
  • the fourth exposure time is less than or equal to the first exposure time and greater than the third exposure time.
  • the fourth exposure time is less than or equal to the first exposure time and greater than the third exposure time.
  • a single-color photosensitive pixel takes the first exposure time (for example, the long exposure time L shown in FIG. 11) Exposure, a single-color photosensitive pixel is exposed for the second exposure time (for example, the short exposure time S shown in FIG. 11), and a full-color photosensitive pixel W is exposed for the third exposure time (for example, the short exposure time S shown in FIG. 11) , One full-color photosensitive pixel W is exposed for the fourth exposure time (for example, the long exposure time L shown in FIG. 11).
  • the exposure process of the pixel array 11 may be: (1) the photosensitive pixels 110 exposed at the first exposure time, the photosensitive pixels 110 exposed at the second exposure time, and the third exposure time.
  • the time-exposed photosensitive pixel 110 and the photosensitive pixel 110 exposed at the fourth exposure time are sequentially exposed (the exposure sequence of the four is not limited), and the exposure time of the four does not overlap; (2) the first exposure time
  • the exposed photosensitive pixels 110, the photosensitive pixels 110 exposed at the second exposure time, the photosensitive pixels 110 exposed at the third exposure time, and the photosensitive pixels 110 exposed at the fourth exposure time are sequentially exposed (the exposure order of the four is not limited ), and there is a partial overlap in the exposure time of the four;
  • the exposure time of all the photosensitive pixels 110 exposed with a shorter exposure time is within the exposure time of the photosensitive pixels 110 exposed with the longest exposure time
  • the exposure time of all the single-color photosensitive pixels exposed at the second exposure time is within the exposure time of all the single-color photosensitive pixels exposed at the
  • the image sensor 10 can output four original images, which are: (1) The first color original image, which is generated by multiple single-color photosensitive pixels exposed with a long exposure time L (first exposure time) (2)
  • the second color original image is composed of the second color information generated by multiple single-color photosensitive pixels exposed with a short exposure time S (second exposure time); (3)
  • the first full The color original image is composed of the first panchromatic information generated by multiple panchromatic photosensitive pixels W (third exposure time) exposed with a short exposure time S;
  • the second panchromatic original image is composed of a long exposure time L (Fourth Exposure Time) It is composed of second panchromatic information generated by a plurality of panchromatic photosensitive pixels W exposed.
  • the image fusion module 20 performs fusion processing on the first color original image and the second panchromatic original image to obtain a first intermediate image, and performs fusion processing on the second color original image and the first panchromatic original image to obtain the first intermediate image. Two intermediate images.
  • the image fusion module 20 first separates the color and brightness of the first color original image to obtain a color-brightness separated image.
  • LIT stands for brightness and CLR stands for color.
  • CLR stands for color.
  • the image fusion module 20 can combine RGB The first color original image in the space is converted into a color-brightness separated image in YCrCb space.
  • Y in YCrCb is the brightness LIT in the color-brightness separated image
  • Cr and Cb in YCrCb are the color CLR in the color-brightness separated image.
  • the image fusion module 20 can also convert the first color original image of RGB into a color-light-separated image in Lab space.
  • L in Lab is the brightness LIT in the color-light-separated image
  • a and a in Lab b is the color CLR in the color-light-separated image.
  • the LIT+CLR in the color-light separation image shown in FIG. 12 does not mean that the pixel value of each pixel is formed by adding L and CLR, but only that the pixel value of each pixel is composed of LIT and CLR.
  • the image fusion module 20 fuses the brightness of the color-brightness separated image and the brightness of the second full-color original image.
  • the pixel value of each panchromatic pixel W in the second panchromatic original image is the brightness value of each panchromatic pixel
  • the image fusion module 20 can separate the LIT of each pixel in the color-brightness image with the panchromatic pixel.
  • the W of the panchromatic pixel at the corresponding position in the intermediate image is added to obtain the pixel value after brightness correction.
  • the image fusion module 20 forms a brightness-corrected color-brightness separated image according to a plurality of brightness-corrected pixel values, and then uses color space conversion to convert the brightness-corrected color-brightness separated image into a first intermediate image.
  • the image fusion module 20 performs fusion processing on the second color original image and the first panchromatic original image to obtain a second intermediate image.
  • the acquisition process of the second intermediate image is the same as the acquisition process of the first intermediate image, and will not be repeated here.
  • the image fusion module 20 can also use other methods to perform fusion processing, which is not limited here.
  • the fusion processing of the color original image and the full-color original image by the image fusion module 20 can increase the brightness of the intermediate image obtained after fusion.
  • the first color original image is composed of first color information generated by multiple single-color photosensitive pixels exposed with a long exposure time L
  • the second full-color original image is also exposed with a long exposure time L
  • the second panchromatic information generated by a plurality of panchromatic photosensitive pixels W is composed, so the exposure time corresponding to all the image pixels in the first intermediate image obtained by the fusion process of the first color original image and the second panchromatic original image is Long exposure time L.
  • the second color original image is composed of second color information generated by a plurality of single-color photosensitive pixels exposed with a short exposure time S
  • the first full-color original image is also composed of a plurality of single-color photosensitive pixels exposed with a short exposure time S
  • the first panchromatic information generated by the panchromatic photosensitive pixel W is composed, so the exposure time corresponding to all image pixels in the second intermediate image obtained by the fusion process of the second color original image and the first panchromatic original image is short exposure Time S.
  • the high dynamic range image processing module 30 includes a high dynamic range image processing unit 31 and a brightness mapping unit 33.
  • the high dynamic range image processing unit 31 is used to fuse the first intermediate image and the second intermediate image into a third high dynamic range image;
  • the brightness mapping unit 33 is used to perform brightness mapping on the third high dynamic range image to obtain the first high dynamic range image Range image.
  • the process of the high dynamic range image processing unit 31 fusing the first intermediate image and the second intermediate image may include brightness alignment processing.
  • the high dynamic range image processing unit 31 performing brightness alignment processing on the first intermediate image and the second intermediate image includes the following steps: (1) identifying overexposed image pixels in the first intermediate image with pixel values greater than a first preset threshold; (2) ) For each overexposed image pixel, expand the predetermined area with the overexposed image pixel as the center; (3) Find the intermediate image pixel with the pixel value less than the first preset threshold in the predetermined area; (4) Use the intermediate image pixel and The second intermediate image corrects the pixel values of the pixels of the overexposed image; (5) the first intermediate image is updated with the corrected pixel values of the pixels of the overexposed image to obtain the first intermediate image with the brightness aligned.
  • the high dynamic range image processing unit 31 expands a predetermined area with the overexposed image pixel P12 as the center, for example, the 3*3 area shown in FIG. 14.
  • it may also be a 4*4 area, a 5*5 area, a 10*10 area, etc., which is not limited here.
  • the high dynamic range image processing unit 31 searches for an intermediate image pixel with a pixel value less than the first preset threshold V0 in a predetermined area of 3*3, such as image pixel P21 in FIG. 14 (marked in the first intermediate image in FIG. 14). If the pixel value V2 of the image pixel with a dotted circle is less than the first preset threshold V0, the image pixel P21 is the intermediate image pixel P21. Subsequently, the high dynamic range image processing unit 31 searches the second intermediate image for image pixels corresponding to the overexposed image pixel P12 and the intermediate image pixel P21 respectively, that is, the image pixel P1'2' (marked in the second intermediate image in FIG. 14).
  • the high dynamic range image processing unit 31 performs this brightness alignment process on each overexposed image pixel in the first intermediate image to obtain the first intermediate image after brightness alignment. Since the pixel value of the overexposed image pixel in the first intermediate image after brightness alignment is corrected, the pixel value of each image pixel in the first intermediate image after brightness alignment is relatively accurate.
  • the high dynamic range image processing unit 31 may fuse the first intermediate image and the second intermediate image with the brightness aligned to obtain The third high dynamic color image.
  • the high dynamic range image processing unit 31 first performs motion detection on the first intermediate image after brightness alignment to identify whether there is a motion blur area in the first intermediate image after brightness alignment. If there is no motion blur area in the first intermediate image after brightness alignment, the first intermediate image and the second intermediate image after brightness alignment are directly merged to obtain the first high dynamic range image.
  • the motion blur area in the first intermediate image after brightness alignment is eliminated, and only all areas of the second intermediate image are merged and the first intermediate image after brightness alignment is removed. The area outside the area to obtain the first high dynamic range image.
  • the fusion of the two intermediate images at this time follows the following principles: ( 1) In the first intermediate image after brightness alignment, the pixel value of the image pixel in the overexposed area is directly replaced with the pixel value of the image pixel corresponding to the overexposed area in the second intermediate image; (2) the first intermediate image after brightness alignment In an intermediate image, the pixel value of the image pixel in the under-exposed area is: the long-exposure pixel value divided by the ratio of the long-short pixel value; (3) the first intermediate image after brightness alignment, the image in the area that is neither under-exposed nor over-exposed The pixel value of a pixel: the long-exposure pixel value divided by the ratio of the long-short pixel value.
  • the fusion of the two intermediate images at this time must follow the above three principles, and also need to follow the (4) principle: the first intermediate image after brightness alignment , The pixel value of the image pixel in the motion blur area is directly replaced with the pixel value of the image pixel in the second intermediate image corresponding to the motion blur area.
  • the signal-to-noise ratio of VS’ will be greater than the signal-to-noise ratio of VS.
  • the dynamic range of the obtained image can be increased, and the imaging effect of the image can be improved.
  • the high dynamic range image processing unit 31 may also use other methods to fuse the first intermediate image and the second intermediate image with the brightness aligned to obtain the third high dynamic color image.
  • the high dynamic range image processing unit 31 may also perform motion blur detection on the first intermediate image and the second intermediate image after the brightness is aligned, and perform motion blur detection on the detected first intermediate image and the second intermediate image.
  • the motion blur is eliminated to obtain the first intermediate image after the motion blur is eliminated and the second intermediate image after the motion blur is eliminated.
  • the high dynamic range image processing unit 31 obtains the first intermediate image after removing the motion blur and the second intermediate image after removing the motion blur
  • the first intermediate image after removing the motion blur and the second intermediate image after removing the motion blur are processed again.
  • the intermediate images are fused to obtain the third high dynamic range image with high dynamic range, which is not limited here.
  • the high dynamic range image processing unit 31 transmits the third high dynamic range image to the brightness mapping unit 33.
  • the brightness mapping unit 33 subjects the third high dynamic range image to brightness mapping processing to obtain the first high dynamic range image.
  • the bit width of the data of each image pixel in the first high dynamic range image is smaller than the bit width of the data of each image pixel in the third high dynamic range image.
  • a third high dynamic range image with a bit width of 16 bits can be obtained.
  • the brightness mapping unit 33 may perform brightness mapping processing on the third high dynamic range image with a bit width of 16 bits to obtain the first high dynamic range image with a bit width of 10 bits.
  • the third high dynamic range image with a bit width of 16 bits may also be subjected to brightness mapping processing to obtain the first high dynamic range image with a bit width of 12 bits, which is not limited here. In this way, the data volume of the high dynamic range image is reduced through the brightness mapping process, thereby avoiding the problem that the image processor 40 cannot process the high dynamic range image with excessive data volume, and is beneficial to improve the image processor 40 processing high dynamic range image. speed.
  • the high dynamic range image processing unit 31 can transmit the first high dynamic range image to the image processor 40 for subsequent processing such as black level, demosaicing, color conversion, lens shading correction, dead pixel compensation, global tone mapping, etc., to obtain the first high dynamic range image.
  • the multiple color image pixels in the first high dynamic range image are all arranged in a Bayer array, and the pixel value of each image pixel only contains the information of one color channel, while the pixels of each image pixel in the second high dynamic range image The values all contain the information of each color channel.
  • the high dynamic range image processing module 30 further includes a statistical unit 35, which is used to process the first intermediate image and the second intermediate image to obtain statistical data.
  • the statistical unit 35 provides the statistical data to the image processor 40 for automatic exposure processing and/or automatic white balance processing.
  • the image processor 40 can perform at least one of automatic exposure processing and automatic white balance processing according to the statistical data.
  • the image processor 40 performs automatic exposure processing based on statistical data; or, the image processor 40 performs automatic white balance processing based on statistical data; or, the image processor 40 performs automatic exposure processing and automatic white balance processing based on statistical data.
  • the image processor 40 can perform automatic exposure and automatic white balance processing according to the statistical data, which is beneficial to improve the quality of the image finally output by the image processor 40.
  • the high dynamic range image processing module 30 further includes a lens shading correction unit 37, which is used to correct the third high dynamic range image to obtain a high dynamic range corrected image. Specifically, after the high dynamic range image processing unit 31 fuses the first intermediate image and the second intermediate image into a third high dynamic range image, the lens shading correction unit 37 performs lens shading correction processing on the third high dynamic range image to obtain a high dynamic range image. Dynamic range correction image. The specific process of lens shading correction processing is shown in Figure 17. The lens shading correction unit 37 divides the third high dynamic range image into sixteen grids equally, and each of the sixteen grids has one The preset compensation coefficient.
  • the lens shading correction unit 37 performs shading correction on the image by the bilinear interpolation method according to the compensation effect of each grid area adjacent or itself and its vicinity.
  • R2 is the pixel value in the dotted frame in the third high dynamic range image that has undergone lens shading correction processing
  • R1 is the pixel value in the dotted frame in the first color original image shown in the figure.
  • R2 R1*k1
  • k1 is obtained by bilinear interpolation of the compensation coefficients 1.10, 1.04, 1.05, and 1.09 of the grid adjacent to the R1 pixel.
  • the coordinates of the image are (x, y), x is counted from the first pixel from the left to the right, y is counted from the first pixel on the top, and both x and y are natural numbers, as indicated by the logo on the edge of the image Show.
  • the coordinates of R1 are (3,3), the coordinates of R1 in each grid compensation coefficient map should be (0.75,0.75).
  • f(x, y) represents the compensation value of the coordinate (x, y) in each grid compensation coefficient graph.
  • the compensation coefficient of each grid is set in advance before the lens shading correction unit 37 performs lens shading correction processing.
  • the lens shading correction unit 37 transmits the high dynamic range correction image to the statistics unit 35.
  • the statistical unit 35 is configured to process the high dynamic range correction image to obtain statistical data, and provide the statistical data to the image processor 40 for automatic exposure processing and/or automatic white balance processing, that is, the statistical data is provided to the image processor 40 To perform at least one of automatic exposure processing and automatic white balance processing.
  • the image processor 40 performs automatic exposure based on the statistical data
  • the quality of the image obtained by processing and/or automatic white balance processing is higher. It should be noted that both the image fusion module 20 and the high dynamic range processing module 30 are integrated in the image sensor 10.
  • the high dynamic range image processing system 100 shown in FIG. 16 first fuses the color original image and the panchromatic original image through the image fusion module 20 to obtain the first intermediate image and the second intermediate image, and then passes the high dynamic range image
  • the processing module 30 performs high dynamic range processing on the first intermediate image and the second intermediate image to obtain the first high dynamic range image. Since the multiple color image pixels in the first high dynamic range image are arranged in a Bayer array, the first high dynamic range image can be directly processed by the image processor 40.
  • the image sensor 10 obtains the first color original image, the second color original image, the first full color original image, and the second full color original image
  • the four images The image is transmitted to the high dynamic range image processing module 30.
  • the high dynamic range image processing module 30 fuses the first color original image and the second color original image into a first high dynamic color original image, and combines the first full color original image with the second full color original image.
  • the color original image is fused into the first high dynamic panchromatic original image.
  • the high dynamic range image processing module 30 transmits the first high dynamic full color original image and the first high dynamic color original image to the image fusion module 20 for fusion processing to finally obtain the first high dynamic range image.
  • the image sensor 10 obtains the first color original image, the second color original image, the first panchromatic original image, and the second panchromatic original image
  • the high dynamic range image processing unit 31 in the high dynamic range image processing module 30 fuses the first color original image and the second color original image into a second high dynamic color original image Image, fusing the first panchromatic original image and the second panchromatic original image into a second high dynamic panchromatic original image.
  • the specific fusion process is the same as the specific process of fusing the first intermediate image and the second intermediate image into the third high dynamic range image in the embodiment shown in FIG. 16, and will not be repeated here.
  • the brightness mapping unit 33 is configured to perform brightness mapping on the second high dynamic color original image to obtain a first high dynamic color original image with a small amount of data, and perform brightness mapping on the second high dynamic full color original image to obtain a small amount of data
  • the first high dynamic panchromatic original image is the same as the specific process of mapping the brightness of the third high dynamic range image to the first high dynamic range image in the embodiment shown in FIG. 16, and will not be repeated here.
  • the lens shading correction unit 37 is used to correct the second high dynamic color original image to obtain a high dynamic color corrected image, and correct the second high dynamic panchromatic original image to obtain a high dynamic full color corrected image.
  • the specific correction process is the same as the process of performing lens shading correction on the third high dynamic range image in the embodiment shown in FIG. 16 and FIG. 17, and will not be repeated here.
  • the statistical unit 35 is used to process the high dynamic color correction image and the high dynamic full color correction image to obtain statistical data, and transmit the statistical data to the image processor 40, so that the image processor 40 can perform automatic exposure and automatic exposure based on the statistical information. At least one of the white balance processing.
  • the statistical unit 35 can also directly process the first color original image, the second color original image, the first full-color original image, and the second full-color original image to obtain statistical data, and transmit the statistical data to the image processor 40 , So that the image processor 40 can perform at least one of automatic exposure and automatic white balance processing according to the statistical information.
  • the high dynamic range image processing module 30 After the high dynamic range image processing module 30 obtains the first high dynamic color original image and the first high dynamic full color original image, the two images are transmitted to the image fusion module 20 for fusion processing to obtain the first high dynamic range image.
  • the specific process of the image fusion module 20 fusing the first high dynamic color original image and the first high dynamic panchromatic original image into the first high dynamic range image is the same as in the embodiment shown in FIG. 12, combining the first color original image and the second high dynamic range image.
  • the specific fusion process of fusion of the panchromatic original image into the first intermediate image is the same, and will not be repeated here.
  • the high dynamic range image processing system 100 shown in FIG. 18 first uses the high dynamic range image processing module 30 to fuse the color original image and the full color original image to obtain the first high dynamic color original image and the first high dynamic range image processing module 30. Then, the first high-dynamic color original image and the first high-dynamic panchromatic original image are merged by the image fusion module 20 to obtain the first high-dynamic range image. Since the multiple color image pixels in the first high dynamic range image are arranged in a Bayer array, the first high dynamic range image can be directly processed by the image processor 40.
  • all panchromatic photosensitive pixels W in the pixel array 11 are exposed at the third exposure time, and the third exposure time may be greater than the second exposure time, so that all panchromatic photosensitive pixels W All are exposed with the medium exposure time M; or, the third exposure time is equal to the first exposure time, so that all the panchromatic photosensitive pixels W are exposed with the long exposure time L.
  • the third exposure time can also be equal to or less than the second exposure time, Exposure of the full-color photosensitive pixel W with a short exposure time is not limited here.
  • the third exposure time is greater than the second exposure time, that is, all the full-color photosensitive pixels W are exposed with the medium exposure time M as an example.
  • a single-color photosensitive pixel has a first exposure time (for example, the long exposure time shown in FIG. 20).
  • One single-color photosensitive pixel is exposed for the second exposure time (for example, the short exposure time S shown in FIG. 20), and the two full-color photosensitive pixels W are both exposed for the third exposure time (for example, the medium exposure time shown in FIG. 20).
  • the exposure process of the pixel array 11 may be: (1) the photosensitive pixel 110 exposed at the first exposure time, the photosensitive pixel 110 exposed at the second exposure time, and the third exposure
  • the time-exposed photosensitive pixels 110 are sequentially exposed (the exposure sequence of the three is not limited), and the exposure time of the three does not overlap; (2) the photosensitive pixels 110 exposed at the first exposure time are exposed at the second exposure time
  • the exposed photosensitive pixels 110 and the photosensitive pixels 110 exposed at the third exposure time are sequentially exposed (the exposure sequence of the three is not limited), and the exposure time of the three overlaps partially; (3) All exposures are shorter
  • the exposure time of the time-exposed photosensitive pixels 110 are all within the exposure time of the photosensitive pixels 110 that are exposed with the longest exposure time.
  • the exposure time of all the single-color photosensitive pixels exposed at the second exposure time are all within the time
  • the exposure time of all the single-color photosensitive pixels exposed at the first exposure time is within the exposure time of all the full-color photosensitive pixels W exposed at the third exposure time are within the exposure time of all the single-color photosensitive pixels exposed at the first exposure time Within time.
  • the pixel array 11 adopts the (3) exposure method, which can shorten the overall exposure time required by the pixel array 11, which is beneficial to increase the frame rate of the image.
  • the image sensor 10 can output three original images, which are: (1) The first color original image, which is generated by multiple single-color photosensitive pixels exposed with a long exposure time L (first exposure time) (2)
  • the second color original image is composed of the second color information generated by multiple single-color photosensitive pixels exposed with a short exposure time S (second exposure time);
  • the first full The color original image is composed of first panchromatic information generated by a plurality of panchromatic photosensitive pixels W exposed at a medium exposure time M (third exposure time).
  • the image sensor 10 first transmits the first color original image and the second color original image to the high dynamic range image processing module 30 for high dynamic range processing to obtain the first high dynamic color original image, and then transfers the first color original image to the high dynamic range image processing module 30.
  • a high dynamic color original image and a first panchromatic original image are transmitted to the image fusion module 20 for fusion algorithm processing to obtain the first high dynamic range image.
  • the image sensor 10 transmits the first color original image, the second color original image, and the first panchromatic to the high dynamic range image processing module 30.
  • the high dynamic range image in the high dynamic range image processing module 30 The processing unit 31 fuses the first color original image and the second color original image into a second high-dynamic color original image.
  • the specific fusion process is the same as that of the first intermediate image and the second intermediate image in the embodiment shown in FIG. 15
  • the specific process of the three high dynamic range images is the same, so I won’t repeat them here.
  • the brightness mapping unit 33 is configured to perform brightness mapping on the second high dynamic color original image to obtain the first high dynamic color original image with a small amount of data.
  • the specific process is the same as the specific process of mapping the brightness of the third high dynamic range image to the first high dynamic range image in the embodiment shown in FIG. 16, and will not be repeated here.
  • the lens shading correction unit 37 is used to correct the second high dynamic color original image to obtain a high dynamic color corrected image.
  • the specific correction process is the same as the process of performing lens shading correction on the third high dynamic range image in the embodiment shown in FIGS. 16 and 17. I will not repeat them here.
  • the statistical unit 35 is used to process the high dynamic color correction image and the high dynamic full color correction image to obtain statistical data, and transmit the statistical data to the image processor 40, so that the image processor 40 can perform automatic exposure and automatic exposure based on the statistical information. At least one of the white balance processing.
  • the statistical unit 35 can also directly process the first color original image, the second color original image, and the first full-color original image to obtain statistical data, and transmit the statistical data to the image processor 40 so that the image processor 40 can At least one of automatic exposure and automatic white balance processing is performed based on the statistical information.
  • the high dynamic range image processing module 30 transmits the first high dynamic color original image and the first panchromatic original image to the image fusion module 20 for fusion processing to obtain the first high dynamic range image.
  • the first panchromatic original image obtained by the image sensor 10 includes a plurality of panchromatic image pixels W and a plurality of empty image pixels N (NULL), wherein the empty image pixels are neither complete Color image pixels are also not color image pixels.
  • the position of the empty image pixel N in the first full-color original image can be regarded as there is no image pixel in that position, or the pixel value of the empty image pixel can be regarded as zero.
  • the sub-unit includes two full-color image pixels W and two color image pixels (color image pixel A, color image pixel B, Or color image pixel C).
  • the first full-color original image also has a sub-unit corresponding to each sub-unit in the pixel array 11, and the sub-unit of the first full-color original image includes two full-color image pixels W and two empty image pixels N, The positions of the two empty image pixels N correspond to the positions of the two color image pixels in the subunit of the pixel array 11.
  • the image fusion module 20 may further process the first full-color original image to obtain a full-color intermediate image.
  • each sub-unit includes a plurality of empty image pixels N and a plurality of panchromatic image pixels.
  • some sub-units include two empty image pixels N and two panchromatic image pixels W.
  • the image fusion module 20 may use the pixel values of all panchromatic image pixels in the subunit including the empty image pixel N and the panchromatic image pixel W as the panchromatic large pixel W in the subunit to obtain a panchromatic intermediate image.
  • the resolution of the panchromatic intermediate image at this time is the same as the resolution of the first high dynamic color original image, so as to facilitate the fusion of the panchromatic intermediate image and the first high dynamic color original image.
  • the specific fusion process of the panchromatic intermediate image and the first high dynamic color original image is the same as the specific fusion process of fusing the first color original image and the second panchromatic original image into the first intermediate image in the embodiment shown in FIG. 12. I will not repeat them here.
  • the high dynamic range image processing system 100 shown in FIG. 18 first fused the color original image and the panchromatic original image through the high dynamic range image processing module 30 to obtain the first high dynamic color original image, and then passed the image fusion module 20
  • the first high dynamic color original image and the first panchromatic original image are merged to obtain the first high dynamic range image. Since the multiple color image pixels in the first high dynamic range image are arranged in a Bayer array, the first high dynamic range image can be directly processed by the image processor 40.
  • the present application also provides an electronic device 1000.
  • the electronic device 1000 of the embodiment of the present application includes a lens 300, a housing 200, and the high dynamic range image processing system 100 described in any one of the above embodiments.
  • the lens 300 and the high dynamic range image processing system 100 are combined with the housing 200.
  • the lens 300 cooperates with the image sensor 10 of the high dynamic range image processing system 100 for imaging.
  • the electronic device 1000 may be a mobile phone, a tablet computer, a notebook computer, a smart wearable device (such as a smart watch, a smart bracelet, a smart glasses, a smart helmet), a drone, a head-mounted display device, etc., which are not limited here.
  • a smart wearable device such as a smart watch, a smart bracelet, a smart glasses, a smart helmet
  • a drone a head-mounted display device, etc., which are not limited here.
  • the electronic device 1000 of the embodiment of the present application performs fusion algorithm processing and high dynamic range processing on the full-color original image and the color original image output by the image sensor 10 through the image fusion module 20 and the high dynamic range image processing module 30 to obtain image pixels.
  • the first high dynamic range image arranged in a Bayer array, and then the first high dynamic range image is input to the image processor for subsequent processing, thereby solving the problem that the image processor 40 cannot directly perform processing on the image with image pixels arranged in a non-Bayer array. Handling issues.
  • the high dynamic range image processing method of the embodiment of the present application is used in the high dynamic range image processing system 100.
  • the high dynamic range image processing system 100 includes an image sensor 10.
  • the image sensor 10 includes a pixel array 11.
  • the pixel array 11 includes a plurality of full-color photosensitive pixels and a plurality of color photosensitive pixels. Color photosensitive pixels have a narrower spectral response than full-color photosensitive pixels.
  • the pixel array 11 includes a minimum repeating unit, and each minimum repeating unit includes a plurality of sub-units. Each subunit includes a plurality of single-color photosensitive pixels and a plurality of full-color photosensitive pixels.
  • High dynamic range image processing methods include:
  • the first high dynamic range image includes multiple color images Pixels, a plurality of color image pixels are arranged in a Bayer array, and the first high dynamic range image is processed by the image processor to obtain the second high dynamic range image.
  • part of the panchromatic photosensitive pixels in the same subunit are exposed at the fourth exposure time, and the remaining panchromatic photosensitive pixels are exposed at the third exposure time.
  • the fourth exposure time is less than or equal to the first exposure time and greater than The third exposure time, and the second panchromatic information generated by the single-color photosensitive pixels exposed at the fourth exposure time to obtain the second panchromatic original image; for the first color original image, the second color original image and the first panchromatic original
  • Performing fusion algorithm processing and high dynamic range processing on the image to obtain the first high dynamic range image includes: fusing the first color original image and the second panchromatic original image into a first intermediate image, and combining the second color original image with the first full color original image.
  • the color original image is fused into a second intermediate image; and the first intermediate image and the second intermediate image are fused into a first high dynamic range image.
  • fusing the first intermediate image and the second intermediate image into a first high dynamic range image includes: fusing the first intermediate image and the second intermediate image into a third high dynamic range image; and The three high dynamic range images are subjected to brightness mapping to obtain the first high dynamic range image.
  • the high dynamic range image processing method further includes: fusing the first intermediate image and the second intermediate image into a third high dynamic range image; obtaining a high dynamic range corrected image from the third high dynamic range image; And processing the high dynamic range correction image to obtain statistical data, and the statistical data is provided to the image processor for automatic exposure processing and/or automatic white balance processing.
  • the high dynamic range image processing method further includes: processing the first intermediate image and the second intermediate image to obtain statistical data, and the statistical data is provided to the image processor for automatic exposure processing and/or automatic white balance processing .
  • part of the panchromatic photosensitive pixels in the same subunit are exposed at the fourth exposure time, and the remaining panchromatic photosensitive pixels are exposed at the third exposure time.
  • the fourth exposure time is less than or equal to the first exposure time and greater than The third exposure time, and the second panchromatic information generated by the single-color photosensitive pixels exposed at the fourth exposure time to obtain the second panchromatic original image.
  • Performing fusion algorithm processing and high dynamic range processing on the first color original image, the second color original image, and the first panchromatic original image to obtain the first high dynamic range image includes: combining the first color original image and the second color original image Fusion into a first high-dynamic color original image, fusing the first full-color original image and the second full-color original image into a first high-dynamic full-color original image; and combining the first high-dynamic color original image with the first high-dynamic full-color original image The color original image is fused into the first high dynamic range image.
  • the first color original image and the second color original image are fused into a first high dynamic color original image
  • the first panchromatic original image and the second panchromatic original image are fused into a first high dynamic color original image.
  • the color original image includes: fusing the first color original image and the second color original image into a second high-dynamic color original image, and fusing the first panchromatic original image and the second panchromatic original image into a second high-dynamic panchromatic original image Image; and brightness mapping is performed on the second high dynamic color original image to obtain the first high dynamic color original image, and brightness mapping is performed on the second high dynamic panchromatic original image to obtain the first high dynamic full color original image.
  • the high dynamic range image processing method includes: fusing a first color original image and a second color original image into a second high dynamic color original image, and combining the first panchromatic original image with the second panchromatic original image.
  • the color correction image and the high dynamic full color correction image are used to obtain statistical data, and the statistical data is provided to the image processor 40 for automatic exposure processing and/or automatic white balance processing.
  • all panchromatic photosensitive pixels in the same subunit are exposed at the third exposure time; the first color original image, the second color original image, and the first panchromatic original image are processed by a fusion algorithm and are highly dynamic
  • the range processing to obtain the first high dynamic range image includes: fusing the first color original image and the second color original image into a first high dynamic color original image; and combining the first high dynamic color original image and the first panchromatic original image Fusion is the first high dynamic range image.
  • fusing the first color original image and the second color original image into a first high dynamic color original image includes: fusing the first color original image and the second color original image into a second high dynamic color original image And performing brightness mapping on the second high dynamic color original image to obtain the first high dynamic color original image.
  • the high dynamic range image processing method includes: fusing a first color original image and a second color original image into a second high dynamic color original image; correcting the second high dynamic color original image to obtain a high dynamic color Correcting the image; and processing the high dynamic color correction image and the first full-color original image to obtain statistical data, which is provided to the image processor 40 for automatic exposure processing and/or automatic white balance processing.
  • the specific implementation process of the high dynamic range image processing method of any one of the foregoing embodiments is the same as the specific implementation process of the aforementioned high dynamic range image processing system 100 to obtain a high dynamic range image, and will not be further described here.
  • the present application also provides a non-volatile computer-readable storage medium 400 containing a computer program.
  • the processor 60 is caused to execute the high dynamic range image processing method described in any one of the foregoing embodiments.
  • the pixel array 11 is exposed, wherein, for a plurality of photosensitive pixels in the same subunit, at least one single-color photosensitive pixel is exposed with a first exposure time, and at least one single-color photosensitive pixel is exposed with a second exposure time that is less than the first exposure time, At least one full-color photosensitive pixel is exposed at a third exposure time that is less than the first exposure time; wherein, the first color information generated by the single-color photosensitive pixel exposed at the first exposure time obtains the first color original image, and the first color original image is obtained at the second exposure time.
  • the second color information generated by the exposed single-color photosensitive pixels obtains a second color original image, and the panchromatic photosensitive pixels exposed at the third exposure time generate a first panchromatic original image; and
  • the first high dynamic range image including a plurality of color image pixels, A plurality of color image pixels are arranged in a Bayer array, and the first high dynamic range image is processed by the image processor to obtain a second high dynamic range image.
  • the processor 60 When the computer program is executed by the processor 60, the processor 60 is caused to perform the following steps:
  • the first high dynamic color original image and the first panchromatic original image are fused into a first high dynamic range image.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Color Television Image Signal Generators (AREA)
  • Image Processing (AREA)

Abstract

Système de traitement d'images à plage dynamique élevée (100), procédé de traitement d'images à plage dynamique élevée, dispositif électronique (1000) et support d'enregistrement lisible par ordinateur. Le système de traitement d'images à plage dynamique élevée (100) comprend un capteur d'image (10), un module de fusion d'image (20) et un module de traitement d'images à plage dynamique élevée (30). Un réseau de pixels (11) dans le capteur d'image (10) est exposé. Le module de fusion d'image (20) et le module de traitement d'images à plage dynamique élevée (30) permettent d'exécuter un traitement à plage dynamique élevée et un traitement par algorithme de fusion sur une première image d'origine de couleur, une seconde image d'origine de couleur et une image d'origine de couleur complète, de façon à obtenir une première image à plage dynamique élevée.
PCT/CN2020/119959 2020-04-03 2020-10-09 Système et procédé de traitement d'images à plage dynamique élevée, dispositif électronique et support d'enregistrement lisible WO2021196553A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010259292.6A CN111479071B (zh) 2020-04-03 2020-04-03 高动态范围图像处理系统及方法、电子设备和可读存储介质
CN202010259292.6 2020-04-03

Publications (1)

Publication Number Publication Date
WO2021196553A1 true WO2021196553A1 (fr) 2021-10-07

Family

ID=71749629

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/119959 WO2021196553A1 (fr) 2020-04-03 2020-10-09 Système et procédé de traitement d'images à plage dynamique élevée, dispositif électronique et support d'enregistrement lisible

Country Status (2)

Country Link
CN (1) CN111479071B (fr)
WO (1) WO2021196553A1 (fr)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111479071B (zh) * 2020-04-03 2021-05-07 Oppo广东移动通信有限公司 高动态范围图像处理系统及方法、电子设备和可读存储介质
CN111970459B (zh) * 2020-08-12 2022-02-18 Oppo广东移动通信有限公司 高动态范围图像处理系统及方法、电子设备和可读存储介质
CN111970461B (zh) * 2020-08-17 2022-03-22 Oppo广东移动通信有限公司 高动态范围图像处理系统及方法、电子设备和可读存储介质
CN111970460B (zh) * 2020-08-17 2022-05-20 Oppo广东移动通信有限公司 高动态范围图像处理系统及方法、电子设备和可读存储介质
CN116349239A (zh) * 2020-11-24 2023-06-27 Oppo广东移动通信有限公司 彩色成像系统
KR20220084578A (ko) * 2020-12-14 2022-06-21 에스케이하이닉스 주식회사 이미지 센싱 장치
CN114697537B (zh) * 2020-12-31 2024-05-10 浙江清华柔性电子技术研究院 图像获取方法、图像传感器及计算机可读存储介质
CN112887571B (zh) * 2021-01-27 2022-06-10 维沃移动通信有限公司 图像传感器、摄像模组和电子设备
CN113676635B (zh) * 2021-08-16 2023-05-05 Oppo广东移动通信有限公司 高动态范围图像的生成方法、装置、电子设备和存储介质
CN115883974B (zh) * 2023-03-08 2023-05-30 淄博凝眸智能科技有限公司 基于分块曝光的hdr图像生成方法、系统及可读介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105578065A (zh) * 2015-12-18 2016-05-11 广东欧珀移动通信有限公司 高动态范围图像的生成方法、拍照装置和终端
CN107635118A (zh) * 2014-09-19 2018-01-26 豪威科技股份有限公司 彩色滤光器阵列、图像传感器以及用于减少光谱串扰的方法
US20180309949A1 (en) * 2017-04-21 2018-10-25 Dartmouth College Quanta Image Sensor with Polarization-Sensitive Jots
CN110740272A (zh) * 2019-10-31 2020-01-31 Oppo广东移动通信有限公司 图像采集方法、摄像头组件及移动终端
CN111479071A (zh) * 2020-04-03 2020-07-31 Oppo广东移动通信有限公司 高动态范围图像处理系统及方法、电子设备和可读存储介质

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7688368B2 (en) * 2006-01-27 2010-03-30 Eastman Kodak Company Image sensor with improved light sensitivity
US8045024B2 (en) * 2009-04-15 2011-10-25 Omnivision Technologies, Inc. Producing full-color image with reduced motion blur
US8237831B2 (en) * 2009-05-28 2012-08-07 Omnivision Technologies, Inc. Four-channel color filter array interpolation
US8203615B2 (en) * 2009-10-16 2012-06-19 Eastman Kodak Company Image deblurring using panchromatic pixels

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107635118A (zh) * 2014-09-19 2018-01-26 豪威科技股份有限公司 彩色滤光器阵列、图像传感器以及用于减少光谱串扰的方法
CN105578065A (zh) * 2015-12-18 2016-05-11 广东欧珀移动通信有限公司 高动态范围图像的生成方法、拍照装置和终端
US20180309949A1 (en) * 2017-04-21 2018-10-25 Dartmouth College Quanta Image Sensor with Polarization-Sensitive Jots
CN110740272A (zh) * 2019-10-31 2020-01-31 Oppo广东移动通信有限公司 图像采集方法、摄像头组件及移动终端
CN111479071A (zh) * 2020-04-03 2020-07-31 Oppo广东移动通信有限公司 高动态范围图像处理系统及方法、电子设备和可读存储介质

Also Published As

Publication number Publication date
CN111479071B (zh) 2021-05-07
CN111479071A (zh) 2020-07-31

Similar Documents

Publication Publication Date Title
WO2021196553A1 (fr) Système et procédé de traitement d'images à plage dynamique élevée, dispositif électronique et support d'enregistrement lisible
WO2021179806A1 (fr) Procédé d'acquisition d'image, appareil d'imagerie, dispositif électronique, et support de stockage lisible
WO2021196554A1 (fr) Capteur d'image, système et procédé de traitement d'image, dispositif électronique et support d'enregistrement
WO2021208593A1 (fr) Système et procédé de traitement d'image à plage dynamique élevée, dispositif électronique et support de stockage
WO2021212763A1 (fr) Système et procédé de traitement d'images à plage dynamique élevée, dispositif électronique et support d'informations lisible
WO2022088311A1 (fr) Procédé de traitement d'images, ensemble caméra et terminal mobile
WO2022007469A1 (fr) Procédé d'acquisition d'image, ensemble caméra, et terminal mobile
WO2021223364A1 (fr) Système et procédé de traitement d'image à grande gamme dynamique, dispositif électronique et support de stockage lisible
WO2021179805A1 (fr) Capteur d'images, ensemble caméra, terminal mobile et procédé d'acquisition d'images
WO2021103818A1 (fr) Capteur d'image, procédé de commande, ensemble caméra, et terminal mobile
WO2022036817A1 (fr) Procédé de traitement d'image, système de traitement d'image, dispositif électronique et support de stockage lisible
US11902674B2 (en) Image acquisition method, camera assembly, and mobile terminal
WO2021159944A1 (fr) Capteur d'image, ensemble caméra et terminal mobile
CN112738493B (zh) 图像处理方法、图像处理装置、电子设备及可读存储介质
CN111970459B (zh) 高动态范围图像处理系统及方法、电子设备和可读存储介质
CN112822475B (zh) 图像处理方法、图像处理装置、终端及可读存储介质
WO2022088310A1 (fr) Procédé de traitement d'images, ensemble caméra et terminal mobile
CN111031297B (zh) 图像传感器、控制方法、摄像头组件和移动终端
WO2021046691A1 (fr) Procédé de collecte d'image, ensemble caméra et terminal mobile
WO2022141743A1 (fr) Procédé de traitement d'image, système de traitement d'image, dispositif électronique et support de stockage lisible
US20220279108A1 (en) Image sensor and mobile terminal
CN112235485B (zh) 图像传感器、图像处理方法、成像装置、终端及可读存储介质
CN112738494B (zh) 图像处理方法、图像处理系统、终端设备及可读存储介质
CN114424517B (zh) 图像传感器、控制方法、摄像头组件及移动终端

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20929197

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20929197

Country of ref document: EP

Kind code of ref document: A1