WO2021223364A1 - Système et procédé de traitement d'image à grande gamme dynamique, dispositif électronique et support de stockage lisible - Google Patents

Système et procédé de traitement d'image à grande gamme dynamique, dispositif électronique et support de stockage lisible Download PDF

Info

Publication number
WO2021223364A1
WO2021223364A1 PCT/CN2020/119957 CN2020119957W WO2021223364A1 WO 2021223364 A1 WO2021223364 A1 WO 2021223364A1 CN 2020119957 W CN2020119957 W CN 2020119957W WO 2021223364 A1 WO2021223364 A1 WO 2021223364A1
Authority
WO
WIPO (PCT)
Prior art keywords
color
image
dynamic range
high dynamic
image data
Prior art date
Application number
PCT/CN2020/119957
Other languages
English (en)
Chinese (zh)
Inventor
杨鑫
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Publication of WO2021223364A1 publication Critical patent/WO2021223364A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/265Mixing

Definitions

  • This application relates to the field of image processing technology, and in particular to a high dynamic range image processing system, a high dynamic range image processing method, electronic equipment, and a non-volatile computer-readable storage medium.
  • a camera may be provided in an electronic device such as a mobile phone to realize a photographing function.
  • An image sensor for receiving light can be set in the camera.
  • the image sensor may be provided with a filter array.
  • the embodiments of the present application provide a high dynamic range image processing system, a high dynamic range image processing method, electronic equipment, and a non-volatile computer-readable storage medium.
  • the embodiment of the present application provides a high dynamic range image processing system.
  • the high dynamic range image processing system includes an image sensor, an image fusion module and a high dynamic range image processing module.
  • the image sensor includes a pixel array, the pixel array includes a plurality of panchromatic photosensitive pixels and a plurality of color photosensitive pixels, the color photosensitive pixels have a narrower spectral response than the panchromatic photosensitive pixels, and the pixel array includes The smallest repeating unit, each of the smallest repeating units includes a plurality of sub-units, and each of the sub-units includes a plurality of single-color photosensitive pixels and a plurality of full-color photosensitive pixels.
  • the pixel array is exposed at a first exposure time to obtain a first original image
  • the first original image includes first color original image data generated by the single-color photosensitive pixel exposed at the first exposure time and The first full-color original image data generated by the full-color photosensitive pixels exposed for the first exposure time
  • the pixel array is exposed for the second exposure time to obtain a second original image
  • the second original image includes the second original image.
  • the image fusion module is used for fusing the first color original image data and the first full-color original image data into a first color intermediate image containing only the first color intermediate image data, and the second color original
  • the image data and the second full-color original image data are fused into a second color intermediate image that only includes the second color intermediate image data.
  • Both the first color intermediate image and the second color intermediate image include a plurality of color image pixels, and the plurality of color image pixels are arranged in a Bayer array.
  • the high dynamic range image processing module is configured to perform high dynamic range processing on the first color intermediate image and the second color intermediate image to obtain a first color high dynamic range image.
  • the embodiments of the present application provide a high dynamic range image processing method.
  • the high dynamic range image processing method is used in a high dynamic range image processing system.
  • the high dynamic range image processing system includes an image sensor, the image sensor includes a pixel array, the pixel array includes a plurality of full-color photosensitive pixels and a plurality of color photosensitive pixels, and the color photosensitive pixels have a higher sensitivity than the full-color photosensitive pixels.
  • the pixel has a narrower spectral response
  • the pixel array includes a minimum repeating unit, each of the minimum repeating units includes a plurality of sub-units, and each of the sub-units includes a plurality of single-color photosensitive pixels and a plurality of full-color photosensitive pixels;
  • the high dynamic range image processing method includes: exposing the pixel array, wherein the pixel array is exposed at a first exposure time to obtain a first original image, and the first original image includes all exposed at the first exposure time.
  • the first color intermediate image and the second color intermediate image both include a plurality of color image pixels, and the plurality of color image pixels are arranged in a Bayer array; and the first color intermediate image and the second color intermediate image The color intermediate image undergo
  • the embodiment of the present application provides an electronic device.
  • the electronic device includes a lens, a housing, and the above-mentioned high dynamic range image processing system.
  • the lens and the high dynamic range image processing system are combined with the housing, and the lens cooperates with the image sensor of the high dynamic range image processing system for imaging.
  • the embodiments of the present application provide a non-volatile computer-readable storage medium containing a computer program.
  • the processor is caused to execute the above-mentioned high dynamic range image processing method.
  • FIG. 1 is a schematic diagram of a high dynamic range image processing system according to an embodiment of the present application
  • FIG. 2 is a schematic diagram of a pixel array according to an embodiment of the present application.
  • FIG. 3 is a schematic cross-sectional view of a photosensitive pixel according to an embodiment of the present application.
  • FIG. 4 is a pixel circuit diagram of a photosensitive pixel according to an embodiment of the present application.
  • FIG. 5 is a schematic diagram of the arrangement of the smallest repeating unit in a pixel array according to an embodiment of the present application
  • FIG. 6 is a schematic diagram of the arrangement of the smallest repeating unit in another pixel array according to an embodiment of the present application.
  • FIG. 7 is a schematic diagram of the arrangement of the smallest repeating unit in another pixel array according to an embodiment of the present application.
  • FIG. 8 is a schematic diagram of the arrangement of the smallest repeating unit in another pixel array according to an embodiment of the present application.
  • FIG. 9 is a schematic diagram of the arrangement of the smallest repeating unit in another pixel array according to an embodiment of the present application.
  • FIG. 10 is a schematic diagram of the arrangement of the smallest repeating unit in another pixel array according to an embodiment of the present application.
  • FIG. 11 is a schematic diagram of an original image output by an image sensor according to an embodiment of the present application.
  • FIG. 12 is a schematic diagram of an image sensor output mode according to an embodiment of the present application.
  • FIG. 13 is a schematic diagram of another image sensor output mode according to an embodiment of the present application.
  • FIG. 14 is a schematic diagram of a color intermediate image according to an embodiment of the present application.
  • FIG. 15 is a schematic diagram of yet another color intermediate image according to an embodiment of the present application.
  • 16 is a schematic diagram of another high dynamic range image processing system according to an embodiment of the present application.
  • FIG. 17 is a schematic diagram of black level correction according to an embodiment of the present application.
  • FIG. 18 is a schematic diagram of lens shading correction according to an embodiment of the present application.
  • FIG. 19 is a schematic diagram of a dead pixel compensation process according to an embodiment of the present application.
  • FIG. 20 is a schematic diagram of a brightness alignment process according to an embodiment of the present application.
  • 21 to 24 are schematic diagrams of demosaicing according to an embodiment of the present application.
  • FIG. 25 is a schematic diagram of the mapping relationship between Vout and Vin in the tone mapping process of the embodiment of the present application.
  • FIG. 26 is a schematic diagram of an original image output by another image sensor according to an embodiment of the present application.
  • FIG. 27 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
  • FIG. 28 is a schematic flowchart of a method for acquiring a high dynamic range image according to an embodiment of the present application.
  • FIG. 29 is a schematic diagram of interaction between a non-volatile computer-readable storage medium and a processor according to an embodiment of the present application.
  • the high dynamic range image processing system 100 includes an image sensor 10, an image fusion module 20 and a high dynamic range image processing module 30.
  • the image sensor 10 includes a pixel array 11, and the pixel array 11 includes a plurality of panchromatic photosensitive pixels and a plurality of color photosensitive pixels.
  • the color photosensitive pixels have a narrower spectral response than the panchromatic photosensitive pixels.
  • the pixel array 11 includes a minimum repeating unit, each minimum repeating unit includes a plurality of sub-units, and each sub-unit includes a plurality of single-color photosensitive pixels and a plurality of full-color photosensitive pixels.
  • the pixel array 11 is exposed at a first exposure time to obtain a first original image.
  • the first original image includes first color original image data generated by single-color photosensitive pixels exposed at the first exposure time and full-color photosensitive pixels exposed at the first exposure time.
  • the pixel array 11 is exposed at a second exposure time to obtain a second original image.
  • the second original image includes second color original image data generated by single-color photosensitive pixels exposed at the second exposure time and full-color photosensitive pixels exposed at the second exposure time.
  • the first exposure time is not equal to the second exposure time.
  • the image fusion module 10 is used for fusing the first color original image data and the first panchromatic original image data into a first color intermediate image containing only the first color intermediate image data, and combining the second color original image data with the second full color image data.
  • the color original image data is fused into a second color intermediate image containing only the second color intermediate image data.
  • Both the first color intermediate image and the second color intermediate image include a plurality of color image pixels, and the plurality of color image pixels are arranged in a Bayer array.
  • the high dynamic range image processing module 30 is configured to perform high dynamic range processing on the first color intermediate image and the second color intermediate image to obtain the first color high dynamic range image.
  • the pixel array 11 is exposed to a third exposure time to obtain a third original image
  • the third original image includes a third original image generated by a single-color photosensitive pixel exposed at the third exposure time.
  • the color original image data and the third panchromatic original image data generated by the panchromatic photosensitive pixels exposed at the third exposure time; wherein the third exposure time is not equal to the first exposure time, and the third exposure time is not equal to the second exposure time.
  • the image fusion module 20 is also used for fusing the third color original image data and the third panchromatic original image data into a third color intermediate image containing only the third color intermediate image data, the third color intermediate image including a plurality of color image pixels , Multiple color image pixels are arranged in a Bayer array.
  • the high dynamic range image processing module 30 is configured to perform high dynamic range processing on the first color intermediate image, the second color intermediate image, and the third color intermediate image to obtain the first color high dynamic range image.
  • each color original image data is generated by a single single color photosensitive pixel
  • each panchromatic original image data is generated by a single panchromatic photosensitive pixel
  • the image sensor 10 outputs multiple original image data.
  • the output mode includes a color original image data and a full-color original image data alternately output.
  • each color original image data is generated by multiple single-color photosensitive pixels in the same subunit
  • each panchromatic original image data is generated by multiple panchromatic pixels in the same subunit.
  • the photosensitive pixels are generated together, and the output mode of the image sensor 10 for outputting a plurality of original image data includes alternately outputting a plurality of color original image data and a plurality of full-color original image data.
  • the high dynamic range image processing system 100 further includes an image processor 40.
  • the image processor 40 includes an image preprocessing module 41. Image preprocessing is performed to obtain the preprocessed first color intermediate image; and image preprocessing is performed on the second color intermediate image to obtain the preprocessed second color intermediate image.
  • the high dynamic range image processing module 30 is configured to perform high dynamic range processing on the preprocessed first color intermediate image and the preprocessed second color intermediate image to obtain the first color high dynamic range image.
  • the high dynamic range image processing system 100 further includes an image processor 40.
  • the image processor 40 includes an image preprocessing module 41. Perform image preprocessing to obtain the preprocessed first color intermediate image; perform image preprocessing on the second color intermediate image to obtain the preprocessed second color intermediate image; and perform image preprocessing on the third color intermediate image to Obtain the preprocessed third color intermediate image.
  • the high dynamic range image processing module 30 is used to perform high dynamic range processing on the preprocessed first color intermediate image, the preprocessed second color intermediate image, and the preprocessed third color intermediate image to obtain the first color high Dynamic range image.
  • the image preprocessing includes at least one of black level correction, lens shading correction, and dead pixel compensation.
  • the high dynamic range image processing system 100 further includes an image processor 40, and the image processor 40 further includes an image post-processing module 42, which is used to compare the first color The dynamic range image is subjected to image post-processing to obtain the second color high dynamic range image.
  • the image post-processing includes at least one of demosaicing, color correction, global tone mapping, and color conversion.
  • the high dynamic range image processing system 100 further includes a storage module 50.
  • the storage module 50 is used to store the image processed by the image preprocessing module 41 and transmit the preprocessed image to the high dynamic range image processing system.
  • the dynamic range image processing module 30 performs high dynamic range processing to obtain the first color high dynamic range image.
  • the image fusion module 20 is integrated in the image sensor 10.
  • the high dynamic range image processing method of the embodiment of the present application is used in the high dynamic range image processing system 100.
  • the high dynamic range image processing system 100 may include the image sensor 10.
  • the image sensor 10 includes a pixel array 11.
  • the pixel array 11 includes a plurality of full-color photosensitive pixels and a plurality of color photosensitive pixels. Color photosensitive pixels have a narrower spectral response than full-color photosensitive pixels.
  • the pixel array 11 includes the smallest repeating unit. Each minimum repeating unit contains multiple subunits. Each subunit includes a plurality of single-color photosensitive pixels and a plurality of full-color photosensitive pixels.
  • High dynamic range image processing methods include:
  • the pixel array 11 Exposure of the pixel array 11, where the pixel array 11 is exposed to a first exposure time to obtain a first original image, and the first original image includes the first color original image data generated by the single-color photosensitive pixels exposed at the first exposure time and The first full-color original image data generated by the full-color photosensitive pixels exposed at the first exposure time; the pixel array is exposed at the second exposure time to obtain a second original image, and the second original image includes the single-color photosensitive pixels exposed at the second exposure time The generated second color original image data and the second panchromatic original image data generated by the panchromatic photosensitive pixels exposed at the second exposure time; wherein the first exposure time is not equal to the second exposure time;
  • first color original image data and the first panchromatic original image data are fused into a first color intermediate image containing only the first color intermediate image data
  • second color original image data and the second panchromatic original image data are fused
  • both the first color intermediate image and the second color intermediate image include a plurality of color image pixels, and the plurality of color image pixels are arranged in a Bayer array
  • the pixel array is exposed at a third exposure time to obtain a third original image
  • the third original image includes third color original image data generated by single-color photosensitive pixels exposed at the third exposure time and at the third exposure time.
  • the third panchromatic original image data generated by the time-exposed panchromatic photosensitive pixels; wherein the third exposure time is not equal to the first exposure time, and the third exposure time is not equal to the second exposure time.
  • the high dynamic range image processing method further includes: fusing the third color original image data and the third panchromatic original image data into a third color intermediate image containing only the third color intermediate image data, the third color intermediate image including multiple colors Image pixels, multiple color image pixels are arranged in a Bayer array.
  • the step of performing high dynamic range processing on the first color intermediate image and the second color intermediate image to obtain the first color high dynamic range image includes: performing high dynamic range processing on the first color intermediate image, the second color intermediate image, and the third color intermediate image.
  • the dynamic range processing obtains the first color high dynamic range image.
  • each color original image data is generated by a single single-color photosensitive pixel
  • each panchromatic original image data is generated by a single panchromatic photosensitive pixel.
  • the output mode of the image sensor 10 for outputting a plurality of original image data includes alternately outputting one color original image data and one full-color original image data.
  • each color original image data is jointly generated by a plurality of single-color photosensitive pixels in the same subunit
  • each panchromatic original image data is jointly generated by a plurality of panchromatic photosensitive pixels in the same subunit.
  • the output manner of the image sensor 10 outputting a plurality of original image data includes alternately outputting a plurality of color original image data and a plurality of panchromatic original image data.
  • the high dynamic range image processing method further includes: performing image preprocessing on the first color intermediate image to obtain the preprocessed first color intermediate image; performing image preprocessing on the second color intermediate image to obtain The preprocessed second color intermediate image.
  • the step of performing high dynamic range processing on the first color intermediate image and the second color intermediate image to obtain the first color high dynamic range image includes: preprocessing the first color intermediate image and the preprocessed second color intermediate image Perform high dynamic range processing to obtain the first color high dynamic range image.
  • the high dynamic range image processing method further includes: performing image preprocessing on the first color intermediate image to obtain the preprocessed first color intermediate image; performing image preprocessing on the second color intermediate image to obtain Preprocessed second color intermediate image; image preprocessing is performed on the third color intermediate image to obtain the preprocessed third color intermediate image.
  • the step of performing high dynamic range processing on the first color intermediate image, the second color intermediate image, and the third color intermediate image to obtain the first color high dynamic range image includes: performing the preprocessed first color intermediate image and the preprocessed first color intermediate image The second color intermediate image and the processed third color intermediate image are subjected to high dynamic range processing to obtain the first color high dynamic range image.
  • the image preprocessing includes at least one of black level correction, lens shading correction, and dead pixel compensation.
  • the high dynamic range image processing method further includes: performing image post-processing on the first color high dynamic range image to obtain a second color high dynamic range image.
  • the image post-processing includes at least one of demosaicing, color correction, global tone mapping, and color conversion.
  • the high dynamic range image processing system includes a storage module.
  • the high dynamic range image processing method also includes: storing the preprocessed image in the storage module; obtaining the preprocessed image from the storage module and performing high dynamic range image processing on the preprocessed image to obtain the first color high dynamic Range image.
  • the present application also provides an electronic device 1000.
  • the electronic device 1000 of the embodiment of the present application includes a lens 300, a housing 200, and the high dynamic range image processing system 100 of any one of the above embodiments.
  • This application also provides a non-volatile computer-readable storage medium 400 containing a computer program.
  • the processor 60 is caused to execute the high dynamic range image processing method of any one of the foregoing embodiments.
  • the high dynamic range image processing system 100 of the embodiment of the present application performs fusion algorithm processing on the multi-frame original image output from the image sensor through the image fusion module 20 in advance to obtain a multi-frame color intermediate image with image pixels arranged in a Bayer array.
  • the multi-frame color intermediate image can be processed by the image processor 40, which solves the problem that the image processor 40 cannot directly process the image in which the image pixels are arranged in a non-Bayer array.
  • FIG. 2 is a schematic diagram of the image sensor 10 in the embodiment of the present application.
  • the image sensor 10 includes a pixel array 11, a vertical driving unit 12, a control unit 13, a column processing unit 14 and a horizontal driving unit 15.
  • the image sensor 10 may adopt a complementary metal oxide semiconductor (CMOS, Complementary Metal Oxide Semiconductor) photosensitive element or a charge-coupled device (CCD, Charge-coupled Device) photosensitive element.
  • CMOS complementary metal oxide semiconductor
  • CCD Charge-coupled Device
  • the pixel array 11 includes a plurality of photosensitive pixels 110 (shown in FIG. 3) arranged two-dimensionally in an array (ie, arranged in a two-dimensional matrix), and each photosensitive pixel 110 includes a photoelectric conversion element 1111 (shown in FIG. 4) .
  • Each photosensitive pixel 110 converts light into electric charge according to the intensity of light incident thereon.
  • the vertical driving unit 12 includes a shift register and an address decoder.
  • the vertical drive unit 12 includes readout scanning and reset scanning functions.
  • the readout scan refers to sequentially scanning the unit photosensitive pixels 110 line by line, and reading signals from these unit photosensitive pixels 110 line by line.
  • the signal output by each photosensitive pixel 110 in the selected and scanned photosensitive pixel row is transmitted to the column processing unit 14.
  • the reset scan is used to reset the charge, and the photocharge of the photoelectric conversion element is discarded, so that the accumulation of new photocharge can be started.
  • the signal processing performed by the column processing unit 14 is correlated double sampling (CDS) processing.
  • CDS correlated double sampling
  • the reset level and the signal level output from each photosensitive pixel 110 in the selected photosensitive pixel row are taken out, and the level difference is calculated.
  • the signals of the photosensitive pixels 110 in a row are obtained.
  • the column processing unit 14 may have an analog-to-digital (A/D) conversion function for converting analog pixel signals into a digital format.
  • A/D analog-to-digital
  • the horizontal driving unit 15 includes a shift register and an address decoder.
  • the horizontal driving unit 15 sequentially scans the pixel array 11 column by column. Through the selection scanning operation performed by the horizontal driving unit 15, each photosensitive pixel column is sequentially processed by the column processing unit 14, and is sequentially output.
  • control unit 13 configures timing signals according to the operation mode, and uses various timing signals to control the vertical driving unit 12, the column processing unit 14 and the horizontal driving unit 15 to work together.
  • FIG. 3 is a schematic diagram of a photosensitive pixel 110 in an embodiment of the present application.
  • the photosensitive pixel 110 includes a pixel circuit 111, a filter 112, and a micro lens 113. Along the light-receiving direction of the photosensitive pixel 110, the microlens 113, the filter 112, and the pixel circuit 111 are arranged in sequence.
  • the microlens 113 is used for condensing light
  • the filter 112 is used for passing light of a certain waveband and filtering out the light of other wavebands.
  • the pixel circuit 111 is used to convert the received light into electrical signals, and provide the generated electrical signals to the column processing unit 14 shown in FIG. 2.
  • FIG. 4 is a schematic diagram of a pixel circuit 111 of a photosensitive pixel 110 in an embodiment of the present application.
  • the pixel circuit 111 in FIG. 4 can be applied to each photosensitive pixel 110 (shown in FIG. 3) in the pixel array 11 shown in FIG.
  • the working principle of the pixel circuit 111 will be described below with reference to FIGS. 2 to 4.
  • the pixel circuit 111 includes a photoelectric conversion element 1111 (for example, a photodiode), an exposure control circuit (for example, a transfer transistor 1112), a reset circuit (for example, a reset transistor 1113), and an amplification circuit (for example, an amplification transistor 1114). ) And a selection circuit (for example, a selection transistor 1115).
  • the transfer transistor 1112, the reset transistor 1113, the amplifying transistor 1114, and the selection transistor 1115 are, for example, MOS transistors, but are not limited thereto.
  • the photoelectric conversion element 1111 includes a photodiode, and the anode of the photodiode is connected to the ground, for example.
  • the photodiode converts the received light into electric charge.
  • the cathode of the photodiode is connected to the floating diffusion unit FD via an exposure control circuit (for example, a transfer transistor 1112).
  • the floating diffusion unit FD is connected to the gate of the amplification transistor 1114 and the source of the reset transistor 1113.
  • the exposure control circuit is a transfer transistor 1112, and the control terminal TG of the exposure control circuit is the gate of the transfer transistor 1112.
  • the transfer transistor 1112 When a pulse of an active level (for example, VPIX level) is transmitted to the gate of the transfer transistor 1112 through the exposure control line, the transfer transistor 1112 is turned on.
  • the transfer transistor 1112 transfers the charge photoelectrically converted by the photodiode to the floating diffusion unit FD.
  • the drain of the reset transistor 1113 is connected to the pixel power supply VPIX.
  • the source of the reset transistor 113 is connected to the floating diffusion unit FD.
  • a pulse of an effective reset level is transmitted to the gate of the reset transistor 113 via the reset line, and the reset transistor 113 is turned on.
  • the reset transistor 113 resets the floating diffusion unit FD to the pixel power supply VPIX.
  • the gate of the amplifying transistor 1114 is connected to the floating diffusion unit FD.
  • the drain of the amplifying transistor 1114 is connected to the pixel power supply VPIX.
  • the amplifying transistor 1114 After the floating diffusion unit FD is reset by the reset transistor 1113, the amplifying transistor 1114 outputs the reset level through the output terminal OUT via the selection transistor 1115. After the charge of the photodiode is transferred by the transfer transistor 1112, the amplifying transistor 1114 outputs a signal level through the output terminal OUT via the selection transistor 1115.
  • the drain of the selection transistor 1115 is connected to the source of the amplification transistor 1114.
  • the source of the selection transistor 1115 is connected to the column processing unit 14 in FIG. 2 through the output terminal OUT.
  • the selection transistor 1115 is turned on.
  • the signal output by the amplifying transistor 1114 is transmitted to the column processing unit 14 through the selection transistor 1115.
  • the pixel structure of the pixel circuit 111 in the embodiment of the present application is not limited to the structure shown in FIG. 4.
  • the pixel circuit 111 may also have a three-transistor pixel structure, in which the functions of the amplifying transistor 1114 and the selecting transistor 1115 are performed by one transistor.
  • the exposure control circuit is not limited to the way of a single transfer transistor 1112, and other electronic devices or structures with the function of controlling the conduction of the control terminal can be used as the exposure control circuit in the embodiment of the present application.
  • the implementation of the transistor 1112 is simple, low in cost, and easy to control.
  • 5 to 10 are schematic diagrams of the arrangement of photosensitive pixels 110 (shown in FIG. 3) in the pixel array 11 (shown in FIG. 2) according to some embodiments of the present application.
  • the photosensitive pixels 110 include two types, one is a full-color photosensitive pixel W, and the other is a color photosensitive pixel.
  • 5 to 10 only show the arrangement of a plurality of photosensitive pixels 110 in a minimum repeating unit. The smallest repeating unit shown in FIGS. 5 to 10 is copied multiple times in rows and columns to form the pixel array 11. Each minimum repeating unit is composed of multiple full-color photosensitive pixels W and multiple color photosensitive pixels. Each minimum repeating unit includes multiple subunits.
  • Each subunit includes a plurality of single-color photosensitive pixels and a plurality of full-color photosensitive pixels W.
  • the full-color photosensitive pixel W and the color photosensitive pixel in each sub-unit are alternately arranged.
  • multiple photosensitive pixels 110 in the same row are photosensitive pixels 110 of the same category; or, multiple photosensitive pixels 110 in the same column are photosensitive pixels 110 of the same category 110.
  • FIG. 5 is a schematic diagram of the arrangement of photosensitive pixels 110 (shown in FIG. 3) in the smallest repeating unit of an embodiment of the application.
  • the smallest repeating unit is 4 rows, 4 columns and 16 photosensitive pixels 110
  • the sub-units are 2 rows, 2 columns and 4 photosensitive pixels 110.
  • the arrangement method is:
  • W represents the full-color photosensitive pixel
  • A represents the first color photosensitive pixel among the multiple color photosensitive pixels
  • B represents the second color photosensitive pixel among the multiple color photosensitive pixels
  • C represents the third color photosensitive pixel among the multiple color photosensitive pixels Pixels.
  • full-color photosensitive pixels W and single-color photosensitive pixels are alternately arranged.
  • the categories of subunits include three categories.
  • the first type subunit UA includes a plurality of full-color photosensitive pixels W and a plurality of first color photosensitive pixels A
  • the second type of subunit UB includes a plurality of panchromatic photosensitive pixels W and a plurality of second-color photosensitive pixels B
  • the third type of subunit UC includes a plurality of full-color photosensitive pixels W and a plurality of third-color photosensitive pixels C.
  • Each minimum repeating unit includes four subunits, which are one subunit of the first type UA, two subunits of the second type UB, and one subunit of the third type UC.
  • a first type subunit UA and a third type subunit UC are arranged in the first diagonal direction D1 (for example, the direction connecting the upper left corner and the lower right corner in FIG. 5), and two second type subunits UB are arranged In the second diagonal direction D2 (for example, the direction where the upper right corner and the lower left corner are connected in FIG. 5).
  • the first diagonal direction D1 is different from the second diagonal direction D2.
  • the first diagonal line and the second diagonal line are perpendicular.
  • first diagonal direction D1 may also be a direction connecting the upper right corner and the lower left corner
  • second diagonal direction D2 may also be a direction connecting the upper left corner and the lower right corner
  • direction here is not a single direction, but can be understood as the concept of a "straight line” indicating the arrangement, and there may be two-way directions at both ends of the straight line.
  • the explanation of the first diagonal direction D1 and the second diagonal direction D2 in FIGS. 6 to 10 is the same as here.
  • FIG. 6 is a schematic diagram of the arrangement of photosensitive pixels 110 (shown in FIG. 3) in a minimum repeating unit according to another embodiment of the application.
  • the smallest repeating unit is 36 photosensitive pixels 110 in 6 rows and 6 columns, and the sub-units are 9 photosensitive pixels 110 in 3 rows and 3 columns.
  • the arrangement method is:
  • W represents the full-color photosensitive pixel
  • A represents the first color photosensitive pixel among the multiple color photosensitive pixels
  • B represents the second color photosensitive pixel among the multiple color photosensitive pixels
  • C represents the third color photosensitive pixel among the multiple color photosensitive pixels Pixels.
  • full-color photosensitive pixels W and single-color photosensitive pixels are alternately arranged.
  • the categories of subunits include three categories.
  • the first type of subunit UA includes a plurality of panchromatic photosensitive pixels W and a plurality of first color photosensitive pixels A
  • the second type of subunit UB includes a plurality of panchromatic photosensitive pixels W and a plurality of second color photosensitive pixels B
  • the third type of subunit UC includes a plurality of full-color photosensitive pixels W and a plurality of third-color photosensitive pixels C.
  • Each minimum repeating unit includes four subunits, which are one subunit of the first type UA, two subunits of the second type UB, and one subunit of the third type UC.
  • a first type subunit UA and a third type subunit UC are arranged in a first diagonal direction D1
  • two second type subunits UB are arranged in a second diagonal direction D2.
  • the first diagonal direction D1 is different from the second diagonal direction D2.
  • the first diagonal line and the second diagonal line are perpendicular.
  • FIG. 7 is a schematic diagram of the arrangement of photosensitive pixels 110 (shown in FIG. 3) in the smallest repeating unit according to another embodiment of the application.
  • the minimum repeating unit is 8 rows and 8 columns and 64 photosensitive pixels 110
  • the sub-units are 4 rows and 4 columns and 16 photosensitive pixels 110.
  • the arrangement method is:
  • W represents the full-color photosensitive pixel
  • A represents the first color photosensitive pixel among the multiple color photosensitive pixels
  • B represents the second color photosensitive pixel among the multiple color photosensitive pixels
  • C represents the third color photosensitive pixel among the multiple color photosensitive pixels Pixels.
  • full-color photosensitive pixels W and single-color photosensitive pixels are alternately arranged.
  • the categories of subunits include three categories.
  • the first type subunit UA includes a plurality of full-color photosensitive pixels W and a plurality of first color photosensitive pixels A
  • the second type of subunit UB includes a plurality of panchromatic photosensitive pixels W and a plurality of second-color photosensitive pixels B
  • the third type of subunit UC includes a plurality of full-color photosensitive pixels W and a plurality of third-color photosensitive pixels C.
  • Each minimum repeating unit includes four subunits, which are one subunit of the first type UA, two subunits of the second type UB, and one subunit of the third type UC.
  • one first type subunit UA and one third type subunit UC are arranged in the first diagonal direction D1
  • two second type subunits UB are arranged in the second diagonal direction D2.
  • the first diagonal direction D1 is different from the second diagonal direction D2.
  • the first diagonal line and the second diagonal line are perpendicular.
  • FIG. 8 is a schematic diagram of the arrangement of photosensitive pixels 110 (shown in FIG. 3) in the smallest repeating unit according to another embodiment of the application.
  • the smallest repeating unit is 4 rows, 4 columns and 16 photosensitive pixels 110, and the sub-units are 2 rows, 2 columns and 4 photosensitive pixels 110.
  • the arrangement method is:
  • W represents the full-color photosensitive pixel
  • A represents the first color photosensitive pixel among the multiple color photosensitive pixels
  • B represents the second color photosensitive pixel among the multiple color photosensitive pixels
  • C represents the third color photosensitive pixel among the multiple color photosensitive pixels Pixels.
  • the arrangement of the photosensitive pixels 110 in the smallest repeating unit shown in FIG. 8 is roughly the same as the arrangement of the photosensitive pixels 110 in the smallest repeating unit shown in FIG.
  • the alternating sequence of full-color photosensitive pixels W and single-color photosensitive pixels in the subunit UB is inconsistent with the alternating sequence of full-color photosensitive pixels W and single-color photosensitive pixels in the second type of subunit UB in the lower left corner of FIG. 5, and ,
  • the alternating sequence of the full-color photosensitive pixel W and the single-color photosensitive pixel in the third type subunit UC in FIG. 8 is the same as the full-color photosensitive pixel W and the single-color photosensitive pixel W in the third type subunit UC in the lower right corner of FIG.
  • the alternating sequence of photosensitive pixels is also inconsistent. Specifically, in the second type subunit UB in the lower left corner of FIG. 5, the alternating sequence of the photosensitive pixels 110 in the first row is full-color photosensitive pixels W, single-color photosensitive pixels (ie, second-color photosensitive pixels B), and The alternating sequence of the two rows of photosensitive pixels 110 is single-color photosensitive pixels (ie, second-color photosensitive pixels B) and full-color photosensitive pixels W; and in the second-type subunit UB in the lower left corner of FIG.
  • the first row The alternating sequence of photosensitive pixels 110 is single-color photosensitive pixels (ie, second-color photosensitive pixels B), full-color photosensitive pixels W, and the alternating sequence of photosensitive pixels 110 in the second row is full-color photosensitive pixels W, single-color photosensitive pixels (ie The second color photosensitive pixel B).
  • the alternating sequence of the photosensitive pixels 110 in the first row is full-color photosensitive pixels W, single-color photosensitive pixels (that is, third-color photosensitive pixels C), and the second row
  • the alternating sequence of the photosensitive pixels 110 is a single-color photosensitive pixel (that is, a third-color photosensitive pixel C) and a full-color photosensitive pixel W; and in the third type of subunit UC in the lower right corner of FIG.
  • the photosensitive pixels 110 in the first row The alternating sequence of the single-color photosensitive pixel (ie the third color photosensitive pixel C), the full-color photosensitive pixel W, the alternating sequence of the photosensitive pixel 110 in the second row is the full-color photosensitive pixel W, the single-color photosensitive pixel (ie the third color Photosensitive pixel C).
  • the alternating sequence of pixels is not consistent.
  • the alternating sequence of the photosensitive pixels 110 in the first row is full-color photosensitive pixels W, single-color photosensitive pixels (that is, first-color photosensitive pixels A), and the second row
  • the alternating sequence of the photosensitive pixels 110 is a single-color photosensitive pixel (that is, the first color photosensitive pixel A), a full-color photosensitive pixel W; and in the third type of subunit CC shown in FIG.
  • the photosensitive pixels 110 in the first row The alternating sequence is single-color photosensitive pixels (that is, third-color photosensitive pixels C), full-color photosensitive pixels W, and the alternating sequence of photosensitive pixels 110 in the second row is full-color photosensitive pixels W, single-color photosensitive pixels (that is, third-color photosensitive pixels). Pixel C). That is to say, in the same minimum repeating unit, the alternating sequence of full-color photosensitive pixels W and color photosensitive pixels in different subunits can be the same (as shown in Figure 5) or inconsistent (as shown in Figure 8). Show).
  • FIG. 9 is a schematic diagram of the arrangement of photosensitive pixels 110 (shown in FIG. 3) in the smallest repeating unit according to another embodiment of the application.
  • the smallest repeating unit is 4 rows, 4 columns and 16 photosensitive pixels 110, and the sub-units are 2 rows, 2 columns and 4 photosensitive pixels 110.
  • the arrangement method is:
  • W represents the full-color photosensitive pixel
  • A represents the first color photosensitive pixel among the multiple color photosensitive pixels
  • B represents the second color photosensitive pixel among the multiple color photosensitive pixels
  • C represents the third color photosensitive pixel among the multiple color photosensitive pixels Pixels.
  • photosensitive pixels 110 in the same row are photosensitive pixels 110 of the same category.
  • the photosensitive pixels 110 of the same category include: (1) all full-color photosensitive pixels W; (2) all first-color photosensitive pixels A; (3) all second-color photosensitive pixels B; (4) all The third color photosensitive pixel C.
  • the categories of subunits include three categories.
  • the first type subunit UA includes a plurality of full-color photosensitive pixels W and a plurality of first color photosensitive pixels A
  • the second type of subunit UB includes a plurality of panchromatic photosensitive pixels W and a plurality of second-color photosensitive pixels B
  • the third type of subunit UC includes a plurality of full-color photosensitive pixels W and a plurality of third-color photosensitive pixels C.
  • Each minimum repeating unit includes four subunits, which are one subunit of the first type UA, two subunits of the second type UB, and one subunit of the third type UC.
  • one first type subunit UA and one third type subunit UC are arranged in the first diagonal direction D1
  • two second type subunits UB are arranged in the second diagonal direction D2.
  • the first diagonal direction D1 is different from the second diagonal direction D2.
  • the first diagonal line and the second diagonal line are perpendicular.
  • FIG. 10 is a schematic diagram of the arrangement of photosensitive pixels 110 (shown in FIG. 3) in the smallest repeating unit according to another embodiment of the application.
  • the smallest repeating unit is 4 rows, 4 columns and 16 photosensitive pixels 110, and the sub-units are 2 rows, 2 columns and 4 photosensitive pixels 110.
  • the arrangement method is:
  • W represents the full-color photosensitive pixel
  • A represents the first color photosensitive pixel among the multiple color photosensitive pixels
  • B represents the second color photosensitive pixel among the multiple color photosensitive pixels
  • C represents the third color photosensitive pixel among the multiple color photosensitive pixels Pixels.
  • a plurality of photosensitive pixels 110 in the same column are photosensitive pixels 110 of the same category.
  • the photosensitive pixels 110 of the same category include: (1) all full-color photosensitive pixels W; (2) all first-color photosensitive pixels A; (3) all second-color photosensitive pixels B; (4) all The third color photosensitive pixel C.
  • the categories of subunits include three categories.
  • the first type subunit UA includes a plurality of full-color photosensitive pixels W and a plurality of first color photosensitive pixels A
  • the second type of subunit UB includes a plurality of panchromatic photosensitive pixels W and a plurality of second-color photosensitive pixels B
  • the third type of subunit UC includes a plurality of full-color photosensitive pixels W and a plurality of third-color photosensitive pixels C.
  • Each minimum repeating unit includes four subunits, which are one subunit of the first type UA, two subunits of the second type UB, and one subunit of the third type UC.
  • one first type subunit UA and one third type subunit UC are arranged in the first diagonal direction D1
  • two second type subunits UB are arranged in the second diagonal direction D2.
  • the first diagonal direction D1 is different from the second diagonal direction D2.
  • the first diagonal line and the second diagonal line are perpendicular.
  • multiple photosensitive pixels 110 in the same row in some sub-units may be photosensitive pixels 110 of the same category, and multiple photosensitive pixels 110 in the same column in the remaining sub-units
  • the pixels 110 are photosensitive pixels 110 of the same type.
  • the first color photosensitive pixel A may be a red photosensitive pixel R; the second color photosensitive pixel B may be a green photosensitive pixel G; and the third color photosensitive pixel C may be Blue photosensitive pixel Bu.
  • the first color photosensitive pixel A may be a red photosensitive pixel R; the second color photosensitive pixel B may be a yellow photosensitive pixel Y; and the third color photosensitive pixel C may be Blue photosensitive pixel Bu.
  • the first color photosensitive pixel A may be a magenta photosensitive pixel M; the second color photosensitive pixel B may be a cyan photosensitive pixel Cy; and the third color photosensitive pixel C may It is the yellow photosensitive pixel Y.
  • the response band of the full-color photosensitive pixel W may be the visible light band (for example, 400 nm-760 nm).
  • the full-color photosensitive pixel W is provided with an infrared filter to filter out infrared light.
  • the response wavelength bands of the full-color photosensitive pixel W are visible light and near-infrared wavelengths (for example, 400nm-1000nm), and the photoelectric conversion element 1111 (shown in FIG. 4) in the image sensor 10 (shown in FIG. 1) (Shown) to match the response band.
  • the full-color photosensitive pixel W may not be provided with a filter or a filter that can pass light of all wavelength bands.
  • the response band of the full-color photosensitive pixel W is determined by the response band of the photoelectric conversion element 1111, that is, the two match. .
  • the embodiments of the present application include, but are not limited to, the above-mentioned waveband range.
  • the following embodiments all describe the first single-color photosensitive pixel A as the red photosensitive pixel R, the second single-color photosensitive pixel B as the green photosensitive pixel G, and the third single-color photosensitive pixel as the blue photosensitive pixel Bu.
  • the control unit 13 controls the exposure of the pixel array 11.
  • the pixel array 11 is exposed for the first exposure time to obtain the first original image.
  • the first original image includes first color original image data generated by single-color photosensitive pixels exposed at the first exposure time and first full-color original image data generated by panchromatic photosensitive pixels W exposed at the first exposure time.
  • the pixel array 11 is exposed for a second exposure time to obtain a second original image.
  • the second original image includes second color original image data generated by the single-color photosensitive pixels exposed at the second exposure time and second full-color original image data generated by the panchromatic photosensitive pixels W exposed at the second exposure time.
  • the first exposure time is not equal to the second exposure time.
  • the pixel array 11 performs two exposures. For example, as shown in FIG. 11, in the first exposure, the pixel array 11 is exposed for a first exposure time L (for example, a long exposure time) to obtain a first original image.
  • the first original image includes first color original image data generated by the single-color photosensitive pixels exposed at the first exposure time L and first full-color original image data generated by the panchromatic photosensitive pixels exposed at the first exposure time L.
  • the pixel array 11 is exposed for a second exposure time S (for example, a short exposure time) to obtain a second original image.
  • the second original image includes second color original image data generated by the single-color photosensitive pixels exposed at the second exposure time S and second full-color original image data generated by the panchromatic photosensitive pixels exposed at the second exposure time S. It should be noted that the pixel array 11 may also perform a short exposure first, and then perform a long exposure, which is not limited here.
  • the image sensor 10 can output multiple original image data generated by the pixel array 11, and the multiple original image data form an original image.
  • each color original image data in each frame of the original image (the first original image, the second original image, and the third original image) is generated by a single single-color photosensitive pixel
  • each full-color original image data is Generated by a single full-color photosensitive pixel W
  • the image sensor 10 outputs multiple original image data in an output manner that alternately outputs one color original image data and one full-color original image data.
  • each single-color photosensitive pixel After the pixel array 11 is exposed, each single-color photosensitive pixel generates a color original image data corresponding to the single-color photosensitive pixel, and each panchromatic photosensitive pixel W generates a full-color corresponding to the panchromatic photosensitive pixel W.
  • Original image data For a plurality of photosensitive pixels 110 in the same row, the original image data generated by the plurality of photosensitive pixels is output in a manner: one color original image data and one full-color original image data are alternately output. After the multiple original image data of the same line are output, multiple original image data of the next line are output.
  • the output mode of multiple original image data of each line is one color original image data and one full-color original image data output. .
  • the image sensor 10 sequentially outputs a plurality of original image data, and the plurality of original image data forms an original image.
  • the alternate output of a color original image data and a full-color original image data can include the following two types: (1) output a color original image data first, and then output a full-color original image data; (2) output first A full-color original image data, and then output a color original image data.
  • the specific alternating sequence is related to the arrangement of the full-color photosensitive pixels W and the color photosensitive pixels in the pixel array 11.
  • the alternating sequence is (1); when the photosensitive pixel 110 in the 0 row and 0 column of the pixel array 11 is a full-color photosensitive pixel W, The alternate sequence is (2).
  • FIG. 12 The output mode of the original image data will be described below by taking FIG. 12 as an example.
  • the pixel array 11 includes 8*8 photosensitive pixels 110, and the photosensitive pixel 110 in the 0th row and 0th column of the pixel array 11 is a full-color photosensitive pixel W, then when the pixel array 11 After the exposure is completed, the image sensor 10 first outputs the full-color original image data generated by the full-color photosensitive pixel p00 in the 0th row and 0th column.
  • the image pixel P00 corresponding to the full-color original image data is located in the 0th row and 0th row of the original image.
  • the image sensor 10 then outputs the color original image data generated by the color photosensitive pixel p01 in the 0th row and the 1st column, and the image pixel P01 corresponding to the color original image data is located in the 0th row and the 1st column of the original image; ...;
  • the image sensor 10 outputs color original image data generated by the color photosensitive pixel p07 in the 0th row and 7th column, and the image pixel P07 corresponding to the color original image data is located in the 0th row and 7th column of the original image. So far, the original image data generated by the eight photosensitive pixels 110 in the 0th row of the pixel array 11 are all output.
  • the image sensor 10 sequentially outputs the original image data generated by the eight photosensitive pixels 110 in the first row of the pixel array 11; subsequently, the image sensor 10 sequentially outputs the original image data generated by the eight photosensitive pixels 110 in the second row of the pixel array 11; It can be deduced by analogy until the image sensor 10 outputs the full-color original image data generated by the full-color photosensitive pixel p77 in the seventh row and seventh column.
  • the original image data generated by the plurality of photosensitive pixels 110 form an original image, wherein the position of the image pixel corresponding to the original image data generated by each photosensitive pixel 110 in the original image is the same as the position of the photosensitive pixel 110 in the pixel array 11. Corresponding to the location.
  • each color original image data in each frame of the original image (the first original image, the second original image, and the third original image) is generated by multiple single-color photosensitive pixels in the same subunit.
  • Each full-color original image data is jointly generated by multiple full-color photosensitive pixels W in the same subunit, and the output mode of the image sensor 10 for outputting multiple original image data includes multiple color original image data and multiple full-color original image data alternately Output.
  • multiple single-color photosensitive pixels in the same sub-unit jointly generate a color original image data corresponding to the sub-unit
  • multiple full-color photosensitive pixels W in the same sub-unit jointly generate one and
  • the full-color original image data corresponding to the sub-unit, that is, one sub-unit corresponds to one color original image data and one full-color original image data.
  • the original image data corresponding to the multiple subunits is outputted as follows: multiple color original image data corresponding to multiple subunits in the same row alternate with multiple full-color original image data Output, wherein the output mode of the multiple color original image data is that the multiple color original images are output in sequence; the output mode of the multiple panchromatic original image data is that the multiple full color original image data are output in sequence. After the multiple original image data of the same line is output, multiple original image data of the next line are output.
  • the output mode of multiple original image data of each line is multiple color original image data and multiple full color original images Data is output alternately.
  • the image sensor 10 sequentially outputs a plurality of original image data, and the plurality of original image data forms an original image.
  • the alternate output of multiple color original image data and multiple full-color original image data may include the following two types: (1) First output multiple color original image data one after another, and then output multiple full-color original image data one after another. Image data; (2) First output multiple full-color original image data one after another, and then output multiple color original image data one after another.
  • the specific alternating sequence is related to the arrangement of the full-color photosensitive pixels W and the color photosensitive pixels in the pixel array 11.
  • the alternating sequence is (1); when the photosensitive pixel 110 in the 0 row and 0 column of the pixel array 11 is a full-color photosensitive pixel W, The alternate sequence is (2).
  • the output mode of the original image data will be described below by taking FIG. 13 as an example.
  • the pixel array 11 includes 8*8 photosensitive pixels 110.
  • the full-color photosensitive pixel p00, the full-color photosensitive pixel p11, the color photosensitive pixel p01, and the color photosensitive pixel p10 in the pixel array 11 constitute a subunit U1;
  • the full-color photosensitive pixel p02, the full-color photosensitive pixel p13, the color photosensitive pixel p03 and the color photosensitive pixel Pixel p12 constitutes subunit U2;
  • full-color photosensitive pixel p04, full-color photosensitive pixel p15, color photosensitive pixel p05, and color photosensitive pixel p14 constitute sub-unit U3;
  • the color photosensitive pixel p16 constitutes a subunit U4
  • the image sensor 10 first outputs the full-color photosensitive pixel p00 and the panchromatic photosensitive pixel p11 in the subunit U1
  • the image pixel P00 corresponding to the panchromatic original image data is located in the 0th row and 0th column of the original image; subsequently, the image sensor 10 then outputs the panchromatic photosensitive pixel p02 and panchromatic photosensitive pixel p02 in the subunit U2.
  • the panchromatic original image data generated together with the panchromatic photosensitive pixel p15, the image pixel P02 corresponding to the panchromatic original image data is located in the 0th row and the second column of the original image; subsequently, the image sensor 10 then outputs the panchromatic in the subunit U4 Panchromatic original image data generated by the photosensitive pixel p06 and the panchromatic photosensitive pixel p17 together, and the image pixel P03 corresponding to the panchromatic primitive image data is located in the 0th row and third column of the original image.
  • the image sensor 10 first outputs the color original image data jointly generated by the color photosensitive pixel p01 and the color photosensitive pixel p10 in the subunit U1, and the image pixel P10 corresponding to the color original image data is located in the first row and the 0 column of the original image; , The image sensor 10 then outputs the color original image data jointly generated by the color photosensitive pixel p03 and the color photosensitive pixel p12 in the subunit U2, and the image pixel P11 corresponding to the color original image data is located in the first row and first column of the original image; subsequently, The image sensor 10 then outputs the color original image data generated by the color photosensitive pixel p05 and the color photosensitive pixel p14 in the subunit U3.
  • the image pixel P12 corresponding to the color original image data is located in the first row and second column of the original image;
  • the sensor 10 then outputs the color original image data jointly generated by the color photosensitive pixel p07 and the color photosensitive pixel p16 in the subunit U4, and the image pixel P13 corresponding to the color original image data is located in the first row and third column of the original image. So far, the multiple color original image data corresponding to the multiple subunits in the first row have also been output.
  • the image sensor 10 outputs multiple full-color original image data and multiple color original image data corresponding to the multiple sub-units in the second row, and multiple full-color original images corresponding to the multiple sub-units in the second row.
  • the output mode of the data and the multiple color original image data is the same as the output mode of the multiple full-color original image data and the multiple color original image data corresponding to the multiple subunits in the first row, and will not be repeated here.
  • the image sensor 10 has outputted multiple full-color original image data and multiple color original image data corresponding to the multiple subunits in the fourth row. In this way, the original image data generated by the plurality of photosensitive pixels 110 forms a frame of original image.
  • the image fusion module 20 fuses the first color original image data and the first panchromatic original image data in the first original image to obtain a first color intermediate image containing only the first color intermediate image data;
  • the second color original image data and the second panchromatic original image data in the two original images are merged to obtain a second color intermediate image containing only the second color intermediate image data, the first color intermediate image and the second color intermediate image
  • the image fusion module 20 fuses the color original image data and the whole
  • the color intermediate image obtained after coloring the original image data includes a plurality of color image pixels, and the plurality of color image pixels are arranged in a Bayer array.
  • the resolution of the color intermediate image is the same as the resolution of the pixel array 11.
  • the image fusion module 20 fuses the color original image data and the panchromatic image data.
  • the color intermediate image obtained after the original image data includes a plurality of color image pixels, and the plurality of color image pixels are arranged in a Bayer array.
  • the resolution of the color intermediate image is the same as the resolution of the pixel array 11.
  • the original image data when the image sensor 10 is working in the high-resolution mode, can be output by alternately outputting one color original image data and one full-color original image data.
  • the image sensor 10 works in the low-resolution mode, the original image data can be output by alternately outputting multiple color original image data and multiple full-color original image data.
  • the image sensor 10 when the ambient brightness is high, the image sensor 10 can work in a high-resolution mode, which is beneficial to improve the clarity of the finally acquired image; when the ambient brightness is low, the image sensor 10 can work in a low-resolution mode, which is beneficial to Increase the brightness of the final acquired image.
  • the image fusion module 20 may be integrated in the image sensor 10, may also be integrated in the image processor 40, or may be separately provided outside the image sensor 10 and the image processor 40.
  • the high dynamic range image processing system 100 also includes an image processor 40.
  • the image processor 40 includes an image preprocessing module 41. After the image fusion module 20 obtains the first color intermediate image and the second color intermediate image, the two images are transmitted to the image preprocessing module 41 for image preprocessing. .
  • the image preprocessing module 41 may perform image preprocessing on the first color intermediate image to obtain the preprocessed first color intermediate image, and perform preprocessing on the second color intermediate image to obtain the preprocessed second color intermediate image.
  • image preprocessing includes at least one of black level correction, lens shading correction, and dead pixel compensation.
  • image preprocessing includes only black level correction processing; or, image preprocessing includes lens shading correction and dead pixel compensation; or, image preprocessing includes black level correction processing and lens shading correction; or, image preprocessing includes black Level correction, lens shading correction and dead pixel compensation.
  • the black level correction process may be that the image preprocessing module 41 subtracts a fixed value from each pixel value (that is, each color intermediate image data) on the basis of obtaining the color intermediate image fused by the image fusion module 20.
  • the fixed value corresponding to the pixel value of each color channel can be the same or different.
  • the first color intermediate image has the pixel values of the red channel, the pixel values of the green channel, and the pixel values of the blue channel.
  • the image preprocessing module 41 performs black level correction on the first color intermediate image, and all the pixel values in the first color intermediate image are subtracted from a fixed value of 5 to obtain the first color that has undergone black level correction.
  • the middle image is the image preprocessing module 41 performing black level correction on the first color intermediate image.
  • the image sensor 10 adds a fixed offset of 5 (or other values) before the ADC input, so that the output pixel value is between 5 (or other values) to 255, and the black level correction can make the While the details of the dark parts of the image obtained by the image sensor 10 and the high dynamic range image processing system 100 of the application embodiment are completely preserved, the pixel value of the image is not increased or decreased, which is beneficial to improving the image quality.
  • Lens shadow is the phenomenon that the lens has a shadow around the lens caused by the uneven optical refraction of the lens, that is, the intensity of the received light in the center and the surrounding area of the image area is inconsistent.
  • the process of lens shading correction may be that the image preprocessing module 41 may perform grid division on the processed image on the basis of the black level corrected first color intermediate image and the black level corrected second color intermediate image, Then through the compensation effect of each grid area adjacent or itself and adjacent circumferences, the image is corrected by the bilinear interpolation method for lens shading.
  • the following takes the lens shading correction of the first color intermediate image as an example for description. As shown in FIG.
  • the image preprocessing module 41 divides the first color intermediate image (that is, the processed image) into sixteen equally Grid, each of the sixteen grids has a preset compensation coefficient. Then, the image preprocessing module 41 performs shading correction on the image by the bilinear interpolation method according to the compensation effect of each grid area adjacent or itself and its vicinity.
  • R2 is the pixel value in the dashed frame in the first color intermediate image after lens shading correction
  • R1 is the pixel value in the dashed frame in the first color intermediate image in the illustration.
  • the coordinates of the image are (x, y), x is counted from the first image pixel from the left to the right, y is counted from the first image pixel on the top, and both x and y are natural numbers, such as those on the edge of the image Logo shown.
  • the coordinates of R1 are (3,3)
  • the coordinates of R1 in each grid compensation coefficient map should be (0.75,0.75).
  • f(x, y) represents the compensation value of the coordinate (x, y) in each grid compensation coefficient graph.
  • f(0.75, j0.75) is the compensation coefficient value corresponding to R1 in each grid compensation coefficient graph.
  • the compensation coefficient of each grid has been preset before the image preprocessing module 41 performs lens shading correction.
  • the compensation coefficient of each grid can be determined by the following methods: (1) Place the lens 300 in a closed device with constant and uniform light intensity and color temperature, and make the lens 300 face a pure gray target with uniform brightness distribution in the closed device The object is shot to obtain a grayscale image; (2) The grayscale image is gridded (for example, divided into 16 grids) to obtain the grayscale image divided into different grid areas; (3) The different grids of the grayscale image are calculated The compensation coefficient of the grid area. After the compensation coefficient of the lens 300 is determined, the high dynamic range image processing system 100 of the present application sets the compensation coefficient in the image preprocessing module 41 in advance.
  • the image preprocessing module 41 in the high dynamic range image processing system 100 compares the image When performing lens shading correction, the compensation coefficient is obtained, and the image preprocessing module 41 uses a bilinear interpolation method to perform lens shading correction on the image according to the compensation effect of each grid area.
  • the photosensitive pixels 110 on the pixel array 11 of the image sensor 40 may have process defects, or errors may occur in the process of converting optical signals into electrical signals, resulting in incorrect image pixel information on the image, resulting in inaccurate pixel values in the image , These defective image pixels appear on the output image as image dead pixels.
  • Image dead pixels may exist, so the image needs to be compensated for dead pixels.
  • Dead pixel compensation may include the following steps: (1) a 3 ⁇ 3 pixel matrix of photosensitive pixels of the same color is established with the pixel to be detected as the center pixel; (2) the surrounding pixels of the central pixel are taken as Reference point, determine whether the difference between the color value of the central pixel and the surrounding pixels is greater than the first threshold, if yes, the central pixel is a bad pixel, if not, the central pixel is normal Point; (3) Perform bilinear interpolation on the central pixel point determined as a bad point to obtain the corrected pixel value.
  • the first color intermediate image (which may be the uncorrected first color intermediate image, or the corrected first color intermediate image, etc.) is described below with dead pixel compensation.
  • R1 is the pixel to be detected, and the image preprocessing module 41 uses R1 as the center pixel to establish a 3 ⁇ 3 pixel matrix of pixels of the same color as the photosensitive pixel of R1 to obtain the second image in FIG. 19. And taking the surrounding pixels of the central pixel point R1 as a reference point, it is determined whether the difference between the color value of the central pixel point R1 and the surrounding pixels is greater than the first threshold Q (Q is preset in the image preprocessing module 41). Assume). If it is, the central pixel R1 is a bad pixel, and if not, the central pixel R1 is a normal pixel.
  • R1 is a dead pixel
  • bilinear interpolation is performed on R1 to obtain the corrected pixel value R1' (the case where R1 is a dead pixel is shown in the figure) to obtain the third image in FIG. 19.
  • the image preprocessing module 41 of the embodiment of the present application can perform dead pixel compensation on the image, which is beneficial for the high dynamic range image processing system 100 to eliminate the processing defects of the photosensitive pixels 110 during the imaging process of the high dynamic range image processing system 100. Or the defective image caused by errors in the process of converting optical signals into electrical signals, thereby improving the accuracy of the pixel values of the target image formed by the high dynamic range image processing system 100, so that the embodiments of the present application have better Imaging effect.
  • the high dynamic range image processing system 100 further includes a storage module 50.
  • the storage module 50 is used to store the image preprocessed by the image preprocessing module 41 and transmit the preprocessed image to the high dynamic range processing image module 30. Perform high dynamic range image processing to obtain the first color high dynamic range image.
  • the image preprocessing module 41 preprocesses the first color intermediate image and the second color intermediate image in sequence. After the image preprocessing module 41 performs image preprocessing on the first color intermediate image, it will obtain the preprocessed first color image.
  • a color intermediate image is transmitted to the storage module 50 for storage.
  • the image preprocessing module 41 After the image preprocessing module 41 completes image preprocessing on the second color intermediate image, it transmits the obtained preprocessed second color intermediate image to the storage module 50 for storage.
  • the storage module 50 stores all the images after the image preprocessing performed by the image preprocessing module 41 (that is, when the storage module 50 stores the preprocessed first color intermediate image and the preprocessed second color intermediate image), The storage module 50 transmits all the stored images (that is, the preprocessed first color intermediate image and the preprocessed second color intermediate image) to the high dynamic range processing image module 30.
  • the image preprocessing module 41 can also preprocess the second color intermediate image first, and then preprocess the first color intermediate image; the image processing module 41 can also process the first color intermediate image and the first color intermediate image at the same time.
  • the two-color intermediate image is subjected to image preprocessing, which is not limited here. No matter what method the image preprocessing module 41 uses to preprocess the first color intermediate image and the second color intermediate image, the storage module 50 only stores the preprocessed first color intermediate image and the preprocessed second color intermediate image. After the color intermediate image, the two images are transmitted to the high dynamic range image processing module 30.
  • the high dynamic range image processing module 30 After obtaining the preprocessed first color intermediate image and the preprocessed second color intermediate image, the high dynamic range image processing module 30 performs high dynamic range processing on the two images to obtain the first color high dynamic range image .
  • the pixel value V1 of the image pixel P12 (the image pixel marked with a dashed circle in the first color intermediate image after preprocessing in FIG. 20) is greater than the first preset threshold V0, that is, the image pixel P12
  • the high dynamic range image processing unit 31 expands a predetermined area with the overexposed image pixel P12 as the center, for example, the 3*3 area shown in FIG. 20.
  • the high dynamic range image processing unit 31 searches for an intermediate image pixel with a pixel value smaller than the first preset threshold V0 in a predetermined area of 3*3, such as image pixel P21 in FIG. 20 (in the first color intermediate image in FIG. 20). If the pixel value V2 of the image pixel marked with a dotted circle is less than the first preset threshold value V0, the image pixel P21 is the intermediate image pixel P21.
  • the high dynamic range image processing unit 31 searches the preprocessed second color intermediate image for image pixels corresponding to the overexposed image pixel P12 and the intermediate image pixel P21 respectively, that is, the image pixel P1'2' (No. 2 in FIG. 20).
  • the image pixels marked with a dashed circle in the second color intermediate image) and the image pixel P2'1' (the image pixels marked with a dotted circle in the second color intermediate image after preprocessing in FIG.
  • the image pixel P1' 2' corresponds to the overexposed image pixel P12
  • the image pixel P2'1' corresponds to the intermediate image pixel P21
  • the pixel value of the image pixel P1'2' is V3
  • the pixel value of the image pixel P2'1' is V4.
  • the actual pixel value of the overexposed image pixel P12 can be calculated.
  • the high dynamic range image processing unit 31 performs this brightness alignment process on each of the overexposed image pixels in the preprocessed first color intermediate image to obtain the preprocessed and brightness aligned first color intermediate image .
  • the high dynamic range image processing module 30 may perform processing on the preprocessed and brightness-aligned image and the preprocessed second color intermediate image. The images are fused to obtain a highly dynamic image. Specifically, the high dynamic range image processing module 30 first performs motion detection on the preprocessed and brightness-aligned first color intermediate image to identify whether there is a motion blur area in the preprocessed and brightness-aligned first color intermediate image.
  • the first color intermediate image after preprocessing and brightness alignment and the second color intermediate image after preprocessing are directly merged to obtain the first color High dynamic range image. If there is a motion blur area in the first color intermediate image after preprocessing and brightness alignment, the motion blur area in the first color intermediate image after preprocessing and brightness alignment is eliminated, and only the preprocessed second color intermediate image is merged The image and the preprocessed and brightness-aligned first color intermediate image except for the motion blur area in the first color intermediate image to obtain the first color high dynamic range image.
  • the fusion of the two intermediate images follows the following principles: (1) In the first color intermediate image after preprocessing and brightness alignment, the pixel values of the image pixels in the overexposed area are directly replaced with the preprocessed second color intermediate image (2) In the first color intermediate image after preprocessing and brightness alignment, the pixel value of the image pixel in the under-exposed area is: the long-exposure pixel value divided by the coefficient K1 , The coefficient K1 is the average of K2 and K3; K2 is the ratio of the long-exposure pixel value to the medium-exposure pixel value, and K3 is the ratio of the long-exposure pixel value to the short-exposure pixel value; (3) the preprocessing and brightness alignment of the first In a color intermediate image, the pixel value of the image pixels in
  • the fusion of the two intermediate images at this time not only follows the above three principles, but also needs to follow the (4) principle: preprocessing and brightness In the aligned first color intermediate image, the pixel value of the image pixel in the motion blur area is directly replaced with the pixel value of the image pixel corresponding to the motion blur area in the preprocessed second color intermediate image.
  • the high dynamic range image processing system 100 of the embodiment of the present application performs high dynamic range processing on the image through the high dynamic range image processing module 30, first performs brightness alignment processing on the image, and then merges the brightness aligned image with other images to obtain
  • the high dynamic image enables the target image formed by the high dynamic range image processing system 100 to have a larger dynamic range, and thus a better imaging effect.
  • the image processor 40 further includes an image post-processing module 42 for performing image post-processing on the first color high dynamic range image to obtain the second color high dynamic range image. image.
  • the image post-processing includes at least one of demosaicing, color correction, global tone mapping, and color conversion.
  • image post-processing includes only global tone mapping; or, image post-processing includes global tone mapping and color conversion; or, image post-processing includes color correction, global tone mapping, and color conversion; or, image post-processing includes demosaicing and color conversion. Correction, global tone mapping and color conversion.
  • each image pixel grid of the first color high dynamic range image in the embodiment of the present application is a single color image pixel, and there is no optical information of other colors, it is necessary to demosaic the first color high dynamic range image.
  • the demosaicing step includes the following steps: (1) Decompose the first color high dynamic range image into a first red high dynamic range image, a first green high dynamic range image, and a first blue high dynamic range image, as shown in Figure 21 It is shown that in the obtained first red high dynamic range image, first green high dynamic range image, and first blue high dynamic range image, some image pixel grids have no pixel values.
  • the first red high dynamic range image, the first green high dynamic range image, and the first blue high dynamic range image are respectively interpolated using a bilinear interpolation method.
  • the image post-processing module 42 uses a bilinear interpolation method to perform interpolation processing on the first blue high dynamic range image.
  • the image pixel Bu1 to be interpolated in FIG. 22 performs bilinear interpolation according to the four image pixels Bu2, Bu3, Bu4, and Bu5 around Bu1 to obtain the interpolated image pixel Bu1' of Bu1. All the pixels of the image to be interpolated in the blanks in the first image of FIG. 22 are traversed using the bilinear interpolation method to complete the pixel values to obtain the interpolated first blue high dynamic range image.
  • the image post-processing module 42 uses a bilinear interpolation method to perform interpolation processing on the first green high dynamic range image.
  • the image pixel G1 to be interpolated in FIG. 23 performs bilinear interpolation according to the four pixels G2, G3, G4, and G5 around G1 to obtain the interpolated pixel G1' of G1. All pixels of the image to be interpolated in the blanks in the first image of FIG. 23 are traversed using the bilinear interpolation method to complement the pixel values to obtain the interpolated first green high dynamic range image.
  • the image post-processing module 42 may use a bilinear interpolation method to perform interpolation processing on the first red high dynamic range image to obtain an interpolated first red high dynamic range image.
  • the image post-processing module 42 performs demosaicing on the color image, which facilitates the implementation of the present application to complete the color image with the pixel value of a single color channel into a color image with multiple color channels, so that the hardware of the single-color photosensitive pixel On the basis of maintaining the complete presentation of the image color.
  • the color correction can specifically be to use a color correction matrix to correct each color channel value of each image pixel of the first color high dynamic range image (which may be the first color high dynamic range image after demosaicing), thereby realizing the correction. Correction of image color. As follows:
  • the color correction matrix (CCM) is preset in the image post-processing module 42.
  • the color correction matrix may specifically be:
  • the image post-processing module 42 traverses all image pixels in the image and performs color correction through the above color correction matrix to obtain a color-corrected image.
  • the color correction in the embodiment of the present application is beneficial to eliminate the problem of serious color deviation caused by colored light sources in the image or video frame, and the color distortion of people or objects in the image, so that the high dynamic range image processing system 100 of the embodiment of the present application can recover The original color of the image improves the visual effect of the image.
  • the binary digits of the gray value are often higher than 8 bits (the binary digits of the gray value of ordinary gray-scale images are generally 8 bits), and the gray scale of many displays is only 8 bits Therefore, the color of the high dynamic range image is changed, which is beneficial for the high dynamic range image to have higher compatibility and can be displayed on a conventional monitor.
  • high dynamic range images generally have a very uneven distribution of gray values, only a few pixels are brighter, and most of the image pixels are distributed in the interval with lower gray values.
  • the high dynamic range image processing in the embodiment of the present application The system 100 performs non-linear mapping for the tone mapping of the image, but the slope of the mapping relationship in the interval with a lower gray value is greater than the slope of the mapping relationship in the interval with a higher gray value, as shown in FIG. 25, which is advantageous.
  • the high dynamic range image processing system 100 can perform the first color high dynamic range image (which may be the first color high dynamic range image that has undergone tone mapping processing).
  • Dynamic range image performs color conversion, and converts the image from one color space (for example, RGB color space) to another color space (for example, YUV color space) so as to have a wider range of application scenarios or a more efficient transmission format.
  • the subsequent image processing of the high dynamic range image processing system 100 compresses the chrominance information of the image, which can reduce the amount of image information while not affecting the viewing effect of the image, thereby improving the transmission efficiency of the image.
  • the image fusion module 20 after the image fusion module 20 obtains the first color intermediate image and the second color intermediate image, they are directly transmitted to the high dynamic fusion module 30 for high dynamic range processing to obtain the third color high dynamic Range image. Then, the third color high dynamic range image is transmitted to the image sensor 40 to sequentially undergo image preprocessing and image post-processing, and finally a second color high dynamic range image is obtained.
  • the image fusion module 20 directly transmits the obtained first color intermediate image and the second color intermediate image to the high dynamic fusion module 30 for high dynamic range processing.
  • the third color high dynamic range image can also be transferred to the high dynamic range.
  • the range image directly enters the image processor 40 and only undergoes image post-processing, and finally a second color high dynamic range image is obtained.
  • the pixel array 11 may also be exposed for a third exposure time to obtain a third original image.
  • the third original image includes third color original image data generated by the single-color photosensitive pixel exposed at the third exposure time and third full-color original image data generated by the panchromatic photosensitive pixel W exposed at the third exposure time.
  • the third exposure time is not equal to the first exposure time, and the third exposure time is not equal to the second exposure time.
  • the pixel array 11 performs three exposures to obtain a first original image, a second original image, and a third original image, respectively.
  • the first original image includes first color original image data generated by single-color photosensitive pixels exposed at a first exposure time L and first full-color original image data generated by panchromatic photosensitive pixels W exposed at a first exposure time L .
  • the second original image includes second color original image data generated by the single-color photosensitive pixels exposed at the second exposure time M and second full-color original image data generated by the panchromatic photosensitive pixels W exposed at the second exposure time M.
  • the third original image includes third color original image data generated by the single-color photosensitive pixel exposed at the third exposure time S and third full-color original image data generated by the panchromatic photosensitive pixel W exposed at the third exposure time S.
  • the image fusion module 20 can merge the first color original image data and the first panchromatic original image data into a first color intermediate image containing only the first color intermediate image data, and combine the second color original image data with the second panchromatic image data.
  • the original image data is fused into a second color intermediate image containing only the second color intermediate image data
  • the third color original image data and the third panchromatic original image data are fused into a third color containing only the third color intermediate image data
  • the middle image is the same as the specific implementation of fusing the first color original image data and the first panchromatic original image data into the first color intermediate image in the embodiment described in FIG. 14 and FIG. 15, and will not be repeated here.
  • the image preprocessing module 41 may perform image preprocessing on the first color intermediate image to obtain the preprocessed first color intermediate image, and perform preprocessing on the second color intermediate image to obtain the preprocessed second color intermediate image,
  • the third color intermediate image is preprocessed to obtain the preprocessed third color intermediate image.
  • the specific implementation is the same as the implementation of the image preprocessing described in any of the embodiments in FIGS. 17 to 19, and will not be repeated here.
  • the high dynamic range image processing module 30 can perform high dynamic range processing on the preprocessed first color intermediate image, the preprocessed second color intermediate image, and the preprocessed third color intermediate image to obtain the first color intermediate image. Dynamic range image. Alternatively, the high dynamic range image processing module 30 may directly perform high dynamic range processing on the first color intermediate image, the second color intermediate image, and the second color intermediate image to obtain the first color high dynamic range image.
  • the specific implementation method of high dynamic range processing is the same as the specific implementation described above for fusing the preprocessed first color intermediate image and the preprocessed second color intermediate image into the first color high dynamic range image. This will not be repeated here.
  • the image post-processing module 42 can perform image post-processing on the first color high dynamic range image to obtain a second color high dynamic range image.
  • the implementation is the same, and will not be repeated here.
  • the pixel array 11 may also be exposed to more times, such as four times, five times, six times, ten times, or twenty times, so as to obtain more original images.
  • the image fusion module 10 and the high dynamic range image processing system 30 then perform fusion algorithm processing and high dynamic range processing on all the original images to obtain the first color high dynamic range image.
  • the present application also provides an electronic device 1000.
  • the electronic device 1000 of the embodiment of the present application includes a lens 300, a housing 200, and the high dynamic range image processing system 100 of any one of the above embodiments.
  • the lens 300 and the high dynamic range image processing system 100 are combined with the housing 200.
  • the lens 300 cooperates with the image sensor 10 of the high dynamic range image processing system 100 for imaging.
  • the electronic device 1000 may be a mobile phone, a tablet computer, a notebook computer, a smart wearable device (such as a smart watch, a smart bracelet, a smart glasses, a smart helmet), a drone, a head-mounted display device, etc., which are not limited here.
  • a smart wearable device such as a smart watch, a smart bracelet, a smart glasses, a smart helmet
  • a drone a head-mounted display device, etc., which are not limited here.
  • the electronic device 1000 of the embodiment of the present application performs fusion algorithm processing on the multiple original images output by the image sensor 10 through the image fusion module 20 provided in the high dynamic range image processing system 100 in advance, so as to obtain multiple image pixels arranged in a Bayer array.
  • Frame color intermediate image The multi-frame color intermediate image can be processed by the image processor 40, which solves the problem that the image processor 40 cannot directly process the image in which the image pixels are arranged in a non-Bayer array.
  • the high dynamic range image processing method of the embodiment of the present application is used in the high dynamic range image processing system 100.
  • the high dynamic range image processing system 100 may include the image sensor 10.
  • the image sensor 10 includes a pixel array 11.
  • the pixel array 11 includes a plurality of full-color photosensitive pixels and a plurality of color photosensitive pixels. Color photosensitive pixels have a narrower spectral response than full-color photosensitive pixels.
  • the pixel array 11 includes the smallest repeating unit. Each minimum repeating unit contains multiple subunits. Each subunit includes a plurality of single-color photosensitive pixels and a plurality of full-color photosensitive pixels.
  • High dynamic range image processing methods include:
  • the pixel array 11 Exposure of the pixel array 11, where the pixel array 11 is exposed to a first exposure time to obtain a first original image, and the first original image includes the first color original image data generated by the single-color photosensitive pixels exposed at the first exposure time and The first full-color original image data generated by the full-color photosensitive pixels exposed at the first exposure time; the pixel array is exposed at the second exposure time to obtain a second original image, and the second original image includes the single-color photosensitive pixels exposed at the second exposure time The generated second color original image data and the second panchromatic original image data generated by the panchromatic photosensitive pixels exposed at the second exposure time; wherein the first exposure time is not equal to the second exposure time;
  • first color original image data and the first panchromatic original image data are fused into a first color intermediate image containing only the first color intermediate image data
  • second color original image data and the second panchromatic original image data are fused
  • both the first color intermediate image and the second color intermediate image include a plurality of color image pixels, and the plurality of color image pixels are arranged in a Bayer array
  • the pixel array is exposed at a third exposure time to obtain a third original image
  • the third original image includes third color original image data generated by single-color photosensitive pixels exposed at the third exposure time and at the third exposure time.
  • the third panchromatic original image data generated by the time-exposed panchromatic photosensitive pixels; wherein the third exposure time is not equal to the first exposure time, and the third exposure time is not equal to the second exposure time.
  • the high dynamic range image processing method further includes: fusing the third color original image data and the third panchromatic original image data into a third color intermediate image containing only the third color intermediate image data, the third color intermediate image including multiple colors Image pixels, multiple color image pixels are arranged in a Bayer array.
  • the step of performing high dynamic range processing on the first color intermediate image and the second color intermediate image to obtain the first color high dynamic range image includes: performing high dynamic range processing on the first color intermediate image, the second color intermediate image, and the third color intermediate image.
  • the dynamic range processing obtains the first color high dynamic range image.
  • each color original image data is generated by a single single-color photosensitive pixel
  • each panchromatic original image data is generated by a single panchromatic photosensitive pixel.
  • the output mode of the image sensor 10 for outputting a plurality of original image data includes alternate output of one color original image data and one full-color original image data.
  • each color original image data is jointly generated by a plurality of single-color photosensitive pixels in the same subunit
  • each panchromatic original image data is jointly generated by a plurality of panchromatic photosensitive pixels in the same subunit.
  • the output manner of the image sensor 10 outputting a plurality of original image data includes alternately outputting a plurality of color original image data and a plurality of panchromatic original image data.
  • the high dynamic range image processing method further includes: performing image preprocessing on the first color intermediate image to obtain the preprocessed first color intermediate image; performing image preprocessing on the second color intermediate image to obtain The preprocessed second color intermediate image.
  • the step of performing high dynamic range processing on the first color intermediate image and the second color intermediate image to obtain the first color high dynamic range image includes: preprocessing the first color intermediate image and the preprocessed second color intermediate image Perform high dynamic range processing to obtain the first color high dynamic range image.
  • the high dynamic range image processing method further includes: performing image preprocessing on the first color intermediate image to obtain the preprocessed first color intermediate image; performing image preprocessing on the second color intermediate image to obtain Preprocessed second color intermediate image; image preprocessing is performed on the third color intermediate image to obtain the preprocessed third color intermediate image.
  • the step of performing high dynamic range processing on the first color intermediate image, the second color intermediate image, and the third color intermediate image to obtain the first color high dynamic range image includes: performing the preprocessed first color intermediate image and the preprocessed first color intermediate image The second color intermediate image and the processed third color intermediate image are subjected to high dynamic range processing to obtain the first color high dynamic range image.
  • the image preprocessing includes at least one of black level correction, lens shading correction, and dead pixel compensation.
  • the high dynamic range image processing method further includes: performing image post-processing on the first color high dynamic range image to obtain a second color high dynamic range image.
  • the image post-processing includes at least one of demosaicing, color correction, global tone mapping, and color conversion.
  • the high dynamic range image processing system includes a storage module.
  • the high dynamic range image processing method also includes: storing the preprocessed image in the storage module; obtaining the preprocessed image from the storage module and performing high dynamic range image processing on the preprocessed image to obtain the first color high dynamic Range image.
  • the specific process of processing images by the high dynamic range image processing method of the embodiment of the present application is the same as the process of processing images by the high dynamic range image processing system 100 shown in FIG. 1, and will not be repeated here.
  • This application also provides a non-volatile computer-readable storage medium 400 containing a computer program.
  • the processor 60 is caused to execute the high dynamic range image processing method of any one of the foregoing embodiments.
  • the pixel array 11 is exposed, where the pixel array 11 is exposed for a first exposure time to obtain a first original image, and the first original image includes first color original image data generated by a single color photosensitive pixel exposed for the first exposure time and a first original image.
  • the first color original image data and the first panchromatic original image data are fused into a first color intermediate image containing only the first color intermediate image data, and the second color original image data and the second panchromatic original image data are fused to contain only the first color intermediate image data.
  • High dynamic range processing is performed on the first color intermediate image and the second color intermediate image to obtain the first color high dynamic range image.
  • the processor 60 when the computer program is executed by the processor 60, the processor 60 is caused to perform the following steps:
  • High dynamic range processing is performed on the preprocessed first color intermediate image and the preprocessed second color intermediate image to obtain the first color high dynamic range image.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Color Television Image Signal Generators (AREA)
  • Studio Devices (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)

Abstract

L'invention concerne un système et un procédé de traitement d'image à grande gamme dynamique (100), un dispositif électronique (1000) et un support de stockage lisible (400). Le système de traitement d'image à grande gamme dynamique (100) comprend un capteur d'image (10), un module de fusion d'image (20) et un module de traitement d'image à grande gamme dynamique (30). Le capteur d'image (10) obtient une image d'origine. Le module de fusion d'image (20) et le module de traitement d'image à grande gamme dynamique (30) sont conçus pour effectuer un traitement d'algorithme de fusion et un traitement à grande gamme dynamique sur l'image d'origine afin d'obtenir une première image à grande gamme dynamique en couleurs.
PCT/CN2020/119957 2020-05-08 2020-10-09 Système et procédé de traitement d'image à grande gamme dynamique, dispositif électronique et support de stockage lisible WO2021223364A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010384531.0A CN111586375B (zh) 2020-05-08 2020-05-08 高动态范围图像处理系统及方法、电子设备和可读存储介质
CN202010384531.0 2020-05-08

Publications (1)

Publication Number Publication Date
WO2021223364A1 true WO2021223364A1 (fr) 2021-11-11

Family

ID=72112041

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/119957 WO2021223364A1 (fr) 2020-05-08 2020-10-09 Système et procédé de traitement d'image à grande gamme dynamique, dispositif électronique et support de stockage lisible

Country Status (2)

Country Link
CN (1) CN111586375B (fr)
WO (1) WO2021223364A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114615438A (zh) * 2022-03-07 2022-06-10 江西合力泰科技有限公司 一种摄像头芯片表面黑点补偿方法
CN115118881A (zh) * 2022-06-24 2022-09-27 维沃移动通信有限公司 信号处理电路、图像传感器、电子设备及图像处理方法

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111491111B (zh) * 2020-04-20 2021-03-26 Oppo广东移动通信有限公司 高动态范围图像处理系统及方法、电子设备和可读存储介质
CN111586375B (zh) * 2020-05-08 2021-06-11 Oppo广东移动通信有限公司 高动态范围图像处理系统及方法、电子设备和可读存储介质
CN112351172B (zh) * 2020-10-26 2021-09-17 Oppo广东移动通信有限公司 图像处理方法、摄像头组件及移动终端
WO2022222112A1 (fr) * 2021-04-22 2022-10-27 深圳市大疆创新科技有限公司 Procédé de traitement de données, capteur d'image, processeur d'image et dispositif électronique
CN118175441A (zh) * 2024-04-24 2024-06-11 荣耀终端有限公司 图像传感器、图像处理方法、电子设备、存储介质及产品

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101248659A (zh) * 2005-08-23 2008-08-20 伊斯曼柯达公司 在变化照明条件下捕获图像
US20110090378A1 (en) * 2009-10-16 2011-04-21 Sen Wang Image deblurring using panchromatic pixels
CN102197641A (zh) * 2008-10-25 2011-09-21 全视科技有限公司 改进缺陷色彩及全色滤色器阵列图像
CN110740272A (zh) * 2019-10-31 2020-01-31 Oppo广东移动通信有限公司 图像采集方法、摄像头组件及移动终端
CN111491111A (zh) * 2020-04-20 2020-08-04 Oppo广东移动通信有限公司 高动态范围图像处理系统及方法、电子设备和可读存储介质
CN111586375A (zh) * 2020-05-08 2020-08-25 Oppo广东移动通信有限公司 高动态范围图像处理系统及方法、电子设备和可读存储介质

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8274715B2 (en) * 2005-07-28 2012-09-25 Omnivision Technologies, Inc. Processing color and panchromatic pixels
TW200820123A (en) * 2006-10-20 2008-05-01 Primax Electronics Ltd Method and system of generating high dynamic range image corresponding to specific scene
US7844127B2 (en) * 2007-03-30 2010-11-30 Eastman Kodak Company Edge mapping using panchromatic pixels
JP2012257193A (ja) * 2011-05-13 2012-12-27 Sony Corp 画像処理装置、撮像装置、および画像処理方法、並びにプログラム
US9071765B2 (en) * 2012-12-28 2015-06-30 Nvidia Corporation System, method, and computer program product implementing an image processing pipeline for high-dynamic range images
US9413992B2 (en) * 2013-05-20 2016-08-09 Omnivision Technologies, Inc. High dynamic range image sensor with full resolution recovery
JP2015011320A (ja) * 2013-07-02 2015-01-19 キヤノン株式会社 撮像装置及びその制御方法
CN107493431A (zh) * 2017-08-31 2017-12-19 努比亚技术有限公司 一种图像拍摄合成方法、终端及计算机可读存储介质
CN108288253B (zh) * 2018-01-08 2020-11-27 厦门美图之家科技有限公司 Hdr图像生成方法及装置
CN109360163A (zh) * 2018-09-26 2019-02-19 深圳积木易搭科技技术有限公司 一种高动态范围图像的融合方法及融合系统

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101248659A (zh) * 2005-08-23 2008-08-20 伊斯曼柯达公司 在变化照明条件下捕获图像
CN102197641A (zh) * 2008-10-25 2011-09-21 全视科技有限公司 改进缺陷色彩及全色滤色器阵列图像
US20110090378A1 (en) * 2009-10-16 2011-04-21 Sen Wang Image deblurring using panchromatic pixels
CN110740272A (zh) * 2019-10-31 2020-01-31 Oppo广东移动通信有限公司 图像采集方法、摄像头组件及移动终端
CN111491111A (zh) * 2020-04-20 2020-08-04 Oppo广东移动通信有限公司 高动态范围图像处理系统及方法、电子设备和可读存储介质
CN111586375A (zh) * 2020-05-08 2020-08-25 Oppo广东移动通信有限公司 高动态范围图像处理系统及方法、电子设备和可读存储介质

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114615438A (zh) * 2022-03-07 2022-06-10 江西合力泰科技有限公司 一种摄像头芯片表面黑点补偿方法
CN114615438B (zh) * 2022-03-07 2023-09-15 江西合力泰科技有限公司 一种摄像头芯片表面黑点补偿方法
CN115118881A (zh) * 2022-06-24 2022-09-27 维沃移动通信有限公司 信号处理电路、图像传感器、电子设备及图像处理方法

Also Published As

Publication number Publication date
CN111586375B (zh) 2021-06-11
CN111586375A (zh) 2020-08-25

Similar Documents

Publication Publication Date Title
WO2021212763A1 (fr) Système et procédé de traitement d'images à plage dynamique élevée, dispositif électronique et support d'informations lisible
WO2021196554A1 (fr) Capteur d'image, système et procédé de traitement d'image, dispositif électronique et support d'enregistrement
WO2021208593A1 (fr) Système et procédé de traitement d'image à plage dynamique élevée, dispositif électronique et support de stockage
WO2021223364A1 (fr) Système et procédé de traitement d'image à grande gamme dynamique, dispositif électronique et support de stockage lisible
WO2021196553A1 (fr) Système et procédé de traitement d'images à plage dynamique élevée, dispositif électronique et support d'enregistrement lisible
US20230017746A1 (en) Image acquisition method, electronic device, and non-transitory computerreadable storage medium
US11812164B2 (en) Pixel-interpolation based image acquisition method, camera assembly, and mobile terminal
WO2022007215A1 (fr) Procédé d'acquisition d'image, ensemble caméra, et terminal mobile
CN111970460B (zh) 高动态范围图像处理系统及方法、电子设备和可读存储介质
CN112738493B (zh) 图像处理方法、图像处理装置、电子设备及可读存储介质
CN114073068B (zh) 图像采集方法、摄像头组件及移动终端
CN111970461B (zh) 高动态范围图像处理系统及方法、电子设备和可读存储介质
CN112822475B (zh) 图像处理方法、图像处理装置、终端及可读存储介质
CN111970459B (zh) 高动态范围图像处理系统及方法、电子设备和可读存储介质
WO2022088310A1 (fr) Procédé de traitement d'images, ensemble caméra et terminal mobile
CN111835971B (zh) 图像处理方法、图像处理系统、电子设备及可读存储介质
CN111031297B (zh) 图像传感器、控制方法、摄像头组件和移动终端
CN112738494B (zh) 图像处理方法、图像处理系统、终端设备及可读存储介质
WO2022141743A1 (fr) Procédé de traitement d'image, système de traitement d'image, dispositif électronique et support de stockage lisible
CN112235485B (zh) 图像传感器、图像处理方法、成像装置、终端及可读存储介质
US20220279108A1 (en) Image sensor and mobile terminal

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20934854

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20934854

Country of ref document: EP

Kind code of ref document: A1