CN111970459B - High dynamic range image processing system and method, electronic device, and readable storage medium - Google Patents

High dynamic range image processing system and method, electronic device, and readable storage medium Download PDF

Info

Publication number
CN111970459B
CN111970459B CN202010808125.2A CN202010808125A CN111970459B CN 111970459 B CN111970459 B CN 111970459B CN 202010808125 A CN202010808125 A CN 202010808125A CN 111970459 B CN111970459 B CN 111970459B
Authority
CN
China
Prior art keywords
image
color
dynamic range
high dynamic
range image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010808125.2A
Other languages
Chinese (zh)
Other versions
CN111970459A (en
Inventor
杨鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202010808125.2A priority Critical patent/CN111970459B/en
Publication of CN111970459A publication Critical patent/CN111970459A/en
Application granted granted Critical
Publication of CN111970459B publication Critical patent/CN111970459B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/741Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
    • G06T5/92
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • H04N23/843Demosaicing, e.g. interpolating colour pixel values
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • H04N23/88Camera processing pipelines; Components thereof for processing colour signals for colour balance, e.g. white-balance circuits or colour temperature control
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/50Control of the SSIS exposure
    • H04N25/57Control of the dynamic range
    • H04N25/58Control of the dynamic range involving two or more exposures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/60Noise processing, e.g. detecting, correcting, reducing or removing noise
    • H04N25/61Noise processing, e.g. detecting, correcting, reducing or removing noise the noise originating only from the lens unit, e.g. flare, shading, vignetting or "cos4"
    • H04N25/611Correction of chromatic aberration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • H04N9/646Circuits for processing colour signals for image enhancement, e.g. vertical detail restoration, cross-colour elimination, contour correction, chrominance trapping filters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10141Special mode during image acquisition
    • G06T2207/10144Varying exposure
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20208High dynamic range [HDR] image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Abstract

The application discloses a high dynamic range image processing system and method, an electronic device and a readable storage medium. The high dynamic range image processing system comprises an image sensor, a high dynamic range image processing module, an image processor and an image fusion module. A pixel array in an image sensor is exposed, at least one single-color photosensitive pixel in the same subunit is exposed with a first exposure time, at least one single-color photosensitive pixel is exposed with a second exposure time, and at least one full-color photosensitive pixel is exposed with a third exposure time. The high dynamic range image processing module carries out high dynamic range image processing on the first color original image and the second color original image to obtain a first color high dynamic range image; the image processor processes the first color high dynamic range image and the first panchromatic original image to obtain a second color high dynamic range image; and the image fusion module fuses the first color high dynamic range image and the second color high dynamic range image to obtain a target image.

Description

High dynamic range image processing system and method, electronic device, and readable storage medium
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to a high dynamic range image processing system, a high dynamic range image processing method, an electronic device, and a non-volatile computer-readable storage medium.
Background
The electronic equipment such as the mobile phone and the like can be provided with a camera to realize the photographing function. An image sensor for receiving light can be arranged in the camera. An array of filters may be disposed in the image sensor. In order to improve the quality of the obtained image, panchromatic photosensitive pixels are usually added to the filter array, so that the parameters of the image processor need to be changed to process the image signal output by the image sensor, which increases the cost and the design difficulty, and is not favorable for the mass production of products.
Disclosure of Invention
The embodiment of the application provides a high dynamic range image processing system, a high dynamic range image processing method, an electronic device and a non-volatile computer readable storage medium.
The embodiment of the application provides a high dynamic range image processing system. The high dynamic range image processing system comprises an image sensor, a high dynamic range image processing module, an image processor and an image fusion module. The image sensor includes a pixel array including a plurality of panchromatic photosensitive pixels and a plurality of color photosensitive pixels having a narrower spectral response than the panchromatic photosensitive pixels, the pixel array including minimal repeating units, each of the minimal repeating units including a plurality of sub-units, each of the sub-units including a plurality of single-color photosensitive pixels and a plurality of panchromatic photosensitive pixels, the pixel array in the image sensor being exposed, wherein for a plurality of photosensitive pixels in the same sub-unit, at least one of the single-color photosensitive pixels is exposed for a first exposure time, at least one of the single-color photosensitive pixels is exposed for a second exposure time that is less than the first exposure time, and at least one of the panchromatic photosensitive pixels is exposed for a third exposure time that is less than the first exposure time; and the first color information generated by the single-color photosensitive pixels exposed in the first exposure time is used for obtaining a first color original image, the second color information generated by the single-color photosensitive pixels exposed in the second exposure time is used for obtaining a second color original image, and the full-color photosensitive pixels exposed in the third exposure time are used for generating a first full-color original image. The high dynamic range image processing module is used for performing high dynamic range image processing on the first color original image and the second color original image to obtain a first color high dynamic range image, wherein the first color high dynamic range image comprises a plurality of first color image pixels, and the plurality of color image pixels are arranged in a Bayer array. The image processor is configured to process the first color high dynamic range image and a first panchromatic original image to obtain a second color high dynamic range image. The image fusion module is used for carrying out fusion algorithm processing on the first color high dynamic range image and the second color high dynamic range image to obtain a target image.
The embodiment of the application provides a high dynamic range image processing method. The high dynamic range image processing method is used for a high dynamic range image processing system. The high dynamic range image processing system includes an image sensor including a pixel array including a plurality of panchromatic photosensitive pixels and a plurality of color photosensitive pixels having a narrower spectral response than the panchromatic photosensitive pixels, the pixel array including minimal repeating units, each of the minimal repeating units including a plurality of sub-units, each of the sub-units including a plurality of single-color photosensitive pixels and a plurality of panchromatic photosensitive pixels. The high dynamic range image processing method includes: and exposing the pixel array, wherein for a plurality of photosensitive pixels in the same subunit, at least one single-color photosensitive pixel is exposed with a first exposure time, at least one single-color photosensitive pixel is exposed with a second exposure time which is less than the first exposure time, and at least one panchromatic photosensitive pixel is exposed with a third exposure time which is less than the first exposure time. And the first color information generated by the single-color photosensitive pixels exposed in the first exposure time is used for obtaining a first color original image, the second color information generated by the single-color photosensitive pixels exposed in the second exposure time is used for obtaining a second color original image, and the full-color photosensitive pixels exposed in the third exposure time are used for generating a first full-color original image. The high dynamic range image processing method includes: performing high dynamic range image processing on the first color original image and the second color original image to obtain a first color high dynamic range image, wherein the first color high dynamic range image comprises a plurality of first color image pixels, and the plurality of color image pixels are arranged in a Bayer array; processing the first color high dynamic range image and a first panchromatic original image to obtain a second color high dynamic range image; and carrying out fusion algorithm processing on the first color high dynamic range image and the second color high dynamic range image to obtain a target image.
The embodiment of the application provides electronic equipment. The electronic equipment comprises a lens, a shell and the high dynamic range image processing system. The lens, the high dynamic range image processing system and the shell are combined, and the lens and an image sensor of the high dynamic range image processing system are matched for imaging.
The present embodiments provide a non-transitory computer-readable storage medium containing a computer program. The computer program, when executed by a processor, causes the processor to perform the high dynamic range image processing method described above.
In the high dynamic range image processing system, the high dynamic range image processing method, the electronic device, and the nonvolatile computer readable storage medium according to the embodiments of the present application, the high dynamic range image processing module and the image processor perform high dynamic range image processing and image processing on the full color raw image and the color raw image output by the image sensor to obtain a first color high dynamic range image arranged in a bayer array and a second color high dynamic range image including full color information, and the image fusion module fuses the first color high dynamic range image and the second color high dynamic range image to obtain a target image with a high dynamic range. Thus, the image quality of the high dynamic range image can be improved, and meanwhile, the image can be processed without changing the parameters of the image processor.
Additional aspects and advantages of embodiments of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the present application.
Drawings
The above and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a schematic diagram of a high dynamic range image processing system according to an embodiment of the present application;
FIG. 2 is a schematic diagram of a pixel array according to an embodiment of the present disclosure;
FIG. 3 is a schematic cross-sectional view of a light-sensitive pixel according to an embodiment of the present application;
FIG. 4 is a pixel circuit diagram of a photosensitive pixel according to an embodiment of the present disclosure;
fig. 5 is a schematic layout diagram of a minimum repeating unit in a pixel array according to an embodiment of the present disclosure;
fig. 6 is a schematic layout diagram of a minimum repeating unit in yet another pixel array according to an embodiment of the present disclosure;
fig. 7 is a schematic layout diagram of a minimum repeating unit in yet another pixel array according to an embodiment of the present disclosure;
fig. 8 is a schematic layout diagram of a minimum repeating unit in a pixel array according to another embodiment of the present disclosure;
fig. 9 is a schematic layout diagram of a minimum repeating unit in a pixel array according to another embodiment of the present disclosure;
fig. 10 is a schematic diagram illustrating an arrangement of minimum repeating units in a pixel array according to another embodiment of the present disclosure;
FIG. 11 is a schematic diagram of an original image output by an image sensor according to an embodiment of the present disclosure;
FIG. 12 is a schematic diagram of yet another high dynamic range image processing system in accordance with an embodiment of the present application;
fig. 13 is a schematic diagram of a luminance alignment process according to the embodiment of the present application;
FIG. 14 is a schematic illustration of a high dynamic range image processing principle of an embodiment of the present application;
fig. 15 is a schematic view of a lens shading correction process according to an embodiment of the present application;
fig. 16 to 17 are schematic diagrams of a method for acquiring a second color high dynamic range image according to an embodiment of the present application;
FIG. 18 is a schematic diagram of yet another high dynamic range image processing system in accordance with an embodiment of the present application;
FIGS. 19-24 are schematic diagrams of a fusion algorithm according to embodiments of the present application;
FIG. 25 is a schematic diagram of a raw image output by yet another image sensor according to an embodiment of the present application;
FIG. 26 is a schematic diagram of yet another high dynamic range image processing principle of an embodiment of the present application;
FIG. 27 is a schematic diagram of another embodiment of the present application for obtaining a second color high dynamic range image;
fig. 28 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
FIG. 29 is a schematic flow chart diagram illustrating a high dynamic range image acquisition method according to an embodiment of the present application;
FIG. 30 is a schematic diagram of an interaction between a non-volatile computer-readable storage medium and a processor according to an embodiment of the present application.
Detailed Description
Reference will now be made in detail to embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below by referring to the drawings are exemplary only for the purpose of explaining the embodiments of the present application, and are not to be construed as limiting the embodiments of the present application.
Referring to fig. 1, the present disclosure provides a high dynamic range image processing system 100. The high dynamic range image processing system 100 includes an image sensor 10, a high dynamic range image processing module 20, an image processor 30, and an image fusion module 40. The image sensor 10 includes a pixel array 11. The pixel array 11 includes a plurality of panchromatic photosensitive pixels and a plurality of color photosensitive pixels having a narrower spectral response than the panchromatic photosensitive pixels. The pixel array 11 includes minimal repeating units, each of which includes a plurality of sub-units, each of which includes a plurality of single-color photosensitive pixels and a plurality of full-color photosensitive pixels. The pixel array 11 in the image sensor 10 is exposed, wherein for a plurality of photosensitive pixels in the same subunit, at least one single-color photosensitive pixel is exposed for a first exposure time, at least one single-color photosensitive pixel is exposed for a second exposure time that is less than the first exposure time, and at least one full-color photosensitive pixel is exposed for a third exposure time that is less than the first exposure time. The first color information generated by the single-color photosensitive pixels exposed in the first exposure time is used for obtaining a first color original image, the second color information generated by the single-color photosensitive pixels exposed in the second exposure time is used for obtaining a second color original image, and the full-color photosensitive pixels exposed in the third exposure time are used for generating a first full-color original image. The high dynamic range image processing module 20 is configured to perform high dynamic range image processing on the first color raw image and the second color raw image to obtain a first color high dynamic range image, where the first color high dynamic range image includes a plurality of first color image pixels, and the plurality of color image pixels are arranged in a bayer array. The image processor 30 is configured to process the first color high dynamic range image and the first full color raw image to obtain a second color high dynamic range image. The image fusion module is used for carrying out fusion algorithm processing on the first color high dynamic range image and the second color high dynamic range image to obtain a target image.
The high dynamic range image processing system 100 according to the embodiment of the present invention performs high dynamic range image processing and image processing on the full-color raw image and the color raw image output by the image sensor 10 through the high dynamic range image processing module 20 and the image processor 30 to obtain a first color high dynamic range image arranged in a bayer array and a second color high dynamic range image including panchromatic information, and then the image fusion module 40 fuses the first color high dynamic range image and the second color high dynamic range image to obtain a target image with a high dynamic range. In this way, the image quality of the high dynamic range image can be improved, and the image can be processed without changing the parameters of the image processor 30.
The present application is further described below with reference to the accompanying drawings.
Fig. 2 is a schematic diagram of the image sensor 10 in the embodiment of the present application. The image sensor 10 includes a pixel array 11, a vertical driving unit 12, a control unit 13, a column processing unit 14, and a horizontal driving unit 15.
For example, the image sensor 10 may employ a Complementary Metal Oxide Semiconductor (CMOS) photosensitive element or a Charge-coupled Device (CCD) photosensitive element.
For example, the pixel array 11 includes a plurality of photosensitive pixels 110 (shown in fig. 3) two-dimensionally arranged in an array form (i.e., arranged in a two-dimensional matrix form), and each photosensitive pixel 110 includes a photoelectric conversion element 1111 (shown in fig. 4). Each photosensitive pixel 110 converts light into an electric charge according to the intensity of light incident thereon.
For example, the vertical driving unit 12 includes a shift register and an address decoder. The vertical driving unit 12 includes a readout scanning and reset scanning functions. The readout scanning refers to sequentially scanning the unit photosensitive pixels 110 line by line, and reading signals from the unit photosensitive pixels 110 line by line. For example, a signal output from each photosensitive pixel 110 in the photosensitive pixel row selected and scanned is transmitted to the column processing unit 14. The reset scan is for resetting charges, and the photocharges of the photoelectric conversion elements are discarded, so that accumulation of new photocharges can be started.
The signal processing performed by the column processing unit 14 is, for example, Correlated Double Sampling (CDS) processing. In the CDS process, the reset level and the signal level output from each photosensitive pixel 110 in the selected photosensitive pixel row are taken out, and the level difference is calculated. Thus, signals of the photosensitive pixels 110 in one row are obtained. The column processing unit 14 may have an analog-to-digital (a/D) conversion function for converting analog pixel signals into a digital format.
The horizontal driving unit 15 includes, for example, a shift register and an address decoder. The horizontal driving unit 15 sequentially scans the pixel array 11 column by column. Each photosensitive pixel column is sequentially processed by the column processing unit 14 by a selective scanning operation performed by the horizontal driving unit 15, and is sequentially output.
For example, the control unit 13 configures timing signals according to the operation mode, and controls the vertical driving unit 12, the column processing unit 14, and the horizontal driving unit 15 to cooperatively operate using a variety of timing signals.
Fig. 3 is a schematic diagram of a photosensitive pixel 110 according to an embodiment of the present disclosure. The photosensitive pixel 110 includes a pixel circuit 111, a filter 112, and a microlens 113. The microlens 113, the filter 112, and the pixel circuit 111 are sequentially disposed along the light receiving direction of the photosensitive pixel 110. The micro-lens 113 is used for converging light, and the optical filter 112 is used for allowing light of a certain wavelength band to pass through and filtering light of other wavelength bands. The pixel circuit 111 is configured to convert the received light into an electric signal and supply the generated electric signal to the column processing unit 14 shown in fig. 2.
Fig. 4 is a schematic diagram of a pixel circuit 111 of a photosensitive pixel 110 according to an embodiment of the disclosure. The pixel circuit 111 of fig. 4 may be implemented in each photosensitive pixel 110 (shown in fig. 3) in the pixel array 11 shown in fig. 2. The operation principle of the pixel circuit 111 is described below with reference to fig. 2 to 4.
As shown in fig. 4, the pixel circuit 111 includes a photoelectric conversion element 1111 (e.g., a photodiode), an exposure control circuit (e.g., a transfer transistor 1112), a reset circuit (e.g., a reset transistor 1113), an amplification circuit (e.g., an amplification transistor 1114), and a selection circuit (e.g., a selection transistor 1115). In the embodiment of the present application, the transfer transistor 1112, the reset transistor 1113, the amplification transistor 1114, and the selection transistor 1115 are, for example, MOS transistors, but are not limited thereto.
The photoelectric conversion element 1111 includes, for example, a photodiode, and an anode of the photodiode is connected to, for example, ground. The photodiode converts the received light into electric charges. The cathode of the photodiode is connected to the floating diffusion FD via an exposure control circuit (e.g., transfer transistor 1112). The floating diffusion FD is connected to the gate of the amplification transistor 1114 and the source of the reset transistor 1113.
For example, the exposure control circuit is a transfer transistor 1112, and the control terminal TG of the exposure control circuit is a gate of the transfer transistor 1112. When a pulse of an effective level (for example, VPIX level) is transmitted to the gate of the transfer transistor 1112 through the exposure control line, the transfer transistor 1112 is turned on. The transfer transistor 1112 transfers the charge photoelectrically converted by the photodiode to the floating diffusion unit FD.
For example, the drain of the reset transistor 1113 is connected to the pixel power supply VPIX. A source of the reset transistor 113 is connected to the floating diffusion FD. Before the electric charges are transferred from the photodiode to the floating diffusion FD, a pulse of an active reset level is transmitted to the gate of the reset transistor 113 via the reset line, and the reset transistor 113 is turned on. The reset transistor 113 resets the floating diffusion unit FD to the pixel power supply VPIX.
For example, the gate of the amplification transistor 1114 is connected to the floating diffusion FD. The drain of the amplification transistor 1114 is connected to the pixel power supply VPIX. After the floating diffusion FD is reset by the reset transistor 1113, the amplification transistor 1114 outputs a reset level through the output terminal OUT via the selection transistor 1115. After the charge of the photodiode is transferred by the transfer transistor 1112, the amplification transistor 1114 outputs a signal level through the output terminal OUT via the selection transistor 1115.
For example, the drain of the selection transistor 1115 is connected to the source of the amplification transistor 1114. The source of the selection transistor 1115 is connected to the column processing unit 14 in fig. 2 through the output terminal OUT. When a pulse of an effective level is transmitted to the gate of the selection transistor 1115 through a selection line, the selection transistor 1115 is turned on. The signal output from the amplification transistor 1114 is transmitted to the column processing unit 14 through the selection transistor 1115.
It should be noted that the pixel structure of the pixel circuit 111 in the embodiment of the present application is not limited to the structure shown in fig. 4. For example, the pixel circuit 111 may also have a three-transistor pixel structure in which the functions of the amplification transistor 1114 and the selection transistor 1115 are performed by one transistor. For example, the exposure control circuit is not limited to the manner of the single transfer transistor 1112, and other electronic devices or structures having a function of controlling the conduction of the control terminal may be used as the exposure control circuit in the embodiment of the present application, and the implementation of the single transfer transistor 1112 in the embodiment of the present application is simple, low-cost, and easy to control.
Fig. 5-10 are schematic diagrams illustrating the arrangement of photosensitive pixels 110 (shown in fig. 3) in the pixel array 11 (shown in fig. 2) according to some embodiments of the present disclosure. The photosensitive pixels 110 include two types, one being full-color photosensitive pixels W and the other being color photosensitive pixels. Fig. 5 to 10 show only the arrangement of the plurality of photosensitive pixels 110 in one minimal repeating unit. The pixel array 11 can be formed by repeating the minimal repeating unit shown in fig. 5 to 10 a plurality of times in rows and columns. Each minimal repeating unit is composed of a plurality of panchromatic photosensitive pixels W and a plurality of color photosensitive pixels. Each minimal repeating unit includes a plurality of sub-units. Each sub-unit includes a plurality of single-color photosensitive pixels and a plurality of full-color photosensitive pixels W therein. Among them, in the minimum repeating unit shown in fig. 5 to 8, the full-color photosensitive pixel W and the color photosensitive pixel in each sub-unit are alternately disposed. In the minimal repeating unit shown in fig. 9 and 10, in each sub-unit, the plurality of photosensitive pixels 110 in the same row may be photosensitive pixels 110 in the same category; alternatively, the plurality of photosensitive pixels 110 in the same column may be photosensitive pixels 110 of the same category.
Specifically, for example, fig. 5 is a schematic layout diagram of the light sensing pixel 110 (shown in fig. 3) in the minimal repeating unit according to an embodiment of the present application.
The minimum repeating unit is 4 rows, 4 columns and 16 photosensitive pixels 110, and the sub-unit is 2 rows, 2 columns and 4 photosensitive pixels 110. The arrangement mode is as follows:
Figure BDA0002629906320000041
w denotes a full-color photosensitive pixel; a denotes a first color-sensitive pixel of the plurality of color-sensitive pixels; b denotes a second color-sensitive pixel of the plurality of color-sensitive pixels; c denotes a third color-sensitive pixel of the plurality of color-sensitive pixels.
For example, as shown in fig. 5, the full-color photosensitive pixels W and the single-color photosensitive pixels are alternately arranged for each sub-unit.
For example, as shown in FIG. 5, the categories of subunits include three categories. The first-type subunit UA comprises a plurality of full-color photosensitive pixels W and a plurality of first-color photosensitive pixels A; the second-type sub-unit UB includes a plurality of full-color photosensitive pixels W and a plurality of second-color photosensitive pixels B; the third type of sub-unit UC includes a plurality of full-color photosensitive pixels W and a plurality of third-color photosensitive pixels C. Each minimum repeating unit comprises four subunits, namely a first subunit UA, two second subunits UB and a third subunit UC. Wherein, a first sub-unit UA and a third sub-unit UC are arranged in a first diagonal direction D1 (for example, the direction connecting the upper left corner and the lower right corner in fig. 5), and two second sub-units UB are arranged in a second diagonal direction D2 (for example, the direction connecting the upper right corner and the lower left corner in fig. 5). The first diagonal direction D1 is different from the second diagonal direction D2. For example, the first diagonal line and the second diagonal line intersect, such as being perpendicular.
In other embodiments, the first diagonal direction D1 may be a direction connecting an upper right corner and a lower left corner, and the second diagonal direction D2 may be a direction connecting an upper left corner and a lower right corner. In addition, the "direction" herein is not a single direction, and may be understood as a concept of "straight line" indicating arrangement, and may have a bidirectional direction of both ends of the straight line. The following explanations of the first diagonal direction D1 and the second diagonal direction D2 in fig. 6 to 10 are the same as here.
For another example, fig. 6 is a schematic layout diagram of a light sensing pixel 110 (shown in fig. 3) in a minimal repeating unit according to another embodiment of the present disclosure. The minimum repeating unit is 6 rows, 6 columns and 36 photosensitive pixels 110, and the sub-unit is 3 rows, 3 columns and 9 photosensitive pixels 110. The arrangement mode is as follows:
Figure BDA0002629906320000042
w denotes a full-color photosensitive pixel; a denotes a first color-sensitive pixel of the plurality of color-sensitive pixels; b denotes a second color-sensitive pixel of the plurality of color-sensitive pixels; c denotes a third color-sensitive pixel of the plurality of color-sensitive pixels.
For example, as shown in fig. 6, the full-color photosensitive pixels W and the single-color photosensitive pixels are alternately arranged for each sub-unit.
For example, as shown in FIG. 6, the categories of subunits include three categories. The first-type subunit UA comprises a plurality of full-color photosensitive pixels W and a plurality of first-color photosensitive pixels A; the second-type sub-unit UB includes a plurality of full-color photosensitive pixels W and a plurality of second-color photosensitive pixels B; the third type of sub-unit UC includes a plurality of full-color photosensitive pixels W and a plurality of third-color photosensitive pixels C. Each minimum repeating unit comprises four subunits, namely a first subunit UA, two second subunits UB and a third subunit UC. Wherein, a first sub-unit UA and a third sub-unit UC are arranged in a first diagonal direction D1, and two second sub-units UB are arranged in a second diagonal direction D2. The first diagonal direction D1 is different from the second diagonal direction D2. For example, the first diagonal line and the second diagonal line intersect, such as being perpendicular.
For another example, fig. 7 is a schematic layout diagram of a light sensing pixel 110 (shown in fig. 3) in a minimal repeating unit according to another embodiment of the present application. The minimum repeating unit is 8 rows, 8 columns and 64 photosensitive pixels 110, and the sub-unit is 4 rows, 4 columns and 16 photosensitive pixels 110. The arrangement mode is as follows:
Figure BDA0002629906320000051
w denotes a full-color photosensitive pixel; a denotes a first color-sensitive pixel of the plurality of color-sensitive pixels; b denotes a second color-sensitive pixel of the plurality of color-sensitive pixels; c denotes a third color-sensitive pixel of the plurality of color-sensitive pixels.
For example, as shown in fig. 7, the full-color photosensitive pixels W and the single-color photosensitive pixels are alternately arranged for each sub-unit.
For example, as shown in FIG. 7, the categories of subunits include three categories. The first-type subunit UA comprises a plurality of full-color photosensitive pixels W and a plurality of first-color photosensitive pixels A; the second-type sub-unit UB includes a plurality of full-color photosensitive pixels W and a plurality of second-color photosensitive pixels B; the third type of sub-unit UC includes a plurality of full-color photosensitive pixels W and a plurality of third-color photosensitive pixels C. Each minimum repeating unit comprises four subunits, namely a first subunit UA, two second subunits UB and a third subunit UC. Wherein, a first sub-unit UA and a third sub-unit UC are arranged in a first diagonal direction D1, and two second sub-units UB are arranged in a second diagonal direction D2. The first diagonal direction D1 is different from the second diagonal direction D2. For example, the first diagonal line and the second diagonal line intersect, such as being perpendicular.
Specifically, for example, fig. 8 is a schematic layout diagram of a light sensing pixel 110 (shown in fig. 3) in a minimal repeating unit according to still another embodiment of the present application. The minimum repeating unit is 4 rows, 4 columns and 16 photosensitive pixels 110, and the sub-unit is 2 rows, 2 columns and 4 photosensitive pixels 110. The arrangement mode is as follows:
Figure BDA0002629906320000052
w denotes a full-color photosensitive pixel; a denotes a first color-sensitive pixel of the plurality of color-sensitive pixels; b denotes a second color-sensitive pixel of the plurality of color-sensitive pixels; c denotes a third color-sensitive pixel of the plurality of color-sensitive pixels.
The arrangement of the photosensitive pixels 110 in the minimal repeating unit shown in fig. 8 is substantially the same as the arrangement of the photosensitive pixels 110 in the minimal repeating unit shown in fig. 5, except that the order of alternation of the panchromatic photosensitive pixels W and the single-color photosensitive pixels in the second-type sub-unit UB positioned at the lower left corner in fig. 8 is not identical to the order of alternation of the panchromatic photosensitive pixels W and the single-color photosensitive pixels in the second-type sub-unit UB positioned at the lower left corner in fig. 5, and the order of alternation of the panchromatic photosensitive pixels W and the single-color photosensitive pixels in the third-type sub-unit UC positioned at the lower right corner in fig. 8 is not identical to the order of alternation of the panchromatic photosensitive pixels W and the single-color photosensitive pixels in the third-type sub-unit UC positioned at the lower right corner in fig. 5. Specifically, in the second type of sub-unit UB located at the lower left corner in fig. 5, the first row of photosensitive pixels 110 is alternately arranged as a full-color photosensitive pixel W and a single-color photosensitive pixel (i.e., second-color photosensitive pixel B), and the second row of photosensitive pixels 110 is alternately arranged as a single-color photosensitive pixel (i.e., second-color photosensitive pixel B) and a full-color photosensitive pixel W; in the second sub-unit UB located at the lower left corner in fig. 8, the photosensitive pixels 110 in the first row are alternately arranged as single-color photosensitive pixels (i.e., second-color photosensitive pixels B) and panchromatic photosensitive pixels W, and the photosensitive pixels 110 in the second row are alternately arranged as panchromatic photosensitive pixels W and single-color photosensitive pixels (i.e., second-color photosensitive pixels B). In the third sub-unit UC located at the lower right corner in fig. 5, the photosensitive pixels 110 in the first row are all-color photosensitive pixels W and single-color photosensitive pixels (i.e., third-color photosensitive pixels C), and the photosensitive pixels 110 in the second row are all-color photosensitive pixels (i.e., third-color photosensitive pixels C) and all-color photosensitive pixels W; in the third sub-unit UC at the bottom right of fig. 8, the photosensitive pixels 110 in the first row are sequentially and alternately a single-color photosensitive pixel (i.e., the third-color photosensitive pixel C) and a full-color photosensitive pixel W, and the photosensitive pixels 110 in the second row are sequentially and alternately a full-color photosensitive pixel W and a single-color photosensitive pixel (i.e., the third-color photosensitive pixel C).
As shown in fig. 8, the alternating order of the full-color photosensitive pixels W and the single-color photosensitive pixels in the first-type sub-unit UA in fig. 8 does not coincide with the alternating order of the full-color photosensitive pixels W and the single-color photosensitive pixels in the third-type sub-unit UC. Specifically, in the first type subunit UA shown in fig. 8, the photosensitive pixels 110 in the first row are sequentially and alternately a full-color photosensitive pixel W and a single-color photosensitive pixel (i.e., first-color photosensitive pixel a), and the photosensitive pixels 110 in the second row are sequentially and alternately a single-color photosensitive pixel (i.e., first-color photosensitive pixel a) and a full-color photosensitive pixel W; in the third sub-unit UC shown in fig. 8, the photosensitive pixels 110 in the first row are alternately arranged as a single-color photosensitive pixel (i.e., the third-color photosensitive pixel C) and a full-color photosensitive pixel W, and the photosensitive pixels 110 in the second row are alternately arranged as a full-color photosensitive pixel W and a single-color photosensitive pixel (i.e., the third-color photosensitive pixel C). That is, the alternating order of the full-color photosensitive pixels W and the color photosensitive pixels in different sub-units in the same minimal repeating unit may be uniform (as shown in fig. 5) or non-uniform (as shown in fig. 8).
For another example, fig. 9 is a schematic layout diagram of a light sensing pixel 110 (shown in fig. 3) in a minimal repeating unit according to another embodiment of the present application. The minimum repeating unit is 4 rows, 4 columns and 16 photosensitive pixels 110, and the sub-unit is 2 rows, 2 columns and 4 photosensitive pixels 110. The arrangement mode is as follows:
Figure BDA0002629906320000061
w denotes a full-color photosensitive pixel; a denotes a first color-sensitive pixel of the plurality of color-sensitive pixels; b denotes a second color-sensitive pixel of the plurality of color-sensitive pixels; c denotes a third color-sensitive pixel of the plurality of color-sensitive pixels.
For example, as shown in fig. 9, for each sub-unit, a plurality of photosensitive pixels 110 of the same row are photosensitive pixels 110 of the same category. Among them, the photosensitive pixels 110 of the same category include: (1) all are panchromatic photosensitive pixels W; (2) all are first color sensitive pixels A; (3) all are second color sensitive pixels B; (4) are all third color sensitive pixels C.
For example, as shown in FIG. 9, the categories of subunits include three categories. The first-type subunit UA comprises a plurality of full-color photosensitive pixels W and a plurality of first-color photosensitive pixels A; the second-type sub-unit UB includes a plurality of full-color photosensitive pixels W and a plurality of second-color photosensitive pixels B; the third type of sub-unit UC includes a plurality of full-color photosensitive pixels W and a plurality of third-color photosensitive pixels C. Each minimum repeating unit comprises four subunits, namely a first subunit UA, two second subunits UB and a third subunit UC. Wherein, a first sub-unit UA and a third sub-unit UC are arranged in a first diagonal direction D1, and two second sub-units UB are arranged in a second diagonal direction D2. The first diagonal direction D1 is different from the second diagonal direction D2. For example, the first diagonal line and the second diagonal line intersect, such as being perpendicular.
For another example, fig. 10 is a schematic layout diagram of a light sensing pixel 110 (shown in fig. 3) in a minimal repeating unit according to another embodiment of the present application. The minimum repeating unit is 4 rows, 4 columns and 16 photosensitive pixels 110, and the sub-unit is 2 rows, 2 columns and 4 photosensitive pixels 110. The arrangement mode is as follows:
Figure BDA0002629906320000062
w denotes a full-color photosensitive pixel; a denotes a first color-sensitive pixel of the plurality of color-sensitive pixels; b denotes a second color-sensitive pixel of the plurality of color-sensitive pixels; c denotes a third color-sensitive pixel of the plurality of color-sensitive pixels.
For example, as shown in fig. 10, for each sub-unit, the plurality of photosensitive pixels 110 in the same column are photosensitive pixels 110 of the same category. Among them, the photosensitive pixels 110 of the same category include: (1) all are panchromatic photosensitive pixels W; (2) all are first color sensitive pixels A; (3) all are second color sensitive pixels B; (4) are all third color sensitive pixels C.
For example, as shown in FIG. 10, the categories of subunits include three categories. The first-type subunit UA comprises a plurality of full-color photosensitive pixels W and a plurality of first-color photosensitive pixels A; the second-type sub-unit UB includes a plurality of full-color photosensitive pixels W and a plurality of second-color photosensitive pixels B; the third type of sub-unit UC includes a plurality of full-color photosensitive pixels W and a plurality of third-color photosensitive pixels C. Each minimum repeating unit comprises four subunits, namely a first subunit UA, two second subunits UB and a third subunit UC. Wherein, a first sub-unit UA and a third sub-unit UC are arranged in a first diagonal direction D1, and two second sub-units UB are arranged in a second diagonal direction D2. The first diagonal direction D1 is different from the second diagonal direction D2. For example, the first diagonal line and the second diagonal line intersect, such as being perpendicular.
For example, in other embodiments, in the same minimum repeating unit, the plurality of photosensitive pixels 110 in the same row in some sub-units may be photosensitive pixels 110 in the same category, and the plurality of photosensitive pixels 110 in the same column in the remaining sub-units may be photosensitive pixels 110 in the same category.
For example, as shown in the minimum repeating unit of fig. 5 to 10, the first color-sensitive pixel a may be a red-sensitive pixel R; the second color sensitive pixel B may be a green sensitive pixel G; the third color photosensitive pixel C may be a blue photosensitive pixel Bu.
For example, as shown in the minimum repeating unit of fig. 5 to 10, the first color-sensitive pixel a may be a red-sensitive pixel R; the second color photosensitive pixel B may be a yellow photosensitive pixel Y; the third color photosensitive pixel C may be a blue photosensitive pixel Bu.
For example, as shown in the minimum repeating unit of fig. 5 to 10, the first color-sensitive pixel a may be a magenta-sensitive pixel M; the second color photosensitive pixel B may be a cyan photosensitive pixel Cy; the third color photosensitive pixel C may be a yellow photosensitive pixel Y.
It is noted that in some embodiments, the response band of the full-color photosensitive pixel W may be the visible band (e.g., 400nm-760 nm). For example, an infrared filter is disposed on the panchromatic photosensitive pixel W to filter out infrared light. In other embodiments, the response bands of the panchromatic photosensitive pixel W are in the visible and near infrared (e.g., 400nm-1000nm) bands, which match the response bands of the photoelectric conversion element 1111 (shown in FIG. 4) in the image sensor 10 (shown in FIG. 1). For example, the full-color photosensitive pixel W may be provided with no filter or a filter through which light of all wavelength bands passes, and the response wavelength band of the full-color photosensitive pixel W is determined by the response wavelength band of the photoelectric conversion element 1111, that is, matched with each other. Embodiments of the present application include, but are not limited to, the above-described band ranges.
For convenience of description, the following embodiments will be described with the first single-color photosensitive pixel a being a red photosensitive pixel R, the second single-color photosensitive pixel B being a green photosensitive pixel G, and the third single-color photosensitive pixel being a blue photosensitive pixel Bu.
Referring to fig. 1 to 3, fig. 5 and fig. 11, in some embodiments, the control unit 13 controls the exposure of the pixel array 11. Among them, for a plurality of photosensitive pixels 110 in the same sub-unit, at least one single-color photosensitive pixel is exposed with a first exposure time, at least one single-color photosensitive pixel is exposed with a second exposure time less than the first exposure time, and at least one full-color photosensitive pixel W is exposed with a third exposure time less than or equal to the first exposure time. A plurality of single-color photosensitive pixels in the pixel array 11 exposed at a first exposure time may generate first color information, a plurality of single-color photosensitive pixels exposed at a second exposure time may generate second color information, and a plurality of panchromatic photosensitive pixels W exposed at a third exposure time may generate panchromatic information. The first color information may form a first color original image. The second color information may form a second color original image. The panchromatic information may generate a panchromatic original image.
In one embodiment, as shown in fig. 11, all of the panchromatic photosensitive pixels W in the pixel array 11 are exposed to light for a third exposure time, which may be greater than the second exposure time, such that all of the panchromatic photosensitive pixels W are exposed to light for the medium exposure time M; alternatively, the third exposure time is equal to the first exposure time such that all the full-color photosensitive pixels W are exposed with the long exposure time L, but the third exposure time may be equal to or less than the second exposure time such that the full-color photosensitive pixels W are exposed with the short exposure time S, which is not limited herein. The third exposure time is greater than the second exposure time, i.e., all of the full-color photosensitive pixels W are exposed to the medium exposure time M, as an example. Specifically, for a plurality (4 in fig. 11) of photosensitive pixels 110 (in fig. 3) in each sub-unit, one single-color photosensitive pixel is exposed with a first exposure time (e.g., long exposure time L in fig. 11), one single-color photosensitive pixel is exposed with a second exposure time (e.g., short exposure time S in fig. 11), and both full-color photosensitive pixels W are exposed with a third exposure time (e.g., medium exposure time M in fig. 11).
It should be noted that, in some embodiments, the exposure process of the pixel array 11 may be: (1) the photosensitive pixels 110 exposed with the first exposure time, the photosensitive pixels 110 exposed with the second exposure time, and the photosensitive pixels 110 exposed with the third exposure time are sequentially exposed (wherein the exposure sequence of the three is not limited), and the exposure proceeding time of the three is not overlapped; (2) the photosensitive pixels 110 exposed with the first exposure time, the photosensitive pixels 110 exposed with the second exposure time, and the photosensitive pixels 110 exposed with the third exposure time are sequentially exposed (wherein the exposure sequence of the three is not limited), and the exposure proceeding time of the three is partially overlapped; (3) the exposure process time of all the photosensitive pixels 110 exposed with the shorter exposure time is within the exposure process time of the photosensitive pixel 110 exposed with the longest exposure time, for example, the exposure process time of all the single-color photosensitive pixels exposed with the second exposure time is within the exposure process time of all the single-color photosensitive pixels exposed with the first exposure time, and the exposure process time of all the full-color photosensitive pixels W exposed with the third exposure time is within the exposure process time of all the single-color photosensitive pixels exposed with the first exposure time. In the embodiment of the present application, the pixel array 11 adopts the (3) th exposure method, and the use of this exposure method can shorten the overall exposure time required by the pixel array 11, which is beneficial to increasing the frame rate of the image.
After the exposure of the pixel array 11 is completed, the image sensor 10 can output three original images, which are: (1) a first color original image composed of first color information generated by a plurality of single-color photosensitive pixels exposed with a long exposure time L (first exposure time); (2) a second color original image composed of second color information generated by a plurality of single-color photosensitive pixels exposed with a short exposure time S (second exposure time); (3) a first full-color original image is composed of first full-color information generated by the plurality of full-color photosensitive pixels W exposed at the medium exposure time M (third exposure time).
Referring to fig. 1, after acquiring a first color original image, a second color original image and a first full-color original image, an image sensor 10 transmits the first color original image and the second color original image to a high dynamic range image processing module 20 for high dynamic range image processing to obtain a first color high dynamic range image.
It should be noted that the high dynamic range image processing module 20 may be integrated in the image sensor 10, integrated in the image processor 30, or separately disposed in the image sensor 10 and the image processor 30
Referring to fig. 12, in some embodiments, the high dynamic range image processing module 20 includes a high dynamic range image processing unit 21 and a luminance mapping unit 23. The high dynamic range image processing unit 21 is configured to perform high dynamic range image processing on the first color original image and the second color original image to obtain a third color high dynamic range image; the luminance mapping unit 23 is configured to perform luminance mapping on the third color high dynamic range image to obtain the first color high dynamic range image.
Specifically, referring to fig. 13, the process of the high dynamic range image processing unit 21 performing the high dynamic range image processing on the first color original image and the second color original image may include a luminance alignment process. The high dynamic range image processing unit 21 performs the luminance alignment processing on the first color original image and the second color original image, and includes the steps of: (1) identifying the overexposed image pixels of which the pixel values are larger than a first preset threshold value in the first color original image; (2) for each overexposed image pixel, expanding a predetermined area by taking the overexposed image pixel as a center; (3) searching a first color original image pixel with a pixel value smaller than a first preset threshold value in a preset area; (4) correcting the pixel value of the pixel of the overexposed image by using the pixel of the first color original image and the pixel of the second color original image; (5) and updating the first color original image by using the corrected pixel values of the pixels of the overexposed image to obtain the first color original image with the brightness aligned. Specifically, referring to fig. 13, assuming that the pixel value V1 of the image pixel P12 (the image pixel marked with the dashed circle in the first color original image in fig. 13) is greater than the first preset threshold V0, that is, the image pixel P12 is an overexposed image pixel P12, the high dynamic range image processing unit 21 extends a predetermined region, for example, the 3 × 3 region shown in fig. 13, with the overexposed image pixel P12 as the center. Of course, in other embodiments, there may be 4 × 4 regions, 5 × 5 regions, 10 × 10 regions, etc., which are not limited herein. Subsequently, the high dynamic range image processing unit 21 searches for a first color original image pixel having a pixel value smaller than the first preset threshold V0, for example, if the pixel value V2 of the image pixel P21 in fig. 13 (the image pixel marked with the dotted line circle in the first color original image in fig. 13) is smaller than the first preset threshold V0 in the predetermined area of 3 × 3, the image pixel P21 is the first color original image pixel P21. Subsequently, the high dynamic range image processing unit 21 finds, in the second color original image, image pixels corresponding to the overexposed image pixel P12 and the first color original image pixel P21, i.e., an image pixel P1 '2' (an image pixel marked with a dashed circle in the second color original image in fig. 13) and an image pixel P2 '1' (an image pixel marked with a dotted circle in the second color original image in fig. 13), where the image pixel P1 '2' corresponds to the overexposed image pixel P12, the image pixel P2 '1' corresponds to the first color original image pixel P21, the pixel value of the image pixel P1 '2' is V3, and the pixel value of the image pixel P2 '1' is V4. Subsequently, V1 ' is calculated from V1 '/V3 ═ V2/V4, and the value of V1 is replaced with the value of V1 '. Thus, the actual pixel value of the overexposed image pixel P12 can be calculated. The high dynamic range image processing unit 21 performs the process of luminance alignment on each overexposed image pixel in the first color original image, so as to obtain a luminance-aligned first color original image. Because the pixel values of the overexposed image pixels in the first color original image after the brightness alignment are corrected, the pixel value of each image pixel in the first color original image after the brightness alignment is more accurate.
In the high dynamic range image processing process, after acquiring the first color original image and the second color original image which are luminance-aligned, the high dynamic range image processing unit 21 may fuse the first color original image and the second color original image which are luminance-aligned to obtain a third color high dynamic range image. Referring to fig. 14, specifically, the high dynamic range image processing unit 21 first performs motion detection on the luminance-aligned first color original image to identify whether a motion blur area exists in the luminance-aligned first color original image. And if the first color original image after the brightness alignment does not have a motion blur area, directly fusing the first color original image and the second color original image after the brightness alignment to obtain a third color high dynamic range image. And if the first color original image after the brightness alignment has the motion blurred area, removing the motion blurred area in the first color original image, and only fusing all areas of the second color original image and the areas except the motion blurred area in the first color original image after the brightness alignment to obtain a third color high dynamic range image. Specifically, when fusing the first color original image and the second color original image after the luminance alignment, if there is no motion blur area in the first color original image after the luminance alignment, the fusion of the two intermediate images at this time follows the following principle: (1) in the first color original image after the brightness alignment, directly replacing the pixel value of the image pixel of the overexposure area with the pixel value of the image pixel corresponding to the overexposure area in the second color original image; (2) in the first color original image after brightness alignment, the pixel values of the image pixels in the underexposed area are: dividing the long exposure pixel value by the long-short pixel value ratio; (3) in the first color original image after brightness alignment, the pixel values of the image pixels in the non-underexposed and non-overexposed areas are as follows: the long exposure pixel value is divided by the long to short pixel value ratio. If there is a motion blur area in the first color original image after brightness alignment, the fusion of the two intermediate images at this time needs to follow the (4) th principle in addition to the above three principles: in the first color original image after the brightness alignment, the pixel values of the image pixels of the motion blur area are directly replaced by the pixel values of the image pixels corresponding to the motion blur area in the second color original image. In the underexposed region and the non-underexposed and non-overexposed regions, the pixel values of the image pixels in these regions are the ratio of the long-exposure pixel value divided by the long-short pixel value, i.e., VL/(VL/VS) ═ VS ', where VL denotes the long-exposure pixel value, VS denotes the short-exposure pixel value, and VS' denotes the calculated pixel values of the image pixels in the underexposed region and the non-underexposed and non-overexposed regions. The signal-to-noise ratio of VS' will be greater than the signal-to-noise ratio of VS.
The high dynamic range image processing unit 21 performs high dynamic range image processing on the color original image, and can improve the dynamic range of the obtained image and improve the imaging effect of the image. Of course, the high dynamic range image processing unit 21 may also use other methods to fuse the first color original image and the second color original image after brightness alignment to obtain a third color high dynamic range image. For example, the high dynamic range image processing unit 21 may further perform motion blur detection on the first color original image and the second color original image after brightness alignment, and perform motion blur elimination on motion blur areas existing on the detected first color original image and second color original image, so as to obtain a first color original image after motion blur elimination and a second color original image after motion blur elimination. After obtaining the first color original image without the motion blur and the second color original image without the motion blur, the high dynamic range image processing unit 21 performs fusion on the first color original image without the motion blur and the second color original image without the motion blur to obtain a third color high dynamic range image with a high dynamic range, which is not limited herein.
The high dynamic range image processing unit 21, after obtaining the third color high dynamic range image, transmits the third color high dynamic range image to the luminance mapping unit 23. The luminance mapping unit 23 subjects the third color high dynamic range image to luminance mapping processing to obtain a first color high dynamic range image. The bit width of the data of each image pixel in the first color high dynamic range image is smaller than the bit width of the data of each image pixel in the third color high dynamic range image.
Illustratively, after the first color original image and the second color original image with the data bit width of 10bit are subjected to the high dynamic range image processing by the high dynamic range image processing unit 21, a third color high dynamic range image with the bit width of 16bit can be obtained. The luminance mapping unit 23 can perform luminance mapping processing on the third color high dynamic range image with the bit width of 16 bits to obtain the first color high dynamic range image with the bit width of 10 bits. Of course, in some embodiments, the luminance mapping process may also be performed on the third color high dynamic range image with the bit width of 16 bits to obtain the first color high dynamic range image with the bit width of 12 bits, which is not limited herein. Thus, the data size of the high dynamic range image is reduced through the brightness mapping processing, so that the problem that the image processor 30 cannot process the high dynamic range image with too large data size is avoided, and the speed of processing the high dynamic range image by the image processor 30 is favorably improved.
Referring to fig. 12, the high dynamic range image processing module 20 further includes a statistical unit 25, and the statistical unit 25 is configured to process the first color original image and the second color original image to obtain statistical data. After acquiring the statistical data, the statistical unit 25 supplies the statistical data to the image processor 30 to perform automatic exposure processing and/or automatic white balance processing. That is, the image processor 30 may perform at least one of the automatic exposure process and the automatic white balance process based on the statistical data after receiving the statistical data. For example, the image processor 30 performs automatic exposure processing based on the statistical data; alternatively, the image processor 30 performs automatic white balance processing based on the statistical data; alternatively, the image processor 30 performs automatic exposure processing and automatic white balance processing based on the statistical data. Thus, the image processor 30 can perform automatic exposure and automatic white balance processing according to the statistical data, which is beneficial to improving the quality of the image finally output by the image processor 30.
Referring to fig. 12, the high dynamic range image processing module 20 further includes a lens shading correction unit 27, and the lens shading correction unit 27 is configured to correct the third color high dynamic range image to obtain a color high dynamic range corrected image. Specifically, after the high dynamic range image processing unit 21 fuses the first color original image and the second color original image into the third color high dynamic range image, the lens shading correction unit 27 performs lens shading correction processing on the third color high dynamic range image to obtain a color high dynamic range corrected image. As shown in fig. 15, the lens shading correction unit 27 divides the third color high dynamic range image into sixteen grids, and each of the sixteen grids has a preset compensation coefficient. Then, the lens shading correction unit 27 performs shading correction on the image by a bilinear interpolation method according to the compensation system effect adjacent to each mesh region or adjacent to itself and itself. R2 is a pixel value within a broken line frame in the illustrated third color high dynamic range image subjected to the lens shading correction processing, and R1 is a pixel value within a broken line frame in the illustrated first color original image. R2 ═ R1 × k1, k1 is obtained by bilinear interpolation from the compensation coefficients 1.10, 1.04, 1.105, and 1.09 of the grid in which the R1 pixels are adjacent. Let the coordinates of the image be (x, y), x counts from the first pixel on the left to the right, y counts from the first pixel on the top to the bottom, and x and y are natural numbers, as indicated by the marks on the edges of the image. For example, if the coordinates of R1 are (3,3), then the coordinates of R1 in each grid compensation coefficient map should be (0.75 ). f (x, y) represents a compensation value of coordinates (x, y) in each grid compensation coefficient map. Then f (0.75, j0.75) is the corresponding compensation coefficient value of R1 in each grid compensation coefficient map, and then f (0.75, j0.75) — (0.25) × f (0,0) +0.25 × 0.75 × (0,1) +0.75 × 0.25 × (1,0) +0.75 × (1,1) ═ 0.0625 × 1.11+0.1875 × 1.10+0.1875 × 1.09+0.5625 × 1.03. The compensation coefficient of each mesh has been set in advance before the lens shading correction unit 27 performs the lens shading correction process.
The lens shading correction unit 27, after obtaining the color high dynamic range correction image, transmits the color high dynamic range correction image to the statistic unit 25. The statistic unit 25 is configured to process the color high dynamic range corrected image to obtain statistic data, and supply the statistic data to the image processor 30 for automatic exposure processing and/or automatic white balance processing, i.e., the statistic data is supplied to the image processor 30 for at least one of automatic exposure processing and automatic white balance processing.
Since the lens shading correction is performed on the third color high dynamic range image first, and then the high dynamic range corrected image after the shading correction is processed to obtain statistical data, the influence of the lens shading is avoided, so that the image quality of the image obtained by the image processor 30 through the automatic exposure processing and/or the automatic white balance processing according to the statistical data is higher. It should be noted that the image fusion module 40 and the high dynamic range image processing module 30 are integrated in the image sensor 10.
The high dynamic range image processing module 20 performs high dynamic range image processing on the first color original image and the second color original image to obtain a first color high dynamic range image, and then transmits the first color high dynamic range image to the image processor 30. The image processor 30 is configured to process the first color high dynamic range image and the first full color raw image to obtain a second color high dynamic range image.
In some embodiments, image processor 30 adds the pixel values of a plurality of color image pixels on the first color high dynamic range image to the pixel values of the panchromatic image pixels W corresponding to the first panchromatic original image to obtain the pixel values of the color image pixels corresponding to the second color high dynamic range image.
Specifically, referring to fig. 11 and 16, the first panchromatic original image obtained by the image sensor 10 includes a plurality of panchromatic image pixels W and a plurality of null image pixels N (null), wherein the null image pixels are neither panchromatic image pixels W nor color image pixels, and the location of the null image pixels N in the first panchromatic original image may be regarded as that there is no image pixel in the location, or the pixel value of the null image pixels may be regarded as zero. Comparing the pixel array 11 with the full-color original image, it can be seen that for each sub-unit in the pixel array 11, the sub-unit includes two full-color image pixels W and two color image pixels (color image pixel a, color image pixel B, or color image pixel C). There is also one sub-unit in the first panchromatic original image corresponding to each sub-unit in the pixel array 11, the sub-units of the first panchromatic original image including two panchromatic image pixels W and two empty image pixels N at positions corresponding to the positions of the two color image pixels in the sub-units of the pixel array 11. The image processor 30 may further process the first full color original image to obtain a full color intermediate image. Illustratively, each subunit includes a plurality of null image pixels N and a plurality of panchromatic image pixels W. In particular, some of the subunits include two null image pixels N and two full-color image pixels W. The image processor 30 may take the pixel values of all the panchromatic image pixels W in a sub-unit including the dummy image pixel N and the panchromatic image pixel W as the full-color large pixel W in the sub-unit to obtain a full-color intermediate image. The resolution of the full-color intermediate image at this time is the same as that of the first color high dynamic range image in order to facilitate image processing of the full-color intermediate image and the first color high dynamic range image.
After the image processor 30 obtains the panchromatic intermediate image and the first color high dynamic range image, the image processor 30 obtains the pixel value of the panchromatic image pixel W on the panchromatic intermediate image and the pixel value of the upper color image pixel of the first color high dynamic range image, and adds the pixel value of the corresponding color image pixel and the pixel value of the panchromatic image pixel W to obtain the pixel value of the color image pixel corresponding to the second color high dynamic range image. For example, referring to fig. 17, the image processor 30 acquires the pixel value of a first-color image pixel a (generated by exposing the first-color photosensitive pixel a) disposed on the 0 th row and 0 th column of the first color high dynamic range image, and acquires the pixel value of a panchromatic image pixel W at a position corresponding to the color image pixel on the panchromatic intermediate image, that is, acquires the pixel value of a panchromatic image pixel W disposed on the 0 th row and 0 th column of the panchromatic intermediate image, adds the acquired pixel value of the first-color image pixel a and the pixel value of the panchromatic image pixel W to obtain the pixel value of a new first-color image pixel a ', and disposes the new first-color image pixel a' on the 0 th row and 0 th column of the second color high dynamic range image. The image processor 30 then obtains the pixel value of the next color image pixel and repeats the above steps until all color image pixels in the first color high dynamic range image are processed, thus obtaining the second color high dynamic range image.
In some embodiments, processing the first color high dynamic range image and the first panchromatic original image to obtain a second color high dynamic range image further comprises: and processing the first panchromatic original image and the first color high dynamic range image after the brightness alignment treatment to obtain a second color high dynamic range image. Therefore, the pixel values of the image pixels in the first full-color original image after the brightness alignment processing are accurate. Specifically, the image processor 30 further processes the first full-color original image to obtain a full-color intermediate image, and after the image processor 30 obtains the full-color intermediate image and the first color high dynamic range image, the brightness alignment process is performed on the full-color intermediate image and the first color high dynamic range image to obtain a brightness-aligned full-color intermediate image. After the image processor 30 obtains the brightness-aligned panchromatic intermediate image and the first color high dynamic range image, the image processor 30 adds the pixel values of the plurality of color image pixels on the first color high dynamic range image to the pixel values of the panchromatic image pixels W corresponding to the brightness-aligned first panchromatic original image to obtain the pixel values of the color image pixels corresponding to the second color high dynamic range image.
It should be noted that the embodiment of the image processor 30 for further processing the first full-color original image to obtain a full-color intermediate image is the same as the embodiment of fig. 16 for further processing the first full-color original image to obtain a full-color intermediate image; the specific implementation of the brightness alignment processing on the full-color intermediate image and the first color high dynamic range image is the same as the specific implementation of the brightness alignment processing on the first color original image and the second color original image shown in fig. 13, and is not described herein again.
Referring to fig. 12, the image processor 30 is further configured to perform image processing on the first color high dynamic range image and the second color high dynamic range image respectively to obtain a processed first color high dynamic range image and a processed second color high dynamic range image. The image processing includes: at least one of black level correction processing, lens shading correction processing, demosaicing processing, dead pixel compensation processing, color correction processing, global tone mapping processing, and color conversion processing.
Referring to fig. 18, the high dynamic range image processing system 100 further includes a storage module 50, and the storage module 50 is configured to store the image processed by the image processor 30, and transmit the processed image to the image fusion module for fusion algorithm processing, so as to obtain the target image.
Specifically, the image processor 30 sequentially processes a first color high dynamic range image and a second color high dynamic range image, the image processor 30 processes the first color high dynamic range image, transmits the obtained processed first color high dynamic range image to the storage module 50 for storage, the image processor 30 processes the second color high dynamic range image, transmits the obtained processed second color high dynamic range image to the storage module 50 for storage, and when all the images processed by the image processor 30 are stored in the storage module 50 (i.e., when the processed first color high dynamic range image and the processed second color high dynamic range image are stored in the storage module 50), the storage module 50 transmits all the stored images (i.e., the processed first color high dynamic range image and the processed second color high dynamic range image) to the image fusion module 40 for further processing And processing by a line fusion algorithm to obtain a target image.
It should be noted that, the image processor 30 may also process the second color high dynamic range image first, and then process the first color high dynamic range image; the image processor 30 may also perform image processing on the first color high dynamic range image and the second color high dynamic range image at the same time, which is not limited herein. No matter what way the image processor 30 performs the image processing on the first color high dynamic range image and the second color high dynamic range image, the storage module 50 only transmits the processed first color high dynamic range image and the processed second color high dynamic range image to the image fusion module 40 for fusion algorithm processing to obtain the target image after storing the processed first color high dynamic range image and the processed second color high dynamic range image.
Referring to fig. 18, after obtaining the processed first color high dynamic range image and the processed second color high dynamic range image, the image processor 30 transmits the processed first color high dynamic range image and the processed second color high dynamic range image to the image fusion module 40, and the image fusion module 40 is configured to perform fusion algorithm processing on the processed first color high dynamic range image and the processed second color high dynamic range image to obtain the target image.
Referring to fig. 19 to 24, the image fusion module 40 performs a fusion algorithm process on the processed first color high dynamic range image and the processed second color high dynamic range image, including the following steps: (1) performing box filtering processing on the processed first color high dynamic range image to obtain a plurality of first intermediate images corresponding to a plurality of color image pixels of different colors; performing box filtering processing on the processed second color high dynamic range image to obtain a plurality of second intermediate images corresponding to a plurality of color image pixels with different colors; (2) dividing the pixel value of each color image pixel in the first intermediate image by the pixel value of the color image pixel corresponding to the second intermediate image of the corresponding color to obtain the pixel value of the color image pixel corresponding to the third intermediate image of the corresponding color; (3) multiplying the pixel value of each color image pixel in the third intermediate image by the pixel value of the color image pixel corresponding to the second intermediate image to obtain a single-color image of the corresponding color; (4) the plurality of single-color images are fused to obtain a target image.
Specifically, the first single-color photosensitive pixel a is taken as a red photosensitive pixel R, the second single-color photosensitive pixel B is taken as a green photosensitive pixel G, and the third single-color photosensitive pixel is taken as a blue photosensitive pixel Bu for illustration. Referring to fig. 19, performing box filtering on the processed first color high dynamic range image to obtain a plurality of first intermediate images corresponding to a plurality of color image pixels of different colors includes: (1) decomposing the processed first color high dynamic range image into a red high dynamic range image, a first class green high dynamic range image, a second class green high dynamic range image and a blue high dynamic range image; (2) and respectively carrying out box filtering processing on the red high dynamic range image, the first class of green high dynamic range image, the second class of green high dynamic range image and the blue high dynamic range image to obtain corresponding first intermediate images. The following description will take an example in which the box filter processing is performed on the red high dynamic range image to obtain the red first intermediate image. Referring to fig. 20, the box filtering process performed on the red high dynamic range image includes: arranging a slidable sliding window C1 in the red high dynamic range image, placing the sliding window C1 at the upper left corner (0,0) of the red high dynamic range image, and adding the pixel values of all red image pixels R in the sliding window C1 at the moment to obtain the pixel value of a new red image pixel R _ S; continuing to slide the sliding window C1 gradually to the right and adding the pixel values of all red image pixels R within the sliding window C1 at that time each time sliding to a new position to obtain pixel values of a plurality of new red image pixels R _ S; if the sliding window C1 moves to the edge of the image, the image boundary pixels can be copied and then calculated; when the sliding window C1 slides to the end of the row of the red high dynamic range image, the sliding window C1 is moved to the beginning of the next row, and so on until the sliding window C1 traverses all red image pixels R in the red high dynamic range image. The obtained plurality of new red image pixels R _ S are arranged to form a red first intermediate image.
Note that the sliding window C1 is virtual and does not actually exist. The size of the sliding window C1 can be selected according to actual requirements, for example, the size of the sliding window C1 can also be a square window such as 4 × 4, 6 × 4, 7 × 7, 8 × 8, 9 × 9, and the like. Since the sliding window C1 is a square window, the weights of all directions in the sliding window are almost the same during box filtering, and the phenomenon that a certain direction is emphasized is avoided, so that the quality of the finally obtained color image can be improved. Of course, the sliding window C1 may be a non-square box, for example, the size of the sliding window C1 may also be 2 × 3,3 × 2, 4 × 5, 4 × 6, etc., and is not limited herein. Referring to fig. 21, a specific implementation of performing box filtering on the processed second color high dynamic range image to obtain a plurality of second intermediate images corresponding to the color image pixels of different colors is the same as the specific implementation of performing box filtering on the processed first color high dynamic range image to obtain a plurality of first intermediate images corresponding to the color image pixels of different colors in the embodiment described in fig. 19, and is not repeated here.
Referring to fig. 22, after the image fusion module 40 obtains a plurality of first intermediate images (e.g., the red first intermediate image, the first kind of green first intermediate image, the second kind of green first intermediate image, and the blue first intermediate image in fig. 20) and a plurality of second intermediate images (e.g., the red second intermediate image, the first kind of green second intermediate image, the second kind of green second intermediate image, and the blue second intermediate image in fig. 21), the image fusion module 40 divides the pixel value of each color image pixel in the first intermediate image by the pixel value of the color image pixel corresponding to the second intermediate image of the corresponding color to obtain the pixel value of the color image pixel corresponding to the third intermediate image of the corresponding color. For example, the image fusion module 40 divides the pixel value of each red image pixel R _ S in the red first intermediate image by the pixel value of the red image pixel R' S corresponding to the red second intermediate image to obtain the pixel value of the red image pixel R · S corresponding to the red third intermediate image. The specific embodiment of obtaining the blue third intermediate image, the first type green third intermediate image and the second type green third intermediate image is the same as the manner of obtaining the red third intermediate image.
Referring to fig. 23, after obtaining a plurality of third intermediate images, the image fusion module 40 multiplies the pixel value of each color image pixel in the third intermediate image by the pixel value of the color image pixel corresponding to the second intermediate image to obtain a single-color image of the corresponding color. For example, the image fusion module 40 multiplies the pixel value of each red image pixel R · s in the red third intermediate image by the pixel value of the red image pixel R' at the position corresponding thereto on the second intermediate image to obtain the red image. The specific implementation of obtaining the first type green image, the second type green image and the blue image is the same as the specific implementation of obtaining the red image, and is not described herein again.
Referring to fig. 24, after obtaining the plurality of single-color images, the image fusion module 40 fuses the plurality of single-color images to obtain the target image. That is, the image fusion module 40 fuses the red image, the first kind of green image, the second kind of green image, and the blue image to obtain the target image.
It should be noted that, the above embodiment only exemplifies one of the fusion algorithms that the image fusion module 40 performs the fusion algorithm processing on the processed first color high dynamic range image and the processed second color high dynamic range image, and in other embodiments, other fusion algorithms may be used to fuse the processed first color high dynamic range image and the processed second color high dynamic range image to obtain the target image, which is not limited herein. Of course, the image fusion module 40 may also perform fusion algorithm processing on the first color high dynamic range image without image processing and the second color high dynamic range image without image processing to obtain the target image. The specific implementation of the fusion algorithm processing performed on the first color high dynamic range image without image processing and the second color high dynamic range image without image processing is the same as the specific implementation of the fusion algorithm processing performed on the processed first color high dynamic range image and the processed second color high dynamic range image, and is not described herein again.
In still other embodiments, referring to fig. 25, a portion of the panchromatic photosensitive pixels W in the same subunit in the pixel array 11 are exposed to light at a fourth exposure time, and the remaining panchromatic photosensitive pixels W are exposed to light at a third exposure time. And the fourth exposure time is less than or equal to the first exposure time and is greater than the third exposure time.
And the fourth exposure time is less than or equal to the first exposure time and is greater than the third exposure time. Specifically, for the photosensitive pixels 110 (shown in fig. 3) (4 in fig. 25) in each sub-unit, one single-color photosensitive pixel is exposed to light with a first exposure time (e.g., the long exposure time L shown in fig. 11), one single-color photosensitive pixel is exposed to light with a second exposure time (e.g., the short exposure time S shown in fig. 25), one full-color photosensitive pixel W is exposed to light with a third exposure time (e.g., the short exposure time S shown in fig. 25), and one full-color photosensitive pixel W is exposed to light with a fourth exposure time (e.g., the long exposure time L shown in fig. 25).
It should be noted that, in some embodiments, the exposure process of the pixel array 11 may be: (1) the photosensitive pixels 110 exposed with the first exposure time, the photosensitive pixels 110 exposed with the second exposure time, the photosensitive pixels 110 exposed with the third exposure time, and the photosensitive pixels 110 exposed with the fourth exposure time are sequentially exposed (wherein the exposure sequence of the four is not limited), and the exposure time of the four is not overlapped; (2) the photosensitive pixels 110 exposed with the first exposure time, the photosensitive pixels 110 exposed with the second exposure time, the photosensitive pixels 110 exposed with the third exposure time, and the photosensitive pixels 110 exposed with the fourth exposure time are sequentially exposed (wherein the exposure sequence of the four is not limited), and the exposure proceeding time of the four is partially overlapped; (3) the exposure proceeding time of all the photosensitive pixels 110 exposed with the shorter exposure time is within the exposure proceeding time of the photosensitive pixel 110 exposed with the longest exposure time, for example, the exposure proceeding time of all the single-color photosensitive pixels exposed with the second exposure time is within the exposure proceeding time of all the single-color photosensitive pixels exposed with the first exposure time, the exposure proceeding time of all the full-color photosensitive pixels W exposed with the third exposure time is within the exposure proceeding time of all the single-color photosensitive pixels exposed with the first exposure time, and the exposure proceeding time of all the full-color photosensitive pixels W exposed with the fourth exposure time is within the exposure proceeding time of all the single-color photosensitive pixels exposed with the first exposure time. In the embodiment of the present application, the image sensor 10 adopts the exposure method (3), and the overall exposure time required by the pixel array 11 can be shortened by using this exposure method, which is favorable for increasing the frame rate of the image.
After the exposure of the pixel array 11 is completed, the image sensor 10 can output four original images, which are: (1) a first color original image composed of first color information generated by a plurality of single-color photosensitive pixels exposed with a long exposure time L (first exposure time); (2) a second color original image composed of second color information generated by a plurality of single-color photosensitive pixels exposed with a short exposure time S (second exposure time); (3) a first full-color original image composed of first full-color information generated by a plurality of full-color photosensitive pixels W (third exposure time) exposed with a short exposure time S; (4) a second full-color original image composed of second full-color information generated by the plurality of full-color photosensitive pixels W exposed with the long exposure time L (fourth exposure time).
Referring to fig. 1 and 26, after acquiring a first color original image, a second color original image, a first full-color original image and a second full-color original image, the image sensor 10 transmits the first color original image, the second color original image, the first full-color original image and the second full-color original image to the high dynamic range image processing module 20. The high dynamic range image processing module 20 performs high dynamic range image processing on the first color original image and the second color original image to obtain a first color high dynamic range image; high dynamic range image processing is performed on the first full-color original image and the second full-color original image to obtain a first full-color high dynamic range image.
Referring to fig. 12 and 26, in some embodiments, the high dynamic range image processing module 20 includes a high dynamic range image processing unit 21 and a brightness mapping unit 23. The high dynamic range image processing unit 21 is configured to perform high dynamic range image processing on the first color original image and the second color original image to obtain a third color high dynamic range image, and perform high dynamic range image processing on the first full-color original image and the second full-color original image to obtain a second full-color high dynamic range image. The specific process of the specific high dynamic range image processing is the same as the specific process of performing the high dynamic range image processing on the first color original image and the second color original image in the embodiment shown in fig. 14, and is not repeated herein.
The luminance mapping unit 23 is configured to luminance map the third color high dynamic range image to obtain a first color high dynamic range image, and the luminance mapping unit 23 is configured to luminance map the second full-color high dynamic range image to obtain a first full-color high dynamic range image. The specific process of luminance mapping is the same as the specific process of luminance mapping the third high dynamic range image into the first high dynamic range image in the above embodiment, and details are not repeated here.
The lens shading correction unit 27 is configured to correct the third color high dynamic range image to obtain a color high dynamic range corrected image, and to correct the second full color high dynamic range image to obtain a full color high dynamic range corrected image. The specific correction process is the same as the process of performing lens shading correction on the third color high dynamic range image in the embodiment shown in fig. 15, and is not described herein again.
The statistical unit 25 is configured to process the color high dynamic range corrected image and the full color high dynamic range corrected image to obtain statistical data, and transmit the statistical data to the image processor 30, so that the image processor 30 can perform at least one of automatic exposure and automatic white balance processing according to the statistical information. Of course, the statistical unit 25 may also directly process the first color original image, the second color original image, the first full-color original image, and the second full-color original image to obtain statistical data, and transmit the statistical data to the image processor 30, so that the image processor 30 can perform at least one of automatic exposure and automatic white balance processing according to the statistical information.
The high dynamic range image processing module 20 transmits the first color high dynamic range image and the first full color high dynamic range image to the image processor 30 after obtaining the first color high dynamic range image and the first full color high dynamic range image. The image processor 30 is configured to process the first color high dynamic range image and the first full color high dynamic range image to obtain a second color high dynamic range image.
In some embodiments, image processor 30 adds the pixel values of a plurality of color image pixels on the first color high dynamic range image to the pixel values of the panchromatic image pixel W corresponding to the first panchromatic high dynamic range image to obtain the pixel values of the color image pixels corresponding to the second color high dynamic range image. For example, referring to fig. 27, the image processor 30 obtains the pixel value of a first color image pixel a (generated by exposing the first color photosensitive pixel a) disposed on the 0 th row and 0 th column of the first color high dynamic range image, and obtains the pixel value of a panchromatic image pixel W at a position corresponding to the color image pixel on the first panchromatic high dynamic range image, that is, obtains the pixel value of the panchromatic image pixel W disposed on the 0 th row and 0 th column of the first panchromatic high dynamic range image, adds the obtained pixel value of the first color image pixel a and the pixel value of the panchromatic image pixel W to obtain the pixel value of a new first color image pixel a ', and sets the new first color image pixel a' on the 0 th row and 0 th example of the second color high dynamic range image. The image processor 30 then obtains the pixel value of the next color image pixel and repeats the above steps until all color image pixels in the first color high dynamic range image are processed, thus obtaining the second color high dynamic range image.
After acquiring the first color high dynamic range image and the second color high dynamic range image, the high dynamic range image processing system 100 performs subsequent processing on the first color high dynamic range image and the second color high dynamic range image to obtain a target image. The specific process of performing subsequent processing on the first color high dynamic range image and the second color high dynamic range image is the same as the specific process of performing subsequent processing on the first color high dynamic range image and the second color high dynamic range image by the high dynamic range image processing system 100 to obtain the target image in the embodiment described in fig. 18 to 24, which is not repeated herein.
Referring to fig. 1 and fig. 28, an electronic device 1000 is also provided. The electronic device 1000 according to the embodiment of the present application includes the lens 300, the housing 200, and the high dynamic range image processing system 100 according to any of the above embodiments. The lens 300, the high dynamic range image processing system 100 and the housing 200 are combined. The lens 300 cooperates with the image sensor 10 of the high dynamic range image processing system 100 for imaging.
The electronic device 1000 may be a mobile phone, a tablet computer, a notebook computer, an intelligent wearable device (e.g., an intelligent watch, an intelligent bracelet, an intelligent glasses, an intelligent helmet), an unmanned aerial vehicle, a head display device, etc., without limitation.
The electronic device 1000 according to the embodiment of the present invention performs high dynamic range image processing and image processing on the full-color raw image and the color raw image output by the image sensor 10 through the high dynamic range image processing module 20 and the image processor 30 to obtain a first color high dynamic range image arranged in a bayer array and a second color high dynamic range image including panchromatic information, and then the image fusion module 40 fuses the first color high dynamic range image and the second color high dynamic range image to obtain a target image with a high dynamic range. Thus, the image quality of the high dynamic range image can be improved, and the image can be processed without changing the parameters of the image processor 30, so that the manufacturing difficulty of the electronic device 1000 is reduced.
Referring to fig. 2 and fig. 29, the present application further provides a high dynamic range image processing method. The high dynamic range image processing method of the embodiment of the present application is used for the high dynamic range image processing system 100. The high dynamic range image processing system 100 includes an image sensor 10. The image sensor 10 includes a pixel array 11. The pixel array 11 includes a plurality of full-color photosensitive pixels and a plurality of color photosensitive pixels. A color sensitive pixel has a narrower spectral response than a panchromatic sensitive pixel. The pixel array 11 includes minimum repeating units each including a plurality of sub-units. Each sub-unit includes a plurality of single-color photosensitive pixels and a plurality of full-color photosensitive pixels. The high dynamic range image processing method includes:
01: a pixel array 11 exposure in which, for a plurality of photosensitive pixels in the same subunit, at least one single-color photosensitive pixel is exposed for a first exposure time, at least one single-color photosensitive pixel is exposed for a second exposure time that is less than the first exposure time, and at least one full-color photosensitive pixel is exposed for a third exposure time that is less than the first exposure time; the method comprises the following steps that first color information generated by single-color photosensitive pixels exposed in a first exposure time is used for obtaining a first color original image, second color information generated by single-color photosensitive pixels exposed in a second exposure time is used for obtaining a second color original image, and full-color photosensitive pixels exposed in a third exposure time are used for generating a first full-color original image;
02: carrying out high dynamic range image processing on the first color original image and the second color original image to obtain a first color high dynamic range image, wherein the first color high dynamic range image comprises a plurality of first color image pixels which are arranged in a Bayer array;
03: processing the first full-color original image and the first color high dynamic range image to obtain a second color high dynamic range image; and
04: and carrying out fusion algorithm processing on the first color high dynamic range image and the second color high dynamic range image to obtain a target image.
In some embodiments, the step of performing high dynamic range image processing on the first color original image and the second color original image to obtain a first color high dynamic range image includes: carrying out high dynamic range image processing on the first color original image and the second color original image to obtain third color high dynamic range image processing; and performing brightness mapping on the third color high dynamic range image to obtain a first color high dynamic range image.
In some embodiments, the high dynamic range image processing method further comprises: carrying out high dynamic range image processing on the first color original image and the second color original image to obtain third color high dynamic range image processing; correcting the third color high dynamic range image to obtain a color high dynamic range corrected image; and processing the color high dynamic range corrected image to obtain statistical data, the statistical data being provided to an image processor for automatic exposure processing and/or automatic white balance processing.
In some embodiments, a high dynamic range image processing method includes: the first color original image, the second color original image, and the first full-color original image are processed to obtain statistical data, and the statistical data is provided to an image processor for automatic exposure processing and/or automatic white balance processing.
In some embodiments, a portion of the panchromatic photosensitive pixels in the same subunit are exposed to light for a fourth exposure time, and the remaining panchromatic photosensitive pixels are exposed to light for a third exposure time, the fourth exposure time being less than or equal to the first exposure time and greater than the third exposure time, and second panchromatic information generated by the panchromatic photosensitive pixels exposed to light for the fourth exposure time results in a second panchromatic original image; the high dynamic range image processing method further includes: carrying out high dynamic range image processing on the first color original image and the second color original image to obtain a first color high dynamic range image; performing high dynamic range image processing on the first full-color original image and the second original image to obtain a first full-color high dynamic range image; the first full color high dynamic range image is processed to obtain a second full color high dynamic range image.
In some embodiments, the step of performing high dynamic range image processing on the first color original image and the second color original image to obtain a first color high dynamic range image; performing high dynamic range image processing on a first full-color original image and a second original image to obtain a first full-color high dynamic range image, comprising: fusing the first color original image and the second color original image into a second color high dynamic range image, and fusing the first full-color original image and the second full-color original image into a second full-color high dynamic range image; the second full-color high dynamic range image is luminance mapped to obtain a first full-color high dynamic range image, and the second full-color high dynamic range image is luminance mapped to obtain a first full-color high dynamic range image.
In some embodiments, the high dynamic range image processing method further comprises: fusing the first color original image and the second color original image into a second color high dynamic range image, and fusing the first full-color original image and the second full-color original image into a second full-color high dynamic range image; correcting the second color high dynamic range image to obtain a color high dynamic range corrected image, and correcting the second full color high dynamic range image to obtain a full color high dynamic range corrected image; the color high dynamic correction image and the full color high dynamic correction image are processed to obtain statistical data, and the statistical data is provided to an image processor for automatic exposure processing and/or automatic white balance processing.
In some embodiments, the step of processing the first color high dynamic range image and the first panchromatic original image to obtain a second color high dynamic range image comprises: the pixel values of a plurality of color image pixels on the first color high dynamic range image are added to the pixel values of panchromatic image pixels corresponding to the first panchromatic original image to obtain pixel values of color image pixels corresponding to the second color high dynamic range image.
In some embodiments, the step of processing the first color high dynamic range image and the first panchromatic original image to obtain a second color high dynamic range image comprises: performing brightness alignment processing on the first full-color original image and the first color high dynamic range image to obtain a brightness-aligned first full-color original image; pixel values of a plurality of color image pixels on the first color high dynamic range image are added to pixel values of panchromatic image pixels corresponding to the first panchromatic original image in luminance alignment to obtain pixel values of color image pixels corresponding to the second color high dynamic range image.
In some embodiments, the step of processing the first color high dynamic range image and the first full color high dynamic range image to obtain the second color high dynamic range image comprises: the pixel values of a plurality of color image pixels on the first color high dynamic range image are added to the pixel values of panchromatic image pixels corresponding to the first panchromatic high dynamic range image to obtain pixel values of color image pixels corresponding to the second color high dynamic range image.
In some embodiments, the high dynamic range image processing method further comprises: performing image processing on the first color high dynamic range image to obtain a processed first color high dynamic range image; and performing image processing on the second color high dynamic range image to obtain a processed second color high dynamic range image. The method comprises the following steps of carrying out fusion algorithm processing on a first color high dynamic range image and a second color high dynamic range image to obtain a target image, wherein the fusion algorithm processing comprises the following steps: and performing fusion algorithm processing on the processed first color high dynamic range image and the processed second color high dynamic range image to obtain a target image.
In some embodiments, the image processing comprises: at least one of black level correction processing, lens shading correction processing, demosaicing processing, dead pixel compensation processing, color correction processing, global tone mapping processing, and color conversion processing.
In some embodiments, the high dynamic range image processing system further comprises a storage module, and the high dynamic range image processing method further comprises: storing the image after the image processing in a storage module; and acquiring the image after the image processing from the storage module, and performing fusion algorithm processing on the image after the image processing to obtain a target image.
The implementation process of the high dynamic range image processing method according to any of the above embodiments is the same as the implementation process of the high dynamic range image processing system 100 for obtaining a high dynamic range image, and will not be described herein.
Referring to fig. 30, the present application also provides a non-volatile computer-readable storage medium 400 containing a computer program. The computer program, when executed by the processor 60, causes the processor 60 to perform the high dynamic range image processing method of any of the above embodiments.
For example, referring to fig. 1, fig. 2, fig. 29, and fig. 30, the computer programs, when executed by the processor 60, cause the processor 60 to perform the following steps:
01: a pixel array 11 exposure in which, for a plurality of photosensitive pixels in the same subunit, at least one single-color photosensitive pixel is exposed for a first exposure time, at least one single-color photosensitive pixel is exposed for a second exposure time that is less than the first exposure time, and at least one full-color photosensitive pixel is exposed for a third exposure time that is less than the first exposure time; the method comprises the following steps that first color information generated by single-color photosensitive pixels exposed in a first exposure time is used for obtaining a first color original image, second color information generated by single-color photosensitive pixels exposed in a second exposure time is used for obtaining a second color original image, and full-color photosensitive pixels exposed in a third exposure time are used for generating a first full-color original image;
02: carrying out high dynamic range image processing on the first color original image and the second color original image to obtain a first color high dynamic range image, wherein the first color high dynamic range image comprises a plurality of first color image pixels which are arranged in a Bayer array;
03: processing the first full-color original image and the first color high dynamic range image to obtain a second color high dynamic range image; and
04: and carrying out fusion algorithm processing on the first color high dynamic range image and the second color high dynamic range image to obtain a target image.
In the description herein, references to the description of the terms "one embodiment," "some embodiments," "an illustrative embodiment," "an example," "a specific example" or "some examples" or the like mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present application. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and the scope of the preferred embodiments of the present application includes other implementations in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present application.
Although embodiments of the present application have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present application, and that variations, modifications, substitutions and alterations may be made to the above embodiments by those of ordinary skill in the art within the scope of the present application.

Claims (28)

1. A high dynamic range image processing system is characterized by comprising an image sensor, a high dynamic range image processing module, an image processor and an image fusion module;
the image sensor includes a pixel array including a plurality of panchromatic photosensitive pixels and a plurality of color photosensitive pixels having a narrower spectral response than the panchromatic photosensitive pixels, the pixel array including minimal repeating units, each of the minimal repeating units including a plurality of sub-units, each of the sub-units including a plurality of single-color photosensitive pixels and a plurality of panchromatic photosensitive pixels, the pixel array in the image sensor being exposed, wherein for a plurality of photosensitive pixels in the same sub-unit, at least one of the single-color photosensitive pixels is exposed for a first exposure time, at least one of the single-color photosensitive pixels is exposed for a second exposure time that is less than the first exposure time, and at least one of the panchromatic photosensitive pixels is exposed for a third exposure time that is less than the first exposure time; wherein, the first color information generated by the single-color photosensitive pixels exposed with the first exposure time obtains a first color original image, the second color information generated by the single-color photosensitive pixels exposed with the second exposure time obtains a second color original image, and the panchromatic photosensitive pixels exposed with the third exposure time generates a first panchromatic original image;
the high dynamic range image processing module is used for performing high dynamic range image processing on the first color original image and the second color original image to obtain a first color high dynamic range image, wherein the first color high dynamic range image comprises a plurality of first color image pixels, and the plurality of color image pixels are arranged in a Bayer array;
the image processor is configured to process the first color high dynamic range image and the first panchromatic raw image to obtain a second color high dynamic range image;
the image fusion module is used for carrying out fusion algorithm processing on the first color high dynamic range image and the second color high dynamic range image to obtain a target image;
wherein the image fusion module is further configured to:
performing box filtering on the first color high dynamic range image to obtain a plurality of first intermediate images corresponding to a plurality of color image pixels of different colors, and performing box filtering on the second color high dynamic range image at any time to obtain a plurality of second intermediate images corresponding to the plurality of color image pixels of different colors;
dividing the pixel value of each color image pixel in the first intermediate image by the pixel value of the color image pixel corresponding to the second intermediate image of the corresponding color to obtain the pixel value of the color image pixel corresponding to the third intermediate image of the corresponding color;
multiplying the pixel value of each color image pixel in the third intermediate image by the pixel value of the color image pixel corresponding to the second intermediate image to obtain a single-color image of the corresponding color;
and fusing a plurality of single-color images to obtain a target image.
2. The high dynamic range image processing system of claim 1, wherein the high dynamic range image processing module comprises a high dynamic range image processing unit and a luminance mapping unit;
the high dynamic range image processing unit is used for carrying out high dynamic range image processing on the first color original image and the second color original image to obtain third color high dynamic range image processing;
the brightness mapping unit is used for performing brightness mapping on the third color high dynamic range image to obtain the first color high dynamic range image.
3. The high dynamic range image processing system of claim 1, wherein the high dynamic range image processing module comprises a high dynamic range image processing unit, a lens shading correction unit, and a statistics unit;
the high dynamic range image processing unit is used for carrying out high dynamic range image processing on the first color original image and the second color original image to obtain third color high dynamic range image processing;
the lens shading correction unit is used for correcting the third color high dynamic range image to obtain a color high dynamic range corrected image;
the statistical unit is used for processing the color high dynamic range corrected image to obtain statistical data, and the statistical data is provided to the image processor for automatic exposure processing and/or automatic white balance processing.
4. The high dynamic range image processing system according to claim 1, wherein the high dynamic range image processing module comprises a statistical unit for processing the first color raw image and the second color raw image to obtain statistical data, the statistical data being provided to the image processor for automatic exposure processing and/or automatic white balance processing.
5. The high dynamic range image processing system of claim 1 wherein a portion of said panchromatic photosensitive pixels in the same subunit are exposed to a fourth exposure time and the remaining panchromatic photosensitive pixels are exposed to the third exposure time, said fourth exposure time being less than or equal to said first exposure time and greater than said third exposure time, second panchromatic information generated by said panchromatic photosensitive pixels exposed to said fourth exposure time yielding a second panchromatic original image;
the high dynamic range image processing module is used for carrying out high dynamic range image processing on the first color original image and the second color original image so as to obtain a first color high dynamic range image; performing high dynamic range image processing on the first full-color original image and the second full-color original image to obtain a first full-color high dynamic range image;
the image processor is for processing the first color high dynamic range image and a first panchromatic high dynamic range image to obtain a second color high dynamic range image.
6. The high dynamic range image processing system of claim 5, wherein the high dynamic range image processing module further comprises a high dynamic range image processing unit and a brightness mapping unit;
the high dynamic range image processing unit is used for fusing the first color original image and the second color original image into a second color high dynamic range image, and fusing the first panchromatic original image and the second panchromatic original image into a second panchromatic high dynamic range image;
the brightness mapping unit is configured to perform brightness mapping on the second color high dynamic range image to obtain the first color high dynamic range image, and perform brightness mapping on the second panchromatic high dynamic range image to obtain the first panchromatic high dynamic range image.
7. The high dynamic range image processing system of claim 5, wherein the high dynamic range image processing module further comprises a high dynamic range image processing unit, a lens shading correction unit, and a statistics unit;
the high dynamic range image processing unit is used for fusing the first color original image and the second color original image into a second color high dynamic range image, and fusing the first panchromatic original image and the second panchromatic original image into a second panchromatic high dynamic range image;
the lens shading correction unit is used for correcting the second color high dynamic range image to obtain a color high dynamic range correction image, and correcting the second panchromatic high dynamic range image to obtain a panchromatic high dynamic range correction image;
the statistical unit is used for processing the color high dynamic correction image and the panchromatic high dynamic correction image to obtain statistical data, and the statistical data is provided to the image processor for automatic exposure processing and/or automatic white balance processing.
8. The high dynamic range image processing system of claim 1, wherein the image processor is further configured to:
adding pixel values of a plurality of color image pixels on the first color high dynamic range image to pixel values of panchromatic image pixels corresponding to the first panchromatic original image to obtain pixel values of color image pixels corresponding to the second color high dynamic range image.
9. The high dynamic range image processing system of claim 1, wherein the image processor is further configured to:
performing brightness alignment processing on the first panchromatic original image and the first color high dynamic range image to obtain a brightness-aligned first panchromatic original image;
adding pixel values of a plurality of color image pixels on the first color high dynamic range image to pixel values of panchromatic image pixels corresponding to the luminance-aligned first panchromatic original image to obtain pixel values of color image pixels corresponding to the second color high dynamic range image.
10. The high dynamic range image processing system of claim 5, wherein the image processor is further configured to:
adding pixel values of a plurality of color image pixels on the first color high dynamic range image to pixel values of panchromatic image pixels corresponding to the first panchromatic high dynamic range image to obtain pixel values of color image pixels corresponding to the second color high dynamic range image.
11. The high dynamic range image processing system of claim 1 or 5, wherein said image processor is further configured to;
performing image processing on a first color high dynamic range image to obtain a processed first color high dynamic range image; and
performing image processing on a second color high dynamic range image to obtain a processed second color high dynamic range image;
the image fusion module is further configured to perform fusion algorithm processing on the processed first color high dynamic range image and the processed second color high dynamic range image to obtain the target image.
12. The high dynamic range image processing system of claim 11, wherein the image processing comprises:
at least one of black level correction processing, lens shading correction processing, demosaicing processing, dead pixel compensation processing, color correction processing, global tone mapping processing, and color conversion processing.
13. The high dynamic range image processing system of claim 11, further comprising a storage module,
the storage module is used for storing the image after the image processing and transmitting the image after the image processing to the image fusion module for fusion algorithm processing so as to obtain the target image.
14. A high dynamic range image processing method is used for a high dynamic range image processing system, and is characterized in that the high dynamic range image processing system comprises an image sensor, a high dynamic range image processing module, an image processor and an image fusion module;
the image sensor includes a pixel array including a plurality of panchromatic photosensitive pixels and a plurality of color photosensitive pixels having a narrower spectral response than the panchromatic photosensitive pixels, the pixel array including minimal repeating units, each of the minimal repeating units including a plurality of sub-units, each of the sub-units including a plurality of single-color photosensitive pixels and a plurality of panchromatic photosensitive pixels, the high dynamic range image processing method including:
the pixel array is exposed, wherein, for a plurality of photosensitive pixels in the same subunit, at least one single-color photosensitive pixel is exposed with a first exposure time, at least one single-color photosensitive pixel is exposed with a second exposure time that is less than the first exposure time, and at least one panchromatic photosensitive pixel is exposed with a third exposure time that is less than the first exposure time; wherein, the first color information generated by the single-color photosensitive pixels exposed with the first exposure time obtains a first color original image, the second color information generated by the single-color photosensitive pixels exposed with the second exposure time obtains a second color original image, and the panchromatic photosensitive pixels exposed with the third exposure time generates a first panchromatic original image;
performing high dynamic range image processing on the first color original image and the second color original image to obtain a first color high dynamic range image, wherein the first color high dynamic range image comprises a plurality of first color image pixels, and the plurality of color image pixels are arranged in a Bayer array;
processing the first color high dynamic range image and the first panchromatic original image to obtain a second color high dynamic range image; and
performing fusion algorithm processing on the first color high dynamic range image and the second color high dynamic range image to obtain a target image;
wherein the performing a fusion algorithm process on the first color high dynamic range image and the second color high dynamic range image to obtain a target image comprises:
performing box filtering on the first color high dynamic range image to obtain a plurality of first intermediate images corresponding to a plurality of color image pixels of different colors, and performing box filtering on the second color high dynamic range image at any time to obtain a plurality of second intermediate images corresponding to the plurality of color image pixels of different colors;
dividing the pixel value of each color image pixel in the first intermediate image by the pixel value of the color image pixel corresponding to the second intermediate image of the corresponding color to obtain the pixel value of the color image pixel corresponding to the third intermediate image of the corresponding color;
multiplying the pixel value of each color image pixel in the third intermediate image by the pixel value of the color image pixel corresponding to the second intermediate image to obtain a single-color image of the corresponding color;
and fusing a plurality of single-color images to obtain a target image.
15. The method according to claim 14, wherein said performing high dynamic range image processing on the first color original image and the second color original image to obtain a first color high dynamic range image comprises:
carrying out high dynamic range image processing on the first color original image and the second color original image to obtain third color high dynamic range image processing;
luminance mapping is performed on the third color high dynamic range image to obtain the first color high dynamic range image.
16. The high dynamic range image processing method according to claim 14, further comprising:
carrying out high dynamic range image processing on the first color original image and the second color original image to obtain third color high dynamic range image processing;
correcting the third color high dynamic range image to obtain a color high dynamic range corrected image;
processing the color high dynamic range corrected image to obtain statistical data, the statistical data being provided to the image processor for automatic exposure processing and/or automatic white balance processing.
17. The high dynamic range image processing method according to claim 14, further comprising:
the first color raw image, the second color raw image, and the first panchromatic raw image are processed to obtain statistical data, which is provided to the image processor for automatic exposure processing and/or automatic white balance processing.
18. The high dynamic range image processing method of claim 14 wherein a portion of said panchromatic photosensitive pixels in the same subunit are exposed to light for a fourth exposure time and the remaining panchromatic photosensitive pixels are exposed to light for the third exposure time, said fourth exposure time being less than or equal to said first exposure time and greater than said third exposure time, second panchromatic information generated by said panchromatic photosensitive pixels exposed to light for said fourth exposure time yielding a second panchromatic original image; the high dynamic range image processing method further includes:
carrying out high dynamic range image processing on the first color original image and the second color original image to obtain a first color high dynamic range image; performing high dynamic range image processing on the first full-color original image and the second full-color original image to obtain a first full-color high dynamic range image;
the first color high dynamic range image and the first full color high dynamic range image are processed to obtain a second color high dynamic range image.
19. The method according to claim 18, wherein said high dynamic range image processing is performed on said first color original image and said second color original image to obtain a first color high dynamic range image; performing high dynamic range image processing on the first full-color original image and the second full-color original image to obtain a first full-color high dynamic range image, comprising:
fusing the first color original image and the second color original image into a second color high dynamic range image, fusing the first panchromatic original image and the second panchromatic original image into a second panchromatic high dynamic range image;
luminance mapping the second color high dynamic range image to obtain the first color high dynamic range image, luminance mapping the second panchromatic high dynamic range image to obtain the first panchromatic high dynamic range image.
20. The high dynamic range image processing method according to claim 18, further comprising:
fusing the first color original image and the second color original image into a second color high dynamic range image, fusing the first panchromatic original image and the second panchromatic original image into a second panchromatic high dynamic range image;
correcting the second color high dynamic range image to obtain a color high dynamic range corrected image, correcting the second panchromatic high dynamic range image to obtain a panchromatic high dynamic range corrected image;
processing the color high dynamic correction image and the panchromatic high dynamic correction image to obtain statistical data, the statistical data being provided to the image processor for automatic exposure processing and/or automatic white balance processing.
21. The high dynamic range image processing method of claim 14, wherein said processing the first color high dynamic range image and a first panchromatic original image to obtain a second color high dynamic range image comprises:
adding pixel values of a plurality of color image pixels on the first color high dynamic range image to pixel values of panchromatic image pixels corresponding to the first panchromatic original image to obtain pixel values of color image pixels corresponding to the second color high dynamic range image.
22. The high dynamic range image processing method of claim 14, wherein said processing the first color high dynamic range image and a first panchromatic original image to obtain a second color high dynamic range image comprises:
performing brightness alignment processing on the first panchromatic original image and the first color high dynamic range image to obtain a brightness-aligned first panchromatic original image;
adding pixel values of a plurality of color image pixels on the first color high dynamic range image to pixel values of panchromatic image pixels corresponding to the luminance-aligned first panchromatic original image to obtain pixel values of color image pixels corresponding to the second color high dynamic range image.
23. The high dynamic range image processing method of claim 18, wherein said processing the first color high dynamic range image and the first full color high dynamic range image to obtain a second color high dynamic range image comprises:
adding pixel values of a plurality of color image pixels on the first color high dynamic range image to pixel values of panchromatic image pixels corresponding to the first panchromatic high dynamic range image to obtain pixel values of color image pixels corresponding to the second color high dynamic range image.
24. The high dynamic range image processing method according to claim 14 or 19, further comprising:
performing image processing on a first color high dynamic range image to obtain a processed first color high dynamic range image;
performing image processing on a second color high dynamic range image to obtain a processed second color high dynamic range image;
the performing a fusion algorithm process on the first color high dynamic range image and the second color high dynamic range image to obtain a target image includes:
and performing fusion algorithm processing on the processed first color high dynamic range image and the processed second color high dynamic range image to obtain the target image.
25. The high dynamic range image processing method of claim 24, wherein the image processing comprises:
at least one of black level correction processing, lens shading correction processing, demosaicing processing, dead pixel compensation processing, color correction processing, global tone mapping processing, and color conversion processing.
26. The high dynamic range image processing method of claim 24, wherein the high dynamic range image processing system further comprises a storage module, the high dynamic range image processing method further comprising:
storing the image after the image processing in the storage module; and
and acquiring the image after image processing from the storage module, and performing fusion algorithm processing on the image after image processing to obtain the target image.
27. An electronic device, comprising:
a lens;
a housing; and
the high dynamic range image processing system of any one of claims 1 to 13, said lens, said high dynamic range image processing system being integrated with said housing, said lens imaging in cooperation with an image sensor of said high dynamic range image processing system.
28. A non-transitory computer-readable storage medium containing a computer program which, when executed by a processor, causes the processor to perform the high dynamic range image processing method of any one of claims 14 to 26.
CN202010808125.2A 2020-08-12 2020-08-12 High dynamic range image processing system and method, electronic device, and readable storage medium Active CN111970459B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010808125.2A CN111970459B (en) 2020-08-12 2020-08-12 High dynamic range image processing system and method, electronic device, and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010808125.2A CN111970459B (en) 2020-08-12 2020-08-12 High dynamic range image processing system and method, electronic device, and readable storage medium

Publications (2)

Publication Number Publication Date
CN111970459A CN111970459A (en) 2020-11-20
CN111970459B true CN111970459B (en) 2022-02-18

Family

ID=73365318

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010808125.2A Active CN111970459B (en) 2020-08-12 2020-08-12 High dynamic range image processing system and method, electronic device, and readable storage medium

Country Status (1)

Country Link
CN (1) CN111970459B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113810622B (en) * 2021-08-12 2022-09-16 荣耀终端有限公司 Image processing method and device

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104079904A (en) * 2014-07-17 2014-10-01 广东欧珀移动通信有限公司 Color image generating method and device
CN107018298A (en) * 2015-12-24 2017-08-04 三星电子株式会社 Imaging device, electronic equipment and the method for obtaining image by it

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012257193A (en) * 2011-05-13 2012-12-27 Sony Corp Image processing apparatus, image pickup apparatus, image processing method, and program
CN111294522A (en) * 2019-02-28 2020-06-16 北京展讯高科通信技术有限公司 HDR image imaging method, device and computer storage medium
CN111479071B (en) * 2020-04-03 2021-05-07 Oppo广东移动通信有限公司 High dynamic range image processing system and method, electronic device, and readable storage medium
CN111491110B (en) * 2020-04-17 2021-09-17 Oppo广东移动通信有限公司 High dynamic range image processing system and method, electronic device, and storage medium
CN111491111B (en) * 2020-04-20 2021-03-26 Oppo广东移动通信有限公司 High dynamic range image processing system and method, electronic device, and readable storage medium

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104079904A (en) * 2014-07-17 2014-10-01 广东欧珀移动通信有限公司 Color image generating method and device
CN107018298A (en) * 2015-12-24 2017-08-04 三星电子株式会社 Imaging device, electronic equipment and the method for obtaining image by it

Also Published As

Publication number Publication date
CN111970459A (en) 2020-11-20

Similar Documents

Publication Publication Date Title
CN111479071B (en) High dynamic range image processing system and method, electronic device, and readable storage medium
CN111405204B (en) Image acquisition method, imaging device, electronic device, and readable storage medium
CN111432099B (en) Image sensor, processing system and method, electronic device, and storage medium
CN111491110B (en) High dynamic range image processing system and method, electronic device, and storage medium
CN111491111B (en) High dynamic range image processing system and method, electronic device, and readable storage medium
CN112261391B (en) Image processing method, camera assembly and mobile terminal
CN111757006B (en) Image acquisition method, camera assembly and mobile terminal
CN110740272B (en) Image acquisition method, camera assembly and mobile terminal
CN111314592B (en) Image processing method, camera assembly and mobile terminal
CN111586375B (en) High dynamic range image processing system and method, electronic device, and readable storage medium
CN111385543B (en) Image sensor, camera assembly, mobile terminal and image acquisition method
CN111050041B (en) Image sensor, control method, camera assembly and mobile terminal
CN110971799B (en) Control method, camera assembly and mobile terminal
CN111741221B (en) Image acquisition method, camera assembly and mobile terminal
CN111970460B (en) High dynamic range image processing system and method, electronic device, and readable storage medium
CN112738493B (en) Image processing method, image processing apparatus, electronic device, and readable storage medium
CN111970459B (en) High dynamic range image processing system and method, electronic device, and readable storage medium
CN111835971B (en) Image processing method, image processing system, electronic device, and readable storage medium
CN111970461B (en) High dynamic range image processing system and method, electronic device, and readable storage medium
CN114073068B (en) Image acquisition method, camera component and mobile terminal
CN112351172A (en) Image processing method, camera assembly and mobile terminal
CN112235485B (en) Image sensor, image processing method, imaging device, terminal, and readable storage medium
CN112738494A (en) Image processing method, image processing system, terminal device, and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant