CN113676636A - Method and device for generating high dynamic range image, electronic equipment and storage medium - Google Patents

Method and device for generating high dynamic range image, electronic equipment and storage medium Download PDF

Info

Publication number
CN113676636A
CN113676636A CN202110937225.XA CN202110937225A CN113676636A CN 113676636 A CN113676636 A CN 113676636A CN 202110937225 A CN202110937225 A CN 202110937225A CN 113676636 A CN113676636 A CN 113676636A
Authority
CN
China
Prior art keywords
pixel
image
pixel point
array
color
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110937225.XA
Other languages
Chinese (zh)
Other versions
CN113676636B (en
Inventor
杨鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202110937225.XA priority Critical patent/CN113676636B/en
Publication of CN113676636A publication Critical patent/CN113676636A/en
Application granted granted Critical
Publication of CN113676636B publication Critical patent/CN113676636B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/741Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/76Addressed sensors, e.g. MOS or CMOS sensors
    • H04N25/77Pixel circuitry, e.g. memories, A/D converters, pixel amplifiers, shared circuits or shared components
    • H04N25/772Pixel circuitry, e.g. memories, A/D converters, pixel amplifiers, shared circuits or shared components comprising A/D, V/T, V/F, I/T or I/F converters

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Color Television Image Signal Generators (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)

Abstract

The application relates to a method and a device for generating a high dynamic range image, a computer device and a storage medium. The method comprises the following steps: exposing each pixel point in the pixel point array to obtain a first intermediate image and a second intermediate image; the first intermediate image and the second intermediate image respectively comprise single-color photosensitive pixels and panchromatic photosensitive pixels, each pixel in the first intermediate image is obtained by combining photosensitive data obtained by exposing a plurality of same-type pixels in a pixel point array for a first exposure duration, each pixel in the second intermediate image is obtained by combining data obtained by exposing a plurality of same-type pixels in the pixel point array for a second exposure duration, and the first exposure duration is longer than the second exposure duration; generating a high dynamic range image based on the first intermediate map and the second intermediate map; the arrangement of each pixel in the high dynamic range image matches the bayer array. By adopting the method, the compatibility between the image sensor with the panchromatic pixel points and the image processor for processing the Bayer array image can be improved.

Description

Method and device for generating high dynamic range image, electronic equipment and storage medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to a method and an apparatus for generating a high dynamic range image, an electronic device, and a computer-readable storage medium.
Background
In more and more electronic devices, a camera is installed to realize a photographing function. An image sensor is arranged in the camera, and images are collected through the image sensor. With the development of computer technology, various types of Image sensors have appeared, such as RGBW (Red, Green, Blue, White, Red, Green, Blue, White) based Image sensors (CMOS Image sensors), such as CMYW (Cyan, Magenta, Yellow, White) based Image sensors, and the like.
However, an image processor in an electronic device generally processes an image based on a Bayer array (Bayer or quadbye), and there is a problem that the image sensor is not compatible with the image processor.
Disclosure of Invention
Embodiments of the present application provide a method and an apparatus for generating a high dynamic range image, an electronic device, and a computer-readable storage medium, which can improve compatibility between an image sensor having panchromatic pixel points and an image processor that processes a bayer array image.
A generation method of a high dynamic range image is applied to an electronic device comprising an image sensor, wherein the image sensor comprises a pixel point array, the pixel point array comprises minimum pixel point repeating units, each minimum pixel point repeating unit comprises a plurality of pixel point sub-units, and each pixel point sub-unit comprises a plurality of single-color pixel points and a plurality of panchromatic pixel points; the method comprises the following steps:
exposing each pixel point in the pixel point array to obtain a first intermediate image and a second intermediate image; each pixel in the first intermediate image is obtained by combining photosensitive data obtained by exposing a plurality of same-type pixels in the pixel dot array for a first exposure duration, each pixel in the second intermediate image is obtained by combining data obtained by exposing a plurality of same-type pixels in the pixel dot array for a second exposure duration, and the first exposure duration is longer than the second exposure duration;
generating a high dynamic range image based on the first intermediate map and the second intermediate map; the arrangement of each pixel in the high dynamic range image is matched with a Bayer array.
A generation apparatus of a high dynamic range image, applied to an electronic device including an image sensor, the image sensor including a pixel point array including minimum pixel point repeating units, each of the minimum pixel point repeating units including a plurality of pixel point sub-units, each of the pixel point sub-units including a plurality of single-color pixel points and a plurality of panchromatic pixel points; the device comprises:
the acquisition module is used for exposing each pixel point in the pixel point array to obtain a first intermediate image and a second intermediate image; each pixel in the first intermediate image is obtained by combining photosensitive data obtained by exposing a plurality of same-type pixels in the pixel dot array for a first exposure duration, each pixel in the second intermediate image is obtained by combining data obtained by exposing a plurality of same-type pixels in the pixel dot array for a second exposure duration, and the first exposure duration is longer than the second exposure duration;
a fusion module for generating a high dynamic range image based on the first intermediate map and the second intermediate map; the arrangement of each pixel in the high dynamic range image is matched with a Bayer array.
An electronic device comprising a memory and a processor, the memory having stored therein a computer program, the computer program, when executed by the processor, causing the processor to perform the steps of the method for generating a high dynamic range image as described above.
A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method as described above.
According to the method and device for generating the high dynamic range image, the electronic device and the computer readable storage medium, the image sensor comprises the pixel point array, the pixel point array comprises the minimum pixel point repeating unit, each minimum pixel point repeating unit comprises a plurality of pixel point sub-units, each pixel point sub-unit comprises a plurality of single-color pixel points and a plurality of panchromatic pixel points, and then higher light incoming quantity can be received through the panchromatic pixel points. Then, the electronic device exposes each pixel point in the pixel point array, so that the light incoming amount received by the whole pixel point array can be increased, more photosensitive data can be obtained, that is, the first intermediate image and the second intermediate image include more photosensitive data, and therefore, a more accurate high dynamic range image can be generated based on the first intermediate image and the second intermediate image. Moreover, the arrangement of each pixel in the high dynamic range image is matched with the Bayer array, so that the high dynamic range image output by the image sensor which collects more photosensitive data can also be transmitted to the image processor for image processing, and the compatibility between the image sensor with panchromatic pixel points and the image processor for processing the Bayer array image is improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a schematic diagram of an electronic device in one embodiment;
FIG. 2 is an exploded view of an image sensor in one embodiment;
FIG. 3 is a flow diagram of a method for generating a high dynamic range image in one embodiment;
FIG. 4 is a schematic diagram of the arrangement of the minimum color filter repeating units in one embodiment;
FIG. 5 is a diagram illustrating an image format output by an image sensor in one embodiment;
FIG. 6 is a diagram showing an image format output from an image sensor in another embodiment;
FIG. 7 is a diagram showing an image format output from an image sensor in another embodiment;
FIG. 8 is a schematic diagram of an embodiment in which an array of pixel dots is exposed in a column-wise spaced manner;
FIG. 9 is a diagram illustrating exposure durations of pixels in a pixel array according to an embodiment;
FIG. 10 is a schematic illustration of a first intermediate image and a second intermediate image obtained after exposure of an array of pixel dots in a column-wise spaced manner in one embodiment;
FIG. 11 is a schematic illustration of a first intermediate image and a second intermediate image obtained after exposure of an array of pixel sites in one embodiment;
FIG. 12 is a flow diagram for deriving a high dynamic range image from a first intermediate map and a second intermediate map in one embodiment;
FIG. 13 is a block diagram showing a configuration of a high dynamic range image generating apparatus according to an embodiment;
fig. 14 is a schematic diagram of an internal structure of an electronic device in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
It will be understood that, as used herein, the terms "first," "second," and the like may be used herein to describe various elements, but these elements are not limited by these terms. These terms are only used to distinguish one element from another. For example, a first client may be referred to as a second client, and similarly, a second client may be referred to as a first client, without departing from the scope of the present application. Both the first client and the second client are clients, but they are not the same client.
As shown in fig. 1, the electronic device includes a camera 102, and the camera 102 includes an image sensor including a color filter array and a pixel dot array.
The electronic device is described below as a mobile phone, but the electronic device is not limited to a mobile phone. The terminal comprises a camera, a processor and a shell. The camera and the processor are arranged in the shell, and the shell can also be used for installing functional modules such as a power supply device and a communication device of the terminal, so that the shell provides protection such as dust prevention, falling prevention and water prevention for the functional modules.
The camera may be a front camera, a rear camera, a side camera, a screen camera, etc., without limitation. The camera includes a lens and an image sensor, and when the camera is used for shooting an image, light passes through the lens and reaches the image sensor, and the image sensor is used for converting a light signal irradiated on the image sensor 21 into an electric signal.
As shown in fig. 2, the image sensor includes a microlens array 21, a color filter array 22, and a pixel point array 23.
The micro lens array 21 comprises a plurality of micro lenses 211, the micro lenses 211, color filters in the color filter array 22 and pixel points in the pixel point array 23 are arranged in a one-to-one correspondence mode, the micro lenses 211 are used for gathering incident light, the gathered light can penetrate through the corresponding color filters and then is projected to the pixel points to be received by the corresponding pixel points, and the pixel points convert the received light into electric signals.
The color filter array 22 includes a plurality of minimum color filter repeating units 221. Each minimum color filter repeating unit 221 includes a plurality of color filter sub-units 222. In the present embodiment, the minimum color filter repeating unit 221 includes 4 color filter sub-units 222, and the 4 color filter sub-units 222 are arranged in a matrix. Each color filter subunit 222 includes a plurality of single-color filters 223 and a plurality of panchromatic filters 224. Different single color filters 223 are also included in different color filter subunits 222. For example, the single color filter included in the filter subunit a is a red filter, and the single color filter included in the filter subunit B is a green filter.
Similarly, the pixel array 23 includes minimum pixel repeating units 231, each minimum pixel repeating unit 231 includes a plurality of pixel sub-units 232, and each pixel sub-unit 232 includes a plurality of single-color pixels 233 and a plurality of panchromatic pixels 234. Each pixel point sub-unit 232 corresponds to one color filter sub-unit 222, and the pixel points in each pixel point sub-unit 232 also correspond to the color filters in the corresponding color filter sub-units 222 one to one. The light transmitted by the panchromatic filter 224 is projected to the panchromatic pixel points 234, and panchromatic sensitization data can be obtained; the light transmitted through the single color filter 223 is projected to the single color pixel 233, and single color sensitive data can be obtained. In the present embodiment, the minimum pixel point repeating unit 231 includes 4 pixel point sub-units 232, and the 4 pixel point sub-units 232 are arranged in a matrix.
FIG. 3 is a flow diagram of a method for generating a high dynamic range image in one embodiment. The method for generating a high dynamic range image in this embodiment is described by taking the electronic device in fig. 1 as an example. As shown in fig. 2, the generation method of the high dynamic range image includes steps 302 to 304.
Step 302, exposing each pixel point in the pixel point array to obtain a first intermediate image and a second intermediate image; the first intermediate image and the second intermediate image respectively comprise single-color photosensitive pixels and panchromatic photosensitive pixels, each pixel in the first intermediate image is obtained by combining photosensitive data obtained by exposing a plurality of same-type pixels in a pixel point array for a first exposure duration, each pixel in the second intermediate image is obtained by combining data obtained by exposing a plurality of same-type pixels in the pixel point array for a second exposure duration, and the first exposure duration is longer than the second exposure duration.
The first intermediate image is an image formed by pixels exposed for a first exposure time period in the pixel dot array. The second intermediate image is an image formed by pixels exposed for a second exposure time period in the pixel dot array. The first exposure duration is longer than the second exposure duration, that is, the first exposure duration is a long exposure duration, and the second exposure duration is a short exposure duration. The first exposure duration and the second exposure duration can be set as required. For example, the first exposure period is 0.02ms (milliseconds), and the second exposure period is 0.01 ms.
A single color filter refers to a filter that is capable of transmitting a single color of light. The single-color filter may specifically include one or more of a Red (R, R) filter, a Green (G) filter, a Blue (B, Blue) filter, and the like. Wherein, red color filter can see through red light, green color filter can see through green light, and blue color filter can see through blue light.
A full color filter is a filter that transmits all color light. The full-color filter may be a White (W) filter. It will be appreciated that a full colour filter can transmit white light, which is the light projected by the wavelengths of the various wavelength bands together, i.e. a full colour filter can transmit light of all colours.
Correspondingly, a single-color pixel refers to a pixel that receives a single color light. The single-color filter may specifically include one or more of Red (R) pixels, Green (G) pixels, Blue (B) pixels, and the like. Panchromatic pixels are pixels that receive all color light. The panchromatic pixels may be White (W) pixels.
The first intermediate image includes single-color photosensitive pixels and full-color photosensitive pixels, and the second intermediate image also includes single-color photosensitive pixels and full-color photosensitive pixels. Then, the pixel points exposed in the pixel point array in the first exposure time period include single-color photosensitive pixel points and panchromatic photosensitive pixel points, and the pixel points exposed in the second exposure time period also include single-color photosensitive pixel points and panchromatic photosensitive pixel points.
The photosensitive data is data obtained by converting optical signals into electric signals after the pixels receive light. The exposure data may include brightness values, exposure time periods, gray scale values, and the like.
Specifically, the electronic device controls the image sensor to expose each pixel point in the pixel point array according to the corresponding preset exposure duration, so as to obtain a photosensitive pixel of each pixel point. The preset exposure duration comprises a first exposure duration and a second exposure duration. Exposing the pixel point for a first exposure duration to obtain first photosensitive data; and exposing the pixel point with the second exposure duration to obtain second photosensitive data. The electronic equipment merges the first photosensitive data obtained by the multiple same-type pixel points according to a preset obtaining sequence to obtain pixels in a first intermediate image, and merges the second photosensitive data obtained by the multiple same-type pixel points to obtain pixels in a second intermediate image. The preset acquisition order may be one of a row-by-row manner, a column-by-column manner, a pixel point-by-pixel point subunit manner, and the like.
In one embodiment, the electronic device may add the first photosensitive data obtained by the plurality of same-type pixel points to obtain a pixel in the first intermediate map. In another embodiment, the electronic device may average the first photosensitive data obtained from the plurality of same type of pixel points to obtain a pixel in the first middle map. In other embodiments, the electronic device may further adopt other merging manners to merge the first photosensitive data obtained by the multiple same-type pixel points, which is not limited herein.
Similarly, the manner in which the electronic device merges the second photosensitive data obtained by the multiple pixels of the same type is similar to the manner in which the first photosensitive data is merged, and is not described herein again.
For example, the first exposure time period is a, and the second exposure time period is b. Obtaining photosensitive pixels obtained by exposing every 2 green pixel points with the exposure duration a in a pixel point array according to a line-by-line obtaining sequence, merging the photosensitive pixels obtained by exposing every 2 blue pixel points with the exposure duration a to obtain a green pixel, merging the photosensitive pixels obtained by exposing every 2 red pixel points with the exposure duration a to obtain a blue pixel, merging the photosensitive pixels obtained by exposing every 2 white pixel points with the exposure duration a to obtain a red pixel, merging the photosensitive pixels obtained by exposing every 2 white pixel points with the exposure duration a to obtain a white pixel, and arranging the pixels obtained by merging according to the corresponding obtaining sequence to obtain a first middle image. Obtaining photosensitive pixels obtained by exposing every 2 green pixel points with the exposure duration of b according to a line-by-line obtaining sequence in a pixel point array, merging the photosensitive pixels to obtain green pixels, merging the photosensitive pixels obtained by exposing every 2 blue pixel points with the exposure duration of b to obtain blue pixels, merging the photosensitive pixels obtained by exposing every 2 red pixel points with the exposure duration of b to obtain red pixels, merging the photosensitive pixels obtained by exposing every 2 white pixel points with the exposure duration of b to obtain white pixels, and arranging the pixels obtained by merging according to the corresponding obtaining sequence to obtain a second intermediate image.
Step 304, generating a high dynamic range image based on the first intermediate map and the second intermediate map; the arrangement of each pixel in the high dynamic range image matches the bayer array.
A High-Dynamic Range (HDR) image is an image that can provide more Dynamic Range and image details than a general image. The bayer array is a technology for simulating the sensitivity of human eyes to colors, and converts gray scale information into color information by using an arrangement mode of 1 red, 2 green and 1 blue, and is one of the main technologies for realizing the shooting of color images by a CCD (Charge-coupled Device) or CMOS (Complementary Metal Oxide Semiconductor) sensor.
In one embodiment, the electronic device performs a merging process on the first intermediate map and the second intermediate map to generate a high dynamic range image. In another embodiment, the electronic device extracts the required pixels from the first and second intermediate maps, respectively, to generate a high dynamic range image. In another embodiment, the electronic device may further obtain weighting factors of the first intermediate map and the second intermediate map, respectively, perform weighting processing on the first intermediate map and the second intermediate map, and then combine the weighted first intermediate map and the weighted second intermediate map to generate a high dynamic range image.
The manner of generating the high dynamic range image based on the first intermediate map and the second intermediate map may be set as needed, and is not limited herein.
In this embodiment, the image sensor includes a pixel point array, the pixel point array includes a minimum pixel point repeating unit, each minimum pixel point repeating unit includes a plurality of pixel point sub-units, and each pixel point sub-unit includes a plurality of single-color pixel points and a plurality of panchromatic pixel points, and then a higher light incident amount can be received through the panchromatic pixel points. Then, the electronic device exposes each pixel point in the pixel point array, so that the light incoming amount received by the whole pixel point array can be increased, more photosensitive data can be obtained, that is, the first intermediate image and the second intermediate image include more photosensitive data, and therefore, a more accurate high dynamic range image can be generated based on the first intermediate image and the second intermediate image. Moreover, the arrangement of each pixel in the high dynamic range image is matched with the Bayer array, so that the high dynamic range image output by the image sensor which collects more photosensitive data can also be transmitted to the image processor for image processing, and the compatibility between the image sensor with panchromatic pixel points and the image processor for processing the Bayer array image is improved. Further, the image processing accuracy is improved by matching the image sensor specific data output structure with the algorithm and hardware function of the image processor.
Then, the electronic equipment exposes each pixel point in the pixel point array to obtain a first intermediate image and a second intermediate image; the first intermediate image and the second intermediate image respectively comprise single-color photosensitive pixels and panchromatic photosensitive pixels, each pixel in the first intermediate image is obtained by combining photosensitive data obtained by exposing a plurality of same-type pixels in a pixel point array for a first exposure duration, each pixel in the second intermediate image is obtained by combining data obtained by exposing a plurality of same-type pixels in the pixel point array for a second exposure duration, and the first exposure duration is longer than the second exposure duration; generating a high dynamic range image based on the first intermediate image and the second intermediate image; and the arrangement of each pixel in the high dynamic range image is matched with the Bayer array
In one embodiment, the image sensor further comprises a color filter array, the color filter array comprises minimum color filter repeating units, each minimum color filter repeating unit comprises a plurality of color filter subunits, each color filter subunit comprises a plurality of single-color filters and a plurality of panchromatic color filters, each color filter in the color filter array corresponds to each pixel point in the pixel point array in a one-to-one mode, and light rays filtered by the color filters are projected to the corresponding pixel points to obtain photosensitive data.
The minimum color filter repeating unit is 8 rows, 8 columns and 64 color filters, and the arrangement mode is as follows:
a w a w b w b w
w a w a w b w b
a w a w b w b w
w a w a w b w b
b w b w c w c w
w b w b w c w c
b w b w c w c w
w b w b w c w c
wherein w denotes a full-color filter, and a, b, and c each denote a single-color filter.
Fig. 4 is a schematic diagram of the arrangement of the minimum color filter repeating units in one embodiment. Where w represents a full-color filter and represents 50% of the minimum filter repeating unit, a, b, and c each represent a single-color filter, a and c represent 12.5% of the minimum filter repeating unit, and b represents 25% of the minimum filter repeating unit.
In one embodiment, w may represent a white color filter, and a, b, and c may represent a red color filter, a green color filter, and a blue color filter, respectively. In other embodiments, w may represent a white color filter, and a, b, and c may represent a cyan color filter, a magenta color filter, and a yellow color filter, respectively.
For example, w may represent a white color filter, a red color filter, b green color filter, and c blue color filter. For another example, w may represent a white color filter, a green color filter, b red color filter, and c blue color filter.
The order of the various pixels obtained by the single-color filter may be set as needed, and is not limited.
In the present embodiment, the minimum color filter repeating unit includes a 50% full-color filter, and the light entering amount of the image sensor can be increased as much as possible, so that more photosensitive data can be acquired.
FIG. 5 is a diagram illustrating an image format output by an image sensor, according to one embodiment. The image sensor controls the pixel point array to output one pixel from each pixel point according to the sequence of row by row or column by column in a Fullsize data reading format to generate an image. In other embodiments, the image sensor may be set as needed when controlling the pixel dot array to be in the Fullsize data readout format, which is not limited herein.
Fig. 6 is a schematic diagram of an image format output by an image sensor in another embodiment. The image sensor controls the pixel point array to merge the photosensitive data obtained by each 2 same-color pixel points according to a first diagonal direction 602 and merge the photosensitive data obtained by each 2 panchromatic pixel points according to a second diagonal direction 604 in a first-level Binning reading format to generate an image. Wherein the first diagonal direction is the direction indicated by the connecting line of the upper left corner and the lower right corner. The second diagonal direction is the direction indicated by the line connecting the upper right and lower left corners. The first diagonal direction and the second diagonal direction are perpendicular to each other. The method of combining the plurality of exposure data may be addition or averaging, and is not limited.
Fig. 7 is a diagram showing an image format output from the image sensor in another embodiment. The image sensor controls the pixel point array to merge 8 photosensitive data obtained by the same single-color pixel points in each pixel point sub-unit 702 in a two-level Binning reading format, and merges 8 photosensitive data obtained by the panchromatic pixel points in each pixel point sub-unit 702 to generate an image. The method of combining the plurality of exposure data may be addition or averaging, and is not limited.
In one embodiment, the pixel array is exposed in a row spacing mode or a column spacing mode; the line interval mode is that the exposure duration of each line of pixel points is alternately set according to the first exposure duration and the second exposure duration, and each pixel point in each line is exposed with the first exposure duration or the second exposure duration; the column interval mode is that the exposure duration of each column of pixel points is alternately set according to the first exposure duration and the second exposure duration, and each pixel point of each column is exposed with the first exposure duration or the second exposure duration.
When the pixel point array is exposed according to the column interval mode, the exposure time of each pixel point in the pixel point array is as follows:
L S L S L S L S
L S L S L S L S
L S L S L S L S
L S L S L S L S
L S L S L S L S
L S L S L S L S
L S L S L S L S
L S L S L S L S
or
S L S L S L S L
S L S L S L S L
S L S L S L S L
S L S L S L S L
S L S L S L S L
S L S L S L S L
S L S L S L S L
S L S L S L S L
Where L denotes a first exposure time period and S denotes a second exposure time period.
When the pixel point array is exposed according to the line interval mode, the exposure time of each pixel point in the pixel point array is as follows:
L L L L L L L L
S S S S S S S S
L L L L L L L L
S S S S S S S S
L L L L L L L L
S S S S S S S S
L L L L L L L L
S S S S S S S S
or
S S S S S S S S
L L L L L L L L
S S S S S S S S
L L L L L L L L
S S S S S S S S
L L L L L L L L
S S S S S S S S
L L L L L L L L
Where L denotes a first exposure time period and S denotes a second exposure time period.
FIG. 8 is a diagram illustrating exposure of a pixel spot array in a column-wise spaced manner, according to one embodiment. As shown in fig. 8, the exposure duration of each pixel point in the 1 st column is a first exposure duration L, the exposure duration of each pixel point in the 2 nd column is a second exposure duration S, and the exposure durations of the pixel points in the columns are alternately set according to the first exposure duration L and the second exposure duration S.
FIG. 9 is a diagram illustrating exposure durations of pixels in a pixel array according to an embodiment. As shown in fig. 9, the exposure time of each pixel in the pixel array is:
L L S S L L S S
L L S S L L S S
S S L L S S L L
S S L L S S L L
L L S S L L S S
L L S S L L S S
S S L L S S L L
S S L L S S L L
where L denotes a first exposure time period and S denotes a second exposure time period.
In one embodiment, exposing each pixel in the pixel point array to obtain a first intermediate image and a second intermediate image includes: determining a plurality of same-kind pixel points with first exposure duration from a pixel point array, obtaining first pixels based on the determined photosensitive data of each pixel point, and obtaining a first intermediate map based on each first pixel; and determining a plurality of same-kind pixel points with second exposure time length from the pixel point array, obtaining second pixels based on the determined photosensitive data of each pixel point, and obtaining a second intermediate image based on each second pixel.
The first pixel is a pixel obtained based on the photosensitive data of the same pixel points of the plurality of first exposure durations. The second pixel is a pixel obtained based on the photosensitive data of the same pixel point for a plurality of second exposure durations. The exposure time lengths corresponding to the first pixel and the second pixel are different, so that different sensitization data can be obtained, and a more accurate high dynamic range image can be generated based on the first intermediate image and the second intermediate image.
Fig. 10 is a schematic diagram of a first intermediate image and a second intermediate image obtained after exposure of a pixel dot array in a column-spaced manner in one embodiment. As shown in fig. 10, which is a first-level binding HDR data reading format, the electronic device controls a pixel array 1002 in the image sensor to expose according to a column interval manner, exposure durations of pixels in each column are alternately set according to a first exposure duration L and a second exposure duration S, and then exposure is performed through each pixel to obtain corresponding photosensitive data; and combining the photosensitive data obtained by exposing each 2 same-type pixel points in the same row for the first exposure duration L to obtain a pixel in the first intermediate image 1004, and combining the photosensitive data obtained by exposing each 2 same-type pixel points in the same row for the second exposure duration S to obtain a pixel in the second intermediate image 1006.
For example, the photosensitive data obtained by exposing each 2 w pixel points in the 1 st row for the first exposure duration L are combined to obtain w pixels in the first middle graph 1004 which are also obtained by exposing each 2 a pixel points in the 1 st row for the first exposure duration L, and the photosensitive data obtained by exposing each 2 a pixel points in the 1 st row for the first exposure duration L are combined to obtain a pixels in the first middle graph 1004 which are also obtained by exposing each 2 a pixel points for the first exposure duration L.
For another example, the photosensitive data obtained by exposing each 2 w pixel points in the 1 st row for the second exposure duration S are combined to obtain w pixels in the second intermediate image 1006 also obtained for the second exposure duration S, and the photosensitive data obtained by exposing each 2 b pixel points in the 1 st row for the second exposure duration S are combined to obtain b pixels in the second intermediate image 1006 also obtained for the second exposure duration S.
FIG. 11 is a schematic illustration of a first intermediate image and a second intermediate image obtained after exposure of an array of pixel sites in one embodiment. As shown in fig. 11, which is a two-level binning HDR data reading format, the electronic device controls a pixel point array 1102 in the image sensor to perform exposure, and then performs exposure through each pixel point to obtain corresponding photosensitive data; the sensitization data obtained by exposure in each pixel point subunit for the first exposure duration L are combined to obtain the pixels in the first intermediate image 1104, and the sensitization data obtained by exposure in each pixel point subunit for the second exposure duration S are combined to obtain the pixels in the second intermediate image 1106.
It should be noted that the order between the panchromatic pixels and the single-color pixels in the first intermediate image or the second intermediate image is not limited, that is, the panchromatic pixels may be generated first in the first row and the first column, and then the single-color pixels may be generated, and so on, to obtain the first intermediate image or the second intermediate image, or the single-color pixels may be generated first in the first row and the first column, then the panchromatic pixels may be generated, and so on, to obtain the first intermediate image or the second intermediate image.
In one embodiment, generating a high dynamic range image based on the first intermediate map and the second intermediate map comprises: sequentially acquiring single-color photosensitive pixels from the first intermediate image to generate a first color image; sequentially acquiring all panchromatic photosensitive pixels from the first intermediate image to generate a first full-color image; sequentially acquiring single-color photosensitive pixels from the second intermediate image to generate a second color image; sequentially acquiring all panchromatic photosensitive pixels from the first intermediate image to generate a second panchromatic image; fusing each single-color photosensitive pixel in the first color image with a panchromatic photosensitive pixel at a corresponding position in the first panchromatic image to obtain a first fused image; fusing each single-color photosensitive pixel in the second color image with the panchromatic photosensitive pixel at the corresponding position in the second panchromatic image to obtain a second fused image; based on the first fused image and the second fused image, a high dynamic range image is generated.
The first color image is an image in which each pixel is a single-color photosensitive pixel acquired from the first intermediate image. For example, the first color image includes three RGB pixels. As another example, the first color image includes CMY three pixels. The first full-color image is an image in which each pixel is a full-color photosensitive pixel acquired from the first intermediate image.
The second color image is an image in which each pixel is a single-color photosensitive pixel acquired from the second intermediate image. For example, the second color image includes three RGB pixels. As another example, the second color image includes CMY three pixels. The second full-color image is an image in which each pixel is a full-color photosensitive pixel acquired from the second intermediate image.
The first fused image is an image obtained by fusing the first color image and the first full-color image. The second fused image is an image obtained by fusing the second color image and the second full-color image.
Fusing each single-color photosensitive pixel in the first color image with the panchromatic photosensitive pixel at the corresponding position in the first panchromatic image to obtain a first fused image, which comprises the following steps: separating the color and brightness of each single-color photosensitive pixel in the first color image to obtain a first color and brightness separated image; and fusing the brightness of each single-color photosensitive pixel in the first color brightness separation image with the brightness of the first full-color image to obtain a first fused image.
The first color image is an RGB image, and the RGB image includes single-color photosensitive pixels as R pixels, G pixels, and B pixels. In one embodiment, the electronic device converts the first color image in RGB space into a first luma-separated image in YCrCb space, where Y in YCrCb is the luma in the first luma-separated image and Cr and Cb in YCrCb are the colors in the first luma-separated image. In another embodiment, the electronic device converts the first color image in the RGB space into a first color-separated image in the Lab space, where L in the Lab is the brightness in the first color-separated image, and a and b in the Lab are the colors in the first color-separated image.
It can be understood that the pixel value of each panchromatic photosensitive pixel in the first panchromatic image is the luminance value of each panchromatic photosensitive pixel, and the electronic device adds the luminance value of each pixel in the first color-luminance separated image to the panchromatic photosensitive pixel at the corresponding position in the first panchromatic image to obtain the luminance-corrected pixel value; and obtaining each pixel value after brightness correction to generate a first brightness separation image after brightness correction, and then performing color space conversion on the first brightness separation image after brightness correction to obtain a first fusion image.
Similarly, fusing each single-color photosensitive pixel in the second color image with the panchromatic photosensitive pixel at the corresponding position in the second panchromatic image to obtain a second fused image, including: separating the color and brightness of each single-color photosensitive pixel in the second color image to obtain a second color brightness separated image; and fusing the brightness of each single-color photosensitive pixel in the second brightness separation image with the brightness of the second full-color image to obtain a second fused image.
It should be noted that, since the first color image and the first panchromatic image are both images obtained by exposure for the first exposure time length, the exposure time length of each pixel in the first fused image obtained by fusing the first color image and the first panchromatic image is also the first exposure time length.
Similarly, since the second color image and the second panchromatic image are both images obtained by exposure for the second exposure time length, the exposure time length of each pixel in the second fused image obtained by fusing the second color image and the second panchromatic image is also the second exposure time length.
Generating a high dynamic range image based on the first fused image and the second fused image, comprising: performing brightness alignment processing on the first fusion image and the second fusion image to obtain a first fusion image with aligned brightness; performing motion detection on the first fusion image after the brightness is aligned to obtain a motion detection result; and fusing the second fused image and the first fused image after the brightness alignment based on the motion detection result to generate a high dynamic range image.
The luminance alignment process includes the steps of: identifying overexposed image pixels with pixel values larger than a first preset threshold value in the first fusion image; for each overexposed image pixel, expanding a preset area by taking the overexposed image pixel as a center; searching for intermediate image pixels with pixel values smaller than a first preset threshold value in a preset area; acquiring reference pixels at corresponding positions of the pixel of the overexposed image and the pixel of the intermediate image in the second fusion image; and determining and updating a new pixel value of the pixel of the overexposed image according to the corresponding relation between the pixel values of the reference pixels and the pixel value of the pixel of the intermediate image to obtain a first fusion image after brightness alignment. The first preset threshold and the preset area can be set according to needs.
For example, the pixel value of the overexposed image pixel in the first fused image is a, an intermediate image pixel with a pixel value smaller than a first preset threshold is found in the preset region, the pixel value of the intermediate image pixel is B, the pixel values of the reference pixels at the corresponding positions of the overexposed image pixel and the intermediate image pixel in the second fused image are C and D, and then the new pixel value of the overexposed image pixel is B C/D.
It can be understood that by correcting the pixel values of the overexposed image pixels in the first fused image, a more accurate luminance-aligned first intermediate image can be obtained.
Fusing the second fused image and the first fused image after the brightness alignment based on the motion detection result to generate a high dynamic range image, comprising: if the motion detection result shows that the first fusion image after the brightness alignment does not have a motion fuzzy area, directly fusing the second intermediate image and the first fusion image after the brightness alignment to obtain a high dynamic range image; and if the motion detection result shows that the first fusion image after the brightness alignment has the motion fuzzy region, removing the motion fuzzy region in the first fusion image, and fusing all regions of the second fusion image and the regions except the motion fuzzy region in the first fusion image after the brightness alignment to obtain the high dynamic range image.
If the motion detection result indicates that the first fused image after brightness alignment does not have a motion blur area, the fusion of the first fused image and the second fused image follows the following principle: (1) in the first fusion image after the brightness alignment, directly replacing the pixel value of the image pixel of the overexposure area with the pixel value of the image pixel corresponding to the overexposure area in the second fusion image; (2) in the first fusion image after brightness alignment, the pixel values of the image pixels of the underexposed area are: dividing the long exposure pixel value by the long-short pixel value ratio; (3) in the first fusion image after brightness alignment, the pixel values of the image pixels of the non-underexposed and non-overexposed areas are as follows: the long exposure pixel value is divided by the long to short pixel value ratio.
If the motion detection result indicates that the first fused image after brightness alignment has a motion blurred region, then the fusion of the first fused image and the second fused image at this time needs to follow the (4) th principle in addition to the above three principles: and in the first fusion image after the brightness alignment, directly replacing the pixel value of the image pixel of the motion blurring area with the pixel value of the image pixel corresponding to the motion blurring area in the second fusion image. In the underexposed region and the non-underexposed and non-overexposed regions, the pixel values of the image pixels in these regions are the ratio of the long-exposure pixel value divided by the long-short pixel value, i.e., VL/(VL/VS) ═ VS ', where VL denotes the long-exposure pixel value, VS denotes the short-exposure pixel value, and VS' denotes the calculated pixel values of the image pixels in the underexposed region and the non-underexposed and non-overexposed regions. The signal-to-noise ratio of VS' will be greater than the signal-to-noise ratio of VS.
Further, after the motion detection result is obtained, the weight factors corresponding to each pixel in the first fusion image and each pixel in the second fusion image after brightness alignment can be obtained; fusing the second fused image and the first fused image after the brightness alignment based on the motion detection result to generate a high dynamic range image, comprising: and fusing the weighted second fused image and the weighted first fused image after brightness alignment based on the motion detection result to generate a high dynamic range image. The weight factors corresponding to each pixel in the first fused image and each pixel in the second fused image after the brightness alignment can be set according to the requirement.
In one embodiment, as shown in fig. 12, the electronic device inputs the first and second intermediate maps 1202 into the luminance alignment module 1204, the motion detection module 1206, the calculate weights module 1208, and the image fusion module 1210 in order to obtain the high dynamic range image 1210.
It should be understood that although the steps in the flowcharts of fig. 2 and 12 are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least some of the steps in fig. 2 and 12 may include multiple sub-steps or multiple stages that are not necessarily performed at the same time, but may be performed at different times, and the order of performing the sub-steps or stages is not necessarily sequential, but may be performed alternately or alternately with other steps or at least some of the sub-steps or stages of other steps.
Fig. 13 is a block diagram showing a configuration of a high dynamic range image generating apparatus according to an embodiment. As shown in fig. 13, there is provided a high dynamic range image generating apparatus applied to an electronic device including an image sensor, wherein the image sensor includes a pixel dot array including minimum pixel dot repeating units, each of the minimum pixel dot repeating units includes a plurality of pixel dot sub-units, each of the pixel dot sub-units includes a plurality of single-color pixel dots and a plurality of panchromatic pixel dots; the device includes: an acquisition module 1302 and a fusion module 1304, wherein:
an obtaining module 1302, configured to expose each pixel point in the pixel point array to obtain a first intermediate map and a second intermediate map; the first intermediate image and the second intermediate image respectively comprise single-color photosensitive pixels and panchromatic photosensitive pixels, each pixel in the first intermediate image is obtained by combining photosensitive data obtained by exposing a plurality of same-type pixels in a pixel point array for a first exposure duration, each pixel in the second intermediate image is obtained by combining data obtained by exposing a plurality of same-type pixels in the pixel point array for a second exposure duration, and the first exposure duration is longer than the second exposure duration.
A fusion module 1304 for generating a high dynamic range image based on the first intermediate map and the second intermediate map; the arrangement of each pixel in the high dynamic range image matches the bayer array.
Above-mentioned high dynamic range image's formation device, image sensor include pixel point array, and pixel point array includes minimum pixel point repeating unit, and every minimum pixel point repeating unit includes a plurality of pixel point subelements, and every pixel point subelement includes a plurality of single-colored pixel points and a plurality of panchromatic pixel point, then can receive higher light inlet quantity through this panchromatic pixel point. Then, the electronic device exposes each pixel point in the pixel point array, so that the light incoming amount received by the whole pixel point array can be increased, more photosensitive data can be obtained, that is, the first intermediate image and the second intermediate image include more photosensitive data, and therefore, a more accurate high dynamic range image can be generated based on the first intermediate image and the second intermediate image. Moreover, the arrangement of each pixel in the high dynamic range image is matched with the Bayer array, so that the high dynamic range image output by the image sensor which collects more photosensitive data can also be transmitted to the image processor for image processing, and the compatibility between the image sensor with panchromatic pixel points and the image processor for processing the Bayer array image is improved.
In one embodiment, the image sensor further comprises a color filter array, the color filter array comprises minimum color filter repeating units, each minimum color filter repeating unit comprises a plurality of color filter subunits, each color filter subunit comprises a plurality of single-color filters and a plurality of panchromatic color filters, each color filter in the color filter array corresponds to each pixel point in the pixel point array in a one-to-one mode, and light rays filtered by the color filters are projected to the corresponding pixel points to obtain photosensitive data; the minimum color filter repeating unit is 8 rows, 8 columns and 64 color filters, and the arrangement mode is as follows:
a w a w b w b w
w a w a w b w b
a w a w b w b w
w a w a w b w b
b w b w c w c w
w b w b w c w c
b w b w c w c w
w b w b w c w c
wherein w denotes a full-color filter, and a, b, and c each denote a single-color filter.
In one embodiment, the pixel array is exposed in a row spacing mode or a column spacing mode; the line interval mode is that the exposure duration of each line of pixel points is alternately set according to the first exposure duration and the second exposure duration, and each pixel point in each line is exposed with the first exposure duration or the second exposure duration; the column interval mode is that the exposure duration of each column of pixel points is alternately set according to the first exposure duration and the second exposure duration, and each pixel point of each column is exposed with the first exposure duration or the second exposure duration.
In one embodiment, the exposure duration of each pixel in the pixel array is:
L S L S L S L S
L S L S L S L S
L S L S L S L S
L S L S L S L S
L S L S L S L S
L S L S L S L S
L S L S L S L S
L S L S L S L S
or
L L L L L L L L
S S S S S S S S
L L L L L L L L
S S S S S S S S
L L L L L L L L
S S S S S S S S
L L L L L L L L
S S S S S S S S
Where L denotes a first exposure time period and S denotes a second exposure time period.
In one embodiment, the exposure duration of each pixel in the pixel array is:
L L S S L L S S
L L S S L L S S
S S L L S S L L
S S L L S S L L
L L S S L L S S
L L S S L L S S
S S L L S S L L
S S L L S S L L
where L denotes a first exposure time period and S denotes a second exposure time period.
In an embodiment, the obtaining module 1302 is further configured to determine a plurality of same-type pixels with a first exposure duration from the pixel array, obtain first pixels based on the determined photosensitive data of each pixel, and obtain a first intermediate map based on each first pixel; and determining a plurality of same-kind pixel points with second exposure time length from the pixel point array, obtaining second pixels based on the determined photosensitive data of each pixel point, and obtaining a second intermediate image based on each second pixel.
In one embodiment, the fusion module 1304 is further configured to sequentially obtain the single-color photosensitive pixels from the first intermediate image to generate a first color image; sequentially acquiring all panchromatic photosensitive pixels from the first intermediate image to generate a first full-color image; sequentially acquiring single-color photosensitive pixels from the second intermediate image to generate a second color image; sequentially acquiring all panchromatic photosensitive pixels from the first intermediate image to generate a second panchromatic image; fusing each single-color photosensitive pixel in the first color image with a panchromatic photosensitive pixel at a corresponding position in the first panchromatic image to obtain a first fused image; fusing each single-color photosensitive pixel in the second color image with the panchromatic photosensitive pixel at the corresponding position in the second panchromatic image to obtain a second fused image; based on the first fused image and the second fused image, a high dynamic range image is generated.
The division of each module in the high dynamic range image generating device is merely for illustration, and in other embodiments, the high dynamic range image generating device may be divided into different modules as needed to complete all or part of the functions of the high dynamic range image generating device.
For specific limitations of the generation apparatus of the high dynamic range image, reference may be made to the above limitations of the generation method of the high dynamic range image, and details thereof are not repeated here. The respective modules in the above-described high dynamic range image generating apparatus may be wholly or partially implemented by software, hardware, and a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
Fig. 14 is a schematic diagram of an internal structure of an electronic device in one embodiment. The electronic device may be any terminal device such as a mobile phone, a tablet computer, a notebook computer, a desktop computer, a PDA (Personal Digital Assistant), a POS (Point of Sales), a vehicle-mounted computer, and a wearable device. The electronic device includes a processor and a memory connected by a system bus. The processor may include one or more processing units, among others. The processor may be a CPU (Central Processing Unit), a DSP (Digital Signal processor), or the like. The memory may include a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The computer program can be executed by a processor to implement a method for generating a high dynamic range image provided in the following embodiments. The internal memory provides a cached execution environment for the operating system computer programs in the non-volatile storage medium.
The implementation of each module in the generation apparatus of the high dynamic range image provided in the embodiment of the present application may be in the form of a computer program. The computer program may be run on a terminal or a server. Program modules constituted by such computer programs may be stored on the memory of the electronic device. Which when executed by a processor, performs the steps of the method described in the embodiments of the present application.
The embodiment of the application also provides a computer readable storage medium. One or more non-transitory computer-readable storage media containing computer-executable instructions that, when executed by one or more processors, cause the processors to perform the steps of a method of generating a high dynamic range image.
Embodiments of the present application also provide a computer program product containing instructions that, when run on a computer, cause the computer to perform a method of generating a high dynamic range image.
Any reference to memory, storage, database, or other medium used herein may include non-volatile and/or volatile memory. The nonvolatile Memory may include a ROM (Read-Only Memory), a PROM (Programmable Read-Only Memory), an EPROM (Erasable Programmable Read-Only Memory), an EEPROM (Electrically Erasable Programmable Read-Only Memory), or a flash Memory. Volatile Memory can include RAM (Random Access Memory), which acts as external cache Memory. By way of illustration and not limitation, RAM is available in many forms, such as SRAM (Static Random Access Memory), DRAM (Dynamic Random Access Memory), SDRAM (Synchronous Dynamic Random Access Memory), Double Data Rate DDR SDRAM (Double Data Rate Synchronous Random Access Memory), ESDRAM (Enhanced Synchronous Dynamic Random Access Memory), SLDRAM (Synchronous Link Dynamic Random Access Memory), RDRAM (Random Dynamic Random Access Memory), and DRmb DRAM (Dynamic Random Access Memory).
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. A generation method of a high dynamic range image is applied to an electronic device comprising an image sensor, and is characterized in that the image sensor comprises a pixel point array, the pixel point array comprises minimum pixel point repeating units, each minimum pixel point repeating unit comprises a plurality of pixel point sub-units, and each pixel point sub-unit comprises a plurality of single-color pixel points and a plurality of panchromatic pixel points; the method comprises the following steps:
exposing each pixel point in the pixel point array to obtain a first intermediate image and a second intermediate image; each pixel in the first intermediate image is obtained by combining photosensitive data obtained by exposing a plurality of same-type pixels in the pixel dot array for a first exposure duration, each pixel in the second intermediate image is obtained by combining data obtained by exposing a plurality of same-type pixels in the pixel dot array for a second exposure duration, and the first exposure duration is longer than the second exposure duration;
generating a high dynamic range image based on the first intermediate map and the second intermediate map; the arrangement of each pixel in the high dynamic range image is matched with a Bayer array.
2. The method of claim 1, wherein the image sensor further comprises a color filter array comprising minimum color filter repeating units, each of the minimum color filter repeating units comprises a plurality of color filter subunits, each of the color filter subunits comprises a plurality of single-color filters and a plurality of panchromatic filters, each color filter in the color filter array corresponds to each pixel point in the pixel point array in a one-to-one manner, and the light filtered by the color filter is projected to the corresponding pixel point to obtain the photosensitive data;
the minimum color filter repeating unit is 64 color filters in 8 rows and 8 columns, and the arrangement mode is as follows:
Figure FDA0003213266640000011
wherein w denotes a full-color filter, and a, b, and c each denote a single-color filter.
3. The method of claim 2, wherein the array of pixel dots is exposed in a row-wise or column-wise manner;
the line interval mode is that the exposure time of each line of pixel points is alternately set according to the first exposure time and the second exposure time, and each pixel point in each line is exposed with the first exposure time or the second exposure time;
the column interval mode is that the exposure duration of each column of pixel points is alternately set according to the first exposure duration and the second exposure duration, and each pixel point of each column is exposed with the first exposure duration or the second exposure duration.
4. The method of claim 3, wherein the exposure duration for each pixel in the pixel array is:
Figure FDA0003213266640000012
or
Figure FDA0003213266640000013
Figure FDA0003213266640000021
Where L denotes a first exposure time period and S denotes a second exposure time period.
5. The method of claim 2, wherein the exposure duration for each pixel in the pixel array is:
Figure FDA0003213266640000022
where L denotes a first exposure time period and S denotes a second exposure time period.
6. The method of claim 1, wherein said exposing each pixel in said array of pixel points to obtain a first intermediate map and a second intermediate map comprises:
determining a plurality of same-kind pixel points with first exposure duration from a pixel point array, obtaining first pixels based on the determined photosensitive data of each pixel point, and obtaining a first intermediate map based on each first pixel;
and determining a plurality of same-kind pixel points with second exposure duration from the pixel point array, obtaining second pixels based on the determined photosensitive data of each pixel point, and obtaining a second intermediate map based on each second pixel.
7. The method of claim 1, wherein generating a high dynamic range image based on the first intermediate map and the second intermediate map comprises:
sequentially acquiring single-color photosensitive pixels from the first intermediate image to generate a first color image; sequentially acquiring all panchromatic photosensitive pixels from the first intermediate image to generate a first full-color image;
sequentially acquiring single-color photosensitive pixels from the second intermediate image to generate a second color image; sequentially acquiring all panchromatic photosensitive pixels from the first intermediate image to generate a second panchromatic image;
fusing each single-color photosensitive pixel in the first color image with a panchromatic photosensitive pixel at a corresponding position in the first panchromatic image to obtain a first fused image;
fusing each single-color photosensitive pixel in the second color image with the panchromatic photosensitive pixel at the corresponding position in the second panchromatic image to obtain a second fused image;
generating a high dynamic range image based on the first fused image and the second fused image.
8. A generation device of a high dynamic range image is applied to an electronic device comprising an image sensor, and is characterized in that the image sensor comprises a pixel point array, the pixel point array comprises minimum pixel point repeating units, each minimum pixel point repeating unit comprises a plurality of pixel point sub-units, and each pixel point sub-unit comprises a plurality of single-color pixel points and a plurality of panchromatic pixel points; the device comprises:
the acquisition module is used for exposing each pixel point in the pixel point array to obtain a first intermediate image and a second intermediate image; each pixel in the first intermediate image is obtained by combining photosensitive data obtained by exposing a plurality of same-type pixels in the pixel dot array for a first exposure duration, each pixel in the second intermediate image is obtained by combining data obtained by exposing a plurality of same-type pixels in the pixel dot array for a second exposure duration, and the first exposure duration is longer than the second exposure duration;
a fusion module for generating a high dynamic range image based on the first intermediate map and the second intermediate map; the arrangement of each pixel in the high dynamic range image is matched with a Bayer array.
9. An electronic device comprising a memory and a processor, the memory having stored therein a computer program, wherein the computer program, when executed by the processor, causes the processor to perform the steps of the method of generating a high dynamic range image according to any one of claims 1 to 7.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 7.
CN202110937225.XA 2021-08-16 2021-08-16 Method and device for generating high dynamic range image, electronic equipment and storage medium Active CN113676636B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110937225.XA CN113676636B (en) 2021-08-16 2021-08-16 Method and device for generating high dynamic range image, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110937225.XA CN113676636B (en) 2021-08-16 2021-08-16 Method and device for generating high dynamic range image, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113676636A true CN113676636A (en) 2021-11-19
CN113676636B CN113676636B (en) 2023-05-05

Family

ID=78542973

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110937225.XA Active CN113676636B (en) 2021-08-16 2021-08-16 Method and device for generating high dynamic range image, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113676636B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114338988A (en) * 2021-12-29 2022-04-12 Oppo广东移动通信有限公司 Image generation method and device, electronic equipment and computer-readable storage medium
CN116847211A (en) * 2023-06-13 2023-10-03 广州城建职业学院 Color filter array and interpolation method
CN117014729A (en) * 2023-09-27 2023-11-07 合肥辉羲智能科技有限公司 Method and system for fusing secondary exposure image with high dynamic range image

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070223904A1 (en) * 2006-03-21 2007-09-27 Bloom Daniel M Method and apparatus for interleaved image captures
CN202535464U (en) * 2012-02-27 2012-11-14 徐辰 Imaging device
CN105578080A (en) * 2015-12-18 2016-05-11 广东欧珀移动通信有限公司 Imaging method, image sensor, imaging device and electronic device
CN105578076A (en) * 2015-12-18 2016-05-11 广东欧珀移动通信有限公司 Imaging method, imaging device and electronic device
CN111432099A (en) * 2020-03-30 2020-07-17 Oppo广东移动通信有限公司 Image sensor, processing system and method, electronic device, and storage medium
US20200236273A1 (en) * 2019-01-18 2020-07-23 Samsung Electronics Co., Ltd. Imaging systems for generating hdr images and operating methods thereof
CN111491110A (en) * 2020-04-17 2020-08-04 Oppo广东移动通信有限公司 High dynamic range image processing system and method, electronic device, and storage medium
WO2021046691A1 (en) * 2019-09-09 2021-03-18 Oppo广东移动通信有限公司 Image collection method, camera assembly and mobile terminal

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070223904A1 (en) * 2006-03-21 2007-09-27 Bloom Daniel M Method and apparatus for interleaved image captures
CN202535464U (en) * 2012-02-27 2012-11-14 徐辰 Imaging device
CN105578080A (en) * 2015-12-18 2016-05-11 广东欧珀移动通信有限公司 Imaging method, image sensor, imaging device and electronic device
CN105578076A (en) * 2015-12-18 2016-05-11 广东欧珀移动通信有限公司 Imaging method, imaging device and electronic device
US20200236273A1 (en) * 2019-01-18 2020-07-23 Samsung Electronics Co., Ltd. Imaging systems for generating hdr images and operating methods thereof
CN111464754A (en) * 2019-01-18 2020-07-28 三星电子株式会社 Imaging system for generating HDR images and method of operation thereof
WO2021046691A1 (en) * 2019-09-09 2021-03-18 Oppo广东移动通信有限公司 Image collection method, camera assembly and mobile terminal
CN111432099A (en) * 2020-03-30 2020-07-17 Oppo广东移动通信有限公司 Image sensor, processing system and method, electronic device, and storage medium
CN111491110A (en) * 2020-04-17 2020-08-04 Oppo广东移动通信有限公司 High dynamic range image processing system and method, electronic device, and storage medium

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114338988A (en) * 2021-12-29 2022-04-12 Oppo广东移动通信有限公司 Image generation method and device, electronic equipment and computer-readable storage medium
WO2023124607A1 (en) * 2021-12-29 2023-07-06 Oppo广东移动通信有限公司 Image generation method and apparatus, electronic device, and computer-readable storage medium
CN114338988B (en) * 2021-12-29 2024-08-02 Oppo广东移动通信有限公司 Image generation method, device, electronic equipment and computer readable storage medium
CN116847211A (en) * 2023-06-13 2023-10-03 广州城建职业学院 Color filter array and interpolation method
CN116847211B (en) * 2023-06-13 2024-03-08 广州城建职业学院 Interpolation method of color filter array
CN117014729A (en) * 2023-09-27 2023-11-07 合肥辉羲智能科技有限公司 Method and system for fusing secondary exposure image with high dynamic range image
CN117014729B (en) * 2023-09-27 2023-12-05 合肥辉羲智能科技有限公司 Method and system for fusing secondary exposure image with high dynamic range image

Also Published As

Publication number Publication date
CN113676636B (en) 2023-05-05

Similar Documents

Publication Publication Date Title
CN213279832U (en) Image sensor, camera and terminal
CN113676636B (en) Method and device for generating high dynamic range image, electronic equipment and storage medium
CN111491110B (en) High dynamic range image processing system and method, electronic device, and storage medium
CN111479071B (en) High dynamic range image processing system and method, electronic device, and readable storage medium
US10136107B2 (en) Imaging systems with visible light sensitive pixels and infrared light sensitive pixels
WO2021179806A1 (en) Image acquisition method, imaging apparatus, electronic device, and readable storage medium
CN113676675B (en) Image generation method, device, electronic equipment and computer readable storage medium
CN111491111B (en) High dynamic range image processing system and method, electronic device, and readable storage medium
CN111586375B (en) High dynamic range image processing system and method, electronic device, and readable storage medium
CN112118378A (en) Image acquisition method and device, terminal and computer readable storage medium
CN114125242B (en) Image sensor, camera module, electronic device, image generation method and device
CN113573030B (en) Image generation method, device, electronic equipment and computer readable storage medium
CN113840067B (en) Image sensor, image generation method and device and electronic equipment
WO2023124607A1 (en) Image generation method and apparatus, electronic device, and computer-readable storage medium
CN113676635B (en) Method and device for generating high dynamic range image, electronic equipment and storage medium
CN113676708A (en) Image generation method and device, electronic equipment and computer-readable storage medium
CN114363486B (en) Image sensor, camera module, electronic device, image generation method and device
CN114040084B (en) Image sensor, camera module, electronic device, image generation method and device
CN114157795B (en) Image sensor, camera module, electronic device, image generation method and device
CN114125318B (en) Image sensor, camera module, electronic device, image generation method and device
CN112738493B (en) Image processing method, image processing apparatus, electronic device, and readable storage medium
CN111970460B (en) High dynamic range image processing system and method, electronic device, and readable storage medium
CN114554046B (en) Image sensor, camera module, electronic device, image generation method and device
CN112822475B (en) Image processing method, image processing apparatus, terminal, and readable storage medium
CN111970461B (en) High dynamic range image processing system and method, electronic device, and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant