CN114240792A - Image exposure fusion method and device and storage medium - Google Patents

Image exposure fusion method and device and storage medium Download PDF

Info

Publication number
CN114240792A
CN114240792A CN202111574767.1A CN202111574767A CN114240792A CN 114240792 A CN114240792 A CN 114240792A CN 202111574767 A CN202111574767 A CN 202111574767A CN 114240792 A CN114240792 A CN 114240792A
Authority
CN
China
Prior art keywords
image
fused
target
laplacian pyramid
brightness
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111574767.1A
Other languages
Chinese (zh)
Inventor
王朋
焦文菲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Beijing Xiaomi Pinecone Electronic Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Beijing Xiaomi Pinecone Electronic Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd, Beijing Xiaomi Pinecone Electronic Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN202111574767.1A priority Critical patent/CN114240792A/en
Publication of CN114240792A publication Critical patent/CN114240792A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The disclosure relates to an image exposure fusion method, an image exposure fusion device and a storage medium. The method comprises the following steps: acquiring an image sequence to be fused, wherein the image sequence to be fused comprises a plurality of images to be fused, and the images to be fused are images shot under different exposure quantities aiming at the same scene; calculating a fusion Laplacian pyramid according to the image sequence to be fused; replacing the brightness distribution of the second top-level image of the fused Laplacian pyramid according to the brightness distribution of the first top-level image of the target Laplacian pyramid to obtain a target fused Laplacian pyramid, wherein the target Laplacian pyramid is generated according to the target image to be fused in the image sequence to be fused; and reconstructing according to the target fusion Laplacian pyramid to obtain a target image. By adopting the mode, the problem of image brightness inversion can be improved, so that the global brightness distribution of the target image is more reasonable.

Description

Image exposure fusion method and device and storage medium
Technical Field
The present disclosure relates to the field of multi-exposure fusion technologies, and in particular, to an image exposure fusion method, an image exposure fusion device, and a storage medium.
Background
With the rapid development of electronic device technology, people have higher and higher requirements on the quality of digital images. However, due to the limitation of the hardware level of the existing image capturing device, the dynamic range of the brightness of the natural scene that can be captured by the image capturing device is much smaller than the dynamic range of the natural real scene, for example, in the natural real scene, from the starlight in the night sky to the dazzling sun, the scene brightness variation covers about nine orders of magnitude of dynamic range. And the dynamic range of the image acquisition device is between 0-255. Therefore, in order to obtain a high-quality image, a multi-Exposure Fusion (EF) technique has been developed.
In the related technology, the multi-exposure fusion technology is used for fusing a plurality of images with different exposure degrees into an image with excellent quality, so that the deficiency of hardware facilities is made up, and the multi-exposure fusion technology has wide application value in the fields of electronic consumer equipment and the like. Although the quality of the image obtained by multi-exposure fusion is improved, the phenomenon that the overall brightness distribution is unnatural or even unreasonable still occurs in the image obtained by fusion. Such as the phenomenon of brightness inversion.
Disclosure of Invention
To overcome the problems in the related art, the present disclosure provides an image exposure fusion method, apparatus, and storage medium.
According to a first aspect of the embodiments of the present disclosure, there is provided an image exposure fusion method, the method including:
acquiring an image sequence to be fused, wherein the image sequence to be fused comprises a plurality of images to be fused, and the images to be fused are images shot under different exposure quantities aiming at the same scene;
calculating a fusion Laplacian pyramid according to the image sequence to be fused;
replacing the brightness distribution of the second top-level image of the fused Laplacian pyramid according to the brightness distribution of the first top-level image of the target Laplacian pyramid to obtain a target fused Laplacian pyramid, wherein the target Laplacian pyramid is generated according to the target image to be fused in the image sequence to be fused;
and reconstructing according to the target fusion Laplacian pyramid to obtain a target image.
Optionally, the first top-level image and the second top-level image have the same number of pixel points, and the replacing, according to the luminance distribution of the first top-level image of the target laplacian pyramid, the luminance distribution of the second top-level image of the fused laplacian pyramid to obtain the target fused laplacian pyramid includes:
calculating the brightness mean value and the standard deviation of each pixel point in the first top layer image to obtain a first brightness mean value and a first brightness standard deviation;
calculating the brightness mean value of each pixel point in the second top layer image to obtain a second brightness mean value;
calculating a first target brightness value of each pixel point in the first top-layer image according to the size relation between the brightness standard deviation and a preset threshold value and the first brightness mean value to obtain a first target brightness image;
obtaining a second target brightness image according to the first target brightness image and the second brightness mean value;
and updating the second top-level image according to the second target brightness image to obtain an updated second top-level image, wherein the top-level image of the target fusion Laplacian pyramid is the updated second top-level image.
Optionally, the calculating a first target brightness value of each pixel point in the first top-level image according to the magnitude relationship between the standard deviation of the brightness and a preset threshold and the first brightness mean value includes:
and under the condition that the standard deviation of the brightness of the first top-layer image is smaller than or equal to the preset threshold, regarding each pixel point in the first top-layer image, taking the difference value between the brightness value of the pixel point and the first brightness mean value as a first target brightness value of the pixel point.
Optionally, the calculating a first target brightness value of each pixel point in the first top-level image according to the magnitude relationship between the standard deviation of the brightness and a preset threshold and the first brightness mean value includes:
and under the condition that the brightness standard deviation of the first top-layer image is larger than the preset threshold, calculating a relative standard distance value between the brightness value of each pixel point and the first brightness mean value aiming at each pixel point in the first top-layer image, and taking the product of the relative standard distance value and the preset threshold as a first target brightness value of the pixel point.
Optionally, the obtaining a second target luminance image according to the first target luminance image and the second luminance average value includes:
and increasing the brightness value of each pixel point in the first target brightness image by the second brightness mean value to obtain a second target brightness image.
Optionally, the updating the second top-level image according to the second target brightness image to obtain an updated second top-level image includes:
and assigning the second target brightness value of each pixel point in the second target brightness image to the pixel point at the corresponding position in the second top image to obtain the updated second top image.
Optionally, the calculating a fusion laplacian pyramid according to the image sequence to be fused includes:
performing laplacian pyramid decomposition on each image to be fused in the image sequence to be fused to obtain a first laplacian pyramid sequence, wherein the first laplacian pyramid sequence comprises a plurality of first laplacian pyramids, and each first laplacian pyramid corresponds to one image to be fused;
acquiring a weight image sequence corresponding to the image sequence to be fused, wherein the weight image sequence comprises a plurality of weight images, each weight image corresponds to one image to be fused, and each weight image represents the weight for fusing the image sequence to be fused;
performing gaussian pyramid decomposition on each weight image in the weight image sequence to obtain a first gaussian pyramid sequence, wherein the first gaussian pyramid sequence comprises a plurality of first gaussian pyramids, and each first gaussian pyramid corresponds to one weight image;
and generating the fused Laplacian pyramid according to the first Laplacian pyramid sequence and the first Gaussian pyramid sequence.
Optionally, the generating the fused laplacian pyramid according to the first laplacian pyramid sequence and the first gaussian pyramid sequence includes:
for each first laplacian pyramid, multiplying the first laplacian pyramid by the corresponding first laplacian pyramid to obtain a second laplacian pyramid, wherein the first laplacian pyramid multiplied by the first laplacian pyramid and the first laplacian pyramid correspond to the same image to be fused;
adding all of the second laplacian pyramids to obtain the fused laplacian pyramid.
Optionally, the first laplacian pyramid is the same as the first gaussian pyramid in layer number, and the multiplying the first laplacian pyramid by the corresponding first gaussian pyramid includes:
multiplying the nth layer of the first laplacian pyramid by the corresponding nth layer of the first laplacian pyramid to obtain the nth layer of the second laplacian pyramid, wherein N is an integer greater than or equal to 0.
Optionally, the target laplacian pyramid represents the first laplacian pyramid of the target image to be fused;
the target image to be fused is determined by any one of the following modes:
randomly determining the target image to be fused from the image sequence to be fused;
and determining the image to be fused with the minimum brightness mean value in the image sequence to be fused as the target image to be fused.
According to a second aspect of the embodiments of the present disclosure, there is provided an image exposure fusion apparatus, the apparatus including:
the image fusion system comprises an acquisition module, a fusion module and a fusion module, wherein the acquisition module is configured to acquire an image sequence to be fused, the image sequence to be fused comprises a plurality of images to be fused, and the images to be fused are images shot under different exposure quantities aiming at the same scene;
a calculation module configured to calculate a fusion laplacian pyramid from the sequence of images to be fused;
a replacing module configured to replace the brightness distribution of the second top-level image of the fused laplacian pyramid according to the brightness distribution of the first top-level image of the target laplacian pyramid to obtain a target fused laplacian pyramid, wherein the target laplacian pyramid is generated according to a target image to be fused in the image sequence to be fused;
and the reconstruction module is configured to obtain a target image according to the target fusion Laplacian pyramid reconstruction.
Optionally, the first top-level image and the second top-level image have the same number of pixel points, and the replacement module includes:
the first calculation submodule is configured to calculate the brightness mean value and the standard deviation of each pixel point in the first top-layer image to obtain a first brightness mean value and a first brightness standard deviation;
the second calculation submodule is configured to calculate the brightness mean value of each pixel point in the second top-layer image to obtain a second brightness mean value;
the third calculation submodule is configured to calculate a first target brightness value of each pixel point in the first top-level image according to the magnitude relation between the brightness standard deviation and a preset threshold value and the first brightness mean value to obtain a first target brightness image;
the first determining submodule is configured to obtain a second target brightness image according to the first target brightness image and the second brightness mean value;
and the updating submodule is configured to update the second top-level image according to the second target brightness image to obtain an updated second top-level image, and the top-level image of the target fusion Laplacian pyramid is the updated second top-level image.
Optionally, the third computing submodule comprises:
the second determining submodule is configured to, when the standard deviation of the brightness of the first top-level image is smaller than or equal to the preset threshold, regarding each pixel point in the first top-level image, use a difference value between the brightness value of the pixel point and the first brightness mean value as a first target brightness value of the pixel point.
Optionally, the third computing submodule comprises:
a third determining submodule configured to, when the standard deviation of the brightness of the first top-level image is greater than the preset threshold, calculate, for each pixel point in the first top-level image, a relative standard distance value between the brightness value of the pixel point and the first brightness mean value, and take a product of the relative standard distance value and the preset threshold as a first target brightness value of the pixel point.
Optionally, the first determining submodule is configured to increase the brightness value of each pixel point in the first target brightness image by the second brightness mean value to obtain the second target brightness image.
Optionally, the update sub-module is configured to: and assigning the second target brightness value of each pixel point in the second target brightness image to the pixel point at the corresponding position in the second top image to obtain the updated second top image.
Optionally, the calculation module comprises:
the first decomposition submodule is configured to perform laplacian pyramid decomposition on each image to be fused in the image sequence to be fused to obtain a first laplacian pyramid sequence, the first laplacian pyramid sequence comprises a plurality of first laplacian pyramids, and each first laplacian pyramid corresponds to one image to be fused;
the obtaining submodule is configured to obtain a weighted image sequence corresponding to the image sequence to be fused, the weighted image sequence comprises a plurality of weighted images, each weighted image corresponds to one image to be fused, and each weighted image represents the weight for fusing the image sequence to be fused;
the second decomposition submodule is configured to perform Gaussian pyramid decomposition on each weight image in the weight image sequence to obtain a first Gaussian pyramid sequence, the first Gaussian pyramid sequence comprises a plurality of first Gaussian pyramids, and each first Gaussian pyramid corresponds to one weight image;
a generation submodule configured to generate the fused Laplacian pyramid from the first Laplacian pyramid sequence and the first Gaussian pyramid sequence.
Optionally, the generating sub-module includes:
a fourth calculation submodule configured to, for each first laplacian pyramid, multiply the first laplacian pyramid by the corresponding first gaussian pyramid to obtain a second laplacian pyramid, where the first laplacian pyramid multiplied by the first laplacian pyramid and the first laplacian pyramid correspond to the same image to be fused;
a fifth computation submodule configured to add all of the second Laplacian pyramids to obtain the fused Laplacian pyramid.
Optionally, the number of layers of the first laplacian pyramid is the same as that of the first gaussian pyramid, and the fourth computation submodule is configured to multiply the nth layer of the first laplacian pyramid with the corresponding nth layer of the first gaussian pyramid to obtain the nth layer of the second laplacian pyramid, where N is an integer greater than or equal to 0.
Optionally, the target laplacian pyramid represents the first laplacian pyramid of the target image to be fused;
the target image to be fused is determined by any one of the following modes:
randomly determining the target image to be fused from the image sequence to be fused;
and determining the image to be fused with the minimum brightness mean value in the image sequence to be fused as the target image to be fused.
According to a third aspect of the embodiments of the present disclosure, there is provided an image exposure fusion apparatus including:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to:
acquiring an image sequence to be fused, wherein the image sequence to be fused comprises a plurality of images to be fused, and the images to be fused are images shot under different exposure quantities aiming at the same scene;
calculating a fusion Laplacian pyramid according to the image sequence to be fused;
replacing the brightness distribution of the second top-level image of the fused Laplacian pyramid according to the brightness distribution of the first top-level image of the target Laplacian pyramid to obtain a target fused Laplacian pyramid, wherein the target Laplacian pyramid is generated according to the target image to be fused in the image sequence to be fused;
and reconstructing according to the target fusion Laplacian pyramid to obtain a target image.
According to a fourth aspect of embodiments of the present disclosure, there is provided a computer-readable storage medium having stored thereon computer program instructions which, when executed by a processor, implement the steps of the image exposure fusion method provided by the first aspect of the present disclosure.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects:
by acquiring the image sequence to be fused, each image to be fused in the image sequence to be fused is an image shot under different exposure quantities aiming at the same scene. And calculating a fusion Laplacian pyramid according to the image sequence to be fused. And replacing the brightness distribution of the second top-level image of the fused Laplacian pyramid according to the brightness distribution of the first top-level image of the target Laplacian pyramid to obtain the target fused Laplacian pyramid. Since the target laplacian pyramid is generated according to the target image to be fused in the image sequence to be fused, the brightness distribution of the first top-level image of the target laplacian pyramid is the real brightness distribution acquired by the image pickup device. Because the brightness distribution of the second top-level image obtained by multi-exposure fusion is replaced according to the real brightness distribution of the first top-level image, the brightness distribution of the second top-level image obtained by multi-exposure fusion can be corrected, namely the brightness distribution of the top-level image fused with the Laplace pyramid can be corrected, and the corrected target fused Laplace pyramid is obtained. And reconstructing according to the corrected target fusion Laplacian pyramid to obtain a target image. By adopting the method, the brightness inversion phenomenon existing in the reconstructed image directly according to the fusion Laplacian pyramid reconstructed image in the related technology can be improved, and the purpose of enabling the global brightness distribution of the target image to be more reasonable is achieved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
FIG. 1 is a flow diagram illustrating an image exposure fusion method according to an exemplary embodiment.
FIG. 2 is an image shown according to an exemplary embodiment.
FIG. 3 is a block diagram illustrating an image exposure fusion apparatus according to an exemplary embodiment.
FIG. 4 is a block diagram illustrating an apparatus for image exposure fusion in accordance with an exemplary embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
In order to make the technical solution of the present disclosure easier for those skilled in the art to understand, the principle of the multi-exposure fusion technology involved in the embodiment of the present disclosure is briefly explained below.
A High-Dynamic Range (HDR) image can provide more Dynamic Range and image details than a general image. The final HDR image is synthesized from LDR (Low-Dynamic Range) images of different exposure times, and with the LDR image corresponding to the best detail for each exposure time. HDR images are better able to reflect visual effects in real environments.
It should be noted that, for an image, a corresponding gaussian pyramid may be constructed. Accordingly, the image may be reconstructed from the gaussian pyramid. Based on this, the following multi-exposure fusion process can be understood:
a plurality of images with different exposure amounts (exposure amount is illuminance × time, where illuminance is determined by a diaphragm and time is controlled by a shutter) are determined, and the plurality of images are taken at the same shooting angle for the same scene. And regarding each image, taking the image as a 0-layer image of the Gaussian pyramid, and performing down-sampling on the 0-layer image to obtain a 1-layer image of the Gaussian pyramid. And continuing downsampling for the layer 1 image of the Gaussian pyramid to obtain a layer 2 image of the Gaussian pyramid, and repeating the steps until the Gaussian pyramid with N layers is obtained. The downsampling process includes performing Gaussian kernel convolution on the image, and then removing all even rows and even columns in the image to obtain a downsampled image with the size of only one fourth of that of the original image.
Then, for the gaussian pyramid of each image, a corresponding laplacian pyramid is calculated. The calculation process comprises the following steps: and performing upsampling on the layer 1 image of the Gaussian pyramid to obtain an upsampling result, and subtracting the upsampling result from the layer 0 image of the Gaussian pyramid to obtain a layer 0 image of the Laplacian pyramid. The first N-1 layer image of the Laplacian pyramid can be obtained through the circulation. Since the top layer of the gaussian pyramid is the last layer, the N-1 layer image (i.e., the top layer image) of the gaussian pyramid can be directly used as the top layer image of the laplacian pyramid. Wherein upsampling refers to enlarging the image, the upsampling process includes expanding the image twice from each direction (horizontal and vertical), filling the newly added pixel rows and pixel columns with 0, and then convolving with the enlarged image using the same kernel (multiplied by 4) as the downsampling to obtain an approximation of the "added pixels". Thereby obtaining an up-sampling result.
Then, for each image, the weight of each pixel point is sequentially calculated according to the weight index (the common weight index is contrast, saturation and proper exposure), so as to obtain the weight image corresponding to the image. In any pixel point having the same position in the plurality of images, the sum of the weight of the pixel point in the plurality of images is 1 (i.e., weight normalization processing in the related art). And carrying out Gaussian pyramid decomposition on the weight image to obtain a weight Gaussian pyramid.
Then, for each image, multiplying the laplacian pyramid of the image by the weighted gaussian pyramid to obtain a corresponding image laplacian pyramid. And adding the Laplacian pyramids of all the images to obtain a fused Laplacian pyramid. And reconstructing an image according to the fused Laplacian pyramid, wherein the obtained image is the HDR image.
The following describes a detailed embodiment of the technical solution of the present disclosure based on the principle of the multi-exposure fusion technique described above.
FIG. 1 is a flow diagram illustrating an image exposure fusion method according to an exemplary embodiment. The image exposure fusion method is applied to the terminal. For example, the present invention is applied to a camera, a mobile phone with a camera function, a tablet, a computer, and other terminals. As shown in fig. 1, the image exposure fusion method may include the following steps.
In step S11, an image sequence to be fused is acquired, where the image sequence to be fused includes a plurality of images to be fused, and the images to be fused are images captured at different exposure amounts for the same scene.
In some embodiments, each image to be fused in the sequence of images to be fused is an image captured at the same capturing angle for the same scene, wherein each image to be fused corresponds to a different exposure. Since the exposure amount is the product of the illuminance and the time. The illumination is determined by the aperture and the time is controlled by the shutter, i.e. the aperture size and the shutter length determine the exposure. Therefore, a plurality of images to be fused can be shot by controlling the aperture size and the shutter time length by the image shooting personnel.
It should be noted that the present disclosure does not limit the relative order between the images to be fused in the image sequence to be fused. The order between the images to be fused in the sequence of images to be fused can be arbitrary.
In step S12, a fused laplacian pyramid is calculated from the image sequence to be fused.
The implementation of calculating the fusion laplacian pyramid according to the image sequence to be fused is similar to the multi-exposure fusion process in the related art, and is not described herein again.
In step S13, replacing the luminance distribution of the second top-level image of the fused laplacian pyramid with the luminance distribution of the first top-level image of the target laplacian pyramid to obtain a target fused laplacian pyramid, where the target laplacian pyramid is generated according to the target to-be-fused image in the to-be-fused image sequence.
Based on the principle of the multi-exposure fusion technology, in the weight image of each image to be fused, the weight value of each pixel point is obtained by independent calculation according to the weight index and is irrelevant to other pixel points. That is, the luminance distribution of the entire image is not considered when generating the weighted image. Accordingly, in the fused laplacian pyramid calculated based on the weighted image, the brightness distribution of each layer of image does not take into account the overall brightness distribution of the image. Thus, the luminance inversion phenomenon inevitably occurs in the HDR image reconstructed based on the fused laplacian pyramid. It should be noted that when the luminance inversion phenomenon does not occur in a local area of a small range, the phenomenon is not easily found. The brightness inversion phenomenon is illustrated by taking fig. 2 as an example. Suppose there is a pixel A in the night sky and a pixel B on the ground where the street lights shine. Under normal conditions, the brightness of the pixel point A should be less than that of the pixel point B. If the brightness of the pixel point A is larger than that of the pixel point B, representing brightness inversion.
The target image to be fused may be any image in the sequence of images to be fused. The target image to be fused is a real image shot by the camera equipment. The image pickup device performs image presentation based on the brightness distribution of the real scene during image presentation, and the brightness distribution of the real scene is reasonable, so that the brightness distribution in the target image to be fused is reasonable and conforms to the overall brightness distribution of the real scene. And performing Laplacian pyramid decomposition on the target image to be fused to obtain a target fusion Laplacian pyramid which accords with the brightness distribution of the target image to be fused.
The fusion Laplacian pyramid obtained by calculation according to the multi-exposure fusion algorithm does not consider the overall brightness distribution, and in the fusion process, the fusion Laplacian pyramid cannot well maintain the overall brightness distribution of the image (sequence) to be fused due to the large tolerance of the weight to the brightness. In the embodiment of the present disclosure, the luminance distribution of the second top-level image of the fused laplacian pyramid is replaced according to the luminance distribution of the first top-level image of the target laplacian pyramid, so that the obtained luminance distribution of the top-level image of the target fused laplacian pyramid more conforms to the objective phenomenon of the overall luminance distribution. Therefore, the obtained brightness distribution of the top-level image of the target fusion Laplacian pyramid can better accord with the global brightness distribution of the target image to be fused.
It should be noted here that in the fused laplacian pyramid, the overall brightness distribution of the reconstructed image is influenced the further toward the top layer.
In step S14, a target image is reconstructed according to the target fusion laplacian pyramid.
The method for reconstructing the target image according to the target fusion laplacian pyramid is the same as the method for reconstructing the HDR image according to the fusion laplacian pyramid in the related art, and details are not repeated here.
By adopting the mode, the image sequence to be fused is obtained, and each image to be fused in the image sequence to be fused is an image shot under different exposure quantities aiming at the same scene. And calculating a fusion Laplacian pyramid according to the image sequence to be fused. And replacing the brightness distribution of the second top-level image of the fused Laplacian pyramid according to the brightness distribution of the first top-level image of the target Laplacian pyramid to obtain the target fused Laplacian pyramid. Since the target laplacian pyramid is generated according to the target image to be fused in the image sequence to be fused, the brightness distribution of the first top-level image of the target laplacian pyramid is the real brightness distribution acquired by the image pickup device. Because the brightness distribution of the second top-level image obtained by multi-exposure fusion is replaced according to the real brightness distribution of the first top-level image, the brightness distribution of the second top-level image obtained by multi-exposure fusion can be corrected, namely the brightness distribution of the top-level image fused with the Laplace pyramid can be corrected, and the corrected target fused Laplace pyramid is obtained. And reconstructing according to the corrected target fusion Laplacian pyramid to obtain a target image. By adopting the method, the brightness inversion phenomenon existing in the reconstructed image directly according to the fusion Laplacian pyramid reconstructed image in the related technology can be improved, and the purpose of enabling the global brightness distribution of the target image to be more reasonable is achieved.
Optionally, the first top-level image and the second top-level image have the same number of pixel points, and the luminance distribution of the second top-level image of the fused laplacian pyramid is replaced according to the luminance distribution of the first top-level image of the target laplacian pyramid to obtain the target fused laplacian pyramid, including the following steps:
calculating the brightness mean value and the standard deviation of each pixel point in the first top layer image to obtain a first brightness mean value and a first brightness standard deviation; calculating the brightness mean value of each pixel point in the second top layer image to obtain a second brightness mean value; calculating a first target brightness value of each pixel point in the first top-layer image according to the size relation between the brightness standard deviation and a preset threshold value and the first brightness mean value to obtain a first target brightness image; obtaining a second target brightness image according to the first target brightness image and the second brightness mean value; and updating the second top-level image according to the second target brightness image to obtain an updated second top-level image, wherein the top-level image of the target fusion Laplacian pyramid is the updated second top-level image.
Wherein the first target luminance image may represent a luminance distribution of the first top-level image. And obtaining a second target brightness image according to the first target brightness image and the second brightness mean value. The second target luminance image may represent a luminance distribution after mapping the luminance distribution of the first top-level image to the luminance distribution space of the second top-level image.
Optionally, the calculating a first target brightness value of each pixel point in the first top-level image according to the magnitude relationship between the standard deviation of the brightness and a preset threshold and the first brightness mean value includes:
and under the condition that the standard deviation of the brightness of the first top-layer image is smaller than or equal to the preset threshold, regarding each pixel point in the first top-layer image, taking the difference value between the brightness value of the pixel point and the first brightness mean value as a first target brightness value of the pixel point.
In the related art, the standard deviation and the mean value of the brightness of the image can be used for evaluating the quality of the image. The average value of the image pixel points can reflect the average brightness of the image, and the larger the average brightness is, the better the image quality is. The pixel standard deviation represents the dispersion degree of the pixel gray value of the image relative to the mean value, and the larger the standard deviation is, the more scattered the gray level distribution in the image is, and the better the image quality is. However, in some extreme cases, the standard deviation is too large, and the image quality is reduced when the standard deviation is too large and is not reasonable (similar to the principle of exposure transition). Therefore, in the embodiment of the present disclosure, the preset threshold is an empirical value for avoiding the situation of the standard deviation being too large.
And the brightness standard deviation of the first top-layer image is used for reflecting the discrete degree of the brightness value of each pixel point in the first top-layer image. The larger the standard deviation of the brightness of the first top-level image is, the larger the dispersion degree of the brightness value of each pixel point in the first top-level image is, and correspondingly, the smaller the standard deviation of the brightness of the first top-level image is, the smaller the dispersion degree of the brightness value of each pixel point in the first top-level image is.
In some embodiments, the luminance standard deviation and the first luminance mean of the first top-level image may be calculated from the first top-level image. Similarly, a second luminance mean value of the second top-level image may be calculated from the second top-level image. Under the condition that the standard deviation of the brightness of the first top image is smaller than or equal to the preset threshold, the difference between the brightness value of each pixel point in the first top image and the first brightness mean value can be calculated, and the difference represents the brightness distribution condition of the pixel point in the brightness distribution space of the first top image. The difference value represents a first target brightness value of the pixel point.
Optionally, the obtaining a second target luminance image according to the first target luminance image and the second luminance average value includes:
and increasing the brightness value of each pixel point in the first target brightness image by the second brightness mean value to obtain a second target brightness image.
In some embodiments, for each pixel point in the first target luminance image, a sum of the first target luminance image and the second luminance average value of the pixel point is calculated to obtain a second target luminance value of the pixel point. The brightness value of each pixel point in the second target brightness image is a second target brightness value.
Optionally, the updating the second top-level image according to the second target brightness image to obtain an updated second top-level image includes:
and assigning the second target brightness value of each pixel point in the second target brightness image to the pixel point at the corresponding position in the second top image to obtain the updated second top image.
In some embodiments, for each pixel point in the second target luminance image, the second target luminance value of the pixel point is assigned to the pixel point at the corresponding position in the second top-level image, for example, the luminance value of the pixel point at the corresponding position in the second top-level image is replaced according to the second target luminance value of the pixel point, so that the luminance distribution condition of the pixel point in the first top-level image can be replaced to the corresponding pixel point in the second top-level image. By adopting the method, the updated second top-level image can be obtained, and then the target fusion Laplacian pyramid can be obtained.
Optionally, the calculating a first target brightness value of each pixel point in the first top-level image according to the magnitude relationship between the standard deviation of the brightness and a preset threshold and the first brightness mean value includes:
and under the condition that the brightness standard deviation of the first top-layer image is larger than the preset threshold, calculating a relative standard distance value between the brightness value of each pixel point and the first brightness mean value aiming at each pixel point in the first top-layer image, and taking the product of the relative standard distance value and the preset threshold as a first target brightness value of the pixel point.
In some embodiments, the embodiment of calculating the relative standard distance value between the luminance value of the pixel point and the first luminance mean value is as follows:
the relative standard distance value is (the brightness value of the pixel point-the first brightness mean value)/the brightness standard deviation. It should be noted that relative standard distance values, also referred to as standard scores (standard scores) or z-scores (z-scores), are characterized by the process of dividing the difference between a number and a mean by the standard deviation. The z-fraction can truly reflect the relative standard distance between the brightness value of one pixel point and the first brightness mean value. If we convert the brightness value of each pixel point into z-scores, each z-score will represent the distance or dispersion from the brightness value of a specific pixel point to the first brightness mean value by taking a brightness standard deviation as a unit. The z-scores of all the pixel points represent the brightness distribution of all the pixel points under the condition of taking the standard deviation of the brightness as a unit.
The product of the relative standard distance value and the preset threshold value represents the distance or the dispersion from the brightness value of a specific pixel point to the first brightness mean value by taking N brightness standard deviations as a unit, and N is the quotient of the preset threshold value and the brightness standard deviation.
In some embodiments, the luminance standard deviation and the first luminance mean of the first top-level image may be calculated, and the second luminance mean of the second top-level image may be calculated. And under the condition that the standard deviation of the brightness of the first top image is greater than a preset threshold, calculating a relative standard distance value between the brightness value of each pixel point and the first brightness mean value aiming at each pixel point in the first top image. So as to obtain the distance or dispersion between the brightness value of the pixel point and the first brightness mean value. The product of the relative standard distance value and the preset threshold value can be used as the first target brightness value of the pixel point. By adopting the mode, the discrete degree between the brightness values of the pixel points in the first target brightness image is controlled according to the preset threshold value.
Optionally, the calculating a fusion laplacian pyramid according to the image sequence to be fused includes:
performing laplacian pyramid decomposition on each image to be fused in the image sequence to be fused to obtain a first laplacian pyramid sequence, wherein the first laplacian pyramid sequence comprises a plurality of first laplacian pyramids, and each first laplacian pyramid corresponds to one image to be fused;
acquiring a weight image sequence corresponding to the image sequence to be fused, wherein the weight image sequence comprises a plurality of weight images, each weight image corresponds to one image to be fused, and each weight image represents the weight for fusing the image sequence to be fused;
performing gaussian pyramid decomposition on each weight image in the weight image sequence to obtain a first gaussian pyramid sequence, wherein the first gaussian pyramid sequence comprises a plurality of first gaussian pyramids, and each first gaussian pyramid corresponds to one weight image;
and generating the fused Laplacian pyramid according to the first Laplacian pyramid sequence and the first Gaussian pyramid sequence.
It should be noted that each first laplacian pyramid includes a plurality of image layers from high to low in level, and the higher the image level is, the lower the resolution is. Accordingly, each first gaussian pyramid comprises a plurality of image layers from high to low in level, the higher the image level, the lower the resolution. The first laplacian pyramid and the first gaussian pyramid have the same layer number, and the image resolution of the same layer in the first laplacian pyramid and the first gaussian pyramid is the same. The specific number of layers of the first laplacian pyramid and the first gaussian pyramid can be customized based on requirements.
For example, if the resolution of the image to be fused is 4000 × 3000, the resolution of the top-level image of the first laplacian pyramid and the first gaussian pyramid may be 4 × 3.
For example, the resolution of the 0 th level image of the first laplacian pyramid is 4000 × 3000, the resolution of the 1 st level image is 2000 × 1500, the resolution of the 2 nd level image is 1000 × 750, the resolution of the 3 rd level image is 500 × 375, the resolution of the 3 rd level image is 250 × 200(375 divided by 2 is 187.5, 187.5 is rounded up to 200), and so forth, the resolution of the image of each level of the first laplacian pyramid can be determined.
Each image to be fused corresponds to a weight image, and the resolution of the weight image is the same as that of the image to be fused. And the value of each pixel point in the weight image is a weight value, and the weight value is obtained by calculation according to the weight index of the pixel point at the same position in the corresponding image to be fused. As mentioned above, the weight index includes at least one of contrast, saturation, and exposure moderation.
In a plurality of images to be fused, the sum of the weighted values corresponding to the pixel points at the same position is 1.
Optionally, the generating the fused laplacian pyramid according to the first laplacian pyramid sequence and the first gaussian pyramid sequence includes:
for each first laplacian pyramid, multiplying the first laplacian pyramid by the corresponding first laplacian pyramid to obtain a second laplacian pyramid, wherein the first laplacian pyramid multiplied by the first laplacian pyramid and the first laplacian pyramid correspond to the same image to be fused; adding all of the second laplacian pyramids to obtain the fused laplacian pyramid.
In detail, the first laplacian pyramid is the same as the first gaussian pyramid in layer number, and the multiplying the first laplacian pyramid by the corresponding first gaussian pyramid includes: multiplying the nth layer of the first laplacian pyramid by the corresponding nth layer of the first laplacian pyramid to obtain the nth layer of the second laplacian pyramid, wherein N is an integer greater than or equal to 0.
For example, in the process of multiplying the nth layer of the first laplacian pyramid by the corresponding nth layer of the first laplacian pyramid, specifically, the pixel value of the pixel point whose coordinate in the nth layer of the first laplacian pyramid is (i, j) is multiplied by the weight value of the pixel point whose coordinate in the nth layer of the first laplacian pyramid is (i, j), so as to obtain the pixel value of the pixel point whose coordinate in the nth layer of the second laplacian pyramid is (i, j). The pixel values include a luminance value, R, G, B color channel values.
The process of adding all the second laplacian pyramids to obtain the fused laplacian pyramid is exemplified: and calculating the sum of the pixel values of the pixel points with the coordinates (i, j) in the Nth layer of the second Laplace pyramid to obtain the fused pixel value of the pixel point with the coordinates (i, j) in the Nth layer of the fused Laplace pyramid.
Optionally, the target laplacian pyramid represents the first laplacian pyramid of the target image to be fused; the target image to be fused is determined by any one of the following modes:
randomly determining the target image to be fused from the image sequence to be fused; and determining the image to be fused with the minimum brightness mean value in the image sequence to be fused as the target image to be fused.
And the image to be fused with the minimum brightness mean value in the image sequence to be fused is the darkest image to be fused in the image sequence to be fused.
In some realizable embodiments, the image to be fused with the largest standard deviation of brightness in the image sequence to be fused may be determined as the target image to be fused based on the requirement. Or determining the image to be fused corresponding to the exposure intermediate value in the image sequence to be fused as the target image to be fused. The present disclosure is not limited thereto.
FIG. 3 is a block diagram illustrating an image exposure fusion apparatus according to an exemplary embodiment. Referring to fig. 3, the apparatus 300 includes:
an obtaining module 310, configured to obtain an image sequence to be fused, where the image sequence to be fused includes a plurality of images to be fused, and the images to be fused are images captured at different exposure amounts for the same scene;
a calculation module 320 configured to calculate a fused laplacian pyramid from the sequence of images to be fused;
a replacing module 330, configured to replace, according to the luminance distribution of the first top-level image of the target laplacian pyramid, the luminance distribution of the second top-level image of the fused laplacian pyramid to obtain a target fused laplacian pyramid, where the target laplacian pyramid is generated according to the target image to be fused in the image sequence to be fused;
a reconstruction module 340 configured to reconstruct a target image according to the target fusion laplacian pyramid.
By adopting the device, the image sequence to be fused is obtained, and each image to be fused in the image sequence to be fused is an image shot under different exposure quantities aiming at the same scene. And calculating a fusion Laplacian pyramid according to the image sequence to be fused. And replacing the brightness distribution of the second top-level image of the fused Laplacian pyramid according to the brightness distribution of the first top-level image of the target Laplacian pyramid to obtain the target fused Laplacian pyramid. Since the target laplacian pyramid is generated according to the target image to be fused in the image sequence to be fused, the brightness distribution of the first top-level image of the target laplacian pyramid is the real brightness distribution acquired by the image pickup device. Because the brightness distribution of the second top-level image obtained by multi-exposure fusion is replaced according to the real brightness distribution of the first top-level image, the brightness distribution of the second top-level image obtained by multi-exposure fusion can be corrected, namely the brightness distribution of the top-level image fused with the Laplace pyramid can be corrected, and the corrected target fused Laplace pyramid is obtained. And reconstructing according to the corrected target fusion Laplacian pyramid to obtain a target image. By adopting the method, the brightness inversion phenomenon existing in the reconstructed image directly according to the fusion Laplacian pyramid reconstructed image in the related technology can be improved, and the purpose of enabling the global brightness distribution of the target image to be more reasonable is achieved.
Optionally, the first top-level image and the second top-level image have the same number of pixel points, and the replacing module 330 includes:
the first calculation submodule is configured to calculate the brightness mean value and the standard deviation of each pixel point in the first top-layer image to obtain a first brightness mean value and a first brightness standard deviation;
the second calculation submodule is configured to calculate the brightness mean value of each pixel point in the second top-layer image to obtain a second brightness mean value;
the third calculation submodule is configured to calculate a first target brightness value of each pixel point in the first top-level image according to the magnitude relation between the brightness standard deviation and a preset threshold value and the first brightness mean value to obtain a first target brightness image;
the first determining submodule is configured to obtain a second target brightness image according to the first target brightness image and the second brightness mean value;
and the updating submodule is configured to update the second top-level image according to the second target brightness image to obtain an updated second top-level image, and the top-level image of the target fusion Laplacian pyramid is the updated second top-level image.
Optionally, the third computing submodule comprises:
the second determining submodule is configured to, when the standard deviation of the brightness of the first top-level image is smaller than or equal to the preset threshold, regarding each pixel point in the first top-level image, use a difference value between the brightness value of the pixel point and the first brightness mean value as a first target brightness value of the pixel point.
Optionally, the third computing submodule comprises:
a third determining submodule configured to, when the standard deviation of the brightness of the first top-level image is greater than the preset threshold, calculate, for each pixel point in the first top-level image, a relative standard distance value between the brightness value of the pixel point and the first brightness mean value, and take a product of the relative standard distance value and the preset threshold as a first target brightness value of the pixel point.
Optionally, the first determining submodule is configured to increase the brightness value of each pixel point in the first target brightness image by the second brightness mean value to obtain the second target brightness image.
Optionally, the update sub-module is configured to: and assigning the second target brightness value of each pixel point in the second target brightness image to the pixel point at the corresponding position in the second top image to obtain the updated second top image.
Optionally, the calculation module 320 includes:
the first decomposition submodule is configured to perform laplacian pyramid decomposition on each image to be fused in the image sequence to be fused to obtain a first laplacian pyramid sequence, the first laplacian pyramid sequence comprises a plurality of first laplacian pyramids, and each first laplacian pyramid corresponds to one image to be fused;
the obtaining submodule is configured to obtain a weighted image sequence corresponding to the image sequence to be fused, the weighted image sequence comprises a plurality of weighted images, each weighted image corresponds to one image to be fused, and each weighted image represents the weight for fusing the image sequence to be fused;
the second decomposition submodule is configured to perform Gaussian pyramid decomposition on each weight image in the weight image sequence to obtain a first Gaussian pyramid sequence, the first Gaussian pyramid sequence comprises a plurality of first Gaussian pyramids, and each first Gaussian pyramid corresponds to one weight image;
a generation submodule configured to generate the fused Laplacian pyramid from the first Laplacian pyramid sequence and the first Gaussian pyramid sequence.
Optionally, the generating sub-module includes:
a fourth calculation submodule configured to, for each first laplacian pyramid, multiply the first laplacian pyramid by the corresponding first gaussian pyramid to obtain a second laplacian pyramid, where the first laplacian pyramid multiplied by the first laplacian pyramid and the first laplacian pyramid correspond to the same image to be fused;
a fifth computation submodule configured to add all of the second Laplacian pyramids to obtain the fused Laplacian pyramid.
Optionally, the number of layers of the first laplacian pyramid is the same as that of the first gaussian pyramid, and the fourth computation submodule is configured to multiply the nth layer of the first laplacian pyramid with the corresponding nth layer of the first gaussian pyramid to obtain the nth layer of the second laplacian pyramid, where N is an integer greater than or equal to 0.
Optionally, the target laplacian pyramid represents the first laplacian pyramid of the target image to be fused;
the target image to be fused is determined by any one of the following modes:
randomly determining the target image to be fused from the image sequence to be fused;
and determining the image to be fused with the minimum brightness mean value in the image sequence to be fused as the target image to be fused.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
The present disclosure also provides a computer readable storage medium having stored thereon computer program instructions which, when executed by a processor, implement the steps of the image exposure fusion method provided by the present disclosure.
FIG. 4 is a block diagram illustrating an apparatus 800 for image exposure blending according to an exemplary embodiment. For example, the apparatus 800 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, an exercise device, a personal digital assistant, and the like.
Referring to fig. 4, the apparatus 800 may include one or more of the following components: a processing component 802, a memory 804, a power component 806, a multimedia component 808, an audio component 810, an input/output (I/O) interface 812, a sensor component 814, and a communication component 816.
The processing component 802 generally controls overall operation of the device 800, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing component 802 may include one or more processors 820 to execute instructions to perform all or a portion of the steps of the image exposure fusion method described above. Further, the processing component 802 can include one or more modules that facilitate interaction between the processing component 802 and other components. For example, the processing component 802 can include a multimedia module to facilitate interaction between the multimedia component 808 and the processing component 802.
The memory 804 is configured to store various types of data to support operations at the apparatus 800. Examples of such data include instructions for any application or method operating on device 800, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 804 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
Power component 806 provides power to the various components of device 800. The power components 806 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the device 800.
The multimedia component 808 includes a screen that provides an output interface between the device 800 and a user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 808 includes a front facing camera and/or a rear facing camera. The front camera and/or the rear camera may receive external multimedia data when the device 800 is in an operating mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 810 is configured to output and/or input audio signals. For example, the audio component 810 includes a Microphone (MIC) configured to receive external audio signals when the apparatus 800 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 804 or transmitted via the communication component 816. In some embodiments, audio component 810 also includes a speaker for outputting audio signals.
The I/O interface 812 provides an interface between the processing component 802 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor assembly 814 includes one or more sensors for providing various aspects of state assessment for the device 800. For example, the sensor assembly 814 may detect the open/closed status of the device 800, the relative positioning of components, such as a display and keypad of the device 800, the sensor assembly 814 may also detect a change in the position of the device 800 or a component of the device 800, the presence or absence of user contact with the device 800, the orientation or acceleration/deceleration of the device 800, and a change in the temperature of the device 800. Sensor assembly 814 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 814 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 814 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 816 is configured to facilitate communications between the apparatus 800 and other devices in a wired or wireless manner. The device 800 may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 816 receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 816 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 800 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the image exposure fusion methods described above.
In an exemplary embodiment, a non-transitory computer-readable storage medium comprising instructions, such as the memory 804 comprising instructions, executable by the processor 820 of the apparatus 800 to perform the image exposure fusion method described above is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
In another exemplary embodiment, a computer program product is also provided, which comprises a computer program executable by a programmable apparatus, the computer program having code portions for performing the image exposure fusion method described above when executed by the programmable apparatus.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (13)

1. An image exposure fusion method, characterized in that the method comprises:
acquiring an image sequence to be fused, wherein the image sequence to be fused comprises a plurality of images to be fused, and the images to be fused are images shot under different exposure quantities aiming at the same scene;
calculating a fusion Laplacian pyramid according to the image sequence to be fused;
replacing the brightness distribution of the second top-level image of the fused Laplacian pyramid according to the brightness distribution of the first top-level image of the target Laplacian pyramid to obtain a target fused Laplacian pyramid, wherein the target Laplacian pyramid is generated according to the target image to be fused in the image sequence to be fused;
and reconstructing according to the target fusion Laplacian pyramid to obtain a target image.
2. The method of claim 1, wherein the first top-level image and the second top-level image have the same number of pixels, and the replacing the luminance distribution of the second top-level image of the fused Laplacian pyramid according to the luminance distribution of the first top-level image of the target Laplacian pyramid to obtain the target fused Laplacian pyramid comprises:
calculating the brightness mean value and the standard deviation of each pixel point in the first top layer image to obtain a first brightness mean value and a first brightness standard deviation;
calculating the brightness mean value of each pixel point in the second top layer image to obtain a second brightness mean value;
calculating a first target brightness value of each pixel point in the first top-layer image according to the size relation between the brightness standard deviation and a preset threshold value and the first brightness mean value to obtain a first target brightness image;
obtaining a second target brightness image according to the first target brightness image and the second brightness mean value;
and updating the second top-level image according to the second target brightness image to obtain an updated second top-level image, wherein the top-level image of the target fusion Laplacian pyramid is the updated second top-level image.
3. The method according to claim 2, wherein the calculating a first target luminance value of each pixel point in the first top-level image according to the magnitude relationship between the standard deviation of luminance and a preset threshold and the first luminance mean value comprises:
and under the condition that the standard deviation of the brightness of the first top-layer image is smaller than or equal to the preset threshold, regarding each pixel point in the first top-layer image, taking the difference value between the brightness value of the pixel point and the first brightness mean value as a first target brightness value of the pixel point.
4. The method according to claim 2, wherein the calculating a first target luminance value of each pixel point in the first top-level image according to the magnitude relationship between the standard deviation of luminance and a preset threshold and the first luminance mean value comprises:
and under the condition that the brightness standard deviation of the first top-layer image is larger than the preset threshold, calculating a relative standard distance value between the brightness value of each pixel point and the first brightness mean value aiming at each pixel point in the first top-layer image, and taking the product of the relative standard distance value and the preset threshold as a first target brightness value of the pixel point.
5. The method according to any one of claims 2-4, wherein obtaining a second target luminance image according to the first target luminance image and the second luminance mean value comprises:
and increasing the brightness value of each pixel point in the first target brightness image by the second brightness mean value to obtain a second target brightness image.
6. The method of claim 2, wherein the updating the second top-level image according to the second target luminance image to obtain an updated second top-level image comprises:
and assigning the second target brightness value of each pixel point in the second target brightness image to the pixel point at the corresponding position in the second top image to obtain the updated second top image.
7. The method according to claim 1, wherein the computing a fused Laplacian pyramid from the sequence of images to be fused comprises:
performing laplacian pyramid decomposition on each image to be fused in the image sequence to be fused to obtain a first laplacian pyramid sequence, wherein the first laplacian pyramid sequence comprises a plurality of first laplacian pyramids, and each first laplacian pyramid corresponds to one image to be fused;
acquiring a weight image sequence corresponding to the image sequence to be fused, wherein the weight image sequence comprises a plurality of weight images, each weight image corresponds to one image to be fused, and each weight image represents the weight for fusing the image sequence to be fused;
performing gaussian pyramid decomposition on each weight image in the weight image sequence to obtain a first gaussian pyramid sequence, wherein the first gaussian pyramid sequence comprises a plurality of first gaussian pyramids, and each first gaussian pyramid corresponds to one weight image;
and generating the fused Laplacian pyramid according to the first Laplacian pyramid sequence and the first Gaussian pyramid sequence.
8. The method of claim 7, wherein generating the fused Laplacian pyramid from the first Laplacian pyramid sequence and the first Gaussian pyramid sequence comprises:
for each first laplacian pyramid, multiplying the first laplacian pyramid by the corresponding first laplacian pyramid to obtain a second laplacian pyramid, wherein the first laplacian pyramid multiplied by the first laplacian pyramid and the first laplacian pyramid correspond to the same image to be fused;
adding all of the second laplacian pyramids to obtain the fused laplacian pyramid.
9. The method of claim 8, wherein the first laplacian pyramid is layered the same as the first gaussian pyramid, and wherein multiplying the first laplacian pyramid by the corresponding first gaussian pyramid comprises:
multiplying the nth layer of the first laplacian pyramid by the corresponding nth layer of the first laplacian pyramid to obtain the nth layer of the second laplacian pyramid, wherein N is an integer greater than or equal to 0.
10. The method according to any of claims 7-9, wherein the target laplacian pyramid represents the first laplacian pyramid corresponding to the target image to be fused;
the target image to be fused is determined by any one of the following modes:
mode 1: randomly determining the target image to be fused from the image sequence to be fused;
mode 2: and determining the image to be fused with the minimum brightness mean value in the image sequence to be fused as the target image to be fused.
11. An image exposure fusion apparatus, characterized in that the apparatus comprises:
the image fusion system comprises an acquisition module, a fusion module and a fusion module, wherein the acquisition module is configured to acquire an image sequence to be fused, the image sequence to be fused comprises a plurality of images to be fused, and the images to be fused are images shot under different exposure quantities aiming at the same scene;
a calculation module configured to calculate a fusion laplacian pyramid from the sequence of images to be fused;
a replacing module configured to replace the brightness distribution of the second top-level image of the fused laplacian pyramid according to the brightness distribution of the first top-level image of the target laplacian pyramid to obtain a target fused laplacian pyramid, wherein the target laplacian pyramid is generated according to a target image to be fused in the image sequence to be fused;
and the reconstruction module is configured to obtain a target image according to the target fusion Laplacian pyramid reconstruction.
12. An image exposure fusion apparatus, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to:
acquiring an image sequence to be fused, wherein the image sequence to be fused comprises a plurality of images to be fused, and the images to be fused are images shot under different exposure quantities aiming at the same scene;
calculating a fusion Laplacian pyramid according to the image sequence to be fused;
replacing the brightness distribution of the second top-level image of the fused Laplacian pyramid according to the brightness distribution of the first top-level image of the target Laplacian pyramid to obtain a target fused Laplacian pyramid, wherein the target Laplacian pyramid is generated according to the target image to be fused in the image sequence to be fused;
and reconstructing according to the target fusion Laplacian pyramid to obtain a target image.
13. A computer-readable storage medium, on which computer program instructions are stored, which program instructions, when executed by a processor, carry out the steps of the method according to any one of claims 1 to 10.
CN202111574767.1A 2021-12-21 2021-12-21 Image exposure fusion method and device and storage medium Pending CN114240792A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111574767.1A CN114240792A (en) 2021-12-21 2021-12-21 Image exposure fusion method and device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111574767.1A CN114240792A (en) 2021-12-21 2021-12-21 Image exposure fusion method and device and storage medium

Publications (1)

Publication Number Publication Date
CN114240792A true CN114240792A (en) 2022-03-25

Family

ID=80760750

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111574767.1A Pending CN114240792A (en) 2021-12-21 2021-12-21 Image exposure fusion method and device and storage medium

Country Status (1)

Country Link
CN (1) CN114240792A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116630218A (en) * 2023-07-02 2023-08-22 中国人民解放军战略支援部队航天工程大学 Multi-exposure image fusion method based on edge-preserving smooth pyramid

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116630218A (en) * 2023-07-02 2023-08-22 中国人民解放军战略支援部队航天工程大学 Multi-exposure image fusion method based on edge-preserving smooth pyramid
CN116630218B (en) * 2023-07-02 2023-11-07 中国人民解放军战略支援部队航天工程大学 Multi-exposure image fusion method based on edge-preserving smooth pyramid

Similar Documents

Publication Publication Date Title
CN109670397B (en) Method and device for detecting key points of human skeleton, electronic equipment and storage medium
CN110493538B (en) Image processing method, image processing device, storage medium and electronic equipment
CN109345485B (en) Image enhancement method and device, electronic equipment and storage medium
US20160284095A1 (en) Machine learning of real-time image capture parameters
CN109118430B (en) Super-resolution image reconstruction method and device, electronic equipment and storage medium
CN108234858B (en) Image blurring processing method and device, storage medium and electronic equipment
CN107948510B (en) Focal length adjusting method and device and storage medium
CN108040204B (en) Image shooting method and device based on multiple cameras and storage medium
CN112614064B (en) Image processing method, device, electronic equipment and storage medium
CN112258404A (en) Image processing method, image processing device, electronic equipment and storage medium
CN113096021A (en) Image processing method, device, equipment and storage medium
CN112634160A (en) Photographing method and device, terminal and storage medium
EP4218228A1 (en) Saliency based capture or image processing
CN111968052A (en) Image processing method, image processing apparatus, and storage medium
CN113160038B (en) Image style migration method and device, electronic equipment and storage medium
CN114240792A (en) Image exposure fusion method and device and storage medium
CN113660531A (en) Video processing method and device, electronic equipment and storage medium
CN112651918A (en) Image processing method and device, electronic device and storage medium
CN109816620B (en) Image processing method and device, electronic equipment and storage medium
CN107451972B (en) Image enhancement method, device and computer readable storage medium
CN114070998B (en) Moon shooting method and device, electronic equipment and medium
CN115049572A (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
US20230368340A1 (en) Gating of Contextual Attention and Convolutional Features
CN110910304B (en) Image processing method, device, electronic equipment and medium
CN112651899A (en) Image processing method and device, electronic device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination