CN110751608A - Night scene high dynamic range image fusion method and device and electronic equipment - Google Patents

Night scene high dynamic range image fusion method and device and electronic equipment Download PDF

Info

Publication number
CN110751608A
CN110751608A CN201911009224.8A CN201911009224A CN110751608A CN 110751608 A CN110751608 A CN 110751608A CN 201911009224 A CN201911009224 A CN 201911009224A CN 110751608 A CN110751608 A CN 110751608A
Authority
CN
China
Prior art keywords
image
fused
brightness
images
adjustment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911009224.8A
Other languages
Chinese (zh)
Other versions
CN110751608B (en
Inventor
王涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Maigewei Technology Co Ltd
Original Assignee
Beijing Maigewei Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Maigewei Technology Co Ltd filed Critical Beijing Maigewei Technology Co Ltd
Priority to CN201911009224.8A priority Critical patent/CN110751608B/en
Publication of CN110751608A publication Critical patent/CN110751608A/en
Application granted granted Critical
Publication of CN110751608B publication Critical patent/CN110751608B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/77Retouching; Inpainting; Scratch removal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)
  • Studio Devices (AREA)

Abstract

The invention provides a night scene high dynamic range image fusion method, a night scene high dynamic range image fusion device and electronic equipment, wherein the method comprises the following steps: acquiring a plurality of images to be fused, wherein the images to be fused at least comprise any two of overexposed images, underexposed images and normal exposed images; determining a reference image according to the image to be fused; adjusting the brightness of the image to be fused according to the reference image to obtain an adjusted image to be fused; and fusing the adjusted images to be fused to obtain an HDR fused image. Therefore, HDR fusion can be carried out on the image to be fused without determining camera parameter information or prior scene information, the time complexity is low, the noise difference between different exposure images can be reduced, and the sharp bright area and unnatural excessive noise of the image obtained by fusion are greatly reduced or even avoided. Therefore, a better fusion effect can be achieved under the condition of low time complexity.

Description

Night scene high dynamic range image fusion method and device and electronic equipment
Technical Field
The invention relates to the technical field of image fusion, in particular to a night scene high-dynamic-range image fusion method and device and electronic equipment.
Background
High-Dynamic Range (HDR) images are synthesized by using images with optimal details corresponding to each exposure time according to images with different exposure times, so that compared with common images, the HDR images can provide more Dynamic ranges and image details and better reflect visual effects in a real environment.
At present, algorithms for HDR image generation are many, but generally can be classified into 2 types on the whole, one is a traditional HDR algorithm, but the traditional HDR algorithm needs camera parameter information or advance scene prior information, requires relatively harsh conditions, and is difficult to calculate accurate prior information, and has high time complexity. The other method is that a plurality of images with different exposure values are directly fused through a certain rule, such as a thesis (direct fusion method for images with different exposure values), and the like, camera parameters or scene prior information are not needed in the mode, and the algorithm speed is high. However, this scheme is difficult to use in night-scene hdr, and because the difference between brightness of the same scene is larger and the difference between noise of different exposure images is serious in the night-scene environment, a sharp bright area and unnatural noise transition are easily caused in the resulting image.
Therefore, there is an urgent need for a method and apparatus that do not require prior information and have a good night-scene high dynamic range image fusion effect.
Disclosure of Invention
The problem solved by the invention is how to quickly determine difficult samples from sample data.
In order to solve the above problems, the present invention first provides a night scene high dynamic range image fusion method, which includes:
acquiring a plurality of images to be fused, wherein the images to be fused at least comprise any two of overexposed images, underexposed images and normal exposed images;
determining a reference image after brightness adjustment is carried out according to the underexposed image or the normal exposed image;
adjusting the brightness of the image to be fused according to the reference image to obtain an adjusted image to be fused;
and fusing the adjusted images to be fused to obtain an HDR fused image.
Therefore, HDR fusion can be carried out on the image to be fused without determining camera parameter information or prior scene information, the time complexity is low, the noise difference between different exposure images can be reduced, and the sharp bright area and unnatural excessive noise of the image obtained by fusion are greatly reduced or even avoided. Therefore, a better fusion effect can be achieved under the condition of low time complexity, and the possibility of a sharp bright area and unnatural noise in a fusion image is reduced.
Optionally, during the adjusting of the brightness of the image to be fused according to the reference image, the image to be fused includes the underexposed image or the normal exposed image used for determining the reference image.
Optionally, during the adjustment of the brightness of the image to be fused according to the reference image, the image to be fused is all images to be fused except for the underexposed image or the normal exposed image used for determining the reference image; and is
The fusing the adjusted image to be fused comprises the following steps: and fusing the reference image and the adjusted image to be fused.
Optionally, the adjusting the brightness of the image to be fused according to the reference image includes:
for each image to be fused, integrally adjusting the average brightness of the image to be fused according to the average brightness of the reference image, wherein the average brightness of the image to be fused after integral adjustment is the same as the average brightness of the reference image;
finely adjusting the brightness of the integrally adjusted image to be fused to obtain an adjusted image to be fused;
traversing all the images to be fused to obtain the corresponding images to be fused after adjustment.
Therefore, the image to be fused is adjusted to be the adjusted image to be fused which is very close to the reference image through integral adjustment and fine adjustment, so that the brightness difference between the image to be fused and the reference image is eliminated (greatly reduced), and meanwhile, noise interference is reduced through fine adjustment.
Optionally, the integrally adjusting the average brightness of the image to be fused according to the average brightness of the reference image includes:
determining the average brightness of the image to be fused and the reference image according to the brightness of each pixel point in the image to be fused and the reference image;
determining the brightness difference proportion of the image to be fused and the reference image according to the average brightness of the image to be fused and the reference image;
and integrally adjusting the brightness of each pixel point in the image to be fused according to the brightness difference proportion, wherein the average brightness of the image to be fused is the same as the average brightness of the reference image after the integral adjustment.
Therefore, the brightness of the image to be fused can be quickly attached to the reference image through integral adjustment, and the probability of distortion caused by the integral adjustment is reduced through the equal-proportion adjustment.
Optionally, the integrally adjusting the brightness of each pixel point in the image to be fused according to the brightness difference ratio includes: and carrying out the integral adjustment by multiplying the brightness of each pixel point in the image to be fused with the brightness difference proportion.
Optionally, the fine tuning of the brightness of the image to be fused after the overall adjustment to obtain the image to be fused after the adjustment includes:
reducing the reference image and the integrally adjusted image to be fused;
performing guiding filtering with the same parameters on the reduced reference image and the reduced image to be fused;
determining the difference proportion of each pixel point according to the brightness of each pixel point in the filtered reference image and the filtered image to be fused, and storing the difference proportion into an adjusting mask;
amplifying the adjustment mask to be the same as the size of the image to be fused before integral adjustment;
and adjusting the brightness of pixel points at corresponding positions in the integrally adjusted image to be fused according to the difference proportion of the amplified adjustment mask to obtain the adjusted image to be fused.
Therefore, the images to be fused after the average brightness is adjusted (the whole image is adjusted) are subjected to small image amplification for fine adjustment, the pixel intensity of the similar areas near the images to be fused can be ensured to be close, and noise interference is avoided.
Optionally, the adjusting the brightness of the pixel point at the corresponding position in the image to be fused after the overall adjustment includes: and adjusting by multiplying the brightness of the pixel points at the corresponding positions in the integrally adjusted image to be fused with the corresponding difference proportion.
Optionally, the fusing the reference image and the adjusted image to be fused to obtain an HDR fused image includes:
determining the fusion weight of the adjusted image to be fused and the fusion weight of the reference image according to the adjusted image to be fused and the reference image;
and fusing the image to be fused after the adjustment and the reference image according to the fusion weight of the image to be fused after the adjustment and the fusion weight of the reference image to obtain an HDR fusion image.
In this way, by performing image fusion after assigning the fusion weight, the overall luminance noise of the finally obtained HDR fusion image can be natural.
Optionally, the determining a reference image after performing brightness adjustment according to the underexposed image or the normal exposed image includes:
selecting an underexposed image or a normal exposed image from the images to be fused as images to be referred to;
determining an overexposure mask of each pixel point in the image to be referenced according to the image to be referenced;
acquiring a preset adjusting parameter, and determining a brightness adjusting parameter of each pixel point in the image to be referenced according to the adjusting parameter and the overexposure mask;
and adjusting the brightness of each pixel point in the image to be referenced according to the brightness adjustment parameter to obtain the reference image.
In this way, by presetting the adjustment parameters and adjusting the brightness of the pixel points, the adjusted brightness can be used as the overall brightness of the finally generated HDR fusion image, so that the overall brightness corresponds to the favorite style of the client, and the experience and use comfort of the client are improved. And through detecting the overexposure area, the bright area overexposure caused in brightness adjustment is prevented, and because the overexposure mask transition is smooth and natural, all transition zones with abnormal jump of brightness can not occur.
Optionally, the overexposure mask is obtained in the following manner:
E=1-min(n,1)
Figure BDA0002243687920000051
wherein E is the overexposure mask of the pixel point, p is the brightness of the pixel point in the image to be referred to, and p istIs the brightness threshold, s is smoothness and n is an intermediate variable.
Optionally, the brightness adjustment parameter is obtained in the following manner:
a=(A-1)×E+1
in the formula, a is a brightness adjustment parameter of a pixel point, a is an adjustment parameter, and E is an overexposure mask of the pixel point.
Optionally, when the image to be fused is the overexposed image, the corresponding obtaining manner of the fusion weight of the adjusted image to be fused is as follows:
Figure BDA0002243687920000052
in the formula, w+The fusion weight of the adjusted image to be fused corresponding to the overexposed image, p is the brightness of the pixel point in the adjusted image to be fused, p istIs the brightness threshold and s is the smoothness.
Optionally, when the image to be fused is the under-exposed image, the corresponding obtaining manner of the fusion weight of the adjusted image to be fused is as follows:
in the formula, w-The fusion weight of the adjusted image to be fused corresponding to the underexposed image, p is the brightness of the pixel point in the adjusted image to be fused, p istIs the brightness threshold and s is the smoothness.
There is provided again a night-scene high-dynamic-range image fusion apparatus, comprising:
the image fusion device comprises an acquisition unit, a fusion unit and a fusion unit, wherein the acquisition unit is used for acquiring a plurality of images to be fused, and the images to be fused at least comprise any two of overexposed images, underexposed images and normal exposure images;
the determining unit is used for determining a reference image after brightness adjustment is carried out according to the underexposed image or the normal exposed image;
the adjusting unit is used for adjusting the brightness of the image to be fused according to the reference image to obtain an adjusted image to be fused;
and the fusion unit is used for fusing the adjusted images to be fused to obtain an HDR fused image.
Therefore, HDR fusion can be carried out on the image to be fused without determining camera parameter information or prior scene information, the time complexity is low, the noise difference between different exposure images can be reduced, and the sharp bright area and unnatural excessive noise of the image obtained by fusion are greatly reduced or even avoided. Therefore, a better fusion effect can be achieved under the condition of low time complexity, and the possibility of a sharp bright area and unnatural noise in a fusion image is reduced.
The electronic device further includes a processor and a memory, where the memory stores a control program, and the control program is executed by the processor to implement the night scene high dynamic range image fusion method.
Finally, a computer-readable storage medium is provided, which stores instructions, wherein the instructions are loaded and executed by a processor to implement the night scene high dynamic range image fusion method.
Drawings
FIG. 1 is a fused image obtained by directly fusing a plurality of night scene images with different exposure values;
FIG. 2 is a flowchart of a night scene high dynamic range image fusion method according to an embodiment of the present invention;
FIG. 3 is a flowchart of the night-scene high dynamic range image fusion method step 30 according to an embodiment of the invention;
FIG. 4 is a flowchart of the night-scene high dynamic range image fusion method step 31 according to an embodiment of the present invention;
FIG. 5 is a flowchart of the night-scene high dynamic range image fusion method step 32 according to an embodiment of the invention;
FIG. 6A is an adjusted to-be-fused image obtained by direct thumbnail magnification according to an embodiment of the present invention;
FIG. 6B is an adjusted image to be fused, obtained by adjusting the average brightness and then enlarging the thumbnail, according to an embodiment of the invention;
FIG. 7 is a flowchart of a night scene high dynamic range image fusion method step 40 according to an embodiment of the invention;
FIG. 8 is a flowchart of the night scene high dynamic range image fusion method step 20 according to an embodiment of the present invention;
FIG. 9 is a block diagram of a night scene high dynamic range image fusion apparatus according to an embodiment of the present invention;
fig. 10 is a block diagram of an electronic device according to an embodiment of the present invention;
FIG. 11 is a block diagram of another electronic device according to an embodiment of the invention.
Description of reference numerals:
1-acquisition unit, 2-determination unit, 3-adjustment unit, 4-fusion unit, 12-electronic device, 14-external device, 16-processing unit, 18-bus, 20-network adapter, 22-input/output (I/O) interface, 24-display, 28-system memory, 30-random access memory, 32-cache memory, 34-storage system, 40-utility, 42-program module.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in detail below.
It is to be understood that the embodiments described are only a few embodiments of the present invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
For easy understanding, in the present invention, technical problems therein need to be elaborated.
High-Dynamic Range (HDR) images, which can provide more Dynamic Range and image details than ordinary images, are synthesized from LDR (Low-Dynamic Range) images of different exposure times and using LDR images of optimal details corresponding to each exposure time to synthesize a final HDR image. The visual effect in the real environment can be better reflected.
The LDR image name differs depending on the exposure time. If the exposure time is insufficient, the obtained LDR image is an underexposed image; if the exposure time is in the normal range, the obtained LDR image is a normal exposure image; and if the exposure time is too long, the obtained LDR image is an overexposed image.
One of the existing HDR algorithms is to determine camera parameter information or pre-scene prior information first and then fuse images according to the camera parameter information or the pre-scene prior information, and the HDR algorithm needs to determine the camera parameter information or the pre-scene prior information first, so that the calculation is difficult and the time complexity is high; one method is to directly fuse a plurality of images with different exposure values through a certain rule, but the method can only be applied to the condition that the image noise difference is small, once the images are in a night scene environment, the brightness difference of the same scene is large, the noise difference of different exposure images is serious, and the fused images have abrupt bright areas and unnatural noise excess.
As shown in fig. 1, it is a fused image obtained by directly fusing a plurality of night view images with different exposure values (i.e. a fused image obtained by fusing the night view images by the second method); from this figure we can clearly see that many sharp bright areas appear above it and that the noise is too unnatural.
The embodiment of the disclosure provides a night scene high dynamic range image fusion method, which can be executed by a night scene high dynamic range image fusion device, and the night scene high dynamic range image fusion device can be integrated in electronic equipment such as a mobile phone, a notebook, a server, a video camera, a PAD and the like. Fig. 2 is a first flowchart of a night-scene high-dynamic-range image fusion method according to an embodiment of the present invention; the night scene high dynamic range image fusion method comprises the following steps:
step 10, acquiring a plurality of images to be fused, wherein the images to be fused at least comprise any two of overexposed images, underexposed images and normal exposed images;
the acquisition mode of the image to be fused can be shooting directly through the electronic device, inputting into the electronic device through other devices, or other feasible acquisition modes.
It should be noted that the images to be fused are all images of the same object with different exposure durations, that is, each pixel point in the images to be fused corresponds to each other one by one, and only the brightness (pixel value or pixel intensity value of the pixel point) is different.
Optionally, the number of the images to be fused is at least 2.
The image to be fused at least comprises any two of an overexposed image, an underexposed image and a normal exposure image.
The judgment of whether the image to be fused is an overexposed image, an underexposed image or a normal exposure image can be judged according to a preset judgment condition in the electronic equipment for shooting the image to be fused, and can also be determined according to the exposure duration of the image to be fused or according to the actual situation.
For an overexposed image, the image noise is low, and the disadvantage is that since the brightness of a pixel point (pixel value or pixel intensity value of the pixel point) is 255.0 at most, the part exceeding the brightness is also 255.0, that is, part of information is uniformly changed into 255.0 brightness due to overexposure, so that the corresponding information is lost, that is, the information amount of the overexposed image is small.
For an underexposed image, the opposite is true, and the advantage is that the information content of the overexposed area is larger, and the disadvantage is that the noise of the image is higher.
Therefore, the image to be fused cannot be only an overexposed image or only an underexposed image, which may cause the disadvantage that cannot be eliminated by means of image fusion.
Step 20, determining a reference image after brightness adjustment is carried out according to the underexposed image or the normal exposed image;
wherein the reference image is obtained by adjusting the brightness of one of the underexposed image or the normal exposed image. Therefore, the reference image is generated through the image to be fused, so that HDR fusion can be carried out on the image to be fused without determining camera parameter information or prior scene prior information, and the time complexity is low.
Selecting a normal exposure image or an underexposure image; the information amount in the overexposed region of the overexposed image is small, so that the information amount of the overexposed region in the reference image is too small if the reference image is generated by the overexposed image, and when other images to be fused are adjusted according to the reference image, the part corresponding to the overexposed region is easy to distort, so that the fused HDR image is seriously distorted; and if the overexposed image is not selected, the distortion of the fused HDR image caused by the overexposed area can be avoided.
Step 30, adjusting the brightness of the image to be fused according to the reference image to obtain an adjusted image to be fused;
therefore, the image to be fused is processed, so that the image to be fused after adjustment is closer to the reference image, the noise difference between different exposure images is reduced, and the sharp bright area and the unnatural excessive noise of the image obtained by fusion are greatly reduced and even avoided.
And step 40, fusing the adjusted images to be fused to obtain an HDR fused image.
Therefore, HDR fusion can be carried out on the image to be fused without determining camera parameter information or prior scene information, the time complexity is low, the noise difference between different exposure images can be reduced, and the sharp bright area and unnatural excessive noise of the image obtained by fusion are greatly reduced or even avoided. Therefore, a better fusion effect can be achieved under the condition of low time complexity, and the possibility of a sharp bright area and unnatural noise in a fusion image is reduced.
Optionally, during the adjusting of the brightness of the image to be fused according to the reference image, the image to be fused includes the underexposed image or the normal exposed image used for determining the reference image.
It should be noted that, during the processing of the images to be fused, since one of the images to be fused determines the reference image (or generates the reference image) after the brightness adjustment, the processing of the image to be fused (the image to be fused for which the reference image is determined) in the subsequent step may be processed in two ways, which is the first processing way.
That is, in step 30, the brightness of the image to be fused is adjusted according to the reference image, so that in the adjusted image to be fused, the brightness of the image to be fused (the image to be fused for which the reference image is determined) is still adjusted, so that the adjusted image to be fused is obtained; then, in step 40, the images to be fused after adjustment are fused to obtain an HDR fused image, and all the images to be fused after adjustment (that is, the images to be fused after adjustment including brightness adjustment of the images to be fused for which the reference image is determined) are directly fused to obtain the HDR fused image.
Therefore, the difference between the image to be fused of which the reference image is determined in the image to be fused and the rest images to be fused does not need to be particularly distinguished, brightness adjustment and fusion are directly carried out on all the images to be fused, and the method is simple and convenient.
Optionally, during the adjustment of the brightness of the image to be fused according to the reference image, the image to be fused is all images to be fused except for the underexposed image or the normal exposed image used for determining the reference image; and is
The fusing the adjusted image to be fused comprises the following steps: and fusing the reference image and the adjusted image to be fused.
This is the second processing method among the aforementioned processing methods.
That is, in step 30, the brightness of the image to be fused is adjusted according to the reference image, and in the adjusted image to be fused, the brightness of the image to be fused (the image to be fused for which the reference image is determined) is not adjusted any more; then, in step 40, the adjusted images to be fused are fused to obtain an HDR fused image, and the reference image and all the images to be fused after adjustment (that is, the image to be fused after adjustment, which is obtained by not including brightness adjustment of the image to be fused for which the reference image is determined, is obtained after brightness adjustment, and actually the image to be fused after adjustment is replaced by the reference image) are directly fused to obtain the HDR fused image.
Therefore, brightness adjustment is not needed to be carried out on the image to be fused with the reference image determined, the reference image is used for replacing the image after the image to be fused is adjusted, and brightness adjustment and fusion are directly carried out on the image to be fused and all other images to be fused, so that the method is simple and convenient. Optionally, as shown in fig. 3, the step 30 of adjusting the brightness of the image to be fused according to the reference image includes:
step 31, for each image to be fused, integrally adjusting the average brightness of the image to be fused according to the average brightness of the reference image, wherein the average brightness of the image to be fused after integral adjustment is the same as the average brightness of the reference image;
in order to adjust all the images to be fused, adjusting one of the images to be fused, and then respectively adjusting the rest of the images to be fused until all the images to be fused are adjusted; when the scheme is specifically implemented, a plurality of images to be fused can be processed simultaneously in a parallel mode.
The reference image and the image to be fused are provided with a plurality of pixel points, and each pixel point is provided with corresponding brightness (also called as a pixel value or a pixel intensity value of the pixel point) or a brightness value; and the average brightness of the reference image and the image to be fused is the average value of the brightness of all pixel points of the image.
The overall adjustment in this step means that all the pixel points are adjusted in equal proportion at the same time, so that distortion is avoided in the adjustment process.
Step 32, fine-tuning the brightness of the image to be fused after the overall adjustment to obtain an adjusted image to be fused;
the whole adjustment is simple and convenient, but simultaneously, the noise part can be amplified or reduced in equal proportion, and certain pixel points cannot be independently adjusted.
Through the fine adjustment in the step, the noise part of the image to be fused can be finely adjusted, so that noise interference is avoided, and finally the image to be fused after fine adjustment (the image to be fused after adjustment) is closer to the reference image.
And step 33, traversing all the images to be fused to obtain the corresponding adjusted images to be fused.
And repeatedly executing the steps 31-32 to adjust other unadjusted images to be fused to obtain the corresponding adjusted images to be fused.
Therefore, the image to be fused is adjusted to be the adjusted image to be fused which is very close to the reference image through integral adjustment and fine adjustment, so that the brightness difference between the image to be fused and the reference image is eliminated (greatly reduced), and meanwhile, noise interference is reduced through fine adjustment.
Optionally, as shown in fig. 4, in step 31, the overall adjusting of the average brightness of the image to be fused according to the average brightness of the reference image includes:
step 311, determining the average brightness of the image to be fused and the reference image according to the brightness of each pixel point in the image to be fused and the reference image;
the average brightness of the reference image and the image to be fused is the average value of the brightness of all pixel points of the image.
Step 312, determining the brightness difference ratio between the image to be fused and the reference image according to the average brightness of the image to be fused and the reference image;
and the brightness difference proportion is the ratio of the average brightness of the reference image to the average brightness of the image to be fused.
For example, the average brightness of the image to be fused is 127.0, and the average brightness of the reference image is 65.0; the brightness difference ratio between the image to be fused and the reference image is 65/127. (the brightness difference ratio is used as an intermediate variable to integrally adjust the image to be fused, so that the positions of the numerator and the denominator can be reversed, and only the corresponding division operation needs to be selected in the subsequent integral adjustment)
And 313, integrally adjusting the brightness of each pixel point in the image to be fused according to the brightness difference proportion, wherein the average brightness of the image to be fused is the same as the average brightness of the reference image after the integral adjustment.
The overall adjustment in this step is to multiply the brightness of each pixel point in the image to be fused by the brightness difference ratio, so that the average brightness of the image to be fused after the overall adjustment is the same as the average brightness of the reference image.
Optionally, the adjusting the brightness of the pixel point at the corresponding position in the image to be fused after the overall adjustment includes: and adjusting by multiplying the brightness of the pixel points at the corresponding positions in the integrally adjusted image to be fused with the corresponding difference proportion.
And the brightness difference proportion is the ratio of the average brightness of the reference image to the average brightness of the image to be fused. The brightness difference ratio is used as an intermediate variable to integrally adjust the image to be fused, so that the positions of the numerator and the denominator can be reversed, and only the corresponding division operation needs to be selected in the subsequent integral adjustment.
For example, if the average brightness of the image to be fused is 127.0, the average brightness of the reference image is 65.0; if the brightness difference ratio of the image to be fused and the reference image is recorded as 127/65, the overall adjustment is to divide the brightness of each pixel point in the image to be fused by the brightness difference ratio; if the brightness difference proportion of the image to be fused and the reference image is recorded as 65/127, the overall adjustment is to multiply the brightness of each pixel point in the image to be fused by the brightness difference proportion; the average brightness of the image to be fused after the overall adjustment is 65.0.
Therefore, the brightness of the image to be fused can be quickly attached to the reference image through integral adjustment, and the probability of distortion caused by the integral adjustment is reduced through the equal-proportion adjustment.
Optionally, as shown in fig. 5, in step 32, the fine-tuning the brightness of the image to be fused after the overall adjustment to obtain the image to be fused after the adjustment, includes:
step 321, reducing the reference image and the integrally adjusted image to be fused;
the step of reducing the reference image and the image to be fused refers to reducing the reference image and the image to be fused from a large image to a small image, for example, a large image with 4000 pixels × 3000 pixels is reduced to a small image with 1000 pixels × 750 pixels according to a proportion of 1/16.
The specific reduction mode for reducing the large image into the small image can be determined according to actual conditions, but it is clear that the reference image and the image to be fused after the overall adjustment need to be reduced in the same mode.
It should be noted that, if the number of the images to be fused is large, the step 32 and the step 321-325 need to be executed repeatedly, then in the step executed for the first time, the reference image and the image to be fused after the overall adjustment may be reduced in the same proportion, and when the step is not executed for the first time, the reference image may not be reduced, but the reduced reference image obtained in the first execution may be directly read, and only the image to be fused after the overall adjustment may be reduced.
Step 322, performing guided filtering with the same parameters on the reduced reference image and the reduced image to be fused;
in this step, the guiding filtering with the same parameters may be the same except that the input image and the guiding image (which may be the input image as the guiding image) are different.
It should be noted that, if the number of the images to be fused is large, step 32 and step 321-325 need to be repeatedly executed, in the first execution of this step, the same parameter guiding filtering may be performed on the reference image and the images to be fused; when the step is not executed for the first time, the reference image is not subjected to guide filtering, but the filtered reference image obtained during the first execution is directly read, and only the guide filtering with the same parameters is performed on the image to be fused.
Through guiding filtering, the pixel intensities of the similar areas nearby in the image to be fused and the reference image can be close, and noise interference is avoided.
Step 323, determining the difference proportion of each pixel point according to the brightness of each pixel point in the filtered reference image and the filtered image to be fused, and storing the difference proportion into an adjusting mask;
the difference ratio of each pixel point in the image to be fused and the reference image is the ratio of the brightness of the corresponding pixel point of the reference image to the brightness of the corresponding pixel point of the image to be fused. For example, the brightness of a certain pixel point in the image to be fused is 139.0, and the brightness of a pixel point at a corresponding position in the reference image is 61.0; the difference ratio between the pixel point in the image to be fused and the reference image is 61/139. (the difference ratio is used as an intermediate variable to fine-tune the image to be fused, so that the positions of the numerator and the denominator can be reversed, and only the corresponding division operation needs to be selected in the subsequent fine tuning)
It should be noted that the difference ratios are calculated for each pixel point, that is, the reference image has n pixel points, and n difference ratios are obtained by calculating n times. If the reduced reference image (to-be-fused image) is a small image of 1000 pixels × 750 pixels, 75,0000 pixels exist, and 75,0000 difference ratios need to be calculated.
The adjustment mask is a matrix or an image corresponding to the image to be fused, in which all the calculated difference ratios (n, 75,0000) are recorded and stored in the corresponding positions.
Step 324, amplifying the adjustment mask to be the same as the size of the image to be fused before integral adjustment;
in this step, the adjustment mask is enlarged from the small image to the large image, and the size of the large image is the same as that of the image to be fused before the whole adjustment.
The specific amplification mode for amplifying the small image into the large image can be determined according to actual conditions.
For example, if the adjustment mask is a small image of 1000 pixels × 750 pixels (if the adjustment mask is regarded as one image), this step is to enlarge the large image to 4000 pixels × 3000 pixels according to the scale of 16/1. After amplification, the corresponding difference ratios are marked at the corresponding positions of the adjustment mask and each pixel point.
Step 325, adjusting the brightness of the pixel points at the corresponding positions in the image to be fused after the overall adjustment according to the difference proportion of the amplified adjustment mask, so as to obtain an adjusted image to be fused.
Optionally, the adjusting the brightness of the pixel point at the corresponding position in the image to be fused after the overall adjustment includes: and adjusting by multiplying the brightness of the pixel points at the corresponding positions in the integrally adjusted image to be fused with the corresponding difference proportion.
The difference ratio of each pixel point in the image to be fused and the reference image is the ratio of the brightness of the corresponding pixel point of the reference image to the brightness of the corresponding pixel point of the image to be fused.
The difference ratio is used as an intermediate variable to fine-tune the image to be fused, so that the positions of the numerator and the denominator can be reversed, and only the corresponding division operation needs to be selected in the subsequent fine tuning.
For example, if the brightness of a certain pixel point in the image to be fused is 139.0, the brightness of the pixel point at the corresponding position in the reference image is 61.0; if the difference proportion of the pixel point in the image to be fused and the reference image is recorded as 139/61, fine adjustment is to divide the brightness of the pixel point in the image to be fused by the difference proportion; if the difference ratio of the pixel point in the image to be fused and the reference image is recorded as 61/139, the fine adjustment is to multiply the brightness of the pixel point in the image to be fused by the difference ratio.
Therefore, the images to be fused after the average brightness is adjusted (the whole image is adjusted) are subjected to small image amplification for fine adjustment, the pixel intensity of the similar areas near the images to be fused can be ensured to be close, and noise interference is avoided.
It should be noted that, pixel diffusion may occur in the small-image amplification process to cause pixel proportion leakage, if the brightness of the image to be fused is not previously pulled to a level with the reference image, the brightness difference may be large in an area, when brightness adjustment is caused by the leakage of these proportion weights, a wrong area brightness proportion may be used by a pixel, and if the brightness proportion of an adjacent area is changed greatly, a situation that a black edge or a white edge appears in the image after brightness adjustment of the image to be fused is easily caused, as shown in fig. 6A, that is, the image to be fused after adjustment obtained by directly amplifying the small image may be seen in a situation that many black edges and white edges appear, and distortion is serious.
Therefore, the average brightness is adjusted first and then the thumbnail is enlarged, so that the situation that a plurality of black edges and white edges appear in the image to be fused after adjustment due to pixel diffusion in the thumbnail enlargement process can be prevented, as shown in fig. 6B, namely, the image to be fused after adjustment obtained by adjusting the average brightness first and then enlarging the thumbnail can be seen, and the situation that the black edges and the white edges are basically absent can be seen.
Optionally, as shown in fig. 7, in the step 40, the fusing the reference image and the adjusted image to be fused to obtain an HDR fused image, includes:
step 41, determining the fusion weight of the adjusted image to be fused and the fusion weight of the reference image according to the adjusted image to be fused and the reference image;
and determining the fusion weight of the image to be fused after the adjustment according to the image to be fused after the adjustment, and determining the fusion weight of the reference image according to the reference image. It should be noted that the determination of the fusion weight is performed one by one in units of a single pixel point.
Wherein, we can determine that the reference image is or is approximate to one of the images to be fused after adjustment, so the determination mode of the fusion weight of the reference image is very similar or the same as the determination mode of the fusion weight of the images to be fused after adjustment; here, for ease of calculation, we determine the weights in the same way for each.
According to different exposure time in the image forming process, the images to be fused are divided into overexposed images, underexposed images and normal exposure images, and different types of images to be fused have different advantages and disadvantages, so that the determination mode of the fusion weight is different.
For an overexposed image, the lower the brightness of a pixel point of the overexposed image is, the higher the distributed fusion weight is; for an underexposed image, the higher the brightness of a pixel point of the underexposed image is, the higher the distributed fusion weight is; for a normally exposed image, the more the brightness of a pixel is close to the middle position (the brightness of the middle position is determined by a brightness threshold), the higher the assigned fusion weight is, and the more the brightness of the pixel is close to the two end positions, the lower the assigned fusion weight is.
Optionally, when the image to be fused is the overexposed image, the corresponding obtaining manner of the fusion weight of the adjusted image to be fused is as follows:
Figure BDA0002243687920000161
in the formula, w+The fusion weight of the adjusted image to be fused corresponding to the overexposed image, and p is the pixel in the adjusted image to be fusedBrightness of the dot, ptIs the brightness threshold and s is the smoothness.
Optionally, when the image to be fused is the under-exposed image, the corresponding obtaining manner of the fusion weight of the adjusted image to be fused (or the reference image) is as follows:
in the formula, w-The fusion weight of the adjusted image to be fused corresponding to the underexposed image, p is the brightness of the pixel point in the adjusted image to be fused, p istIs the brightness threshold and s is the smoothness.
Wherein, the value range of the smoothness s is [ 1, 10 ]; therefore, by setting the smoothness, the fusion weight is more transitional, natural and smooth, no truncation and division exist, and the fusion is more convenient.
Wherein the brightness threshold value ptAccording to actual conditions or engineering debugging.
Optionally, when the image to be fused is the normal exposure image, the corresponding obtaining manner of the fusion weight of the adjusted image to be fused (or the reference image) is as follows:
in the formula, w is the fusion weight of the image to be fused after adjustment corresponding to the normal exposure image, p is the brightness of the pixel point in the image to be fused after adjustment, and p istIs the brightness threshold and s is the smoothness.
And 42, fusing the image to be fused after the adjustment and the reference image according to the fusion weight of the image to be fused after the adjustment and the fusion weight of the reference image to obtain an HDR fusion image.
It should be noted that, the image fusion is determined one by taking a single pixel point as a unit; that is, the brightness of all the pixel points at the corresponding positions of the adjusted image to be fused and the reference image is calculated, and the obtained new brightness is the brightness of the pixel point at the corresponding position of the HDR fused image. In this way, the pixel points at each position in the image are calculated one by one, and finally the brightness of all the pixel points of the HDR fusion image is obtained.
The brightness of the pixel points at the corresponding positions of the adjusted image to be fused and the reference image is calculated to obtain the brightness of the pixel points at the corresponding positions of the HDR fused image, that is, the brightness of the pixel points at the corresponding positions of the adjusted image to be fused and the reference image is multiplied by the fusion weight of the image and then summed, and then the sum is divided by the sum of all the fusion weights at all the positions (the sum of the fusion weights at the positions of the adjusted image to be fused and the reference image) to obtain the brightness of the pixel points at the corresponding positions of the HDR fused image.
In this way, by performing image fusion after assigning the fusion weight, the overall luminance noise of the finally obtained HDR fusion image can be natural.
Optionally, as shown in fig. 8, in step 20, determining a reference image after performing brightness adjustment according to the underexposed image or the normal exposed image includes:
step 21, selecting an underexposed image or a normal exposed image from the images to be fused as images to be referenced;
wherein, the selected normal exposure image or the underexposed image; the information amount in the overexposed region of the overexposed image is small, so that the information amount of the overexposed region in the reference image is too small if the reference image is generated by the overexposed image, and when other images to be fused are adjusted according to the reference image, the part corresponding to the overexposed region is easy to distort, so that the fused HDR image is seriously distorted; and if the overexposed image is not selected, the distortion of the fused HDR image caused by the overexposed area can be avoided.
Therefore, the image to be referred is a normal exposure image or an underexposed image, and the distortion of an overexposed area caused by insufficient information amount in the generated reference image can be avoided.
Optionally, if the image to be fused includes a normal exposure image and an underexposure image, selecting one of the normal exposure images as the image to be referred to; and if the image to be fused comprises an underexposed image but not a normal exposure image, selecting one of the underexposed images as the image to be referred to.
Step 22, determining an overexposure mask of each pixel point in the image to be referenced according to the image to be referenced;
the overexposure mask is used for reflecting whether the corresponding pixel point is located in an overexposure area, that is, the overexposure mask with the value of 0 represents that the corresponding pixel point is not overexposed and is not located in the overexposure area; the positive overexposure mask represents overexposure of the corresponding pixel point and is located in the overexposure area.
Optionally, the overexposure mask is obtained in the following manner:
E=min(n,1)
wherein E is the overexposure mask of the pixel point, p is the brightness of the pixel point in the image to be referred to, and p istIs the brightness threshold, s is smoothness and n is an intermediate variable.
Through the obtaining formula, it can be seen that, if the brightness is less than or equal to the brightness threshold, the overexposure mask is 0; if the brightness is larger than the brightness threshold, the overexposure mask is a positive value which is larger than 0 and smaller than or equal to 1.
Therefore, whether the pixel points are overexposed can be judged by setting the brightness threshold value, and the detection result of the overexposed area (the area formed by the pixel points with the overexposure mask being positive values) is more transitional, natural and smooth by setting the smoothness, so that the method has no truncation and division and is more convenient to fuse.
Step 23, acquiring preset adjustment parameters, and determining brightness adjustment parameters of each pixel point in the image to be referenced according to the adjustment parameters and the overexposure mask;
the preset adjusting parameters can be preset in the electronic equipment so as to be read when needed; the adjustment parameter may be a default value of an electronic device such as a camera, or may be a value determined by a client according to a sense.
It should be noted that the brightness adjustment parameter is the brightness amplification factor of the pixel point; therefore, for a fully overexposed pixel (with a brightness of 255.0), the magnification is 1 (i.e., remains unchanged and no longer magnifies).
Optionally, the brightness adjustment parameter is obtained in the following manner:
a=(A-1)×E+1
in the formula, a is a brightness adjustment parameter of a pixel point, a is an adjustment parameter, and E is an overexposure mask of the pixel point.
Therefore, the brightness adjustment parameters are determined in a calculation mode, and the method is simple, quick and convenient.
And 24, adjusting the brightness of each pixel point in the image to be referenced according to the brightness adjustment parameter to obtain the reference image.
In this step, the brightness of each pixel point in the image to be referred is adjusted according to the brightness adjustment parameter, that is, the brightness adjustment parameter is multiplied by the brightness of each pixel point, so as to obtain the adjusted reference image.
In this way, by presetting the adjustment parameters and adjusting the brightness of the pixel points, the adjusted brightness can be used as the overall brightness of the finally generated HDR fusion image, so that the overall brightness corresponds to the favorite style of the client, and the experience and use comfort of the client are improved. And through detecting the overexposure area, the bright area overexposure caused in brightness adjustment is prevented, and because the overexposure mask transition is smooth and natural, all transition zones with abnormal jump of brightness can not occur.
The embodiment of the present disclosure provides a night-scene high-dynamic-range image fusion device, which is used for executing the night-scene high-dynamic-range image fusion method described in the above, and the night-scene high-dynamic-range image fusion device is described in detail below.
As shown in fig. 9, a night-scene high-dynamic-range image fusion apparatus includes:
the image fusion device comprises an acquisition unit 1, a fusion processing unit and a fusion processing unit, wherein the acquisition unit is used for acquiring a plurality of images to be fused, and the images to be fused at least comprise any two of overexposed images, underexposed images and normal exposure images;
the determining unit 2 is used for determining a reference image after brightness adjustment is carried out according to the underexposed image or the normal exposed image;
the adjusting unit 3 is used for adjusting the brightness of the image to be fused according to the reference image to obtain an adjusted image to be fused;
and the fusion unit 4 is used for fusing the adjusted images to be fused to obtain an HDR fused image.
Therefore, HDR fusion can be carried out on the image to be fused without determining camera parameter information or prior scene information, the time complexity is low, the noise difference between different exposure images can be reduced, and the sharp bright area and unnatural excessive noise of the image obtained by fusion are greatly reduced or even avoided. Therefore, a better fusion effect can be achieved under the condition of low time complexity, and the possibility of a sharp bright area and unnatural noise in a fusion image is reduced.
Optionally, the adjusting unit 3 is further configured to: for each image to be fused, integrally adjusting the average brightness of the image to be fused according to the average brightness of the reference image, wherein the average brightness of the image to be fused after integral adjustment is the same as the average brightness of the reference image; finely adjusting the brightness of the integrally adjusted image to be fused to obtain an adjusted image to be fused; traversing all the images to be fused to obtain the corresponding images to be fused after adjustment.
Optionally, the adjusting unit 3 is further configured to: determining the average brightness of the image to be fused and the reference image according to the brightness of each pixel point in the image to be fused and the reference image; determining the brightness difference proportion of the image to be fused and the reference image according to the average brightness of the image to be fused and the reference image; and integrally adjusting the brightness of each pixel point in the image to be fused according to the brightness difference proportion, wherein the average brightness of the image to be fused is the same as the average brightness of the reference image after the integral adjustment.
Optionally, the adjusting unit 3 is further configured to: reducing the reference image and the integrally adjusted image to be fused; performing guiding filtering with the same parameters on the reduced reference image and the reduced image to be fused; determining the difference proportion of each pixel point according to the brightness of each pixel point in the filtered reference image and the filtered image to be fused, and storing the difference proportion into an adjusting mask; amplifying the adjustment mask to be the same as the size of the image to be fused before integral adjustment; and adjusting the brightness of pixel points at corresponding positions in the integrally adjusted image to be fused according to the difference proportion of the amplified adjustment mask to obtain the adjusted image to be fused.
Optionally, the image to be fused at least includes any two of an overexposed image, an underexposed image, and a normal exposure image.
Optionally, the fusion unit 4 is further configured to: determining the fusion weight of the adjusted image to be fused and the fusion weight of the reference image according to the adjusted image to be fused and the reference image; and fusing the image to be fused after the adjustment and the reference image according to the fusion weight of the image to be fused after the adjustment and the fusion weight of the reference image to obtain an HDR fusion image.
Optionally, the determining unit 2 is further configured to: selecting an underexposed image or a normal exposed image from the images to be fused as images to be referred to; determining an overexposure mask of each pixel point in the image to be referenced according to the image to be referenced; acquiring a preset adjusting parameter, and determining a brightness adjusting parameter of each pixel point in the image to be referenced according to the adjusting parameter and the overexposure mask; and adjusting the brightness of each pixel point in the image to be referenced according to the brightness adjustment parameter to obtain the reference image.
Optionally, the overexposure mask is obtained in the following manner:
E=1-min(n,1)
wherein E is the overexposure mask of the pixel point, p is the brightness of the pixel point in the image to be referred to, and p istIs the brightness threshold, s is smoothness and n is an intermediate variable.
Optionally, the brightness adjustment parameter is obtained in the following manner:
a=(A-1)×E+1
in the formula, a is a brightness adjustment parameter of a pixel point, a is an adjustment parameter, and E is an overexposure mask of the pixel point.
Optionally, when the image to be fused is the overexposed image, the corresponding obtaining manner of the fusion weight of the adjusted image to be fused is as follows:
Figure BDA0002243687920000221
in the formula, w+The fusion weight of the adjusted image to be fused corresponding to the overexposed image, p is the brightness of the pixel point in the adjusted image to be fused, p istIs the brightness threshold and s is the smoothness.
Optionally, when the image to be fused is the under-exposed image, the corresponding obtaining manner of the fusion weight of the adjusted image to be fused is as follows:
Figure BDA0002243687920000222
in the formula, w_The fusion weight of the adjusted image to be fused corresponding to the underexposed image, p is the brightness of the pixel point in the adjusted image to be fused, p istIs the brightness threshold and s is the smoothness.
It should be noted that the above-described device embodiments are merely illustrative, for example, the division of the units is only one logical function division, and there may be other divisions when actually implemented, and for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some communication interfaces, and may be in an electrical, mechanical or other form.
Having described the internal functions and structure of the night-scene high-dynamic-range image fusion apparatus, as shown in fig. 10, in practice, the night-scene high-dynamic-range image fusion apparatus may be implemented as an electronic device, including: the processor and the memory are used for storing a control program, and the control program is used for realizing the night scene high dynamic range image fusion method when being executed by the processor.
Fig. 11 is a block diagram illustrating another electronic device according to an embodiment of the invention. The electronic device 12 shown in fig. 11 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present application.
As shown in fig. 11, the electronic device 12 may be implemented in the form of a general-purpose electronic device. The components of electronic device 12 may include, but are not limited to: one or more processors or processing units 16, a system memory 28, and a bus 18 that couples various system components including the system memory 28 and the processing unit 16.
Bus 18 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and a processor or local bus using any of a variety of bus architectures. These architectures include, but are not limited to, Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MAC) bus, enhanced ISA bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus, to name a few.
Electronic device 12 typically includes a variety of computer system readable media. Such media may be any available media that is accessible by electronic device 12 and includes both volatile and nonvolatile media, removable and non-removable media.
Memory 28 may include computer system readable media in the form of volatile Memory, such as Random Access Memory (RAM) 30 and/or cache Memory 32. The electronic device 12 may further include other removable/non-removable, volatile/nonvolatile computer-readable storage media. By way of example only, storage system 34 may be used to read from and write to non-removable, nonvolatile magnetic media (not shown, but commonly referred to as a "hard drive"). Although not shown in FIG. 11, a disk drive for reading from and writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk (e.g., a Compact disk Read Only memory (CD-ROM), a Digital versatile disk Read Only memory (DVD-ROM), or other optical media) may be provided. In these cases, each drive may be connected to bus 18 by one or more data media interfaces. Memory 28 may include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out the functions of embodiments of the application.
A program/utility 40 having a set (at least one) of program modules 42 may be stored, for example, in memory 28, such program modules 42 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each of which examples or some combination thereof may comprise an implementation of a network environment. Program modules 42 generally perform the functions and/or methodologies of the embodiments described herein.
Electronic device 12 may also communicate with one or more external devices 14 (e.g., keyboard, pointing device, display 24, etc.), with one or more devices that enable a user to interact with the computer system/server 12, and/or with any devices (e.g., network card, modem, etc.) that enable the computer system/server 12 to communicate with one or more other electronic devices. Such communication may be through an input/output (I/O) interface 22. Also, the electronic device 12 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public Network such as the Internet) via the Network adapter 20. As shown, the network adapter 20 communicates with other modules of the electronic device 12 via the bus 18. It is noted that although not shown, other hardware and/or software modules may be used in conjunction with electronic device 12, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
The processing unit 16 executes various functional applications and data processing, for example, implementing the methods mentioned in the foregoing embodiments, by executing programs stored in the system memory 28.
The electronic device of the invention can be a server or a terminal device with limited computing power, and the lightweight network structure of the invention is particularly suitable for the latter. The base body implementation of the terminal device includes but is not limited to: intelligent mobile communication terminal, unmanned aerial vehicle, robot, portable image processing equipment, security protection equipment etc.. The embodiment of the present disclosure provides a computer-readable storage medium, which stores instructions, and when the instructions are loaded and executed by a processor, the night-scene high-dynamic-range image fusion method is implemented.
The technical solution of the embodiment of the present invention substantially or partly contributes to the prior art, or all or part of the technical solution may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor (processor) to execute all or part of the steps of the method according to the embodiment of the present invention. And the aforementioned storage medium includes: various media capable of storing program codes, such as a U disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk.
Although the present disclosure has been described above, the scope of the present disclosure is not limited thereto. Various changes and modifications may be effected therein by one of ordinary skill in the pertinent art without departing from the spirit and scope of the present disclosure, and these changes and modifications are intended to be within the scope of the present disclosure.

Claims (17)

1. A night scene high dynamic range image fusion method is characterized by comprising the following steps:
acquiring a plurality of images to be fused, wherein the images to be fused at least comprise any two of overexposed images, underexposed images and normal exposed images;
determining a reference image after brightness adjustment is carried out according to the underexposed image or the normal exposed image;
adjusting the brightness of the image to be fused according to the reference image to obtain an adjusted image to be fused;
and fusing the adjusted images to be fused to obtain an HDR fused image.
2. The night-scene high-dynamic-range image fusion method according to claim 1, wherein in the adjusting of the brightness of the image to be fused according to the reference image, the image to be fused includes the under-exposed image or the normal-exposed image used for determining the reference image.
3. The night-scene high-dynamic-range image fusion method according to claim 1, wherein in the adjusting of the brightness of the image to be fused according to the reference image, the image to be fused is all the images to be fused except for the under-exposed image or the normal-exposed image used for determining the reference image; and is
The fusing the adjusted image to be fused comprises the following steps: and fusing the reference image and the adjusted image to be fused.
4. The night-scene high-dynamic-range image fusion method according to claim 1, wherein the adjusting the brightness of the image to be fused according to the reference image comprises:
for each image to be fused, integrally adjusting the average brightness of the image to be fused according to the average brightness of the reference image, wherein the average brightness of the image to be fused after integral adjustment is the same as the average brightness of the reference image;
finely adjusting the brightness of the integrally adjusted image to be fused to obtain an adjusted image to be fused;
traversing all the images to be fused to obtain the corresponding images to be fused after adjustment.
5. The night-scene high-dynamic-range image fusion method according to claim 4, wherein the overall adjustment of the average brightness of the image to be fused according to the average brightness of the reference image comprises:
determining the average brightness of the image to be fused and the reference image according to the brightness of each pixel point in the image to be fused and the reference image;
determining the brightness difference proportion of the image to be fused and the reference image according to the average brightness of the image to be fused and the reference image;
and integrally adjusting the brightness of each pixel point in the image to be fused according to the brightness difference proportion, wherein the average brightness of the image to be fused is the same as the average brightness of the reference image after the integral adjustment.
6. The night scene high dynamic range image fusion method according to claim 5, wherein the integrally adjusting the brightness of each pixel point in the image to be fused according to the brightness difference ratio comprises: and carrying out the integral adjustment by multiplying the brightness of each pixel point in the image to be fused with the brightness difference proportion.
7. The night-scene high-dynamic-range image fusion method according to claim 4, wherein the fine-tuning of the brightness of the image to be fused after the overall adjustment to obtain the image to be fused after the adjustment comprises:
reducing the reference image and the integrally adjusted image to be fused;
performing guiding filtering with the same parameters on the reduced reference image and the reduced image to be fused;
determining the difference proportion of each pixel point according to the brightness of each pixel point in the filtered reference image and the filtered image to be fused, and storing the difference proportion into an adjusting mask;
amplifying the adjustment mask to be the same as the size of the image to be fused before integral adjustment;
and adjusting the brightness of pixel points at corresponding positions in the integrally adjusted image to be fused according to the difference proportion of the amplified adjustment mask to obtain the adjusted image to be fused.
8. The night scene high dynamic range image fusion method according to claim 7, wherein the adjusting the brightness of the pixel points at the corresponding positions in the integrally adjusted image to be fused comprises: and adjusting by multiplying the brightness of the pixel points at the corresponding positions in the integrally adjusted image to be fused with the corresponding difference proportion.
9. The night scene high dynamic range image fusion method according to any one of claims 1 to 8, wherein the fusing the reference image and the adjusted image to be fused to obtain an HDR fused image comprises:
determining the fusion weight of the adjusted image to be fused and the fusion weight of the reference image according to the adjusted image to be fused and the reference image;
and fusing the image to be fused after the adjustment and the reference image according to the fusion weight of the image to be fused after the adjustment and the fusion weight of the reference image to obtain an HDR fusion image.
10. The night-scene high-dynamic-range image fusion method according to claim 9, wherein when the image to be fused is the overexposed image, the corresponding acquisition mode of the fusion weight of the adjusted image to be fused is as follows:
Figure FDA0002243687910000031
in the formula, w+The fusion weight of the adjusted image to be fused corresponding to the overexposed image, and the brightness p of the pixel point in the adjusted image to be fusedtIs the brightness threshold and s is the smoothness.
11. The night-scene high-dynamic-range image fusion method according to claim 9, wherein when the image to be fused is the under-exposed image, the corresponding acquisition mode of the fusion weight of the adjusted image to be fused is as follows:
Figure FDA0002243687910000032
in the formula, w-The fusion weight of the adjusted image to be fused corresponding to the underexposed image, and the brightness p of the pixel point in the adjusted image to be fusedtIs the brightness threshold and s is the smoothness.
12. The night scene high dynamic range image fusion method according to any one of claims 1 to 8, wherein determining a reference image after performing brightness adjustment according to the underexposed image or the normal exposed image comprises:
selecting an underexposed image or a normal exposed image from the images to be fused as images to be referred to;
determining an overexposure mask of each pixel point in the image to be referenced according to the image to be referenced;
acquiring a preset adjusting parameter, and determining a brightness adjusting parameter of each pixel point in the image to be referenced according to the adjusting parameter and the overexposure mask;
and adjusting the brightness of each pixel point in the image to be referenced according to the brightness adjustment parameter to obtain the reference image.
13. The night scene high dynamic range image fusion method according to claim 12, wherein the overexposure mask is obtained in a manner that:
E=1-min(n,1)
Figure FDA0002243687910000041
wherein E is the overexposure mask of the pixel point, p is the brightness of the pixel point in the image to be referred to, and p istIs the brightness threshold, s is smoothness and n is an intermediate variable.
14. The night scene high dynamic range image fusion method according to claim 12, wherein the brightness adjustment parameter is obtained in a manner that:
a=(A-1)×E+1
in the formula, a is a brightness adjustment parameter of a pixel point, a is an adjustment parameter, and E is an overexposure mask of the pixel point.
15. A night scene high dynamic range image fusion device is characterized by comprising:
the image fusion device comprises an acquisition unit (1) for acquiring a plurality of images to be fused, wherein the images to be fused at least comprise any two of overexposed images, underexposed images and normal exposure images;
the determining unit (2) is used for determining a reference image after brightness adjustment is carried out according to the underexposed image or the normal exposed image;
the adjusting unit (3) is used for adjusting the brightness of the image to be fused according to the reference image to obtain an adjusted image to be fused;
and the fusion unit (4) is used for fusing the adjusted images to be fused to obtain an HDR fused image.
16. An electronic device comprising a processor and a memory, wherein the memory stores a control program which, when executed by the processor, implements the night scene high dynamic range image fusion method according to any one of claims 1 to 14.
17. A computer readable storage medium storing instructions which, when loaded and executed by a processor, carry out a night-scene high dynamic range image fusion method according to any one of claims 1 to 14.
CN201911009224.8A 2019-10-23 2019-10-23 Night scene high dynamic range image fusion method and device and electronic equipment Active CN110751608B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911009224.8A CN110751608B (en) 2019-10-23 2019-10-23 Night scene high dynamic range image fusion method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911009224.8A CN110751608B (en) 2019-10-23 2019-10-23 Night scene high dynamic range image fusion method and device and electronic equipment

Publications (2)

Publication Number Publication Date
CN110751608A true CN110751608A (en) 2020-02-04
CN110751608B CN110751608B (en) 2022-08-16

Family

ID=69279411

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911009224.8A Active CN110751608B (en) 2019-10-23 2019-10-23 Night scene high dynamic range image fusion method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN110751608B (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111311532A (en) * 2020-03-26 2020-06-19 深圳市商汤科技有限公司 Image processing method and device, electronic device and storage medium
CN112651918A (en) * 2021-01-15 2021-04-13 北京小米松果电子有限公司 Image processing method and device, electronic device and storage medium
WO2021082580A1 (en) * 2019-10-31 2021-05-06 北京迈格威科技有限公司 Night scene high dynamic range image generation method, device, and electronic apparatus
WO2021184496A1 (en) * 2020-03-17 2021-09-23 捷开通讯(深圳)有限公司 Image fusion method and apparatus, storage medium and mobile terminal
WO2021223094A1 (en) * 2020-05-06 2021-11-11 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Method and apparatus for reducing noise, and computer usable medium storing software for implementing the method
CN113781370A (en) * 2021-08-19 2021-12-10 北京旷视科技有限公司 Image enhancement method and device and electronic equipment
CN117061841A (en) * 2023-06-12 2023-11-14 深圳市博盛医疗科技有限公司 Dual-wafer endoscope imaging method and imaging device

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105894449A (en) * 2015-11-11 2016-08-24 乐卡汽车智能科技(北京)有限公司 Method and system for overcoming abrupt color change in image fusion processes
CN105976325A (en) * 2016-06-29 2016-09-28 上海小蚁科技有限公司 Method for adjusting brightness of multiple images
CN108205796A (en) * 2016-12-16 2018-06-26 大唐电信科技股份有限公司 A kind of fusion method and device of more exposure images
WO2018136373A1 (en) * 2017-01-20 2018-07-26 Microsoft Technology Licensing, Llc Image fusion and hdr imaging
US20180260941A1 (en) * 2017-03-07 2018-09-13 Adobe Systems Incorporated Preserving color in image brightness adjustment for exposure fusion
CN109767413A (en) * 2019-01-11 2019-05-17 深圳岚锋创视网络科技有限公司 A kind of the HDR method, apparatus and portable terminal of anti-motion artifacts
CN110166709A (en) * 2019-06-13 2019-08-23 Oppo广东移动通信有限公司 Night scene image processing method, device, electronic equipment and storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105894449A (en) * 2015-11-11 2016-08-24 乐卡汽车智能科技(北京)有限公司 Method and system for overcoming abrupt color change in image fusion processes
CN105976325A (en) * 2016-06-29 2016-09-28 上海小蚁科技有限公司 Method for adjusting brightness of multiple images
CN108205796A (en) * 2016-12-16 2018-06-26 大唐电信科技股份有限公司 A kind of fusion method and device of more exposure images
WO2018136373A1 (en) * 2017-01-20 2018-07-26 Microsoft Technology Licensing, Llc Image fusion and hdr imaging
US20180260941A1 (en) * 2017-03-07 2018-09-13 Adobe Systems Incorporated Preserving color in image brightness adjustment for exposure fusion
CN109767413A (en) * 2019-01-11 2019-05-17 深圳岚锋创视网络科技有限公司 A kind of the HDR method, apparatus and portable terminal of anti-motion artifacts
CN110166709A (en) * 2019-06-13 2019-08-23 Oppo广东移动通信有限公司 Night scene image processing method, device, electronic equipment and storage medium

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021082580A1 (en) * 2019-10-31 2021-05-06 北京迈格威科技有限公司 Night scene high dynamic range image generation method, device, and electronic apparatus
WO2021184496A1 (en) * 2020-03-17 2021-09-23 捷开通讯(深圳)有限公司 Image fusion method and apparatus, storage medium and mobile terminal
CN111311532A (en) * 2020-03-26 2020-06-19 深圳市商汤科技有限公司 Image processing method and device, electronic device and storage medium
WO2021223094A1 (en) * 2020-05-06 2021-11-11 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Method and apparatus for reducing noise, and computer usable medium storing software for implementing the method
CN112651918A (en) * 2021-01-15 2021-04-13 北京小米松果电子有限公司 Image processing method and device, electronic device and storage medium
CN113781370A (en) * 2021-08-19 2021-12-10 北京旷视科技有限公司 Image enhancement method and device and electronic equipment
WO2023020201A1 (en) * 2021-08-19 2023-02-23 北京旷视科技有限公司 Image enhancement method and electronic device
CN117061841A (en) * 2023-06-12 2023-11-14 深圳市博盛医疗科技有限公司 Dual-wafer endoscope imaging method and imaging device

Also Published As

Publication number Publication date
CN110751608B (en) 2022-08-16

Similar Documents

Publication Publication Date Title
CN110751608B (en) Night scene high dynamic range image fusion method and device and electronic equipment
CN110611750B (en) Night scene high dynamic range image generation method and device and electronic equipment
CN108335279B (en) Image fusion and HDR imaging
CN109218628B (en) Image processing method, image processing device, electronic equipment and storage medium
US10410327B2 (en) Shallow depth of field rendering
EP3694203A1 (en) Method and device for obtaining exposure compensation value of high-dynamic-range image
US9826149B2 (en) Machine learning of real-time image capture parameters
US9311901B2 (en) Variable blend width compositing
US8520090B2 (en) Methods and apparatuses for image processing
US20180308225A1 (en) Systems and techniques for automatic image haze removal across multiple video frames
CN110009587B (en) Image processing method, image processing device, storage medium and electronic equipment
CN112565636B (en) Image processing method, device, equipment and storage medium
TWI697867B (en) Metering compensation method and related monitoring camera apparatus
CN112541868B (en) Image processing method, device, computer equipment and storage medium
Guthier et al. Flicker reduction in tone mapped high dynamic range video
CN113674193A (en) Image fusion method, electronic device and storage medium
CN111369471A (en) Image processing method, device, equipment and storage medium
CN112164007A (en) Image display method and apparatus, terminal and readable storage medium
CN110493515B (en) High dynamic range shooting mode starting method and device, storage medium and electronic equipment
Choi et al. A method for fast multi-exposure image fusion
Yang et al. Mantissa-exponent-based tone mapping for wide dynamic range image sensors
CN112561906A (en) Image processing method, device, equipment and medium
CN111917986A (en) Image processing method, medium thereof, and electronic device
CN112312035B (en) Exposure parameter adjusting method, exposure parameter adjusting device and electronic equipment
CN110874816B (en) Image processing method, device, mobile terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant