CN113538265A - Image denoising method and device, computer readable medium and electronic device - Google Patents

Image denoising method and device, computer readable medium and electronic device Download PDF

Info

Publication number
CN113538265A
CN113538265A CN202110764536.0A CN202110764536A CN113538265A CN 113538265 A CN113538265 A CN 113538265A CN 202110764536 A CN202110764536 A CN 202110764536A CN 113538265 A CN113538265 A CN 113538265A
Authority
CN
China
Prior art keywords
image
denoising
parameter
under
exposure condition
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110764536.0A
Other languages
Chinese (zh)
Inventor
王舒瑶
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202110764536.0A priority Critical patent/CN113538265A/en
Publication of CN113538265A publication Critical patent/CN113538265A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The present disclosure relates to the field of image processing technologies, and in particular, to an image denoising method and apparatus, a computer-readable medium, and an electronic device. The method comprises the following steps: acquiring a current image and a corresponding reference image under a preset exposure condition, and determining a first image parameter of the current image; inquiring a noise parameter lookup table according to a preset exposure condition to obtain a second image parameter; comparing the first image parameter with the second image parameter to perform first denoising processing on the current image according to the image parameter comparison result so as to obtain a preliminary denoising image under a preset exposure condition, and generating a denoising degree parameter according to the denoising result of the current image; performing second fusion denoising processing on the preliminary denoised image and the corresponding reference image based on the denoising degree parameter to obtain a fusion image; and performing third fusion denoising treatment on the fusion images under different preset exposure conditions to obtain a denoised image corresponding to the current image. The method can effectively eliminate the noise of the exposure image and ensure the image denoising effect.

Description

Image denoising method and device, computer readable medium and electronic device
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to an image denoising method, an image denoising device, a computer readable medium, and an electronic device.
Background
In the field of image denoising, in some schemes, images with different exposure effects are fused to remove image noise, and the detail expression of bright and dark areas is controlled by using the exposure length. However, when such a technical solution is used for image fusion, it is inevitable that residual noise exists in the fused image.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
The present disclosure provides an image denoising method, an image denoising device, a computer readable medium, and an electronic device, which can effectively eliminate exposure image noise, ensure image denoising effect, and avoid image detail loss.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows, or in part will be obvious from the description, or may be learned by practice of the disclosure.
According to a first aspect of the present disclosure, there is provided an image denoising method, including:
acquiring a current image and a corresponding reference image under a preset exposure condition, and determining a first image parameter of the current image;
inquiring a noise parameter lookup table according to the preset exposure condition to obtain a corresponding second image parameter;
comparing the first image parameter with the second image parameter under the preset exposure condition, performing first denoising processing on the current image according to an image parameter comparison result to obtain a preliminary denoised image under the preset exposure condition, and generating a denoising degree parameter according to a denoising result of the current image;
performing second fusion denoising processing on the preliminary denoising image and the corresponding reference image under the preset exposure condition based on the denoising degree parameter to obtain a fusion image;
and performing third fusion denoising treatment on the fusion images under different preset exposure conditions to obtain a denoised image corresponding to the current image.
According to a second aspect of the present disclosure, there is provided an image denoising apparatus, comprising:
the image acquisition module is used for acquiring a current image and a corresponding reference image under a preset exposure condition and determining first image parameters of the current image and the reference image; wherein the preset exposure conditions comprise at least two different exposure conditions;
the noise parameter lookup table query module is used for querying a noise parameter lookup table according to the preset exposure condition to acquire a corresponding second image parameter;
the first denoising processing module is used for comparing the first image parameter and the second image parameter under the preset exposure condition, performing first denoising processing on the current image according to an image parameter comparison result to obtain a preliminary denoising image under the preset exposure condition, and generating a denoising degree parameter according to a denoising result of the current image;
the second denoising processing module is used for carrying out second fusion denoising processing on the preliminary denoising image and the corresponding reference image under the preset exposure condition based on the denoising degree parameter so as to obtain a fusion image;
and the third denoising processing module is used for performing third fusion denoising processing on the fusion images under different preset exposure conditions to obtain a denoised image corresponding to the current image.
According to a third aspect of the present disclosure, there is provided a computer readable medium having stored thereon a computer program which, when executed by a processor, implements the image denoising method described above.
According to a fourth aspect of the present disclosure, there is provided an electronic device comprising:
one or more processors;
a storage device for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement the image denoising method described above.
According to the image denoising method provided by the embodiment of the disclosure, the second image parameter is obtained by using the lookup noise parameter lookup table, so that the comparison can be firstly carried out according to the first image parameter and the second image parameter of the current image, and the first denoising processing is carried out according to the comparison result, the noise under the exposure environment is eliminated, the loss of image details is avoided, and a primary denoised image and a denoising degree parameter are obtained; and guiding the primary de-noised image and the reference image to perform image fusion by using the de-noising degree parameter, and performing secondary de-noising treatment to eliminate random noise. And finally, carrying out fusion processing on the fused images under different exposure conditions, and carrying out third denoising processing to generate a denoised high dynamic range image.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty.
FIG. 1 schematically illustrates a diagram of an image denoising method in an exemplary embodiment of the present disclosure;
FIG. 2 schematically illustrates a diagram of a method of constructing a noise parameter look-up table in an exemplary embodiment of the disclosure;
FIG. 3 schematically illustrates a gray gradient image in an exemplary embodiment of the disclosure;
FIG. 4 schematically illustrates a gradient gray scale checkerboard image of a grid size in an exemplary embodiment of the present disclosure;
FIG. 5 schematically illustrates a gradient gray scale checkerboard image of another grid size in an exemplary embodiment of the present disclosure;
FIG. 6 schematically illustrates a gradient gray scale checkerboard image of another grid size in an exemplary embodiment of the present disclosure;
FIG. 7 schematically illustrates a flow chart of an image denoising method in an exemplary embodiment of the present disclosure;
fig. 8 schematically illustrates a composition diagram of an image denoising apparatus according to an exemplary embodiment of the present disclosure;
fig. 9 schematically illustrates a structural diagram of an electronic device in an exemplary embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
In the related art, when a scheme of fusion denoising of multi-frame exposure images is utilized, the detail expressive force of bright and dark areas can be controlled according to the length of exposure. Theoretically, a long exposure frame image has small noise in a dark area and is overexposed in a bright area; the bright area of the short exposure frame has rich details and the dark area has large noise; however, in the process of image fusion, the situation that the dark area of the short exposure frame image and the bright area of the long exposure frame image are selected for fusion inevitably occurs, so that the problem that the fused image has the remaining noise is caused.
In view of the above-mentioned shortcomings and drawbacks of the prior art, the exemplary embodiment provides an image denoising method, which performs denoising before fusion of multiple frames of exposed images, thereby avoiding the problem of noise in the fused images. Referring to fig. 1, the image denoising method may include the following steps:
s11, acquiring a current image and a corresponding reference image under a preset exposure condition, and determining a first image parameter of the current image;
s12, inquiring a noise parameter lookup table according to the preset exposure condition to obtain a corresponding second image parameter;
s13, comparing the first image parameter and the second image parameter under the preset exposure condition, performing first denoising processing on the current image according to an image parameter comparison result to obtain a preliminary denoised image under the preset exposure condition, and generating a denoising degree parameter according to a denoising result of the current image;
s14, performing second fusion denoising processing on the preliminary denoising image and the corresponding reference image under the preset exposure condition based on the denoising degree parameter to obtain a fusion image;
and S15, performing third fusion denoising processing on the fusion images under different preset exposure conditions to obtain a denoised image corresponding to the current image.
In the image denoising method provided in this exemplary embodiment, on one hand, the second image parameter is obtained by using a method of searching a noise parameter lookup table, so that the first image parameter and the second image parameter of the current image can be compared first, and the first denoising processing is performed according to the comparison result, thereby removing noise generated in an exposure environment, avoiding loss of image details, and obtaining a preliminary denoising image and a denoising degree parameter; and on the other hand, the denoising degree parameter is used for guiding the primary denoised image and the reference image to perform image fusion, and secondary denoising processing is performed to eliminate random noise. And on the other hand, finally, carrying out fusion processing on the fusion images under different exposure conditions, and carrying out third denoising processing to generate a denoised high dynamic range image.
Hereinafter, the steps of the image-based denoising method in this exemplary embodiment will be described in more detail with reference to the drawings and the examples.
In this exemplary embodiment, for example, the method described above may be applied to a server, and a user may upload captured image data or image frame data obtained by decomposing video data to the server through a terminal device, so that the server performs image denoising calculation in response to the received image data. Alternatively, the method may also be applied to intelligent terminal devices with the same computing capability, such as intelligent terminals like mobile phones, tablets or computers. The user can start the image denoising calculation after selecting an existing frame of image or shooting a frame of image.
In step S11, a current image under a preset exposure condition and a corresponding reference image are acquired, and a first image parameter of the current image is determined.
In the present exemplary embodiment, the above-described method is applied to a terminal device with a shooting function, for example, a mobile phone equipped with a camera. The preset exposure condition may include: short exposure conditions, normal exposure conditions, and overexposure conditions. For example, the above-described short exposure condition, normal exposure condition, and overexposure condition may be configured by controlling the exposure time period. For example, the overexposure period may be 40ms, the normal exposure period may be 20ms, and the short exposure period may be 10 ms. Or, when the user takes an image with the camera function, the corresponding overexposure duration and short exposure duration may be configured according to the normal exposure duration of the current scene. For example, in a night view model, a portrait mode, or a general mode, different overexposure durations and short exposure durations may be configured according to a current normal exposure duration, that is, in different shooting modes or shooting scenes, different short exposure durations, normal exposure durations, and overexposure durations may be configured for the shooting mode and/or the shooting scene.
For example, when a user takes a picture with a camera, the image content currently captured by the lens can be cached in real time. For example, three consecutive frames of images may be buffered; and for each frame of image which is cached, obtaining a corresponding short-exposure image, a corresponding normal-exposure image and a corresponding overexposure image according to a preset short-exposure condition, a preset normal-exposure condition and a preset overexposure condition. Specifically, the image content in the preview interface is photographed under the short exposure condition, the normal exposure condition and the overexposure condition, respectively, to obtain the corresponding short exposure image, normal exposure image and overexposure image. That is, the buffered consecutive three-frame images correspond to nine images. Or, only the normal exposure image corresponding to the normal exposure condition may be stored in the buffer, and then the calculation is performed on the normal exposure image based on the short exposure condition and the overexposure condition to obtain the corresponding short exposure image and overexposure image.
When a user clicks a shutter to shoot and acquire a current image, shooting is performed by using a short exposure condition, a normal exposure condition and an overexposure condition, and when the current image with normal exposure is acquired, a corresponding short exposure image and an overexposure image are acquired. Meanwhile, an adjacent frame image before the current image may be used as the reference image. In some embodiments, only images of the current image under the three exposure conditions may be acquired. In other embodiments, the same short exposure condition, normal exposure condition, and overexposure condition as the current image may be used simultaneously to obtain a normally exposed reference image, and a short exposed reference image and an overexposed reference image.
Alternatively, the current picture may be a frame of picture in the video, and the reference picture is a frame of picture before the current picture, and the reference picture and the current picture are consecutive. At this time, the previous reference image has already been calculated once, and the reference image is the previous current image, and the image under the short exposure condition or the overexposure condition can be buffered.
The first image parameter may be an image variance parameter. Acquiring a current image, and a corresponding short-exposure image and an overexposure image; after the reference image, and the corresponding short exposure reference image and overexposure reference image, the image variance corresponding to each image may be calculated. The variance of the image can be an average variance calculated according to the gray value of each pixel point in the image; for example, the variance is the sum of the squares of the gray value of each pixel minus the average gray value of the image divided by the total number of pixels. And expressing the variance value corresponding to the gray value of the pixel point of the current image under the normal exposure condition through the first image parameter. The same applies to images under other exposure conditions.
Alternatively, in some embodiments, only the image variance of the current image, and the corresponding short-exposure image and overexposure image, may be calculated. Or, if the current picture is the first frame picture of the video, the current picture itself may be used as the reference picture.
In step S12, the noise parameter lookup table is queried according to the preset exposure condition to obtain a corresponding second image parameter.
In this exemplary embodiment, when the current image is obtained, the noise parameter lookup table may be queried according to three different exposure conditions, and the second image parameters corresponding to the three different exposure conditions may be determined. The second image parameter may be an image variance value of the corresponding gray-scale value under the three exposure conditions.
In this exemplary embodiment, specifically, the method may further include: and constructing the noise parameter lookup table in advance. Specifically, the image parameters of the image corresponding to the target gray scale condition may be collected under a preset exposure condition, and the noise parameter lookup table may be constructed according to the statistical result of the image parameters.
Referring to fig. 2, the specific step of constructing the noise parameter lookup table may include:
step S21, configuring any one or combination of any multiple of basic image sequence, gray scale gradient image sequence and gradient gray scale grid pattern image sequence; the basic image sequence comprises a plurality of gray pure-color basic images which change according to preset gray intervals, the gray scale gradient image sequence comprises a plurality of gray scale gradient images with different gray scale numbers, and the gradient gray scale grid pattern image sequence comprises a plurality of gradient gray scale grid pattern images with different grid sizes;
step S22, acquiring exposure images corresponding to the basic image, the gray scale gradient image and the gradient gray scale grid pattern image under the preset exposure condition, and calculating image parameters corresponding to each exposure image; wherein the image parameters comprise an image mean parameter and an image variance parameter;
step S23, constructing the noise parameter lookup table based on the image parameters of each exposure image under the preset exposure condition.
For example, a first test scene may be set, and the gray level pure color map with a gray interval of 8 may be included based on the image sequence for the above-mentioned basic image sequence, for example, 32 sets of gray level pure color maps may be configured, where the gray value of the first set of gray level pure color maps is 0, the gray value of the second set of gray level pure color maps is 8, and the gray value interval of each set of images is 8. For the short exposure condition, the normal exposure condition and the overexposure condition, 32 groups of images are respectively collected under three different exposure conditions. For the 3 x 32 sets of acquired gray-scale solid-color images, the full-scale mean and the full-scale variance of the images were calculated. Through the pure-color gray level images under different exposure conditions, the noise characteristics of different areas under different exposure conditions can be obtained.
Setting a second test scene, for the above gray-scale gradient image sequence, such as the gray-scale gradient image shown in fig. 3, the number of gradient gray scales may be configured as 2, 4, 8, 16, 32, and 64-level images. For the short exposure condition, the normal exposure condition and the overexposure condition, under three different exposure conditions, 6 × 3 groups of images under three groups of exposures were collected, and the mean and variance of each image were calculated.
Setting a third test scenario, for the above-mentioned gradient gray-scale checkered image sequence, wherein gray-scale checkered patterns with different sizes can be configured, as shown in fig. 4, fig. 5, and fig. 6, for example, three sizes of large, medium, and small are configured. For example, the large cell size is 512 × 512, the medium cell size is 256 × 256, and the small cell size is 128 × 128. The gradation rule of the gray scale can be gradation according to 8, 16 and 32 levels. Alternatively, the gradation of the lattice gray scale may be randomly changed. Under three different exposure conditions, gray scale grating patterns with three grid sizes are collected, and then the gray scale mean value and the gray scale variance of each image are respectively calculated.
And combining the image data of the three types of images under three exposure conditions to obtain the variance and the mean value corresponding to different gray scales. And then carrying out interpolation operation and averaging operation to obtain the average value and the variance of 0-255 different gray scales under three exposure conditions. Through the images in various forms, under three different exposure conditions, noise information under different exposure conditions can be obtained. Since the brightness of the adjacent gray scales may have an influence on the noise, the noise characteristics under different exposure conditions can be obtained by testing a large number of sample images in different test scenes.
Based on the mean and variance under the three exposure conditions obtained by the calculation, a noise parameter lookup table LUT, which can also be called as a noise model, can be constructed to embody the noise morphology under the three exposure conditions. In the noise model, 256 × 2 pieces of data are included in total under the conditions of long exposure, normal exposure, and short exposure, and 3 × 256 × 2 pieces of data are included in total. Wherein 256 corresponds to a gray scale value, 2 corresponds to a variance value and a mean value, and 3 corresponds to three different exposure conditions.
In step S13, the first image parameter and the second image parameter under the preset exposure condition are compared, so as to perform a first denoising process on the current image according to an image parameter comparison result to obtain a preliminary denoised image under the preset exposure condition, and generate a denoising degree parameter according to a denoising result of the current image.
In the present exemplary embodiment, under three exposure conditions, three noise variance maps, i.e., second image parameters, are obtained according to the variance under different gray scale conditions. According to the same calculation scale, the current image, the corresponding short-exposure image and the long-exposure image, and the corresponding image variance, namely the first image parameter can be obtained through calculation. Or, in some embodiments, the second image parameter may also be a noise variance map formed by querying a noise parameter lookup table according to a gray value of each pixel point in the current image under the normal exposure condition, and obtaining a corresponding variance value.
And under the normal exposure condition, comparing the variance value of the first image parameter of the current image with the variance value of the second image parameter under the normal exposure condition pixel by pixel, and denoising the airspace according to the comparison result of the variance values. For example, spatial domain denoising may be performed by using a non-local-tie denoising algorithm, a gaussian filter, or a laplacian-pyramid-based filter, and the variance values may be compared on a pixel-by-pixel basis. And if the variance value of the current image is greater than the variance value of the noise model, the current image is considered as a detail, and the spatial filtering degree of the pixel is weakened. For example, the default user sets the spatial denoising strength parameter to d1, and the variance difference between the pixel-by-pixel comparison current image calculated variance and lut look-up table variance is varDiff. And setting the maximum value to be vardiffMax according to empirical values, and limiting the varDiff to be in the range of 0 and vardiffMax. If the normalization process degreePara is varDiff/varDiffMax, the degree of attenuation is d2 d1 (1-degreePara). The same method is used for calculating the short exposure image and the long exposure image corresponding to the current image.
Meanwhile, when the spatial domain denoising is performed for the first time, the spatial domain denoising degree is recorded pixel by pixel, normalized and recorded as a spatial domain denoising degree map, namely a denoising degree parameter.
In step S14, a second fusion denoising process is performed on the preliminary denoised image under the preset exposure condition and the corresponding reference image based on the denoising degree parameter, so as to obtain a fusion image.
In this exemplary embodiment, after the initial de-noising images under the three exposure conditions are obtained by performing the spatial domain de-noising processing on the current image, the short-exposure image and the long-exposure image corresponding to the normal exposure condition, the short-exposure condition and the long-exposure condition for the first time, the time domain de-noising processing can be performed on the initial de-noising images and the reference image under the same exposure condition, so as to implement the second fusion de-noising processing.
Specifically, aiming at the normal exposure condition, the preliminary de-noising image corresponding to the current image and the corresponding reference image are subjected to image fusion, and the inter-frame fusion proportion is guided by using the de-noising degree parameter under the normal exposure condition, namely the spatial domain de-noising degree map. And aiming at the short exposure condition, performing time domain denoising on a preliminary denoising image corresponding to the short exposure image of the current image and a reference image under the corresponding short exposure condition, and guiding the inter-frame fusion proportion by using a space domain denoising degree map under the short exposure condition. And aiming at the long exposure condition, performing time domain denoising on the preliminary denoising image of the long exposure image of the current image and the reference image under the corresponding long exposure condition, and guiding the inter-frame fusion proportion by utilizing a spatial domain denoising degree map under the long exposure condition. And guiding the fusion proportion between time domain noise reduction frames by utilizing a spatial domain noise reduction degree map, wherein the formula can comprise:
dTnr=(1-dSnr)*paraAdjust
wherein dSnr represents the degree of spatial domain denoising; dTnr represents the degree of time-domain denoising; the paraAdjust may be a user setting value or a default value, and is used to adjust the overall denoising degree of the time domain denoising.
According to the formula, the inter-frame fusion proportion of the time domain denoising, namely the time domain denoising degree, can be obtained by calculation according to the denoising degree parameter. By the formula, when the denoising degree parameters of the three exposure conditions are obtained, the interframe fusion proportion of time domain denoising under the three exposure conditions can be determined.
In some embodiments, the reference image subjected to time domain fusion may be an original image acquired under three exposure conditions; alternatively, the image may be a spatially denoised image. And performing time domain denoising on the image subjected to the space domain denoising, performing fusion denoising by using the continuous frame image, and guiding the inter-frame fusion proportion by using the denoising degree parameter, wherein for the pixel points in the image, the time domain denoising degree can be enhanced when the space domain denoising degree is weaker. Thereby effectively removing random noise between frames. For example, the time-domain denoising may be averaging pixels with a pixel gray value difference smaller than a certain threshold between the current image and the reference image as a gray value of the current image frame; or, the gray value of the pixel point reserved between the current image and the reference image is larger and is used as the gray value of the current image frame. Alternatively, time domain denoising may be performed based on a gaussian pyramid filter.
In step S15, performing a third fusion denoising process on the fusion images under different preset exposure conditions to obtain a denoised image corresponding to the current image.
In this example embodiment, after three fused images subjected to time domain denoising under the short exposure condition, the normal exposure condition, and the overexposure condition are obtained, the three fused images may be subjected to high dynamic range image fusion processing to complete third fused denoising processing, so as to implement complete denoising of the current image. And generating a de-noised high dynamic range image after fusion, and reserving detail information of a high region in the short-exposure image and detail information of a low-dark region in the long-exposure image.
For example, the third fusion denoising process may include: the first gray threshold thd1 is set to 64, and the second gray threshold thd2 is set to 196. For each pixel point in the image, let the gray value lumMap be Y. If the lumMap is less than thd1, configuring a weight value mask as 0; if the lumMap is larger than or equal to thd1 and the lumMap is less than thd2, configuring the weight value mask as 1; if the lumMap is greater than or equal to thd2 and the lumMap is less than 255, the configuration weight value is mask 2. The output image outImg of the fusion process of the high dynamic range image is shortExpImg (mask ═ 0) + idExpImg (mask ═ 1) + longExpImg (mask ═ 2); wherein shortExpImg is a short exposure image, ideExpImg is a normal exposure image, and longExpImg is a long exposure image.
The image denoising method provided by the embodiment of the disclosure summarizes the noise characteristics under each exposure degree in advance by testing various gray scale noise images under different exposure degrees, constructs the noise parameter table in the form of the noise model, provides the noise fluctuation range under each exposure condition, and determines the noise characteristics. Referring to fig. 7, under the normal exposure condition, the short exposure (underexposure) condition, and the long exposure (overexposure) condition, images are collected, the first-level spatial domain noise reduction degree is guided according to the noise model, the noise of each exposure image is eliminated, the loss of details is avoided, and the spatial domain noise reduction degree map is obtained. And guiding the second-stage time domain denoising fusion by using the space domain denoising degree map under the equal exposure condition of each exposure image, and achieving the purpose of eliminating random noise by overlapping the processing frame and the reference frame after the space domain denoising. And finally, fusing the images under each exposure condition of the third level to generate a de-noised high dynamic range image. And combining the space domain denoising and the time domain denoising through three-level denoising logic, and coordinating the degrees of two denoising modes to reduce the noise of the fused image. The image fusion high dynamic range image quality after the noise reduction processing is higher, and more image details can be displayed.
It is to be noted that the above-mentioned figures are only schematic illustrations of the processes involved in the method according to an exemplary embodiment of the invention, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
Further, referring to fig. 8, an embodiment of the present example further provides an image denoising apparatus 80, including: the image processing device comprises an image acquisition module 801, a noise parameter lookup table query module 802, a first denoising processing module 803, a second denoising processing module 804 and a third denoising processing module 805. Wherein the content of the first and second substances,
the image acquisition module 801 may be configured to acquire a current image and a corresponding reference image under a preset exposure condition, and determine first image parameters of the current image and the reference image; wherein the preset exposure condition comprises at least two different exposure conditions.
The noise parameter lookup table query module 802 may be configured to query a noise parameter lookup table according to the preset exposure condition to obtain a corresponding second image parameter.
The first denoising processing module 803 may be configured to compare the first image parameter and the second image parameter under the preset exposure condition, perform a first denoising process on the current image according to an image parameter comparison result to obtain a preliminary denoising image under the preset exposure condition, and generate a denoising degree parameter according to a denoising result of the current image.
The second denoising module 804 may be configured to perform a second fusion denoising process on the preliminary denoising image and the corresponding reference image under the preset exposure condition based on the denoising degree parameter to obtain a fusion image
The third denoising module 805 may be configured to perform third fusion denoising on the fusion images under different preset exposure conditions, so as to obtain a denoised image corresponding to the current image.
In one example of the present disclosure, the preset exposure condition includes: short exposure conditions, normal exposure conditions, and overexposure conditions.
In one example of the present disclosure, the first and second image parameters comprise an image variance parameter;
the first denoising module 803 may be configured to compare the first image parameter with the second image parameter under the preset exposure condition to obtain an image variance difference, and instruct the current image to perform spatial domain denoising processing according to the image variance difference and obtain a preliminary denoised image.
In an example of the present disclosure, the first denoising processing module 803 may be configured to determine a corresponding denoising strength according to a variance difference between the first image parameter and the second image parameter of each pixel in the current image, and construct the denoising degree parameter based on the denoising strength.
In an example of the present disclosure, the second denoising module 804 may be configured to guide a fusion ratio between the preliminary denoised image and the corresponding reference image based on the denoising degree parameter, and perform time-domain denoising processing on the preliminary denoised image to obtain the fusion image.
In one example of the present disclosure, the apparatus 80 may include a noise parameter lookup table building module.
The noise parameter lookup table construction module may be configured to pre-construct the noise parameter lookup table, including: and under the preset exposure condition, acquiring image parameters of the image corresponding to the target gray scale condition, and constructing the noise parameter lookup table according to the statistical result of the image parameters.
In one example of the present disclosure, the noise parameter lookup table construction module may include: any one item or combination of any multiple items in the basic image sequence, the gray scale gradient image sequence and the gradient gray scale grid pattern image sequence is configured; the basic image sequence comprises a plurality of gray pure-color basic images which change according to preset gray intervals, the gray scale gradient image sequence comprises a plurality of gray scale gradient images with different gray scale numbers, and the gradient gray scale grid pattern image sequence comprises a plurality of gradient gray scale grid pattern images with different grid sizes; acquiring exposure images corresponding to the basic image, the gray scale gradient image and the gradient gray scale grid pattern image under the preset exposure condition, and calculating image parameters corresponding to the exposure images; wherein the image parameters comprise an image mean parameter and an image variance parameter; and constructing the noise parameter lookup table based on the image parameters of the exposure images under the preset exposure condition.
The specific details of each module in the image denoising device are described in detail in the corresponding image denoising method, and therefore are not described herein again.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
FIG. 9 shows a schematic diagram of an electronic device suitable for use to implement an embodiment of the invention.
It should be noted that the electronic device 500 shown in fig. 9 is only an example, and should not bring any limitation to the functions and the scope of the embodiments of the present disclosure.
As shown in fig. 9, the electronic apparatus 500 includes a Central Processing Unit (CPU)501 that can perform various appropriate actions and processes in accordance with a program stored in a Read-Only Memory (ROM) 502 or a program loaded from a storage section 508 into a Random Access Memory (RAM) 503. In the RAM 503, various programs and data necessary for system operation are also stored. The CPU 501, ROM502, and RAM 503 are connected to each other via a bus 504. An Input/Output (I/O) interface 505 is also connected to bus 504.
The following components are connected to the I/O interface 505: an input portion 506 including a keyboard, a mouse, and the like; an output section 507 including a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, a speaker, and the like; a storage portion 508 including a hard disk and the like; and a communication section 509 including a Network interface card such as a LAN (Local Area Network) card, a modem, or the like. The communication section 509 performs communication processing via a network such as the internet. The driver 510 is also connected to the I/O interface 505 as necessary. A removable medium 511 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 510 as necessary, so that a computer program read out therefrom is mounted into the storage section 508 as necessary.
In particular, according to an embodiment of the present invention, the processes described below with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the invention include a computer program product comprising a computer program embodied on a computer-readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication section 509, and/or installed from the removable medium 511. The computer program executes various functions defined in the system of the present application when executed by a Central Processing Unit (CPU) 501.
Specifically, the electronic device may be an intelligent mobile terminal device such as a mobile phone, a tablet computer, or a notebook computer. Alternatively, the electronic device may be an intelligent terminal device such as a desktop computer.
It should be noted that the computer readable medium shown in the embodiment of the present invention may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a Read-Only Memory (ROM), an Erasable Programmable Read-Only Memory (EPROM), a flash Memory, an optical fiber, a portable Compact Disc Read-Only Memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present invention, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present invention, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wired, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present invention may be implemented by software, or may be implemented by hardware, and the described units may also be disposed in a processor. Wherein the names of the elements do not in some way constitute a limitation on the elements themselves.
It should be noted that, as another aspect, the present application also provides a computer-readable medium, which may be included in the electronic device described in the above embodiment; or may exist separately without being assembled into the electronic device. The computer readable medium carries one or more programs which, when executed by an electronic device, cause the electronic device to implement the method as described in the embodiments below. For example, the electronic device may implement the steps shown in fig. 1 or fig. 2.
Furthermore, the above-described figures are merely schematic illustrations of processes involved in methods according to exemplary embodiments of the invention, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
Furthermore, the above-described figures are merely schematic illustrations of processes involved in methods according to exemplary embodiments of the invention, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is to be limited only by the terms of the appended claims.

Claims (10)

1. An image denoising method, comprising:
acquiring a current image and a corresponding reference image under a preset exposure condition, and determining a first image parameter of the current image;
inquiring a noise parameter lookup table according to the preset exposure condition to obtain a corresponding second image parameter;
comparing the first image parameter with the second image parameter under the preset exposure condition, performing first denoising processing on the current image according to an image parameter comparison result to obtain a preliminary denoised image under the preset exposure condition, and generating a denoising degree parameter according to a denoising result of the current image;
performing second fusion denoising processing on the preliminary denoising image and the corresponding reference image under the preset exposure condition based on the denoising degree parameter to obtain a fusion image;
and performing third fusion denoising treatment on the fusion images under different preset exposure conditions to obtain a denoised image corresponding to the current image.
2. The image denoising method of claim 1, wherein the preset exposure condition comprises: short exposure conditions, normal exposure conditions, and overexposure conditions.
3. The image denoising method of claim 2, wherein the first image parameter and the second image parameter comprise an image variance parameter;
the comparing the first image parameter and the second image parameter under the preset exposure condition to perform a first denoising process on the current image according to an image parameter comparison result to obtain a preliminary denoising image under the preset exposure condition includes:
and comparing the first image parameter with the second image parameter under the preset exposure condition to obtain an image variance difference, and guiding the current image to carry out spatial domain denoising processing according to the image variance difference to obtain a preliminary denoised image.
4. The image denoising method according to claim 2 or 3, wherein the generating a denoising degree parameter according to the denoising result of the current image comprises:
determining corresponding denoising intensity according to variance difference between the first image parameter and the second image parameter of each pixel in the current image, and constructing the denoising degree parameter based on the denoising intensity.
5. The image denoising method according to claim 1 or 2, wherein performing a second fusion denoising process on the preliminary denoised image and the corresponding reference image under the preset exposure condition based on the denoising degree parameter to obtain a fusion image comprises:
and guiding the fusion proportion between the preliminary de-noised image and the corresponding reference image based on the de-noising degree parameter, and carrying out time domain de-noising processing on the preliminary de-noised image to obtain the fusion image.
6. The method of denoising an image according to claim 1, further comprising: pre-constructing the noise parameter lookup table, including:
and under the preset exposure condition, acquiring image parameters of the image corresponding to the target gray scale condition, and constructing the noise parameter lookup table according to the statistical result of the image parameters.
7. The image denoising method of claim 6, wherein the acquiring image parameters of the image corresponding to the target gray scale condition under the preset exposure condition, and constructing the noise parameter lookup table according to the statistical result of the image parameters comprises:
any one item or combination of any multiple items in the basic image sequence, the gray scale gradient image sequence and the gradient gray scale grid pattern image sequence is configured; the basic image sequence comprises a plurality of gray pure-color basic images which change according to preset gray intervals, the gray scale gradient image sequence comprises a plurality of gray scale gradient images with different gray scale numbers, and the gradient gray scale grid pattern image sequence comprises a plurality of gradient gray scale grid pattern images with different grid sizes;
acquiring exposure images corresponding to the basic image, the gray scale gradient image and the gradient gray scale grid pattern image under the preset exposure condition, and calculating image parameters corresponding to the exposure images; wherein the image parameters comprise an image mean parameter and an image variance parameter;
and constructing the noise parameter lookup table based on the image parameters of the exposure images under the preset exposure condition.
8. An image denoising apparatus, comprising:
the image acquisition module is used for acquiring a current image and a corresponding reference image under a preset exposure condition and determining first image parameters of the current image and the reference image; wherein the preset exposure conditions comprise at least two different exposure conditions;
the noise parameter lookup table query module is used for querying a noise parameter lookup table according to the preset exposure condition to acquire a corresponding second image parameter;
the first denoising processing module is used for comparing the first image parameter and the second image parameter under the preset exposure condition, performing first denoising processing on the current image according to an image parameter comparison result to obtain a preliminary denoising image under the preset exposure condition, and generating a denoising degree parameter according to a denoising result of the current image;
the second denoising processing module is used for carrying out second fusion denoising processing on the preliminary denoising image and the corresponding reference image under the preset exposure condition based on the denoising degree parameter so as to obtain a fusion image;
and the third denoising processing module is used for performing third fusion denoising processing on the fusion images under different preset exposure conditions to obtain a denoised image corresponding to the current image.
9. A computer-readable medium, on which a computer program is stored which, when being executed by a processor, carries out an image denoising method according to any one of claims 1 to 7.
10. An electronic device, comprising:
one or more processors;
a storage device for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement the image denoising method as recited in any one of claims 1 to 7.
CN202110764536.0A 2021-07-06 2021-07-06 Image denoising method and device, computer readable medium and electronic device Pending CN113538265A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110764536.0A CN113538265A (en) 2021-07-06 2021-07-06 Image denoising method and device, computer readable medium and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110764536.0A CN113538265A (en) 2021-07-06 2021-07-06 Image denoising method and device, computer readable medium and electronic device

Publications (1)

Publication Number Publication Date
CN113538265A true CN113538265A (en) 2021-10-22

Family

ID=78097875

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110764536.0A Pending CN113538265A (en) 2021-07-06 2021-07-06 Image denoising method and device, computer readable medium and electronic device

Country Status (1)

Country Link
CN (1) CN113538265A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107635098A (en) * 2017-10-30 2018-01-26 广东欧珀移动通信有限公司 High dynamic range images noise remove method, apparatus and equipment
CN108205796A (en) * 2016-12-16 2018-06-26 大唐电信科技股份有限公司 A kind of fusion method and device of more exposure images
CN110246101A (en) * 2019-06-13 2019-09-17 Oppo广东移动通信有限公司 Image processing method and device
CN112513936A (en) * 2019-11-29 2021-03-16 深圳市大疆创新科技有限公司 Image processing method, device and storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108205796A (en) * 2016-12-16 2018-06-26 大唐电信科技股份有限公司 A kind of fusion method and device of more exposure images
CN107635098A (en) * 2017-10-30 2018-01-26 广东欧珀移动通信有限公司 High dynamic range images noise remove method, apparatus and equipment
CN110246101A (en) * 2019-06-13 2019-09-17 Oppo广东移动通信有限公司 Image processing method and device
CN112513936A (en) * 2019-11-29 2021-03-16 深圳市大疆创新科技有限公司 Image processing method, device and storage medium

Similar Documents

Publication Publication Date Title
US9275445B2 (en) High dynamic range and tone mapping imaging techniques
CN104980652B (en) Image processing apparatus and image processing method
US20200082508A1 (en) Information processing method, information processing apparatus, and recording medium
CN111669514B (en) High dynamic range imaging method and apparatus
CN106981054B (en) Image processing method and electronic equipment
KR102045538B1 (en) Method for multi exposure image fusion based on patch and apparatus for the same
KR102317613B1 (en) Systems and methods for localized contrast enhancement
CN105407296A (en) Real-time video enhancement method and device
CN111242860B (en) Super night scene image generation method and device, electronic equipment and storage medium
KR20160102438A (en) Method and device for tone-mapping a high dynamic range image
JP2021525029A (en) HDR method and mobile terminal to counter exercise ghost
WO2023273868A1 (en) Image denoising method and apparatus, terminal, and storage medium
CN113962859A (en) Panorama generation method, device, equipment and medium
CN114862722B (en) Image brightness enhancement implementation method and processing terminal
CN110807735A (en) Image processing method, image processing device, terminal equipment and computer readable storage medium
US20130287299A1 (en) Image processing apparatus
CN114581355A (en) Method, terminal and electronic device for reconstructing HDR image
CN112819699A (en) Video processing method and device and electronic equipment
CN110717864A (en) Image enhancement method and device, terminal equipment and computer readable medium
CN114240767A (en) Image wide dynamic range processing method and device based on exposure fusion
CN112258417B (en) Image generation method, device and equipment
Lim et al. High dynamic range for contrast enhancement
CN110838088A (en) Multi-frame noise reduction method and device based on deep learning and terminal equipment
CN113538265A (en) Image denoising method and device, computer readable medium and electronic device
CN113014745B (en) Video image noise reduction method and device, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination