CN111192226B - Image fusion denoising method, device and system - Google Patents

Image fusion denoising method, device and system Download PDF

Info

Publication number
CN111192226B
CN111192226B CN202010292705.0A CN202010292705A CN111192226B CN 111192226 B CN111192226 B CN 111192226B CN 202010292705 A CN202010292705 A CN 202010292705A CN 111192226 B CN111192226 B CN 111192226B
Authority
CN
China
Prior art keywords
image
noise
denoising
residual
filtering
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010292705.0A
Other languages
Chinese (zh)
Other versions
CN111192226A (en
Inventor
刘剑君
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
NANJING SUNING ELECTRONIC INFORMATION TECHNOLOGY Co.,Ltd.
Original Assignee
Suning Cloud Computing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suning Cloud Computing Co Ltd filed Critical Suning Cloud Computing Co Ltd
Priority to CN202010292705.0A priority Critical patent/CN111192226B/en
Publication of CN111192226A publication Critical patent/CN111192226A/en
Application granted granted Critical
Publication of CN111192226B publication Critical patent/CN111192226B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • G06T5/70
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20021Dividing image into blocks, subimages or windows
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Image Processing (AREA)

Abstract

The invention discloses an image fusion denoising method, device and system, wherein the method comprises the following steps: processing the noise image by using a filtering algorithm to obtain a first de-noised image, and calculating the noise image and the first de-noised image by using a residual image method to obtain a first noise residual image; processing the noise image by using the first neural network model to obtain a second denoised image, and calculating the noise image and the second denoised image by using a residual image method to obtain a second noise residual image; and inputting the noise image, the first noise residual image and the second noise residual image into a second neural network model together for fusion denoising, and obtaining a target denoising image. The method adopts a fusion method to combine the noise images processed by the collaborative filtering algorithm and the neural network model algorithm together, obtains the target denoising image with both denoising effect and image details, and better ensures the image denoising quality.

Description

Image fusion denoising method, device and system
Technical Field
The invention relates to the technical field of image processing, in particular to an image fusion denoising method, device and system.
Background
Image noise is a random variation of brightness or color information in an image, and the more noise present in an image indicates the poorer the imaging quality of the image. The image noise affects not only the visual recognition of the image definition and accuracy, but also the result quality of high-level image processing such as edge detection, image segmentation, feature matching and the like. Therefore, in order to meet the technical requirements of further processing images, various image denoising methods appear in the art, such as: and denoising methods such as BM3D based on filtering, DNCNN based on deep learning, WNNM based on prior models and the like.
However, the above denoising methods all have their own drawbacks that cannot be overcome: the filtering-based denoising method is designed for Gaussian white noise, and the noise model adopted by the filtering-based denoising method is not consistent with the noise distribution of an actual image, so a large amount of noise points and pseudo color residues exist in a denoising result. The denoising method based on deep learning can meet the requirement of image real-time processing under the clamping of a GPU, but the loss of the processed image information is serious, so that the details of the denoised image are lost. For the denoising method based on the prior model, the computational complexity and the time consumption are large, and the requirement of real-time processing cannot be met.
Disclosure of Invention
In order to solve the technical problems of the existing denoising method, the invention provides an image fusion denoising method, a device and a system. The technical scheme is as follows:
in one aspect, an image fusion denoising method is provided, where the method includes:
processing a noise image by using a filtering algorithm to obtain a first de-noised image, and calculating the noise image and the first de-noised image by using a residual image method to obtain a first noise residual image;
processing the noise image by using a first neural network model to obtain a second denoised image, and calculating the noise image and the second denoised image by using the residual image method to obtain a second noise residual image;
and inputting the noise image, the first noise residual image and the second noise residual image into a second neural network model together for fusion denoising, so as to obtain a target denoising image.
Further, inputting the noise image, the first noise residual image and the second noise residual image together into a second neural network model for fusion denoising, and obtaining a target denoised image includes:
cascading the noise image, the first noise residual image and the second noise residual image together according to a color channel, and inputting the cascaded images into a second neural network model together for denoising to obtain the target denoising image.
Further, the processing the noise image by using the filtering algorithm to obtain the first denoised image includes:
carrying out Bayer pattern down-sampling on the noise image to obtain a sub-image;
respectively carrying out noise estimation on the sub-images to obtain the noise variance of each sub-image;
and adjusting the filtering parameters of the filtering algorithm by using the noise variance, and obtaining the first denoised image by using the filtering algorithm.
Further, adjusting the filter parameters of the filter algorithm using the noise variance comprises:
dividing the sub-images into image blocks, and traversing and searching similar blocks of each image block in the sub-images by taking the mean square error between the image blocks of the sub-images as a distance to construct a similar block group of the sub-images;
performing three-dimensional collaborative hard threshold filtering on the similar block group of the sub-image to obtain a hard threshold filtering and denoising result of the sub-image, wherein a filtering threshold in a hard threshold filtering formula is adjusted according to the noise variance;
reconstructing a complete image by using the hard threshold filtering denoising result of each subimage to obtain a primary denoising image;
and carrying out collaborative wiener filtering on the preliminary de-noised image to obtain the first de-noised image.
Further, performing collaborative wiener filtering on the preliminary denoised image to obtain the first denoised image includes:
dividing image blocks of the preliminary de-noised image, and traversing and searching similar blocks of each image block on the preliminary de-noised image to construct a similar block group of the preliminary de-noised image;
dividing the noise image according to the image blocks of the preliminary de-noised image to obtain image blocks corresponding to the positions of the image blocks of the preliminary de-noised image, and traversing and searching similar blocks for the image blocks of the noise image to construct a similar block group of the noise image;
and carrying out three-dimensional collaborative wiener filtering on the similar block group of the noise image by taking the similar block group of the preliminary de-noising image as a guide to obtain the first de-noising image, wherein the noise variance mean value in a wiener filtering formula is adjusted according to the noise variance.
Further, the method further comprises: calculating the total loss of the noise image by combining a first noise residual loss and a second noise residual loss, and evaluating the quality of the target de-noised image by using the total loss, wherein the first noise residual loss is as follows: the first neural network model processes the loss of forward propagation computation generated by the noise image, and the second noise residual loss is: the second neural network model processes the loss of forward propagation computations produced by the noisy image.
In a second aspect, an image fusion denoising apparatus is provided, the apparatus including:
the first de-noising image acquisition module is used for processing the noise image by using a filtering algorithm to obtain a first de-noising image;
the second denoising image obtaining module is used for processing the noise image by utilizing the first neural network model to obtain a second denoising image;
a residual image calculation module, configured to calculate the noise image and the first denoised image by using a residual image method to obtain a first noise residual image, and calculate the noise image and the second denoised image by using a residual image method to obtain a second noise residual image;
and the fusion denoising module is used for inputting the noise image, the first noise residual image and the second noise residual image into a second neural network model together for fusion denoising to obtain a target denoising image.
The fusion denoising module is specifically configured to input the noise image, the first noise residual image, and the second noise residual image together into a second neural network model for fusion denoising, and obtaining a target denoised image includes: cascading the noise image, the first noise residual image and the second noise residual image together according to a color channel, and denoising the cascaded images if inputting the cascaded images into a second neural network model to obtain the target denoising image;
the second neural network model includes: a convolution filter layer, ten convolutional layers stacked repeatedly, each of which is followed by a batch regularization layer and activation function, a reconstructed convolutional layer.
Further, the first denoised image obtaining module comprises:
the down-sampling module is used for carrying out Bayer pattern down-sampling on the noise image to obtain a sub-image;
the noise estimation module is used for respectively carrying out noise estimation on the sub-images to obtain a noise method of each sub-image;
and the filtering and denoising module is used for adjusting the filtering parameters of the filtering algorithm by using the noise variance and obtaining a first denoised image by using the filtering algorithm.
Further, the filtering and denoising module includes:
a similar block group construction module for dividing the sub-image into image blocks, ergodically searching similar blocks in the sub-image for each image block with a mean square error between the image blocks as a distance, constructing a similar block group of the sub-image, and,
dividing the preliminary denoised image into image blocks, searching similar blocks for each image block on the preliminary denoised image in a traversal manner, constructing a similar block group of the preliminary denoised image, and,
dividing the noise image according to the image blocks of the preliminary de-noised image to obtain image blocks corresponding to the positions of the image blocks of the preliminary de-noised image, traversing similar blocks of the image blocks of the noise image, and constructing a similar block group of the noise image;
the collaborative hard threshold filtering and denoising module is used for carrying out three-dimensional collaborative hard threshold filtering on the similar block group of the subimage to obtain a hard threshold filtering and denoising result of the subimage, wherein the filtering threshold in the hard threshold filtering formula is adjusted according to the noise variance;
the recombination module is used for reconstructing a hard threshold filtering denoising result of each subimage into a complete image to obtain a primary denoising image;
and the collaborative wiener filtering denoising module is used for carrying out three-dimensional collaborative wiener filtering on the similar block group of the noise image by taking the similar block group of the preliminary denoising image as a guide to obtain a first denoising image, wherein the noise variance mean value in the wiener filtering formula is adjusted according to the noise variance.
Further, the apparatus further comprises: the target de-noising image quality evaluation module is used for calculating the total loss of the noise image by combining a first noise residual loss and a second noise residual loss, and evaluating the quality of the target de-noising image by using the total loss, wherein the first noise residual loss is: the first neural network model processes the loss of forward propagation computation generated by the noise image, and the second noise residual loss is: the second neural network model processes the loss of forward propagation computations produced by the noisy image.
In a third aspect, a computer system comprises:
one or more processors; and
a memory associated with the one or more processors for storing program instructions that, when read and executed by the one or more processors, perform any of the image fusion denoising methods described above.
The technical scheme provided by the invention has the beneficial effects that:
1. the method adopts a fusion method to combine the noise images processed by the collaborative filtering algorithm and the neural network model algorithm together, so that a target denoising image with both denoising effect and image details is obtained, and the image denoising quality is better ensured;
2. according to the invention, the Bayer pattern is combined to carry out downsampling to obtain the sub-images before noise estimation, so that the noise estimation is more accurate and accords with the actual noise distribution condition of the image;
3. the invention utilizes noise estimation to adjust the parameters of the collaborative hard threshold filtering algorithm, so that the denoising effect of the collaborative hard threshold filtering algorithm is more accurate.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts;
FIG. 1 is a flowchart of an image fusion denoising method according to an embodiment of the present invention;
fig. 2 is a flowchart of a first denoising image obtaining method according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of downsampling and full image reconstruction provided by an embodiment of the present invention;
FIG. 4 is a schematic diagram of the noise estimation effect provided by the embodiment of the present invention;
FIG. 5 is a schematic diagram of a similar block searching process provided by an embodiment of the present invention;
FIG. 6 is a schematic diagram illustrating comparison of denoising effects according to an embodiment of the present invention;
FIG. 7 is a schematic diagram illustrating comparison of denoising effects according to an embodiment of the present invention;
FIG. 8 is a schematic diagram of an image fusion denoising apparatus module according to an embodiment of the present invention;
fig. 9 is a schematic structural diagram of a computer system according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The existing image denoising method mainly comprises the following steps: the method comprises a denoising method based on a prior model, a denoising method based on filtering and a filtering method based on deep learning. Among the three common denoising methods, the prior model-based denoising method cannot meet the real-time image processing requirement, and has low processing efficiency; the denoising method based on filtering has insufficient denoising degree, residual noise points and pseudo colors can exist in the generated denoising result, but the detail content in the original image can be reserved; the denoising method based on deep learning has a good denoising effect, but is easy to remove part of information in the original image together, so that the detail content of the image is lost. In order to solve the technical problems of the three existing denoising methods, the embodiment of the invention provides an image fusion denoising method and device by combining the denoising method based on filtering and the denoising method based on deep learning by utilizing the advantages of the denoising method based on filtering and the denoising method based on deep learning. The specific technical scheme is as follows:
an image fusion denoising method as shown in fig. 1 includes:
processing the noise image by using a filtering algorithm to obtain a first de-noised image, and calculating the noise image and the first de-noised image by using a residual image method to obtain a first noise residual image;
processing the noise image by using the first neural network model to obtain a second denoised image, and calculating the noise image and the second denoised image by using a residual image method to obtain a second noise residual image;
and inputting the noise image, the first noise residual image and the second noise residual image into a second neural network model together for fusion denoising, and obtaining a target denoising image.
It should be noted that, in the embodiment of the present invention, the order of the steps of obtaining the first noise residual image and obtaining the second noise residual image is not limited. The noise image is an original image with noise and is described asI. The first denoised image is a denoised result generated by processing the noise image by using a filtering algorithm and is recorded asI I . The second denoised image is a denoised result generated by processing the noise image by using the neural network model and is recorded asI E . The residual image method is to regulate each pixel value according to a certain rule, normalize the image data to obtain the relative reflectivity or select the maximum value of each wave band in the whole image, and subtract the normalized average radiation value from the maximum value of each wave band. The specific calculation method is as follows:
Figure DEST_PATH_IMAGE001
Figure 47573DEST_PATH_IMAGE002
wherein the content of the first and second substances,R I in order to be the first noise residual image,R E is a second noisy residual image.
And finally, cascading the noise image, the first noise residual image and the second noise residual image together according to color channels, and inputting the cascaded images into a second neural network model for denoising, wherein the second neural network model is a trained neural network model capable of fusing and denoising the noise image, the first noise residual image and the cascaded images of the second noise residual image, and comprises a first layer consisting of 64 convolution filters of 3 × 3 × 9, 10 layers of middle repeated stacking layers consisting of 64 convolution layers of 3 × 3 × 64 convolution kernels, each convolution layer is connected with a batch regularization layer (BN) and an activation function Relu, and finally, a convolution layer of 3 × 3 × 64 reconstructs the residual between the noise image and the target denoised image.
As shown in fig. 2, the method for obtaining the first denoised image further includes the following steps:
carrying out Bayer pattern down-sampling on the noise image to obtain a sub-image;
respectively carrying out noise estimation on the sub-images to obtain the noise variance of each sub-image;
and adjusting the filtering parameters of the filtering algorithm by using a noise method, and obtaining a first de-noised image by using the filtering algorithm.
In the method, before the filtering algorithm processing is carried out on the noise image, the Bayer pattern down sampling is carried out on the noise image to obtain the sub-image. The Bayer pattern is a color pattern widely used with CCD and CMOS cameras. Taking an RGB image as an example, the RGB image includes: red, green and blue color channels. As shown in fig. 3, in the figure: 1. 5, 4, 8, 9, 13, 12 and 16 are green, 2, 6, 10 and 14 are red, 3, 7, 11 and 15 are blue, and four sub-images are obtained by sampling according to a Bayer pattern. The down-sampling with the Bayer pattern is mainly because the noise variance of the actual noise image shows the Bayer pattern in the spatial distribution, which affects the noise estimation of the image. The noise estimation can adopt a uniform region method and a block method in the prior art. The uniform region method is to select a uniform region from the sub-images and obtain the noise evaluation result of the image by calculating the standard deviation of the uniform region. The block method is to estimate noise using the variance of each small block, assuming that an image is composed of a large number of uniform small blocks. In order to make the estimation more accurate, in the embodiment of the present invention, a blocking method is selected for noise estimation, a noise variance map is first constructed, fig. 4 illustrates noise estimation, (a) is an input noise image, (b) is normalized display of a noise variance map of a red channel of the noise image, a value of each point on the noise variance map represents a noise variance of a pixel point at a corresponding position of an original noise image, the higher the noise variance is, the higher the display brightness is on the noise variance map, and (c) is amplified display of a box area in (b), the noise variance in the area presents a Bayer pattern in space, and the rationality of Bayer pattern down-sampling is verified.
As shown in fig. 2, the adjusting of the filter parameters of the filter algorithm by using the noise method mainly includes: collaborative hard threshold filtering and collaborative wiener filtering.
The cooperative hard threshold filtering includes:
dividing the sub-image into image blocks, traversing and searching similar blocks similar to the image blocks in the sub-image for each image block by taking the mean square error between the image blocks of the sub-image as a distance, and constructing a similar block group of the sub-image;
carrying out three-dimensional collaborative hard threshold filtering on the similar block group of the subimage to obtain a hard threshold filtering and denoising result of the subimage;
and reconstructing a complete image by using the hard threshold filtering and denoising result of each sub-image to obtain a primary denoising image.
It should be noted that, in the above-disclosed collaborative hard threshold filtering method, in consideration of the calculation amount and the filtering effect, the number of similar blocks searched in the sub-image may be set to be 32, that is, 32 similar blocks constitute a similar block group of one sub-image, and is marked asB 3D Then, hard threshold filtering is performed on the filter, and the filtering formula is as follows:
Figure DEST_PATH_IMAGE003
wherein the content of the first and second substances,T 3D andT -1 3D the method comprises the steps of firstly carrying out two-dimensional DCT on each image block, and then carrying out Hadamard transform on the image blocks subjected to DCT along a third dimension.
Figure 574501DEST_PATH_IMAGE004
Represents a hard threshold filtering operation, whose formula is:
Figure DEST_PATH_IMAGE005
filter threshold value thereofthrIs composed of
Figure 698446DEST_PATH_IMAGE006
Wherein
Figure 624813DEST_PATH_IMAGE007
The noise variance mean value of the current image block is the noise variance mean value corresponding to the current image block calculated according to the noise variance of each position on the subimage obtained in the noise estimation process, so that the hard threshold filtering formula is adjusted through the noise variance,
Figure 501502DEST_PATH_IMAGE008
is a constant. After the denoising results of the four subgraphs are obtained, a preliminary denoising image of the whole noise image is reconstructed in a mode shown in FIG. 3 and is recorded asR basic . FIG. 5 is a diagram of similar block search, wherein (b) illustrates similar blocks searched according to (a).
After the collaborative hard threshold filtering is completed, collaborative wiener filtering is required to be performed on the preliminary de-noised image, and the collaborative wiener filtering includes:
dividing image blocks of the preliminary de-noised image, traversing and searching similar blocks of each image block on the preliminary de-noised image, and constructing a similar block group of the preliminary de-noised image;
dividing the noise image according to the image blocks of the preliminary de-noised image to obtain image blocks corresponding to the positions of the image blocks of the preliminary de-noised image, traversing similar blocks of the image blocks of the noise image, and constructing a similar block group of the noise image;
and carrying out three-dimensional collaborative wiener filtering on the similar block group of the noise image by taking the similar block group of the preliminary de-noised image as a guide to obtain a first de-noised image.
It should be noted that the similar block group of the preliminary de-noised image constructed in the above method is denoted as
Figure 137014DEST_PATH_IMAGE009
And the similar block group of the constructed noise image is marked as
Figure 200785DEST_PATH_IMAGE010
To do so by
Figure 215664DEST_PATH_IMAGE009
For guidance, pair
Figure 681281DEST_PATH_IMAGE010
The filter formula for collaborative wiener filtering is:
Figure 471382DEST_PATH_IMAGE011
wherein the content of the first and second substances,
Figure 507603DEST_PATH_IMAGE012
and
Figure 775773DEST_PATH_IMAGE013
respectively represent a three-dimensional frequency domain transform,Wthe calculation formula is as follows:
Figure 377787DEST_PATH_IMAGE014
wherein the content of the first and second substances,
Figure 401106DEST_PATH_IMAGE015
is a constant number of times, and is,
Figure 190202DEST_PATH_IMAGE016
the noise variance mean value of the current processed image block is the noise variance mean value corresponding to the current processed image block calculated according to the noise variance of each position on the subimage obtained in the noise estimation process, so that the noise is passed throughThe variance adjusts the wiener filter formula.
The method for acquiring the second denoised image further comprises the following steps:
denoising the noise image by using a convolutional neural network to obtain a second denoised image, recording the loss calculated by forward propagation as a first noise residual loss (ext. loss), and adopting the mean square error between the ideal noiseless image and the second denoised image as a loss function.
Specifically, the convolutional neural network denoises using a feedforward convolutional denoising neural network (DnCNN).
In combination with the first noise residual loss (ext. loss) and the second noise residual loss (fusion. loss), the final total loss function of the image is:
Figure 996484DEST_PATH_IMAGE017
wherein the content of the first and second substances,
Figure 436692DEST_PATH_IMAGE018
the residual of the ideal noise-free image and the noise image,jfor the first in each training batchjEach of the image blocks is a block of an image,
Figure 316399DEST_PATH_IMAGE019
for the first loss of the noise residual to be the first,
Figure 576479DEST_PATH_IMAGE020
for the purpose of the second noise residual loss,
Figure 750234DEST_PATH_IMAGE021
is a weight parameter in the neural network,
Figure 576107DEST_PATH_IMAGE022
is Frobenius norm. And the loss function in the image is used as the total loss of the whole noise image so as to evaluate the quality of the target de-noised image.
As shown in fig. 6 and 7, (a) is an original noise image, (b) is an image block of the noise image, (c) is a denoising result obtained by independently adopting a collaborative filtering algorithm, (d) is a denoising result obtained by independently adopting a DnCNN method, and (e) is a denoising result obtained by adopting the fusion denoising method disclosed by the embodiment of the present invention. Comparing the graph (c) with the graph (d), it can be seen that the image processed by the collaborative filtering algorithm better retains detailed information such as texture of the image than the image processed by the DnCNN method, but noise still exists in the image, and the image processed by the DnCNN method loses much detailed information on the image than the image processed by the filtering algorithm. The denoising effect of the image (e) is between the two, namely, a good denoising effect is achieved, and the details of the image are reserved.
As shown in fig. 8, based on the image fusion denoising method, an embodiment of the present invention further provides an image fusion denoising device, including:
the first de-noising image acquisition module is used for processing the noise image by using a filtering algorithm to obtain a first de-noising image;
the second denoising image obtaining module is used for processing the noise image by utilizing the first neural network model to obtain a second denoising image;
the residual image calculation module is used for calculating a noise image and a first de-noised image by using a residual image method to obtain a first noise residual image, and calculating the noise image and a second de-noised image by using the residual image method to obtain a second noise residual image;
and the fusion denoising module is used for cascading the noise image, the first noise residual image and the second noise residual image together according to the color channel, and inputting the cascaded images into the second neural network model for denoising to obtain a target denoising image.
In the device, after a first denoised image obtaining module and a second denoised image obtaining module respectively obtain a first denoised image and a second denoised image, the first denoised image and the second denoised image are input to a residual image calculating module to obtain a first noise residual image and a second noise residual image. And after the residual image calculation module finishes the calculation of the first noise residual image and the second noise image, the first noise residual image and the second noise image are input into the fusion denoising module to be cascaded and fused for denoising, so that a target denoising image is obtained.
Corresponding to the method, the first denoised image obtaining module comprises:
the down-sampling module is used for carrying out Bayer pattern down-sampling on the noise image to obtain a sub-image;
the noise estimation module is used for respectively carrying out noise estimation on the sub-images to obtain a noise method of each sub-image;
and the filtering and denoising module is used for adjusting the filtering parameters of the filtering algorithm by using the noise variance and obtaining a first denoised image by using the filtering algorithm.
And the filtering and denoising module comprises a collaborative hard threshold filtering and denoising module, a collaborative wiener filtering and denoising module, a similar block group construction module and a recombination module.
The similar block group building module is used for dividing the sub-image into image blocks, and traversing and searching similar blocks in the sub-image for each image block by taking the mean square error between the image blocks as the distance to build a similar block group of the sub-image. In fig. 8, a line without an arrow indicates the module attribution relationship, and a line with an arrow indicates the data transfer relationship.
And the collaborative hard threshold filtering and denoising module is used for carrying out three-dimensional collaborative hard threshold filtering on the similar block group of the sub-image to obtain a hard threshold filtering and denoising result of the sub-image.
And the recombination module is used for reconstructing the hard threshold filtering denoising result of each sub-image into a complete image to obtain a primary denoising image.
The similar block group building module is further used for dividing the image blocks of the preliminary de-noised image, searching similar blocks for each image block on the preliminary de-noised image in a traversing mode, building a similar block group of the preliminary de-noised image, dividing the noise image according to the image blocks of the preliminary de-noised image, obtaining the image blocks corresponding to the image block positions of the preliminary de-noised image, searching similar blocks for the image blocks of the noise image in a traversing mode, and building a similar block group of the noise image.
And the collaborative wiener filtering denoising module is used for carrying out three-dimensional collaborative wiener filtering on the similar block group of the noise image by taking the similar block group of the preliminary denoising image as a guide to obtain a first denoising image.
In the device, the downsampling module transmits the obtained subimage to the noise estimation module, the noise estimation module outputs a noise estimation result to a hard threshold filtering denoising module in the filtering denoising module for hard threshold filtering denoising after performing noise estimation, the similar block group structure modeling block outputs the similar block group to the collaborative hard threshold filtering denoising module and the collaborative wiener filtering denoising module for denoising, the hard threshold filtering denoising module outputs a denoising result to the recombination module to construct a preliminary denoising image, and the collaborative wiener filtering denoising module performs collaborative wiener filtering on the preliminary denoising image to obtain a first denoising image.
Correspondingly to the method, the second denoised image obtaining module is specifically configured to denoise the noise image by using a convolutional neural network to obtain a second denoised image, the loss calculated by forward propagation is recorded as a first noise residual loss, and a loss function adopts a mean square error between an ideal noiseless image and the second denoised image.
In order to evaluate the quality of the target denoised image, the device disclosed by the embodiment of the invention further comprises: the target denoising image quality evaluation module is used for receiving and combining a first noise residual loss sent by the second denoising image acquisition module and a second noise residual loss sent by the fusion denoising module to calculate the total loss of the noise image, and evaluating the quality of the target denoising image by using the total loss, wherein the first noise residual loss is: the first neural network model processes the loss of forward propagation computation generated by the noise image, and the second noise residual loss is: the second neural network model processes the loss of forward propagation computations produced by the noisy image.
According to the fusion denoising device based on the images, technicians can obtain the target denoising image after fusion denoising only by inputting the noise image without manual participation, and the image processing efficiency is improved.
In addition, an embodiment of the present application further provides a computer system, including:
one or more processors; and
a memory associated with the one or more processors for storing program instructions that, when read and executed by the one or more processors, perform the image fusion denoising method described above.
Fig. 9 illustrates an architecture of a computer system, which may include, in particular, a processor 910, a video display adapter 911, a disk drive 912, an input/output interface 913, a network interface 914, and a memory 920. The processor 910, the video display adapter 911, the disk drive 912, the input/output interface 913, and the network interface 914 may be communicatively connected to the memory 920 via a communication bus 930.
The processor 910 may be implemented by a general-purpose CPU (Central Processing Unit), a microprocessor, an Application Specific Integrated Circuit (ASIC), or one or more Integrated circuits, and is configured to execute related programs to implement the technical solution provided in the present Application.
The Memory 920 may be implemented in the form of a ROM (Read Only Memory), a RAM (Random access Memory), a static storage device, a dynamic storage device, or the like. The memory 920 may store an operating system 921 for controlling the operation of the electronic device 900, a basic input output system 922(BIOS) for controlling low-level operations of the electronic device 900. In addition, a web browser 923, a data storage management system 924, a device identification information processing system 925, and the like may also be stored. The device identification information processing system 925 may be an application program that implements the operations of the foregoing steps in this embodiment. In summary, when the technical solution provided in the present application is implemented by software or firmware, the relevant program code is stored in the memory 920 and invoked by the processor 910 for execution.
The input/output interface 913 is used to connect the input/output module to realize information input and output. The i/o module may be configured as a component in a device (not shown) or may be external to the device to provide a corresponding function. The input devices may include a keyboard, a mouse, a touch screen, a microphone, various sensors, etc., and the output devices may include a display, a speaker, a vibrator, an indicator light, etc.
The network interface 914 is used for connecting a communication module (not shown in the figure) to implement communication interaction between the present device and other devices. The communication module can realize communication in a wired mode (such as USB, network cable and the like) and also can realize communication in a wireless mode (such as mobile network, WIFI, Bluetooth and the like).
The bus 930 includes a path to transfer information between the various components of the device, such as the processor 910, the video display adapter 911, the disk drive 912, the input/output interface 913, the network interface 914, and the memory 920.
In addition, the electronic device 900 may also obtain information of specific pickup conditions from the virtual resource object pickup condition information database 941 for performing condition judgment, and the like.
It should be noted that although the above-mentioned devices only show the processor 910, the video display adapter 911, the disk drive 912, the input/output interface 913, the network interface 914, the memory 920, the bus 930 and so on, in a specific implementation, the device may also include other components necessary for normal operation. Furthermore, it will be understood by those skilled in the art that the apparatus described above may also include only the components necessary to implement the solution of the present application, and not necessarily all of the components shown in the figures.
From the above description of the embodiments, it is clear to those skilled in the art that the present application can be implemented by software plus necessary general hardware platform. Based on such understanding, the technical solutions of the present application may be essentially implemented or a part contributing to the prior art may be embodied in the form of a software product, which may be stored in a storage medium, such as a ROM/RAM, a magnetic disk, an optical disk, etc., and includes several instructions for enabling a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method described in the embodiments or some parts of the embodiments of the present application.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, the system or system embodiments are substantially similar to the method embodiments and therefore are described in a relatively simple manner, and reference may be made to some of the descriptions of the method embodiments for related points. The above-described system and system embodiments are only illustrative, wherein the units described as separate parts may or may not be physically separate, and the parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
The embodiment of the invention has the beneficial effects that:
1. the method adopts a fusion method to combine the noise images processed by the collaborative filtering algorithm and the neural network model algorithm together, obtains the de-noised image with both de-noising effect and image details, and better ensures the image de-noising quality;
2. according to the invention, the Bayer pattern is combined to carry out downsampling to obtain the sub-images before noise estimation, so that the noise estimation is more accurate and accords with the actual noise distribution condition of the image;
3. the invention utilizes noise estimation to adjust the parameters of the collaborative hard threshold filtering algorithm, so that the denoising effect of the collaborative hard threshold filtering algorithm is more accurate.
All the above-mentioned optional technical solutions can be combined arbitrarily to form the optional embodiments of the present invention, and are not described herein again.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (9)

1. An image fusion denoising method is characterized by comprising the following steps:
processing a noise image by using a filtering algorithm to obtain a first de-noised image, and calculating the noise image and the first de-noised image by using a residual image method to obtain a first noise residual image;
processing the noise image by using a first neural network model to obtain a second denoised image, and calculating the noise image and the second denoised image by using the residual image method to obtain a second noise residual image;
cascading the noise image, the first noise residual image and the second noise residual image together according to a color channel, inputting the cascaded images into a second neural network model together for fusion denoising, and obtaining a target denoising image.
2. The image fusion denoising method of claim 1, wherein the processing a noise image by using a filtering algorithm to obtain a first denoised image comprises:
carrying out Bayer pattern down-sampling on the noise image to obtain a sub-image;
respectively carrying out noise estimation on the sub-images to obtain the noise variance of each sub-image;
and adjusting the filtering parameters of the filtering algorithm by using the noise variance, and obtaining the first denoised image by using the filtering algorithm.
3. The image fusion denoising method of claim 2, wherein adjusting the filter parameters of the filter algorithm using the noise variance comprises:
dividing the sub-images into image blocks, and traversing and searching similar blocks of each image block in the sub-images by taking the mean square error between the image blocks of the sub-images as a distance to construct a similar block group of the sub-images;
performing three-dimensional collaborative hard threshold filtering on the similar block group of the sub-image to obtain a hard threshold filtering and denoising result of the sub-image, wherein a filtering threshold in a hard threshold filtering formula is adjusted according to the noise variance;
reconstructing a complete image by using the hard threshold filtering denoising result of each subimage to obtain a primary denoising image;
and carrying out collaborative wiener filtering on the preliminary de-noised image to obtain the first de-noised image.
4. The image fusion denoising method of claim 3, wherein the performing collaborative wiener filtering on the preliminary denoised image to obtain the first denoised image comprises:
dividing image blocks of the preliminary de-noised image, and traversing and searching similar blocks of each image block on the preliminary de-noised image to construct a similar block group of the preliminary de-noised image;
dividing the noise image according to the image blocks of the preliminary de-noised image to obtain image blocks corresponding to the positions of the image blocks of the preliminary de-noised image, and traversing and searching similar blocks for the image blocks of the noise image to construct a similar block group of the noise image;
and carrying out three-dimensional collaborative wiener filtering on the similar block group of the noise image by taking the similar block group of the preliminary de-noising image as a guide to obtain the first de-noising image, wherein the noise variance mean value in a wiener filtering formula is adjusted according to the noise variance.
5. The image fusion denoising method of claim 1, wherein the method further comprises: calculating the total loss of the noise image by combining a first noise residual loss and a second noise residual loss, and evaluating the quality of the target de-noised image by using the total loss, wherein the first noise residual loss is as follows: the first neural network model processes the loss of forward propagation computation generated by the noise image, and the second noise residual loss is: the second neural network model processes the loss of forward propagation computations produced by the noisy image.
6. An image fusion denoising apparatus, comprising:
the first de-noising image acquisition module is used for processing the noise image by using a filtering algorithm to obtain a first de-noising image;
the second denoising image obtaining module is used for processing the noise image by utilizing the first neural network model to obtain a second denoising image;
a residual image calculation module, configured to calculate the noise image and the first denoised image by using a residual image method to obtain a first noise residual image, and calculate the noise image and the second denoised image by using a residual image method to obtain a second noise residual image;
and the fusion denoising module is used for cascading the noise image, the first noise residual image and the second noise residual image together according to a color channel, inputting the cascaded images into a second neural network model together for fusion denoising, and obtaining a target denoising image.
7. The image fusion denoising apparatus of claim 6, wherein the apparatus further comprises:
the target de-noising image quality evaluation module is used for calculating the total loss of the noise image by combining a first noise residual loss and a second noise residual loss, and evaluating the quality of the target de-noising image by using the total loss, wherein the first noise residual loss is: the first neural network model processes the loss of forward propagation computation generated by the noise image, and the second noise residual loss is: the second neural network model processes the loss of forward propagation computations produced by the noisy image.
8. The image fusion denoising apparatus of claim 6, wherein the first denoising image obtaining module comprises:
the down-sampling module is used for carrying out Bayer pattern down-sampling on the noise image to obtain a sub-image;
the noise estimation module is used for respectively carrying out noise estimation on the sub-images to obtain the noise variance of each sub-image;
and the filtering and denoising module is used for adjusting the filtering parameters of the filtering algorithm by using the noise variance and obtaining a first denoised image by using the filtering algorithm.
9. A computer system, comprising:
one or more processors; and
a memory associated with the one or more processors for storing program instructions that, when read and executed by the one or more processors, perform the method of any of claims 1-5.
CN202010292705.0A 2020-04-15 2020-04-15 Image fusion denoising method, device and system Active CN111192226B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010292705.0A CN111192226B (en) 2020-04-15 2020-04-15 Image fusion denoising method, device and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010292705.0A CN111192226B (en) 2020-04-15 2020-04-15 Image fusion denoising method, device and system

Publications (2)

Publication Number Publication Date
CN111192226A CN111192226A (en) 2020-05-22
CN111192226B true CN111192226B (en) 2020-07-31

Family

ID=70710335

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010292705.0A Active CN111192226B (en) 2020-04-15 2020-04-15 Image fusion denoising method, device and system

Country Status (1)

Country Link
CN (1) CN111192226B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111815535B (en) * 2020-07-14 2023-11-10 抖音视界有限公司 Image processing method, apparatus, electronic device, and computer readable medium
WO2022163440A1 (en) * 2021-01-29 2022-08-04 富士フイルム株式会社 Information processing apparatus, imaging apparatus, information processing method, and program
CN112862717B (en) * 2021-02-10 2022-09-20 山东英信计算机技术有限公司 Image denoising and blurring method, system and medium
CN112907480B (en) * 2021-03-11 2023-05-09 北京格灵深瞳信息技术股份有限公司 Point cloud surface ripple removing method and device, terminal and storage medium
CN116385280B (en) * 2023-01-09 2024-01-23 爱芯元智半导体(上海)有限公司 Image noise reduction system and method and noise reduction neural network training method
CN115836867B (en) * 2023-02-14 2023-06-16 中国科学技术大学 Deep learning electroencephalogram noise reduction method, equipment and medium with double-branch fusion

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108764270B (en) * 2018-04-03 2022-06-14 上海大学 Information hiding detection method integrated by using convolutional neural network
CN109035163B (en) * 2018-07-09 2022-02-15 南京信息工程大学 Self-adaptive image denoising method based on deep learning
EP3857268A4 (en) * 2018-09-30 2022-09-14 ConocoPhillips Company Machine learning based signal recovery
CN111008943B (en) * 2019-12-24 2023-04-14 广州柏视医疗科技有限公司 Low-dose DR image noise reduction method and system

Also Published As

Publication number Publication date
CN111192226A (en) 2020-05-22

Similar Documents

Publication Publication Date Title
CN111192226B (en) Image fusion denoising method, device and system
EP3948764B1 (en) Method and apparatus for training neural network model for enhancing image detail
US11562498B2 (en) Systems and methods for hybrid depth regularization
CN108198154B (en) Image denoising method, device, equipment and storage medium
Liu et al. Automatic estimation and removal of noise from a single image
Mairal et al. Sparse representation for color image restoration
US8908989B2 (en) Recursive conditional means image denoising
US8818125B2 (en) Scene adaptive filter design for improved stereo matching
Rajput et al. Noise robust face hallucination algorithm using local content prior based error shrunk nearest neighbors representation
CN112541877B (en) Defuzzification method, system, equipment and medium for generating countermeasure network based on condition
KR20200140713A (en) Method and apparatus for training neural network model for enhancing image detail
CN107767358B (en) Method and device for determining ambiguity of object in image
CN111047543A (en) Image enhancement method, device and storage medium
Yeh et al. Single image dehazing via deep learning-based image restoration
Saleem et al. A non-reference evaluation of underwater image enhancement methods using a new underwater image dataset
CN115131229A (en) Image noise reduction and filtering data processing method and device and computer equipment
CN113723317B (en) Reconstruction method and device of 3D face, electronic equipment and storage medium
Chang et al. UIDEF: A real-world underwater image dataset and a color-contrast complementary image enhancement framework
Ponomaryov et al. Fuzzy color video filtering technique for sequences corrupted by additive Gaussian noise
Zhang et al. Deep joint neural model for single image haze removal and color correction
Zhao et al. Saliency map-aided generative adversarial network for raw to rgb mapping
Jung et al. Multispectral fusion of rgb and nir images using weighted least squares and convolution neural networks
US9077963B2 (en) Systems and methods for generating a depth map and converting two-dimensional data to stereoscopic data
CN114764803B (en) Noise evaluation method and device based on real noise scene and storage medium
CN108805816B (en) Hyperspectral image denoising method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20210506

Address after: 210042 no.1-9 Suning Avenue, Xuanwu District, Nanjing City, Jiangsu Province (Jiangsu Province)

Patentee after: NANJING SUNING ELECTRONIC INFORMATION TECHNOLOGY Co.,Ltd.

Address before: No.1-1 Suning Avenue, Xuzhuang Software Park, Xuanwu District, Nanjing, Jiangsu Province, 210000

Patentee before: Suning Cloud Computing Co.,Ltd.