CN116091364B - Image blurring processing method, device, electronic equipment and storage medium - Google Patents

Image blurring processing method, device, electronic equipment and storage medium Download PDF

Info

Publication number
CN116091364B
CN116091364B CN202310362201.5A CN202310362201A CN116091364B CN 116091364 B CN116091364 B CN 116091364B CN 202310362201 A CN202310362201 A CN 202310362201A CN 116091364 B CN116091364 B CN 116091364B
Authority
CN
China
Prior art keywords
image
pixel
blurred
saturated
original
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310362201.5A
Other languages
Chinese (zh)
Other versions
CN116091364A (en
Inventor
沈孝慈
郭辉
唐波
徐乙峰
黄鑫
林森
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin Hanyun Industrial Internet Co ltd
XCMG Hanyun Technologies Co Ltd
Original Assignee
Tianjin Hanyun Industrial Internet Co ltd
XCMG Hanyun Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin Hanyun Industrial Internet Co ltd, XCMG Hanyun Technologies Co Ltd filed Critical Tianjin Hanyun Industrial Internet Co ltd
Priority to CN202310362201.5A priority Critical patent/CN116091364B/en
Publication of CN116091364A publication Critical patent/CN116091364A/en
Application granted granted Critical
Publication of CN116091364B publication Critical patent/CN116091364B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • G06T5/70
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics

Abstract

The invention provides an image blurring processing method, an image blurring processing device, electronic equipment and a storage medium, wherein saturated pixels in a plurality of original images continuously shot by a camera are searched, the interpolation density of frame interpolation between adjacent original images is controlled by utilizing the difference between the pixel value of the saturated pixel in any original image and the pixel value of the corresponding pixel in the adjacent original image, so that the pixel value intensity of potential saturated pixel points in a first blurred image obtained by averaging the pixel values of the original image and the interpolation image is not excessively weakened by the averaging operation as far as possible, the saturated pixel synthesis processing is carried out on the first blurred image based on the position of the saturated pixel in each original image, noise is added to the second blurred image after the second blurred image is obtained, the final blurred image is obtained by converting the noise into an sRGB format, and the difference between the final blurred image and a real blurred image is shortened.

Description

Image blurring processing method, device, electronic equipment and storage medium
Technical Field
The present invention relates to the field of image processing technologies, and in particular, to an image blurring processing method, an image blurring processing device, an electronic device, and a storage medium.
Background
In various image segmentation or image analysis scenarios, the image samples used for segmentation or analysis often require the use of sharper images to avoid missed or false detection. However, in many cases, the camera may blur the captured image due to self-shake during exposure or movement of the subject, and thus in order to secure the effect of the downstream task, a deblurring operation is required for the image sample.
The currently adopted image deblurring processing mode mostly adopts a deep learning technology, and a clear-fuzzy image pair is utilized to train a deep learning network, so that the deep learning network can autonomously learn the mapping relation between a clear image and a fuzzy image, thereby learning the capability of converting the fuzzy image into the clear image. However, the training effect of the deep learning network depends on the magnitude of the training samples, the better the training sample will be the model training effect, otherwise the model will be over-fitted, resulting in poor deblurring effect. However, the number of samples in the existing blurred image data set is difficult to meet the training requirement of the deep learning network with a huge current parameter, and the method of generating blurred images through the actual shooting is too large in workload and too low in efficiency of sample collection. In view of this, how to generate a required amount of blurred pictures conveniently is a problem that needs to be solved at present. Unfortunately, current methods of generating blurred images convert sharp images into blurred images by a single blurring algorithm (e.g., gaussian blurring), which generates blurred images in too single a pattern, and which differ too much from complex blurred images in a real environment, resulting in undesirable deblurring effects of such data-trained models in real scenes.
Disclosure of Invention
The invention provides an image blurring processing method, an image blurring processing device, electronic equipment and a storage medium, which are used for solving the defect that the mode of a blurring image generated in the prior art is too single and the difference between the mode and a complex blurring image in a real environment is too large.
The invention provides an image blurring processing method, which comprises the following steps:
acquiring a plurality of continuous original images shot by a camera, and determining saturated pixels in each original image; the saturated pixels in either image are pixels with normalized pixel values of 1;
performing frame interpolation processing on the continuous multiple original images based on pixel values of saturated pixels in each original image and pixel values of pixels corresponding to saturated pixels in each original image in adjacent original images to obtain multiple interpolation images; the larger the difference between the pixel value of the saturated pixel in any original image and the pixel value of the corresponding pixel in the adjacent original image of the corresponding saturated pixel in the any original image, the more the interpolation image between the any original image and the adjacent original image;
converting the original image and the interpolation image into a RAW format, and then carrying out pixel value averaging treatment to obtain a first blurred image;
Based on the positions of saturated pixels in each original image, carrying out saturated pixel synthesis processing on the first blurred image to obtain a second blurred image;
and adding noise to the second blurred image to obtain a third blurred image, and converting the third blurred image into an sRGB format to obtain a final blurred image.
According to the image blurring processing method provided by the invention, based on the position of saturated pixels in each original image, saturated pixels synthesis processing is carried out on the first blurred image to obtain a second blurred image, and the method specifically comprises the following steps:
establishing saturated pixel masks corresponding to each original image respectively based on the positions of saturated pixels in each original image;
determining potential saturated pixel masks which correspond to each original image in common based on the saturated pixel masks which correspond to each original image respectively; the positions of pixels in the potential saturated pixel mask are located in a saturated pixel mask area corresponding to at least one original image;
and reinforcing pixel values of pixels in the first blurred image, which are positioned in the potential saturated pixel mask area, so as to synthesize saturated pixels in the first blurred image, and obtain a second blurred image.
According to the image blurring processing method provided by the invention, the pixel value of the pixel in the first blurring image in the potential saturated pixel mask area is enhanced to synthesize the saturated pixel in the first blurring image to obtain the second blurring image, and the method specifically comprises the following steps:
scaling pixel values of pixels in the potential saturated pixel mask based on a preset coefficient to obtain a saturated pixel synthetic map;
and superposing pixel values at the same position in the first blurred image and the saturated pixel synthetic image to obtain a second blurred image.
According to the image blurring processing method provided by the invention, the third blurred image is converted into the sRGB format to obtain the final blurred image, and the method specifically comprises the following steps:
performing image feature coding on the third blurred image based on a plurality of feature extraction modules to obtain a depth feature vector of the third blurred image;
selecting an intermediate original image from the continuous multiple original images as a reference image;
after Gaussian blur is carried out on the reference image, color feature codes of the Gaussian blurred reference image are extracted based on two feature extraction modules;
fusing the depth feature vector of the third blurred image with the color feature code of the Gaussian blurred reference image to obtain the color feature vector of the third blurred image;
And processing the color feature vector of the third blurred image based on a feature extraction module, a convolution module and an up-sampling module to obtain a final blurred image in an sRGB format.
According to the image blurring processing method provided by the invention, the depth feature vector of the third blurred image is fused with the color feature code of the reference image after Gaussian blurring to obtain the color feature vector of the third blurred image, and the method specifically comprises the following steps:
performing point multiplication on the depth feature vector of the third blurred image and the color feature code of the Gaussian blurred reference image to obtain a color association vector of the third blurred image;
and fusing the depth feature vector of the third blurred image with the color association vector of the third blurred image to obtain the color feature vector of the third blurred image.
According to the image blurring processing method provided by the invention, the original image and the interpolation image are converted into the RAW format, and the method specifically comprises the following steps:
performing image feature coding on any image based on a plurality of feature extraction modules to obtain a depth feature vector of any image; the arbitrary image is an original image or an interpolation image;
Performing convolution operation on the depth feature vector of any image based on a convolution module to obtain a converted image of any image;
sampling the converted image of any image, and removing the color channel of the converted image of any image to convert any image into a RAW format.
According to the image blurring processing method provided by the invention, the feature extraction module comprises a plurality of serially connected dual-attention modules and a convolution module; any dual attention module comprises a space attention module and a channel attention module; a jump connection is established between the input of the first dual-attention module and the output of the convolution module.
The invention also provides an image blurring processing device, which comprises:
the saturated pixel acquisition unit is used for acquiring a plurality of continuous original images shot by the camera and determining saturated pixels in each original image; the saturated pixels in either image are pixels with normalized pixel values of 1;
a frame interpolation unit, configured to perform frame interpolation processing on the continuous multiple original images based on pixel values of saturated pixels in each original image and pixel values of pixels corresponding to saturated pixels in each original image in adjacent original images, to obtain multiple interpolated images; the larger the difference between the pixel value of the saturated pixel in any original image and the pixel value of the corresponding pixel in the adjacent original image of the corresponding saturated pixel in the any original image, the more the interpolation image between the any original image and the adjacent original image;
The first blurring unit is used for converting the original image and the interpolation image into a RAW format and then carrying out pixel value averaging treatment to obtain a first blurring image;
the second blurring unit is used for carrying out saturated pixel synthesis processing on the first blurring image based on the position of the saturated pixel in each original image to obtain a second blurring image;
and the third blurring unit is used for adding noise to the second blurring image to obtain a third blurring image, and converting the third blurring image into an sRGB format to obtain a final blurring image.
The invention also provides an electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the image blurring processing method according to any of the above when executing the program.
The present invention also provides a non-transitory computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the image blur processing method as described in any one of the above.
The invention also provides a computer program product comprising a computer program which when executed by a processor implements the image blur processing method as described in any one of the above.
The image blurring processing method, the device, the electronic equipment and the storage medium provided by the invention control the interpolation density of frame interpolation between adjacent original images by searching saturated pixels in a plurality of original images continuously shot by a camera and utilizing the difference between the pixel value of the saturated pixel in any original image and the pixel value of the corresponding pixel in the adjacent original image of the corresponding saturated pixel, can ensure that the pixel value intensity of potential saturated pixel points in a first blurring image obtained by carrying out pixel value averaging on the original image and the interpolation image converted into a RAW format is not excessively weakened by the averaging operation as far as possible, when the saturated pixel synthesis processing is carried out on the first blurred image on the basis, potential saturated pixel points in the first blurred image can be prevented from being ignored in the synthesis process, so that the synthesis effect of saturated pixels and the synthesis effect of the whole blurred image are improved, the saturated pixel synthesis processing is carried out on the first blurred image on the basis of the position of the saturated pixels in each original image, after the second blurred image is obtained, noise is added to the second blurred image, a third blurred image is obtained, the third blurred image is converted into an sRGB format, a final blurred image is obtained, the difference between the final blurred image and the real blurred image is shortened, and the training effect of an image deblurring model is improved.
Drawings
In order to more clearly illustrate the invention or the technical solutions of the prior art, the following description will briefly explain the drawings used in the embodiments or the description of the prior art, and it is obvious that the drawings in the following description are some embodiments of the invention, and other drawings can be obtained according to the drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of an image blurring processing method provided by the invention;
FIG. 2 is a schematic flow chart of a saturated pixel synthesis method provided by the invention;
FIG. 3 is a schematic flow chart of a format conversion method according to the present invention;
FIG. 4 is a second flow chart of the format conversion method according to the present invention;
fig. 5 is a schematic diagram of the structure of an image blurring processing device provided by the present invention;
fig. 6 is a schematic structural diagram of an electronic device provided by the present invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present invention more apparent, the technical solutions of the present invention will be clearly and completely described below with reference to the accompanying drawings, and it is apparent that the described embodiments are some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Fig. 1 is a schematic flow chart of an image blurring processing method provided by the present invention, as shown in fig. 1, the method includes:
step 110, acquiring a plurality of continuous original images shot by a camera, and determining saturated pixels in each original image; the saturated pixels in either image are pixels with normalized pixel values of 1;
step 120, performing frame interpolation processing on the continuous multiple original images based on pixel values of saturated pixels in each original image and pixel values of pixels corresponding to saturated pixels in each original image in adjacent original images, so as to obtain multiple interpolated images; the larger the difference between the pixel value of the saturated pixel in any original image and the pixel value of the corresponding pixel in the adjacent original image of the corresponding saturated pixel in the any original image, the more the interpolation image between the any original image and the adjacent original image;
step 130, converting the original image and the interpolation image into a RAW format, and then carrying out pixel value averaging processing to obtain a first blurred image;
step 140, performing saturated pixel synthesis processing on the first blurred image based on the positions of saturated pixels in each original image to obtain a second blurred image;
And step 150, adding noise to the second blurred image to obtain a third blurred image, and converting the third blurred image into an sRGB format to obtain a final blurred image.
Specifically, a plurality of clear original images continuously shot by the camera are obtained, wherein the original images can be in an sRGB format, and a certain time continuity exists among the plurality of original images. Each original image is searched to locate saturated pixels in each original image. Wherein, the saturated pixel is the pixel with the normalized pixel value of 1 in the corresponding image. For example, the pixel values of any of the original images may be read and normalized such that the pixel values of each pixel lie in the [0,1] interval, and each pixel is then traversed to find a saturated pixel in the image. Here, the reason for locating the saturated pixels in the respective images from the respective original images is that, on the one hand, the density of the subsequent frame interpolation can be determined using the positions of the saturated pixels of the respective original images, and on the other hand, the positions of the saturated pixels to be synthesized in the blurred image can be determined therefrom, but for whatever reason, the final purpose is to synthesize the saturated pixels close to the true blurred image in the blurred image, thereby reducing the difference between the generated blurred image and the true blurred image, and the specific mechanism of action of the saturated pixels of the respective original images will be described later.
In order to simulate the blurring factor in a real scene, the original images can be fused to generate the motion blurring effect. However, there are large differences between original images continuously photographed by the camera due to frame rate and the like in many cases, and the blurring effect of directly fusing the respective original images may be poor. Therefore, frame interpolation operation can be performed on each original image to obtain a plurality of interpolation images so as to make up for larger differences between adjacent original images, and in addition, the fact that blurring usually occurs in a RAW domain under a real scene is considered, so that pixel value averaging processing can be performed after the plurality of original images and the interpolation images are converted into a RAW format, and a first blurred image is obtained. In determining the interpolation density between adjacent images (i.e., the number of images newly added in the adjacent images), it is generally considered to interpolate frames as much as possible to reduce the difference in pixel values at the same position of the adjacent images (including the original image and the interpolated image) as much as possible, so as to synthesize a smooth intermediate image.
However, when the embodiment of the present invention applies the frame interpolation operation to the scene of the image blurring process, the main consideration is not the effect of image smoothing, but how to generate the synthesized blurred image closer to the true blurred image, and the embodiment of the present invention finds that one very important characteristic of the true blurred image is that saturated pixels exist, so when the original image is subjected to frame interpolation and the interpolated image obtained after interpolation is subjected to averaging process to obtain the first blurred image, it is necessary to ensure that the pixel values of the potential saturated pixels in the first blurred image are not excessively weakened. In this regard, for any two adjacent original images, in determining the number of interpolation images between the two adjacent original images, it may be determined based on a difference between a pixel value of a saturated pixel in any one of the original images and a pixel value at the same position in the other original image. Wherein, the larger the difference between the pixel value of the saturated pixel in any original image and the pixel value of the corresponding pixel in the adjacent original image, the more the interpolation image between the original image and the adjacent original image. By controlling the interpolation density by using the difference between the pixel value of the saturated pixel in any one of the original images and the pixel value of the corresponding saturated pixel in the adjacent original image, it is possible to ensure that the pixel value intensity of the potential saturated pixel point in the first blurred image (the potential saturated pixel in the first blurred image corresponds to the saturated pixel in at least one of the original images) obtained by averaging the pixel values of the interpolation image converted into the RAW format and the original image is not excessively weakened by the above-mentioned averaging operation, and when the saturated pixel synthesis processing is performed on the first blurred image on the basis of this, it is possible to avoid the potential saturated pixel point in the first blurred image from being ignored in the synthesis process, thereby improving the synthesis effect of the saturated pixel.
After the first blurred image is obtained, the potential saturated pixels in the first blurred image can be determined based on the positions of the saturated pixels in each original image, and the positions of the potential saturated pixels are the positions of the saturated pixels to be synthesized in the first blurred image. And strengthening the potential saturated pixels in the first blurred image to synthesize the saturated pixels in the first blurred image, so as to obtain a second blurred image. In view of the fact that the actual blurred image contains a certain degree of noise, in order to promote the realism of the blurred image synthesis as much as possible, noise can be added to the second blurred image, and a third blurred image can be obtained. Wherein the cypress noise and gaussian noise may be added to the second blurred image to simulate noise in the real image. Subsequently, the third blurred image is converted into the same sRGB format as the original image, resulting in a final blurred image.
According to the method provided by the embodiment of the invention, the saturated pixels in a plurality of original images continuously shot by the camera are searched, the interpolation density of the frame interpolation between adjacent original images is controlled by utilizing the difference between the pixel value of the saturated pixel in any original image and the pixel value of the corresponding saturated pixel in the adjacent original image, so that the pixel value intensity of the potential saturated pixel point in the first blurred image obtained by averaging the original image and the interpolation image converted into the RAW format is not weakened excessively by the averaging operation, and when the saturated pixel synthesis processing is carried out on the first blurred image on the basis, the potential saturated pixel point in the first blurred image is prevented from being ignored in the synthesis process, thereby improving the synthesis effect of the saturated pixels and the synthesis effect of the whole blurred image, and on the basis of the position of the saturated pixel in each original image, noise is added to the second blurred image after the saturated pixel synthesis processing is carried out on the first blurred image to obtain a third blurred image, the third blurred image is converted into the sRGB image, and finally the training effect is improved, and the training effect of the final blurred image is improved.
Based on the above embodiment, as shown in fig. 2, based on the positions of the saturated pixels in each original image, the saturated pixels synthesis process is performed on the first blurred image to obtain a second blurred image, which specifically includes:
step 210, based on the positions of the saturated pixels in each original image, establishing saturated pixel masks corresponding to each original image respectively;
step 220, determining potential saturated pixel masks corresponding to each original image in common based on the saturated pixel masks corresponding to each original image respectively; the positions of pixels in the potential saturated pixel mask are located in a saturated pixel mask area corresponding to at least one original image;
and step 230, enhancing pixel values of pixels in the first blurred image, which are located in the potential saturated pixel mask area, so as to synthesize saturated pixels in the first blurred image, and obtain a second blurred image.
Specifically, a saturated pixel mask corresponding to any original image is constructed according to the position of a saturated pixel in the original image, where the pixel of the saturated pixel mask corresponding to any original image corresponds to the saturated pixel of the original image, and the pixel value of each pixel in the saturated pixel mask may be a preset pixel value (for example, may be the highest pixel value of the RAW format image). Then, fusion is performed based on the saturated pixel masks respectively corresponding to each original image, so that potential saturated pixel masks commonly corresponding to each original image are determined. Wherein the position of the pixel in the potential saturated pixel mask (i.e. the potential saturated pixel) is located in the region of the saturated pixel mask corresponding to the at least one original image, i.e. the position of the pixel in the potential saturated pixel mask is at least the same as the position of the saturated pixel in the one original image. In addition, the pixel value of each pixel in the potential saturated pixel mask may be determined based on the pixel value of the corresponding pixel in each saturated pixel mask, for example, for any pixel in the potential saturated pixel mask, an average value of the corresponding pixel value of the pixel in each saturated pixel mask (if not in a certain saturated pixel mask region, the pixel value is 0) may be determined, and the average value is taken as the pixel value of the pixel in the potential saturated pixel mask.
Based on the obtained potential saturated pixel mask region, the pixel value of the pixel in the potential saturated pixel mask region in the first blurred image can be pertinently enhanced, so that the pixel value of the pixel in the potential saturated pixel mask region in the first blurred image reaches or approaches to the maximum pixel value of the RAW format image, and saturated pixels in a blurred image are synthesized to obtain a second blurred image.
Based on any of the above embodiments, enhancing the pixel value of the pixel in the first blurred image in the potentially saturated pixel mask area to synthesize the saturated pixel in the first blurred image to obtain a second blurred image, specifically includes:
scaling pixel values of pixels in the potential saturated pixel mask based on a preset coefficient to obtain a saturated pixel synthetic map;
and superposing pixel values at the same position in the first blurred image and the saturated pixel synthetic image to obtain a second blurred image.
Specifically, when the pixel values of the pixels located in the region of the potential saturated pixel mask in the first blurred image are emphasized, the pixel values of the pixels in the potential saturated pixel mask may be scaled based on a preset coefficient first, so as to obtain a saturated pixel composite image. The preset coefficient may be determined based on the pixel values of the pixels in the potentially saturated pixel mask, and if the pixel values of the pixels in the potentially saturated pixel mask are distributed at a higher level (for example, the average value or the median of the pixel values of the pixels in the potentially saturated pixel mask is higher than a certain pixel threshold), the preset coefficient may be set to a value greater than 1, otherwise, the preset coefficient may be set to a value less than 1. Here, the initial value of the preset coefficient may be set to 1, the initial value of the preset coefficient may be adjusted based on the average value or the median of the pixel values of each pixel in the potentially saturated pixel mask and the difference between the pixel threshold values, for example, after normalizing the difference to be within the [ -1,1] interval, the normalized difference is added to the initial value of the preset coefficient to obtain an adjusted preset coefficient, and the adjusted preset coefficient is used to perform scaling processing on the pixel value of each pixel in the potentially saturated pixel mask. And then, overlapping the pixel values at the same position in the first blurred image and the saturated pixel synthetic image to obtain a second blurred image. It should be noted that, if the pixel value of the pixel at any position after superposition is higher than the highest pixel value of the RAW format image or lower than the lowest pixel value of the RAW format image, the pixel value of the pixel at the position is set as the highest pixel value or the lowest pixel value of the RAW format image correspondingly.
Based on any of the above embodiments, as shown in fig. 3, the converting the third blurred image into sRGB format to obtain a final blurred image specifically includes:
step 310, performing image feature encoding on the third blurred image based on a plurality of feature extraction modules to obtain a depth feature vector of the third blurred image;
step 320, selecting an intermediate original image of the continuous multiple original images as a reference image;
step 330, after performing gaussian blur on the reference image, extracting color feature codes of the reference image after gaussian blur based on two feature extraction modules;
step 340, fusing the depth feature vector of the third blurred image with the color feature code of the reference image after Gaussian blur to obtain the color feature vector of the third blurred image;
and 350, processing the color feature vector of the third blurred image based on a feature extraction module, a convolution module and an up-sampling module to obtain a final blurred image in the sRGB format.
Specifically, when converting an image in a RAW format into an image in an sRGB format, a method of mapping the RAW format and the sRGB format using an image signal processing algorithm of a camera is generally adopted. However, in the case where the image signal processing algorithm of the camera is unknown, how to accurately convert the third blurred image in RAW format into the final blurred image in sRGB format is a difficult problem. One solution is to train one conversion network for each possible image signal processing algorithm separately, however this solution is not scalable and has poor generalization capability. In view of the fact that the largest difference between the image in the RAW format and the image in the sRGB format is color information, the color information can be added on the basis of the third blurred image in the RAW format after the third blurred image in the RAW format is subjected to feature encoding, and the final blurred image in the sRGB format is obtained by performing integral feature encoding after the color information is superimposed, so that the problem that the image is accurately converted from the RAW domain to the sRGB domain under the condition that an image signal processing algorithm is unknown is solved.
Specifically, image feature encoding may be performed on the third blurred image based on a plurality of feature extraction modules (denoted as F1, F2,..and Fn), so as to obtain a depth feature vector of the third blurred image, where the depth feature vector includes semantic information of each layer in the third blurred image. Since color information is lacking in the third blurred image in the RAW format, in order to increase the color information, a source from which the color information is acquired may be selected from the above-described continuous plurality of original images. Here, considering that the original images are continuous and the third blurred image is pixel-averaged from the respective original images and the interpolation image, an original image closer to the third blurred image may be selected as a reference image providing color information, for example, an intermediate original image (e.g., original images I1, I2, I9, I5) located at the center in a continuous plurality of original images may be selected as a reference image. Subsequently, the reference image is subjected to gaussian blur, and only color information is extracted by using the strong blur operation, and semantic information such as structural content, fine texture and the like is provided by depth feature vectors of the third blurred image. The color feature codes of the reference image after Gaussian blur are extracted based on two feature extraction modules (feature extraction codes adopted when image feature codes are carried out on the third blurred image can be multiplexed, such as F1 and F2). The reference image filters most semantic information except color information after Gaussian blur, so that a few feature extraction modules are used for extracting features of the reference image, the color information in the reference image can be efficiently acquired, and color feature codes can be constructed.
After the color feature codes are extracted from the reference image, the depth feature vector of the third blurred image can be fused with the color feature codes of the reference image after Gaussian blur, so that fusion of image structure content information, fine texture information and the like with the color information is realized, and the color feature vector of the third blurred image is obtained. And performing feature arrangement and scale transformation on the color feature vector of the third blurred image based on a feature extraction module (Fn+1), a convolution module and an up-sampling module to obtain a final blurred image in the sRGB format.
Based on any one of the above embodiments, fusing the depth feature vector of the third blurred image with the color feature code of the reference image after gaussian blur to obtain the color feature vector of the third blurred image, specifically including:
performing point multiplication on the depth feature vector of the third blurred image and the color feature code of the Gaussian blurred reference image to obtain a color association vector of the third blurred image;
and fusing the depth feature vector of the third blurred image with the color association vector of the third blurred image to obtain the color feature vector of the third blurred image.
Specifically, the depth feature vector of the third blurred image is multiplied by the color feature code of the reference image after Gaussian blur according to elements, so that the superposition of color information can be realized, and the color association vector of the third blurred image is obtained. And then, fusing the depth feature vector of the third blurred image with the color association vector of the third blurred image to obtain the color feature vector of the third blurred image. The depth feature vector of the third blurred image and the color association vector of the third blurred image can be added to achieve fusion of the depth feature vector and the color association vector of the third blurred image.
Based on any of the above embodiments, as shown in fig. 4, the converting the original image and the interpolated image into a RAW format specifically includes:
step 410, performing image feature encoding on any image based on a plurality of feature extraction modules to obtain a depth feature vector of the any image; the arbitrary image is an original image or an interpolation image;
step 420, performing convolution operation on the depth feature vector of any image based on a convolution module to obtain a converted image of any image;
step 430, sampling the converted image of the any image, and removing the color channel of the converted image of the any image, so as to convert the any image into a RAW format.
Specifically, image feature encoding is performed on an original image or an interpolation image to be converted based on a plurality of feature extraction modules (recorded as S1, S2,..and Sk), and depth feature vectors of corresponding images are obtained. The depth feature vector of the original image or the interpolated image contains semantic information of each layer including color information (because the sRGB image contains rich color features).
Note that the feature extraction module employed in the branch for converting the sRGB image into the RAW image is the same as the feature extraction module employed in the branch for converting the RAW image into the sRGB image in structure, but the two branches are trained separately, and thus the parameters of the feature extraction modules in the two branches are different. The branch for converting the sRGB image into the RAW image can be obtained by adjusting model parameters by calculating the difference between a predicted image obtained by converting the sample image by the branch and a real converted image based on the sample image in the sRGB format and the real converted image in the RAW format; similarly, the branch for converting the RAW image into the sRGB image may be obtained by adjusting the model parameters by calculating the difference between the predicted image obtained by converting the sample image by the branch and the real converted image based on the sample image in the RAW format and the real converted image in the sRGB format thereof.
And then, performing convolution calculation on the depth feature vector of the original image or the interpolation image by using a convolution module to realize the demosaicing effect. In the above manner, the effects of tone mapping, gamma correction, color correction, white balance, and other conversions can be reversed. Then, the converted image of the original image or the interpolated image is sampled, and color channels of the converted image of the original image or the interpolated image are removed to convert the original image or the interpolated image into a RAW format. Here, sampling may be performed using a Bayer sampling function, omitting two color channels at each pixel.
Based on any of the above embodiments, the feature extraction module includes a plurality of dual-attention modules and a convolution module connected in series; any dual attention module comprises a space attention module and a channel attention module; a jump connection is established between the input of the first dual-attention module and the output of the convolution module.
Specifically, the feature extraction module employed in the branch for converting the sRGB image into the RAW image has the same structure as the feature extraction module employed in the branch for converting the RAW image into the sRGB image, and each feature extraction module includes a plurality of serially connected dual-attention modules and a convolution module. Here, any dual attention module includes a spatial attention module and a channel attention module, which are used to suppress the features with poor effectiveness, and only propagate the features with larger information quantity. The spatial attention module may use the spatial relationships of features and calculate a spatial attention map, which is then used to rescale the incoming features, while the channel attention module may use the inter-channel dependencies of the convolution features to encode a global background over space, fully capturing the relationships between channels. It should be noted that the spatial attention module and the channel attention module may be respectively configured based on the existing spatial attention mechanism and the channel attention mechanism, which is not particularly limited in the embodiment of the present invention. In addition, a jump layer connection is established between the input end of the first double-attention module and the output end of the convolution module, so that the problem of performance degradation caused by more double-attention modules in series and deeper layers is solved.
The image blur processing apparatus provided by the present invention will be described below, and the image blur processing apparatus described below and the image blur processing method described above may be referred to correspondingly to each other.
Based on any of the above embodiments, fig. 5 is a schematic structural diagram of an image blurring processing device according to the present invention, as shown in fig. 5, the device includes: a saturated pixel acquisition unit 510, a frame interpolation unit 520, a first blurring unit 530, a second blurring unit 540, and a third blurring unit 550.
The saturated pixel acquiring unit 510 is configured to acquire a plurality of continuous original images captured by the camera, and determine saturated pixels in each original image; the saturated pixels in either image are pixels with normalized pixel values of 1;
the frame interpolation unit 520 is configured to perform frame interpolation processing on the continuous multiple original images based on pixel values of saturated pixels in each original image and pixel values of pixels corresponding to saturated pixels in each original image in adjacent original images, so as to obtain multiple interpolated images; the larger the difference between the pixel value of the saturated pixel in any original image and the pixel value of the corresponding pixel in the adjacent original image of the corresponding saturated pixel in the any original image, the more the interpolation image between the any original image and the adjacent original image;
The first blurring unit 530 is configured to convert the original image and the interpolated image into a RAW format, and perform pixel value averaging processing to obtain a first blurred image;
the second blurring unit 540 is configured to perform saturated pixel synthesis processing on the first blurred image based on the position of the saturated pixel in each original image, so as to obtain a second blurred image;
the third blurring unit 550 is configured to add noise to the second blurred image, obtain a third blurred image, and convert the third blurred image into sRGB format, so as to obtain a final blurred image.
According to the device provided by the embodiment of the invention, the saturated pixels in a plurality of original images continuously shot by the camera are searched, the interpolation density of the frame interpolation between adjacent original images is controlled by utilizing the difference between the pixel value of the saturated pixel in any original image and the pixel value of the corresponding saturated pixel in the adjacent original image, so that the pixel value intensity of the potential saturated pixel point in the first blurred image obtained by averaging the original image and the interpolation image converted into the RAW format is not weakened excessively by the averaging operation, and when the first blurred image is subjected to saturated pixel synthesis processing on the basis, the potential saturated pixel point in the first blurred image is prevented from being ignored in the synthesis process, thereby improving the synthesis effect of the saturated pixels and the synthesis effect of the whole blurred image, and on the basis of the position of the saturated pixel in each original image, the saturated pixel synthesis processing is performed on the first blurred image to obtain a second blurred image, noise is added into the second blurred image to obtain a third blurred image, the third blurred image is converted into sRGB (red, green, blue, and finally the training effect of the blurred image and the real blurred image is improved.
Based on any of the above embodiments, based on the position of the saturated pixel in each original image, performing saturated pixel synthesis processing on the first blurred image to obtain a second blurred image, which specifically includes:
establishing saturated pixel masks corresponding to each original image respectively based on the positions of saturated pixels in each original image;
determining potential saturated pixel masks which correspond to each original image in common based on the saturated pixel masks which correspond to each original image respectively; the positions of pixels in the potential saturated pixel mask are located in a saturated pixel mask area corresponding to at least one original image;
and reinforcing pixel values of pixels in the first blurred image, which are positioned in the potential saturated pixel mask area, so as to synthesize saturated pixels in the first blurred image, and obtain a second blurred image.
Based on any of the above embodiments, enhancing the pixel value of the pixel in the first blurred image in the potentially saturated pixel mask area to synthesize the saturated pixel in the first blurred image to obtain a second blurred image, specifically includes:
scaling pixel values of pixels in the potential saturated pixel mask based on a preset coefficient to obtain a saturated pixel synthetic map;
And superposing pixel values at the same position in the first blurred image and the saturated pixel synthetic image to obtain a second blurred image.
Based on any of the above embodiments, converting the third blurred image into an sRGB format to obtain a final blurred image, specifically including:
performing image feature coding on the third blurred image based on a plurality of feature extraction modules to obtain a depth feature vector of the third blurred image;
selecting an intermediate original image from the continuous multiple original images as a reference image;
after Gaussian blur is carried out on the reference image, color feature codes of the Gaussian blurred reference image are extracted based on two feature extraction modules;
fusing the depth feature vector of the third blurred image with the color feature code of the Gaussian blurred reference image to obtain the color feature vector of the third blurred image;
and processing the color feature vector of the third blurred image based on a feature extraction module, a convolution module and an up-sampling module to obtain a final blurred image in an sRGB format.
Based on any one of the above embodiments, fusing the depth feature vector of the third blurred image with the color feature code of the reference image after gaussian blur to obtain the color feature vector of the third blurred image, specifically including:
Performing point multiplication on the depth feature vector of the third blurred image and the color feature code of the Gaussian blurred reference image to obtain a color association vector of the third blurred image;
and fusing the depth feature vector of the third blurred image with the color association vector of the third blurred image to obtain the color feature vector of the third blurred image.
Based on any of the above embodiments, converting the original image and the interpolated image into a RAW format specifically includes:
performing image feature coding on any image based on a plurality of feature extraction modules to obtain a depth feature vector of any image; the arbitrary image is an original image or an interpolation image;
performing convolution operation on the depth feature vector of any image based on a convolution module to obtain a converted image of any image;
sampling the converted image of any image, and removing the color channel of the converted image of any image to convert any image into a RAW format.
Based on any of the above embodiments, the feature extraction module includes a plurality of dual-attention modules and a convolution module connected in series; any dual attention module comprises a space attention module and a channel attention module; a jump connection is established between the input of the first dual-attention module and the output of the convolution module.
Fig. 6 is a schematic structural diagram of an electronic device according to the present invention, and as shown in fig. 6, the electronic device may include: processor 610, memory 620, communication interface 630, and communication bus 640, wherein processor 610, memory 620, and communication interface 630 communicate with each other via communication bus 640. The processor 610 may invoke logic instructions in the memory 620 to perform an image blur processing method comprising: acquiring a plurality of continuous original images shot by a camera, and determining saturated pixels in each original image; the saturated pixels in either image are pixels with normalized pixel values of 1; performing frame interpolation processing on the continuous multiple original images based on pixel values of saturated pixels in each original image and pixel values of pixels corresponding to saturated pixels in each original image in adjacent original images to obtain multiple interpolation images; the larger the difference between the pixel value of the saturated pixel in any original image and the pixel value of the corresponding pixel in the adjacent original image of the corresponding saturated pixel in the any original image, the more the interpolation image between the any original image and the adjacent original image; converting the original image and the interpolation image into a RAW format, and then carrying out pixel value averaging treatment to obtain a first blurred image; based on the positions of saturated pixels in each original image, carrying out saturated pixel synthesis processing on the first blurred image to obtain a second blurred image; and adding noise to the second blurred image to obtain a third blurred image, and converting the third blurred image into an sRGB format to obtain a final blurred image.
Further, the logic instructions in the memory 620 described above may be implemented in the form of software functional units and may be stored in a computer readable storage medium when sold or used as a stand alone product. Based on this understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: various media capable of storing program codes, such as a U disk, a mobile hard disk, a read-only memory, a random access memory, a magnetic disk or an optical disk.
In another aspect, the present invention also provides a computer program product comprising a computer program stored on a non-transitory computer readable storage medium, the computer program comprising program instructions which, when executed by a computer, enable the computer to perform the image blur processing method provided by the above methods, the method comprising: acquiring a plurality of continuous original images shot by a camera, and determining saturated pixels in each original image; the saturated pixels in either image are pixels with normalized pixel values of 1; performing frame interpolation processing on the continuous multiple original images based on pixel values of saturated pixels in each original image and pixel values of pixels corresponding to saturated pixels in each original image in adjacent original images to obtain multiple interpolation images; the larger the difference between the pixel value of the saturated pixel in any original image and the pixel value of the corresponding pixel in the adjacent original image of the corresponding saturated pixel in the any original image, the more the interpolation image between the any original image and the adjacent original image; converting the original image and the interpolation image into a RAW format, and then carrying out pixel value averaging treatment to obtain a first blurred image; based on the positions of saturated pixels in each original image, carrying out saturated pixel synthesis processing on the first blurred image to obtain a second blurred image; and adding noise to the second blurred image to obtain a third blurred image, and converting the third blurred image into an sRGB format to obtain a final blurred image.
In still another aspect, the present invention also provides a non-transitory computer-readable storage medium having stored thereon a computer program which, when executed by a processor, is implemented to perform the above-provided image blurring processing methods, the method comprising: acquiring a plurality of continuous original images shot by a camera, and determining saturated pixels in each original image; the saturated pixels in either image are pixels with normalized pixel values of 1; performing frame interpolation processing on the continuous multiple original images based on pixel values of saturated pixels in each original image and pixel values of pixels corresponding to saturated pixels in each original image in adjacent original images to obtain multiple interpolation images; the larger the difference between the pixel value of the saturated pixel in any original image and the pixel value of the corresponding pixel in the adjacent original image of the corresponding saturated pixel in the any original image, the more the interpolation image between the any original image and the adjacent original image; converting the original image and the interpolation image into a RAW format, and then carrying out pixel value averaging treatment to obtain a first blurred image; based on the positions of saturated pixels in each original image, carrying out saturated pixel synthesis processing on the first blurred image to obtain a second blurred image; and adding noise to the second blurred image to obtain a third blurred image, and converting the third blurred image into an sRGB format to obtain a final blurred image.
The apparatus embodiments described above are merely illustrative, wherein the elements illustrated as separate elements may or may not be physically separate, and the elements shown as elements may or may not be physical elements, may be located in one place, or may be distributed over a plurality of network elements. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment. Those of ordinary skill in the art will understand and implement the present invention without undue burden.
From the above description of the embodiments, it will be apparent to those skilled in the art that the embodiments may be implemented by means of software plus necessary general hardware platforms, or of course may be implemented by means of hardware. Based on this understanding, the foregoing technical solution may be embodied essentially or in a part contributing to the prior art in the form of a software product, which may be stored in a computer readable storage medium, such as ROM/RAM, a magnetic disk, an optical disk, etc., including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method described in the respective embodiments or some parts of the embodiments.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present invention, and are not limiting; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present invention.

Claims (9)

1. An image blurring processing method, characterized by comprising:
acquiring a plurality of continuous original images shot by a camera, and determining saturated pixels in each original image; the saturated pixels in either image are pixels with normalized pixel values of 1;
performing frame interpolation processing on the continuous multiple original images based on pixel values of saturated pixels in each original image and pixel values of pixels corresponding to saturated pixels in each original image in adjacent original images to obtain multiple interpolation images; the larger the difference between the pixel value of the saturated pixel in any original image and the pixel value of the corresponding pixel in the adjacent original image of the corresponding saturated pixel in the any original image, the more the interpolation image between the any original image and the adjacent original image;
Converting the original image and the interpolation image into a RAW format, and then carrying out pixel value averaging treatment to obtain a first blurred image;
based on the positions of saturated pixels in each original image, carrying out saturated pixel synthesis processing on the first blurred image to obtain a second blurred image;
adding noise to the second blurred image to obtain a third blurred image, and converting the third blurred image into an sRGB format to obtain a final blurred image;
and performing saturated pixel synthesis processing on the first blurred image based on the positions of saturated pixels in each original image to obtain a second blurred image, wherein the method specifically comprises the following steps:
establishing saturated pixel masks corresponding to each original image respectively based on the positions of saturated pixels in each original image;
determining potential saturated pixel masks which correspond to each original image in common based on the saturated pixel masks which correspond to each original image respectively; the positions of pixels in the potential saturated pixel mask are located in a saturated pixel mask area corresponding to at least one original image;
and reinforcing pixel values of pixels in the first blurred image, which are positioned in the potential saturated pixel mask area, so as to synthesize saturated pixels in the first blurred image, and obtain a second blurred image.
2. The method according to claim 1, wherein the enhancing the pixel value of the pixel in the first blurred image in the potentially saturated pixel mask area to synthesize the saturated pixel in the first blurred image to obtain the second blurred image comprises:
scaling pixel values of pixels in the potential saturated pixel mask based on a preset coefficient to obtain a saturated pixel synthetic map;
and superposing pixel values at the same position in the first blurred image and the saturated pixel synthetic image to obtain a second blurred image.
3. The image blurring processing method according to claim 1, wherein the converting the third blurred image into sRGB format to obtain a final blurred image specifically includes:
performing image feature coding on the third blurred image based on a plurality of feature extraction modules to obtain a depth feature vector of the third blurred image;
selecting an intermediate original image from the continuous multiple original images as a reference image;
after Gaussian blur is carried out on the reference image, color feature codes of the Gaussian blurred reference image are extracted based on two feature extraction modules;
Fusing the depth feature vector of the third blurred image with the color feature code of the Gaussian blurred reference image to obtain the color feature vector of the third blurred image;
and processing the color feature vector of the third blurred image based on a feature extraction module, a convolution module and an up-sampling module to obtain a final blurred image in an sRGB format.
4. The method for image blur processing according to claim 3, wherein the fusing the depth feature vector of the third blurred image with the color feature code of the gaussian blurred reference image to obtain the color feature vector of the third blurred image specifically includes:
performing point multiplication on the depth feature vector of the third blurred image and the color feature code of the Gaussian blurred reference image to obtain a color association vector of the third blurred image;
and fusing the depth feature vector of the third blurred image with the color association vector of the third blurred image to obtain the color feature vector of the third blurred image.
5. The image blurring processing method according to claim 1, wherein the converting the original image and the interpolation image into RAW format specifically includes:
Performing image feature coding on any image based on a plurality of feature extraction modules to obtain a depth feature vector of any image; the arbitrary image is an original image or an interpolation image;
performing convolution operation on the depth feature vector of any image based on a convolution module to obtain a converted image of any image;
sampling the converted image of any image, and removing the color channel of the converted image of any image to convert any image into a RAW format.
6. The image blurring processing method of any of claims 3 to 5 wherein the feature extraction module comprises a plurality of dual attention modules in series and a convolution module; any dual attention module comprises a space attention module and a channel attention module; a jump connection is established between the input of the first dual-attention module and the output of the convolution module.
7. An image blurring processing device, characterized by comprising:
the saturated pixel acquisition unit is used for acquiring a plurality of continuous original images shot by the camera and determining saturated pixels in each original image; the saturated pixels in either image are pixels with normalized pixel values of 1;
A frame interpolation unit, configured to perform frame interpolation processing on the continuous multiple original images based on pixel values of saturated pixels in each original image and pixel values of pixels corresponding to saturated pixels in each original image in adjacent original images, to obtain multiple interpolated images; the larger the difference between the pixel value of the saturated pixel in any original image and the pixel value of the corresponding pixel in the adjacent original image of the corresponding saturated pixel in the any original image, the more the interpolation image between the any original image and the adjacent original image;
the first blurring unit is used for converting the original image and the interpolation image into a RAW format and then carrying out pixel value averaging treatment to obtain a first blurring image;
the second blurring unit is used for carrying out saturated pixel synthesis processing on the first blurring image based on the position of the saturated pixel in each original image to obtain a second blurring image;
the third blurring unit is used for adding noise into the second blurring image to obtain a third blurring image, and converting the third blurring image into an sRGB format to obtain a final blurring image;
and performing saturated pixel synthesis processing on the first blurred image based on the positions of saturated pixels in each original image to obtain a second blurred image, wherein the method specifically comprises the following steps:
Establishing saturated pixel masks corresponding to each original image respectively based on the positions of saturated pixels in each original image;
determining potential saturated pixel masks which correspond to each original image in common based on the saturated pixel masks which correspond to each original image respectively; the positions of pixels in the potential saturated pixel mask are located in a saturated pixel mask area corresponding to at least one original image;
and reinforcing pixel values of pixels in the first blurred image, which are positioned in the potential saturated pixel mask area, so as to synthesize saturated pixels in the first blurred image, and obtain a second blurred image.
8. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the image blur processing method according to any one of claims 1 to 6 when executing the program.
9. A non-transitory computer-readable storage medium, on which a computer program is stored, characterized in that the computer program, when executed by a processor, implements the image blur processing method according to any one of claims 1 to 6.
CN202310362201.5A 2023-04-07 2023-04-07 Image blurring processing method, device, electronic equipment and storage medium Active CN116091364B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310362201.5A CN116091364B (en) 2023-04-07 2023-04-07 Image blurring processing method, device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310362201.5A CN116091364B (en) 2023-04-07 2023-04-07 Image blurring processing method, device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN116091364A CN116091364A (en) 2023-05-09
CN116091364B true CN116091364B (en) 2023-06-06

Family

ID=86199452

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310362201.5A Active CN116091364B (en) 2023-04-07 2023-04-07 Image blurring processing method, device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116091364B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101568908A (en) * 2006-02-14 2009-10-28 快图影像有限公司 Image blurring
CN103198453A (en) * 2011-09-26 2013-07-10 佳能株式会社 Image processing apparatus and method
CN107431793A (en) * 2015-03-26 2017-12-01 索尼公司 Image processing apparatus and image processing method and program

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7469071B2 (en) * 2006-02-14 2008-12-23 Fotonation Vision Limited Image blurring
JP5847471B2 (en) * 2011-07-20 2016-01-20 キヤノン株式会社 Image processing apparatus, imaging apparatus, image processing method, and image processing program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101568908A (en) * 2006-02-14 2009-10-28 快图影像有限公司 Image blurring
CN103198453A (en) * 2011-09-26 2013-07-10 佳能株式会社 Image processing apparatus and method
CN107431793A (en) * 2015-03-26 2017-12-01 索尼公司 Image processing apparatus and image processing method and program

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
"Saturated-Pixel Enhancement for Color Images";Di Xu et al.;《International Symposium on Circuits and Systems》;第1-4页 *
候成刚等."基于照明恢复算子的单幅图像高光去除方法".《工业控制计算机》.2020,第33卷(第10期),90-93. *

Also Published As

Publication number Publication date
CN116091364A (en) 2023-05-09

Similar Documents

Publication Publication Date Title
CN112330574B (en) Portrait restoration method and device, electronic equipment and computer storage medium
CN108604369B (en) Method, device and equipment for removing image noise and convolutional neural network
EP3488388B1 (en) Video processing method and apparatus
CN111062872A (en) Image super-resolution reconstruction method and system based on edge detection
US20230080693A1 (en) Image processing method, electronic device and readable storage medium
CN110248096A (en) Focusing method and device, electronic equipment, computer readable storage medium
CN110473185A (en) Image processing method and device, electronic equipment, computer readable storage medium
JP2009509417A (en) Extraction of moving object boundary
CN112529776B (en) Training method of image processing model, image processing method and device
CN116664450A (en) Diffusion model-based image enhancement method, device, equipment and storage medium
CN110852965A (en) Video illumination enhancement method and system based on generation countermeasure network
CN113724136A (en) Video restoration method, device and medium
CN115131229A (en) Image noise reduction and filtering data processing method and device and computer equipment
CN112489103B (en) High-resolution depth map acquisition method and system
CN116091364B (en) Image blurring processing method, device, electronic equipment and storage medium
CN117333398A (en) Multi-scale image denoising method and device based on self-supervision
Ponomaryov et al. Fuzzy color video filtering technique for sequences corrupted by additive Gaussian noise
CN116823662A (en) Image denoising and deblurring method fused with original features
Guan et al. NODE: Extreme low light raw image denoising using a noise decomposition network
CN114339030B (en) Network live video image stabilizing method based on self-adaptive separable convolution
CN116208812A (en) Video frame inserting method and system based on stereo event and intensity camera
Tian et al. Deformable convolutional network constrained by contrastive learning for underwater image enhancement
CN114841870A (en) Image processing method, related device and system
Xu et al. Joint learning of super-resolution and perceptual image enhancement for single image
CN115311149A (en) Image denoising method, model, computer-readable storage medium and terminal device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant