CN115239580A - Image noise reduction processing method and device - Google Patents

Image noise reduction processing method and device Download PDF

Info

Publication number
CN115239580A
CN115239580A CN202210752510.9A CN202210752510A CN115239580A CN 115239580 A CN115239580 A CN 115239580A CN 202210752510 A CN202210752510 A CN 202210752510A CN 115239580 A CN115239580 A CN 115239580A
Authority
CN
China
Prior art keywords
image
frame
processing
pixel
coordinate information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210752510.9A
Other languages
Chinese (zh)
Inventor
马璐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202210752510.9A priority Critical patent/CN115239580A/en
Publication of CN115239580A publication Critical patent/CN115239580A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • G06T5/70
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/40Scaling the whole image or part thereof
    • G06T3/4053Super resolution, i.e. output image resolution higher than sensor resolution
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]

Abstract

The application discloses an image noise reduction processing method and device, and belongs to the technical field of image processing. The method comprises the following steps: processing the first image based on a first processing mode to obtain a second image; processing at least one frame of third image based on a second processing mode to obtain at least one frame of fourth image, wherein the at least one frame of third image is an image acquired before the first image, and each frame of fourth image corresponds to one frame of third image respectively; and performing combined denoising processing on the second image according to the at least one frame of fourth image to obtain a fifth image after the combined denoising processing.

Description

Image noise reduction processing method and device
Technical Field
The application belongs to the technical field of image processing, and particularly relates to an image noise reduction processing method and device.
Background
At present, after a user shoots a video through an electronic device, the electronic device can perform noise reduction processing on the video according to an image algorithm to ensure that the video shot by the electronic device is played smoothly and has no flicker.
However, in the above noise reduction method, although the electronic device adopts an image algorithm with high performance and low power consumption to perform noise reduction processing on the video, the image size of the shot video is also reduced, so that the definition of the video image shot by the electronic device is low.
Disclosure of Invention
The embodiment of the application aims to provide an image noise reduction processing method, an image noise reduction processing device, electronic equipment and a readable storage medium, and can solve the problem that the definition of a shot video image is low.
In a first aspect, an embodiment of the present application provides an image denoising processing method, including: processing the acquired first image based on a first processing mode to obtain a second image; processing at least one frame of third image based on a second processing mode to obtain at least one frame of fourth image, wherein the at least one frame of third image is an image acquired before the first image, and each frame of fourth image in the at least one frame of fourth image corresponds to one frame of third image respectively; and performing combined denoising processing on the second image according to the at least one frame of fourth image to obtain a fifth image after the combined denoising processing.
In a second aspect, an embodiment of the present application provides an image noise reduction processing apparatus, including: and (5) a processing module. The processing module is used for processing the acquired first image based on a first processing mode to obtain a second image; processing at least one frame of third image based on a second processing mode to obtain at least one frame of fourth image, wherein the at least one frame of third image is an image acquired before the first image, and each frame of fourth image in the at least one frame of fourth image respectively corresponds to one frame of third image; and performing combined noise reduction processing on the second image according to the at least one frame of fourth image to obtain a fifth image after the combined noise reduction processing.
In a third aspect, embodiments of the present application provide an electronic device, which includes a processor and a memory, where the memory stores a program or instructions executable on the processor, and the program or instructions, when executed by the processor, implement the steps of the method according to the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium, on which a program or instructions are stored, which when executed by a processor implement the steps of the method according to the first aspect.
In a fifth aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the method according to the first aspect.
In a sixth aspect, embodiments of the present application provide a computer program product, stored on a storage medium, for execution by at least one processor to implement the method according to the first aspect.
In this embodiment of the application, the electronic device may process the acquired first image based on the first processing manner to obtain the second image, and process at least one frame of image (i.e., at least one frame of third image) acquired before the first image by the second processing manner to obtain at least one frame of fourth image, and then perform joint denoising processing on the second image by the at least one frame of fourth image to obtain the fifth image after denoising processing. In the scheme, the electronic equipment can adopt different processing modes for the currently acquired first image and at least one frame of image acquired before the first image, so that different image information in the first image and at least one frame of third image can be obtained, namely the first image and the at least one frame of third image are subjected to different image processing, and the obtained second image and the at least one frame of fourth image are images containing different image information.
Drawings
FIG. 1 is a schematic diagram of an example of down-sampling of an image in the related art;
FIG. 2 is a flowchart of an image denoising processing method according to an embodiment of the present application;
fig. 3 is a schematic diagram of an example of an image denoising processing method according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of an image noise reduction processing apparatus according to an embodiment of the present application;
fig. 5 is a schematic hardware structure diagram of an electronic device according to an embodiment of the present application;
fig. 6 is a second hardware structure schematic diagram of an electronic device according to an embodiment of the present disclosure.
Detailed Description
The technical solutions in the embodiments of the present application will be described below clearly and completely with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, of the embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any inventive effort, shall fall within the scope of protection of the present application.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that embodiments of the application are capable of operation in other sequences than those illustrated or described herein, and that the terms "first," "second," etc. generally refer to a class of objects and do not limit the number of objects, for example, a first object may be one or more. Further, in the specification and claims, "and/or" means at least one of the connected objects, the character "/" generally means a relationship that preceding and succeeding related objects are an "or".
The image denoising processing method provided by the embodiment of the present application is described in detail below with reference to the accompanying drawings through specific embodiments and application scenarios thereof.
At present, a user can realize more and more functions through electronic equipment, for example, the user can shoot videos through the electronic equipment, and along with the development of communication technology, the user's demand for the image quality of the videos shot by the electronic equipment is gradually increased, in the related technology, the electronic equipment can perform image noise reduction processing on the videos shot by the electronic equipment through an image noise reduction algorithm, for example, the electronic equipment can reduce the size of video images (namely, reduce the resolution of the videos), and then process the video images through an image noise reduction algorithm with high performance and low power consumption, so that videos which are smoothly played and have no flicker are obtained; however, downsampling on a Raw domain Bayer image (Bayer) array has a greater impact on image sharpness than image downsampling on RGB and YUV domains, as shown in fig. 1 (a), because the downsampling process on RGB and YUV domains is: the Bayer image is up-sampled from a single-channel Bayer image to a three-channel image through Demosaic (image interpolation), and then down-sampled to a target size; as shown in fig. 1 (B), the Bayer image downsampling is to downsample to a target size and then upsample to a three-channel image by Demosaic, and obviously, the loss of sharpness of downsampling and upsampling is greater than that of upsampling and downsampling.
In the embodiment of the application, the electronic device processes the acquired first image based on the first processing mode to obtain the second image, processes at least one frame of image (namely at least one frame of third image) acquired before the first image by the second processing mode to obtain at least one frame of fourth image, and then performs combined denoising processing on the second image by the at least one frame of fourth image to obtain the fifth image after denoising processing. In the scheme, the electronic equipment can adopt different processing modes for the currently acquired first image and the at least one frame of image acquired before the first image, so that different image information in the first image and the at least one frame of third image can be obtained, namely the first image and the at least one frame of third image are subjected to different image processing, and the obtained second image and the at least one frame of fourth image are images containing different image information.
An embodiment of the present application provides an image denoising processing method, and fig. 2 shows a flowchart of the image denoising processing method provided in the embodiment of the present application. As shown in fig. 2, the image noise reduction processing method provided in the embodiment of the present application may include steps 201 to 203 described below.
Step 201, the electronic device processes the acquired first image based on the first processing mode to obtain a second image.
In the embodiment of the application, when a video is shot and a first image is acquired, the electronic device processes the first image by using a central interpolation processing mode (i.e., the first processing mode) to obtain a second image.
In the embodiment of the application, the electronic device can acquire a first image with an original size through the image sensor, and then process the first image in a central interpolation processing mode to obtain a second image with a second size.
In addition, in the embodiments of the present application, in order to solve the problem of the loss of sharpness of the down-sampled Bayer image, the first image is a Bayer image.
It can be understood that the first image is a current frame image acquired by the electronic device during the process of shooting the video, that is, the first image is not a fixed image but changes according to the change of the object shot by the electronic device until the electronic device finishes shooting, and the first image is a last frame image of the shot video.
Optionally, in this embodiment of the application, the Bayer arrangement in the image sensor may be any of the following: RGGB, RYYB, RGBW or RGBIR, where IR is a filter.
Optionally, in this embodiment of the present application, the filter may be any one of the following: infrared filters, interference filters or raman filters.
Optionally, in this embodiment of the application, the original size is larger than the second size.
Optionally, in this embodiment of the application, in a case that the shooting preview interface is displayed, the electronic device may receive a first input of a user, so that the electronic device starts to shoot a video.
Optionally, in this embodiment of the application, the first input may be input by a user to a shooting control, or input by a user to a physical button, or input by a user to an electronic device by voice. The method can be determined according to actual use requirements, and the embodiment of the application is not limited.
Specifically, the first input may be a click input, a long-press input, a sliding input, or a preset trajectory input of the user on the shooting control; or a physical key combination (e.g., power key and volume key). The method can be determined according to actual use requirements, and the embodiment of the application is not limited.
Optionally, in this embodiment of the present application, the step 201 may be specifically implemented by the following steps 201a to 201 c.
Step 201a, the electronic device performs interpolation processing on at least one first pixel point according to actual coordinate information of the at least one first pixel point in the first image to obtain first coordinate information of the at least one first pixel point.
In this embodiment, the first coordinate information is coordinate information obtained by interpolating corresponding pixel points.
Specifically, the first coordinate information is coordinate information obtained by performing interpolation processing on each first pixel point in the at least one first pixel point by the electronic device.
In this embodiment, the electronic device may perform interpolation processing on at least one first pixel point participating in interpolation according to the target magnification and the first interpolation mode, so as to obtain first coordinate information of the at least one first pixel point.
It should be noted that, for the specificity of the Bayer array, the target magnification is an integer power of 2 (e.g., 2, 4, 8).
Optionally, in this embodiment of the application, the target magnification may be preset by a user; or the electronic device is determined based on the size of the first image.
Alternatively, the first interpolation method may be any one of the following methods: nearest neighbor interpolation, bilinear interpolation, mean interpolation, or median interpolation.
Exemplarily, taking an example that a target magnification is 2 times, taking a first image as an RGGB arrangement, as shown in fig. 3 (a), the electronic device interpolates a 4 × 4 Bayer image according to 2 magnifications, and can extract pixel points (represented by R1, R2, R3, and R4 in fig. 3) corresponding to an R channel, pixel points (represented by G1 in fig. 3) (represented by G1 in the following) corresponding to a G channel (represented by G11, G12, G13, and G14 in fig. 3), pixel points (represented by G2 in fig. 3) (represented by G2 in the following) corresponding to a G channel (represented by G21, G22, G23, and G24 in fig. 3), and pixel points (represented by B1, B2, B3, and B4 in fig. 3) corresponding to a B channel, and then the electronic device can take pixel points corresponding to the R channel, pixel points corresponding to the G1 channel, pixel points corresponding to the G2 channel, and pixel points corresponding to a B channel as a middle pixel point of the Bayer image, and obtain an interpolated Bayer image; as shown in fig. 3 (B), the electronic device performs interpolation processing on a 4 × 4 Bayer image according to 2 magnifications, and the electronic device may respectively extract pixel points corresponding to an R channel (represented by R1, R2, R3, and R4 in fig. 3), pixel points corresponding to a G channel (represented by G1 in fig. 3) (represented by G11, G12, G13, and G14 in fig. 3), pixel points corresponding to a G channel (represented by G2 in fig. 3) (represented by G21, G22, G23, and G24 in fig. 3), and pixel points corresponding to a B channel (represented by B1, B2, B3, and B4 in fig. 3), and then the electronic device may perform averaging processing on 3 pixel points located in the middle of the pixel point corresponding to the R channel, the pixel point corresponding to the G1 channel, the pixel point corresponding to the G2 channel, and the pixel point corresponding to the B channel to obtain one pixel point on each channel after performing averaging processing, and obtain a Bayer image after performing averaging processing on each channel, and then obtaining a Bayer image.
It should be noted that, for the at least one first pixel participating in the interpolation and the coordinate information of the at least one first pixel participating in the interpolation, the electronic device may determine, according to different interpolation algorithms, the at least one different first pixel, so as to obtain the coordinate information corresponding to the at least one first pixel, that is, the number and the coordinate information of the pixels participating in the interpolation are determined according to the interpolation algorithms, and specifically may be determined according to actual use requirements, which is not limited in the embodiment of the present application.
Step 201b, the electronic device obtains equivalent center coordinate information of at least one first pixel point according to the target reference multiplying power, the actual coordinate information of at least one first pixel point and at least one first weight coefficient.
In an embodiment of the present application, the first weight coefficient is a ratio of a pixel value of the corresponding pixel point to a pixel value of the first image.
In the embodiment of the application, under the condition that the target reference magnification is determined, the electronic device may obtain the equivalent center coordinates of at least one first pixel point according to an equivalent center coordinate formula (1), where the specific formula is:
Figure BDA0003718781540000071
wherein, ω is i Is a first weight coefficient, (x) i ,y i ) For the actual coordinates of at least one first pixel point participating in the interpolation,
Figure BDA0003718781540000072
and i is an equivalent central coordinate of the interpolated at least one first pixel point in the original image, and the pixel sequence number is the pixel sequence number.
It should be noted that the equivalent center coordinates are not coordinates of pixel points actually existing in the first image, but are used to indicate that the phase of at least one interpolated first pixel point is not changed.
Optionally, in this embodiment of the application, the target reference magnification may be preset by a user; or the electronic device is determined according to the first image size.
It should be noted that, when the electronic device acquires the equivalent center coordinate information of the at least one first pixel, the target reference magnification is the same as the target magnification when the electronic device performs interpolation processing, that is, the electronic device performs interpolation processing on the at least one first pixel in the first image at the same magnification and acquires the interpolated center equivalent coordinate of the at least one first pixel.
Optionally, in an implementation of the present application, the at least one first weight coefficient may be preset by a user; or the electronic equipment determines according to the ratio of the pixel value of the first image to the pixel value of at least one first pixel point.
Optionally, in this embodiment of the present application, the pixel value may include at least one of the following: a pixel brightness value, a pixel saturation value, a pixel color temperature value, and a pixel exposure value.
Step 201c, the electronic device determines image areas corresponding to the N first pixel points in the first image as a second image.
In this embodiment of the application, the N first pixel points are pixel points of at least one first pixel point where first coordinate information and equivalent center coordinate information satisfy a preset condition, and N is a positive integer.
In this embodiment of the application, the electronic device may input coordinates of at least one first pixel point into the equivalent center relationship, so as to determine, as a second image, an image area corresponding to N first pixel points that satisfy the center equivalent relationship, and calculate as follows through formula (2):
Figure BDA0003718781540000081
wherein the content of the first and second substances,
Figure BDA0003718781540000082
for the equivalent center coordinate of the interpolated at least one first pixel point in the original image,
Figure BDA0003718781540000083
and obtaining the actual coordinate of at least one first pixel point in the original image after interpolation, and Q target reference multiplying power.
It can be understood that the actual coordinates under the same reference magnification correspond to the equivalent center coordinates under the same reference magnification, that is, the reference magnification is different, and the equivalent center coordinates corresponding to the actual coordinates are also different.
In the embodiment of the application, the electronic device obtains the equivalent center coordinates of at least one interpolated first pixel point through an equivalent center coordinate algorithm, so that the image areas corresponding to the N first pixel points in the at least one first pixel point satisfying the equivalent center relationship are determined as the second image, and thus, the electronic device can ensure that the relative position of each pixel point in the second image obtained after interpolation does not change for the pixel points participating in interpolation in the first image, and the accuracy of processing the image by the electronic device is improved.
Step 202, the electronic device processes the at least one frame of third image based on the second processing mode to obtain at least one frame of fourth image.
In an embodiment of the present application, the at least one frame of third image is an image acquired before the first image, and each frame of fourth image in the at least one frame of fourth image corresponds to one frame of third image.
In this embodiment, the electronic device may process the at least one frame of the third image by using an eccentric interpolation processing method (i.e., the second processing method), so as to obtain at least one frame of the fourth image.
In this embodiment of the application, the electronic device may store at least one frame of the third image with the original size, and then process the at least one frame of the third image in an eccentric interpolation processing manner to obtain at least one frame of the fourth image with the third size.
It should be noted that the at least one third image is a Bayer image.
Optionally, in this embodiment of the application, the at least one third image may be a frame of image acquired by the electronic device before the first image; alternatively, the at least one third image may be all images acquired by the electronic device before the first image.
Optionally, in this embodiment of the present application, the third size is smaller than the original size, and the third size may be different from the second size.
Alternatively, in this embodiment of the application, the step 202 may be specifically implemented by the following steps 202a to 202 c.
Step 202a, aiming at each frame of third image in at least one frame of third image, the electronic device performs interpolation processing on at least one second pixel point according to actual coordinate information of at least one second pixel point in one frame of third image to obtain second coordinate information of at least one second pixel point.
In this embodiment, the second coordinate information is coordinate information obtained by performing interpolation processing on the corresponding pixel point.
Specifically, the second coordinate information is coordinate information obtained by interpolating, by the electronic device, each of the at least one second pixel point.
In this embodiment, the electronic device may perform interpolation processing on at least one second pixel point participating in interpolation according to the target magnification and the first interpolation mode, so as to obtain first coordinate information of the at least one second pixel point.
It should be noted that, for the at least one second pixel participating in the interpolation and the coordinate information of the at least one second pixel participating in the interpolation, the electronic device may determine, according to different interpolation algorithms, the different at least one second pixel, so as to obtain the coordinate information corresponding to the at least one second pixel, that is, the number of the pixels participating in the interpolation and the coordinate information are determined according to the interpolation algorithms, and specifically may be determined according to actual use requirements, which is not limited in the embodiment of the present application.
Step 202b, the electronic device obtains equivalent center coordinate information of at least one second pixel point according to the target reference multiplying power, the actual coordinate information of at least one second pixel point and at least one second weight coefficient.
In an embodiment of the present invention, the second weighting factor is a ratio of a pixel value of the corresponding pixel to a pixel value of a frame of the third image.
In the embodiment of the present application, in the case of determining the target reference magnification, the electronic device may obtain the equivalent center coordinate of at least one first pixel point according to the center equivalent coordinate formula (3), where the specific formula is:
Figure BDA0003718781540000101
wherein, ω is i Is a first weight coefficient, (x) i ,y i ) For the actual coordinates of at least one second pixel point participating in the interpolation,
Figure BDA0003718781540000102
and i is an equivalent central coordinate of the at least one second pixel point in the original image after interpolation, and the pixel sequence number is the i.
Step 202c, the electronic device determines image areas corresponding to the M second pixel points in the frame of third image as a frame of fourth image.
In this embodiment, the M second pixel points are pixel points of at least one second pixel point where the second coordinate information and the equivalent center coordinate information do not satisfy the preset condition, and M is a positive integer.
In this embodiment of the application, the electronic device may input the coordinates of at least one second pixel point into the center equivalence relation, so as to determine, as a fourth image, an image area corresponding to M second pixel points that do not satisfy the center equivalence relation, and the calculation is performed according to formula (4) as follows:
Figure BDA0003718781540000111
wherein the content of the first and second substances,
Figure BDA0003718781540000112
for the equivalent center coordinate of the interpolated at least one second pixel point in the original image,
Figure BDA0003718781540000113
and the actual coordinate of at least one second pixel point in the original image after interpolation is Q target reference multiplying power.
It should be noted that, for each frame of the third image in the at least one frame of the third image, the electronic device may perform the above steps 202a to 202c to obtain the at least one frame of the fourth image, which is not repeated herein to avoid repetition.
In this embodiment of the present application, in the embodiment of the present application, the electronic device obtains, through an equivalent center coordinate algorithm, an equivalent center coordinate of at least one interpolated second pixel, so that an image area corresponding to M first pixel points in at least one second pixel that does not satisfy an equivalent center relationship is determined as a frame of fourth image, and thus, the electronic device can change a relative position of a pixel point participating in interpolation in the first image for each pixel point in the frame of fourth image obtained after interpolation, simulate pixel shift, and thus the electronic device can retain image information to the greatest extent in a continuous time period.
And 203, the electronic device performs combined noise reduction processing on the second image according to the at least one frame of fourth image to obtain a fifth image after the combined noise reduction processing.
In this embodiment, the electronic device may input at least one frame of the fourth image and the second image into the target neural network to obtain a fifth image after the joint noise reduction processing.
Optionally, in this embodiment of the application, the target neural network may be any one of: convolutional neural networks, generative antagonistic neural networks, or periodic neural networks. The method can be determined according to actual use requirements, and the embodiment of the application is not limited
Optionally, in this embodiment of the application, the at least one fourth image may be an image that is a frame before the first image and is processed in an eccentric interpolation processing manner; or the image is processed by the eccentric interpolation processing mode in all the previous frames of the first image.
Alternatively, in this embodiment of the application, the step 203 may be specifically implemented by the step 203a described below.
Step 203a, the electronic device adds pixel point information of at least one frame of fourth image to the second image based on a target processing mode to obtain a fifth image after combined denoising processing.
In an embodiment of the present application, the target processing manner is any one of the following: weighting, convolution, and high-pass filtering.
It can be understood that the at least one frame of the fourth image and the second image are obtained through different processing manners, so that image information in the at least one frame of the fourth image and the second image is different, and thus, the electronic device may add pixel point information of the at least one frame of the fourth image to the second image by using a target processing manner to obtain a fifth image after the joint noise reduction processing.
Optionally, in this embodiment of the application, the electronic device may add, in a target processing manner, pixel point information of at least one frame of the fourth image to the second image to obtain an RGB image of the fifth image after the joint noise reduction processing.
The embodiment of the application provides an image noise reduction processing method, and an electronic device can process a collected first image based on a first processing mode to obtain a second image, process at least one frame of image (namely at least one frame of third image) collected before the first image through a second processing mode to obtain at least one frame of fourth image, and then perform combined noise reduction processing on the second image through the at least one frame of fourth image to obtain a fifth image after noise reduction processing. In the scheme, the electronic device can adopt different processing modes for the currently acquired first image and the at least one frame of image acquired before the first image, so that different image information in the first image and the at least one frame of third image can be obtained, namely the first image and the at least one frame of third image are subjected to different image processing, and the obtained second image and the at least one frame of fourth image are images containing different image information.
Optionally, in this embodiment of the present application, before the step 203, the image denoising process provided in this embodiment of the present application further includes the following step 301 and step 303, and the step 203 may be specifically realized by the following step 203 b.
Step 301, the electronic device obtains a pixel value of each pixel point in the second image and a pixel value of each pixel point in the sixth image.
In an embodiment of the present invention, the sixth image is a frame previous to the first image and is processed by the first processing method.
In this embodiment of the application, the electronic device may obtain the pixel value of each pixel point in the second image through the corresponding relationship between the coordinate of each pixel point in the second image and the pixel value, and obtain the pixel value of each pixel point in the sixth image through the corresponding relationship between the coordinate of each pixel point in the sixth image and the pixel value.
It should be noted that the correspondence is preset by the user.
Step 302, the electronic device determines, according to the pixel value of each pixel point in the second image and the pixel value of each pixel point in the sixth image, an image area corresponding to a pixel point with the same pixel value in the second image and the sixth image as a similarity image of the second image and the sixth image.
In this embodiment of the application, the electronic device may obtain the similarity image of the second image and the sixth image according to an image similarity formula (5), where the specific formula is as follows:
Figure BDA0003718781540000131
wherein clip (z, 0, 1) is a numerical constraint function for limiting the numerical value to [0,1 ]]In this case, k is a normalization coefficient, and z represents data processed in the clip function, i.e., f lowpass (M) of lowpass (M) is a low-pass filter function, M representing the data processed in the low-pass filter function, i.e.
Figure RE-GDA0003846032080000132
And
Figure RE-GDA0003846032080000133
Figure RE-GDA0003846032080000134
is a second image of the image to be displayed,
Figure RE-GDA0003846032080000135
is the fifth image.
Specifically, the electronic device may obtain a pixel value of each pixel point in the second image and a pixel value of each pixel point in the sixth image, and then obtain a similarity image of the second image and the sixth image in a mean filtering manner.
And step 303, the electronic device performs weighting processing on the second image, the sixth image and the similarity image to obtain a seventh image subjected to time domain noise reduction processing.
In this embodiment of the application, the electronic device may perform weighting processing on the second image, the sixth image, and the similarity image through a weighting formula (6) to obtain the second image after the time-domain denoising processing, where the specific formula is as follows:
Figure BDA0003718781540000136
wherein A (x, y) is a similarity image,
Figure BDA0003718781540000137
is a second image of the image to be displayed,
Figure BDA0003718781540000138
is the sixth image.
And 203b, the electronic device performs combined denoising on the seventh image after the time domain denoising processing according to the at least one frame of the fourth image to obtain a fifth image after the combined denoising processing.
In the embodiment of the application, the seventh image subjected to time domain noise reduction and the at least one frame of fourth image are input into a target neural network to obtain a fifth image subjected to joint time domain noise reduction.
In the embodiment of the application, because the time domain noise reduction can eliminate the noise jitter between frames, after the seventh image subjected to the time domain noise reduction is subjected to noise reduction treatment by at least one frame of fourth image, the resolution of the fifth image can be improved to achieve the effect of super-resolution of image information, so that the definition of processing the image by the electronic equipment is improved.
It should be noted that, in the method for processing an image noise reduction device provided in the embodiment of the present application, the execution main body may be the image noise reduction processing device, or the electronic device, or may also be a functional module or an entity in the electronic device. In the embodiment of the present application, a method for executing an image noise reduction processing device by an image noise reduction processing device is taken as an example, and the image noise reduction processing device provided in the embodiment of the present application is described.
Fig. 4 shows a schematic diagram of a possible structure of the image noise reduction processing apparatus according to the embodiment of the present application. As shown in fig. 4, the image noise reduction processing apparatus 70 may include: a processing module 71.
The processing module 71 is configured to process the acquired first image based on a first processing manner to obtain a second image; processing at least one frame of third image based on a second processing mode to obtain at least one frame of fourth image, wherein the at least one frame of third image is an image acquired before the first image, and each frame of fourth image in the at least one frame of fourth image respectively corresponds to one frame of third image; and performing combined noise reduction processing on the second image according to the at least one frame of fourth image to obtain a fifth image after the combined noise reduction processing.
In a possible implementation manner, the processing module 71 is specifically configured to perform interpolation processing on at least one first pixel according to actual coordinate information of at least one first pixel in the first image, so as to obtain first coordinate information of at least one first pixel, where the first coordinate information is coordinate information after interpolation processing is performed on a corresponding first pixel; obtaining equivalent center coordinate information of at least one first pixel point according to the target reference multiplying power, the actual coordinate information of at least one first pixel point and at least one first weight coefficient, wherein the first weight coefficient is the proportion of the pixel value of the corresponding pixel point relative to the pixel value of the first image; and determining an image area corresponding to N first pixel points in the first image as a second image, wherein the N first pixel points are pixel points of which the first coordinate information and the equivalent center coordinate information in at least one first pixel point meet preset conditions, and N is a positive integer.
In a possible implementation manner, the processing module 71 is specifically configured to perform interpolation processing on at least one second pixel point according to actual coordinate information of at least one second pixel point in one frame of third image, so as to obtain second coordinate information of at least one second pixel point, where the second coordinate information is coordinate information obtained after interpolation processing is performed on a corresponding pixel point, for each frame of third image in the at least one frame of third image; obtaining equivalent center coordinate information of at least one second pixel point according to the target reference multiplying power, the actual coordinate information of at least one second pixel point and at least one second weight coefficient, wherein the second weight coefficient is the proportion of the corresponding pixel value relative to the pixel value of a frame of third image; and determining image areas corresponding to M second pixel points in a frame of third image as a frame of fourth image, wherein the M second pixel points are pixel points of at least one second pixel point, the second coordinate information and the equivalent center coordinate information of which do not meet preset conditions, and M is a positive integer.
In a possible implementation manner, an image noise reduction processing apparatus provided in an embodiment of the present application further includes: the device comprises an acquisition module and a determination module. And the acquisition module is used for acquiring the pixel value of each pixel point in the second image and the pixel value of each pixel point in a sixth image before the processing module performs combined noise reduction processing on the second image according to at least one frame of fourth image to obtain a fifth image after the combined noise reduction processing, wherein the sixth image is an image which is a frame before the first image and is processed by the first processing mode. And the determining module is used for determining an image area corresponding to a pixel point with the same pixel value in the second image and the sixth image as a similarity image of the second image and the sixth image according to the pixel value of each pixel point in the second image and the pixel value of each pixel point in the sixth image which are obtained by the obtaining module. The processing module 71 is further configured to perform weighting processing on the second image, the sixth image, and the similarity image to obtain a seventh image after time domain noise reduction processing; the processing module 71 is specifically configured to perform, according to at least one frame of the fourth image, joint denoising processing on the time-domain denoising processed seventh image to obtain a fifth image after the joint denoising processing.
In a possible implementation manner, the processing module is specifically configured to add pixel point information of at least one frame of the fourth image to the second image based on a target processing manner to obtain a fifth image subjected to joint noise reduction processing, where the target processing manner is any one of the following: weighting, convolution, and high-pass filtering.
The embodiment of the application provides an image noise reduction processing device, because the image noise reduction processing can adopt different processing modes for a currently acquired first image and at least one frame of image acquired before the first image, different image information in the first image and at least one frame of third image can be obtained, namely the first image and the at least one frame of third image are subjected to different image processing, the obtained second image and the at least one frame of fourth image are images containing different image information, it can be understood that the image noise reduction processing device can simulate pixel shift by carrying out different processing modes on the previous frame image and the next frame image, so that the image information can be retained to the maximum extent in a continuous time period, and through continuous multi-frame images, electronic equipment can stabilize time domain information transition on the one hand, and on the other hand, the image information of the first image acquired by the image noise reduction processing device can be recovered by utilizing the information difference between different frames, so that the video image sharpness is low due to the noise reduction processing device adopting an image algorithm with high performance and low power consumption by reducing the image size of a video shot by the image noise reduction processing device is avoided, and the video sharpness is improved.
The image noise reduction processing apparatus in the embodiment of the present application may be an apparatus, or may be a component, an integrated circuit, or a chip in an electronic device. The device can be mobile electronic equipment or non-mobile electronic equipment. The Mobile electronic Device may be, for example, a Mobile phone, a tablet computer, a notebook computer, a palmtop computer, a vehicle-mounted electronic Device, a Mobile Internet Device (MID), an Augmented Reality (AR)/Virtual Reality (VR) Device, a robot, a wearable Device, an ultra-Mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and may also be a server, a Network Attached Storage (NAS), a Personal Computer (PC), a Television (TV), a teller machine, a self-service machine, and the like, and the embodiments of the present application are not limited in particular.
The image noise reduction processing apparatus in the embodiment of the present application may be an apparatus having an operating system. The operating system may be an Android (Android) operating system, an IOS operating system, or other possible operating systems, which is not specifically limited in the embodiments of the present application.
The image noise reduction processing apparatus provided in the embodiment of the present application can implement each process implemented by the method embodiments in fig. 1 to fig. 3, and is not described here again to avoid repetition.
Optionally, as shown in fig. 5, an electronic device 90 is further provided in this embodiment of the present application, and includes a processor 91 and a memory 92, where the memory 92 stores a program or an instruction that can be executed on the processor 91, and when the program or the instruction is executed by the processor 91, the steps of the embodiment of the image noise reduction processing method are implemented, and the same technical effect can be achieved, and details are not repeated here to avoid repetition.
It should be noted that the electronic device in the embodiment of the present application includes the mobile electronic device and the non-mobile electronic device described above.
Fig. 6 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 100 includes, but is not limited to: radio frequency unit 101, network module 102, audio output unit 103, input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, and processor 110.
Those skilled in the art will appreciate that the electronic device 100 may further include a power source (e.g., a battery) for supplying power to the various components, and the power source may be logically connected to the processor 110 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system. The electronic device structure shown in fig. 6 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is omitted here.
The processor 110 is configured to process the acquired first image based on a first processing manner to obtain a second image; processing at least one frame of third image based on a second processing mode to obtain at least one frame of fourth image, wherein the at least one frame of third image is an image acquired before the first image, and each frame of fourth image in the at least one frame of fourth image respectively corresponds to one frame of third image; and performing combined denoising processing on the second image according to the at least one frame of fourth image to obtain a fifth image after the combined denoising processing.
The embodiment of the application provides an electronic device, because the electronic device can adopt different processing modes for a currently acquired first image and at least one frame of image acquired before the first image, different image information in the first image and at least one frame of third image can be obtained, that is, the first image and the at least one frame of third image are subjected to different image processing, and an obtained second image and at least one frame of fourth image are images containing different image information.
Optionally, in this embodiment of the present application, the processor 110 is specifically configured to perform interpolation processing on at least one first pixel according to actual coordinate information of the at least one first pixel in the first image, to obtain first coordinate information of the at least one first pixel, where the first coordinate information is coordinate information obtained after interpolation processing is performed on a corresponding pixel; obtaining equivalent center coordinate information of at least one first pixel point according to the target reference multiplying power, the actual coordinate information of at least one first pixel point and at least one first weight coefficient, wherein the first weight coefficient is the proportion of the pixel value of the corresponding pixel point relative to the pixel value of the first image; and determining an image area corresponding to N first pixel points in the first image as a second image, wherein the N first pixel points are pixel points of which the first coordinate information and the equivalent center coordinate information in at least one first pixel point meet preset conditions, and N is a positive integer.
Optionally, in this embodiment of the application, the processor 110 is specifically configured to, for each frame of the third image in the at least one frame of the third image, perform interpolation processing on the at least one second pixel according to actual coordinate information of the at least one second pixel in the one frame of the third image, to obtain second coordinate information of the at least one second pixel, where the second coordinate information is coordinate information obtained after interpolation processing is performed on a corresponding pixel; obtaining equivalent center coordinate information of at least one second pixel point according to the target reference multiplying power, the actual coordinate information of at least one second pixel point and at least one second weight coefficient, wherein the second weight coefficient is the proportion of the pixel value of the corresponding pixel point relative to the pixel value of a third image of one frame; and determining image areas corresponding to M second pixel points in a frame of third image as a frame of fourth image, wherein the M second pixel points are pixel points of at least one second pixel point, the second coordinate information and the equivalent center coordinate information of which do not meet preset conditions, and M is a positive integer.
Optionally, in this embodiment of the application, the processor 110 is further configured to, before performing joint noise reduction processing on the second image according to at least one frame of the fourth image to obtain a fifth image after the joint noise reduction processing, obtain a pixel value of each pixel point in the second image and a pixel value of each pixel point in a sixth image, where the sixth image is an image that is a previous frame of the first image and that is processed in the first processing manner; determining an image area corresponding to a pixel point with the same pixel value in the second image and the sixth image as a similarity image of the second image and the sixth image according to the pixel value of each pixel point in the second image and the pixel value of each pixel point in the sixth image; and weighting the second image, the sixth image and the similarity image to obtain a seventh image subjected to time domain noise reduction processing. The processor 110 is specifically configured to perform, according to the at least one frame of the fourth image, joint denoising on the seventh image after time-domain denoising processing, to obtain a fifth image after joint denoising processing.
Optionally, in this embodiment of the application, the processor 110 is specifically configured to add pixel point information of at least one frame of the fourth image to the second image by using a target processing method to obtain a fifth image after the joint noise reduction processing, where the target processing method is any one of the following: weighting, convolution, and high-pass filtering.
The electronic device provided by the embodiment of the application can realize each process realized by the method embodiment, and can achieve the same technical effect, and for avoiding repetition, the details are not repeated here.
The beneficial effects of the various implementation manners in this embodiment may specifically refer to the beneficial effects of the corresponding implementation manners in the above method embodiments, and are not described herein again to avoid repetition.
It should be understood that, in the embodiment of the present application, the input Unit 104 may include a Graphics Processing Unit (GPU) 1041 and a microphone 1042, and the Graphics Processing Unit 1041 processes image data of a still picture or a video obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The display unit 106 may include a display panel 1061, and the display panel 1061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 107 includes at least one of a touch panel 1071 and other input devices 1072. The touch panel 1071 is also referred to as a touch screen. The touch panel 1071 may include two parts of a touch detection device and a touch controller. Other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in further detail herein.
The memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a first storage area storing a program or an instruction and a second storage area storing data, wherein the first storage area may store an operating system, an application program or an instruction (such as a sound playing function, an image playing function, etc.) required for at least one function, and the like. Further, the memory 109 may include volatile memory or nonvolatile memory, or the memory x09 may include both volatile and nonvolatile memory. The non-volatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable PROM (EEPROM), or a flash Memory. The volatile Memory may be a Random Access Memory (RAM), a Static Random Access Memory (Static RAM, SRAM), a Dynamic Random Access Memory (Dynamic RAM, DRAM), a Synchronous Dynamic Random Access Memory (Synchronous DRAM, SDRAM), a Double Data Rate Synchronous Dynamic Random Access Memory (Double Data Rate SDRAM, ddr SDRAM), an Enhanced Synchronous SDRAM (ESDRAM), a Synchronous Link DRAM (SLDRAM), and a Direct Memory bus RAM (DRRAM). Memory 109 in the embodiments of the subject application includes, but is not limited to, these and any other suitable types of memory.
Processor 110 may include one or more processing units; optionally, the processor 110 integrates an application processor, which primarily handles operations related to the operating system, user interface, and applications, and a modem processor, which primarily handles wireless communication signals, such as a baseband processor. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements the processes of the foregoing method embodiments, and can achieve the same technical effects, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium, such as a computer read only memory ROM, a random access memory RAM, a magnetic or optical disk, and the like.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to execute a program or an instruction to implement each process of the foregoing method embodiments, and can achieve the same technical effect, and in order to avoid repetition, the details are not repeated here.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as a system-on-chip, or a system-on-chip.
The embodiments of the present application provide a computer program product, where the program product is stored in a storage medium, and the program product is executed by at least one processor to implement the processes of the above embodiment of the image denoising processing method, and achieve the same technical effects, and in order to avoid repetition, details are not repeated here.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrases "comprising a," "8230," "8230," or "comprising," does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus in the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order, depending on the functionality involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the above embodiment method can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better embodiment. Based on such understanding, the technical solutions of the present application may be embodied in the form of a computer software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (which may be a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the scope of the invention as defined by the appended claims.

Claims (10)

1. An image noise reduction processing method, characterized by comprising:
processing the acquired first image based on a first processing mode to obtain a second image;
processing at least one frame of third image based on a second processing mode to obtain at least one frame of fourth image, wherein the at least one frame of third image is an image acquired before the first image, and each frame of fourth image corresponds to one frame of third image;
and performing combined noise reduction processing on the second image according to the at least one frame of fourth image to obtain a fifth image after the combined noise reduction processing.
2. The method according to claim 1, wherein processing the acquired first image based on the first processing manner to obtain a second image comprises:
performing interpolation processing on at least one first pixel point according to actual coordinate information of the at least one first pixel point in the first image to obtain first coordinate information of the at least one first pixel point, wherein the first coordinate information is coordinate information of a corresponding pixel point after interpolation processing;
obtaining equivalent center coordinate information of the at least one first pixel point according to a target reference multiplying power, actual coordinate information of the at least one first pixel point and at least one first weight coefficient, wherein the first weight coefficient is the proportion of pixel values of corresponding pixel points relative to pixel values of the first image;
and determining image areas corresponding to N first pixel points in the first image as the second image, wherein the N first pixel points are pixel points of which the first coordinate information and the equivalent center coordinate information in the at least one first pixel point meet preset conditions, and N is a positive integer.
3. The method according to claim 1 or 2, wherein the processing the at least one frame of the third image based on the second processing manner to obtain at least one frame of the fourth image comprises:
for each frame of third image in the at least one frame of third image, performing interpolation processing on at least one second pixel point according to actual coordinate information of at least one second pixel point in one frame of third image to obtain second coordinate information of the at least one second pixel point, wherein the second coordinate information is coordinate information of a corresponding pixel point after interpolation processing;
obtaining equivalent center coordinate information of the at least one second pixel point according to the target reference multiplying power, the actual coordinate information of the at least one second pixel point and at least one second weight coefficient, wherein the second weight coefficient is the proportion of the pixel value of the corresponding pixel point to the pixel value of the third image of the frame;
and determining image areas corresponding to M second pixel points in the frame of third image as a frame of fourth image, wherein the M second pixel points are pixel points of which second coordinate information and equivalent center coordinate information in the at least one second pixel point do not meet preset conditions, and M is a positive integer.
4. The method according to claim 1, wherein before performing the joint noise reduction processing on the second image according to the at least one frame of the fourth image to obtain a fifth image after the joint noise reduction processing, the method further comprises:
acquiring a pixel value of each pixel point in the second image and a pixel value of each pixel point in a sixth image, wherein the sixth image is an image which is one frame before the first image and is processed by the first processing mode;
determining an image area corresponding to a pixel point with the same pixel value in the second image and the sixth image as a similarity image of the second image and the sixth image according to the pixel value of each pixel point in the second image and the pixel value of each pixel point in the sixth image;
weighting the second image, the sixth image and the similarity image to obtain a seventh image subjected to time domain noise reduction;
performing, according to the at least one frame of fourth image, joint denoising on the second image to obtain a fifth image after the joint denoising, including:
and performing combined noise reduction processing on the seventh image subjected to time domain noise reduction processing according to the at least one frame of fourth image to obtain the fifth image subjected to combined noise reduction processing.
5. The method according to claim 1 or 4, wherein the performing, according to the at least one frame of fourth image, a joint noise reduction process on the second image to obtain a fifth image after the joint noise reduction process includes:
adding pixel point information of the at least one frame of fourth image to the second image based on a target processing mode to obtain a fifth image subjected to combined noise reduction processing, wherein the target processing mode is any one of the following modes: weighting, convolution, and high-pass filtering.
6. An image noise reduction processing apparatus, characterized by comprising: a processing module;
the processing module is used for processing the acquired first image based on a first processing mode to obtain a second image; processing at least one frame of third image based on a second processing mode to obtain at least one frame of fourth image, wherein the at least one frame of third image is an image acquired before the first image, and each frame of the fourth image corresponds to one frame of the third image; and performing combined denoising processing on the second image according to the at least one frame of fourth image to obtain a fifth image after the combined denoising processing.
7. The apparatus according to claim 6, wherein the processing module is specifically configured to perform interpolation processing on at least one first pixel according to actual coordinate information of the at least one first pixel in the first image, so as to obtain first coordinate information of the at least one first pixel, where the first coordinate information is coordinate information obtained after interpolation processing is performed on a corresponding pixel; obtaining equivalent center coordinate information of the at least one first pixel point according to a target reference multiplying power, actual coordinate information of the at least one first pixel point and at least one first weight coefficient, wherein the first weight coefficient is the proportion of pixel values of corresponding pixel points relative to pixel values of the first image; and determining image areas corresponding to N first pixel points in the first image as the second image, wherein the N first pixel points are pixel points of which the first coordinate information and the equivalent center coordinate information in the at least one first pixel point meet preset conditions, and N is a positive integer.
8. The apparatus according to claim 6 or 7, wherein the processing module is specifically configured to, for each frame of the third image in the at least one frame of third image, perform interpolation processing on at least one second pixel point in one frame of third image according to actual coordinate information of the at least one second pixel point to obtain second coordinate information of the at least one second pixel point, where the second coordinate information is coordinate information obtained after interpolation processing is performed on a corresponding pixel point; obtaining equivalent center coordinate information of the at least one second pixel point according to the target reference multiplying power, the actual coordinate information of the at least one second pixel point and at least one second weight coefficient, wherein the second weight coefficient is the proportion of the pixel value of the corresponding pixel point to the pixel value of the third image of the frame; and determining image areas corresponding to M second pixel points in the frame of third image as a frame of fourth image, wherein the M second pixel points are pixel points of which second coordinate information and equivalent center coordinate information in the at least one second pixel point do not meet preset conditions, and M is a positive integer.
9. The apparatus according to claim 6, wherein the image noise reduction processing apparatus further comprises: the device comprises an acquisition module and a determination module;
the acquiring module is configured to, before the processing module performs joint noise reduction on the second image according to the at least one frame of fourth image to obtain a fifth image after the joint noise reduction, acquire a pixel value of each pixel in the second image and a pixel value of each pixel in a sixth image, where the sixth image is an image that is a previous frame of the first image and that is processed in the first processing manner;
the determining module is configured to determine, according to the pixel value of each pixel point in the second image and the pixel value of each pixel point in the sixth image obtained by the obtaining module, an image area corresponding to a pixel point with the same pixel value in the second image and the sixth image as a similarity image of the second image and the sixth image;
the processing module is further configured to perform weighting processing on the second image, the sixth image and the similarity image to obtain a seventh image subjected to time-domain noise reduction processing;
the processing module is specifically configured to perform, according to the at least one frame of fourth image, joint denoising on the seventh image subjected to time-domain denoising processing, so as to obtain the fifth image subjected to joint denoising processing.
10. The apparatus according to claim 6 or 9, wherein the processing module adds, to the second image, pixel information of the at least one frame of fourth image specifically based on a target processing method, to obtain a fifth image after the joint noise reduction processing, where the target processing method is any one of: weighting, convolution, and high-pass filtering.
CN202210752510.9A 2022-06-28 2022-06-28 Image noise reduction processing method and device Pending CN115239580A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210752510.9A CN115239580A (en) 2022-06-28 2022-06-28 Image noise reduction processing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210752510.9A CN115239580A (en) 2022-06-28 2022-06-28 Image noise reduction processing method and device

Publications (1)

Publication Number Publication Date
CN115239580A true CN115239580A (en) 2022-10-25

Family

ID=83670758

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210752510.9A Pending CN115239580A (en) 2022-06-28 2022-06-28 Image noise reduction processing method and device

Country Status (1)

Country Link
CN (1) CN115239580A (en)

Similar Documents

Publication Publication Date Title
US8861846B2 (en) Image processing apparatus, image processing method, and program for performing superimposition on raw image or full color image
CN113014801B (en) Video recording method, video recording device, electronic equipment and medium
WO2023056950A1 (en) Image processing method and electronic device
CN113794829A (en) Shooting method and device and electronic equipment
CN115103126A (en) Shooting preview method and device, electronic equipment and storage medium
CN111429371A (en) Image processing method and device and terminal equipment
CN107395983B (en) Image processing method, mobile terminal and computer readable storage medium
CN112508820A (en) Image processing method and device and electronic equipment
CN115239580A (en) Image noise reduction processing method and device
CN113393391B (en) Image enhancement method, image enhancement device, electronic apparatus, and storage medium
CN115439386A (en) Image fusion method and device, electronic equipment and storage medium
US11195247B1 (en) Camera motion aware local tone mapping
JP2009076984A (en) Image processor, image processing method, program, and recording medium
CN114785957A (en) Shooting method and device thereof
CN114339051A (en) Shooting method, shooting device, electronic equipment and readable storage medium
CN114298889A (en) Image processing circuit and image processing method
CN112672056A (en) Image processing method and device
CN114390188A (en) Image processing method and electronic equipment
CN112367470B (en) Image processing method and device and electronic equipment
CN116012262B (en) Image processing method, model training method and electronic equipment
CN113709372B (en) Image generation method and electronic device
CN115619642A (en) Model training method and device, electronic equipment and medium
CN115456882A (en) Image processing method, image processing apparatus, electronic device, and medium
CN116128844A (en) Image quality detection method, device, electronic equipment and medium
CN117541507A (en) Image data pair establishing method and device, electronic equipment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination