CN113658065A - Image noise reduction method and device, computer readable medium and electronic equipment - Google Patents

Image noise reduction method and device, computer readable medium and electronic equipment Download PDF

Info

Publication number
CN113658065A
CN113658065A CN202110908689.8A CN202110908689A CN113658065A CN 113658065 A CN113658065 A CN 113658065A CN 202110908689 A CN202110908689 A CN 202110908689A CN 113658065 A CN113658065 A CN 113658065A
Authority
CN
China
Prior art keywords
image
processing
brightness
noise reduction
alignment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110908689.8A
Other languages
Chinese (zh)
Other versions
CN113658065B (en
Inventor
王振
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202110908689.8A priority Critical patent/CN113658065B/en
Publication of CN113658065A publication Critical patent/CN113658065A/en
Application granted granted Critical
Publication of CN113658065B publication Critical patent/CN113658065B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The disclosure provides an image denoising method and device, a computer readable medium and electronic equipment, and relates to the technical field of image processing. The method comprises the following steps: acquiring a first image and at least one second image, wherein the first image is generated by exposing a target scene for a first exposure duration, the second image is generated by exposing the target scene for a second exposure duration, and the second exposure duration is longer than the first exposure duration; performing brightness alignment processing on the second image based on the first image to obtain a second image with aligned brightness; performing image content alignment processing on the second image after the brightness alignment based on the first image to obtain a reference image; and carrying out noise reduction processing on the first image according to the reference image to obtain a noise-reduced output image. The method and the device can improve the noise reduction effect while reducing the complexity of the noise reduction algorithm.

Description

Image noise reduction method and device, computer readable medium and electronic equipment
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to an image denoising method, an image denoising device, a computer readable medium, and an electronic device.
Background
Along with the continuous improvement of the living standard of people, in daily life, due to the limitation of shooting conditions, the definition of shot videos or images is often interfered by noise, so that the quality of the videos or images is reduced, and the visual effect of the videos or images is influenced. Therefore, denoising a video or image is very necessary to improve the quality of the video or image.
At present, most of noise reduction algorithms are difficult to distinguish fine textures and noise of an image in image processing, in order to remove the noise, the definition of the texture is difficult to avoid being sacrificed, and finally, some balance can only be made between the noise reduction effect and the definition of the texture, so that the complexity of the noise reduction algorithm is high, the calculation efficiency is low, and the noise reduction effect is limited.
Disclosure of Invention
The present disclosure aims to provide an image denoising method, an image denoising device, a computer readable medium and an electronic device, so as to overcome the problems of high complexity, low computational efficiency and poor denoising effect of a denoising algorithm in the related art at least to a certain extent.
According to a first aspect of the present disclosure, there is provided an image noise reduction method, comprising:
acquiring a first image and at least one second image, wherein the first image is generated by exposing a target scene for a first exposure duration, the second image is generated by exposing the target scene for a second exposure duration, and the second exposure duration is longer than the first exposure duration;
performing brightness alignment processing on the second image based on the first image to obtain a second image with aligned brightness;
performing image content alignment processing on the second image after the brightness alignment based on the first image to obtain a reference image;
and carrying out noise reduction processing on the first image according to the reference image to obtain a noise-reduced output image.
According to a second aspect of the present disclosure, there is provided an image noise reduction device comprising:
the image acquisition module is used for acquiring a first image and at least one second image, wherein the first image is generated by carrying out exposure processing on a target scene for a first exposure duration, the second image is generated by carrying out exposure processing on the target scene for a second exposure duration, and the second exposure duration is greater than the first exposure duration;
the brightness alignment module is used for carrying out brightness alignment processing on the second image based on the first image to obtain a second image with aligned brightness;
the image content alignment module is used for carrying out image content alignment processing on the second image after the brightness alignment based on the first image to obtain a reference image;
and the image denoising module is used for denoising the first image according to the reference image to obtain a denoised output image.
According to a third aspect of the present disclosure, a computer-readable medium is provided, on which a computer program is stored, which computer program, when being executed by a processor, is adapted to carry out the above-mentioned method.
According to a fourth aspect of the present disclosure, there is provided an electronic apparatus, comprising:
a processor; and
a memory for storing one or more programs that, when executed by the one or more processors, cause the one or more processors to implement the above-described method.
The image denoising method provided by an embodiment of the present disclosure obtains a first image generated by performing exposure processing for a first exposure duration on a target scene, and obtains at least one second image generated by performing exposure processing for a second exposure duration on the target scene, first performs luminance alignment processing on the second image based on the first image to obtain a second image after luminance alignment, then performs image content alignment processing on the second image after luminance alignment based on the first image to obtain a reference image, and finally performs denoising processing on the first image according to the reference image to obtain an output image after denoising. On one hand, brightness alignment and image content alignment are carried out on the long-exposure image by combining the short-exposure image to obtain a reference image, and noise reduction processing is carried out on the short-exposure image according to the reference image, so that the processing complexity of the noise reduction process is low, and the calculated amount is small; on the other hand, the reference image is generated through the long exposure image with smaller noise and clear texture, and the short exposure image is denoised according to the reference image, so that the denoising effect can be effectively improved, the texture definition of the output image is ensured, and the quality of the output image is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty. In the drawings:
FIG. 1 illustrates a schematic diagram of an exemplary system architecture to which embodiments of the present disclosure may be applied;
FIG. 2 shows a schematic diagram of an electronic device to which embodiments of the present disclosure may be applied;
FIG. 3 schematically illustrates a flow chart of a method of image denoising in an exemplary embodiment of the present disclosure;
FIG. 4 schematically illustrates a flow chart for luminance alignment of a second image in an exemplary embodiment of the disclosure;
FIG. 5 schematically illustrates another flow chart for luminance alignment of a second image in an exemplary embodiment of the disclosure;
FIG. 6 schematically illustrates a flow chart for image content alignment of a second image in an exemplary embodiment of the disclosure;
fig. 7 schematically shows a composition diagram of an image noise reduction apparatus in an exemplary embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
Fig. 1 is a schematic diagram illustrating a system architecture of an exemplary application environment to which an image denoising method and apparatus according to an embodiment of the present disclosure may be applied.
As shown in fig. 1, the system architecture 100 may include one or more of terminal devices 101, 102, 103, a network 104, and a server 105. The network 104 serves as a medium for providing communication links between the terminal devices 101, 102, 103 and the server 105. Network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few. The terminal devices 101, 102, 103 may be various electronic devices having an image processing function, including but not limited to desktop computers, portable computers, smart phones, tablet computers, and the like. It should be understood that the number of terminal devices, networks, and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation. For example, server 105 may be a server cluster comprised of multiple servers, or the like.
The image noise reduction method provided by the embodiment of the present disclosure is generally executed by the terminal devices 101, 102, and 103, and accordingly, the image noise reduction apparatus is generally disposed in the terminal devices 101, 102, and 103. However, it is easily understood by those skilled in the art that the image denoising method provided in the embodiment of the present disclosure may also be executed by the server 105, and accordingly, the image denoising device may also be disposed in the server 105, which is not particularly limited in the exemplary embodiment. For example, in an exemplary embodiment, a user may acquire a first image and a second image corresponding to a target scene through the terminal devices 101, 102, and 103, and upload the first image and the second image to the server 105, and after the server generates an output image through the image denoising method provided by the embodiment of the present disclosure, the server transmits the output image to the terminal devices 101, 102, and 103, and so on.
An exemplary embodiment of the present disclosure provides an electronic device for implementing an image noise reduction method, which may be the terminal device 101, 102, 103 or the server 105 in fig. 1. The electronic device comprises at least a processor and a memory for storing executable instructions of the processor, the processor being configured to perform the image noise reduction method via execution of the executable instructions.
The following takes the mobile terminal 200 in fig. 2 as an example, and exemplifies the configuration of the electronic device. It will be appreciated by those skilled in the art that the configuration of figure 2 can also be applied to fixed type devices, in addition to components specifically intended for mobile purposes. In other embodiments, mobile terminal 200 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware. The interfacing relationship between the components is only schematically illustrated and does not constitute a structural limitation of the mobile terminal 200. In other embodiments, the mobile terminal 200 may also interface differently than shown in fig. 2, or a combination of multiple interfaces.
As shown in fig. 2, the mobile terminal 200 may specifically include: a processor 210, an internal memory 221, an external memory interface 222, a Universal Serial Bus (USB) interface 230, a charging management module 240, a power management module 241, a battery 242, an antenna 1, an antenna 2, a mobile communication module 250, a wireless communication module 260, an audio module 270, a speaker 271, a microphone 272, a microphone 273, an earphone interface 274, a sensor module 280, a display 290, a camera module 291, an indicator 292, a motor 293, a button 294, and a Subscriber Identity Module (SIM) card interface 295. Wherein the sensor module 280 may include a depth sensor 2801, a pressure sensor 2802, a gyroscope sensor 2803, and the like.
Processor 210 may include one or more processing units, such as: the Processor 210 may include an Application Processor (AP), a modem Processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband Processor, and/or a Neural-Network Processing Unit (NPU), and the like. The different processing units may be separate devices or may be integrated into one or more processors.
The NPU is a Neural-Network (NN) computing processor, which processes input information quickly by using a biological Neural Network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. The NPU can implement applications such as intelligent recognition of the mobile terminal 200, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
A memory is provided in the processor 210. The memory may store instructions for implementing six modular functions: detection instructions, connection instructions, information management instructions, analysis instructions, data transmission instructions, and notification instructions, and execution is controlled by processor 210.
The charge management module 240 is configured to receive a charging input from a charger. The power management module 241 is used for connecting the battery 242, the charging management module 240 and the processor 210. The power management module 241 receives the input of the battery 242 and/or the charging management module 240, and supplies power to the processor 210, the internal memory 221, the display screen 290, the camera module 291, the wireless communication module 260, and the like.
The wireless communication function of the mobile terminal 200 may be implemented by the antenna 1, the antenna 2, the mobile communication module 250, the wireless communication module 260, a modem processor, a baseband processor, and the like. Wherein, the antenna 1 and the antenna 2 are used for transmitting and receiving electromagnetic wave signals; the mobile communication module 250 may provide a solution including wireless communication of 2G/3G/4G/5G, etc. applied to the mobile terminal 200; the modem processor may include a modulator and a demodulator; the Wireless communication module 260 may provide a solution for Wireless communication including a Wireless Local Area Network (WLAN) (e.g., a Wireless Fidelity (Wi-Fi) network), Bluetooth (BT), and the like, applied to the mobile terminal 200. In some embodiments, antenna 1 of the mobile terminal 200 is coupled to the mobile communication module 250 and antenna 2 is coupled to the wireless communication module 260, such that the mobile terminal 200 may communicate with networks and other devices via wireless communication techniques.
The mobile terminal 200 implements a display function through the GPU, the display screen 290, the application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display screen 290 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 210 may include one or more GPUs that execute program instructions to generate or alter display information.
The mobile terminal 200 may implement a photographing function through the ISP, the camera module 291, the video codec, the GPU, the display screen 290, the application processor, and the like. The ISP is used for processing data fed back by the camera module 291; the camera module 291 is used for capturing still images or videos; the digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals; the video codec is used to compress or decompress digital video, and the mobile terminal 200 may also support one or more video codecs.
The external memory interface 222 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the mobile terminal 200. The external memory card communicates with the processor 210 through the external memory interface 222 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
Internal memory 221 may be used to store computer-executable program code, which includes instructions. The internal memory 221 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area may store data (e.g., audio data, a phonebook, etc.) created during use of the mobile terminal 200, and the like. In addition, the internal memory 221 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk Storage device, a Flash memory device, a Universal Flash Storage (UFS), and the like. The processor 210 executes various functional applications of the mobile terminal 200 and data processing by executing instructions stored in the internal memory 221 and/or instructions stored in a memory provided in the processor.
The mobile terminal 200 may implement an audio function through the audio module 270, the speaker 271, the receiver 272, the microphone 273, the earphone interface 274, the application processor, and the like. Such as music playing, recording, etc.
The depth sensor 2801 is used to acquire depth information of a scene. In some embodiments, a depth sensor may be provided to the camera module 291.
The pressure sensor 2802 is used to sense a pressure signal and convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 2802 may be disposed on the display screen 290. Pressure sensor 2802 can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like.
The gyro sensor 2803 may be used to determine a motion gesture of the mobile terminal 200. In some embodiments, the angular velocity of the mobile terminal 200 about three axes (i.e., x, y, and z axes) may be determined by the gyroscope sensor 2803. The gyro sensor 2803 can be used to photograph anti-shake, navigation, body-feel game scenes, and the like.
In addition, other functional sensors, such as an air pressure sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity light sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, etc., may be provided in the sensor module 280 according to actual needs.
Other devices for providing auxiliary functions may also be included in mobile terminal 200. For example, the keys 294 include a power-on key, a volume key, and the like, and a user can generate key signal inputs related to user settings and function control of the mobile terminal 200 through key inputs. Further examples include indicator 292, motor 293, SIM card interface 295, etc.
At present, digital images generated by electronic photographing devices generate noise, one of the sources of the noise is caused in the photoelectric conversion process of an image sensor, and the noise of the image is larger when the environment is darker, because the amount of incoming light is smaller when the environment is darker, so that the signal-to-noise ratio of the image is reduced. The noise in a captured image at night is greater for a general photographing apparatus than for an image captured in the daytime. The noise of the image can be reduced by prolonging the exposure time in the night scene, but the exposure time has certain limitation, the frame rate is reduced if the exposure time is too long, and the highlight area in the image is overexposed, so that some software and hardware algorithms are needed to inhibit the noise together. Most image shooting equipment has a plurality of cameras simultaneously, and the function of these cameras is different, like long focus camera, macro camera etc. but the process of making an uproar is all independent in the image that these cameras gathered, needs to make an uproar the processing of making an uproar to the image that each camera gathered respectively.
In the related art, most of the software and hardware noise reduction algorithms are difficult to distinguish fine textures and noise of an image in image processing, and in order to remove the noise, the texture definition is difficult to avoid being sacrificed, and finally, some balance can be only made between the noise reduction effect and the texture definition. Generally speaking, the noise reduction algorithm has high complexity and limited effect. For products with multiple cameras, the complexity is higher if the images captured by the individual cameras are to be de-noised separately.
The following describes an image denoising method according to an exemplary embodiment of the present disclosure in detail by taking an example including a terminal device.
Fig. 3 shows a flowchart of an image denoising method in the present exemplary embodiment, including the following steps S310 to S340:
in step S310, a first image and at least one second image are obtained, where the first image is generated by performing exposure processing on a target scene for a first exposure duration, the second image is generated by performing exposure processing on the target scene for a second exposure duration, and the second exposure duration is greater than the first exposure duration.
In an exemplary embodiment, the target scene is a scene shot by a camera on the terminal device, the first image is generated by exposing the target scene for a first exposure duration by the camera of the terminal device, and the second image is generated by exposing the target scene for a second exposure duration by the camera of the terminal device.
The terminal device may include at least two cameras, for example, the terminal device may include a main camera and at least one camera with a long exposure function except for the main camera, and at this time, a target scene may be shot through the terminal device, so that a first image with a first exposure duration corresponding to the target scene, that is, a short exposure image, and a second image with at least one second exposure duration corresponding to the target scene, that is, a long exposure image, may be obtained simultaneously.
It should be noted that the short-exposure image in the present exemplary embodiment may be a normal image captured according to normal capturing parameters (i.e., default parameters of the camera), and is not necessarily an image captured by setting the short-exposure time period, as opposed to the long-exposure image.
Of course, in some specific scenarios, such as a scenario in which the terminal device is set in a fixed position and used for capturing still content, the terminal device may also include only one camera, and a first image, i.e., a short-exposure image, corresponding to the scenario may be captured by the camera, and then a plurality of different exposure duration parameters (which are greater than the exposure duration during capturing the first image) of the camera are set, and at least one second image, i.e., a long-exposure image, corresponding to the scenario is captured again. The present exemplary embodiment does not set any particular limitation on the acquisition manner of acquiring the first image and the second image.
For cameras with the same specification, the darker the light, the higher the noise of the image, and the longer the exposure time, the lower the noise.
In step S320, a brightness alignment process is performed on the second image based on the first image, so as to obtain a brightness-aligned second image.
In an exemplary embodiment, the brightness alignment refers to a process of adjusting the brightness distribution of the second image to be consistent with the brightness distribution of the first image, for example, the brightness distribution of the second image may be aligned according to a ratio of exposure durations of the first image and the second image, or the brightness of the second image may be aligned with the brightness of the first image according to a Histogram Matching method (Histogram Matching), of course, the brightness alignment may also be another process capable of aligning contents of the two images, which is not limited in this exemplary embodiment.
In the same target scene, the textures of the first image and the second image are relatively similar, but have a large difference in brightness, so that brightness alignment is required, the brightness of the second image is corrected to be consistent with that of the first image, and the quality of the generated reference image is improved.
In step S330, the second image after the brightness alignment is subjected to image content alignment processing based on the first image, so as to obtain a reference image.
In an exemplary embodiment, the image content alignment refers to a process of aligning textures or contents in the first image and the second image, for example, the first image and the second image may be aligned by Scale-invariant feature transform (SIFT) -based image alignment, or the first image and the second image may be aligned by FAST feature point extraction and description algorithm (ORB), or of course, the image content alignment may be another process capable of aligning the contents of the two images, which is not particularly limited in this exemplary embodiment.
Under the condition that the first image and the second image are aligned in brightness, the first image and the second image have some differences in image texture, and the differences mainly come from two aspects, namely, the positions of the two cameras have certain differences (or terminal equipment at a fixed position in the shooting process has shake), so that at least two acquired images have certain angle differences, and when moving objects exist in the images, the positions and postures of the moving objects in the images with different exposure times have certain differences, so that image content alignment needs to be carried out, and the accuracy and quality of the reference image can be further ensured through the image content alignment.
In step S340, a noise reduction process is performed on the first image according to the reference image, so as to obtain an output image after noise reduction.
In an exemplary embodiment, the outputting the image refers to performing noise reduction processing on the first image according to the reference image to obtain a noise-reduced first image, which is used as a final noise reduction result displayed to a user.
The noise reduction processing refers to a processing procedure of fusing the content in the reference image into the first image to realize noise reduction, for example, the noise reduction processing may be a processing procedure of performing guided filtering on the first image by using the reference image as a guide map, or a processing procedure of performing noise reduction on the first image by fusing the reference image and the first image according to a ratio, of course, the noise reduction processing may also be other processing procedures capable of performing noise reduction on the first image by combining the reference image, which is not particularly limited in this example embodiment.
Next, step S310 to step S340 will be described in detail.
In an exemplary embodiment, when the first image is subjected to noise reduction processing according to the reference image, the reference image may be used as a guide graph to perform guide filtering on the first image, so as to obtain an output image after noise reduction.
The Guided Filter (also called Guided Filter) is an image filtering technology, and a final output image is similar to a first image as a whole, but an image texture part is similar to a reference image by filtering a to-be-processed image (i.e. a first image) through a Guided graph (i.e. a reference image), so that the integral content of the first image is ensured to be unchanged while the image texture definition of the first image is effectively improved, an output image with a good noise reduction effect is obtained, and meanwhile, the complexity of the Guided Filter is low, and the calculation efficiency is high.
Further, after the first image is subjected to the guided filtering, in order to further ensure the quality of the output image, post-processing may be performed on the guided filtered first image to obtain a noise-reduced output image, where the post-processing may include detail restoration processing and/or artifact removal processing.
The detail recovery processing refers to a processing procedure of image restoration for an image with details missing, for example, the detail recovery processing may be based on a deep learning technique to estimate the missing details information in the image, extract effective semantic information in the image, and generate a new image (for example, by generating a discriminant anti-neural network to generate an output image with image details completed), so as to realize the detail recovery for the image; or based on the image slice, the method can reconstruct the missing area by searching the target area with the structure similar to the detail missing area and copying the image slice, so as to realize the detail recovery of the image; of course, other processing procedures capable of achieving image detail recovery may also be used, and this exemplary embodiment is not particularly limited thereto.
Artifact (Artifact) removal processing refers to a processing procedure for removing an Artifact in an image after noise reduction processing, although image contents of a first image and a second image are aligned, there is a possibility that texture details of a partial image area are not completely aligned, in this case, noise reduction processing is performed on the first image according to a reference image, and there is a possibility that an unaligned image texture is superimposed on a final output image, so that an Artifact phenomenon is generated. Specifically, the artifact phenomenon in the output image may be eliminated through edge-preserving filtering processing, or the artifact phenomenon in the output image may be eliminated through scaling the convolutional neural network, and of course, the artifact elimination processing may also be performed through other manners capable of eliminating the artifact phenomenon in the output image, which is not particularly limited in this example embodiment.
In an exemplary embodiment, in addition to performing guided filtering on the first image by using the reference image as a guide graph to obtain a noise-reduced output image, the noise-reduced output image may also be obtained by performing alignment and superposition processing on the reference image and the first image.
If the size of the reference image is consistent with that of the first image, the reference image and the first image can be aligned in a coordinate system, and pixel values of the same position in the reference image and the first image are directly linearly superposed to obtain an output image subjected to noise reduction.
In another exemplary embodiment, in addition to performing guided filtering on the first image by using the reference image as a guide graph to obtain the noise-reduced output image, a fusion ratio between the reference image and the first image may be calculated, and the reference image and the first image are subjected to weighted fusion according to the fusion ratio to obtain the noise-reduced output image.
The reference image and the first image may be divided according to an image region, for example, the image region may be divided according to a scene type of a target scene corresponding to the reference image and the first image, for example, the scene type may include a face scene, a night scene, a sky scene, a beach scene, a cloud scene, a plant scene, a bird scene, and the like, of course, the above scene types are only schematically illustrated, and the scene type in the actual application scene is set by a developer in a self-defined manner, which is not limited in any way in this exemplary embodiment; the image area division may also be performed through the luminance distribution map of the reference image corresponding to the first image, and of course, the area division may also be performed through the texture complexity distribution map of the reference image corresponding to the first image, which is not particularly limited in this exemplary embodiment.
Specifically, after the image regions corresponding to the reference image and the first image are obtained by dividing, different fusion ratios may be determined for different image regions, for example, if the image regions are divided according to a scene type, the fusion ratio corresponding to the scene type may be obtained to implement weighted fusion of the reference image and the first image; if the image area is divided according to the brightness distribution map, determining a fusion proportion according to the brightness difference value of the same image area in the two images, and realizing the weighted fusion of the reference image and the first image according to the determined fusion proportion; if the image regions are divided according to the texture complexity distribution map, the fusion proportion can be determined according to the texture complexity in each image, for example, the region with lower texture complexity, the region with lower reference image fusion proportion, the region with higher texture complexity, and the reference image fusion proportion can be higher. Of course, the fusion ratio determining manner and the image area dividing manner in the present exemplary embodiment are only schematic illustrations, and should not be any special limitation to the present exemplary embodiment.
In an exemplary embodiment, step S320 may include steps S410 to S420, and specifically, the performing of the luminance alignment process on the second image based on the first image may be implemented by the steps in fig. 4, and as shown in fig. 4, the performing of the luminance alignment process on the second image may include:
step S410, if the first image and the second image are not subjected to nonlinear transformation processing, acquiring a first exposure duration of the first image and acquiring a second exposure duration of the second image;
step S420, correcting the brightness of the second image according to the ratio of the first exposure time to the second exposure time, to obtain a second image with aligned brightness.
The nonlinear conversion processing refers to a processing procedure of obtaining RGB values observed by human eyes by performing nonlinear conversion on the acquired optical signal power value through an Image sensor (Image sensor), for example, the nonlinear conversion processing may be a processing procedure corresponding to Gamma correction, where Gamma is derived from a response curve of a CRT (display/television), that is, a nonlinear relationship between luminance and input voltage. The RGB values are not simply linear but rather power functions of the light signal, the exponent of the function is called the Gamma value, which is typically 2.2, and the scaling process is called Gamma correction.
The first exposure duration refers to an exposure duration parameter adopted when a first image is shot and collected, and the second exposure duration refers to an exposure duration parameter adopted when a second image is shot and collected.
Specifically, if the cameras with the same rule are used for acquiring the first image and the second image, and the output first image and the output second image are not subjected to nonlinear transformation such as Gamma correction, the brightness of the second image can be directly corrected by using the proportion of the exposure time.
For example, the luminance value of the second image may be calculated by the relation (1):
Img3=Img2*(d1/d2)*δ (1)
wherein Img3 may represent pixel values of the second image after luminance alignment, Img2 may represent pixel values in the second image, d1 may represent a first exposure time duration of the first image, d2 may represent a second exposure time duration of the second image, δ may represent a correction factor for eliminating a non-linear relationship between image luminance and exposure time duration, and δ may be a parameter of the image sensor itself and is a priori data.
In an exemplary embodiment, if the first image and the second image are both subjected to the non-linear transformation processing, the histogram matching is performed on the second image based on the first image, so as to obtain a second image after brightness alignment.
The Histogram Matching (Histogram regularization) refers to an image enhancement method in which a Histogram of one image is changed into a Histogram of a predetermined shape, that is, a Histogram of a certain image or a certain region is matched with another image so that the hues of the two images are kept the same. The histogram matching method can be used for matching histograms of single-band images and multi-band images simultaneously, and before comparing the two images, the histograms of the two images are required to be consistent. The histogram has the following characteristics: a specific image has a unique histogram, but the histograms of two images are the same and cannot indicate that the images are the same; the histogram of a particular object in an image is shift invariant; the histogram of a particular object in an image is rotation invariant.
Specifically, step S320 may include step S510 to step S530, and specifically, the histogram matching of the second image based on the first image may be implemented by the steps in fig. 5, and as shown in fig. 5, the histogram matching may include:
step S510, if the first image and the second image are both subjected to nonlinear transformation processing, calculating cumulative probability density functions of the first image and the second image, respectively;
step S520, histogram equalization is carried out on the cumulative probability density function, and a transformation mapping relation is determined;
and step S520, performing histogram matching on the second image based on the transformation mapping relation to obtain a second image with aligned brightness.
The histogram matching principle is that histogram equalization is carried out on histograms of two images to form the same normalized uniform histogram, the uniform histogram plays a medium role, and then the inverse operation of equalization is carried out on the first image, and the histogram equalization is a bridge matched with the histogram.
For example, suppose Pr(r) and Pz(z) cumulative probability density distribution functions of the first image and the second image, respectively, (where r and z represent gray levels of the first image and the second image, respectively), histogram equalization is first performed on the first image, i.e., gray level S is found, which can be calculated by relation (2):
Figure BDA0003202693470000141
then, histogram equalization is performed on the second image, i.e. a gray level V is obtained, which can be calculated by the relation (3):
Figure BDA0003202693470000142
an inverse transformation Z of the gray level V is calculated, which can be expressed as relation (4):
Z=G-1(V) (4)
since the equalization process is performed, the probability density function P of the processed first imageS(S) and probability density function P of the second imageV(V) are equal, so that the gray scale of the transformed first image can be usedThe level S, instead of the gray level V in relation (4), may result in a transformation map, which may be represented by relation (5):
Z=G-1(S) (5)
the gray level Z in the relation (5) is the gray level of the second image, and the luminance alignment of the second image is realized by transforming the mapping relation, so as to obtain the second image after the luminance alignment.
In an exemplary embodiment, step S330 may include step S610 to step S620, and specifically, the performing, by the steps in fig. 6, the image content alignment process on the second image after brightness alignment based on the first image may specifically include:
step S610, respectively extracting feature points in the first image and the second image;
step S620, carrying out feature point matching on the feature points, and determining a transformation matrix between the first image and the second image;
step S630, performing image content alignment processing on the second image after brightness alignment according to the transformation matrix to obtain a reference image.
The feature points refer to key points in the image and feature descriptors corresponding to the key points, for example, the feature points may be SIFT feature points, and may include key points of local features detected by the SIFT descriptors in the image and directions of the key points; the feature points may also be ORB feature points, and may include FAST key points detected by the BRIEF descriptor in the image and the direction of the FAST key points; of course, the feature points in this exemplary embodiment may also be other types of feature points capable of performing image content alignment, and this is not particularly limited in this exemplary embodiment.
The transformation matrix is a matrix for twisting the deformed image obtained after the feature point matching calculation so that the deformed image is consistent with the image content of the reference image. And performing deformation processing on the second image after the brightness alignment through the transformation matrix to realize image content alignment and obtain a reference image for noise reduction processing.
In the embodiment of the present invention, two images obtained by performing exposure processing for different durations through two cameras (or one camera) are subjected to noise reduction, so that a problem of a large amount of computation of a conventional complex noise reduction algorithm is avoided, and a reference image obtained through a second image (i.e., a long exposure image) has a characteristic of clear small noise texture, and is used for distinguishing a flat region and a texture region of an image more easily when a first image (i.e., a short exposure image) is subjected to noise reduction reference, so that a noise reduction effect is more accurate, and the reference image subjected to luminance alignment and image content alignment can also perform accurate detail restoration, dynamic enhancement and other processing on the first image, which is beneficial to improving the overall effect of the image.
To sum up, in the exemplary embodiment, a first image generated by performing short exposure processing on a target scene and at least one second image generated by performing long exposure processing on the target scene are acquired, first, luminance alignment processing is performed on the second image based on the first image to obtain a second image after luminance alignment, then, image content alignment processing is performed on the second image after luminance alignment based on the first image to obtain a reference image, and finally, noise reduction processing is performed on the first image according to the reference image to obtain an output image after noise reduction. On one hand, brightness alignment and image content alignment are carried out on the long-exposure image by combining the short-exposure image to obtain a reference image, and noise reduction processing is carried out on the short-exposure image according to the reference image, so that the processing complexity of the noise reduction process is low, and the calculated amount is small; on the other hand, the reference image is generated through the long exposure image with smaller noise and clear texture, and the short exposure image is denoised according to the reference image, so that the denoising effect can be effectively improved, the texture definition of the output image is ensured, and the quality of the output image is improved.
It is noted that the above-mentioned figures are merely schematic illustrations of processes involved in methods according to exemplary embodiments of the present disclosure, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
Further, referring to fig. 7, an image denoising apparatus 700 is further provided in this example embodiment, and may include an image obtaining module 710, a brightness alignment module 720, an image content alignment module 730, and an image denoising module 740. Wherein:
the image obtaining module 710 is configured to obtain a first image and at least one second image, where the first image is generated by performing exposure processing on a target scene for a first exposure duration, the second image is generated by performing exposure processing on the target scene for a second exposure duration, and the second exposure duration is greater than the first exposure duration;
the brightness alignment module 720 is configured to perform brightness alignment processing on the second image based on the first image to obtain a second image with aligned brightness;
the image content alignment module 730 is configured to perform image content alignment processing on the second image after the brightness alignment based on the first image to obtain a reference image;
the image denoising module 740 is configured to perform denoising processing on the first image according to the reference image to obtain a denoised output image.
In an exemplary embodiment, the image denoising module 740 may be configured to:
and performing guiding filtering on the first image by taking the reference image as a guiding graph to obtain an output image subjected to noise reduction.
In an exemplary embodiment, the image denoising apparatus 700 may further include a post-processing module, which may be configured to:
and performing post-processing on the first image after the guide filtering to obtain an output image after noise reduction, wherein the post-processing comprises detail recovery processing and/or artifact elimination processing.
In an exemplary embodiment, the image denoising module 740 may be further configured to:
aligning and overlapping the reference image and the first image to obtain an output image subjected to noise reduction; or
And calculating the fusion ratio between the reference image and the first image, and performing weighted fusion on the reference image and the first image according to the fusion ratio to obtain an output image subjected to noise reduction.
In an exemplary embodiment, the brightness alignment module 720 may be configured to:
if the first image and the second image are not subjected to nonlinear transformation processing, acquiring a first exposure duration of the first image and a second exposure duration of the second image;
and correcting the brightness of the second image according to the ratio of the first exposure time length to the second exposure time length to obtain a second image with aligned brightness.
In an exemplary embodiment, the brightness alignment module 720 may further be configured to:
and if the first image and the second image are subjected to nonlinear transformation processing, performing histogram matching on the second image based on the first image to obtain a second image with aligned brightness.
In an exemplary embodiment, the image content alignment module 730 may be configured to:
respectively extracting feature points in the first image and the second image;
performing feature point matching on the feature points, and determining a transformation matrix between the first image and the second image;
and carrying out image content alignment processing on the second image after the brightness alignment according to the transformation matrix to obtain a reference image.
The specific details of each module in the above apparatus have been described in detail in the method section, and details that are not disclosed may refer to the method section, and thus are not described again.
As will be appreciated by one skilled in the art, aspects of the present disclosure may be embodied as a system, method or program product. Accordingly, various aspects of the present disclosure may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system.
Exemplary embodiments of the present disclosure also provide a computer-readable storage medium having stored thereon a program product capable of implementing the above-described method of the present specification. In some possible embodiments, various aspects of the disclosure may also be implemented in the form of a program product including program code for causing a terminal device to perform the steps according to various exemplary embodiments of the disclosure described in the above-mentioned "exemplary methods" section of this specification, when the program product is run on the terminal device, for example, any one or more of the steps in fig. 3 to 6 may be performed.
It should be noted that the computer readable media shown in the present disclosure may be computer readable signal media or computer readable storage media or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer readable signal medium may comprise a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
Furthermore, program code for carrying out operations of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is to be limited only by the terms of the appended claims.

Claims (10)

1. An image noise reduction method, comprising:
acquiring a first image and at least one second image, wherein the first image is generated by exposing a target scene for a first exposure duration, the second image is generated by exposing the target scene for a second exposure duration, and the second exposure duration is longer than the first exposure duration;
performing brightness alignment processing on the second image based on the first image to obtain a second image with aligned brightness;
performing image content alignment processing on the second image after the brightness alignment based on the first image to obtain a reference image;
and carrying out noise reduction processing on the first image according to the reference image to obtain a noise-reduced output image.
2. The method of claim 1, wherein the denoising the first image according to the reference image to obtain a denoised output image comprises:
and performing guiding filtering on the first image by taking the reference image as a guiding graph to obtain an output image subjected to noise reduction.
3. The method of claim 2, further comprising:
and performing post-processing on the first image after the guide filtering to obtain an output image after noise reduction, wherein the post-processing comprises detail recovery processing and/or artifact elimination processing.
4. The method of claim 1, wherein the denoising the first image according to the reference image to obtain a denoised output image comprises:
aligning and overlapping the reference image and the first image to obtain an output image subjected to noise reduction; or
And calculating the fusion ratio between the reference image and the first image, and performing weighted fusion on the reference image and the first image according to the fusion ratio to obtain an output image subjected to noise reduction.
5. The method of claim 1, wherein performing the luma alignment process on the second image based on the first image to obtain a luma aligned second image comprises:
if the first image and the second image are not subjected to nonlinear transformation processing, acquiring a first exposure duration of the first image and a second exposure duration of the second image;
and correcting the brightness of the second image according to the ratio of the first exposure time length to the second exposure time length to obtain a second image with aligned brightness.
6. The method according to claim 1 or 5, characterized in that the method further comprises:
and if the first image and the second image are subjected to nonlinear transformation processing, performing histogram matching on the second image based on the first image to obtain a second image with aligned brightness.
7. The method according to claim 1, wherein performing image content alignment processing on the brightness-aligned second image based on the first image to obtain a reference image comprises:
respectively extracting feature points in the first image and the second image;
performing feature point matching on the feature points, and determining a transformation matrix between the first image and the second image;
and carrying out image content alignment processing on the second image after the brightness alignment according to the transformation matrix to obtain a reference image.
8. An image noise reduction apparatus, comprising:
the image acquisition module is used for acquiring a first image and at least one second image, wherein the first image is generated by carrying out exposure processing on a target scene for a first exposure duration, the second image is generated by carrying out exposure processing on the target scene for a second exposure duration, and the second exposure duration is greater than the first exposure duration;
the brightness alignment module is used for carrying out brightness alignment processing on the second image based on the first image to obtain a second image with aligned brightness;
the image content alignment module is used for carrying out image content alignment processing on the second image after the brightness alignment based on the first image to obtain a reference image;
and the image denoising module is used for denoising the first image according to the reference image to obtain a denoised output image.
9. A computer-readable medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 7.
10. An electronic device, comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the method of any of claims 1 to 7 via execution of the executable instructions.
CN202110908689.8A 2021-08-09 2021-08-09 Image noise reduction method and device, computer readable medium and electronic equipment Active CN113658065B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110908689.8A CN113658065B (en) 2021-08-09 2021-08-09 Image noise reduction method and device, computer readable medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110908689.8A CN113658065B (en) 2021-08-09 2021-08-09 Image noise reduction method and device, computer readable medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN113658065A true CN113658065A (en) 2021-11-16
CN113658065B CN113658065B (en) 2024-07-23

Family

ID=78490592

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110908689.8A Active CN113658065B (en) 2021-08-09 2021-08-09 Image noise reduction method and device, computer readable medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN113658065B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114821030A (en) * 2022-04-11 2022-07-29 苏州振旺光电有限公司 Planet image processing method, system and device
CN116095517A (en) * 2022-08-31 2023-05-09 荣耀终端有限公司 Blurring method and blurring device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012160852A (en) * 2011-01-31 2012-08-23 Olympus Corp Image composition device, imaging device, image composition method, and image composition program
CN108492245A (en) * 2018-02-06 2018-09-04 浙江大学 Low light images based on wavelet decomposition and bilateral filtering are to fusion method
CN109040524A (en) * 2018-08-16 2018-12-18 Oppo广东移动通信有限公司 Artifact eliminating method, device, storage medium and terminal
CN110971781A (en) * 2019-11-08 2020-04-07 Oppo广东移动通信有限公司 Image processing method, image processing device, storage medium and electronic equipment
CN111402135A (en) * 2020-03-17 2020-07-10 Oppo广东移动通信有限公司 Image processing method, image processing device, electronic equipment and computer readable storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012160852A (en) * 2011-01-31 2012-08-23 Olympus Corp Image composition device, imaging device, image composition method, and image composition program
CN108492245A (en) * 2018-02-06 2018-09-04 浙江大学 Low light images based on wavelet decomposition and bilateral filtering are to fusion method
CN109040524A (en) * 2018-08-16 2018-12-18 Oppo广东移动通信有限公司 Artifact eliminating method, device, storage medium and terminal
CN110971781A (en) * 2019-11-08 2020-04-07 Oppo广东移动通信有限公司 Image processing method, image processing device, storage medium and electronic equipment
CN111402135A (en) * 2020-03-17 2020-07-10 Oppo广东移动通信有限公司 Image processing method, image processing device, electronic equipment and computer readable storage medium

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114821030A (en) * 2022-04-11 2022-07-29 苏州振旺光电有限公司 Planet image processing method, system and device
CN116095517A (en) * 2022-08-31 2023-05-09 荣耀终端有限公司 Blurring method and blurring device
CN116095517B (en) * 2022-08-31 2024-04-09 荣耀终端有限公司 Blurring method, terminal device and readable storage medium

Also Published As

Publication number Publication date
CN113658065B (en) 2024-07-23

Similar Documents

Publication Publication Date Title
CN111368685B (en) Method and device for identifying key points, readable medium and electronic equipment
CN111598776B (en) Image processing method, image processing device, storage medium and electronic apparatus
CN112602088B (en) Method, system and computer readable medium for improving quality of low light images
CN112562019A (en) Image color adjusting method and device, computer readable medium and electronic equipment
CN112348747A (en) Image enhancement method, device and storage medium
US11948280B2 (en) System and method for multi-frame contextual attention for multi-frame image and video processing using deep neural networks
CN113658065B (en) Image noise reduction method and device, computer readable medium and electronic equipment
US10929961B2 (en) Electronic device and method for correcting images using external electronic device
CN113706414A (en) Training method of video optimization model and electronic equipment
CN111866483A (en) Color restoration method and device, computer readable medium and electronic device
CN112348738B (en) Image optimization method, image optimization device, storage medium and electronic equipment
CN113610720A (en) Video denoising method and device, computer readable medium and electronic device
CN113066020A (en) Image processing method and device, computer readable medium and electronic device
CN113283319A (en) Method and device for evaluating face ambiguity, medium and electronic equipment
CN115471413A (en) Image processing method and device, computer readable storage medium and electronic device
CN113205011B (en) Image mask determining method and device, storage medium and electronic equipment
CN113902636A (en) Image deblurring method and device, computer readable medium and electronic equipment
CN113920023A (en) Image processing method and device, computer readable medium and electronic device
CN111507142A (en) Facial expression image processing method and device and electronic equipment
CN113409204A (en) Method and device for optimizing image to be processed, storage medium and electronic equipment
CN112950641B (en) Image processing method and device, computer readable storage medium and electronic equipment
CN113610724A (en) Image optimization method and device, storage medium and electronic equipment
CN111798385B (en) Image processing method and device, computer readable medium and electronic equipment
CN113362260A (en) Image optimization method and device, storage medium and electronic equipment
CN114119413A (en) Image processing method and device, readable medium and mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant