CN113610724A - Image optimization method and device, storage medium and electronic equipment - Google Patents

Image optimization method and device, storage medium and electronic equipment Download PDF

Info

Publication number
CN113610724A
CN113610724A CN202110887297.8A CN202110887297A CN113610724A CN 113610724 A CN113610724 A CN 113610724A CN 202110887297 A CN202110887297 A CN 202110887297A CN 113610724 A CN113610724 A CN 113610724A
Authority
CN
China
Prior art keywords
image
filtering
filtered
output
parameter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110887297.8A
Other languages
Chinese (zh)
Inventor
王舒瑶
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202110887297.8A priority Critical patent/CN113610724A/en
Publication of CN113610724A publication Critical patent/CN113610724A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The present disclosure relates to the field of image processing technologies, and in particular, to an image optimization method and apparatus, a computer-readable storage medium, and an electronic device, where the method includes: acquiring a current image and a reference image, and performing spatial filtering on the current image to obtain a first filtering image under a first filtering parameter and a second filtering image under a second filtering parameter, wherein the second filtering parameter is greater than the first filtering parameter; performing spatial filtering on the reference image to obtain a reference filtered image of the reference image under a first filtering parameter; performing time domain filtering on the first filtered image according to the reference filtered image to obtain a guide image and a first image to be output; performing contrast enhancement on the current image according to the second filtering image and the guide image to obtain a second image to be output; and obtaining and outputting an optimized image according to the first image to be output and the second image to be output. According to the technical scheme, the calculation amount and the power consumption during image optimization are reduced, and the image optimization precision is improved.

Description

Image optimization method and device, storage medium and electronic equipment
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to an image optimization method and apparatus, a computer-readable storage medium, and an electronic device.
Background
With the development of mobile technology, camera photographing technology has been rapidly developed, however, in the video transmission process, various noises are often mixed in the video, and the contrast in part of the video is also weak, which reduces the visual effect of the video, and therefore, optimization of noise reduction, contrast enhancement and the like on the video and images is required.
When the image noise reduction and contrast enhancement method in the related art is executed, the image noise reduction and contrast enhancement are executed independently, the calculation amount is large, and the power consumption is large.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
An object of the present disclosure is to provide an image optimization method, an image optimization apparatus, a computer-readable medium, and an electronic device, thereby reducing the amount of computation and power consumption at the time of image optimization at least to some extent.
According to a first aspect of the present disclosure, there is provided an image optimization method, comprising:
acquiring a current image and a reference image, and performing spatial filtering on the current image to obtain a first filtering image under a first filtering parameter and a second filtering image under a second filtering parameter; wherein the second filter parameter is greater than the first filter parameter;
performing spatial filtering on the reference image to obtain a reference filtering image of the reference image under a first filtering parameter;
performing time domain filtering on the first filtered image according to the reference filtered image to obtain a guide image and a first image to be output;
carrying out contrast enhancement on the current image according to the second filtering image and the guide image to obtain a second image to be output;
and obtaining an optimized image according to the first image to be output and the second image to be output and outputting the optimized image.
According to a second aspect of the present disclosure, there is provided an image optimization apparatus including:
the first filtering module is used for acquiring a current image and a reference image, and performing spatial filtering on the current image to obtain a first filtering image under a first filtering parameter and a second filtering image under a second filtering parameter; wherein the second filter parameter is greater than the first filter parameter;
the second filtering module is used for carrying out spatial filtering on the reference image to obtain a reference filtering image of the reference image under a first filtering parameter;
the time domain filtering module is used for performing time domain filtering on the first filtering image according to the reference filtering image to obtain a guide image and a first image to be output;
the image enhancement module is used for carrying out contrast enhancement on the current image according to the second filtering image and the guide image to obtain a second image to be output;
and the image optimization module is used for obtaining and outputting an optimized image according to the first image to be output and the second image to be output.
According to a third aspect of the present disclosure, a computer-readable medium is provided, on which a computer program is stored, which computer program, when being executed by a processor, is adapted to carry out the above-mentioned method.
According to a fourth aspect of the present disclosure, there is provided an electronic apparatus, comprising:
a processor; and
a memory for storing one or more programs that, when executed by the one or more processors, cause the one or more processors to implement the above-described method.
The image optimization method provided by the embodiment of the disclosure obtains a current image and a reference image, and performs spatial filtering on the current image to obtain a first filtered image under a first filtering parameter and a second filtered image under a second filtering parameter; performing spatial filtering on the reference image to obtain a reference filtered image of the reference image under a first filtering parameter; performing time domain filtering on the first filtered image according to the reference filtered image to obtain a guide image and a first image to be output; performing contrast enhancement on the current image according to the second filtering image and the guide image to obtain a second image to be output; and obtaining and outputting an optimized image according to the first image to be output and the second image to be output. Compared with the prior art, on one hand, the second filtered image parameters obtained after the image is denoised are directly applied to contrast enhancement of the image, calculated amount and power consumption are reduced by multiplexing calculated results, on the other hand, the contrast enhancement is performed by using the result of spatial filtering, the problem that noise is enhanced when details are enhanced is solved, further, the result of contrast enhancement is enhanced by using the first image to be output after the temporal filtering, and the optimization precision of the image is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty. In the drawings:
FIG. 1 illustrates a schematic diagram of an exemplary system architecture to which embodiments of the present disclosure may be applied;
FIG. 2 shows a schematic diagram of an electronic device to which embodiments of the present disclosure may be applied;
FIG. 3 schematically illustrates a flow chart of a method of image optimization in an exemplary embodiment of the present disclosure;
FIG. 4 schematically illustrates a data flow diagram of an image optimization method in an exemplary embodiment of the present disclosure;
FIG. 5 schematically illustrates a flow chart for acquiring a first filtered image and a second filtered image in an exemplary embodiment of the disclosure;
FIG. 6 schematically illustrates a data flow diagram for acquiring a first filtered image and a second filtered image in an exemplary embodiment of the disclosure;
FIG. 7 schematically illustrates a flow chart for acquiring a guide image in an exemplary embodiment of the present disclosure;
FIG. 8 schematically illustrates a data flow diagram for temporal filtering and contrast enhancement in an exemplary embodiment of the disclosure;
fig. 9 schematically shows a composition diagram of an image optimization apparatus in an exemplary embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
Fig. 1 is a schematic diagram illustrating a system architecture of an exemplary application environment to which an image optimization method and apparatus according to an embodiment of the present disclosure may be applied.
As shown in fig. 1, the system architecture 100 may include one or more of terminal devices 101, 102, 103, a network 104, and a server 105. The network 104 serves as a medium for providing communication links between the terminal devices 101, 102, 103 and the server 105. Network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few. The terminal devices 101, 102, 103 may be various electronic devices having an image processing function, including but not limited to desktop computers, portable computers, smart phones, tablet computers, and the like. It should be understood that the number of terminal devices, networks, and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation. For example, server 105 may be a server cluster comprised of multiple servers, or the like.
The image optimization method provided by the embodiment of the present disclosure is generally executed by the terminal devices 101, 102, and 103, and accordingly, the image optimization apparatus is generally disposed in the terminal devices 101, 102, and 103. However, it is easily understood by those skilled in the art that the image optimization method provided in the embodiment of the present disclosure may also be executed by the server 105, and accordingly, the image optimization apparatus may also be disposed in the server 105, which is not particularly limited in the present exemplary embodiment. For example, in an exemplary embodiment, the user may acquire the current image and the reference image through the terminal devices 101, 102, 103, and then upload the current image and the reference image to the server 105, and after the server generates the optimized image by the image optimization method provided by the embodiment of the present disclosure, transmit the optimized image to the terminal devices 101, 102, 103, and so on.
The exemplary embodiment of the present disclosure provides an electronic device for implementing an image optimization method, which may be the terminal device 101, 102, 103 or the server 105 in fig. 1. The electronic device comprises at least a processor and a memory for storing executable instructions of the processor, the processor being configured to perform the image optimization method via execution of the executable instructions.
The following takes the mobile terminal 200 in fig. 2 as an example, and exemplifies the configuration of the electronic device. It will be appreciated by those skilled in the art that the configuration of figure 2 can also be applied to fixed type devices, in addition to components specifically intended for mobile purposes. In other embodiments, mobile terminal 200 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware. The interfacing relationship between the components is only schematically illustrated and does not constitute a structural limitation of the mobile terminal 200. In other embodiments, the mobile terminal 200 may also interface differently than shown in fig. 2, or a combination of multiple interfaces.
As shown in fig. 2, the mobile terminal 200 may specifically include: a processor 210, an internal memory 221, an external memory interface 222, a Universal Serial Bus (USB) interface 230, a charging management module 240, a power management module 241, a battery 242, an antenna 1, an antenna 2, a mobile communication module 250, a wireless communication module 260, an audio module 270, a speaker 271, a microphone 272, a microphone 273, an earphone interface 274, a sensor module 280, a display 290, a camera module 291, an indicator 292, a motor 293, a button 294, and a Subscriber Identity Module (SIM) card interface 295. Wherein the sensor module 280 may include a depth sensor 2801, a pressure sensor 2802, a gyroscope sensor 2803, and the like.
Processor 210 may include one or more processing units, such as: the Processor 210 may include an Application Processor (AP), a modem Processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband Processor, and/or a Neural-Network Processing Unit (NPU), and the like. The different processing units may be separate devices or may be integrated into one or more processors.
The NPU is a Neural-Network (NN) computing processor, which processes input information quickly by using a biological Neural Network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. The NPU can implement applications such as intelligent recognition of the mobile terminal 200, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
A memory is provided in the processor 210. The memory may store instructions for implementing six modular functions: detection instructions, connection instructions, information management instructions, analysis instructions, data transmission instructions, and notification instructions, and execution is controlled by processor 210.
The charge management module 240 is configured to receive a charging input from a charger. The power management module 241 is used for connecting the battery 242, the charging management module 240 and the processor 210. The power management module 241 receives the input of the battery 242 and/or the charging management module 240, and supplies power to the processor 210, the internal memory 221, the display screen 290, the camera module 291, the wireless communication module 260, and the like.
The wireless communication function of the mobile terminal 200 may be implemented by the antenna 1, the antenna 2, the mobile communication module 250, the wireless communication module 260, a modem processor, a baseband processor, and the like. Wherein, the antenna 1 and the antenna 2 are used for transmitting and receiving electromagnetic wave signals; the mobile communication module 250 may provide a solution including wireless communication of 2G/3G/4G/5G, etc. applied to the mobile terminal 200; the modem processor may include a modulator and a demodulator; the Wireless communication module 260 may provide a solution for Wireless communication including a Wireless Local Area Network (WLAN) (e.g., a Wireless Fidelity (Wi-Fi) network), Bluetooth (BT), and the like, applied to the mobile terminal 200. In some embodiments, antenna 1 of the mobile terminal 200 is coupled to the mobile communication module 250 and antenna 2 is coupled to the wireless communication module 260, such that the mobile terminal 200 may communicate with networks and other devices via wireless communication techniques.
The mobile terminal 200 implements a display function through the GPU, the display screen 290, the application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display screen 290 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 210 may include one or more GPUs that execute program instructions to generate or alter display information.
The mobile terminal 200 may implement a photographing function through the ISP, the camera module 291, the video codec, the GPU, the display screen 290, the application processor, and the like. The ISP is used for processing data fed back by the camera module 291; the camera module 291 is used for capturing still images or videos; the digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals; the video codec is used to compress or decompress digital video, and the mobile terminal 200 may also support one or more video codecs.
The external memory interface 222 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the mobile terminal 200. The external memory card communicates with the processor 210 through the external memory interface 222 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
Internal memory 221 may be used to store computer-executable program code, which includes instructions. The internal memory 221 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area may store data (e.g., audio data, a phonebook, etc.) created during use of the mobile terminal 200, and the like. In addition, the internal memory 221 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk Storage device, a Flash memory device, a Universal Flash Storage (UFS), and the like. The processor 210 executes various functional applications of the mobile terminal 200 and data processing by executing instructions stored in the internal memory 221 and/or instructions stored in a memory provided in the processor.
The mobile terminal 200 may implement an audio function through the audio module 270, the speaker 271, the receiver 272, the microphone 273, the earphone interface 274, the application processor, and the like. Such as music playing, recording, etc.
The depth sensor 2801 is used to acquire depth information of a scene. In some embodiments, a depth sensor may be provided to the camera module 291.
The pressure sensor 2802 is used to sense a pressure signal and convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 2802 may be disposed on the display screen 290. Pressure sensor 2802 can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like.
The gyro sensor 2803 may be used to determine a motion gesture of the mobile terminal 200. In some embodiments, the angular velocity of the mobile terminal 200 about three axes (i.e., x, y, and z axes) may be determined by the gyroscope sensor 2803. The gyro sensor 2803 can be used to photograph anti-shake, navigation, body-feel game scenes, and the like.
In addition, other functional sensors, such as an air pressure sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity light sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, etc., may be provided in the sensor module 280 according to actual needs.
Other devices for providing auxiliary functions may also be included in mobile terminal 200. For example, the keys 294 include a power-on key, a volume key, and the like, and a user can generate key signal inputs related to user settings and function control of the mobile terminal 200 through key inputs. Further examples include indicator 292, motor 293, SIM card interface 295, etc.
In the related technology, when an image is optimized, registration, matching and difference values are mainly carried out on a main image and a secondary image, bilateral filtering is carried out on the main image to achieve the effect of contrast enhancement, or low-illumination images are enhanced, brightness is decomposed, reflection components are solved, and a retinex model is adopted to carry out contrast enhancement and noise suppression. The problems of large calculation amount, large power consumption and optimized progress angle exist.
The image optimization method and the image optimization apparatus according to exemplary embodiments of the present disclosure are specifically described below.
Fig. 3 shows a flow of an image optimization method in the present exemplary embodiment, including the following steps:
step S310, acquiring a current image and a reference image, and performing spatial filtering on the current image to obtain a first filtering image under a first filtering parameter and a second filtering image under a second filtering parameter;
step S320, performing spatial filtering on the reference image to obtain a reference filtered image of the reference image under a first filtering parameter;
step S330, performing time domain filtering on the first filtered image according to the reference filtered image to obtain a guide image and a first image to be output;
step S340, carrying out contrast enhancement on the current image according to the second filtering image and the guide image to obtain a second image to be output;
and step S350, obtaining an optimized image according to the first image to be output and the second image to be output and outputting the optimized image.
Compared with the prior art, on one hand, the second filtered image parameters obtained after the image is denoised are directly applied to contrast enhancement of the image, calculated amount and power consumption are reduced by multiplexing calculated results, on the other hand, the contrast enhancement is performed by using the result of spatial filtering, the problem that noise is enhanced when details are enhanced is solved, further, the result of contrast enhancement is enhanced by using the first image to be output after the temporal filtering, and the optimization precision of the image is improved.
The above steps are described in detail with reference to the following examples.
In step S310, a current image and a reference image are obtained, and spatial filtering is performed on the current image to obtain a first filtered image under a first filtering parameter and a second filtered image under a second filtering parameter.
In an example embodiment of the present disclosure, the reference image and the current image may both be images in a video, wherein an image frame being processed may be defined as the current image, wherein the number of reference images may be one, and a previous frame of the current image may be defined as the reference image; the number of the reference images may also be multiple, and the first few frames of the current image may be defined as the reference images, and in the present exemplary embodiment, the number of the reference images is not specifically limited.
In this exemplary embodiment, as shown in fig. 4, the current image and the reference image may be first spatially filtered 410, then temporally filtered 420, and finally contrast enhanced 430, so as to obtain an optimized image.
In the present exemplary embodiment, the spatial filtering may be performed on the current image to obtain a first filtered image under a first filtering parameter and a second filtered image under a second filtering parameter, which may specifically include steps S510 to S550 as shown in fig. 5.
In step S510, the current image is down-sampled to obtain two identical sub-images.
In step S520, the sub-image is subjected to mean filtering under a preset window to obtain a first mean value.
The down-sampling module 610 may be used to down-sample the current image to obtain two identical sub-images, that is, to down-sample the current image by a factor of 2, and to down-sample the current image to obtain two identical sub-images. Then, the average value of the sub-image may be filtered under a preset window to obtain a first average value, specifically, as shown in fig. 6, the box filter 620 with a window size of the preset window is used to filter the average value of the sub-image to obtain the first average value.
In the present example real-time manner, the size of the preset window is smaller than the size of the sub-image, and the size of the preset window may be 0.25 times, 0.2 times, 0.1 times, and the like of the size of the sub-image, and may also be customized according to a user requirement, which is not specifically limited in the present example embodiment.
Step S530, performing mean value filtering on the product of the two sub-images under a preset window to obtain a second mean value, and obtaining a first square difference according to the first mean value and the second mean value;
in this exemplary embodiment, the average filtering may be performed on the product of the two sub-images in a preset window, and referring to fig. 5, the second average value may still be obtained by performing the average filtering on the product of the sub-images by using a box filter with a window size of the preset window.
In addition, when calculating the product of the two sub-images, the pixel matrices of the two sub-images may be calculated by matrix multiplication.
The variance calculating module 630 may then calculate a first variance according to the first mean and the second mean, and may specifically be the square of the second mean minus the first mean.
And S540, performing guiding filtering on the current image according to the first variance and the first filtering parameter to obtain a first filtering image.
In this exemplary embodiment, a first filtering parameter may be determined first, and then the first filtering parameter filtering module 640 may perform guided filtering on the current image and the first filtered image obtained by performing guided filtering according to the first filtering parameter and the first variance, in this exemplary embodiment, both the guide map of guided filtering and the input image may be the current image, and the following formula may be adopted to determine and calculate the first filtered image:
Q1=a1I+b1
Figure BDA0003194644140000101
Figure BDA0003194644140000102
wherein Q1For the first filtered image, var (I) identifies the first variance, I denotes the current image, ∈ 1 denotes the first filtering parameter, p denotes the guide map, which is the same as the current image I in this example real-time approach, cov is the covariance calculation function.
And step S550, performing guiding filtering on the current image according to the first variance and the second filtering parameter to obtain a second filtering image.
In this exemplary embodiment, a second filtering parameter may be determined first, and then the current image and the second filtered image obtained by performing guided filtering may be obtained by using the second filtering parameter filtering module 650 according to the second filtering parameter and the first variance, in this exemplary embodiment, both the guide map of guided filtering and the input image may be the current image, and the second filtered image may be determined and calculated by using the following formula:
Q2=a2I+b2
Figure BDA0003194644140000111
Figure BDA0003194644140000112
wherein Q2For the second filtered image, var (I) identifies the first variance, I denotes the current image, ∈ 2 denotes the second filtering parameter, p denotes the guide map, which is the same as the current image I in this example real-time approach, cov is the covariance calculation function.
In the present exemplary embodiment, the first filter parameter is larger than the second filter parameter, so that the second filtered image can retain larger edges so as not to cause noise enhancement when used for contrast enhancement. The optimization effect of the image can be improved.
In step S320, spatial filtering is performed on the reference image to obtain a reference filtered image of the reference image under a first filtering parameter.
In this exemplary embodiment, spatial filtering may be performed on the reference image to obtain a reference filtered image under the first filtering parameter, and the current image is already optimized when being optimized, so that the reference filtered image under the first filtering parameter of the reference image may be directly obtained.
In this exemplary embodiment, if the reference filtered image of the reference image cannot be directly obtained, the reference image may be first downsampled to obtain two identical sub-reference images; carrying out mean value filtering on the sub-reference image under a preset window to obtain a third mean value; performing mean filtering on the product of the two sub-reference images under a preset window to obtain a fourth mean value, and obtaining a second variance according to the third mean value and the fourth mean value; and performing guided filtering on the reference image according to the second variance and the first filtering parameter to obtain a reference filtering image.
Specifically, the reference picture may be downsampled first to obtain two identical sub-reference pictures, that is, the reference picture may be downsampled by 2 times to obtain two identical sub-reference pictures. Then, mean filtering may be performed on the sub-reference image under a preset window to obtain a third mean value, and the box filter with the window size being the preset window is used to perform mean filtering on the sub-reference image to obtain the third mean value.
In this exemplary embodiment, it may be calculated to perform mean filtering on the product of the two sub-reference images in a preset window, and still may perform mean filtering on the product of the sub-reference images by using a box filter with a window size of the preset window to obtain the fourth mean value.
The second variance may then be calculated based on the third mean and the fourth mean, and may specifically be the fourth mean minus the square of the third mean.
And then, performing guided filtering on the reference image according to the second variance and the first filtering parameter to obtain the reference filtered image, where the process of guided filtering is described in detail above, and therefore, the description is omitted here.
Step S330, performing time-domain filtering on the first filtered image according to the reference filtered image to obtain a guide image and a first image to be output.
In this exemplary embodiment, the first filtered image may be first temporally filtered according to the reference filtered image to obtain a guide image; and then fusing the first filtered image and the reference filtered image according to the guide image to form a first image to be output.
In the present exemplary embodiment, when the first filtered image is temporally filtered according to the reference filtered image to obtain the guide image, as shown in fig. 7, steps S710 to S750 may be included
In step S710, a difference image of the first filtered image and the reference filtered image is obtained.
In the present exemplary embodiment, as shown in fig. 8, the difference image 810 may be obtained by subtracting the obtained first filtered image and the reference filtered image.
In step S720, a salient region image in the first filtered image is acquired.
In this exemplary embodiment, referring to fig. 8, a saliency region may be extracted from the first filtered image, and a saliency detection algorithm may be used to extract a saliency region and perform differentiation display on the saliency region to obtain a saliency region image 820.
In the present exemplary embodiment, the significance detection algorithm may be an LC algorithm, an HC algorithm, an AC algorithm, an FT algorithm, or the like, and is not particularly limited in the present exemplary embodiment.
In step S730, edge detection is performed on the first filtered image to obtain an edge image.
In this exemplary embodiment, referring to fig. 8, the first filtered image may be subjected to edge detection to obtain an edge image 830, where the edge image is an image in which edges in the first filtered image are differentially displayed.
In this exemplary embodiment, the edge detection of the image may be implemented by using a Sobel operator, a Laplace operator, and a Canny operator, which is not specifically limited in this exemplary embodiment.
In step S740, the salient region image and the edge image are fused to obtain a fused image.
In the present exemplary embodiment, as shown in fig. 8, a fused image may be obtained by image-fusing 840 the image of the salient region and the edge image, and specifically, the fused image may be obtained by adding the image of the salient region and the edge image. The fusion image comprises a detail part sensitive to human eyes, so that the accuracy of image optimization can be improved.
In step S750, the guide image is obtained according to the fusion image and the difference image.
In the present exemplary embodiment, as shown in fig. 8, the fusion image and the difference image may be used as the guide image.
In the present exemplary embodiment, referring to fig. 8, after the guide image is acquired, the first filtered image and the reference filtered image may be subjected to guide fusion 850 according to the guide image to obtain a first image to be output. Specifically, the image may be divided into a plurality of regions, and if a pixel in the difference image in a region is smaller than a preset threshold, the pixel value of each point of the first to-be-output image in the region is an average value of the pixel value of each point in the region of the first filtered image and the pixel value of each point in the region of the reference filtered image. And if the pixel of the difference image in the region is larger than a preset threshold value, the fused image of the region is the current image. The specific value of the preset threshold may be customized according to the requirement of the user, and is not specifically limited in this exemplary embodiment.
In the real-time mode of the present example, the fused image may also be used as a guide, so that details of the salient region and the edge region are not easily lost, and the accuracy of the obtained first to-be-output image is better.
In another example embodiment, a fusion weight may be determined according to the difference image, and the larger the pixel value of the difference image is, the larger the fusion weight value of the first filtered image is, so that the similarity between the first image to be output and the current image is higher.
Step S340, performing contrast enhancement on the current image according to the second filtered image and the guide image to obtain a second image to be output.
In an exemplary embodiment of the present disclosure, referring to fig. 8, a high-pass filtered image may be obtained according to a current image and a second filtered image, specifically, the second filtered image is subtracted from the current image to obtain the high-pass filtered image, and then the high-pass filtered image is contrast-enhanced 860 using the guide image as a guide to obtain the second image to be output.
In the embodiment of the present invention, the fused image is used as a guide to perform emphasis enhancement on the detail parts sensitive to human eyes, and then the difference image is used to perform enhancement correction on the result of contrast enhancement, so as to further improve the accuracy of image optimization.
And step S350, obtaining an optimized image according to the first image to be output and the second image to be output and outputting the optimized image.
In the present exemplary embodiment, as shown in fig. 8, after the second image to be output is obtained, the second image to be output and the first image to be output may be optimally fused 870 to obtain an optimized image, and specifically, the pixel matrix of the optimized image may be obtained by solving an average value of the pixel matrix of the first image to be output and the pixel matrix of the second image to be output.
In summary, in the exemplary embodiment, on one hand, the second filtered image parameter obtained after the image is denoised is directly applied to contrast enhancement of the image, and the computation amount and power consumption are reduced by multiplexing the computation result, and on the other hand, the problem of noise enhancement during detail enhancement is solved by performing contrast enhancement by using the spatial filtering result. Furthermore, the first to-be-output image after time-domain filtering is used for enhancing the contrast enhancement result, and the optimization precision of the image is improved.
It is noted that the above-mentioned figures are merely schematic illustrations of processes involved in methods according to exemplary embodiments of the present disclosure, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
Further, referring to fig. 9, an image optimization apparatus 900 is further provided in the present exemplary embodiment, and includes a first filtering module 910, a second filtering module 920, a temporal filtering module 930, an image enhancement module 940, and an image optimization module 950. Wherein:
the first filtering module 910 is configured to obtain a current image and a reference image, and perform spatial filtering on the current image to obtain a first filtered image under a first filtering parameter and a second filtered image under a second filtering parameter; wherein the second filter parameter is greater than the first filter parameter.
In an exemplary embodiment, the first filtering module 910 may be configured to down-sample the current image to obtain two identical sub-images; carrying out mean value filtering on the sub-images under a preset window to obtain a first mean value; performing mean value filtering on the product of the two sub-images under a preset window to obtain a second mean value, and obtaining a first square difference according to the first mean value and the second mean value; performing guided filtering on the current image according to the first variance and the first filtering parameter to obtain a first filtering image; performing guided filtering on the current image according to the first variance and the second filtering parameter to obtain a second filtering image; the second filtering parameter is larger than the first filtering parameter, and the reference image comprises a previous frame image of the current image.
The second filtering module 920 may be configured to perform spatial filtering on the reference image to obtain a reference filtered image of the reference image under the first filtering parameter.
In an example embodiment, the second filtering module 920 may down-sample the reference picture to obtain two identical sub-reference pictures; carrying out mean value filtering on the sub-reference image under a preset window to obtain a third mean value; performing mean filtering on the product of the two sub-reference images under a preset window to obtain a fourth mean value, and obtaining a second variance according to the third mean value and the fourth mean value; and performing guided filtering on the reference image according to the second variance and the first filtering parameter to obtain a reference filtering image.
The temporal filtering module 930 may be configured to perform temporal filtering on the first filtered image according to the reference filtered image to obtain a guide image and a first to-be-output image.
In an example embodiment, the temporal filtering module 930 may first perform temporal filtering on the first filtered image according to the reference filtered image to obtain a guide image, and specifically, may obtain a difference image between the first filtered image and the reference filtered image; acquiring a salient region image in the first filtering image; performing edge detection on the first filtering image to obtain an edge image, and fusing the salient region image and the edge image to obtain a fused image; obtaining a guide image according to the fusion image and the difference image; and then fusing the first filtered image and the reference filtered image according to the guide image to obtain a first image to be output, specifically, fusing the first filtered image and the reference filtered image according to the guide image and the difference image to obtain the first image to be output.
The image enhancement module 940 may be configured to perform contrast enhancement on the current image according to the second filtered image and the guide image to obtain a second image to be output.
In an example embodiment of the present disclosure, the image enhancement module 940 may obtain a high-pass filtered image from the second filtered image and the current image; and performing contrast enhancement on the high-pass filtering image by using the guide image as a reference to obtain a second image to be output.
The image optimization module 950 may be configured to obtain an optimized image according to the first image to be output and the second image to be output, and output the optimized image.
The specific details of each module in the above apparatus have been described in detail in the method section, and details that are not disclosed may refer to the method section, and thus are not described again.
As will be appreciated by one skilled in the art, aspects of the present disclosure may be embodied as a system, method or program product. Accordingly, various aspects of the present disclosure may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system.
Exemplary embodiments of the present disclosure also provide a computer-readable storage medium having stored thereon a program product capable of implementing the above-described method of the present specification. In some possible embodiments, various aspects of the disclosure may also be implemented in the form of a program product comprising program code for causing a terminal device to perform the steps according to various exemplary embodiments of the disclosure described in the above-mentioned "exemplary methods" section of this specification, when the program product is run on the terminal device.
It should be noted that the computer readable media shown in the present disclosure may be computer readable signal media or computer readable storage media or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer readable signal medium may comprise a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
Furthermore, program code for carrying out operations of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is to be limited only by the terms of the appended claims.

Claims (11)

1. An image optimization method, comprising:
acquiring a current image and a reference image, and performing spatial filtering on the current image to obtain a first filtering image under a first filtering parameter and a second filtering image under a second filtering parameter; wherein the second filter parameter is greater than the first filter parameter;
performing spatial filtering on the reference image to obtain a reference filtering image of the reference image under a first filtering parameter;
performing time domain filtering on the first filtered image according to the reference filtered image to obtain a guide image and a first image to be output;
carrying out contrast enhancement on the current image according to the second filtering image and the guide image to obtain a second image to be output;
and obtaining an optimized image according to the first image to be output and the second image to be output and outputting the optimized image.
2. The method of claim 1, wherein the spatially filtering the current image to obtain a first filtered image under a first filtering parameter and a second filtered image under a second filtering parameter comprises:
down-sampling the current image to obtain two identical sub-images;
performing mean value filtering on the subimages under a preset window to obtain a first mean value;
performing mean value filtering on the product of the two sub-images under the preset window to obtain a second mean value, and obtaining a first square difference according to the first mean value and the second mean value;
performing guided filtering on the current image according to the first variance and a first filtering parameter to obtain a first filtering image;
and performing guiding filtering on the current image according to the first variance and the second filtering parameter to obtain a second filtering image.
3. The method of claim 1, wherein temporally filtering the first filtered image according to the reference filtered image to obtain a guide image and a first to-be-output image comprises:
performing time domain filtering on the first filtering image according to the reference filtering image to obtain a guide image;
and fusing the first filtering image and the reference filtering image according to the guide image to obtain the first image to be output.
4. The method of claim 3, wherein temporally filtering the first filtered image from the reference filtered image to obtain a guide image comprises:
acquiring a difference image of the first filtered image and the reference filtered image;
acquiring a salient region image in the first filtering image;
performing edge detection on the first filtered image to obtain an edge image;
fusing the salient region image and the edge image to obtain a fused image;
and obtaining the guide image according to the fusion image and the difference image.
5. The method according to claim 4, wherein fusing the first filtered image and the reference filtered image according to the guide image comprises:
and fusing the first filtering image and the reference filtering image according to the guide image and the difference image to obtain a first image to be output.
6. The method of claim 1, wherein performing contrast enhancement on the current image according to the second filtered image and the guide image to obtain a second image to be output comprises:
obtaining a high-pass filtering image according to the second filtering image and the current image;
and performing contrast enhancement on the high-pass filtering image by using the guide image as a reference to obtain a second image to be output.
7. The method according to claim 1, wherein the spatially filtering the reference image to obtain a reference filtered image of the reference image under a first filtering parameter comprises:
down-sampling the reference image to obtain two same sub-reference images;
carrying out mean value filtering on the sub-reference image under a preset window to obtain a third mean value;
performing mean filtering on the product of the two sub-reference images under a preset window to obtain a fourth mean value, and obtaining a second variance according to the third mean value and the fourth mean value;
and performing guiding filtering on the reference image according to the second variance and the first filtering parameter to obtain a reference filtering image.
8. The method of claim 1, wherein the current picture and the reference picture belong to the same video data, and wherein the reference picture comprises a picture of a frame previous to the current picture.
9. An image optimization apparatus, comprising:
the first filtering module is used for acquiring a current image and a reference image, and performing spatial filtering on the current image to obtain a first filtering image under a first filtering parameter and a second filtering image under a second filtering parameter;
the second filtering module is used for carrying out spatial filtering on the reference image to obtain a reference filtering image of the reference image under a first filtering parameter;
the time domain filtering module is used for performing time domain filtering on the first filtering image according to the reference filtering image to obtain a guide image and a first image to be output;
the image enhancement module is used for carrying out contrast enhancement on the current image according to the second filtering image and the guide image to obtain a second image to be output;
and the image optimization module is used for obtaining and outputting an optimized image according to the first image to be output and the second image to be output.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the image optimization method according to any one of claims 1 to 8.
11. An electronic device, comprising:
a processor; and
memory for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement the image optimization method of any one of claims 1 to 8.
CN202110887297.8A 2021-08-03 2021-08-03 Image optimization method and device, storage medium and electronic equipment Pending CN113610724A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110887297.8A CN113610724A (en) 2021-08-03 2021-08-03 Image optimization method and device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110887297.8A CN113610724A (en) 2021-08-03 2021-08-03 Image optimization method and device, storage medium and electronic equipment

Publications (1)

Publication Number Publication Date
CN113610724A true CN113610724A (en) 2021-11-05

Family

ID=78306626

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110887297.8A Pending CN113610724A (en) 2021-08-03 2021-08-03 Image optimization method and device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN113610724A (en)

Similar Documents

Publication Publication Date Title
EP2889832B1 (en) Image processing apparatus and method
CN111091166B (en) Image processing model training method, image processing device, and storage medium
WO2021078001A1 (en) Image enhancement method and apparatus
CN112562019A (en) Image color adjusting method and device, computer readable medium and electronic equipment
CN111866483B (en) Color restoration method and device, computer readable medium and electronic device
CN111950570B (en) Target image extraction method, neural network training method and device
CN112598780B (en) Instance object model construction method and device, readable medium and electronic equipment
CN113706440A (en) Image processing method, image processing device, computer equipment and storage medium
CN112581358A (en) Training method of image processing model, image processing method and device
CN113744286A (en) Virtual hair generation method and device, computer readable medium and electronic equipment
CN113570645A (en) Image registration method, image registration device, computer equipment and medium
CN110956571B (en) SLAM-based virtual-real fusion method and electronic equipment
CN113902636A (en) Image deblurring method and device, computer readable medium and electronic equipment
CN113658065A (en) Image noise reduction method and device, computer readable medium and electronic equipment
CN112528760B (en) Image processing method, device, computer equipment and medium
CN113284206A (en) Information acquisition method and device, computer readable storage medium and electronic equipment
CN112818979A (en) Text recognition method, device, equipment and storage medium
CN113362260A (en) Image optimization method and device, storage medium and electronic equipment
CN116258800A (en) Expression driving method, device, equipment and medium
CN113920023A (en) Image processing method and device, computer readable medium and electronic device
CN113409204A (en) Method and device for optimizing image to be processed, storage medium and electronic equipment
CN113610724A (en) Image optimization method and device, storage medium and electronic equipment
CN114511082A (en) Training method of feature extraction model, image processing method, device and equipment
CN114119413A (en) Image processing method and device, readable medium and mobile terminal
CN116137025A (en) Video image correction method and device, computer readable medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination