CN113362246A - Image banding artifact removing method, device, equipment and medium - Google Patents

Image banding artifact removing method, device, equipment and medium Download PDF

Info

Publication number
CN113362246A
CN113362246A CN202110647537.7A CN202110647537A CN113362246A CN 113362246 A CN113362246 A CN 113362246A CN 202110647537 A CN202110647537 A CN 202110647537A CN 113362246 A CN113362246 A CN 113362246A
Authority
CN
China
Prior art keywords
image
value
determining
matrix
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110647537.7A
Other languages
Chinese (zh)
Inventor
李昆明
宋秉一
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Bigo Technology Pte Ltd
Original Assignee
Bigo Technology Pte Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bigo Technology Pte Ltd filed Critical Bigo Technology Pte Ltd
Priority to CN202110647537.7A priority Critical patent/CN113362246A/en
Publication of CN113362246A publication Critical patent/CN113362246A/en
Priority to PCT/CN2022/094771 priority patent/WO2022257759A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/73Deblurring; Sharpening
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration using local operators
    • G06T5/30Erosion or dilatation, e.g. thinning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/80Geometric correction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • G06T2207/20028Bilateral filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20192Edge enhancement; Edge preservation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a method, a device, equipment and a medium for removing banded artifacts of an image, wherein the method comprises the steps of carrying out filtering operation on the image to be processed, determining a filtered image after removing the banded artifacts, and determining a difference matrix and a difference absolute value matrix of the filtered image and the image to be processed according to the difference value of pixel values of pixel points at each corresponding position in the filtered image and the image to be processed; determining a target size relation between the element value and a first preset threshold and a second preset threshold for each element point in the difference absolute value matrix, wherein the second preset threshold is larger than the first preset threshold; determining a target pixel value of a target pixel point at a corresponding position of each element point in the target image according to a pre-stored value function of the pixel value corresponding to each size relationship, and obtaining the target image with retained edge textures; thereby removing the banding artifact from the target image and avoiding the blurring of the image edges, and improving the speed of determining the target image due to low computational complexity.

Description

Image banding artifact removing method, device, equipment and medium
Technical Field
The present invention relates to the field of image processing technologies, and in particular, to a method, an apparatus, a device, and a medium for removing image banding artifacts.
Background
Based on the current image coding Standard and Video coding Standard, quantization means is commonly used when compressing images, wherein the image coding Standard is Joint Photographic Experts Group (JPEG), and the Video coding Standard can be a compression Standard based on digital storage media Moving images and voice (MPEG-2), a highly compressed digital Video coding Standard (H264), a highly compressed digital Video coding Standard (H265), or an Audio Video coding Standard (AVS).
The quantization means quantizes a residual signal or a large-range input signal into a small-range output signal, thereby compressing bits, and although the quantization means can effectively improve the compression efficiency of an image, artificial noise, typically blocking artifacts and banding artifacts, is introduced due to the presence of quantization errors.
The main manifestation of blocking artifacts in images is that the brightness at the edges of the coded blocks is discontinuous, and blocking artifacts can occur anywhere in the whole image. The banding artifact is mainly caused by that in the quantization process, detail information with low amplitude is lost, so the banding artifact mainly appears in a region with relatively gentle gradient change in the image, such as a flat region, a brightness gradient region, and the like. In a scene with sensitive bandwidth cost, a larger compression ratio is adopted, which may cause a more serious banding artifact phenomenon, and in addition, after an image including the banding artifact phenomenon is obtained, if contrast enhancement processing is subsequently performed, the banding artifact phenomenon may be more serious.
Fig. 1 is a schematic diagram of an image without quantization processing provided in the prior art, as shown in fig. 1, there is no banding artifact in fig. 1; fig. 2 is a schematic diagram of an image after quantization processing provided by the prior art, as shown in fig. 2, a Quantization Parameter (QP) during quantization processing is 17, and a banding artifact phenomenon in fig. 2 is not obvious; fig. 3 is a schematic diagram of an image after quantization processing provided in the prior art, as shown in fig. 3, a Quantization Parameter (QP) during quantization processing is 20, and a more obvious banding artifact phenomenon exists in fig. 3; fig. 4 is a schematic diagram of an image after quantization processing according to the prior art, as shown in fig. 4, a Quantization Parameter (QP) during quantization processing is 27, and a relatively serious banding artifact phenomenon exists in fig. 4.
Fig. 5 is a schematic diagram of a video frame image after quantization processing provided by the prior art, as shown in fig. 5, a relatively obvious banding artifact phenomenon exists in an upper left region (upper, lower, left and right regions in the figure) in fig. 5, a brightness step change phenomenon appears in the upper left region in fig. 5, and when a video frame image is played corresponding to a video, a brightness step of the upper left region changes along with a change of the video frame, and an image in the video frame can change in a wave-like dynamic manner in a circle, which results in poor viewing experience of a user.
In order to remove the banding artifacts in the quantized image, the prior art includes a diter error transfer method and a post-processing method.
The above-mentioned method of transferring the diter error processes the banding artifact by adding the correlated noise to the image, but this method can only reduce the banding artifact in the image, and the effect of removing the banding artifact is not obvious.
The post-processing method comprises the steps of removing the banding artifact by using loop filtering, removing the banding artifact by using a gradfun plug-in a multimedia video processing tool (Fast Forward Mpeg), and removing the banding artifact by using a deep learning model.
The effect is not obvious when the ring filtering is adopted to remove the banding artifact, the effect is not obvious when the gradfun plug-in is adopted to remove the banding artifact, the image edge is fuzzy, the calculation complexity is high when the deep learning model is adopted to remove the banding artifact, and the speed is low when the banding artifact is removed.
Therefore, how to avoid blurring of image edges when removing the banding artifact and increase the speed when removing the banding artifact of the image becomes an urgent problem to be solved.
Disclosure of Invention
The invention provides a method, a device, equipment and a medium for removing image banding artifacts, which are used for solving the problems of avoiding image edge blurring when the banding artifacts are removed and improving the speed when the image banding artifacts are removed in the prior art.
The invention provides an image banding artifact removing method, which comprises the following steps:
performing filtering operation on an image to be processed, determining a filtered image of the image to be processed after removing the strip-shaped artifact, and determining a difference matrix and a difference absolute value matrix of the filtered image and the image to be processed according to the difference value of the pixel value of each pixel point at the corresponding position in the filtered image and the image to be processed;
for each element point in the difference absolute value matrix, determining a target size relation between the element value and a first preset threshold and a second preset threshold according to the element value of the element point, the first preset threshold and the second preset threshold, wherein the second preset threshold is larger than the first preset threshold;
and determining a target pixel value of a target pixel point at a corresponding position of each element point in the target image after the retention of the edge texture according to the target size relationship corresponding to the element value of each element point and a pre-stored value function of the pixel value corresponding to each size relationship, so as to obtain the target image after the retention of the edge texture.
Accordingly, the present invention provides an image banding artifact removal apparatus, said apparatus comprising:
the first determining module is used for performing filtering operation on an image to be processed, determining a filtered image of the image to be processed after removing the strip-shaped artifact, and determining a difference matrix and a difference absolute value matrix of the filtered image and the image to be processed according to the difference value of the pixel value of each pixel point at the corresponding position in the filtered image and the image to be processed;
a second determining module, configured to determine, for each element point in the difference absolute value matrix, a target size relationship between the element value and the first preset threshold and a target size relationship between the element value and the second preset threshold according to an element value of the element point, a first preset threshold and a second preset threshold, where the second preset threshold is greater than the first preset threshold;
and the third determining module is used for determining the target pixel value of the target pixel point at the corresponding position of each element point in the target image after the retention of the edge texture according to the target size relationship corresponding to the element value of each element point and the pre-stored value taking function of the pixel value corresponding to each size relationship, so as to obtain the target image after the retention of the edge texture.
Accordingly, the present invention provides an electronic device comprising: the system comprises a processor, a communication interface, a memory and a communication bus, wherein the processor, the communication interface and the memory complete mutual communication through the communication bus; the memory has stored therein a computer program which, when executed by the processor, causes the processor to perform the steps of any of the above-described image banding artifact removal methods.
Accordingly, the present invention provides a computer readable storage medium having stored thereon a computer program which, when being executed by a processor, carries out the steps of any of the above-mentioned image banding artifact removal methods.
The invention provides a method, a device, equipment and a medium for removing image banding artifacts, wherein the method comprises the steps of carrying out filtering operation on an image to be processed, determining a filtered image of the image to be processed after the banding artifacts are removed, and determining a difference matrix and a difference absolute value matrix of the filtered image and the image to be processed according to the difference value of pixel values of pixel points at each corresponding position in the filtered image and the image to be processed in order to solve the problem of edge blurring of the filtered image; for each element point in the difference absolute value matrix, determining a target size relation between the element value and a first preset threshold and a second preset threshold according to the element value of the element point, the first preset threshold and the second preset threshold, wherein the second preset threshold is larger than the first preset threshold; determining a target pixel value of a target pixel point at a corresponding position of each element point in a target image after the retention of the edge texture according to the target size relationship corresponding to the element value of each element point and a pre-stored value function of the pixel value corresponding to each size relationship, so as to obtain the target image after the retention of the edge texture; therefore, the target image is removed from the banding artifact and the image edge blurring is avoided, and only simple subtraction and multiplication are adopted in the method, so that the computational complexity is low, and the speed of determining the target image is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
FIG. 1 is a schematic diagram of an image without quantization processing provided in the prior art;
FIG. 2 is a diagram illustrating a quantized image according to the prior art;
FIG. 3 is a diagram illustrating a quantized image according to the prior art;
FIG. 4 is a diagram illustrating a quantized image according to the prior art;
FIG. 5 is a diagram illustrating a video frame image after quantization processing according to the prior art;
fig. 6 is a schematic process diagram of an image banding artifact removing method according to an embodiment of the present invention;
FIG. 7 is a diagram illustrating a default error transfer matrix according to an embodiment of the present invention;
fig. 8 is a schematic diagram illustrating a video frame image after banding artifact removal according to an embodiment of the present invention;
fig. 9 is a schematic process diagram of an image banding artifact removing method according to an embodiment of the present invention;
FIG. 10 is a process diagram of an image banding artifact removal method according to an embodiment of the present invention;
fig. 11 is a schematic structural diagram of an image banding artifact removing apparatus according to an embodiment of the present invention;
fig. 12 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention clearer, the present invention will be described in further detail with reference to the accompanying drawings, and it is apparent that the described embodiments are only a part of the embodiments of the present invention, not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In order to remove the banding artifact, avoid blurring the edge of the image and improve the speed of removing the banding artifact of the image, the embodiment of the invention provides a method, a device, equipment and a medium for removing the banding artifact of the image.
Example 1:
fig. 6 is a schematic process diagram of an image banding artifact removing method according to an embodiment of the present invention, where the process includes the following steps:
s601: the method comprises the steps of carrying out filtering operation on an image to be processed, determining a filtering image obtained after strip-shaped artifacts are removed from the image to be processed, and determining a difference matrix and a difference absolute value matrix of the filtering image and the image to be processed according to the difference value of pixel values of pixel points at each corresponding position in the filtering image and the image to be processed.
The method for removing the image banding artifact provided by the embodiment of the invention is applied to electronic equipment, wherein the electronic equipment can be intelligent terminal equipment such as a PC (personal computer), a tablet personal computer, a smart phone and the like, and can also be a local server, a cloud server and the like.
Due to banding artifacts of the image to be processed caused by quantization of the encoder, for example, in the short video field, the video uploaded by the user may be the video compressed by other third party encoders, or the video compressed by a high compression ratio, so that the video frame images in the video include the banding artifacts.
In order to remove the banding artifact in the image to be processed, the electronic device performs a filtering operation on the image to be processed, where the image to be processed refers to an image with the banding artifact, the image to be processed may be a single image or a video frame image obtained by decoding video data, the image to be processed does not define a color space, and may be a Red Green Blue (RGB) image, a bright Luminance Chrominance (YUV) image, or a color model (Lab) image, and the filtering operation may be low-pass filtering or edge preserving filtering.
Specifically, the low-pass filtering operation may be performed by using an existing box filter (boxfilter), gaussian filter (gaussian filter), or the like, and the bilateral filtering operation may be performed by using an existing bilateral filter (bilateral filter), non-local filter (non-local filter), guided filter (guided filter), side window filter (side windows filter), or the like.
After the filtering operation is performed on the image to be processed, a filtered image of the image to be processed without the strip-shaped artifact can be determined, wherein the number of pixel points of the filtered image is the same as that of the pixel points of the image to be processed.
Specifically, for a pixel point at each position in the filtered image, a difference value of the pixel values is determined according to the pixel value of the pixel point and the pixel value of the pixel point at the corresponding position in the image to be processed; according to the difference value of the pixel values of the pixel points at each position, a difference matrix of the filtering image and the image to be processed can be determined; the number of columns of the difference matrix is the number of pixels with the width of the image to be processed, and the number of rows of the difference matrix is the number of pixels with the height of the image to be processed; according to the determined difference matrix, the electronic equipment further determines a difference absolute value matrix, and specifically, the difference absolute value matrix is determined by adopting the existing absolute value taking function.
S602: and aiming at each element point in the difference absolute value matrix, determining the target size relation between the element value and the first preset threshold and the second preset threshold according to the element value of the element point, the first preset threshold and the second preset threshold, wherein the second preset threshold is larger than the first preset threshold.
In order to make the edge of the filtered image clear, the electronic device further stores a first preset threshold and a second preset threshold, wherein the first preset threshold and the second preset threshold are both preset, and the second preset threshold is greater than the first preset threshold.
Since the element value of each element point of the difference absolute value matrix is different from the first preset threshold and the second preset threshold, the determination method of the pixel value of the corresponding position of the element point is different, and thus, for each element point in the difference absolute value matrix, the target size relationship between the element value and the first preset threshold and the target size relationship between the element value and the second preset threshold are determined according to the element value of the element point, the first preset threshold and the second preset threshold. Wherein the element value may be less than a first preset threshold, may be greater than the first preset threshold, and may be greater than a second preset threshold.
S603: and determining a target pixel value of a target pixel point at a corresponding position of each element point in the target image after the retention of the edge texture according to the target size relationship corresponding to the element value of each element point and a pre-stored value function of the pixel value corresponding to each size relationship, so as to obtain the target image after the retention of the edge texture.
In order to determine a target pixel value of a target pixel point at a corresponding position of each element point, in the embodiment of the present invention, a value function of the pixel value corresponding to each size relationship is also pre-stored, where the size relationship includes that the element value is not greater than a first preset threshold, the element value is greater than the first preset threshold and less than a second preset threshold, and the element value is not less than the second preset threshold.
And determining a target value function of the pixel value of the corresponding position of each element point according to the prestored value function of the pixel value corresponding to each size relation and the prestored target size relation corresponding to the element value of each element point, and determining a target pixel value of a target pixel point of the corresponding position of each element point in the target image after the edge texture is reserved according to the target value function of the pixel point of the corresponding position of each element point, so that the target image after the edge texture is reserved is obtained according to the target pixel value of each target pixel point.
In the embodiment of the invention, the filtering operation is carried out on the image to be processed, the filtering image of the image to be processed after the strip-shaped artifact is removed is determined, and in order to solve the problem of edge blurring of the filtering image, a difference matrix and a difference absolute value matrix of the filtering image and the image to be processed are determined according to the difference value of the pixel value of each pixel point at the corresponding position in the filtering image and the image to be processed; for each element point in the difference absolute value matrix, determining a target size relation between the element value and a first preset threshold and a second preset threshold according to the element value of the element point, the first preset threshold and the second preset threshold, wherein the second preset threshold is larger than the first preset threshold; determining a target pixel value of a target pixel point at a corresponding position of each element point in a target image after the retention of the edge texture according to the target size relationship corresponding to the element value of each element point and a pre-stored value function of the pixel value corresponding to each size relationship, so as to obtain the target image after the retention of the edge texture; therefore, the target image is removed from the banding artifact and the image edge blurring is avoided, and only simple subtraction and multiplication are adopted in the method, so that the computational complexity is low, and the speed of determining the target image is improved.
Example 2:
in order to determine a target pixel value of a target pixel point at a corresponding position of each element point in a target image after an edge texture is retained, on the basis of the above embodiment, in an embodiment of the present invention, the size relationship includes that the element value is not greater than the first preset threshold, the element value is greater than the first preset threshold and less than the second preset threshold, and the element value is not less than the second preset threshold, and determining the target pixel value of the target pixel point at the corresponding position of each element point in the target image after the edge texture is retained according to the target size relationship corresponding to the element value of each element point and a pre-stored value-taking function of the pixel value corresponding to each size relationship includes:
for the target size relationship corresponding to the element value of each element point, if the target size relationship is that the element value of the element point is not greater than the first preset threshold or not less than the second preset threshold, respectively determining the pixel value of a pixel point at the corresponding position of the element point in the filtered image or the image to be processed, and determining the pixel value as the target pixel value of a target pixel point at the corresponding position of the element point in the target image; if the target size relationship also indicates that the element value of the element point is greater than the first preset threshold and smaller than the second preset threshold, determining a first difference value between the second preset threshold and the element value of the element point, determining a second difference value between the second preset threshold and the first preset threshold, determining a ratio value corresponding to the first difference value and the second difference value, determining a product value between the element value of a corresponding position of the element point in the difference matrix and the ratio value, determining a sum value between the product value and a pixel value of a pixel point of the corresponding position of the element point in the image to be processed, and determining the sum value as a target pixel value of a target pixel point of the corresponding position of the element point in the target image.
In order to determine the target pixel value of each target pixel point in the target image, in the embodiment of the present invention, for the target size relationship corresponding to the element value of each element point, the target pixel value of the target pixel point at the corresponding position of the element point in the target image is determined according to the target size relationship corresponding to the element value of the element point, the image to be processed, and the filtered image.
Specifically, according to a target size relationship corresponding to the element value of the element point, if the target size relationship is that the element value of the element point is not greater than a first preset threshold, according to a predetermined filtering image, determining a pixel value of a pixel point at a position corresponding to the element point in the filtering image.
For example, if the position of the element point in the difference absolute value matrix is a first row and a second column, the pixel value of the pixel point in the first row and the second column is determined in the filtered image, where the number of rows and the number of columns of the difference absolute value matrix are the same as the number of rows and the number of columns of the filtered image, and therefore the pixel value of the pixel point in the corresponding position can be found in the filtered image according to the position of the element point in the difference absolute value matrix.
And if the target size relationship is that the element value of the element point is not less than a second preset threshold, determining the pixel value of the pixel point at the corresponding position of the element point in the image to be processed according to the predetermined image to be processed. The number of rows and columns of the difference absolute value matrix is the same as the number of rows and columns of the image to be processed, so that the pixel value of the pixel point at the corresponding position can be found in the image to be processed according to the position of the element point in the difference absolute value matrix.
And after determining the pixel value of the pixel point of the corresponding position of the element point in the filtering image or the image to be processed, determining the pixel value as the target pixel value of the target pixel point of the corresponding position of the element point in the target image.
If the target size relationship is that the element value of the element point is larger than a first preset threshold and smaller than a second preset threshold, determining a first difference value between the second preset threshold and the element value of the element point according to the element value of the element point and the second preset threshold, determining a second difference value between the second preset threshold and the first preset threshold according to the second preset threshold and the first preset threshold, determining a proportional value between the first difference value and the second difference value according to the first difference value and the second difference value, and multiplying the proportional value and the element value of the corresponding position of the element point in the difference matrix according to the determined proportional value and the element value of the corresponding position of the element point in the difference matrix to obtain a product value. And determining the sum of the product value and the pixel value of the pixel point at the corresponding position of the element point in the image to be processed, and determining the sum as the target pixel value of the target pixel point at the corresponding position of the element point in the target image.
Specifically, in the embodiment of the present invention, a difference matrix textDiff between the filtered image blu filter Img and the image to be processed Img is determined according to the image to be processed Img and the filtered image blu filter Img from which the strip-shaped artifact is removed from the image to be processed, where textDiff is blu filter Img-Img, that is, a difference value between an element value of each element point in the difference matrix and an element value of an element point at a corresponding position in the image to be processed is determined, and textDiff of the difference matrix is determined according to a difference value corresponding to each element point. The absolute difference matrix textDiffAbs corresponding to the difference matrix textDiff, that is textDiffAbs (textDiff), is determined using the existing absolute value taking function.
Specifically, according to the determined difference matrix textDiff and the difference absolute value matrix textDiffAbs, the target image textPresImg after the edge texture is retained can be determined, and the determination formula of the target image textPresImg is as follows:
Figure BDA0003106373880000101
according to the element value of each element point in the difference absolute value matrix textDiffAbs, the first preset threshold thrLow and the second preset threshold thrHigh, if the element value is not greater than the first preset threshold thrLow, that is, the element value is not greater than thrLow, the pixel value of the pixel point at the corresponding position of the element point in the filtered image is obtained, and the pixel value is determined as the target pixel value of the target pixel point at the corresponding position of the element point in the target image textPresImg after the edge texture is retained.
If the element value is not less than a second preset threshold thrHigh, that is, the element value is not less than thrHigh, the pixel value of the pixel point at the corresponding position of the element point in the image to be processed is obtained, and the pixel value is determined as the target pixel value of the target pixel point at the corresponding position of the element point in the target image textPresImg after the edge texture is reserved.
If the element value is larger than the first preset threshold and smaller than the second preset threshold, determining a first difference value between the second preset threshold thrHigh and the element value of the element point, wherein the first difference value is represented as thrHigh-textDiffAbs in the formula, determining a second difference value between the second preset threshold thrHigh and the first preset threshold thrLow, wherein the second difference value is represented as thrHigh-thrLow in the formula, determining a ratio value corresponding to the first difference value and the second difference value, wherein the ratio value is represented as thrHigh-thrLow in the formula
Figure BDA0003106373880000111
When the second difference between the second preset threshold thrHigh and the first preset threshold thrLow is 0, the sum of the eps and the second difference is updated to determine the product of the element value and the proportion value of the corresponding position of the element in the difference matrix, and the product is expressed in the above formula as
Figure BDA0003106373880000112
And determining the sum of the product value and the pixel value of the pixel point at the corresponding position of the element point in the image to be processed, wherein the sum is expressed as
Figure BDA0003106373880000113
And determining the sum value as the target pixel value of the target pixel point of the corresponding position of the element point in the target image textPresimg.
Example 3:
in order to make the target image more realistic, on the basis of the above embodiments, in an embodiment of the present invention, the method further includes:
detecting and determining a banding artifact edge image and a non-flat area image corresponding to the image to be processed;
performing edge-preserving filtering operation on the banding artifact edge image and the non-flat area image, determining a banding artifact area weight matrix and a non-flat area weight matrix, and determining a fusion weight matrix according to a pre-stored fusion formula;
determining a first product matrix of a first matrix formed by the fusion weight matrix and the pixel value of each pixel point in the image to be processed;
determining a second product matrix of a first difference matrix and a second matrix according to a first difference matrix determined by the difference between a preset numerical value and the element value of each element point in the fusion weight matrix and the second matrix formed by the pixel value of each pixel point in the target image;
and determining a sum matrix of the first product matrix and the second product matrix, and updating the target image according to the element value of each element point in the sum matrix.
In the embodiment of the invention, in order to make the target image more real, the electronic device can also keep the texture of the original banding artifact area on the basis of the target image, so as to update the target image, and thus, the updated target image is more real.
In order to retain the texture of the original banding artifact area on the basis of the target image, the electronic device further detects and determines a banding artifact edge image and a non-flat area image corresponding to the to-be-processed image, wherein when the banding artifact edge image and the non-flat area image are detected, the to-be-processed image may be an image with an original resolution, an image after downsampling, or an image after upsampling, which is not limited in the embodiment of the present invention.
Determining a banding artifact edge area image corresponding to the image to be processed by adopting an edge detection algorithm, for example, the banding artifact edge area image can be an edge detection algorithm based on canny, an edge detection algorithm based on Sobel, or an edge detection algorithm based on Laplacian; the variance can be used to determine the non-flat area image corresponding to the image to be processed, and the non-flat area image can also be determined based on the gradient.
As a possible implementation manner, before the detection, denoising processing may be performed on the image to be processed, for example, denoising processing is performed by using mean filtering, edge-preserving filtering, and the like, so as to improve the accuracy of subsequent detection.
Specifically, when determining the banding artifact edge region image of the image to be processed by using the canny-based edge detection algorithm, threshold values are preset, which are canythlow, canythmid and canythhigh, respectively, where canythlow is smaller than canythmid and canythmid is smaller than canythuhigh.
The method comprises the steps of forming a threshold range by using a threshold canyThLow and a threshold canyThMid, carrying out edge detection on an image to be processed to obtain a first edge map edgeMapLow, carrying out edge detection on the image to be processed by using the threshold range formed by the threshold canyThMid and the threshold canyThhigh to obtain a second edge map edgeMapHigh, specifically, calculating a gradient value of each pixel point of the image to be processed, and determining the pixel point within the threshold range according to the gradient value and the threshold range, thereby determining the first edge map edgeMapLow and the second edge map edgeMapHigh.
Specifically, edgeMapLow is canny (Img, [ canythlow, canytthmid ]), and edgeMapHigh is canny (Img, [ canytthmid, canythhigh ]), where the above formula identifies that the pixel points in the edge map are the pixel points whose gradient values are within the threshold range in the image to be processed.
The method includes the steps of obtaining a banding artifact edge map bandedgemapdark corresponding to an image to be processed by using a first edge map edgemap low and a second edge map edgemap height, specifically, determining the banding artifact edge map bandedgemap outer corresponding to the image to be processed according to a difference value between a pixel value of each pixel point in the first edge map edgemap low and a pixel value of a pixel point at a corresponding position in the second edge map edgemap height, namely, determining the banding artifact edge map bandedgemap outer corresponding to the image to be processed, namely, the bandedgemap outer is edgemap low-edgemap height.
After the banding artifact edge image and the non-flat area image corresponding to the image to be processed are detected and determined, edge-preserving filtering operation is carried out on the banding artifact edge image to determine a banding artifact weight matrix of the banding artifact edge image, edge-preserving filtering operation is carried out on the non-flat area image to determine a non-flat area weight matrix of the non-flat area image.
Specifically, any one of a joint bilateral filter (joint bilateral filter), a guided filter (guided filter), a guided side window filter (guided side window filter) and an improved filter thereof in the prior art is used to perform edge-preserving filtering operation on the banding artifact edge image and the non-flat area image, so as to determine a banding artifact weight matrix of the banding artifact edge image and a non-flat area weight matrix of the non-flat area image.
And substituting the banding artifact weight matrix and the non-flat region weight matrix into a fusion formula according to a pre-stored fusion formula according to the banding artifact weight matrix and the non-flat region weight matrix to determine the fusion weight matrix of the banding artifact weight matrix and the non-flat region weight matrix.
Forming a first matrix according to the pixel value of each pixel point in the image to be processed, wherein the column number of the first matrix is the number of the pixel points with the width of the image to be processed, and the row number of the first matrix is the number of the pixel points with the height of the image to be processed; and according to the fusion weight matrix and the first matrix of the image to be processed, multiplying the fusion weight matrix and the first matrix to determine a first product matrix of the fusion weight matrix and the first matrix.
And subtracting the element value of each element point from the preset value to obtain each difference value according to the preset value and the element value of each element point in the fusion weight matrix, and determining a first difference matrix according to the difference value of the corresponding position of each element point.
And forming a second matrix by the pixel values of all the pixel points according to the determined pixel value of each pixel point in the target image, wherein the column number of the second matrix is the number of the pixel points with the width of the target image, and the row number of the second matrix is the number of the pixel points with the height of the target image.
According to the first difference matrix and the second matrix, the first difference matrix is multiplied by the second matrix to obtain a second product matrix of the first difference matrix and the second matrix, wherein the multiplication of the matrices is in the prior art, which is not described in detail in the embodiments of the present invention.
And replacing the pixel value of the pixel point at the corresponding position in the target image with the element value according to the element value of each element point in the sum matrix, thereby realizing the update of the target image.
In order to determine the fusion weight matrix, on the basis of the foregoing embodiments, in an embodiment of the present invention, the determining the fusion weight matrix according to a pre-stored fusion formula includes:
determining a second difference matrix according to the difference value between the first preset value and the element value of each element point in the banding artifact area weight matrix;
and determining a product matrix of the second difference matrix and the non-flat area weight matrix as a fusion weight matrix.
In order to accurately determine the fusion weight matrix, in the embodiment of the invention, according to a pre-stored strip-shaped area weight matrix, a first preset value is subtracted from an element value of each element point in the strip-shaped area weight to determine a difference value of a corresponding position of each element point, and a second difference matrix is determined according to the difference value of the corresponding position of each element point. The first predetermined value is a positive integer value, and preferably, the first predetermined value is 1.
And multiplying the second difference matrix by the non-flat area weight matrix to obtain a product matrix of the second difference matrix and the non-flat area weight matrix, and determining the product matrix as a fusion weight matrix.
Specifically, the method includes the steps of conducting guided edge-preserving filtering on a banding artifact edge image to obtain a banding artifact area weight matrix bandEdgeWeight, conducting guided edge-preserving filtering on a non-flat area image to obtain a non-flat area weight matrix notFlatweight, and generating a final fusion weight matrix blendWeight according to the banding artifact area weight matrix bandEdgeweight, the non-flat area weight matrix notFlatweight and a pre-stored fusion formula, wherein the fusion formula is (1-bandEdgeweight) x notFlatweight.
Where the (1-bandEdgeWeight) is represented as a second difference matrix of 1 and the difference of the element value of each element point in the banding artifact area weight matrix bandEdgeWeight. The edge-preserving filtering can be performed by using a joint bilateral filter (joint bilateral filter), a guided filter (guided filter), a guided side window filter (guided side window filter) and their modified filters in the prior art.
Example 4:
in order to make the detected banding artifact edge image more accurate, on the basis of the foregoing embodiments, in an embodiment of the present invention, after the detecting determines the banding artifact edge image corresponding to the image to be processed, the method further includes:
and performing filtering operation on the banding artifact edge image, and determining an image with the edge length smaller than a preset length threshold removed as an updated image of the banding artifact edge image.
In order to make the detected banding artifact edge image more accurate, in the embodiment of the present invention, the electronic device performs a filtering operation on the banding artifact edge image, removes a target edge of which the edge length in the banding artifact edge image is smaller than a preset length threshold, obtains an image with the target edge removed, and uses the image with the target edge removed as an updated image of the banding artifact edge image.
Specifically, for each banding artifact edge in the banding artifact edge image, a distance is set to be within a region range with a radius by taking any pixel point on the banding artifact edge as a center, if a pixel point of another banding artifact edge exists, the two banding artifact edges are considered as the same banding artifact edge, the mutually independent banding artifact edges in the banding artifact edge image are determined and marked as i, wherein i is {0, 1.. multidot.n }, and n is the number of the banding artifact edges.
And determining a target edge with the edge length smaller than a preset length threshold value in the banding artifact edge mark image according to the length of each banding artifact edge in the banding artifact edge image and the preset length threshold value.
For example, with any pixel Ap1 on a banding artifact edge a in the banding artifact edge image bandEdgeMapRough as a center, in an area range win with a fixed radius r, if a pixel Bp1 on another banding artifact edge B exists, it is determined that the edge a and the edge B are the same edge, the above operations are performed on all the pixels on the banding artifact edge, and a banding artifact edge marker image bandEdgeLabelMap is determined, which is compared with the banding artifact edge image bandEdgeMapRough, and the marker i of each banding artifact edge is increased.
And determining a target edge of which the edge length in the banding artifact edge marker image bandEdgeLabelMap is smaller than a preset length threshold, and taking pixel points in other edges except the target edge as pixel points in the image bandEdge after the target edge is removed.
Figure BDA0003106373880000161
Wherein x and y respectively refer to the x-axis coordinate and the y-axis coordinate of the pixel point, len [, [ solution ] ]]The length statistics is represented, edgeLenTh represents a preset length threshold, bandEdge (x, y) ═ 1 represents that the pixel point corresponding to the position in the banding artifact edge map is on the banding artifact edge with the length larger than the preset length threshold, and bandEdge (x, y) ═ 0 represents that the pixel point corresponding to the position in the banding artifact edge map is on the banding artifact edge with the length not larger than the preset length threshold or the pixel point is on the non-banding artifact edge.
As a possible implementation manner, in the embodiment of the present invention, the area of the image bandEdge after the target edge is removed may also be expanded, so as to obtain a banding artifact area image bandEdge map. Specifically, the region expansion method includes an expansion method, a binarization process after filtering, and the like.
Example 5:
in order to determine the non-flat area image corresponding to the image to be processed, on the basis of the foregoing embodiments, in an embodiment of the present invention, the determining the non-flat area image corresponding to the image to be processed includes:
performing expansion operation and corrosion operation on the image to be processed to determine an expansion image and a corrosion image;
determining a difference image of the expansion image and the corrosion image according to the difference value of the pixel values of the pixel points at each corresponding position in the expansion image and the corrosion image;
and carrying out binarization filtering operation on the difference image, determining a binary difference image of the difference image, carrying out area filtering on the binary difference image, and determining the image with the area smaller than a preset area threshold value removed as a non-flat area image.
In order to determine the non-flat area image corresponding to the image to be processed, in the embodiment of the invention, the image to be processed is subjected to an expansion operation and a corrosion operation to obtain an expansion image and a corrosion image. The method for performing dilation operation on the image can adopt the existing imdilate function to perform image dilation operation and adopt the existing imode function to perform image erosion operation.
And according to the determined expansion image and the corrosion image, subtracting the pixel value of the pixel point at each corresponding position in the expansion image and the corrosion image to determine the difference value of the pixel point at each corresponding position, and according to the difference value of the pixel point at each corresponding position, determining the difference image of the expansion image and the corrosion image.
And performing binarization filtering operation on the difference image to determine a binarization difference image of the difference image, specifically, determining a pixel point with a pixel value not less than a preset texture difference threshold value according to the pixel value of each pixel point in the difference image and the determined preset texture difference threshold value, setting the pixel value to be 255, determining a pixel point with a pixel value less than the preset texture difference threshold value, and setting the pixel value to be 0.
And performing area filtering on the determined binarization difference image, removing a target area with the area smaller than a preset area threshold value in the binarization difference image to obtain an image with the target area removed, and taking the image with the target area removed as a non-flat area image.
Specifically, an expansion operation is performed on an image to be processed to obtain an expanded image dilateImg, a corrosion operation is performed on the image to be processed to obtain a corrosion image erodieimg, and a difference image diffImg is determined according to the expanded image dilateImg and the corrosion image erodieimg, wherein the diffImg is the diffeimg-erodieimg and is expressed by taking a difference value obtained by subtracting pixel values of pixel points at each corresponding position in the expanded image dilateImg and the corrosion image erodieimg as a pixel value of a pixel point at a corresponding position in the difference image diffImg.
Carrying out binarization processing on the difference image diffImg to obtain a binary difference image binaryDiffImg, wherein
Figure BDA0003106373880000181
Where textThLow represents a preset texture difference threshold.
Area filtering is carried out on the binary difference image binaryDiffImg to obtain a non-flat region map notFlatMap,
Figure BDA0003106373880000182
where j ═ {0,1, 2., m } is the region identification of the binary difference image binaryDiffImg, m represents the number of regions in the binary difference image binaryDiffImg, x and y refer to the x-axis coordinate and the y-axis coordinate of the pixel point, respectively, S () represents a function for calculating the region area, and areth is a preset area threshold.
Example 6:
in order to make the determined target image more realistic, on the basis of the foregoing embodiments, in an embodiment of the present invention, the method further includes:
aiming at each pixel point in the target image, carrying out position and operation on the difference value between the width of an abscissa value and a preset error transfer matrix of the pixel point and a second preset numerical value, determining a target abscissa value corresponding to the pixel point, carrying out position and operation on the difference value between the height of the ordinate value and the preset error transfer matrix of the pixel point and the second preset numerical value, determining a target ordinate value corresponding to the pixel point, determining element values of element points corresponding to the target abscissa value and the target ordinate value from the error transfer matrix according to the target abscissa value and the target ordinate value corresponding to the pixel point, and determining a third matrix according to the element value, the abscissa value and the ordinate value of the element point corresponding to each pixel point;
determining a fourth product matrix of the first difference matrix and the third matrix, determining a third sum matrix of a fourth matrix composed of the fourth product matrix and the pixel value of each pixel point in the target image, and determining the pixel value of the pixel point at the corresponding position of the updated target image according to the element value of each pixel point in the third sum matrix.
In order to make the determined target image more realistic, in the embodiment of the present invention, the electronic device further stores a preset error transfer matrix, where the error transfer matrix is used to make the target image not too smooth through an error in the target image, so as to make the authenticity of the target image higher.
Aiming at each pixel point of the target image, determining an abscissa value according to the abscissa value of the pixel point and the width of the error transfer matrix, determining a difference value between the width of the error transfer matrix and a second preset numerical value, and carrying out bitwise AND operation on the abscissa value and the difference value; the second predetermined value can be any positive integer value, and preferably, the second predetermined value is 1.
According to the vertical coordinate value of the pixel point and the height of the error transfer matrix, determining a vertical coordinate value, determining a difference value between the height of the error transfer matrix and a second preset value, and performing position and operation on the vertical coordinate value and the difference value to determine a target horizontal coordinate value and a target total coordinate value corresponding to each pixel point, and according to the target horizontal coordinate value and the target total coordinate value corresponding to each pixel point, determining a pixel value of the pixel point of a third matrix corresponding to each pixel point position to determine the third matrix.
For example, fig. 7 is a schematic diagram of a preset error transfer matrix according to an embodiment of the present invention, and as shown in fig. 7, the width dw of the error transfer matrix is 8, and the height dh is also 8.
And according to the determined third matrix and the first difference matrix, multiplying the third matrix and the first difference matrix to obtain a fourth product matrix of the first difference matrix and the third matrix, according to the pixel value of each pixel point in the target image, determining a fourth matrix, adding the fourth matrix and the third product matrix to obtain a third sum matrix of the fourth matrix and the third product matrix, updating the target image according to the element value of each element point in the third sum matrix, and the pixel value of each pixel point in each position in the updated target image is the element value of the element point in the corresponding position in the third sum matrix.
Specifically, according to a preset error transfer matrix shown in fig. 7, a mask dither operation is performed on the target image blendmigout to obtain an updated target image ImgOut, where the updated target image ImgOut is blendsoutt + (1-blendWeight) × dithermmatrix [ x & (dw-1), y & (dh-1) ].
Fig. 8 is a schematic diagram of a video frame image after removing a banding artifact according to an embodiment of the present invention, as shown in fig. 8, the method according to the embodiment of the present invention can effectively remove or alleviate the banding artifact in a flat area, and meanwhile, can also play a good role in protecting the edge of the image, and in an area with rich texture, the details of the image are hardly affected too much.
Example 7:
the image banding artifact removing method of the present invention is described below with a specific embodiment, where the image to be processed is a video frame image, fig. 9 is a schematic process diagram of the image banding artifact removing method provided by the embodiment of the present invention, and as shown in fig. 9, the process includes the following steps:
s901: and acquiring a video frame image after the video data decoding as an image to be processed, and performing S902, S903 and S904 in parallel.
S902: and performing filtering operation on the image to be processed, determining a filtered image of the image to be processed after the strip-shaped artifact is removed, performing edge texture retaining filtering operation according to the filtered image and the image to be processed, determining a target image of the image to be processed after the edge texture is retained, and performing S906.
S903: and detecting and determining a banding artifact edge image corresponding to the image to be processed, and performing S905.
S904: and detecting and determining the non-flat area image corresponding to the image to be processed, and performing S905.
S905: and performing edge-preserving filtering operation on the banding artifact edge image and the non-flat region image, determining a banding artifact region weight matrix and a non-flat region weight matrix, determining a fusion weight matrix according to a pre-stored fusion formula, and performing S906.
S906: determining a first product matrix of the fusion weight matrix and a first matrix according to the first matrix formed by the fusion weight matrix and the pixel value of each pixel point in the image to be processed; determining a second product matrix of the first difference matrix and the second matrix according to a first matrix formed by a first difference matrix determined by the difference between the first preset numerical value and the element value of each element point in the fusion weight matrix and the pixel value of each pixel point in the target image; and determining a sum matrix of the first product matrix and the second product matrix, and updating the target image according to the element value of each element point in the sum matrix.
S907: and (4) performing a gather operation on the target image, and determining the finally output target image with the banding artifact removed.
In order to increase the speed of determining the final output target image after removing the banding artifact, fig. 10 is a schematic process diagram of an image banding artifact removing method provided by an embodiment of the present invention, as shown in fig. 10, the process includes the following steps:
s1001: and acquiring a video frame image after the video data is decoded, and performing S1002 and S1003 in parallel.
S1002: and performing filtering operation on the image to be processed, determining a filtered image of the image to be processed after removing the strip-shaped artifact, performing edge texture retaining filtering operation according to the filtered image and the image to be processed, determining a target image of the image to be processed after retaining the edge texture, and performing S1008.
S1003: the image to be processed is subjected to down-sampling processing, and S1004 and S1005 are performed in parallel.
S1004: and (5) detecting and determining a banding artifact edge image corresponding to the image to be processed, and performing S1006.
S1005: and detecting and determining a non-flat area image corresponding to the image to be processed, and performing S1006.
S1006: and performing edge-preserving filtering operation on the banding artifact edge image and the non-flat area image, determining a banding artifact area weight matrix and a non-flat area weight matrix, and determining a fusion weight matrix according to a pre-stored fusion formula.
S1007: the fusion weight matrix is up-sampled and S1008 is performed.
S1008: determining a first product matrix of the fusion weight matrix and a first matrix according to the first matrix formed by the fusion weight matrix and the pixel value of each pixel point in the image to be processed; determining a second product matrix of the first difference matrix and the second matrix according to a first matrix formed by a first difference matrix determined by the difference between the first preset numerical value and the element value of each element point in the fusion weight matrix and the pixel value of each pixel point in the target image; and determining a sum matrix of the first product matrix and the second product matrix, and updating the target image according to the element value of each element point in the sum matrix.
S1009: and (4) performing a gather operation on the target image, and determining the finally output target image with the banding artifact removed.
Example 8:
on the basis of the foregoing embodiments, fig. 11 is a schematic structural diagram of an image banding artifact removing apparatus according to an embodiment of the present invention, where the apparatus includes:
the first determining module 1101 is configured to perform a filtering operation on an image to be processed, determine a filtered image of the image to be processed after removing a strip artifact, and determine a difference matrix and a difference absolute value matrix of the filtered image and the image to be processed according to a difference between pixel values of pixel points at each corresponding position in the filtered image and the image to be processed;
a second determining module 1102, configured to determine, for each element point in the difference absolute value matrix, a target size relationship between the element value and the first preset threshold and a target size relationship between the element value and the second preset threshold according to an element value of the element point, a first preset threshold and a second preset threshold, where the second preset threshold is greater than the first preset threshold;
a third determining module 1103, configured to determine, according to the target size relationship corresponding to the element value of each element point and a pre-stored value taking function of the pixel value corresponding to each size relationship, a target pixel value of a target pixel point at a corresponding position of each element point in the target image after the edge texture retention, to obtain the target image after the edge texture retention.
Further, the size relationship includes that the element value is not greater than the first preset threshold, the element value is greater than the first preset threshold and less than the second preset threshold, and the element value is not less than the second preset threshold, the third determining module is specifically configured to determine, for the target size relationship corresponding to the element value of each element point, if the target size relationship is that the element value of the element point is not greater than the first preset threshold or not less than the second preset threshold, the pixel value of the pixel point at the corresponding position of the element point in the filtered image or the to-be-processed image, and determine the pixel value as the target pixel value of the target pixel point at the corresponding position of the element point in the target image; if the target size relationship also indicates that the element value of the element point is greater than the first preset threshold and smaller than the second preset threshold, determining a first difference value between the second preset threshold and the element value of the element point, determining a second difference value between the second preset threshold and the first preset threshold, determining a ratio value corresponding to the first difference value and the second difference value, determining a product value between the element value of a corresponding position of the element point in the difference matrix and the ratio value, determining a sum value between the product value and a pixel value of a pixel point of the corresponding position of the element point in the image to be processed, and determining the sum value as a target pixel value of a target pixel point of the corresponding position of the element point in the target image.
Further, the apparatus further comprises:
the detection module is used for detecting and determining a banding artifact edge image and a non-flat area image corresponding to the image to be processed;
the fusion module is used for performing edge-preserving filtering operation on the banding artifact edge image and the non-flat area image, determining a banding artifact area weight matrix and a non-flat area weight matrix, and determining a fusion weight matrix according to a pre-stored fusion formula;
the fourth determining module is used for determining a first product matrix of a first matrix formed by the fusion weight matrix and the pixel value of each pixel point in the image to be processed; determining a second product matrix of a first difference matrix and a second matrix according to a first difference matrix determined by the difference between a first preset numerical value and the element value of each element point in the fusion weight matrix and the second matrix composed of the pixel value of each pixel point in the target image;
and the updating module is used for determining a sum matrix of the first product matrix and the second product matrix and updating the target image according to the element value of each element point in the sum matrix.
Further, the fusion module is specifically configured to determine a second difference matrix according to a difference between a first preset value and an element value of each element point in the banding artifact area weight matrix; and determining a product matrix of the second difference matrix and the non-flat area weight matrix as a fusion weight matrix.
Further, the updating module is further configured to perform a filtering operation on the banding artifact edge image, and determine that an image from which a target edge with an edge length smaller than a preset length threshold is removed is an updated image of the banding artifact edge image.
Further, the detection module is specifically configured to perform an expansion operation and a corrosion operation on the image to be processed, and determine an expansion image and a corrosion image; determining a difference image of the expansion image and the corrosion image according to the difference value of the pixel values of the pixel points at each corresponding position in the expansion image and the corrosion image; and carrying out binarization filtering operation on the difference image, determining a binary difference image of the difference image, carrying out area filtering on the binary difference image, and determining the image with the area smaller than a preset area threshold value removed as a non-flat area image.
Further, the fourth determining module is further configured to, for each pixel point in the target image, perform a bit and operation on a difference value between an abscissa value of the pixel point and a width of a preset error transfer matrix and a second preset value, determine a target abscissa value corresponding to the pixel point, perform a bit and operation on a difference value between an ordinate value of the pixel point and a height of the preset error transfer matrix and the second preset value, determine a target ordinate value corresponding to the pixel point, determine, according to the target abscissa value and the target ordinate value corresponding to the pixel point, an element value of an element point corresponding to the target abscissa value and the target ordinate value from the error transfer matrix, and determine a third matrix according to the element value, the abscissa value and the ordinate value of the element point corresponding to each pixel point;
the updating module is further configured to determine a fourth product matrix of the first difference matrix and the third matrix, determine a third sum matrix of a fourth matrix formed by the fourth product matrix and a pixel value of each pixel in the target image, and determine, according to an element value of each pixel in the third sum matrix, an updated pixel value of a pixel at a corresponding position of the target image.
Example 9:
fig. 12 is a schematic structural diagram of an electronic device according to an embodiment of the present invention, and on the basis of the foregoing embodiments, an electronic device according to an embodiment of the present invention is further provided, where the electronic device includes a processor 1201, a communication interface 1202, a memory 1203, and a communication bus 1204, where the processor 1201, the communication interface 1202, and the memory 1203 complete communication with each other through the communication bus 1204;
the memory 1203 has stored therein a computer program which, when executed by the processor 1201, causes the processor 1201 to perform the steps of:
performing filtering operation on an image to be processed, determining a filtered image of the image to be processed after removing the strip-shaped artifact, and determining a difference matrix and a difference absolute value matrix of the filtered image and the image to be processed according to the difference value of the pixel value of each pixel point at the corresponding position in the filtered image and the image to be processed;
for each element point in the difference absolute value matrix, determining a target size relation between the element value and a first preset threshold and a second preset threshold according to the element value of the element point, the first preset threshold and the second preset threshold, wherein the second preset threshold is larger than the first preset threshold;
and determining a target pixel value of a target pixel point at a corresponding position of each element point in the target image after the retention of the edge texture according to the target size relationship corresponding to the element value of each element point and a pre-stored value function of the pixel value corresponding to each size relationship, so as to obtain the target image after the retention of the edge texture.
Further, the processor 1201 is specifically configured to determine, according to the target size relationship corresponding to the element value of each element point and a pre-stored value taking function of the pixel value corresponding to each size relationship, a target pixel value of a target pixel point at a corresponding position of each element point in the target image after the edge texture is retained, where the size relationship includes that the element value is not greater than the first preset threshold, the element value is greater than the first preset threshold and is less than the second preset threshold, and the element value is not less than the second preset threshold, and the determining, by the processor 1201, the target pixel value of the target pixel point at a corresponding position of each element point in the target image after the edge texture is retained includes:
for the target size relationship corresponding to the element value of each element point, if the target size relationship is that the element value of the element point is not greater than the first preset threshold or not less than the second preset threshold, respectively determining the pixel value of a pixel point at the corresponding position of the element point in the filtered image or the image to be processed, and determining the pixel value as the target pixel value of a target pixel point at the corresponding position of the element point in the target image; if the target size relationship also indicates that the element value of the element point is greater than the first preset threshold and smaller than the second preset threshold, determining a first difference value between the second preset threshold and the element value of the element point, determining a second difference value between the second preset threshold and the first preset threshold, determining a ratio value corresponding to the first difference value and the second difference value, determining a product value between the element value of a corresponding position of the element point in the difference matrix and the ratio value, determining a sum value between the product value and a pixel value of a pixel point of the corresponding position of the element point in the image to be processed, and determining the sum value as a target pixel value of a target pixel point of the corresponding position of the element point in the target image.
Further, the processor 1201 is further configured to detect and determine a banding artifact edge image and a non-flat area image corresponding to the image to be processed;
performing edge-preserving filtering operation on the banding artifact edge image and the non-flat area image, determining a banding artifact area weight matrix and a non-flat area weight matrix, and determining a fusion weight matrix according to a pre-stored fusion formula;
determining a first product matrix of a first matrix formed by the fusion weight matrix and the pixel value of each pixel point in the image to be processed;
determining a second product matrix of a first difference matrix and a second matrix according to a first difference matrix determined by the difference between a first preset numerical value and the element value of each element point in the fusion weight matrix and the second matrix composed of the pixel value of each pixel point in the target image;
and determining a sum matrix of the first product matrix and the second product matrix, and updating the target image according to the element value of each element point in the sum matrix.
Further, the processor 1201 is specifically configured to determine a second difference matrix according to a difference between a first preset value and an element value of each element point in the banding artifact area weight matrix;
and determining a product matrix of the second difference matrix and the non-flat area weight matrix as a fusion weight matrix.
Further, the processor 1201 is further configured to, after the detection determines that the banding artifact edge image corresponding to the image to be processed is a corresponding banding artifact edge image, perform a filtering operation on the banding artifact edge image, and determine that an image from which a target edge with an edge length smaller than a preset length threshold is removed is an updated image of the banding artifact edge image.
Further, the processor 1201 is specifically configured to perform an expansion operation and an erosion operation on the image to be processed, and determine an expanded image and an eroded image;
determining a difference image of the expansion image and the corrosion image according to the difference value of the pixel values of the pixel points at each corresponding position in the expansion image and the corrosion image;
and carrying out binarization filtering operation on the difference image, determining a binary difference image of the difference image, carrying out area filtering on the binary difference image, and determining the image with the area smaller than a preset area threshold value removed as a non-flat area image.
Further, the processor 1201 is further configured to, for each pixel point in the target image, perform a bit-wise and operation on a difference value between an abscissa value of the pixel point and a width of a preset error transfer matrix and a second preset value, determine a target abscissa value corresponding to the pixel point, perform a bit-wise and operation on a difference value between an ordinate value of the pixel point and a height of the preset error transfer matrix and the second preset value, determine a target ordinate value corresponding to the pixel point, determine, according to the target abscissa value and the target ordinate value corresponding to the pixel point, an element value of an element point corresponding to the target abscissa value and the target ordinate value from the error transfer matrix, and determine a third matrix according to the element value, the abscissa value, and the ordinate value of the element point corresponding to each pixel point;
determining a fourth product matrix of the first difference matrix and the third matrix, determining a third sum matrix of a fourth matrix composed of the fourth product matrix and the pixel value of each pixel point in the target image, and determining the pixel value of the pixel point at the corresponding position of the updated target image according to the element value of each pixel point in the third sum matrix.
The communication bus mentioned in the electronic device may be a Peripheral Component Interconnect (PCI) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The communication bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown, but this does not mean that there is only one bus or one type of bus.
The communication interface 1202 is used for communication between the electronic apparatus and other apparatuses.
The Memory may include a Random Access Memory (RAM) or a Non-Volatile Memory (NVM), such as at least one disk Memory. Alternatively, the memory may be at least one memory device located remotely from the processor.
The Processor may be a general-purpose Processor, including a central processing unit, a Network Processor (NP), and the like; but may also be a Digital instruction processor (DSP), an application specific integrated circuit, a field programmable gate array or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or the like.
Example 10:
on the basis of the foregoing embodiments, an embodiment of the present invention further provides a computer-readable storage medium, which stores a computer program, where the computer program is executed by a processor to perform the following steps:
performing filtering operation on an image to be processed, determining a filtered image of the image to be processed after removing the strip-shaped artifact, and determining a difference matrix and a difference absolute value matrix of the filtered image and the image to be processed according to the difference value of the pixel value of each pixel point at the corresponding position in the filtered image and the image to be processed;
for each element point in the difference absolute value matrix, determining a target size relation between the element value and a first preset threshold and a second preset threshold according to the element value of the element point, the first preset threshold and the second preset threshold, wherein the second preset threshold is larger than the first preset threshold;
and determining a target pixel value of a target pixel point at a corresponding position of each element point in the target image after the retention of the edge texture according to the target size relationship corresponding to the element value of each element point and a pre-stored value function of the pixel value corresponding to each size relationship, so as to obtain the target image after the retention of the edge texture.
Further, the determining, according to the target size relationship corresponding to the element value of each element point and a pre-stored value function of the pixel value corresponding to each size relationship, the target pixel value of the target pixel point at the corresponding position of each element point in the target image after the edge texture is retained includes:
for the target size relationship corresponding to the element value of each element point, if the target size relationship is that the element value of the element point is not greater than the first preset threshold or not less than the second preset threshold, respectively determining the pixel value of a pixel point at the corresponding position of the element point in the filtered image or the image to be processed, and determining the pixel value as the target pixel value of a target pixel point at the corresponding position of the element point in the target image; if the target size relationship also indicates that the element value of the element point is greater than the first preset threshold and smaller than the second preset threshold, determining a first difference value between the second preset threshold and the element value of the element point, determining a second difference value between the second preset threshold and the first preset threshold, determining a ratio value corresponding to the first difference value and the second difference value, determining a product value between the element value of a corresponding position of the element point in the difference matrix and the ratio value, determining a sum value between the product value and a pixel value of a pixel point of the corresponding position of the element point in the image to be processed, and determining the sum value as a target pixel value of a target pixel point of the corresponding position of the element point in the target image.
Further, the method further comprises:
detecting and determining a banding artifact edge image and a non-flat area image corresponding to the image to be processed;
performing edge-preserving filtering operation on the banding artifact edge image and the non-flat area image, determining a banding artifact area weight matrix and a non-flat area weight matrix, and determining a fusion weight matrix according to a pre-stored fusion formula;
determining a first product matrix of a first matrix formed by the fusion weight matrix and the pixel value of each pixel point in the image to be processed;
determining a second product matrix of a first difference matrix and a second matrix according to a first difference matrix determined by the difference between a first preset numerical value and the element value of each element point in the fusion weight matrix and the second matrix composed of the pixel value of each pixel point in the target image;
and determining a sum matrix of the first product matrix and the second product matrix, and updating the target image according to the element value of each element point in the sum matrix.
Further, the determining a fusion weight matrix according to a pre-stored fusion formula includes:
determining a second difference matrix according to the difference value between the first preset value and the element value of each element point in the banding artifact area weight matrix;
and determining a product matrix of the second difference matrix and the non-flat area weight matrix as a fusion weight matrix.
Further, after the detecting determines the banding artifact edge image corresponding to the image to be processed, the method further includes:
and performing filtering operation on the banding artifact edge image, and determining an image with the edge length smaller than a preset length threshold removed as an updated image of the banding artifact edge image.
Further, the determining the non-flat area image corresponding to the image to be processed includes:
performing expansion operation and corrosion operation on the image to be processed to determine an expansion image and a corrosion image;
determining a difference image of the expansion image and the corrosion image according to the difference value of the pixel values of the pixel points at each corresponding position in the expansion image and the corrosion image;
and carrying out binarization filtering operation on the difference image, determining a binary difference image of the difference image, carrying out area filtering on the binary difference image, and determining the image with the area smaller than a preset area threshold value removed as a non-flat area image.
Further, the method further comprises:
aiming at each pixel point in the target image, carrying out position and operation on the difference value between the width of an abscissa value and a preset error transfer matrix of the pixel point and a second preset numerical value, determining a target abscissa value corresponding to the pixel point, carrying out position and operation on the difference value between the height of the ordinate value and the preset error transfer matrix of the pixel point and the second preset numerical value, determining a target ordinate value corresponding to the pixel point, determining element values of element points corresponding to the target abscissa value and the target ordinate value from the error transfer matrix according to the target abscissa value and the target ordinate value corresponding to the pixel point, and determining a third matrix according to the element value, the abscissa value and the ordinate value of the element point corresponding to each pixel point;
determining a fourth product matrix of the first difference matrix and the third matrix, determining a third sum matrix of a fourth matrix composed of the fourth product matrix and the pixel value of each pixel point in the target image, and determining the pixel value of the pixel point at the corresponding position of the updated target image according to the element value of each pixel point in the third sum matrix.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is intended to include such modifications and variations as well.

Claims (10)

1. An image banding artifact removal method, said method comprising:
performing filtering operation on an image to be processed, determining a filtered image of the image to be processed after removing the strip-shaped artifact, and determining a difference matrix and a difference absolute value matrix of the filtered image and the image to be processed according to the difference value of the pixel value of each pixel point at the corresponding position in the filtered image and the image to be processed;
for each element point in the difference absolute value matrix, determining a target size relation between the element value and a first preset threshold and a second preset threshold according to the element value of the element point, the first preset threshold and the second preset threshold, wherein the second preset threshold is larger than the first preset threshold;
and determining a target pixel value of a target pixel point at a corresponding position of each element point in the target image after the retention of the edge texture according to the target size relationship corresponding to the element value of each element point and a pre-stored value function of the pixel value corresponding to each size relationship, so as to obtain the target image after the retention of the edge texture.
2. The method according to claim 1, wherein the size relationship includes that the element value is not greater than the first preset threshold, the element value is greater than the first preset threshold and less than the second preset threshold, and the element value is not less than the second preset threshold, and determining the target pixel value of the target pixel point at the corresponding position of each element point in the target image after the edge texture is retained according to the target size relationship corresponding to the element value of each element point and a pre-stored value-taking function of the pixel value corresponding to each size relationship includes:
for the target size relationship corresponding to the element value of each element point, if the target size relationship is that the element value of the element point is not greater than the first preset threshold or not less than the second preset threshold, respectively determining the pixel value of a pixel point at the corresponding position of the element point in the filtered image or the image to be processed, and determining the pixel value as the target pixel value of a target pixel point at the corresponding position of the element point in the target image; if the target size relationship also indicates that the element value of the element point is greater than the first preset threshold and smaller than the second preset threshold, determining a first difference value between the second preset threshold and the element value of the element point, determining a second difference value between the second preset threshold and the first preset threshold, determining a ratio value corresponding to the first difference value and the second difference value, determining a product value between the element value of a corresponding position of the element point in the difference matrix and the ratio value, determining a sum value between the product value and a pixel value of a pixel point of the corresponding position of the element point in the image to be processed, and determining the sum value as a target pixel value of a target pixel point of the corresponding position of the element point in the target image.
3. The method of claim 2, further comprising:
detecting and determining a banding artifact edge image and a non-flat area image corresponding to the image to be processed;
performing edge-preserving filtering operation on the banding artifact edge image and the non-flat area image, determining a banding artifact area weight matrix and a non-flat area weight matrix, and determining a fusion weight matrix according to a pre-stored fusion formula;
determining a first product matrix of a first matrix formed by the fusion weight matrix and the pixel value of each pixel point in the image to be processed;
determining a second product matrix of a first difference matrix and a second matrix according to a first difference matrix determined by the difference between a first preset numerical value and the element value of each element point in the fusion weight matrix and the second matrix composed of the pixel value of each pixel point in the target image;
and determining a sum matrix of the first product matrix and the second product matrix, and updating the target image according to the element value of each element point in the sum matrix.
4. The method of claim 3, wherein determining a fusion weight matrix according to a pre-stored fusion formula comprises:
determining a second difference matrix according to the difference value between the first preset value and the element value of each element point in the banding artifact area weight matrix;
and determining a product matrix of the second difference matrix and the non-flat area weight matrix as a fusion weight matrix.
5. The method of claim 3, wherein after the detecting determines the banding artifact edge image corresponding to the image to be processed, the method further comprises:
and performing filtering operation on the banding artifact edge image, and determining an image with the edge length smaller than a preset length threshold removed as an updated image of the banding artifact edge image.
6. The method according to claim 3, wherein the determining the non-flat area image corresponding to the image to be processed comprises:
performing expansion operation and corrosion operation on the image to be processed to determine an expansion image and a corrosion image;
determining a difference image of the expansion image and the corrosion image according to the difference value of the pixel values of the pixel points at each corresponding position in the expansion image and the corrosion image;
and carrying out binarization filtering operation on the difference image, determining a binary difference image of the difference image, carrying out area filtering on the binary difference image, and determining the image with the area smaller than a preset area threshold value removed as a non-flat area image.
7. The method of claim 3, further comprising:
aiming at each pixel point in the target image, carrying out position and operation on the difference value between the width of an abscissa value and a preset error transfer matrix of the pixel point and a second preset numerical value, determining a target abscissa value corresponding to the pixel point, carrying out position and operation on the difference value between the height of the ordinate value and the preset error transfer matrix of the pixel point and the second preset numerical value, determining a target ordinate value corresponding to the pixel point, determining element values of element points corresponding to the target abscissa value and the target ordinate value from the error transfer matrix according to the target abscissa value and the target ordinate value corresponding to the pixel point, and determining a third matrix according to the element value, the abscissa value and the ordinate value of the element point corresponding to each pixel point;
determining a fourth product matrix of the first difference matrix and the third matrix, determining a third sum matrix of a fourth matrix composed of the fourth product matrix and the pixel value of each pixel point in the target image, and determining the pixel value of the pixel point at the corresponding position of the updated target image according to the element value of each pixel point in the third sum matrix.
8. An image banding artifact removal apparatus, said method comprising:
the first determining module is used for performing filtering operation on an image to be processed, determining a filtered image of the image to be processed after removing the strip-shaped artifact, and determining a difference matrix and a difference absolute value matrix of the filtered image and the image to be processed according to the difference value of the pixel value of each pixel point at the corresponding position in the filtered image and the image to be processed;
a second determining module, configured to determine, for each element point in the difference absolute value matrix, a target size relationship between the element value and the first preset threshold and a target size relationship between the element value and the second preset threshold according to an element value of the element point, a first preset threshold and a second preset threshold, where the second preset threshold is greater than the first preset threshold;
and the third determining module is used for determining the target pixel value of the target pixel point at the corresponding position of each element point in the target image after the retention of the edge texture according to the target size relationship corresponding to the element value of each element point and the pre-stored value taking function of the pixel value corresponding to each size relationship, so as to obtain the target image after the retention of the edge texture.
9. An electronic device, comprising: the system comprises a processor, a communication interface, a memory and a communication bus, wherein the processor, the communication interface and the memory complete mutual communication through the communication bus;
the memory has stored therein a computer program which, when executed by the processor, causes the processor to perform the method of any one of claims 1-7.
10. A computer-readable storage medium, characterized in that it stores a computer program executable by a processor, which program, when run on the processor, causes the processor to carry out the method of any one of claims 1-7.
CN202110647537.7A 2021-06-08 2021-06-08 Image banding artifact removing method, device, equipment and medium Pending CN113362246A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110647537.7A CN113362246A (en) 2021-06-08 2021-06-08 Image banding artifact removing method, device, equipment and medium
PCT/CN2022/094771 WO2022257759A1 (en) 2021-06-08 2022-05-24 Image banding artifact removal method and apparatus, and device and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110647537.7A CN113362246A (en) 2021-06-08 2021-06-08 Image banding artifact removing method, device, equipment and medium

Publications (1)

Publication Number Publication Date
CN113362246A true CN113362246A (en) 2021-09-07

Family

ID=77533785

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110647537.7A Pending CN113362246A (en) 2021-06-08 2021-06-08 Image banding artifact removing method, device, equipment and medium

Country Status (2)

Country Link
CN (1) CN113362246A (en)
WO (1) WO2022257759A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022257759A1 (en) * 2021-06-08 2022-12-15 百果园技术(新加坡)有限公司 Image banding artifact removal method and apparatus, and device and medium
CN116452465A (en) * 2023-06-13 2023-07-18 江苏游隼微电子有限公司 Method for eliminating JPEG image block artifact

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113838001B (en) * 2021-08-24 2024-02-13 内蒙古电力科学研究院 Ultrasonic wave full focusing image defect processing method and device based on nuclear density estimation
CN115797343B (en) * 2023-02-06 2023-04-21 山东大佳机械有限公司 Livestock and poultry breeding environment video monitoring method based on image data
CN116485819B (en) * 2023-06-21 2023-09-01 青岛大学附属医院 Ear-nose-throat examination image segmentation method and system
CN116523924B (en) * 2023-07-05 2023-08-29 吉林大学第一医院 Data processing method and system for medical experiment
CN117437279B (en) * 2023-12-12 2024-03-22 山东艺达环保科技有限公司 Packing box surface flatness detection method and system
CN117911716B (en) * 2024-03-19 2024-06-21 天津医科大学总医院 Arthritis CT image feature extraction method based on machine vision

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1129383A (en) * 1994-10-31 1996-08-21 大宇电子株式会社 Post-processing method for use in an image signal decoding system
CN103761718A (en) * 2014-02-12 2014-04-30 北京空间机电研究所 Satellite remote sensing image region stripe noise suppression device and method thereof
US20150117793A1 (en) * 2013-10-31 2015-04-30 Stmicroelectronics Asia Pacific Pte. Ltd. Recursive de-banding filter for digital images

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6859565B2 (en) * 2001-04-11 2005-02-22 Hewlett-Packard Development Company, L.P. Method and apparatus for the removal of flash artifacts
CA2674164A1 (en) * 2006-12-28 2008-07-17 Thomson Licensing Detecting block artifacts in coded images and video
EP2105023A2 (en) * 2006-12-28 2009-09-30 Thomson Licensing Banding artifact detection in digital video content
US9747673B2 (en) * 2014-11-05 2017-08-29 Dolby Laboratories Licensing Corporation Systems and methods for rectifying image artifacts
US10275894B2 (en) * 2016-01-28 2019-04-30 Interra Systems Methods and systems for detection of artifacts in a video after error concealment
CN106780649B (en) * 2016-12-16 2020-04-07 上海联影医疗科技有限公司 Image artifact removing method and device
CN111402172B (en) * 2020-03-24 2023-08-22 湖南国科微电子股份有限公司 Image noise reduction method, system, equipment and computer readable storage medium
CN113362246A (en) * 2021-06-08 2021-09-07 百果园技术(新加坡)有限公司 Image banding artifact removing method, device, equipment and medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1129383A (en) * 1994-10-31 1996-08-21 大宇电子株式会社 Post-processing method for use in an image signal decoding system
US20150117793A1 (en) * 2013-10-31 2015-04-30 Stmicroelectronics Asia Pacific Pte. Ltd. Recursive de-banding filter for digital images
CN103761718A (en) * 2014-02-12 2014-04-30 北京空间机电研究所 Satellite remote sensing image region stripe noise suppression device and method thereof

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
TAE-YONG PARK 等: "Multitoning Method Based on Threshold Modulation Using MJBNM for Banding Artifact Reduction", 《CONFERENCE ON COLOUR IN GRAPHICS, IMAGING, AND VISION》, vol. 3, 31 January 2006 (2006-01-31), pages 1 - 4 *
赖胜圣 等: "一种快速校正CT环形伪影的方法", 《中国组织工程研究与临床康复》, vol. 15, no. 13, 26 March 2011 (2011-03-26), pages 2412 - 2415 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022257759A1 (en) * 2021-06-08 2022-12-15 百果园技术(新加坡)有限公司 Image banding artifact removal method and apparatus, and device and medium
CN116452465A (en) * 2023-06-13 2023-07-18 江苏游隼微电子有限公司 Method for eliminating JPEG image block artifact
CN116452465B (en) * 2023-06-13 2023-08-11 江苏游隼微电子有限公司 Method for eliminating JPEG image block artifact

Also Published As

Publication number Publication date
WO2022257759A1 (en) 2022-12-15

Similar Documents

Publication Publication Date Title
CN113362246A (en) Image banding artifact removing method, device, equipment and medium
CN110324664B (en) Video frame supplementing method based on neural network and training method of model thereof
CN111275626B (en) Video deblurring method, device and equipment based on ambiguity
CN111402258A (en) Image processing method, image processing device, storage medium and electronic equipment
CN109427047B (en) Image processing method and device
CN111784603A (en) RAW domain image denoising method, computer device and computer readable storage medium
CN106971399B (en) Image-mosaics detection method and device
CN111028165B (en) High-dynamic image recovery method for resisting camera shake based on RAW data
CN113068034B (en) Video encoding method and device, encoder, equipment and storage medium
JP2018107797A (en) Encoding and decoding for image data
WO2023226584A1 (en) Image noise reduction method and apparatus, filtering data processing method and apparatus, and computer device
WO2016022322A1 (en) System and method for increasing the bit depth of images
CN111741290B (en) Image stroboscopic detection method and device, storage medium and terminal
US11373279B2 (en) Image processing method and device
CN117333398A (en) Multi-scale image denoising method and device based on self-supervision
CN110717864B (en) Image enhancement method, device, terminal equipment and computer readable medium
CN111563517A (en) Image processing method, image processing device, electronic equipment and storage medium
CN110415175B (en) Method for rapidly removing flat region coding mosaic
CN111160340B (en) Moving object detection method and device, storage medium and terminal equipment
JP2007334457A (en) Image processor and image processing method
CN108668166B (en) Coding method, device and terminal equipment
CN110766117A (en) Two-dimensional code generation method and system
CN113395475B (en) Data processing method and device, electronic equipment and storage equipment
CN114359183A (en) Image quality evaluation method and device, and lens occlusion determination method
CN114596210A (en) Noise estimation method, device, terminal equipment and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination